Better Planets

I really don't want to spend all my time on planets, because I see how often these kinds of projects get wrapped up in planetary generation and then never come back...still, I guess I need to have good planets since pretty much everybody does procedural planets these days.

They're getting better, although I constrained them to looking "Earth-like" for now. I think instead of picking random colors like before, I'm going to have to constrain the generator to picking from a list of palettes that I will create, since some of the planets before just looked plain ugly...purple land and green water don't really go well together.

The lighting model still needs a lot of work. In particular, the bumps are too exaggerated (and actually based off of luminance instead of height), and the water does not react to light differently than the ground, which is a terrible thing. Nonetheless, it's a step forward!

Blurring a Cubemap

I was surprised by the lack of resources concerning blurring a cubemap available online. I'll give a brief explanation of how to do so in case someone else with the same needs happens to stumble upon this.

First, it is a very reasonable thing to want to blur a cubemap. The most obvious reason one might want to do so, at least in my mind, is to simulate a reflection map for a glossy surface. The cheap alternative to doing so is using the mipmaps, but those will probably be generated using a box filter, which looks pretty bad as an approximation of a glossy surface. So you might want to generate a nice gaussian-blurred version of your environment map. But how?

Doing so is straightforward with a texture - we all know the algorithm. But a cubemap is a very different beast. Or is it? Here's the thing one must keep in mind when approaching this problem: stop thinking in terms of six textures. Instead, think of your cubemap as the unit ball. Which it is, right? Or at least that's how you index into it. Now, if you have a ray in that ball, how might you find the "blurred" equivalent of that ray? The obvious answer is by summing contributions of "nearby" rays. What do you mean by "nearby"? Well, that totally depends on what kind of Kernel you want! For a simulation of a gaussian blur, we will sample points on a disk around our ray, where the radius of the disk will be sampled from a gaussian distribution.

Suppose that e1 is the normalized position that corresponds to the current cubemap pixel, and that e2 and e3 are the orthogonal vectors that form the tangent space of e1 (see the tangent/bitangent post from a while back if you need code to do that). Also suppose that "samples" is an int corresponding to how many samples you want to take (suggest at least 1000 for HQ, but this will depend on your SD), and that "SD" is the standard deviation of the blur. Finally, suppose that you have a function Noise2D(x, y, s) that returns a random float in 0, 1 given two coordinates and a seed (although for this application, the noise doesn't really need to be continous). Then, here you have it:

This is a great example of a Monte-Carlo technique. We solved the problem in an elegant and asymptotically-correct fashion with minimal math and minimal code! If you want better results, look up "stratified sampling" and play around with that (note that the code above is already performing stratification on the angle, just not on the radius). For my purposes, this simple sampling strategy was sufficient.

Dithering!

I'm really excited about this post. If you look carefully at my previous screenshots, especially those in grayish/dusty environments, you might notice a dirty little secret that I've been refusing to talk about until now: color quantization (in the form of color banding). Usually, color quantization is not really a problem that most games have to address. However, the dusty and monochromatic atmospheres that I've been producing as of late are particularly subject to the problem, since they use a very narrow spectrum of colors. Within this narrow spectrum, one starts to run into the problem that 32-bit color only affords 256 choices each for red, green, and blue intensity in a given pixel. When the entire screen is spanned by a narrow spectrum of colors, the discrete steps between these 256 levels become visible, especially in desaturated colors (since there are only 256 total choices for gray).

As an example, here's a shot that demonstrates the problem:

Notice the visible banding in the lower-left. If you open this image in Photoshop, you can see that the colors are only one unit away from one another in each component - so we really can't do any better with 8 bits per component! The solution? A very old technique called dithering! Dithering adds some element of "noise" to break up the perfect borders of discrete color steps, which results in a remarkably-convincing illusion of color continuity.

OpenGL supposedly supports dithering, but it's not clear at all to me how it would do so, and especially how that would interact with a deferred renderer. Luckily, it's actually possible to implement dithering quite easily in a given shader. The appropriate time to do so is when you have a shader that you know will be performing some color computations in floating-point, but will then be outputting to a render target with a lower bit depth (i.e., RGBA8). You'd like to somehow retain that extra information that you had while performing the FP color computations - that's where dithering comes in. Before you output the final color, just add a bit of screen-space noise to the result! In particular, take a random vector v = (x, y, z) such that x,y,z are all Uniform ~ [-1, 1], where the uniform random function uses the fragment coordinates (or the texture coordinates) to compute a pseudorandom number (which need not be of high quality). Then add v / 256 to your result. Why v / 256? Well, this will allow your pixel to go up or down a single level of intensity in each component, assuming you're using an 8-bit-per-component format. In my experiments, this worked well.

Right now, I've implemented this the lazy way by switching my primary render targets to RGBA16, then dithering a single time immediately before presenting the result, which, if you think about it, is pretty much equivalent to the above, but requires 2x the bandwidth. I will switch to the more efficient way soon.

And here's the same scene, now with dithering:

As if by magic, the banding has totally disappeared, and we are left with the flawless impression of a continuous color space. Happy times :)

Mining and Beam Weapons

I haven't given up! Not yet at least. I'm got a five-week plan to get a Kickstarter ready for this project. I hope to flesh out a lot of cool features between now and then, and pull together some real cinematic footage so as to show off all the graphical polish on which I've been spending time.

Right now, my priority is all things mining. Unfortunately, I'm still not positive how the mining system will be implemented. I know that ore/mineable things will actually be physical objects attached to asteroids. I think what will happen is normal weapons will be able to break off shards of the ore, which can then be tractored in. Specialty mining lasers will also be available, and will provide far more efficient extraction of ore (i.e., higher yield than just blasting at it). Mining lasers will have built-in tractor capability, so they won't require a two-step mining process. Of course, this requires implementation of beam weapons, which I've been avoiding.

Here's my first attempt at mining beams.

The beams look pretty cool, and are animated using coherent temporal noise that flows along the direction of the beam, which breaks up the perfect gradient and provides the illusion that the beam is pulsing into the target object.

I can't wait to use this same functionality to slap some beam weapons on large ships and observe epic cap-ship battles (which, sadly, will require a good ship generator, which I'm still avoiding :/).

Dust and Collision Detection

I made some great progress yesterday with volumetric effects, and am officially excited about the current look of dust clouds/nebulas! I've implemented a very cheap approximation to global illumination for volumes. Although the effect is really simple, it adds a lot to the realism. On top of illumination from background nebulae, there's also a fake illumination effect from stars. Here are some shots:

Definitely starting to feel like the game has a strong atmosphere!

The rest of my time lately has been devoted to finally getting real collision detection. I tried several options, including Bullet, OZCollide and Opcode. For now, I've opted (no pun intended...) to go with Opcode. I love the cleanliness of the interface, as well as the fact that it's really easy to use as just a narrow-phase collider, since I've already implemented my own broad-phase.

Once again, I started a fight that I shouldn't have started with someone way bigger than me. Luckily, thanks to the new, accurate collision-detection, I was able to sneakily take refuge in a depression in this large asteroid:

Opcode is very fast, so I'm pleased with that. And I haven't even set up temporal coherence caches yet, which supposedly can yield 10x speedups. Unfortunately, I've already run into a very serious problem that it seems is universal to collision detectors. They don't support scaling in world transformations!! I'm not sure why...I guess there's some fundamental challenge with that and collision detection algorithms that I'm not aware of. At any rate, this poses a big problem: I've got thousands of collidable asteroids drifting around systems. With scaling support, this wouldn't be a problem, because there are only about 15 unique meshes that I generate for a given system. With instancing and scaling, you can't tell that some of the asteroids are the same. Scaling is an absolutely critical part of the illusion! But to get the scaled asteroids to work with collision detectors, I have to bake the scale data into the meshes. I can't possibly generate a unique, scaled mesh and acceleration structure for each of the thousands of asteroids at load time. That would take way too long, and way too much memory.

To solve this, what I'm currently doing is trying to anticipate collisions, and launch another thread to build the acceleration structures for the objects that would be involved. To do so, I keep track of a "fattened" world-space AABB for all objects, and launch the builder thread when the fattened boxes intersect. The factor by which the world boxes are exaggerated affects how early the acceleration structures will be cached. So far, this method is working really well. Opcode is fast at building acceleration structures, so I haven't had any trouble with the response time. In theory, with a slow processor, collisions will be ignored if the other thread does not finish building the acceleration structure before the collision occurs. I've tested to see how far I can push the complexity of meshes before this starts happening. Indeed, if I use the full-quality asteroid meshes (~120k tris) for collision, my first missile flies right through them (admittedly, I also turned the speed way up on the missiles). But let's face it, 120k tris for a collision mesh is a little much! And the only real alternative is to stall and wait for the acceleration structure to be generated, which means the game would freeze for a moment as your missile is heading towards an asteroid or ship. I'd much prefer to run the risk of an occasional missed collision on a really low-end machine than to have the game regularly stalling when I shoot at things.

I'm very impressed with how easy multithreading is with SFML. The thread and mutex structures provided by SFML are completely intuitive and work perfectly! It took less than an hour to implement the aforementioned collision structure worker thread.

Engine Trails and Dust

Wasn't such a good idea to mess with this guy...he has a lot of guns. I've upgraded engine trails to a first-class subsystem, they no longer use the particle system. They look much better as continuous ribbons rather than bunches of particles.

Last night I started playing with fake volumetric effects for nebulas/dust clouds/etc. The effect is really cool, but isn't that apparent in screenshots. When flying around, the effect is pretty compelling. This was one of my favorite graphical aspects of Freelancer - the look and feel of dust clouds. They felt so distinct and really broke up the monotonous feeling of space.