GPU Terrain

December 20, 2011 Algorithmic Art 0 Comments

I've finally admitted it to myself: CPU-based terrain generation just isn't smart, and it'll probably get crushed by all things GPU-based in the future. Sure, it makes threading and precision easier to deal with, but it doesn't make sense that my current terrain takes about the same amount of time to generate on my Intel HD Graphics-driven laptop and my 560GTX-powered desktop. Why on earth are all those processors going to waste?

So, here we go. My second endeavor into GPU-based terrain generation. I've already come farther than last time, when I gave it all up because of cracks that I thought were the result of GPU precision problems. I managed to solve the crack problem this time. Of course, my skirts were hiding the problems, but it still made me uneasy that tiles didn't line up perfectly - luckily, a little math solved that.

It may look like the screenshot is a regression from my previous shot, but, since it's GPU-generated, it definitely represents a step in a direction that I think will ultimately prove to be a worthwhile investment in the future of XDX's terrain engine. The terrain below is obviously simple: just a single layer of iterated value noise. Building up layers is the next step, and should be relatively easy with a few shaders.