Real-Time procedurally generated planetary landscapes with CLOD


Abstract
We will present a real-time method for procedurally generating huge planetary landscapes with continuous level of detail. This approach enables us to produce interesting planets with a small or non existing pre-generated dataset, which in turn could be used to visualize an endless number of different planets. Where previous work in landscape generation have generally been purely procedural or purely design, we have devised a method which allows for a seamless integration of design into the computer generated world.
Another novelty is the decoupling of the mesh optimization from the rendering. While a high frame-rate is a requirement for fast and smooth animation, the mesh optimization can run in the background at a slower pace. We have implemented a system with different update frequencies for rendering and mesh optimization, to let us prioritize the different tasks, and to distribute the workload on multiple processors.
A method to generate natural looking river systems in the procedural generated terrain is explored and implemented. While we found that actual real-time procedural river generation was very difficult, one could combine a fast preprocessing step, with correct river flow calculations, which could later be placed inside the terrain.
* Seamless blend of design and pure procedurally generated terraincolor
* Decoupled mesh optimization and rendering to better utilize multi core processors
* Correct river flow calculation in a procedural landscape.

The paper can be downloaded as pdf here


A series of videos recorded from the virtual planet. 


A rough cloud system seen from space. The clouds are rendered on a sphere surface which moves through a 3D perlin noise space.



Night turns to day as the sun moves across the sky.


Night turns to day as the sun moves across the sky.



Night turns to day as the sun moves across the sky.


Rough procedural mesh is rendered with a pixel shader generated noise texture. The noise is 3D Perlin noise. This video is to show that even a very rough mesh can look decent is noise is added through texturing.



Procedural planet where the rendered pixel colors are based on a lookup table. For a given distance from the planet center, the table defines the color for this pixel. This will, as shown in the paper, result in a striped look which looks unnatural. For that reason, the hight values are perturbed slightly using a noise function before the color is found in the table.


A number of different "planets" are quickly generated by changing the seed for the random number generator used in the noise functions.



Night turns to day, through a red sunrise. As the sun moves across the sky, the moving clouds can be observed before the sun eventually sets again.



A "flight" over a procedurally generated planet. The sun rises through a red sunrise and during the (short) day the planet is explored a little before the sun sets again. After sunset, the "space ship" quickly flies towards the sunset to catch up with daylight again.



A flight over the procedural planet. This time the mesh is textured. The texture is not really suitable for this purpose, but it was used none the less. At a distance, the surface is colored by the terrain types base color. As the camera gets closer, the texture will start to blend in.


A flight over the procedural planet. This time the mesh is textured. The texture is not really suitable for this purpose, but it was used none the less. At a distance, the surface is colored by the terrain types base color. As the camera gets closer, the texture will start to blend in.


The camera starts quite close to the planet and then moves away. As more and more of the planet comes into view, the finer details are removed to maintain a somewhat constant polygon count.



To detect, and handle, collisions between an arbitrary object and the planet, I implemented a method in a pixel shader where any pixel having its 3D spatial position inside a spherical domain around (bounding sphere) the "space ship" would render into an off-screen render target. If pixels were rendered, it would mean that part of the planet was inside the bounding sphere and we had a collision. The actual color being rendered would describe the normal of the surface to make it easy to average the pixel position and normals to handle the collision.


A few static images, showing various elements of the program. Sun rises, level of detail and blending design is shown.

The fractal subdivision scene was designed to weight areas such as coast lines higher than most other areas. The reasoning is that the coastal areas has a higher detail level and therefore should be rendered with a more detailed mesh. Adding vertices in areas away from the coast will generally not benefit the visual quality. Other areas with higher priority would be mountain ridges when seen from a shallow angle. Under those circumstances, the mountains form a clear silhouette against the sky and a coarse mesh would be very noticeable.



A sunrise or sun set coloring the sky red well before the sun itself becomes visible.



The view from high in the atmosphere. The sun is visible, and so are the stars. It is not entirely realistic seeing both clouds, the sun and the stars in the same direction like this, but it looks nice I think ;-)



Three examples of the blending of user designed terrain with the procedural generation as described in the paper. Click the image for a better look. The images shows the blending of a user designed circle with central bump into the procedural landscape. The topmost image blends so the user design is very strong while the bottom image shows a blend where the procedural content is weighted more heavily.


A common problem for real-time procedurally generated content is rivers. The essence of the problem is that terrain features far away / out of sight are commonly ignored when doing real-time continuous level of detail procedural content generation and rendering. For rivers this is a problem since a river may well start far away and out of sight and the flow of the river is therefore controlled by terrain which is out of sight and not generated. The method implemented, and described in the paper, is to allow for a fixed number of pre-generated rivers which are then known beforehand and can be blended into the landscape using the previously method of blending design with procedural content.
A random river seed position is chosen. The terrain is the surroundings are generated and evaluated to see if a river would start here. If it would, the river starts flowing down. As it flows, the terrain surrounding the river is procedurally refined and the river thus flows in a highly detailed terrain. The river will flow down while filling river basins and eventually reaching the ocean.
For the detailed version, read the paper.
Ċ
Thomas Grønneløv,
Apr 12, 2011, 1:35 AM