Kévin Boulanger

Rendering Grass in Real Time with Dynamic Lighting

Grass rendering

Since grass is very abundant on the Earth’s surface, it is an important element of rendered natural 3D scenes. Real-time realistic rendering of grass has always been difficult due to the huge number of grass blades. Overcoming this geometric complexity usually requires many coarse approximations to provide interactive frame rates. However, the performance comes at the cost of poor lighting quality and lack of detail of the grass. Our research work is about a grass rendering technique that allows better lighting and parallax effect while maintaining real-time performance. We use a combination of geometry and lit volume slices, composed of Bidirectional Texture Functions (BTFs). BTFs, generated using a fast pre-computation step, provide an accurate, per pixel lighting of the grass. Our implementation allows the rendering of a football field, covered by approximately 627 million virtual grass blades, with dynamic lighting, shadows and anti-aliasing in real-time. The creation of arbitrary shaped patches of grass is made possible using our density management. The latter also allows seamless transitions between levels of detail.

Our method supports three levels of detail, so allows rendering at any distance from the grass. Here are some results (click on the thumbnails to see larger images):

Football field Football field
Football field Football field
Football field Football field

In nature, grass is never uniformly distributed over the ground. Various external phenomena introduce chaos: kind of ground, availability of water, rocks, roads, etc., hence affecting the grass density. We define grass density as the number of grass blades by unit of surface. The density of grass at a point of the ground, which we call local density, can be defined with a density map mapped over the terrain. Here is an example (the black pixels of the map represent the regions of the terrain with no grass, the white pixels represent the regions with the maximum grass density):

Density map for the terrain

It is often difficult to manage grass density in real-time applications. Thus, when many instances of a primitive, a tree for example, have to be distributed on a terrain using a density map, the position of each instance is computed in a preprocessing step. These positions are then used to translate each instance of the primitive when achieving the rendering of the final scene. In the case of grass, the required memory to store all grass blade locations is unavailable for large terrains containing hundreds of million grass blades. We define a method that does not require the preprocessing step and the storage space to locate the instances. Here are some results, using the previous density map (click on the thumbnails to see larger images):

Park demo Park demo
Park demo Park demo
Park demo Park demo

We also use our density management to achieve smooth transitions between levels of detail. We use a function depending on the distance from the viewer that modulates the local density at each point of the terrain. The following image shows in false colors the three levels of detail and how they are interleaved. Red grass blades are rendered with geometry for full details. Blue grass blades are rendered using the vertical slices of a volume rendering method. Green grass blades are rendered using the horizontal slice of the previous method.

Levels of detail of the grass rendering algorithm

Shadows are an important factor of realism in rendered scenes. If they are not present, the rendered images look flat, with low contrast, and it is difficult to know the exact location of 3D objects relatively to the others. However, rendering scenes with shadows involves expensive computations, resulting in low frame rates. If we render exact shadows for each grass blade, the computation cost gets prohibitive. We need to perform fast approximations that give visually pleasant dynamic shadows. We take three kinds of shadow into account: points of the ground occluded by grass blades, points of grass blades occluded by other blades and occlusions from the environment due to external elements.

Here is an example of rendering showing the first kind of shadows:

Shadows cast onto the ground by grass blades

Here is an example of rendering showing the shadows on grass blades cast by neighbor blades:

Shadows on grass blades cast from neighbor blades

The last kind of shadows is managed using an ambient occlusion map, representing the amount of occlusions from the environment for each point of the ground. Here is an example of such a map with an example of rendered scene (the arrow represents the camera):

Ambient occlusion map      Ambient occlusion map applied to grass rendering

We achieve real-time frame rates with a Pentium D 3 GHz processor and a GeForce 7800 GTX video card, 1024x768 resolution and 4X antialiasing. We obtain 10 to 55 frames per second for the park scene, 20 to 250 frames per second for the football field.

Warning! The speeds we claim are for the WORST case! The demos we show are proofs of concept, the goal was to present a maximum number of features. In the context of games, many parameters of the algorithms can be set to constants (position of the sun for example), reducing drastically the work done by the shaders and increasing the framerates. Also, we use antialiasing to provide a better image quality, but this is optional. The distance thresholds to switch between levels of detail can be adjusted. We doubled the framerates by simply changing the levels of detail a bit earlier. The implementation can be highly optimized: we used a GeForce 7 and OpenGL Shading Language. By using DirectX features, HLSL or Cg for shaders, and a GeForce 8 class hardware (to access to the instancing API, texture arrays), the framerate would be much higher. That is why the algorithm is applicable to games with some modifications and optimizations, it is actually highly scalable.

The work in progress was accepted as a sketch for the Siggraph 2006 conference. Some additional work has been done to improve the method. Non-flat terrains are now managed and the level of detail transition algorithm has been improved. The final paper has been published in IEEE Computer Graphics & Applications.

The trees and flowers present in the park scene are modeled using XFrog (using the basic tree library and the European groundcover library).

Publications


Downloads


We provide videos of our test program and not the test program itself. The main reason is that the program is a prototype, it needs some additional work to be made publicly available. To avoid thinking that a video tries to hide problems in the algorithm, we show all features at the same time. Several heights and angles of the moving camera show different views that would be identical with a walk-through. Also, the movement of the camera is continuous, we did not assemble several videos to hide problems.

Demo video - High definition, 1280x720 (150 MB)
Demo video - Low definition, 640x360 (37.6 MB)

Some video compression artefacts appear on the videos because of the low video bitrates (the sudden changes of color every second are due to compression, not rendering). It is better to download the high definition video if your internet connection allows it. You need the Windows Media Video 9 codec to read the videos. Grass rendering presents many high frequencies that make the video compression difficult.

Preview of the grass video

If you want to have a quick preview of the video, here is a Flash version:

 

For up-to-date details on how the algorithm works, you can read the chapter on grass in my PhD thesis:

PhD thesis (pdf) (8.1 MB)

The following documents were made with an earlier version of the algorithm. The last version proposes a more advanced method for the transition between levels of detail and non-flat terrains are supported.

Siggraph 2006 sketch (pdf) (2.5 MB)
Siggraph 2006 sketch supplemental material (pdf) (2.6 MB)
Siggraph 2006 presentation (pdf) (17.8 MB)
Siggraph2006 sketch demo video, low quality (27.4 MB)
Siggraph2006 sketch demo video, high quality (73.5 MB)

INRIA research report RR-5960 (pdf) (27.3 MB)

If you have any comment, you can contact me at info@kevinboulanger.net