Shawn Hargreaves Blog
In videos of real life rainy MotoGP races, one of the most dramatic visual effects is when raindrops land on the camera lens. We wanted to recreate this in our game, so we needed to understand what happens when light passes through a raindrop.
Had we been smarter, more mathematically inclined, or paid more attention in school, we could have worked this out from first principles (I vaguely recall something about an index of refraction, a dude called Newton, and an apple? :-) but it is sometimes easier to just use your eyes and then figure out how to replicate whatever you see.
Jay (the lead artist) and I found one of those sprays like you use for misting plants, and spent some time squirting it on the office window, then examining the resulting droplets from different angles and against different backgrounds. Here's what we learned:
"Hmm, I wonder why our test droplets don't come away when we try to wipe them off the window? Hey Lynn, what exactly is in this spray bottle? Huh. Some kind of acid cleaning solution, you say?" The office windows never did look quite the same after that :-)
"Hmm, I wonder why our test droplets don't come away when we try to wipe them off the window? Hey Lynn, what exactly is in this spray bottle? Huh. Some kind of acid cleaning solution, you say?"
The office windows never did look quite the same after that :-)
First, we made a droplet texture. We drew the shape of a typical droplet into the alpha channel, and I wrote a function (using the MotoGP equivalent of a content processor) that generated 2D refraction offsets into the red and green channels.
We were already drawing the main 3D scene into a rendertarget, so we could apply postprocessing effects such as motion blur. It was trivial to draw a number of 2D droplet sprites over the top, using a pixel shader that sampled the droplet texture, used its refraction offset to compute texture coordinates into the main scene image, then sampled the scene at this modified location (a similar concept to this XNA Framework sample).
It looked good, but the performance was terrible. When I drew a hundred droplets, each one was scaling down the entire scene to cover that single droplet. Adjacent pixels of the droplet were sampling background texels from radically different parts of the scene, which had terrible locality and thus thrashed the texture cache. Our framerate dropped from 60 fps to a slideshow.
I created a 128x64 rendertarget, scaled the main scene down to this lower resolution, and used the resulting smaller texture while drawing the droplet sprites. Now we were only taking the bandwidth hit once per frame, rather than once per droplet. Phew! Back up to 60 fps.
I used a simple particle system to animate the droplets. New ones appeared at random intervals, and could either fade out, slide slowly downwards, or move outward toward the edge of the lens if the camera was moving particularly fast.
For variety, I tweaked the settings to make the droplets look different for each camera type. Third person views had a medium number of small droplets, except when they got splattered by driving through the spray plume of another bike:
In the first person cockpit view, I drew a mixture of very large and very small droplets onto the bike windscreen, as opposed to the camera lens:
TV replay cameras had stationary droplets, scaled according to the field of view. This looked particularly realistic when the droplets changed size as the camera zoomed:
Bike-mounted replay cameras had many bigger, faster moving, shorter lived, and more opaque droplets. It was sometimes hard to see what was going on behind all this, but it looked very wet indeed, especially when in motion: