How would you design a database if you have to store gigabytes of new data every second?

A particle accellerator generates a huge amount of data, that needs to be stored in real-time on very large storage systems. Usually, the offline analysis phase happens weeks later. But in these days, the quantity of data is so large, and the query patters so diverse, that you have also to create the database indexes in real time! The team at Berkeley Lab performed some interesting optimizations in this area:

http://www.physorg.com/news4137.html