IT struggles with climate change

06.02.2006

We now have better connection to the high-speed Internet2 Abilene network, with its 10Gbit/sec. cross-country backbone. The bottom line still is we need much higher bandwidth, less network congestion and smart transfer protocols, such as the large-files transfer protocol [bbFTP], that minimize CPU load.

Hack: The so-called sneakernet continues to provide the best bandwidth for moving large data sets between computing centers -- shipping data on tapes or disk via overnight services. We are engaged in some emerging computational projects that will generate hundreds of terabytes per experiment. Moving that data is a significant challenge. Storage and access to that data for analysis purposes is a comparably challenging technical task.

How adequate is supercomputer capacity in the U.S. for scientific research?

Hack: One could argue that there will never be enough supercomputing capacity. In [a] sense, scientific progress is paced by the availability of high- performance computing cycles. And the problem becomes more acute as the need to address nonlinear scientific problems in other disciplines, like material science, computational chemistry and computational biology, continues to grow.

There remains some controversy about global warming. Could better climate models and/or better computer technology help resolve that?