Yesterday I encountered a new distributed computing project. In this one, the distributed computing client downloads and calculates climate change models running from 1920 to 2080. The tool is estimating 2445 hours to complete the first 2 models that were downloaded. All results are then submitted to the servers to be used by researchers in analyzing various predictions (since from 1920 to 2007 we know what happened) and using them to forecast long term trends. The tool includes visualization and screensaver modes enabling watching the movement of clouds and other global climate systems as they are calculated.
For more information on this project check out climateprediction.net.
Distributed Computing Primer: For those that aren't aware, the idea behind distributed computing is that essentially people with computers connected to the internet donate unused processing time on their computer to perform calculations for whatever project. This is done with a server on the internet that assigns work to each computer and keeps track of what work each computer has been assigned, and has completed. The completed data is used to form a ranking so that people can compete to contribute more processing time by adding more computers under their account. The second half is that on each participating personal computer a piece of software is installed that handles getting work from the projects server and determining when it can run without impacting normal use of the computer.