History of Climate Models
Early studies of atmospheric and oceanic circulation patterns examined simple physical models. Sometimes, a container filled with fluid was placed on a hotplate that represented the warming of the tropics; cooler areas of the container represented the poles. In more elaborate physical models, a heated tank was mounted on a turntable to mimic Earth’s rotation (Coriolis force). These efforts became known as general circulation models (GCMs). Physical GCMs, however, were limited in their abilities to account for the intricacies of atmospheric convection or ocean currents, and climatologists soon turned from physical to mathematical GCMs...
The ENIAC computer, which began operation in 1947, was 2.4 m × 0.9 m × 30 m in size, and contained 17,468 vacuum tubes.
Contemporary mathematical models depict Earth’s climate in its entirety, and the acronym GCM also now stands for global climate model. Digital computers, which were developed near the end of World War II, are well suited to the repetitive nature of the calculations in GCMs. In 1950, the first large-scale digital computer, called ENIAC, achieved real-time climate predictions whereby 1 minute of calculations estimated air movements for the next minute. As digital computers became more sophisticated, so did the GCMs. GCMs, for convenience, divide the world into boxes having a certain breadth and width (grid size) and a certain height determined by the number of vertical levels. They also treat the passage of time in a stepwise fashion whereby they calculate conditions only at discrete moments separated by a certain time interval (time step). Generally, the accuracy of such models improves with a smaller grid size, more vertical levels, or a shorter time step, but these require additional computation time.
In 1969, one of the first mathematical models to couple atmospheric and oceanic processes engaged the most powerful computer of its time for 46 straight days.  Six years later, in 1975, the next generation of computer churned for 50 straight days on the next generation of this model, to simulate 300 years of global climate at a low resolution (500 km grid size, 3 vertical levels, 10-minute time step). In 1994, another model running on a cluster of 352 computers, required 10 minutes to calculate the climate for 1 day at a higher resolution (200 km grid size, 9 vertical levels, 7.5-minute time step);  at this rate, reconstructing 300 years of climate would require 760 days of computer time. In 2004, a cluster of 2560 computers in the Japanese Earth Simulator Center required as little as 3 hours to simulate 1 day of climate at a very high resolution (3.5 km grid size, 54 vertical levels, 25-second time step);  at this rate, a 300-year reconstruction, were it ever attempted, would require over 37 years of computer time. Clearly, GCMs have expanded to take full advantage of advances in computer science, and their insatiable appetite has been a major driving force spurring the development of even faster and larger machines.
 Mechoso, C. R., J. D. Farrara, and J. A. Spahr (1994) Achieving superlinear speedup on a heterogeneous, distributed system. IEEE Parallel&Distributed Technology 2:57-61.
 Satoh, M., H. Tomita, T. Nasuno, S.-I. Iga, K. Goto, Y. Tanaka, M. Tsugawa, M. Sakasita, and M. Kogi (2004) Development of super high-resolution atmospheric and oceanic general circulation models on quasi-uniform grids. In Annual Report of the Japanese Earth Simulator Center.
This is an excerpt from the book Global Climate Change: Convergence of Disciplines by Dr. Arnold J. Bloom and taken from UCVerse of the University of California.
©2010 Sinauer Associates and UC Regents