climate modelling
desk! We are now able to include all the relevant processes in much more complete and complex ways, but as we go through the huge shift to parallel computing we have had to essentially rewrite our codes. In addition to that we are trying to fi gure out how to reach petascale computing capabilities and we now are thinking about exascale and how to apply climate modelling when that generation of computers become available. We have to consider what the issues are going to be and how are we going to confront them. The way computing has developed
over the years has totally changed the way we do modelling, but at the same time it has allowed us to do the science much more completely and effectively. The most complete codes we use now are Earth System models. They were previously known as Global Climate Models, and General Circulation Models (both use the acronym GCM) before that because they primarily represented the atmosphere, but not the chemistry or interactions between the biosphere, land and the atmosphere, or the interactions with the oceans. Today’s models include all that information and as they increase in resolution they allow us to study climate change in much greater depth. As computing systems change, we try
now, given the dense population increase that has occurred. At the moment we are setting up a model to examine the Tambora volcanic eruption that took place in 1815, which is the largest known volcanic eruption in human history. I have also led a number of regional
climate impact assessments where we downscale climate from the 3D models to the local scale using either statistical techniques or regional models that allow us to then look at the potential impact climate change may have on human systems like agriculture, transportation and water resources in much greater detail and with far better accuracy. I am also about to be named a member of the Federal Advisory Committee that’s going to lead the next national climate assessment for the US. Ever since I became an atmospheric
scientist in the early 1970s my research has been based around numerical models of the physics, chemistry and biology affecting the atmosphere. When I look back to when I started, the supercomputers we were using were the equivalent to the calculators we have today, and certainly a lot less powerful than the Macbook Pro I have sitting on my
www.scientific-computing.com
to port the codes over, but there are always problems and it’s a very time consuming process. Ultimately, we are paid to do science, not development work, so we have to overcome those issues as quickly as possible. That said, a member of my team is ostensibly a chemist but in practical terms he is a computer scientist who keeps our codes running on all the platforms. These are very complex codes that take a lot of computer resources and we are continually looking for new computers to run on as we never seem to have enough computer time. We’re continually porting codes over to new machines and running into problems with the parallelisation as we do so, and we have to overcome that. Another of my team spends half his time dealing with those aspects. I do think as we look at the new types of
dynamical cores, which is essentially changing the entire grid structure of the models and allowing us to now go to petascale, we are being faced with another learning experience that will allow us to do things we couldn’t before. The process enables us to keep improving the physics and the models, and eventually the models will need to undergo a shift of their own so that they include human interactions. I believe that’s where the future of Earth System models lies.
Gil Compo, research scientist, Climate Diagnostics Center CIRES at the University of Colorado, USA
T
he work we do is considerably different from most other climate modelling. Usually in climate modelling you are trying to simulate what happened
during a specifi c historical period, or predict what might happen in the future. What we are trying to do is recreate what actually happened, every six hours around the globe. We implemented an advanced technique called the ensemble Kalman fi lter and what that allows us to do is combine a series of model-based guesses as to what’s happening at any particular time with observations of what did happen. The observations that we use in particular
are surface and sea level pressure. The reason for this is that if you think about what our meteorological observing systems look like, right now we have data from satellites, weather balloons, aeroplanes, ships, buoys and land stations. Looking back to more than 100 years ago, many of those technologies hadn’t been invented yet, so we had to think about the kinds of data we would be able to gather going back to the 19th Century. The barometer has proven to be a fairly consistent observation tool. We run a numerical weather prediction
model similar to what’s used to make weather forecasts, and every six hours of re-constructed time we make an ensemble of guesses. We run 56 equally plausible guesses as to what was happening at a particular time, and then modify those guesses using the ensemble Kalman fi lter algorithm before combining them with the available observations. We use a window of observations: three hours before and up to 2.95 hours afterwards. The observations are combined with the 56 guesses to have
APRIL/MAY 2011 41
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48