Okay, stick with me here. The IPCC summary for policy makers (PDF) of their draft report on Global Warming was issued in February--the draft will be released in May. It used data from over a dozen climate models in preparing their reports. The 2001 report relied heavily on a model developed by the UK's very own Hadley Centre. The policy summary notes the following problems with the climate models they used:
- systematic biases in simulation of the Southern Ocean, which is important for the transfer of heat between the atmosphere and the oceans
- on-going problems in simulating the El Nino - Southern Oscillation (ENSO) cycle, which is a major factor in the Earth's climates
- poor simulation of precipitation events: "in general, models tend to produce too many days of weak precipitation (<10 mm per day) and too little precipitation overall in intense events (>10 mm per day)"
- substantial uncertainty in the simulation of feedback from sea-ice, which are coupled with polar cloud formation and transpor of heat through the polar oceans
Given the potential gravity of worst-case scenarios of the effects of global warming, you cannot wait for perfect data before attempting to make policy decisions. But these are not trivial issues.
This should be a hotbed of scientific activity, and we should see lots of research being done on the issues above. In particular, we should see attempts to harness Moore's Law, which baldly states that computer power has doubled every 18 months to 2 years in the recent past to make climate models more detailed, easier to test and quicker to run. So what do we find?
Here are some positive indications:
- (I thought this was much more exciting until I looked at the publication date of 2002) (From Science Daily) "Atmospheric scientists from Lawrence Livermore National Laboratory have performed the first global climate simulations with spatial resolutions of roughly 50 km (30 miles). This capability will be used to assess climate change and its societal impacts. Typical global climate simulations use spatial resolutions of about 300 kilometers (186 miles), which limits their ability to simulate climate and climate change on a regional scale "
- This (also from Science Daily, but from 2007) is the type of article I was hoping to find: Open Source Software Toolkit Plays Key Role In New Climate Simulations Science Daily — The Model Coupling Toolkit created by the U.S. Department of Energy's Argonne National Laboratory played a key role in the climate simulations used in preparing the new U.N. report "Climate Change 2007: The Physical Science Basis." The Model Coupling Toolkit (MCT) is an open source software library for constructing parallel coupled models from individual parallel models. MCT is designed for high performance and portability. All of the simulations of the Community Climate System Model used the Model Coupling Toolkit." This is good--although the skeptic in me wonders if it means that any errors in the Model Coupling Toolkit are therefore replicated in all models that use it...
- The University of Arizona researcher Joellen Russell, working with a group of researchers from National Oceanographic and Atmospheric Administration/Geophysical Fluid Dynamics Laboratory, Princeton, New Jersey, published (in December 2006) a paper called "The Southern Hemisphere Westerlies in a Warming World: Propping open the Door to the Deep Ocean,” which may specifically address some of the IPCC study's self-identified areas of weakness. Although I'm just interested in the fact that the research took place within the past 5 years, the results (that the Southern oceans can absorb more carbon dioxide and additional heat, perhaps reducing the rate of global warming), are not in strict accordance with what the IPCC is publishing.
I'll do a sequel to this post very soon. I don't like blog posts that go on forever, and besides, breakfast is waiting.
Comments