By Alleen Brown
By Maggie LaMaack
By CP Staff
By Jesse Marx
By Jesse Marx
By Maggie LaMaack
By Jake Rossen
When it came to the greenhouse effect, the models climatologists use as stand-ins for the real world had always confounded researchers. If you enriched a model's hypothetical air with the amount of CO2 humans have been pumping out, you got a much warmer climate; but it was much warmer than what we actually have. Which meant that either some other force was moderating the human influence, or the climate was simply doing its own thing independent of human inputs.
Then some enterprising graduate students plugged another key real-world variable into their model: sulfates, the acid-rain chemicals that are a major by-product of fossil-fuel burning. If CO2 acts to reinforce Earth's greenhouse layer, sulfates are thought to create a "parasol effect." Like veils draped through the air high above, they reflect radiation back into space before it can warm Earth.
As soon as sulfates were accounted for, the models' warming curves snapped into position, overlapping the actual record with eerie accuracy. The globe would warm more at night than during the day, they predicted (and it has); it would stay cooler in places downwind from most coal burning, such as the Northeastern U.S. (right again); the stratosphere would get colder while the air further down heated up (likewise).
It looked, for all intents and purposes, like a climatic "fingerprint": No other force--solar cycles, volcanoes, and the like--could produce as good a match to the real-world record as the CO2-plus-sulfate model. And so the 2,400 IPCC scientists finally put their signatures to a statement as bland as it was remarkable: "The balance of evidence suggests that there is a discernible human influence on global climate."
A predictable firestorm followed. Critics called the IPCC a political body, beholden to world governments--which, apparently, couldn't wait to have their energy policies indicted. There were rumors about key statements being inserted, or removed, surreptitiously. Congress even held hearings to debunk global warming.
And it's true that much remains uncertain about the IPCC report. Some unknown force could be playing hide-and-seek, changing the climate in a way that only looks like human interference. This is a standard scientific disclaimer: Similarly, researchers don't know that gravity exists, they just know the planet behaves as if it did. "There's not a single experiment that proves human emissions cause climate change," says John Pastor, a University of Minnesota-Duluth researcher who has studied the response of forests to climate change. "For a long time, it also wasn't proven that smoking causes cancer. But there is such a thing as the cumulative weight of evidence."
And new evidence keeps emerging, with crucial studies published almost monthly since the release of the IPCC report. Perhaps the most interesting was the National Oceanic and Atmospheric Administration's disclosure that over the past 80 years, "extreme weather" has become more common around the world--and that the change has occurred in just the way the global-warming models predict. Precipitation, for example, now happens less frequently, in bigger downpours, and in winter more than summer.
The NOAA extreme-weather study team was headed by Robert Quayle, a fairly prominent "climate agnostic." In 1991, he said that it would take at least another four decades to validate the theory of human-induced greenhouse warming. Last year, Quayle told the magazine International Wildlife he had revised that estimate: Chances were 19 out of 20, he said, that what was being observed was indeed global warming. "I'm not particularly agnostic anymore," he said. "There is such a convergence of data it gets to be a little spooky."
Why should we care? That people have changed the climate may be philosophically crucial--you can't, if you let the fact sink in, look at storms or other "acts of God" the same way again--but so far, it doesn't seem to have made much of a tangible difference.
Or has it? That's the trouble with the law of averages. The researchers can tell you that somewhere, there's a river that wouldn't have overflowed without global warming, a drought that would have ended sooner, a storm that wouldn't have been so severe. But they can't and won't say whether it was the Red River of '97 or the Mississippi of '95, the storm that blacked out South Minneapolis last week or the heat wave that killed 500 in Chicago two years ago. Nor will they ever: Climate, it's been said, is what you predict. Weather is what you get.
Every prediction needs a starting point; right now, most of the forecasts assume that between 2050 and 2100, CO2 in the atmosphere will reach a level twice as great as before the Industrial Revolution. This, the IPCC estimates, will send average global temperatures up an additional 2.5 to 6.5 degrees. Most forecasters split the difference and estimate an increase of 4 to 5 degrees, with the warming distributed unevenly around the world. On the models' crude regional scale, the Twin Cities look to get right about the average increase, becoming 4 degrees warmer and somewhat drier. It would, in effect, look more like Omaha around here.
If that seems unremarkable, consider the historical precedent. Through all of the time humans have been recording their stories, there is not a single instance of warming even approaching the level predicted now, and far lesser climate ripples have split societies apart. Egypt's Old Kingdom ended when the Sahara's freshwater lakes dried up after the winds shifted slightly south. The "Little Ice Age," a mere 1-degree cooling from circa 1400 to 1850, sent Europe into a cycle of war, famine, pestilence and mass migration to the New World.