I don’t write a lot about climate change, but every once in a while something catches my eye. Over on Watts Up With That, a post apparently written by Julius Sanks asks the question “Can we puny humans produce enough energy to melt the huge icecap?” Before explaining why he’s wrong — or rather, why that’s the wrong question — I want to take a detour to a another Watts Up With That post that Sanks praises at the beginning of his post. Along the way, I’m going to try to reproduce some of the math in the posts.
The post Sanks likes is written by Andrew Watt himself and is titled “Getting ‘Cooked’ by Hiroshima Atomic Bomb Global Warming.” In it, Watts complains about the claim by John Cook that
our planet has been building up heat at the rate of about four Hiroshima bombs every second
I kind of agree with some of Watts’s criticism. Describing global warming in terms of nuclear bombs can be unnecessary sensationalist. Nuclear weapons concentrate all their energy in a spectacular blast, whereas weather and climate events spread out their effects over millions of square miles. On that scale, nuclear bombs are not very significant.
The standard unit of energy is the Joule (J), but the yield of nuclear bombs is usually expressed in terms of the number of tons of trinitrotoluene (TNT) explosive that would be needed to give off the same amount of energy. That’s not a very precise measurement, but the conventional assumption is that 1 ton of TNT releases 4.184 billion Joules (4.184×109 J) of energy. The bomb dropped on Hiroshima was a 15 kiloton weapon, equivalent to 15000 tons of TNT, so it released 62,760 billion Joules (62.76×1012 J). Four of them would release about 251,000 billion Joules (251×1012 J).
Those numbers are big, but so is our planet, and lots of big things happen on it. For example, a fully-formed hurricane releases about 600,000 billion Joules (600×1012 J) per second, which is pretty close to ten Hiroshima bombs every second. In a single day, a hurricane will release 52 billion billion Joules (52×1018 J), the equivalent of 800,000 bombs. (Fortunately, only about 1/4 of 1 percent of that energy is used to drive winds. The rest is used to evaporate water from the oceans and turn it into clouds and rain.)
Watts’s post uses a different point of comparison by calculating how much energy is in sunlight. The Earth’s radius is 6378 km, which means that the Earth’s cross-sectional area (given by Ï€r2) is 128 million square kilometers. Solar flux, the density of energy reaching the Earth, is 1361 Joules per second per square meter, so multiplying it out, the Earth intercepts 174,000,000 billion Joules per second (174×1015 J/s) from the sun. That works out to about 41 megatons, or about 2770 Hiroshima bombs, every second.
From this, Watt concludes that global warming energy is negligible:
Gosh, a thousand Hiroshima bombs exploding on this planet every second? How frightening! With that sort of threat, one wonders why Obama isn’t going to announce taxing the sun into submission next Tuesday.
These calculation just go to illustrate that in the grand scheme of things, not only is the global energy associated with global warming small, it isn’t even within the bounds of measurement certainty.
(Watt gets a figure of only 1000 bombs because he uses ballpark estimates for solar flux, the bomb’s energy, and frickin’ pi, and all the errors are in the same direction.)
This is kind of a silly argument. A lot of natural processes take a very long time. Water washing over rocks erodes only a tiny amount of material every day, but let it go on long enough, and the Colorado river can carve out the Grand Canyon.
Sanks’s post ostensibly addresses the issue of whether humans could possibly produce enough energy to melt the Antarctic ice cap, as is predicted to occur if the Earth’s climate gets warm enough. I’ll go through the math before explaining why this is a silly thing to talk about.
It takes about 334 Joules to melt a single gram of ice, and according to the National Snow & Ice Data Center, the Antarctic ice cap contains about 30 million cubic kilometers of ice. Given that 1 cubic centimeter of ice ideally weights 0.9169 grams, that’s 27.5 billion billion kilograms of ice. Multiplying by 334 J/g lets us figure out that melting the Antarctic ice cap would take 9,200,000 billion billion Joules (9.2×1024 J).
Sanks then tries to estimate how long it would take to melt the Antarctic ice cap if we used all of the energy produced by human civilization to do it. Using an energy production figure of 89 billion billion Joules per year(8.906×1019 J/yr) from the U.S. Energy Information Administration, he estimate that it would take about 103,000 years.
Unfortunately, Watt is using the wrong numbers. As you might guess from the name, the U.S. Energy Information Administration only reports U.S. figures. A better figure for the world (for 2015) is 146,000 terrawatt-hours, which is 526 billion billion Joules per year (5.256×10^20 J/yr). At that rate, it would take 17,500 years to melt the Antarctic ice cap.
It’s actually a little wore than that. Ice only melts when it reaches its melting temperature, and the arctic ice sheet is actually -49 degrees C below its melting point (according to Sanks), so some energy will be absorbed just raising it to the melting point. The amount of energy to raise one gram of water 1 degree C is 1.865 Joules. Re-running the math, it would take 22,300 years to melt the ice cap.
The author thinks the conclusion is pretty clear:
My goal here is to show the enormous energy levels involved and how ridiculous it is to blame humans for any significant ice melt.
That’s just plain dishonest. Climatologists aren’t claiming that human energy production is directly raising the temperature of the planet. The principle global warming claim is that a variety of human activities are changing the composition of the Earth’s atmospheric gases in ways that change the way the Earth retains heat, leading to a slow temperature increase. As it happens, the combustion products of fossil fuels are the main source of the “greenhouse” gasses that are causing the change, but it’s not the fossil fuel energy itself that is raising the Earth’s temperature.
The weird thing is that Watt’s post (referred to by Sanks) actually sets the scene for how this works. As he explains, the amount of energy reaching the Earth from the Sun is a whopping 174,000,000 billion Joules — 2770 Hiroshima bombs — per second. That’s about 10,000 times as much energy as our entire civilization produces. If all of that energy were focused on the Antarctic ice cap, it would melt away in a little over two years.
That’s not going to happen of course, and no climate scientist is predicting that it will. But here’s the thing: As Watt emphasizes, this is a tremendous amount of energy — so much that even a really small change in how it is absorbed could cause big changes to the Earth’s climate.
Early climate models were built around the assumption that everything was in equilibrium: The weather may be chaotic from moment to moment and vary from place to place, but over reasonable amounts of time, the planet-wide climate as a whole did not change. This constraint was a good enough approximation for scientists to build useful models of weather and climate.
For that constraint to hold, there would have to be no net energy change over the entire planet: If 174 petajoules of sunlight strike the planet every second, then the planet has to radiate away 174 petajoules of energy to space every second. This doesn’t have to be true from second to second, but it has to hold over periods of years or decades.
If the energy flows in and out are not exactly balanced, the Earth will experience temperature changes. So if changes to the Earth’s atmosphere reduce the amount of energy radiated to space, the Earth will accumulate energy, and that energy will cause the Earth’s temperature to increase. The temperature will continue increasing until the energy equilibrium is restored. Fortunately, all other things being equal, the warmer an object gets, the more energy it radiates, so as the Earth warms, it will begin to radiate more energy. Eventually, it will reach a temperature where it radiates as much Energy as it did before, at which point the balance of energy flows will be returned, and the Earth’s temperature will stabilize at the new value.
There are, of course, a lot of complexities. Some solar energy is reflected away without ever being absorbed, there are complex energy exchanges between the land, the seas, and the atmosphere, and there are other sources of energy besides the Sun, such as nuclear decay of radioactive elements deep inside the planet. But the basic idea still holds: If the energy flows are not balanced, the Earth will experience temperature changes.
And given the enormous quantity of energy involved, over time spans of decades or centuries, it should not surprise us that even slight perturbations to the Earth’s energy handling mechanisms could cause devastating climate changes.
Leave a Reply