Thursday, November 29, 2007

My reply to "What's Wrong With Warm Weather"
from the blog
A Few Things Ill Considered

This blog entry tells us that what's bad about global warming, is not the final temperature, but the rate of change of temperature. If the temp changes too fast, then all sorts of bad things happen.

This is essentially a truism. Anything that is too fast is not good, otherwise it wouldn't be "too" fast. Its pretty much impossible to argue with a truism as it is inherently true, but the problem with truisms, is that they also don't tell you anything of value.

The blog could have escaped its truism label if it only tells the point at which fast climate change becomes too fast.

I, on the other hand, will make the conjecture that any climate change rate that has occurred in the last 20,000 years of this planet is a safe climate change rate. We can see in their graph (shown below) that climate changes quite quickly and frequently. So if our current climate were to change with the same rate and frequency, there should be no concern, or do you presume that prehistoric man was altering the climate even then?

So let's look at a particular peak in the graph around 8000 years ago. If you look closely, you'll see that the rise in temperature is about 3 degrees Celsius. I blew up that peak and superimposed a ruler to show you just how wide (how long it lasted) the peak is here:Since each tick mark on the grey line represents 2000 years, I used a ruler that breaks that interval into 20 section. This make each of my ticks represent 100 years, or one century. As you can see the rising edge of the highest peak lasted one century. This equates to a 3 degree rise per century. So according to our planet, a 3 deg/century change is within expected operating parameters. Its not for us to decide whether or not this rate of change is good or bad. All we can say is that this has happened before so it is not out of the ordinary to see it happening again.

Now let's compare this to what's happening today. The IPCC's 4th report states that "Warming in the last 100 years has caused about a 0.74 °C increase in global average temperature." Then in its projection for the next 100 years, the IPCC states that the temperature rise will be from 1.8 °C to 4.0 °C based on its "low" and "high" estimates. Averaging the high and low estimates gives you 2.9 °C/century which is still within the norm. If you don't like my averaging methodology, then consider this: The IPCC lists 6 scenario families and a temperature rise predicition for each. Of the 6, only 2 of them have predicted temperatures over 2.9 °C. Couple this with the fact that world oil supplies are dwindling and that we could run out in less than 100 years means that the worst scenarios like A1F1 (4.0 °C) becomes increasingly less likely to occur (we won't really run out of oil, it'll just not be cost effective anymore).

Now let's look at the period 12,000-10,000 years ago. Doesn't it seem weird that temperature rise occurs before CO2 rise? Here's a closeup with some vertical lines put in to help you align everything.
Notice how the temperature increase starts occurring even before any CO2 increase. Also the temperature peaks before the CO2 peak. If there is a cause and effect relationship between temp and CO2 it must be concluded that temp change cause CO2 change and not the other way around. This also means that there is no historical evidence that CO2 change has caused temperature change on this planet (I'm not saying that it cannot, I'm just saying that it hasn't been shown to us so far in the past 20,000 years).

The proponents of anthropogenic global warming base their argument on two main points. First, historical measurements tell us that CO2 causes global climate change. Second, computer models show us how our temperature will rise if we don't control our carbon emissions. These models are validated against historical data to give them credibility. The problem I see, is that if the historical data doesn't show that CO2 causes climate change, then how can this model show it?