I have about 10 years experience in the IT world, but even before I entered that line of work, I knew what most people have heard before when it comes to computer products: garbage in, garbage out.
In other words, if you put garbage in as your data, you’re going to get garbage out as your results. This is also true if your program itself is garbage.
If you weren’t busy guzzling global warming Koolaid, you might have noticed a few years ago when Al Gore’s famous “hockey stick” temperature graph was exposed for the fraud it was.
From Technology Review in 2004:
But now a shock: Canadian scientists Stephen McIntyre and Ross McKitrick have uncovered a fundamental mathematical flaw in the computer program that was used to produce the hockey stick. In his original publications of the stick, Mann purported to use a standard method known as principal component analysis, or PCA, to find the dominant features in a set of more than 70 different climate records.
But it wasnt so. McIntyre and McKitrick obtained part of the program that Mann used, and they found serious problems. Not only does the program not do conventional PCA, but it handles data normalization in a way that can only be described as mistaken.
Now comes the real shocker. This improper normalization procedure tends to emphasize any data that do have the hockey stick shape, and to suppress all data that do not. To demonstrate this effect, McIntyre and McKitrick created some meaningless test data that had, on average, no trends. This method of generating random data is called Monte Carlo analysis, after the famous casino, and it is widely used in statistical analysis to test procedures. When McIntyre and McKitrick fed these random data into the Mann procedure, out popped a hockey stick shape!
Richard Muller provides a more technical explanation in the article also.
In what seems like an uncanny replay of environmentalist fiddling with reality, Marc Sheppard writes at the American Thinker about the source code for the climate model used by the Climatic Research Unit to try and fool us into believing in anthropogenic global warming illustrate the problem of anthropogenic global warming.
While a great deal of the program code seems normal and legitimate,
many others fall into the precarious range between highly questionable (removing MXD data which demonstrate poor correlations with local temperature) to downright fraudulent (replacing MXD data entirely with measured data to reverse a disorderly trend-line).
In fact, workarounds for the post-1960 “divergence problem,” as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”
What exactly is meant by “corrected” MXD,” you ask? Outstanding question — and the answer appears amorphous from program to program. Indeed, while some employ one or two of the aforementioned “corrections,” others throw everything but the kitchen sink at the raw data prior to output.
For instance, in the subfolder “osborn-tree6\mann\oldprog,” there’s a program (Calibrate_mxd.pro) that calibrates the MXD data against available local instrumental summer (growing season) temperatures between 1911-1990, then merges that data into a new file. That file is then digested and further modified by another program (Pl_calibmxd1.pro), which creates calibration statistics for the MXD against the stored temperature and “estimates” (infills) figures where such temperature readings were not available. The file created by that program is modified once again by Pl_Decline.pro, which “corrects it” – as described by the author — by “identifying” and “artificially” removing “the decline.”
But oddly enough, the series doesn’t begin its “decline adjustment” in 1960 — the supposed year of the enigmatic “divergence.” In fact, all data between 1930 and 1994 are subject to “correction.”
The descriptions in Sheppard’s piece go on to become pretty technical, but even the code comments (a statement left in the program code that is ignored when processed but serves as a note to explain something to a programmer later) show a clear “fudge factor” in the program.
It seems the approach of the Apostles of Global Warming is what many of us have suspected for years: where there is no data, make it up; where there is contradictory data, massage it out or ignore it; if the data happens to fit the hypothesis, by all means use it.
And the socialists in Washington D.C. expect us to sit idly and quietly by as they slam us and our economy with a crippling cap and trade global warming tax, based on the silly and fraud-ridden notion of anthropogenic global warming?
Dream on! Maybe if we were Euro-sheep, but red-blooded Americans are anything but Euro-sheep. We have a 233-year history of fighting against government oppression and heavy-handedness.
Yet the U.S. Senate hopes to pass the cap and trade global warming tax that the U.S. House has already passed in June, and President Obama is already packing his socks to head for Copenhagen to chain us to a UN Climate Change treaty.
We threw off an oppressive despot in 1776, and it’s time to do it again, my fellow Americans. Only this time we have the legal means (ensured by the U.S. Constitution) to do legally and electorally what we had to do on the battlefield then. So let’s get those letters to the editor, emails and phone calls to our representatives, and efforts to recruit and support limited-government candidates rolling, people!
We have a heritage to preserve!
Meanwhile, the White House continues to sail down that famous river in Egypt