Here's an admirably succinct summary of how climate change alarmists are 'cooking the books', falsifying data to support their claims (and, naturally, to demand ever more money and resources to continue their spurious 'research). I've inserted links to information about relevant organizations, individuals and information, for the benefit of those who may not be familiar with the field.
How have we come to be told that global temperatures have suddenly taken a great leap upwards to their highest level in 1,000 years? In fact, it has been no greater than their upward leaps between 1860 and 1880, and 1910 and 1940, as part of that gradual natural warming since the world emerged from its centuries-long “Little Ice Age” around 200 years ago.
This belief has rested entirely on five official data records. Three of these are based on measurements taken on the Earth’s surface, versions of which are then compiled by Giss [NASA's Goddard Institute for Space Studies], by the US National Oceanic and Atmospheric Administration (NOAA) and by the University of East Anglia’s Climatic Research Unit working with the Hadley Centre for Climate Prediction, part of the UK Met Office. The other two records are derived from measurements made by satellites, and then compiled by Remote Sensing Systems (RSS) in California and the University of Alabama, Huntsville (UAH).
In recent years, these two very different ways of measuring global temperature have increasingly been showing quite different results. The surface-based record has shown a temperature trend rising up to 2014 as “the hottest years since records began”. RSS and UAH have, meanwhile, for 18 years been recording no rise in the trend, with 2014 ranking as low as only the sixth warmest since 1997.
One surprise is that the three surface records, all run by passionate believers in man-made warming, in fact derive most of their land surface data from a single source. This is the Global Historical Climate Network (GHCN), managed by the US National Climate Data Center under NOAA, which in turn comes under the US Department of Commerce.
But two aspects of this system for measuring surface temperatures have long been worrying a growing array of statisticians, meteorologists and expert science bloggers. One is that the supposedly worldwide network of stations from which GHCN draws its data is flawed. Up to 80 per cent or more of the Earth’s surface is not reliably covered at all. Furthermore, around 1990, the number of stations more than halved, from 12,000 to less than 6,000 – and most of those remaining are concentrated in urban areas or places where studies have shown that, thanks to the “urban heat island effect”, readings can be up to 2 degrees higher than in those rural areas where thousands of stations were lost.
To fill in the huge gaps, those compiling the records have resorted to computerised “infilling” or “homogenising”, whereby the higher temperatures recorded by the remaining stations are projected out to vast surrounding areas (Giss allows single stations to give a reading covering 1.6 million square miles). This alone contributed to the sharp temperature rise shown in the years after 1990.
But still more worrying has been the evidence that even this data has then been subjected to continual “adjustments”, invariably in only one direction. Earlier temperatures are adjusted downwards, more recent temperatures upwards, thus giving the impression that they have risen much more sharply than was shown by the original data.
An early glaring instance of this was spotted by Steve McIntyre, the statistician who exposed the computer trickery behind that famous “hockey stick” graph, beloved by the IPCC [Intergovernmental Panel on Climate Change], which purported to show that, contrary to previous evidence, 1998 had been the hottest year for 1,000 years. It was McIntyre who, in 2007, uncovered the wholesale retrospective adjustments made to US surface records between 1920 and 1999 compiled by Giss (then run by the outspoken climate activist James Hansen). These reversed an overall cooling trend into an 80-year upward trend. Even Hansen had previously accepted that the “dust bowl” 1930s was the hottest US decade of the entire 20th century.
Assiduous researchers have since unearthed countless similar examples across the world, from the US and Russia to Australia and New Zealand. In Australia, an 80-year cooling of 1 degree per century was turned into a warming trend of 2.3 degrees. In New Zealand, there was a major academic row when “unadjusted” data showing no trend between 1850 and 1998 was shown to have been “adjusted” to give a warming trend of 0.9 degrees per century. This falsified new version was naturally cited in an IPCC report (see “New Zealand NIWA temperature train wreck” on the Watts Up With That science blog, WUWT, which has played a leading role in exposing such fiddling of the figures).
By far the most comprehensive account of this wholesale corruption of proper science is a paper written for the Science and Public Policy Institute, “Surface Temperature Records: Policy-Driven Deception?”, by two veteran US meteorologists, Joseph D’Aleo and WUWT’s Anthony Watts (and if warmists are tempted to comment below this article online, it would be welcome if they could address their criticisms to the evidence, rather than just resorting to personal attacks on the scientists who, after actually examining the evidence, have come to a view different from their own).
One of the more provocative points arising from the debate over those claims that 2014 was “the hottest year evah” came from the Canadian academic Dr Timothy Ball when, in a recent post on WUWT, he used the evidence of ice-core data to argue that the Earth’s recent temperatures rank in the lowest 3 per cent of all those recorded since the end of the last ice age, 10,000 years ago.
There's more at the link, including graphs. Useful reading.
So much for the 'scientific consensus' that globull warmening is all humanity's fault and is going to kill us all and woe is us . . .