The 1990s were the Arctic’s warmest decade in the past 2,000 years, says a study released in Friday’s edition of Science. The warming -- due to the release of greenhouse gases into the Earth’s atmosphere -- overpowered a natural cooling trend that should have otherwise continued.
Scientists used “natural” thermometers -- such as glacial ice cores, tree rings and sediments from lakes -- to calculate the temperatures of the Arctic over the past two millennia. Instruments have been used to measure the actual temperature of the Arctic since the late 1800s.
“This study provides us with a long-term record that reveals how greenhouse gases from human activities are overwhelming the Arctic's natural climate system," reports study co-author David Schneider of the National Center for Atmospheric Research (NCAR) in Boulder, Colo.
This 'study' has more holes in it than a sieve. See Steve McIntyre's blog for the full story. A brief summary of some of the bigger problems:
(1) Some of the proxy data, sediment from lakes as computed by Mia Tiljander, is used *upside down*, so that it shows warming when it is actually cooling, and vice versa. Here is how Tiljander's data should actually look:
(2) The Kaufman et al (2009) study engages in data mining, i.e. the use of data that support their preconceived notion that CO2 emissions have warmed the plane in the 20th century and discarding the data that do not support their theory. As McIntyre explains it:
The most cursory examination of Kaufman et al shows the usual problem of picking proxies ex post: e.g. the exclusion of the Mount Logan ice core and Jellybean Lake sediment series; or the selection of Yamal rather than Polar Urals - a problem that is even pernicious because of the failure to archive "bad" results (e.g. Thompson's Bona-Churchill or Jacoby's "we're paid to tell a story").
The mind boggles. If some researcher in economics had used such fraudulent techniques he would be laughed right out of the profession.