Naïve questions?
As a non academic engaged for some seven years in a prior life as Secretary and Fundraiser for two research foundations at an internationally recognised university, I have been able to read and collate a large volume of scientific data and media reports over several years on climate change – an unrelated discipline. Perhaps I am naïve, but can “consensus” scientists answer these conundrums:
First, if as scientific fact, water vapour constitutes more than 95% of greenhouse gases by volume and CO2 just 0.039%; and if there is irrefutable proof now that CO2 lags temperature by a minimum of several hundred years; and ipso-facto our high emissions cannot be the cause of global temperature rises – moreso by the fact (as Dr Susan Soloman argues) that “the trend in global surface temperatures has been flat since the late 1990s despite continuing increases in the forcing due to the sum of well-mixed greenhouse gases, raising questions regarding the understanding of forced climate change , its drivers, the parameters that define natural internal variability, and how fully these terms are represented in climate models.”; and if (as Professor Tim Ball argues) there is also proof now that water vapour has a negative feed back in relation to CO2 compared with (as Professor Richard Lindzen argues) the positive feedback mechanism built-in to “consensus” sponsored computer models thus overestimating climate sensitivity by a factor of 5; and if (as Professor Reid Bryson states) 80% of radiative energy is absorbed by water vapour in the first 30 feet and CO2 only eight-hundreds of 1%; why are we so preoccupied with a doubling of CO2 or pricing it as a catastrophic “pollutant”?
Secondly, as the hockey-stick theory of global warming is totally discredited, and if government’s around the world have colluded in manipulating temperature records – take the following example from recently published Watts et al (2012):
“…The new analysis demonstrates that reported 1979-2008 U.S. temperature trends are spuriously doubled, with 92% of that over-estimation resulting from erroneous NOAA adjustments of well-sited stations upward. The paper is the first to use the updated siting system which addresses USHCN siting issues and data adjustments.
“The new improved assessment, for the years 1979 to 2008, yields a trend of +0.155C per decade from the high quality sites, a +0.248 C per decade trend for poorly sited locations, and a trend of +0.309 C per decade after NOAA adjusts the data.”
And “The chart below from Willis Eschenbach’s WUWT essay, “The smoking gun at Darwin Zero,”… plots GHCN Raw versus homogeneity-adjusted temperature data at Darwin International Airport in Australia. The “adjustments” actually reversed the 20th-century trend from temperatures falling at 0.7°C per century to temperatures rising at 1.2°C per century.
He writes: “Intrigued by the curious shape of the average of the homogenized Darwin records, I went to see how they had homogenized each of the individual station records. What made up that strange average in Fig 7?
This is Station Zero at Darwin, showing the raw and the homogenized versions.
Fig 8. Darwin Zero Homogeneity Adjustments. Black line shows amount and timing of adjustments.
“…What on earth justifies such an adjustment? How can they do that? We have five different records covering Darwin from 1941 on. They all agree almost exactly. Why adjust them at all? They’ve just added a huge artificial, totally imaginary trend to the last half of the raw data! Now it certainly looks like the IPCC diagram in Figure 1, all right … but with a six degree per century trend!!? And in the shape of a regular stepped pyramid climbing to heaven?
“Those…are the clumsy fingerprints of someone messing with the data … they are indisputable evidence that the “homogenized” data has been changed to fit the preconceptions of someone about whether the earth is warming.”
Finally, as Mr. Ottmar Edenhofer, a leading member of the United Nation’s Intergovernmental Panel on Climate Change (IPCC), finally admitted in an interview with a German News outlet NZZ AM Sonntag in November 2011, that the entire theory of man-made global warming was a scheme towards “globalization” designed to redistribute, de- facto, the world’s wealth: “First of all, developed countries have basically expropriated the atmosphere of the world community. But one must say clearly that we redistribute de facto the world’s wealth by climate policy. Obviously the owners of coal and oil will not be enthusiastic about this. One has to free oneself from the illusion that international climate policy is environmental policy. This has almost nothing to do with environmental policy anymore with problems such as deforestation or the ozone hole
“We distribute de-facto the world’s wealth by climate change. Africa will be the big winner, and huge amounts of money will flow there. This will have enormous implications for development policy. And it will raise the question (whether) these countries can deal responsibly with so much money.”
So much money? Noted Australian Economist Terry McCrann writing in “The Australian” (August 20, 2011) on the extent of these U.N. policy “obligations” drew attention to the detailed Australian Treasury Department’s modelling of the proposed carbon dioxide tax and the fact that “by 2050, Australia will be obliged to send $A57billion a year overseas [that’s $A57,000,000,000 every year] just for the right to keep our lights on.”
“…That is to say, it will be an entirely artificial cost imposed on all Australians …with not the slightest offsetting benefit. It has the same economic consequences as taking $57bn and just shredding it: Every year!”
Now, in light of all this, and as one who knows full well the intense pressure associated with funding long term research, why should I not be sceptical that your “consensus” arguments on behalf of your “Piper” – the U.N. – and its IPCC, are not so much about scientific integrity, but more about a continuing flow of public money for your research endeavours and meeting their Socialist Agenda?