Monday, March 28, 2011

Under the Cloak of ‘Climate Change’ childhoods are being sacrificed for political gain

'When asked to choose the 3 biggest threats to the world from a list of 9, the most common answer is terrorism, chosen by more than half (59%), followed by climate change (49%).' -- Extract from the results of a BBC survey of some 329 schools, with 24,000 respondents aged 11 to 16 years, published 24 March, 2011

So, if the survey has been well-conducted, approximately half of secondary-school children in the UK regard 'climate change' as one of the biggest threats facing the world. How can that be, given that nothing at all unusual has happened to any weather phenomena, including air temperatures, rainfall, storminess etc, and nor to commonly associated phenomena such as polar ice extents?

The answer, of course, is clear enough: very successful lobbying and publicising of the results of computer models programmed to give CO2 a large effect as a driver of climate using positive feedbacks. Given that CO2 levels have been rising, and are confidently expected to rise further, there is clearly the makings of a good scare story here.

However, neither the atmosphere itself nor many leading climate scientists, have been sufficiently convinced by these stories to, in the case of the atmosphere, display unusual behaviour, and in the case of the scientists, display alarm. Yet many others are alarmed, or find it convenient to act as if they are for the sake of political and other advantages. Finance houses, political parties, environmentalists, and development organisations have all seen substantial boosts to their incomes and/or their influence thanks to the widespread publicity given to such as the IPCC.

Many well-intentioned individuals and groups have no doubt been persuaded to 'do something' by all of this, and are even trying to get schoolchildren involved in political actions.

One such group is Norwich Education and Action for Development (NEAD), whose Windmill Project was reported upon this week in the Norwich Evening News

The headline, and the activities described look innocent enough. Since our climate has always changed and is no doubt still changing, children should be taught about it as part of their nature or geography or science studies. Who would not want that? The changes however are quite slow and hard to detect amidst the within-year variation, and so it is unlikely that this topic ought to be a major part of any curriculum for such a young age group. The problem though is that they may be being misled about climate risks, and that these in turn may be scaring them, and leading them into political roles which seem utterly unsuited to their tender years. On the NEAD site, one can find phrases such as this one:
'Most importantly, children are offered information about some of the solutions to problems related to climate change. This will give children the power to make informed decisions and allow them to move towards behavioural and attitudinal change.'

Primary school children have been visited by this group in the past. Although their teaching materials are not available to non-members on their site, my concerns that they may be the usual alarmist stuff are not allayed by listening to this song sung and partly composed by children at a NEAD event at a school in October last year:
'The Norfolk Flood Blues'

It is quite hard to make out all the words, but it seems to begin with stamping of feet in time to the music, while chanting
'Rain Flood Rain Flood Rain Flood Rain Flood ...'

Later on, I think I heard these phrases (please email corrections or confirmations about these!):

'Water in my home, Water in my bed'
'It's destroying everything'
'I feel doomed. I feel scared.'

I looked up the UK Met Office site to see what weather records I could find for East Anglia, the region in which Norwich lies. Records were available for Lowestoft, a coastal town less than 20 miles from Norwich. I extracted monthly rainfall, monthly sunshine hours, and monthly mean maximum and mean minimum temperature values for the 30 years 1980 to 2010, and used these to produce the plots shown below. Can you see any grounds for alarm in them?



The pupils will have some difficulty in discerning ‘climate change’ in such a display, dominated as it is by within-year variation. Throughout this period, CO2 levels grew, along with increasingly agitated pleas and warnings from people who ought to have known better, such as James Hansen who in 1986 was warning of mean global temperature rises of several degrees by the year 2010. Since the computer models suggest the temperature rises will be greater away from the equatorial regions towards the poles, a naive observer might well have expected more action in the Lowestoft data by now. Could it be that the models are also useless for predicting such things?

Mercifully, the NEAD people do not seem deranged like those who produced the film ‘No Pressure’, whereby children of non-compliant parents were portrayed as being violently destroyed, ‘pour encourager les autres’. I suspect that NEAD attracts many good people, but people who have been misled by the IPCC, and by others. There are further grounds for concern about NEAD: first, is it really a charity, second, is it at risk of crossing the line re political indoctrination in schools, and third, will campaigning around climate change really help the world's poor in the long run?

More HERE





Disturbing Imagery Of The Permanent Drought In California

Climate models run on the world’s largest supercomputers predicted this drought.



SOURCE







CO2 Causes More Precipitation And Less Precipitation

Jeff Masters [of Weather Underground] has been telling us that more CO2 causes more precipitation – due to larger amounts of water vapor in the atmosphere. Now we find out that more CO2 also causes less precipitation due to a drier atmosphere. From Ken Caldeira of the Carnegie Institution we read:
Cutting carbon dioxide helps prevent drying

Recent climate modeling has shown that reducing the concentration of carbon dioxide in the atmosphere would give the Earth a wetter climate in the short term. New research from Carnegie Global Ecology scientists Long Cao and Ken Caldeira offers a novel explanation for why climates are wetter when atmospheric carbon dioxide (CO2) concentrations are decreasing. Their findings, published online today by Geophysical Research Letters, show that cutting carbon dioxide concentrations could help prevent droughts caused by global warming.

Cao and Caldeira’s new work shows that this precipitation increase is due to the heat-trapping property of the greenhouse gas carbon dioxide in the atmosphere. Carbon dioxide traps heat in the middle of the atmosphere. This warm air higher in the atmosphere tends to prevent the rising air motions that create thunderstorms and rainfall.

As a result, an increase in the atmospheric concentration of carbon dioxide tends to suppress precipitation. Similarly, a decrease in the atmospheric concentration of carbon dioxide tends to increase precipitation.

The results of this study show that cutting the concentration of precipitation-suppressing carbon dioxide in the atmosphere would increase global precipitation. This is important because scientists are concerned that unchecked global warming could cause already dry areas to get drier. (Global warming may also cause wet areas to get wetter.) Cao and Caldeira’s findings indicate that reducing atmospheric carbon dioxide could prevent droughts caused by climate change.

“This study shows that the climate is going to be drier on the way up and wetter on the way down,” Caldeira said, adding:”Proposals to cool the earth using geo-engineering tools to reflect sunlight back to space would not cause a similar pulse of wetness.”

The team’s work shows that carbon dioxide rapidly affects the structure of the atmosphere, causing quick changes precipitation, as well as many other aspects of Earth’s climate, well before the greenhouse gas noticeably affects temperature. These results have important implications for understanding the effects of climate change caused by carbon dioxide, as well as the potential effects of reducing atmospheric carbon dioxide concentrations.

“The direct effects of carbon dioxide on precipitation take place quickly,” said Cao. “If we could cut carbon dioxide concentrations now, we would see precipitation increase within the year, but it would take many decades for climate to cool.”

SOURCE





The dismal record of prophecy

Andy Revkin Points To The End of The Line For The IPCC And Its ilk
Beginning in the 1980s, [University of Pennsylvania Professor Philip] Tetlock examined 27,451 forecasts by 284 academics, pundits and other prognosticators. The study was complex, but the conclusion can be summarized simply: the experts bombed. Not only were they worse than statistical models, they could barely eke out a tie with the proverbial dart-throwing chimps. [...] The least accurate forecasters, [Tetlock] found, were hedgehogs: “thinkers who ‘know one big thing,’ aggressively extend the explanatory reach of that one big thing into new domains” and “display bristly impatience with those who ‘do not get it,’ ” he wrote. Better experts “look like foxes: thinkers who know many small things,” “are skeptical of grand schemes” and are “diffident about their own forecasting prowess.”

So there we have it…experts of the “big thing” called “climate change”, aggressive (to the point of hiding declines, preventing publication of competing ideas, inserting unsubstantiated critiques in the IPCC report, etc etc) and definitely “impatient” with us little humans wondering aloud about their certitudes (any post at RC, Connolley, Deltoid, Romm, etc etc keeps confirming this point).

Note how none of the above can be defined as “gross negligence” or “conspiracy”, and yet despite all the whitewashing by the Climategate inquiries, there is a scientific consensus, and the best of our scientific knowledge demonstrates, that all that bunch, and pretty much all the bigwigs around the IPCC, they ARE “least accurate forecasters”. QED.

For more discussion about “wrongology”: here and here. Read also here a critique-essay by Tetlock himself, listing a set of criteria suggested by David Freedman, author of Wrong: Why Experts* Keep Failing Us—And How to Know When Not to Trust Them as signs of claims we should be “especially wary of”
* dramatic (“claiming to have invented the psychological equivalent of the telescope qualifies”)

* a tad too clear-cut (“devoid of qualifications about when propositions do and do not hold”)

* doubt free (“portraying findings as beyond reasonable doubt and one’s measure as 100 percent pure”)

* universal (“implying that one is tapping into powerful unconscious forces that, hitherto unbeknownst to us, drive all human behavior”)

* palatable (“likely to appeal to one’s favorite ideological constituencies”)

* receiving “a lot of positive” media attention (“widely covered in the mass media and millions have visited the website”)

* actionable implications (“claims about what employers now need to do to guarantee true equality of opportunity in workplaces”)

Let me now make a statement that is dramatic, very clear-cut, doubt-free, universal, palatable (to most of my readers), yet likely media-ignored and hardly actionable: the “scientific consensus” on climate-change (rather, the unscientific stuff that constitutes the IPCC–led propaganda bandied about as “scientific consensus”), scores 7 out of 7 on the Freedman scale and therefore should lie at the bottom of anybody’s trust level:
* dramatic (having reached the computational power needed to project future climate just as CO2 emissions got to a previously-unknown “dangerous” level)

* a tad too clear-cut (with climate change almost completely due to a “thermostat” called CO2)

* doubt free (the IAC spent an inordinate amount of time complaining about the absurd IPCC policy of underplaying uncertainties)

* universal (everybody will feel the (bad) consequences of climate change, and everybody is guilty of it)

* palatable (as it happens, the usual evils of capitalism and freedoms are the underling cause of climate change)

* receiving “a lot of positive” media attention (shall I really comment this?)

* actionable implications (every ha’penny worth of a politician understands how many things can be pinned upon the bandwagon called “climate change”)

And I find one sentence by Tetlock as especially relevant to the climate debate: "Whatever may be the merits of the underlying science in the peer-reviewed literature, in the public forum, the ratio of pseudoexpertise to genuine expertise is distressingly high."

Yes, I might be wrong. On the other hand, I am not asking for billions of dollars for dubious research, have never attempted to restrict anybody’s liberty, don’t use the ‘net to show off my superiority complex, do let almost every comment free on this website, etc etc)

SOURCE (See the original for links)




The "experts" are regularly wrong: Some notable examples

Sunday Book Review: ‘Future Babble’ by Dan Gardner (March 27, 2011)

“The end of everything we call life is close at hand and cannot be evaded.” H.G. Wells, 1946

George Edward Scott, my mother’s father, was born in an English village near the city of Nottingham. It was 1906. We can be sure that anyone who took notice of George’s arrival in the world agreed that he was a very lucky baby. There was the house he lived in, for one thing. It was the work of his father, a successful builder, and it was, like the man who built it, correct, confident, and proudly Victorian. Middle-class prosperity was evident throughout, from the sprawling rooms to the stained-glass windows and the cast-iron bathtub with a pull-cord that rang a bell downstairs. A maid carrying a bucket of hot water would arrive in due course.

And there was the country and the era. Often romanticized as the “long Edwardian summer,” Britain at the beginning of the twentieth century was indeed a land of peace and prosperity, if not strawberries and champagne. Britain led the world in industry, science, education, medicine, trade, and finance. Its empire was vaster than any in history, its navy invincible. The great and terrible war with Napoleon’s France was tucked away in dusty history books and few worried that its like would ever come again.

It was a time when “Progress” was capitalized. People were wealthier. They ate better and lived longer. Trade, travel, and communication steadily expanded, a process that would be called, much later, globalization. Science advanced briskly, revealing nature’s secrets and churning out technological marvels, each more wonderful than the last, from the train to the telegraph to the airplane. The latest of these arrived only four years before George Scott was born, and in 1912, when George was six, his father gathered the family in a field to witness the miracle of a man flying through the air in a machine. The pilot waved to the gawkers below. “Now I’ve seen it,” George’s grandmother muttered. “But I still don’t believe it.”

And the future? How could it be anything but grand? In 1902, the great American economist John Bates Clark imagined himself in 2002, looking back on the last hundred years. He pronounced himself profoundly satisfied. “There is certainly enough in our present condition to make our gladness overflow” and to hope that “the spirit of laughter and song may abide with us through the years that are coming,” Clark wrote. The twentieth century had been a triumph, in Clark’s imagining. Technology had flourished, conflict between labour and capital had vanished, and prosperity had grown until the slums were “transformed into abodes of happiness and health.” Only trade had crossed borders, never armies, and in the whole long century not a shot had been fired in anger. Of course this was only to be expected, Clark wrote, even though some silly people in earlier generations had actually believed war could happen in the modern world — “as if nations bound together by such economic ties as now unite the countries of the world would ever disrupt the great industrial organism and begin fighting.”

At the time, Clark’s vision seemed as reasonable as it was hopeful, and it was widely shared by eminent persons. “We can now look forward with something like confidence to the time when war between civilized nations will be as antiquated as the duel,” wrote the esteemed British historian, G.P. Gooch, in 1911. Several years later, the celebrated Manchester Guardian journalist H.N. Norman was even more definitive. “It is as certain as anything in politics can be, that the frontiers of our modern national states are finally drawn. My own belief is that there will be no more wars among the six Great Powers.”

One day, a few months after H.N. Norman had declared the arrival of eternal peace, George Scott fetched his father’s newspaper. The top story was the latest development in the push for Irish home rule. Below that was another headline. “War Declared,” it read.

It was August 1914. What had been considered impossible by so many informed experts was now reality. But still there w as no need to despair. It would be “the war to end all wars,” in H.G. Wells’s famously optimistic phrase. And it would be brief. It has to be, wrote the editors of the Economist, thanks to “the economic and financial impossibility of carrying out hostilities many more months on the present scale.”

For more than four years, the industry, science, and technology that had promised a better world slowly ground millions of men into the mud. The long agony of the First World War shattered empires, nations, generations, and hopes. The very idea of progress came to be scorned as a rotten illusion, a raggedy stage curtain now torn down and discarded.

In defeated Germany, Oswald Spengler’s dense and dark Decline of the West was the runaway best-seller of the 1920s. In victorious Britain, the Empire was bigger but the faith in the future that had sustained it faded like an old photograph left in the sun. The war left crushing debts and the economy staggered. “Has the cycle of prosperity and progress closed?” asked H.G. Wells in the foreword to a book whose title ventured an even bleaker question: Will Civilisation Crash? Yes to both, answered many of the same wise men who had once seen only peace and prosperity ahead. “It is clear now to everyone that the suicide of civilization is in progress,” declared the physician and humanitarian Albert Schweitzer in a 1922 lecture at Oxford University. It may have been “the Roaring Twenties” in the United States — a time of jazz, bathtub gin, soaring stocks, and real estate speculation — but it was a decade of gloom in Britain. For those who thought about the future, observes historian Richard Overy, “the prospect of imminent crisis, a new Dark Age, became a habitual way of looking at the world.”

My grandfather’s fortunes followed Britain’s. His father’s business declined, prosperity seeped away, and the bathtub pull-cord ceased to summon the downstairs maid. In 1922, at the age of fifteen, George was apprenticed to a plumber. A few years later, bowing to the prevailing sense that Britain’s decline was unstoppable, he decided to emigrate. A coin toss — heads Canada, tails Australia — settled the destination. With sixty dollars in his pocket, he landed in Canada. It was 1929. He had arrived just in time for the Great Depression.

A horror throughout the industrialized world, the Great Depression was especially savage in North America. Half the industrial production of the United States vanished. One-quarter of workers were unemployed. Starvation was a real and constant threat for millions. Growing numbers of desperate, frightened people sought salvation in fascism or communism. In Toronto, Maple Leaf Gardens was filled to the rafters not for a hockey game but a Stalinist rally, urging Canadians to follow the glorious example of the Soviet Union. Among the leading thinkers of the day, it was almost a truism that liberal democracy and free-market capitalism were archaic, discredited, and doomed. Even moderates were sure the future would belong to very different economic and political systems.

In 1933, the rise to power of the Nazis added the threat of what H.G. Wells called the “Second World War” in his sci-fi novel The Shape of Things to Come. Published the same year Adolf Hitler became chancellor of Germany, The Shape of Things to Come saw the war beginning in 1940 and predicted it would become a decade-long mass slaughter, ending not in victory but the utter exhaustion and collapse of all nations. Military analysts and others who tried to imagine another Great War were almost as grim. The airplanes that had been so wondrous to a young boy in 1912 would fill the skies with bombs, they agreed. Cities would be pulverized. There would be mass psychological breakdown and social disintegration. In 1934, Britain began a rearmament program it could not afford for a war that, it increasingly seemed, it could not avoid. In 1936, as Nazi Germany grew stronger, the program was accelerated.

A flicker of hope came from the United States, where economic indicators jolted upward, like a flat line on a heart monitor suddenly jumping. It didn’t last. In 1937, the American economy plunged again. It seemed nothing could pull the world out of its death spiral. “It is a fact so familiar that we seldom remember how very strange it is,” observed the British historian G.N. Clark, “that the commonest phrases we hear used about civilization at the present time all relate to the possibility, or even the prospect, of its being destroyed.”

That same year, George Scott’s second daughter, June, was born. It is most unlikely that anyone thought my mother was a lucky baby.

The Second World War began in September 1939. By the time it ended in 1945, at least forty million people were dead, the Holocaust had demonstrated that humanity was capable of any crime, much of the industrialized world had been pounded into rubble, and a weapon vastly more destructive than anything seen before had been invented. “In our recent history, war has been following war in ascending order of intensity,” wrote the influential British historian Arnold Toynbee in 1950. “And today it is already apparent that the War of 1939—45 was not the climax of this crescendo movement.” Ambassador Joseph Grew, a senior American foreign service officer, declared in 1945 that “a future war with the Soviet Union is as sure as anything in this world.” Albert Einstein was terrified. “Only the creation of a world government can prevent the impending self-destruction of mankind,” declared the man whose name was synonymous with genius. Some were less optimistic. “The end of everything we call life is close at hand and cannot be evaded,” moaned H.G. Wells.

Happily for humanity, Wells, Einstein, and the many other luminaries who made dire predictions in an era W.H. Auden dubbed “The Age of Anxiety” were all wrong. The end of life was not at hand. War did not come. Civilization did not crumble. Against all reasonable expectation, my mother turned out to be a very lucky baby, indeed.

Led by the United States, Western economies surged in the postwar decades. The standard of living soared. Optimism returned, and people expressed their hope for a brighter future by getting married earlier and having children in unprecedented numbers. The result was a combination boom — economic and baby — that put children born during the Depression at the leading edge of a wealth-and-population wave. That’s the ultimate demographic sweet spot. Coming of age in the 1950s, they entered a dream job market. To be hired at a university in the early 1960s, a professor once recalled to me, you had to sign your name three times “and spell it right twice.” Something of an exaggeration, to be sure. But the point is very real. Despite the constant threat of nuclear war, and lesser problems that came and went, children born in the depths of the Great Depression — one of the darkest periods of the last five centuries — lived their adult lives amid peace and steadily growing prosperity. There has never been a more fortunate generation.

Who predicted that? Nobody. Which is entirely understandable. Even someone who could have foreseen that there would not be a Third World War — which would have been a triumph of prognostication in its own right — would have had to correctly forecast both the baby boom and the marvellous performance of post-war economies. And how would they have done that? The baby boom was caused by a post-war surge in fertility rates that sharply reversed a downward trend that had been in place for more than half a century. Demographers didn’t see it coming. No one did.

Similarly, the dynamism of the post-war economies was a sharp break from previous trends that was not forecast by experts, whose expectations were much more pessimistic. Many leading economists even worried that demobilization would be followed by mass unemployment and stagnation. One surprise after another. That’s how the years unfolded after 1945. The result was a future that was as unpredictable as it was delightful — and a generation born at what seemed to be the worst possible time came to be a generation born at the most golden of moments.

The desire to know the future is universal and constant, as the profusion of soothsaying techniques in human cultures — from goats’ entrails to tea leaves — demonstrates so well. But certain events can sharpen that desire, making it fierce and urgent. Bringing a child into the world is one such force. What will the world be like for my baby? My great-grandfather undoubtedly asked himself that question when his little boy was born in 1906. He was a well-read person, and so he likely paid close attention to what the experts said. George Edward Scott was a very lucky baby, he would have concluded. And any intelligent, informed person would have agreed. Thirty-one years later, when my grandfather held his infant daughter in his arms, he surely asked himself the same question, and he, too, would have paid close attention to what the experts said. And he would have feared for her future, as any intelligent, informed person would have.

My great-grandfather was wrong. My grandfather was wrong. All those intelligent, informed people were wrong. But mostly, the experts were wrong.

They’re wrong a lot, those experts. History is littered with their failed predictions. Whole books can be filled with them. Many have been.

Some failed predictions are prophecies of disaster and despair. In the 1968 book The Population Bomb, which sold millions of copies, Stanford University biologist Paul Ehrlich declared “the battle to feed all of humanity is over. In the 1970s, the world will undergo famines — hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.” But there weren’t mass famines in the 1970s. Or in the 1980s. Thanks to the dramatic improvements in agriculture collectively known as “the Green Revolution” — which were well underway by the time Ehrlich wrote his book — food production not only kept up with population growth, it greatly surpassed it.

Ehrlich thought that was utterly impossible. But it happened. Between 1961 and 2000, the world’s population doubled but the calories of food consumed per person increased 24 per cent. In India, calories per person rose 20 per cent. In Italy, 26 per cent. In South Korea, 44 per cent. Indonesia, 69 per cent. China had experienced a famine that killed some 30 million people in the dark years between 1959 and 1961, but in the 40 years after that horror China’s per capita food consumption rose an astonishing 73 per cent. And the United States? In the decades after The Population Bomb was published, fears that people would not get enough to eat were forgotten as American waistlines steadily expanded. The already-substantial consumption of the average American rose 32 per cent, and the United States became the first nation in history to struggle with an epidemic of obesity.

In 1977, President Jimmy Carter called for the “moral equivalent of war” to shift the American economy off oil because, he said, the production of oil would soon fail to keep up with demand. When that happened, oil prices would soar and never come down again — the American economy would be devastated and the American dream would turn brown and die like an unwatered suburban lawn. Eight years later, oil prices fell through the floor. They stayed low for two decades.

A small library could be filled with books predicting stock market crashes and economic disasters that never happened, but the giant of the genre was published in 1987. The hardcover edition of economist Ravi Batra’s The Great Depression of 1990 hit the top spot on the New York Times best-seller list and spent a total of ten months on the chart; the paperback stayed on the list for an astonishing nineteen months. When the American economy slipped into recession in 1990, Batra looked prophetic. When the recession proved to be mild and brief, he seemed less so. When the 1990s roared, he looked foolish, particularly when he spent the entire decade writing books predicting a depression was imminent.

In 1990, Jacques Attali — intellectual, banker, former adviser to French president François Mitterrand — published a book called Millennium, which predicted dramatic change on the other side of the year 2000. Both the United States and the Soviet Union would slowly lose their superpower status, Attali wrote. Their replacements would be Japan and Europe. As for China and India, they “will refuse to fall under the sway of either the Pacific or the European sphere,” but it would be hard for these desperately poor countries to resist. Catastrophic war was “possible, even probable.” However, Attali cautioned, this future isn’t quite chiselled in stone. “If a miracle were to occur” and China and India were to be “integrated into the global economy and market, all strategic assumptions underpinning my prognostications would be overturned. That miracle is most unlikely.” Of course, that “miracle” is precisely what happened. And almost nothing Attali predicted came true.

Even economists who win Nobel Prizes have been known to blow big calls. In 1997, as Asian economies struggled with a major currency crisis, Paul Krugman — New York Times columnist and winner of the Nobel in 2008 — worried that Asia must act quickly. If not, he wrote in Fortune magazine, “we could be looking at a true Depression scenario — the kind of slump that 60 years ago devastated societies, destabilized governments, and eventually led to war.” Krugman’s prescription? Currency controls. It had to be done or else. But mostly, it wasn’t done. And Asia was booming again within two years.

Pessimists have no monopoly on forecasting flops, however. Excited predictions of the amazing technologies to come — Driverless cars! Robot maids! Jet packs! — have been dazzling the public since the late nineteenth century. These old forecasts continue to entertain today, though for quite different reasons. And for every bear prophesying blood in the stock markets, there is a bull who is sure things will only get better.

The American economist Irving Fisher was one. “Stock prices have reached what looks like a permanently high plateau,” the esteemed economist assured nervous investors. “I do not feel there will soon be, if ever, a 50 or 60 point break from present levels, such as they have predicted. I expect to see the stock market a good deal higher within a few months.” That was October 17, 1929. The market crashed the following week. But that crash was none of Britain’s concern, the legendary John Maynard Keynes believed. “There will be no serious consequences in London resulting from the Wall Street Slump,” Keynes wrote. “We find the look ahead decidedly encouraging.” Shortly afterward, Britain sank with the rest of the world into the Great Depression.

Another bull market, this one in the late 1990s, produced a bookshelf full of predictions so giddy they made Irving Fisher sound like Eeyore. The most famous was the 1999 book Dow 36,000 by James Glassman and Kevin Hassett. “If you are worried about missing the market’s big move upward, you will discover that it’s not too late,” Glassman and Hassett wrote. Actually, it was too late. Shortly after Dow 36,000 was published, the Dow peaked at less than 12,000 and started a long, painful descent.

Paul Ehrlich can also take consolation in the fact that many of the optimists who assailed his writing were not much better at predicting the future. “The doomsayers who worry about the prospect of starvation for a burgeoning world population” will not see their terrible visions realized, Time magazine reported in 1966. The reason? Aquaculture. “Rand experts visualize fish herded and raised in offshore pens as cattle are today. Huge fields of kelp and other kinds of seaweed will be tended by undersea ‘farmers’ — frogmen who will live for months at a time in submerged bunkhouses. The protein-rich underseas crop will probably be ground up to produce a dull-tasting cereal that eventually, however, could be regenerated chemically to taste like anything from steak to bourbon.” The same Rand Corporation experts agreed that “a permanent lunar base will have been established long before A.D. 2000 and that men will have flown past Venus and landed on Mars.”

Herman Kahn, a founder of the Hudson Institute and a determined critic of Ehrlich, was similarly off the mark in a thick book called The Year 2000, published in 1967. It is “very likely,” Kahn wrote, that by the end of the century nuclear explosives would be used for excavation and mining, “artificial moons” would be used to illuminate large areas at night, and there would be permanent undersea colonies. Kahn also expected that one of the world’s fastest-growing economies at the turn of the millennium would be that of the Soviet Union.

More HERE




Choose your prophet: Twenty years or 1000?

Climate scientist and warmist Andy Pitman on Thursday: "If we could stop emissions tomorrow we would still have 20 to 30 years of warming ahead of us because of inertia of the system.

Climate Commissioner and warmist Tim Flannery on Friday: "If the world as a whole cut all emissions tomorrow the average temperature of the planet is not going to drop in several hundred years, perhaps as much as a thousand years

SOURCE






Australian Power generator tells academic climate adviser to get real and to take into account asset-destruction effects of a carbon price

ELECTRICITY producers have called on the Gillard government's chief climate change adviser to drop "undergraduate rhetorical devices" and develop "real world" policy about power generation that doesn't damage the economy.

One of Australia's biggest electricity generators, InterGen, has challenged Ross Garnaut to change his position on not compensating power companies for asset value destruction under a carbon tax. Brent Gunther, managing director of InterGen, which produces 16 per cent of Queensland's electricity, has declared that Professor Garnaut's arguments have "missed the point" about financial damage to companies under a carbon price.

He joins several senior business figures in speaking out against the carbon tax proposed to start on July 1 next year.

Mr Gunther says, in an article published in The Australian today, that Professor Garnaut's position on compensating power companies under the Rudd government's carbon pollution reduction scheme would have resulted in "major damage to the national electricity market" and was a "prescription that will end up damaging the Australian economy".

Professor Garnaut "needs to deliver real-world solutions, not high-level principles that assume away problems", Mr Gunther writes.

Professor Garnaut will release another major discussion paper on electricity generation and the carbon tax tomorrow, but signalled last week he had not changed his position from 2008, when he argued there were no grounds for compensation for electricity generators.

He said that, although assistance to emissions-intensive, trade-exposed industries was needed to avoid unfair competition between Australian emitters and those in countries without a carbon price, this should not be confused with providing support for loss of profits or asset value.

"Any fall in asset value stemming from the internalisation of the carbon externality (through pricing carbon) creates no greater case for compensation than other government reforms to reduce other externalities, such as the introduction of measures to discourage smoking, control the use of asbestos or phase out lead in petrol" Professor Garnaut said.

Mr Gunther says the comments suggest Professor Garnaut's discussion paper tomorrow will be a "prescription based on a simplistic and superficial understanding of the power sector - a prescription that will end up damaging the Australian economy".

The InterGen chief also says that asset value losses for electricity companies raise the prospect of state governments having to direct "a power station to keep operating if things ever got bad".

In 2009, as a result of Professor Garnaut's recommendations, the Rudd government indicated it would provide $7.3 billion over 10 years to the power sector for the impact of an emissions trading scheme.

This was after commissioning a report from investment bank Morgan Stanley that highlighted generators would be unable to pass on to consumers the impact of a carbon price on their asset losses.

"At a time when the economic debate in Australia is starting to refocus on how to enhance productivity, the importance of the national electricity market should never be underestimated," Mr Gunther says.

He says the energy sector wants to "develop a solution", as did Climate Change Minister Greg Combet and Energy Minister Martin Ferguson.

Last week in parliament, Mr Ferguson, said: "A highly efficient energy-driven system has been the key to the Australian economy.

"The Australian energy market is actually held up as the most efficient in the OECD world.

"It is estimated that over $17bn of capital is required for powerhouse generation assets - that is, refinancing, capital expenditure and new build over the next five years."

SOURCE

***************************************

For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here

*****************************************

2 comments:

Anonymous said...

"The "experts" are regularly wrong: Some notable examples"

Thanks for posting things like this. In this Internet era it's easy to miss classically well written and thought out essays. -=NikFromNYC=-

Anonymous said...

excellent blog.

keep it up. quite staggering to read the facts behind the hype.