Sunday, July 27, 2014

Half of Britain to be opened up to fracking

Ministers are this week expected to offer up vast swathes of Britain for fracking in an attempt to lure energy companies to explore shale oil and gas reserves.

The Department for Energy and Climate Change is expected to launch the so-called “14th onshore licensing round”, which will invite companies to bid for the rights to explore in as-yet untouched parts of the country.

The move is expected to be hugely controversial because it could potentially result in fracking taking place across more than half of Britain. Industry sources said the plans could be announced at a press conference tomorrow.

The Government is a big proponent of fracking and last year revealed that it would “step up the search” for shale gas and oil.

Ministers said they would offer energy companies the chance for rights to drill across more than 37,000 square miles, stretching from central Scotland to the south coast.

Michael Fallon, the former energy minister, has previously described shale as “an exciting prospect, which could bring growth, jobs and energy security”.

A previous government-commissioned report said as many as 2,880 wells could be drilled in the new licence areas, generating up to a fifth of the country’s annual gas demand at peak and creating as many as 32,000 jobs.

However, the report warned that communities close to drilling sites could see a large increase in traffic. Residents could face as many as 51 lorry journeys each day for three years, the study said.

It also warned of potential strain on facilities for handling the waste water generated by hydraulic fracturing, the process known as fracking, which involves pumping water, sand and chemicals into rocks at high pressure to extract gas.

There were also concerns over the potential environmental impact on the countryside.

Controversies include plans to offer land within national parks, despite National Trust opposition.

The areas expected to attract the most interest are the Bowland basin in the north of England, where it is estimated there could be enough gas to supply the UK for 40 years.

Ministers also anticipate strong interest in the South East and the central belt of Scotland.

SOURCE





New York Senate Rejects Fracking Ban

The New York Senate has declined to pass a bill extending a statewide moratorium on hydraulic fracturing energy production in the state. Instead, Gov. Andrew Cuomo (D) and local governments will decide the fate of fracking in New York.

The New York State Assembly voted 89 to 34 on June 16 to continue the statewide moratorium, which was imposed as a temporary measure by former Gov. David Paterson (D). The Cuomo administration is currently reviewing the moratorium, and some legislators are trying to pass a law that would ban fracking even if Cuomo lifts the executive moratorium. The Senate, however, declined to vote on the bill.

New York environmental officials have missed multiple deadlines to issue final rulings on hydraulic fracturing. The Cuomo administration’s ongoing delay in making a final decision on fracking keeps the ban in place while enabling the governor to avoid the political consequences of making it permanent.

“Gov. Cuomo appears to be appeasing urban, far-left environmental activists while paying lip service to upstate voters who will decide his fate in the November election,” said Jay Lehr, science director for The Heartland Institute, which publishes Environment & Climate News. “This is the same political strategy employed by President Obama regarding the Keystone XL pipeline.”

SOURCE




UK: Stop building offshore wind farms, says energy company

Britain should stop building expensive offshore wind farms, energy giant Centrica has said, claiming that billpayers could be saved £96bn by 2030 if ministers pursued a cheaper green strategy.

The British Gas owner - whose chief executive Sam Laidlaw is preparing to step down after eight years - on Wednesday took the unusual step of issuing its own manifesto for how to solve Britain’s energy crisis, claiming its plans were three times cheaper than Government’s.

Mr Laidlaw, whose exit and replacement by BP executive Iain Conn is expected to be confirmed as soon as next week, is said to have grown tired of taking the flak for rising energy bills.

The report, which points the finger of blame at Government for backing expensive green technologies, offers a “more affordable pathway to a lower-carbon future”, Mr Laidlaw said.  It advocates building no more offshore wind farms, which it calls “an expensive option that may not be needed”, stopping solar panel deployment, “since it generates no output at times of peak demand” and restricting use of expensive solid wall insulation for homes.

Instead it backs gas, nuclear and carbon capture and storage (CCS) plants. It claims the plan would save consumers £100 a year by 2030, compared with the Government’s strategy, while still hitting 2050 carbon targets.

But the manifesto would involve Britain failing to meet its legally-binding EU target for renewable energy generation by 2020, and would also involve weakening green targets for the late 2020s.

One Whitehall source dismissed the report, saying: “Centrica ignores legally binding targets that are not going to go away.”

Peter Atherton, of Liberum Capital, said Centrica had entered the debate on policy “at least five years late” having previously supported policies such as offshore wind “as that suited their short term profit outlook”.

Centrica last year sold its interest in the proposed Race Bank offshore wind farm after deciding subsidies were inadequate, and hopes now to build gas-fired power plants.

Sophie Neuburg, of Friends of the Earth, said the report was “cynical” and served Centrica’s own interests. She said it was "ridiculous" to stop building offshore wind when it was not clear if CCS would work.

Joss Garman, of think-tank IPPR said: "Centrica’s proposals could fatally damage the UK’s efforts to reduce harmful carbon pollution because they directly contradict the recommendation of the Committee on Climate Change to introduce a 2030 decarbonisation target for the power sector. To regain the trust of consumers and bring down costs, Centrica needs to embrace new technologies and be part of the solution to climate change, not part of the problem.”

The energy department said it was working to “ensure the UK’s energy security and achieve our carbon targets in the most cost effective way possible”.

SOURCE




Guillotine climate change skeptics?

Don Surber

If the world is warming, it is doing so at one-quarter of the rate the Intergovernmental Panel on Climate Change predicted in its 2007 report.

The IPCC admits in a yet-to-be released report that it overestimated global warming, the London Daily Mail reported.

“But the new report says the observed warming over the more recent 15 years to 2012 was just 0.05C per decade -- below almost all computer predictions,” the newspaper reported.

That is a change of five-hundredths of a degree annually.  Feel the burn.

The weather is doing what Leona Woods Marshall Libby forecast 30 years ago.  She’s a big deal. At 23, she was the only female on physicist Enrico Fermi’s team that built the first nuclear reactor and first atom bomb.

Dr. Libby later developed the method used to measure temperatures centuries ago using tree ring data, which is a key tool in climatology.

In 1979 -- when the scientific consensus was global cooling -- Dr. Libby forecast a rise in temperatures until the year 2000 when it would get colder again for the next 50 years.

“Easily one to two degrees,” she told the Los Angeles Times. “And maybe even three or four degrees. It takes only 10 degrees to bring on an Ice Age.”

The first half of her prediction proved true. Temperatures peaked in 1998.

But why bother with the facts? Global warming is politics, not science. The head of the IPCC is an economist. Its Nobel is a Peace Prize.

Yes, horticulturists use water vapor and carbon dioxide in their greenhouses.  But that is to feed their plants. Carbon dioxide is your friend, not a pollutant.

As the evidence mounts that this is junk science, its promoters are getting ugly.

Two years ago, Professor Richard Parncutt of Graz University in Austria called for the execution of skeptics.

He later retracted his statement, but pardon people for being nervous. Austria was part of Nazi Germany.

And history shows that being right is small comfort.  In the 18th century, the scientific consensus backed the theory of phlogiston, which held that there were three elements.

Along came Antoine-Laurent de Lavoisier who determined this was wrong.  People now consider Lavoisier as the Father of Modern Chemistry, because he did the math and used experimentation to prove his point.

But some people held on to the phlogiston theory for a while longer. They did so because everyone else had said it is true. And if you do not believe what everyone else believes, then you are an idiot.

Hans Christian Andersen mocked this conformity in his story, “The Emperor’s New Clothes,” in which con men sold the emperor cloth that didn’t exist. They told him the cloth was invisible to the hopelessly stupid and people who are unfit for their office.

Not wishing to be known as a fool or unfit, the emperor pretended to see the cloth.  He put on the non-existent clothing and paraded naked before the people, who were silent lest they be considered fools.  Finally, a child blurted out that the emperor was wearing nothing. That broke the spell.

And so it goes with global warming. If you do not believe then you are a denier, anti-science, and a tool for that great bogeyman, Big Oil.

The truth is every flood, every drought, every tornado, every hurricane, every cyclone, every dip in the polar vortex, every derecho, every wildfire, every blizzard and every other weather phenomenon does not prove global warming.

Forces that are beyond man’s comprehension control and deterimine the weather.

Skeptics beware, they guillotined Lavoisier.  His execution was unrelated to his debunking phlogiston, but his status as the father of chemistry did not spare him.

Legend has it that when he pleaded for a stay of execution so he could complete one final experiment, the judge replied: “La République n’a pas besoin de savants ni de chimistes; le cours de la justice ne peut être suspendu.”

That translates into “The Republic needs neither scientists nor chemists; the course of justice cannot be delayed.”

A motto fit for today’s global warming fanatics.

SOURCE





Just Who is Waging the ‘War on Science’?

Paul Driessen

Left-leaning environmentalists, media and academics have long railed against the alleged conservative “war on science.” They augment this vitriol with substantial money, books, documentaries and conference sessions devoted to “protecting” global warming alarmists from supposed “harassment” by climate chaos skeptics, whom they accuse of wanting to conduct “fishing expeditions” of alarmist emails and “rifle” their file cabinets in search of juicy material (which might expose collusion or manipulated science).

A primary target of this “unjustified harassment” has been Penn State University professor Dr. Michael Mann, creator of the infamous “hockey stick” temperature graph that purported to show a sudden spike in average planetary temperatures in recent decades, following centuries of supposedly stable climate. But at a recent AGU meeting a number of other “persecuted” scientists were trotted out to tell their story of how they have been “attacked” or had their research, policy demands or integrity questioned.

To fight back against this “harassment,” the American Geophysical Union actually created a “Climate Science Legal Defense Fund,” to pay mounting legal bills that these scientists have incurred. The AGU does not want any “prying eyes” to gain access to their emails or other information. These scientists and the AGU see themselves as “Freedom Fighters” in this “war on science.” It’s a bizarre war.

While proclaiming victimhood, they detest and vilify any experts who express doubts that we face an imminent climate Armageddon. They refuse to debate any such skeptics, or permit “nonbelievers” to participate in conferences where endless panels insist that every imaginable and imagined ecological problem is due to fossil fuels. They use hysteria and hyperbole to advance claims that slashing fossil fuel use and carbon dioxide emissions will enable us to control Earth’s climate – and that references to computer model predictions and “extreme weather events” justify skyrocketing energy costs, millions of lost jobs, and severe damage to people’s livelihoods, living standards, health and welfare.

Reality is vastly different from what these alarmist, environmentalist, academic, media and political elites attempt to convey.

In 2009, before Mann’s problems began, Greenpeace started attacking scientists it calls “climate deniers,” focusing its venom on seven scientists at four institutions, including the University of Virginia and University of Delaware. This anti-humanity group claimed its effort would “bring greater transparency to the climate science discussion” through “educational and other charitable public interest activities.” (If you believe that, send your bank account number to those Nigerians with millions in unclaimed cash.)

UVA administrators quickly agreed to turn over all archived records belonging to Dr. Patrick Michaels, a prominent climate chaos skeptic who had recently retired from the university. They did not seem to mind that no press coverage ensued, and certainly none that was critical of these Spanish Inquisition tactics.

However, when the American Tradition Institute later filed a similar FOIA request for Dr. Mann’s records, UVA marshaled the troops and launched a media circus, saying conservatives were harassing a leading climate scientist. The AGU, American Meteorological Society and American Association of University Professors (the nation’s college faculty union) rushed forward to lend their support. All the while, in a remarkable display of hypocrisy and double standards, UVA and these organizations continued to insist it was proper and ethical to turn all of Dr. Michaels’ material over to Greenpeace.

Meanwhile, although it had started out similarly, the scenario played out quite differently at the University of Delaware. Greenpeace targeted Dr. David Legates, demanding access to records related to his role as the Delaware State Climatologist. The University not only agreed to this. It went further, and demanded that Legates produce all his records – regardless of whether they pertained to his role as State Climatologist, his position on the university faculty, or his outside speaking and writing activities, even though he had received no state money for any of this work. Everything was fair game.

But when the Competitive Enterprise Institute filed a FOIA request for documents belonging to several U of Delaware faculty members who had contributed to the IPCC, the university told CEI the state’s FOIA Law did not apply. (The hypocrisy and double standards disease is contagious.) Although one faculty contributor clearly had received state money for his climate change work, University Vice-President and General Counsel Lawrence White falsely claimed none of the individuals had received state funds.

When Legates approached White to inquire about the disparate treatment, White said Legates did not understand the law. State law did not require that White produce anything, White insisted, but also did not preclude him from doing so. Under threat of termination for failure to respond to the demands of a senior university official, Legates was required to allow White to inspect his emails and hardcopy files.

Legates subsequently sought outside legal advice. At this, his academic dean told him he had now gone too far. “This puts you at odds with the University,” she told him, “and the College will no longer support anything you do.” This remarkable threat was promptly implemented. Legates was terminated as the State Climatologist, removed from a state weather network he had been instrumental in organizing and operating, and banished from serving on any faculty committees.

Legates appealed to the AAUP – the same union that had staunchly supported Mann at UVA. Although the local AAUP president had written extensively on the need to protect academic freedom, she told Legates that FOIA issues and actions taken by the University of Delaware’s vice-president and dean “would not fall within the scope of the AAUP.”

What about the precedent of the AAUP and other professional organizations supporting Dr. Mann so quickly and vigorously? Where was the legal defense fund to pay Legates’ legal bills? Fuggedaboutit.

In the end, it was shown that nothing White examined in Legates’ files originated from state funds. The State Climate Office had received no money while Legates was there, and the university funded none of Legates’ climate change research though state funds. This is important because, unlike in Virginia, Delaware’s FOIA law says that regarding university faculty, only state-funded work is subject to FOIA.

That means White used his position to bully and attack Legates for his scientific views – pure and simple. Moreover, a 1991 federal arbitration case had ruled that the University of Delaware had violated another faculty member’s academic freedom when it examined the content of her research. But now, more than twenty years later, U Del was at it again.

Obviously, academic freedom means nothing when one’s views differ from the liberal faculty majority – or when they contrast with views and “science” that garners the university millions of dollars a year from government, foundation, corporate and other sources, to advance the alarmist climate change agenda. All these institutions are intolerant of research by scientists like Legates, because they fear losing grant money if they permit contrarian views, discussions, debates or anything that questions the climate chaos “consensus.” At this point, academic freedom and free speech obviously apply only to advance selected political agendas, and campus “diversity” exists in everything but opinions.

Climate alarmists have been implicated in the ClimateGate scandal, for conspiring to prevent their adversaries from receiving grants, publishing scientific papers, and advancing their careers. Yet they are staunchly supported by their universities, professional organizations, union – and groups like Greenpeace.

Meanwhile, climate disaster skeptics are vilified and harassed by these same groups, who pretend they are fighting to “let scientists conduct research without the threat of politically motivated attacks.” Far worse, we taxpayers are paying the tab for the junk science – and then getting stuck with regulations, soaring energy bills, lost jobs and reduced living standards…based on that bogus science.

Right now, the climate alarmists appear to be winning their war on honest science. But storm clouds are gathering, and a powerful counteroffensive is heading their way.

SOURCE




SCHOLARLY JOURNAL EXPOSES ‘PEER REVIEW RING’

Warmists have rivals for crookedness in journal publication

Every now and then a scholarly journal retracts an article because of errors or outright fraud. In academic circles, and sometimes beyond, each retraction is a big deal. jvc

Now comes word of a journal retracting 60 articles at once.

The reason for the mass retraction is mind-blowing: A “peer review and citation ring” was apparently rigging the review process to get articles published.

You’ve heard of prostitution rings, gambling rings and extortion rings. Now there’s a “peer review ring.”

The publication is the Journal of Vibration and Control (JVC). It publishes papers with names like “Hydraulic engine mounts: a survey” and “Reduction of wheel force variations with magnetorheological devices.”

The field of acoustics covered by the journal is highly technical:

Analytical, computational and experimental studies of vibration phenomena and their control. The scope encompasses all linear and nonlinear vibration phenomena and covers topics such as: vibration and control of structures and machinery, signal analysis, aeroelasticity, neural networks, structural control and acoustics, noise and noise control, waves in solids and fluids and shock waves.

JVC is part of the SAGE group of academic publications. Here’s how it describes its peer review process:

[The journal] operates under a conventional single-blind reviewing policy in which the reviewer’s name is always concealed from the submitting author.

All manuscripts are reviewed initially by one of the Editors and only those papers that meet the scientific and editorial standards of the journal, and fit within the aims and scope of the journal, will be sent for peer review.  Generally, reviews from two independent referees are required.

An announcement from SAGE published July 8 explained what happened, albeit somewhat opaquely.

In 2013, the editor of JVC, Ali H. Nayfeh, became aware of people using “fabricated identities” to manipulate an online system called SAGE Track by which scholars review the work of other scholars prior to publication.

Attention focused on a researcher named Peter Chen of the National Pingtung University of Education (NPUE) in Taiwan and “possibly other authors at this institution.”

After a 14-month investigation, JVC determined the ring involved “aliases” and fake e-mail addresses of reviewers — up to 130 of them — in an apparently successful effort to get friendly reviews of submissions and as many articles published as possible by Chen and his friends. “On at least one occasion, the author Peter Chen reviewed his own paper under one of the aliases he created,” according to the SAGE announcement.

The statement does not explain how something like this happens. Did the ring invent names and say they were scholars? Did they use real names and pretend to be other scholars? Doesn’t anyone check on these things by, say, picking up the phone and calling the reviewer?

In any case, SAGE and Nayfeh confronted Chen to give him an “opportunity to address the accusations of misconduct,” the statement said, but were not satisfied with his responses.

In May, “NPUE informed SAGE and JVC that Peter Chen had resigned from his post on 2 February 2014.”

Each of the 60 retracted articles had at least one author and/or one reviewer “who has been implicated in the peer review” ring, said a separate notice issued by JVC.

Efforts by The Washington Post to locate and contact Chen for comment were unsuccessful.

The whole story is described in a publication called “Retraction Watch” under the headline: “SAGE Publications busts ‘peer review and citation ring.’”

“This one,” it said, “deserves a ‘wow.’”

Update: Some additional information from the SAGE statement: “As the SAGE investigation drew to a close, in May 2014 Professor Nayfeh’s retirement was announced and he resigned his position as Editor-in-Chief of JVC….

Three senior editors and an additional 27 associate editors with expertise and prestige in the field have been appointed to assist with the day-to-day running of the JVC peer review process. Following Professor Nayfeh’s retirement announcement, the external senior editorial team will be responsible for independent editorial control for JVC.”

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************

Friday, July 25, 2014


Invertebrate populations have dropped by 45 percent in the last four decades

"Invertebrate" means "no backbone".  At that rate America has an ample supply of invertebrates -- in Congress. Seriously, though, how can anybody know how many beetles, wasps etc there are?  It's a fantasy.  They might as well have just made the number up  -- which they probably did

Much has been said about the loss of bird, mammal, reptile, and amphibian species around the world. By current estimates, at least 322 species have gone extinct in the last 500 years. And researchers estimate that 16 to 33 percent of the world’s vertebrate species — animals with developed spinal cords — are currently threatened or endangered. But a new article, published today in Science, paints an even more alarming picture, as scientists have found that the number of individual insects, crustaceans, worms, and spiders decreased by 45 percent on average over the past 40 years — a period in which the global human population doubled.

"We had strong suspicions that the problem was largely with the vertebrates," said Rodolfo Dirzo, an ecologist at Stanford University, in an email to The Verge. "But it was surprising to see this now, also, among the invertebrates," or animals without developed spines. Dirzo calls this loss of animal life "defaunation," and he blames it on humans. "The richness of the animal world of our planet is being seriously threatened by human activities," he said. Many species have gone extinct and the ones that remain — mammals, birds, and insects alike — are showing dramatic declines in their abundance.

In the article, Dirzo and his colleagues reviewed past studies, and compiled a global index of all invertebrate species over the past 40 years. Overall, they found that 67 percent of the world’s invertebrates have declined in numbers by an average of 45 percent. In the UK, for instance, there has been a 30 to 60 percent decline in the number of butterflies, bees, beetles, and wasps. This, the researchers write, is important because too often we measure animal diversity in terms of number of species, or in terms of extinctions. But an animal’s contribution is about more than the mere presence of its species on the planet — it’s also about local shifts in populations that could impact everything from agriculture to human health.

More HERE




Making Earth a ‘High-Energy Planet’

Electricity for Africa may become a reality

If all goes smoothly, President Obama will be able to sign a landmark bipartisan energy-for-Africa bill when more than 40 African heads of state — all looking to attract more U.S. investment for their economies — convene at the White House for the U.S.-Africa Leaders Summit on Aug. 5 and 6.

Two bills in Congress are waiting in the wings for their high voltage debut — the “Electrify Africa” measure (HR 2548), which has passed the House of Representatives and the Senate’s companion “Energize Africa” bill (S 2508), which is ready for a floor vote. Both bills abandon yesterday’s foreign aid handouts and propose only private sector-based incentives such as loans, guarantees, and political risk insurance as a shock absorber for U.S. businesses willing to enter frontier African markets.

The mere possibility of an energy-for-Africa bill with the president’s signature on it is already sparking angry outbursts from Obama’s political base. That’s bizarre but predictable — congressional action would give Obama a big boost for his June 30, 2013, “Power Africa” initiative, which — amazingly — is an “all of the above” energy program and not one of those weasel-worded “all except fossil fuels” shams the White House usually perpetrates. The White House fact sheet specifically said, “Power Africa will partner with Uganda and Mozambique on responsible oil and gas resources management.” It was silent about coal, which is plentiful in Africa.

When Energy Secretary Ernest Moniz touts his department’s “Beyond the Grid” initiative to encourage greenie-approved off-grid and small-scale energy projects, he takes care to avoid disparaging fossil fuels because they’re part of Obama’s Power Africa plan.

Obama’s Big Green base is furious that a measly president of the United States would dare to thwart their exalted global mission to force people in developing nations to live off-grid with only the energy for two lightbulbs, a fan and a radio — a standard measure of “energy access” used by the U.N.’s callous “Sustainable Energy for All” initiative.

A recent Sierra Club report, “Clean Energy Services for All,” defines energy access for poor nations as living on 0.15 percent of the average Californian’s annual usage, according to several critiques. Sierra Clubbers, please lead by example.

The Sierra Club’s sourpuss misers got a nasty slapdown in June from the Breakthrough Institute, a brainstorming enterprise formed in 2004 by Big Green bad boys Ted Nordhaus and Michael Shellenberger. The unknown pair published a provocative essay titled “The Death of Environmentalism” — drubbing everything wrong with mainstream environmentalism — and presented all the dirty laundry at the annual retreat of the Environmental Grantmakers Association. They weren’t unknown afterward.

They have matured wonderfully into welcome thought leaders with their April publication, “Our High Energy Planet — A Climate Pragmatism Project.” In the past I have disparaged some of their more leftward shenanigans, so I offer the following quote from their executive summary as part contrition, part admiration:

“Today, over 1 billion people around the world — 500 million of them in sub-Saharan Africa alone — lack access to electricity. Nearly 3 billion people cook over open fires fueled by wood, dung, coal, or charcoal. This energy poverty presents a significant hurdle to achieving development goals of health, prosperity, and a livable environment.”

I have friends in sub-Saharan Africa, from my days working with leaders of the Congress of Racial Equality, two of whom run CORE Uganda, Fiona Kobusingye and her husband Cyril Boynes. Kobusingye is also an outspoken promoter of DDT sprays as coordinator of Uganda’s Kill Malarial Mosquitoes Now Brigade. She is a victim of malaria herself, requiring lifelong medical treatment — I was seated next to Fiona at a 2004 conference in New York City when she suffered an attack and went to a hospital where none of the doctors had ever seen malaria — and she has lost many cherished family members to the disease.

I asked CORE’s national chairman, Roy Innis, how he felt about the two energy-for-Africa bills now in Congress. Although best known for his activism in the civil rights movement of the 1960s, he is also a long-time champion of energy access for the disadvantaged — and author of “Energy Keepers, Energy Killers: The New Civil Rights Battle.”

Innis said, “A short visit to most of Africa reveals a crushing shortage of controlled and developed energy. It appears that on this legislation, HR 2548 and S 2508, we can avoid the usual fights that bogged down the legislative branch. We hope that the executive branch can follow.”

CORE Uganda hopes so in particular. The Ugandan census of 2002 reported that 7.7 percent of households used electricity for lighting (only 2.6 percent of rural households), with 74.8 percent of households using “tadooba,” a form of paraffin candle, for lighting. Most tourist areas need backup generators because of grid failures. In 2002, the network fed by hydroelectric dams on Lake Victoria provided power to only 33 of the 54 districts of Uganda. Things have improved with diesel-fueled power turbines and co-generation from sugar works, bringing most numbers up about 50 percent since 2002. And in February, the Ugandan Ministry of Energy signed a deal with three European — not American — oil companies to develop its petroleum reserves estimated at over 3.5 billion barrels, based on limited drilling and testing.

Assuming that Congress does the right thing and puts an energy-for-Africa bill on Obama’s desk soon, the new law and his Power Africa initiative may together have the momentum to steamroller the would-be energy-starvation despots of the world into the frozen darkness of Dante’s Ninth Circle of Hell and lift the Breakthrough Institute’s report title into global reality — “Our High-Energy Planet.”

SOURCE





Property Rights at Stake in EPA’s Water Power Grab

Thanks to the federal government, it soon may become far more difficult to use and enjoy private property. The Environmental Protection Agency and the Army Corps of Engineers want to make a water—and land—grab that should scare everyone.

Under the Clean Water Act, the federal government has jurisdiction over “navigable waters,” which the statute further defines as “the waters of the United States, including the territorial seas.” Property owners often need to get permits if waters covered under the law will be impacted. Therefore, a critical question is what types of “waters” are covered under the CWA. That’s what the EPA and Corps seek to address with a new proposed rule that would define “the waters of the United States.” As expected, the EPA and the Corps are seeking to expand their authority to cover waters never imagined when the Clean Water Act was passed in 1972.

For example, the new proposed rule would regulate all ditches, except in narrow circumstances. This even includes man-made ditches. The rule would apply to tributaries that have ephemeral flow. This would include depressions in land that are dry most of the year except when there’s heavy rain.

There’s widespread opposition to the proposed rule. Farmers and ranchers are concerned that the rule could affect normal agricultural practices. Homebuilders could face additional development costs that would likely be passed on to buyers. Counties are concerned because of costly new requirements that could impact municipal storm sewer systems, roadside ditches, among other things.

This broad overreach could have significant costs and delays for permit applicants. In Rapanos v. United States (2006), a major CWA case, Justice Antonin Scalia cited a study highlighting the following costs and delays for one of the major types of permits (Section 404 permits), “The average applicant for an individual permit spends 788 days and $271,596 in completing the process, and the average applicant for a nationwide permit spends 313 days and $28,915—not counting costs of mitigation or design changes.”

The American Farm Bureau Federation launched a national campaign to inform people why the Clean Water Act should be 'ditched.' (Photo: American Farm Bureau Federation Facebook)
The American Farm Bureau Federation launched a national campaign to inform people why the Clean Water Act should be ‘ditched.’ (Photo: American Farm Bureau Federation Facebook)
If the EPA and Corps expand their authority over more waters, property owners will have to secure additional permits. They will have to get permission from federal bureaucrats to enjoy and use their property because of waters that were never intended to be regulated under the CWA. If property owners don’t comply with the law, they could face civil penalties as high as $37,500 per day per violation, or even criminal penalties.

In their craving for more power, the EPA and Corps are ignoring a critical aspect of the CWA: cooperative federalism. Both the states and federal government are supposed to play a role in implementation of the law. Yet, this power grab is an attempt by the federal government to push out state and local governments.

At the start of the CWA it states, “It is the policy of the Congress to recognize, preserve, and protect the primary responsibilities and rights of States to prevent, reduce, and eliminate pollution, to plan the development and use (including restoration, preservation, and enhancement) of land and water resources…” The EPA and Corps are pretending that this important policy doesn’t exist.

The EPA also had to ignore sound science and proper rulemaking to move forward with its power play. The agency developed a draft report entitled Connectivity of Streams and Wetlands to Downstream Waters: A Review and Synthesis of the Scientific Evidence. A Scientific Advisory Board was convened to peer review the study, which when finalized would provide the scientific foundation for implementation of the rule.

However, the EPA finalized the proposed rule before the Scientific Advisory Board even met. The EPA defends this action by claiming that the final study will still help inform the final rule. But this is putting the cart before the horse (or the rule before the science). The scientific foundation should inform the proposed rule so that the public can provide informed comments and have a meaningful voice in the process.

The public may be commenting on a proposed rule that seems to be a mere placeholder rather than a real policy proposal, or more likely, a proposal that already reflects the final conclusions of the EPA. The EPA has a strong incentive to avoid making major changes to the draft scientific report and, as a result, the final rule. If major changes are made, the EPA might be forced by law to restart the rulemaking process over.

Congress is taking notice. The House Transportation and Infrastructure Committee passed a bill (H.R. 5078) that would prohibit implementation of the proposed rule, and legislation (S. 2496) has been introduced in the Senate to prohibit implementation as well. In addition, the FY 2015 House Interior and Environment appropriations bill that passed out of the appropriations committee includes a provision that withholds funds for implementation of the rule.

Ultimately though, it is the responsibility of Congress to define the term “navigable waters” instead of deferring to the EPA and the Corps. History shows these agencies will continue to seek to expand their authority. As with other laws, Congress needs to reassert its authority and rein in agency overreach. Private property rights are at stake.

SOURCE





Another comment on Risbey et al

The more you look at it the stranger the paper becomes.  The fact that it has among its authors two old Warmist warriors with no expertise in climate science may help explain that

The Risbey et al. (2014) "Well-estimated global surface warming in climate projections selected for ENSO phase" is yet another paper trying to blame the recent dominance of La Niña events for the slowdown in global surface temperature warming, the hiatus. This one, however, states that ENSO contributes to the warming when El Niño events dominate. That occurred from the mid-1970s to the late-1990s. Risbey et al. (2014) also has a number of curiosities that make it stand out from the rest. One of those curiosities is that they claim that 4 specially selected climate models (which they failed to identify) can reproduce the spatial patterns of warming and cooling in the Pacific (and the rest of the ocean basins) during the hiatus period, while the maps they presented of observed versus modeled trends contradict the claims.

IMPORTANT INITIAL NOTE

I’ve read and reread Risbey et al. (2014) a number of times and I can’t find where they identify the “best” 4 and “worst” 4 climate models presented in their Figure 5. I asked Anthony Watts to provide a second set of eyes, and he was also unable to find where they list the models selected for that illustration.

Risbey et al. (2014) identify 18 models, but not the “best” and “worst” of those 18 they used in their Figure 5. Please let me know if I’ve somehow overlooked them. I’ll then strike any related text in this post.

Further to this topic, Anthony Watts sent emails to two of the authors on Friday, July 18, 2014, asking if the models selected for Figure 5 had been named somewhere. Refer to Anthony’s post A courtesy note ahead of publication for Risbey et al. 2014. Anthony has not received replies. While there are numerous other 15-year periods presented in Risbey et al (2014) along with numerous other “best” and “worst” models, our questions pertained solely to Figure 5 and the period of 1998-2012, so it should have been relatively easy to answer the question…and one would have thought the models would have been identified in the Supplementary Information for the paper, but there is no Supplementary Information.

Because Risbey et al. (2014) have not identified the models they’ve selected as “best” and “worst”, their work cannot be verified.

INTRODUCTION

The Risbey et al. (2014) paper Well-estimated global surface warming in climate projections selected for ENSO phase was just published online. Risbey et al. (2014) are claiming that if they cherry-pick a few climate models from the CMIP5 archive (used by the IPCC for their 5th Assessment Report)—that is, if they select specific climate models that best simulate a dominance of La Niña events during the global warming hiatus period of 1998 to 2012—then those models provide a good estimate of warming trends (or lack thereof) and those models also properly simulate the sea surface temperature patterns in the Pacific, and elsewhere.

Those are very odd claims. The spatial patterns of warming and cooling in the Pacific are dictated primarily by ENSO processes and climate models still can’t simulate the most basic of ENSO processes. Even if a few of the models created the warning and cooling spatial patterns by some freak occurrence, the models still do not (cannot) properly simulate ENSO processes. In that respect, the findings of Risbey et al. (2014) are pointless.

Additionally, their claims that the very-small, cherry-picked subset of climate models provides good estimates of the spatial patterns of warming and cooling in the Pacific for the period of 1998-2012 are not supported by the data and model outputs they presented, so Risbey et al. (2014) failed to deliver.

There are a number of other curiosities, too.

ABSTRACT

The Risbey et al. (2014) abstract reads:

"The question of how climate model projections have tracked the actual evolution of global mean surface air temperature is important in establishing the credibility of their projections. Some studies and the IPCC Fifth Assessment Report suggest that the recent 15-year period (1998–2012) provides evidence that models are overestimating current temperature evolution. Such comparisons are not evidence against model trends because they represent only one realization where the decadal natural variability component of the model climate is generally not in phase with observations. We present a more appropriate test of models where only those models with natural variability (represented by El Niño/Southern Oscillation) largely in phase with observations are selected from multi-model ensembles for comparison with observations. These tests show that climate models have provided good estimates of 15-year trends, including for recent periods and for Pacific spatial trend patterns."

Curiously, in their abstract, Risbey et al. (2014) note a major flaw with the climate models used by the IPCC for their 5th Assessment Report—that they are “generally not in phase with observations”—but they don’t accept that as a flaw. If your stock broker’s models were out of phase with observations, would you continue to invest with that broker based on their out-of-phase models or would you look for another broker whose models were in-phase with observations? Of course, you’d look elsewhere.

Unfortunately, we don’t have any other climate “broker” models to choose from. There are no climate models that can simulate naturally occurring coupled ocean-atmosphere processes that can contribute to global warming and that can stop global warming…or, obviously, simulate those processes in-phase with the real world. Yet governments around the globe continue to invest billions annually in out-of-phase models.

Risbey et al. (2014), like numerous other papers, are basically attempting to blame a shift in ENSO dominance (from a dominance of El Niño events to a dominance of La Niña events) for the recent slowdown in the warming of surface temperatures. Unlike others, they acknowledge that ENSO would also have contributed to the warming from the mid-1970s to the late 1990s, a period when El Niños dominated.

CHANCE VERSUS SKILL

The fifth paragraph of Risbey et al. (2014) begins (my boldface):

In the CMIP5 models run using historical forcing there is no way to ensure that the model has the same sequence of ENSO events as the real world. This will occur only by chance and only for limited periods, because natural variability in the models is not constrained to occur in the same sequence as the real world.

Risbey et al. (2014) admitted that the models they selected for having the proper sequence of ENSO events did so by chance, not out of skill, which undermines the intent of their paper. If the focus of the paper had been need for climate models to be in-phase with obseervations, they would have achieved their goal. But that wasn’t the aim of the paper. The concluding sentence of the abstract claims that “…climate models have provided good estimates of 15-year trends, including for recent periods…” when, in fact, it was by pure chance that the cherry-picked models aligned with the real world. No skill involved. If models had any skill, the outputs of the models would be in-phase with observations.

More HERE





5 million Scottish trees felled for wind farms

ONLY a fraction of Scottish forests felled to make way for wind farms have been replanted, figures show, sparking calls for a ban on new developments.

Forestry Commission statistics reveal that about five million trees – almost one for every person in Scotland – have been cut down to clear space for turbines in the past six years but less than a third of them have been replaced.

Of the 2,510 hectares stripped of woodland to make way for turbines since 2007, just 792 hectares were reforested after construction was completed.

The Scottish Conservatives, who obtained the figures through a Freedom of Information request, claimed the figures are evidence that the Scottish Government is “destroying nature” in a bid to meet its own climate targets, which aim for all the country’s electricity to come from renewable sources by 2020.

MSP Murdo Fraser, energy spokesman for the party, said: “The SNP is so blindly obsessed with renewable energy that it doesn’t mind destroying another important environmental attribute to make way for it.

“It’s quite astonishing to see almost as many trees have been destroyed as there are people in Scotland.”

The government has hit back at the claims, saying the figures do not represent the full picture.

Environment and climate change minister Paul Wheelhouse said: “We have replanted nearly 800 hectares and have restored significant areas of important open habitat where this is best for the environment. The result is that, of the area felled for wind farms, only 315 hectares of land suitable for another rotation of trees has not been replanted.”

He also pointed out that 31,400 hectares of new forestry was planted around the country in the same six-year period. “That’s a staggering 62 million trees in the ground across Scotland,” he said.

“Scotland is also shouldering the vast majority of tree-planting in Britain, with nearly two and a half times more in Scotland compared to south of the Border.”

Mr Fraser, who has previously voiced his opposition to wind farms, is calling for a year-long moratorium on planning applications for new developments.

The regional MSP for Mid-Scotland and Fife said: “The contribution of trees to our environment has been well established through the ages.

“I’m still waiting to see compelling evidence of the contribution wind farms make. They are an expensive, intermittent and unreliable alternative, and not one that it’s worth making this level of sacrifice to ­accommodate.

“If the Scottish Government cooled its ludicrous renewable energy targets, we wouldn’t see this kind of wanton destruction and intrusion on our landscape.”

Mr Wheelhouse defended Scotland’s planning rules, which he said require developers to plant new trees to replace any cut down to make way for wind farms.

He added: “It was the Scottish Government that took a proactive role in protecting Scotland’s forests and woodlands. In 2009, we tightened up the guidance around felling from wind farm developments.

“A key component is to keep any felling to a minimum and compensatory planting undertaken where suitable. Every energy company building wind farms has to comply with this policy. All renewable developments are subject to environmental scrutiny through the planning process and this manages any impacts on the natural environment, landscape and communities.”

SOURCE




Smart Growth Facts vs. Ideology

Debates over smart growth–sometimes known as new urbanism, compact cities, or sustainable urban planning, but always meaning higher urban densities and a higher share of people in multifamily housing–boil down to factual questions. But smart-growth supporters keep trying to twist the arguments into ideological issues.

For example, in response to my Minneapolis Star Tribune article about future housing demand, Thomas Fisher, the dean of the College of Design at the University of Minnesota, writes, “O’Toole, like many conservatives, equates low-density development with personal freedom.” In fact, I equate personal freedom with personal freedom.

Fisher adds, “we [meaning government] should promote density where it makes sense and prohibit it where it doesn’t”; in other words, restrict personal freedom whenever planners’ ideas of what “makes sense” differ from yours. Why? As long as people pay the costs of their choices, they should be allowed to choose high or low densities without interference from planners like Fisher.

Another writer who makes this ideological is Daily Caller contributor Matt Lewis, who believes that conservatives should endorse new urbanism. His weird logic is conservatives want people to love their country, high-density neighborhoods are prettier than low-density suburbs, and people who don’t have pretty places to live will stop loving their country. Never mind that more than a century of suburbanization hasn’t caused people to stop loving their country; the truth is there are many beautiful suburbs and many ugly new urban developments.

Lewis adds, “Nobody I know is suggesting that big government–or the U.N.!–ought to mandate or impose these sorts of development policies.” He apparently doesn’t know many urban planners, and certainly none in Denver, Portland, San Francisco, Seattle, the Twin Cities, or other metropolitan areas where big government in the form of regional planning agencies (though not the U.N.) are doing just that. If new urbanism were simply a matter of personal choice, no one would criticize it.

The real issues are factual, not ideological.

Fact #1: Contrary to University of Utah planning professor Arthur Nelson, most people everywhere prefer low-density housing as soon as they have transport that is faster than walking. While a minority does prefer higher densities, the market will provide both as long as there is demand for them.

Fact #2: Contrary to Matt Lewis, American suburbanization did not result from a “post-World War II push for sprawl” coming from “the tax code, zoning, a federally financed highway system, and so on.” Suburbanization began before the Civil War when steam trains could move people faster than walking speed. Most American families abandoned transit and bought cars long before interstate highways–which, by the way, more than paid for themselves with the gas taxes collected from the people who drove on them. Nor did the tax code promote sprawl: Australians build bigger houses with higher homeownership rates in suburbs just as dispersed as America’s without a mortgage interest deduction.

Fact #3: Contrary to Thomas Fisher, low-density housing costs less, not more, than high-density. Without urban-growth boundaries or other artificial restraints, there is almost no urban area in America short of land for housing. Multifamily housing costs more to build, per square foot, than single-family, and compact development is expensive because the planners tend to locate it in areas with the highest land prices. The relative prices in my article–$375,000 for a 1,400-square-foot home in a New Urban neighborhood vs. $295,000 for a 2,400-square-foot home on a large suburban lot–are typical for many smart-growth cities. Compare these eastside Portland condos with these single-family homes in a nearby Portland suburb.

Fact #4: Contrary to Fisher, the so-called costs of sprawl are nowhere near as high as the costs of density. Rutgers University’s Costs of Sprawl 2000 estimates that urban services to low-density development cost about $11,000 more per house than services to high-density development. This is trivial compared with the tens to hundreds of thousands of dollars added to home prices in regions whose policies promote compact development.

Fact #5: Contrary to University of Minnesota planning professor Richard Bolan, the best way to reduce externalities such as pollution and greenhouse gases is to treat the source, not try to change people’s lifestyles. For example, since 1970, pollution controls reduced total air pollution from cars by more than 80 percent, while efforts to entice people out of their cars and onto transit reduced pollution by 0 percent.

Fact #6: Contrary to Lewis, suburbs are not sterile, boring places. Suburbanites have a strong sense of community and are actually more likely to engage in community affairs than city dwellers.

Fact #7: Smart growth doesn’t even work. It doesn’t reduce driving: After taking self-selection into account, its effects on driving are “too small to be useful.” It doesn’t save money or energy: Multi-family housing not only costs more, it uses more energy per square foot than single-family, while transit costs more and uses as much or more energy per passenger mile as driving. When planners say smart growth saves energy, what they mean is you’ll live in a smaller house and have less mobility.

Fact #8: If we end all subsidies and land-use regulation, I’ll happily accept whatever housing and transport outcomes result from people expressing their personal preferences. Too many planners want to control population densities and transport choices through prescriptive land-use regulation and huge subsidies to their preferred forms of transportation and housing.

These planners think only government can know what is truly right for other people. Even if you believe that, government failure is worse than market failure and results in subsidies to special interest groups for projects that produce negligible social or environmental benefits.

If urban planners have a role to play, it is to ensure people pay the costs of their choices. Instead, it is planners, rather than economists such as myself, who have become ideological, insisting density is the solution to all problems despite the preferences of 80 percent of Americans for low-density lifestyles.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************

Thursday, July 24, 2014


Mega pesky:  Deep Oceans have been Cooling For The Past 20 Years

In polite scientific language, this study demolishes the Warmist explanation for "missing heat".  At every point of the warmist explanation, the data show the opposite of what that explanation requires.  In addition, and as even I have repeatedly pointed out, the authors note that there is no known mechanism that would cause ocean heat to move in the paradoxical way that Warmists theorize.  It's all BS, to put it in layman's terms

Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content.  They point out where future data is most needed so these ambiguities do not persist into the next several decades of change.

As a by-product of that analysis they 1) determined the deepest oceans are cooling, 2) estimated a much slower rate of ocean warming, 3) highlighted where the greatest uncertainties existed due to the ever changing locations of heating and cooling, and 4) specified concerns with previous methods used to construct changes in ocean heat content, such as Balmaseda and Trenberth’s re-analysis (see below). They concluded, “Direct determination of changes in oceanic heat content over the last 20 years are not in conflict with estimates of the radiative forcing, but the uncertainties remain too large to rationalize e.g., the apparent “pause" in warming.”

Wunsch and Heimbach (2014) humbly admit that their “results differ in detail and in numerical values from other estimates, but the determining whether any are “correct" is probably not possible with the existing data sets.”

They estimate the changing states of the ocean by synthesizing diverse data sets using models developed by the consortium for Estimating the Circulation and Climate of the Ocean, ECCO.   The ECCO “state estimates” have eliminated deficiencies of previous models and they claim, “unlike most “data assimilation" products, [ECCO] satisfies the model equations without any artificial sources or sinks or forces. The state estimate is from the free running, but adjusted, model and hence satisfies all of the governing model equations, including those for basic conservation of mass, heat, momentum, vorticity, etc. up to numerical accuracy.”

Their results (Figure 18. below) suggest a flattening or slight cooling in the upper 100 meters since 2004, in agreement with the -0.04 Watts/m2 cooling reported by Lyman (2014).6 The consensus of previous researchers has been that temperatures in the upper 300 meters have flattened or cooled since 2003,4 while Wunsch and Heimbach (2014) found the upper 700 meters still warmed up to 2009.

The deep layers contain twice as much heat as the upper 100 meters, and overall exhibit a clear cooling trend for the past 2 decades. Unlike the upper layers, which are dominated by the annual cycle of heating and cooling, they argue that deep ocean trends must be viewed as part of the ocean’s long term memory which is still responding to “meteorological forcing of decades to thousands of years ago”. If Balmaseda and Trenberth’s model of deep ocean warming was correct, any increase in ocean heat content must have occurred between 700 and 2000 meters, but the mechanisms that would warm that “middle layer” remains elusive.

The detected cooling of the deepest oceans is quite remarkable given geothermal warming from the ocean floor. Wunsch and Heimbach (2014) note, “As with other extant estimates, the present state estimate does not yet account for the geothermal flux at the sea floor whose mean values (Pollack et al., 1993) are of order 0.1 W/m2,” which is small but “not negligible compared to any vertical heat transfer into the abyss.3   (A note of interest is an increase in heat from the ocean floor has recently been associated with increased basal melt of Antarctica’s Thwaites glacier. ) Since heated waters rise, I find it reasonable to assume that, at least in part, any heating of the “middle layers” likely comes from heat that was stored in the deepest ocean decades to thousands of years ago.

Wunsch and Heimbach (2014) emphasize the many uncertainties involved in attributing the cause of changes in the overall heat content concluding, “As with many climate-related records, the unanswerable question here is whether these changes are truly secular, and/or a response to anthropogenic forcing, or whether they are instead fragments of a general red noise behavior seen over durations much too short to depict the long time-scales of Fig. 6, 7, or the result of sampling and measurement biases, or changes in the temporal data density.”

Given those uncertainties, they concluded that much less heat is being added to the oceans compared to claims in previous studies (seen in the table below).  It is interesting to note that compared to Hansen’s study that ended in 2003 before the observed warming pause, subsequent studies also suggest less heat is entering the oceans. Whether those declining trends are a result of improved methodologies, or due to a cooler sun, or both requires more observations.

No climate model had predicted the dramatically rising temperatures in the deep oceans calculated by the Balmaseda/Trenberth re-analysis,13 and oceanographers suggest such a sharp rise is more likely an artifact of shifting measuring systems. Indeed the unusual warming correlates with the switch to the Argo observing system. Wunsch and Heimbach (2013)2 wrote, “clear warnings have appeared in the literature—that spurious trends and values are artifacts of changing observation systems (see, e.g., Elliott and Gaffen, 1991; Marshall et al., 2002; Thompson et al., 2008)—the reanalyses are rarely used appropriately, meaning with the recognition that they are subject to large errors.”3

More specifically Wunsch and Heimbach (2014) warned, “Data assimilation schemes running over decades are usually labeled “reanalyses.” Unfortunately, these cannot be used for heat or other budgeting purposes because of their violation of the fundamental conservation laws; see Wunsch and Heimbach (2013) for discussion of this important point. The problem necessitates close examination of claimed abyssal warming accuracies of 0.01 W/m2 based on such methods (e.g., Balmaseda et al., 2013).” 3  

So who to believe?

Because ocean heat is stored asymmetrically and that heat is shifting 24/7, any limited sampling scheme will be riddled with large biases and uncertainties. In Figure 12 below Wunsch and Heimbach (2014) map the uneven densities of regionally stored heat. Apparently associated with its greater salinity, most of the central North Atlantic stores twice as much heat as any part of the Pacific and Indian Oceans. Regions where there are steep heat gradients require a greater sampling effort to avoid misleading results. They warned, “The relatively large heat content of the Atlantic Ocean could, if redistributed, produce large changes elsewhere in the system and which, if not uniformly observed, show artificial changes in the global average.” 3

Furthermore, due to the constant time-varying heat transport, regions of warming are usually compensated by regions of cooling as illustrated in their Figure 15. It offers a wonderful visualization of the current state of those natural ocean oscillations by comparing changes in heat content between1992 and 2011. Those patterns of heat re-distributions involve enormous amounts of heat and that make detection of changes in heat content that are many magnitudes smaller extremely difficult. Again any uneven sampling regime in time or space, would result in “artificial changes in the global average”.

Figure 15 shows the most recent effects of La Nina and the negative Pacific Decadal Oscillation. The eastern Pacific has cooled, while simultaneously the intensifying trade winds have swept more warm water into the western Pacific causing it to warm. Likewise heat stored in the mid?Atlantic has likely been transported northward as that region has cooled while simultaneously the sub-polar seas have warmed. This northward change in heat content is in agreement with earlier discussions about cycles of warm water intrusions that effect Arctic sea ice, confounded climate models of the Arctic and controls the distribution of marine organisms.

Most interesting is the observed cooling throughout the upper 700 meters of the Arctic. There have been 2 competing explanations for the unusually warm Arctic air temperature that heavily weights the global average. CO2 driven hypotheses argue global warming has reduced polar sea ice that previously reflected sunlight, and now the exposed dark waters are absorbing more heat and raising water and air temperatures. But clearly a cooling upper Arctic Ocean suggests any absorbed heat is insignificant. Despite greater inflows of warm Atlantic water, declining heat content of the upper 700 meters supports the competing hypothesis that warmer Arctic air temperatures are, at least in part, the result of increased ventilation of heat that was previously trapped by a thick insulating ice cover.7 That second hypothesis is also in agreement with extensive observations that Arctic air temperatures had been cooling in the 80s and 90s. Warming occurred after subfreezing winds, re-directed by the Arctic Oscillation, drove thick multi-year ice out from the Arctic.11

Regional cooling is also detected along the storm track from the Caribbean and along eastern USA. This evidence contradicts speculation that hurricanes in the Atlantic will or have become more severe due to increasing ocean temperatures. This also confirms earlier analyses of blogger Bob Tisdale and others that Superstorm Sandy was not caused by warmer oceans.

In order to support their contention that the deep ocean has been dramatically absorbing heat, Balmaseda/Trenberth must provide a mechanism and the regional observations where heat has been carried from the surface to those depths. But few are to be found. Warming at great depths and simultaneous cooling of the surface is antithetical to climate models predictions. Models had predicted global warming would store heat first in the upper layer and stratify that layer. Diffusion would require hundreds to thousands of years, so it is not the mechanism. Trenberth, Rahmstorf, and others have argued the winds could drive heat below the surface. Indeed winds can drive heat downward in a layer that oceanographers call the “mixed-layer,” but the depth where wind mixing occurs is restricted to a layer roughly 10-200 meters thick over most of the tropical and mid-latitude belts. And those depths have been cooling slightly.

The only other possible mechanism that could reasonably explain heat transfer to the deep ocean was that the winds could tilt the thermocline. The thermocline delineates a rapid transition between the ocean’s warm upper layer and cold lower layer. As illustrated above in Figure 15, during a La Nina warm waters pile up in the western Pacific and deepens the thermocline. But the tilting Pacific thermocline typically does not dip below the 700 meters, if ever.8

Unfortunately the analysis by Wunsch and Heimbach (2014) does not report on changes in the layer between 700 meters and 2000 meters. However based on changes in heat content below 2000 meters (their Figure 16 below), deeper layers of the Pacific are practically devoid of any deep warming.

The one region transporting the greatest amount of heat into the deep oceans is the ice forming regions around Antarctica, especially the eastern Weddell Sea where annually sea ice has been expanding.12 Unlike the Arctic, the Antarctic is relatively insulated from intruding subtropical waters (discussed here) so any deep warming is mostly from heat descending from above with a small contribution from geothermal.

Counter-intuitively greater sea ice production can deliver relatively warmer subsurface water to the ocean abyss. When oceans freeze, the salt is ejected to form a dense brine with a temperature that always hovers at the freezing point. Typically this unmodified water is called shelf water. Dense shelf water readily sinks to the bottom of the polar seas. However in transit to the bottom, shelf water must pass through layers of variously modified Warm Deep Water or Antarctic Circumpolar Water. Turbulent mixing also entrains some of the warmer water down to the abyss. Warm Deep Water typically comprises 62% of the mixed water that finally reaches the bottom. Any altered dynamic (such as increasing sea ice production, or circulation effects that entrain a greater proportion of Warm Deep Water), can redistribute more heat to the abyss.14 Due to the Antarctic Oscillation the warmer waters carried by the Antarctic Circumpolar Current have been observed to undulate southward bringing those waters closer to ice forming regions. Shelf waters have generally cooled and there has been no detectable warming of the Warm Deep Water core, so this region’s deep ocean warming is likely just re-distributing heat and not adding to the ocean heat content.

So it remains unclear if and how Trenberth’s “missing heat” has sunk to the deep ocean. The depiction of a dramatic rise in deep ocean heat is highly questionable, even though alarmists have flaunted it as proof of Co2’s power. As Dr. Wunsch had warned earlier, “Convenient assumptions should not be turned prematurely into ‘facts,’ nor uncertainties and ambiguities suppressed.” … “Anyone can write a model: the challenge is to demonstrate its accuracy and precision... Otherwise, the scientific debate is controlled by the most articulate, colorful, or adamant players.”

To reiterate, “the uncertainties remain too large to rationalize e.g., the apparent “pause" in warming.”

More HERE  (See the original for links, graphics etc.)





“WELL-ESTIMATED GLOBAL SURFACE WARMING”

Warmist paper was just being wise after the event

 Dr David Whitehouse

This new paper allows great headlines to proclaim that the warming “pause” in global surface temperature is explainable by climate models. As is often the case in climate reporting the details do not back up the headline.

Risbey et al (2014) in Nature Climate Change is yet another paper suggesting that the global surface temperature hiatus of the last 15-years or so is due to changes in the character of the ENSO. But they go a little further and say that once the observational timing of ENSO changes is included in climate models they do a good job. Unfortunately, whilst an interesting and thought provoking paper, it does not support its own conclusion that “climate models have provided good estimates of the 15-year trends for recent periods.”

Climate models have many uses and are essential tools to discover what is going on and, with major caveats, suggest future possibilities. It is well-known that as a whole the CIMP5 ensemble of models does not represent reality that well with only two models coming anywhere near reflecting the hiatus in global surface temperature seen in the last 15-years or so.

With a climate model ensemble that is mainly unrepresentative of reality there are several possibilities for further action. One is to have faith in the models that over longer timescales realities departure from them is temporary. Another is to select those models that best simulate reality and concentrate on them, and the other is to refine the models. Risbey et al (1014) carry out both the latter options.

They selected 18 out of 32 CIMP5 models choosing the ones that had sea surface temperature as a model output. In itself this introduces a selection effect whose influence on subsequent selections of “suitable” models is unknown. Out of those 18 they selected the four best and four worst. The best included ENSO parameters that are in phase with observations. They argue that when the phase of ENSO is got right climate models do represent reality. Unfortunately the evidence they provide for this is not convincing.

If the ENSO with El Nino dominant is having the effect of flattening the global surface temperature of the past 15 years or so then the converse must also be true. ENSO with La Nina dominant would have contributed to the warming seen since about 1980. [Pesky!]

Our lack of understanding of the ENSO process also affects the stated conclusions of this paper. We cannot predict these events with any certainty and we cannot simulate them to any degree of great accuracy. So while there are ENSO components in a climate model, to say that those in the right phase do better could mean nothing. In addition there are other semi-regular changes such as the Atlantic oscillation that might, or might not, be in phase with the observations.

Supplementary information would have helped understand this paper, especially the selection of the models, but unfortunately there are none. This means that given the information in this paper alone it would not be possible to retrace the author’s footsteps.

This paper allows great headlines to be written proclaiming that the “pause” in global surface temperature is explainable by climate models. As is often the case in climate reporting the details do not back up the headline.

What this paper has really done is to draw attention to the limitations of the climate models. One can select subsets of them and argue that they are better than others but the real test is if the Risbey et al (2014) paper has predictive power. In science looking forward is always more powerful than looking back and adjusting models to fit the data.

Risbey et al (2014) say they expect the observed trend to bounce back. So do many others for different reasons. If it does how will we know who is right?

SOURCE




Deficient Chicago infrastructure blamed on climate change

Since there has been no climate change for 17 years, we can KNOW that to be false

Sewage gushed up Lori Burns’s toilet. It swept the floor. It wrecked the water heater, the deep freezer, her mother’s wedding veil.

This basement invasion was the third in five years. Burns, 40, could no longer afford to pay a cleanup crew. So she slipped on polka dotted rain boots, waded into the muck, wrenched out the stand-pipe and watched the brown water drain.

The South Side native, a marketing specialist, estimated damages at $17,000. And that did not include what she could not replace: the family heirlooms, the oriental rugs, her cashmere sweaters. The bungalow had flooded four times from 1985 to 2006, when her parents owned it. Lately, it flooded every other year. Burns felt nature was working against her. In a way, it was.

As Washington still fights over whether or not climate change is real, people across the country are already paying costs scientists ascribe to it — sometimes in unexpected places. You might think about climate change in terms of rising sea levels threatening coastal cities. But all over the Midwest, from Chicago to Indianapolis and Milwaukee, residents face just as many difficult issues as changing weather patterns collide with aging infrastructure. The costs — for governments, insurance companies and homeowners — are measured not only in dollars, but in quality of life.

In Chicago over the past century, downpours that force human waste up pipes and into homes — storms that dump at least 1.5 inches of rain in a single day — have struck the city more often. Annual precipitation in the Midwest grew about 20 percent during the past century. Rains of more than 2.5 inches a day are expected to increase another 50 percent in the next 20 years. That means more flooding — and more clean-up costs for people like Burns.

As the April rain poured, she texted her brother: How much bleach do you have?

On came the snowsuits, goggles and face masks. They dumped bleach on the floor, mopped and reminisced about what they had survived in this basement: a midnight home intruder, the occasional pop-pop of neighborhood gunfire, their parents’ divorce. Here they played Monopoly and watched “The Cosby Show” and learned the truth about Santa Claus.

Soon the silt, as Burns euphemistically called it, was gone. Fans would dry the dampness. The worst was over, it seemed.

In May, a year after sewage swamped Burns’s basement, an insurance giant took to an Illinois courtroom for what might have been a publicity stunt, or what might be a preview of a nationwide battle over who foots the bill for extreme weather events linked to climate change. Farmer’s Insurance Co. sued the city of Chicago for failing to prepare for the effects of global warming.

The city “should have known,” the lawsuit alleged, “that climate change in Cook County has resulted in greater rainfall volume … than pre-1970 rainfall history evidenced.” The storms are not an act of God, the suit claimed, but a carbon-driven reality outlined in Chicago’s own Climate Action Plan, published in 2010.

Last April, sewage water flooded roughly 600 Chicago buildings, according to the lawsuit: “Geysers of sewer water shot out from the floor drains, toilets, showers. … Elderly men and women and young children were forced to evacuate.” That could have been prevented, the company claimed, if Chicago would have remedied an underground storm-water storage that has become, over time, “obsolete.”

“Farmers has taken what we believe is the necessary action to recover payments made on behalf of our customers,” spokesman Trent Frager said in a statement, “for damages caused by what we believe to be a completely preventable issue.”

Two months later, the company dropped the suit — “We hoped that by filing … we would encourage cities and counties to take preventative steps,” Frager said — but not before raising issues that are sure to return to the courts if current climate trends persist.

“The debate we have entered now is: Why does it seem more and more disasters are happening?” said Erwann Michel-Kerjan, executive director of the Wharton Risk Management and Decision Processes Center at the University of Pennsylvania. “And, as a nation, who’s supposed to pay for them?”

The National Climate Assessment, released by the Obama administration in May, predicts that the “frequency and intensity” of the Midwest’s heaviest downpours will more than double over the next hundred years. A handful of heavy spring and summer storms, the kind that flood homes, can supply 40 percent of the region’s annual rainfall, according to the Environmental Protection Agency.

If weather patterns follow projections, that means trouble for aging urban infrastructures and the cities, like Chicago, that rely on them: “Designs are based upon historical patterns of precipitation and stream flow,” the climate assessment says, “which are no longer appropriate guides.”

The link between climate and flooding in Chicago, however, can’t be summarized with, It’s warmer out, so this storm happened. Inherent uncertainties in science make it difficult to disentangle just what forces play Rainmaker.

The American Association for the Advancement of Science, which calls itself the world’s largest non-government group science advocacy group, released a report this year called “What We Know,” which offers a nuanced look at climate change and its effects. The report concludes that natural disasters, like floods, are striking harder and more often. But, beyond anecdotes and weather projections, it adds,  it’s hard to link one specific flood to carbon emissions.

Increased storm frequency is particularly problematic in Chicago, where the sewer system was designed to absorb rain nearly 120 years ago. The city’s storm water systems were built on the assumption that the biggest storms happen only once each decade, at a time when the population was much smaller, said Robert Moore, who leads a climate preparation team at the Natural Resources Defense Council in downtown Chicago. “Climate change will only amplify an existing issue.”

The combined sewer system overflows when an inch of rain soaks the city, directing waste into the Chicago River. If more than 1.5 inches of rain fall city-wide in a day, Moore said, it floods basements across town, disrupting lives and bank accounts.

District engineers agree the problem is serious, and they’re building heavily to address it. They’ve seen the data and the changing weather patterns, but don’t think it suggests any particular cause. They don’t blame a man-made Apocalypse.

“Climate change is a political term,” said David St. Pierre, head of Chicago’s Metropolitan Water Reclamation District.“But you can’t ignore that our weather has changed drastically in the past five years.”

The city’s underground storm and wastewater storage can now hold about 2.7 billion gallons of overflow. By 2015, storage should total 7.5 billion gallons, St. Pierre said. By 2029, 17.5 billion gallons.

“I don’t see any overflows happening when that’s done,” he said. “We’re getting this under control, maybe more than any other city in the U.S.”

SOURCE





Britain Won’t Sign New Climate Treaty Unless China, India Agree CO2 Caps

Britain will not sign a global deal on climate change unless it includes commitments from China and India on reducing emissions, the energy and climate change secretary said on the eve of visiting the two countries.

China is the world’s highest emitter of greenhouse gases and India the third. Neither has agreed any cap on emissions. In an interview with The Times, Ed Davey said that there was little point in Britain making great efforts to cut emissions if other countries did not. “If I looked around the world and no one was doing anything I would have to ask myself the question: is it worth us doing anything if no one else is?” he said.

Speaking before meetings in Beijing and Delhi this week to discuss contributions to a global climate deal due to be signed in Paris next year, Mr Davey said: “We won’t do a deal unless these countries come on board. We need a deal that’s applicable to all — that’s what we didn’t get at Kyoto [the 1997 conference in Japan at which binding targets were set for the emissions of industrialised nations].” Mr Davey said that developing countries should be allowed to carry on increasing their emissions for a few years but at a lower rate and with clear targets for when the level should peak and start declining.

“We expect the rich, developed countries to cut aggressively, emerging economies to peak and then decline and the developing countries and the poorest to increase but hopefully at low rates and have a more sustainable development model than we had.”

On China, he said: “The key for them and the world is when they will peak. The earlier the better. I would like it to be 2025 or earlier. If the Chinese were to say ‘we are not going to commit to a peaking point’, I’m not sure you would get a deal.

More HERE





A Great Plan to Replace the EPA

By Alan Caruba

For years now I have been saying that the Environmental Protection Agency (EPA) must be eliminated and its powers given to the fifty states, all of which,have their own departments of environmental protection. Until now, however, there has been no plan put forth to do so.

Dr. Jay Lehr has done just that and his plan no doubt will be sent to the members of Congress and the state governors. Titled “Replacing the Environmental Protection Agency” it should be read by everyone who, like Dr. Lehr, has concluded that the EPA was a good idea when it was introduced in 1971, but has since evolved into a rogue agency threatening the U.S. economy, attacking the fundamental concept of private property, and the lives of all Americans in countless and costly ways.

Dr. Lehr is the Science Director and Senior Fellow of The Heartland Institute, for whom I am a policy advisor. He is a leading authority on groundwater hydrology and the author of more than 500 magazine and journal articles, and 30 books. He has testified before Congress on more than three dozen occasions on environmental issues and consulted with nearly every agency of the federal government and with many foreign countries. The Institute is a national nonprofit research and education organizations supported by voluntary contributions.

Ironically, he was among the scientists who called for the creation of the EPA and served on many of the then-new agency’s advisory councils. Over the course of its first ten years, he helped write a significant number of legislative bills to create a safety net for the environment.

As he notes in his plan, “Beginning around 1981, liberal activist groups recognized EPA could be used to advance their political agenda by regulating virtually all human activities regardless of their impact on the environment. Politicians recognized they could win votes by posing as protectors of the public health and wildlife. Industries saw a way to use regulations to handicap competitors or help themselves to public subsidies. Since that time, not a single environmental law or regulation has passed that benefited either the environment or society.”

“The takeover of EPA and all of its activities by liberal activists was slow and methodical over the past 30 years. Today, EPA is all but a wholly owned subsidiary of liberal activist groups. Its rules account for about half of the nearly $2 trillion a year cost of complying with all national regulations in the U.S. President Barack Obama is using it to circumvent Congress to impose regulations on the energy sector that will cause prices to ‘skyrocket.’ It is a rogue agency.”

Dr. Lehr says that “Incremental reform of EPA is simply not an option.”  He's right.

“I have come to believe that the national EPA must be systematically dismantled and replaced by a Committee of the Whole of the 50 state environmental protection agencies. Those agencies in nearly all cases long ago took over primary responsibility for the implementation of environmental laws passed by Congress (or simply handed down by EPA as fiat rulings without congressional vote or oversight.”

Looking back over the years, Dr. Lehr notes that “The initial laws I helped write have become increasingly draconian, yet they have not benefited our environment or the health of our citizens. Instead they suppress our economy and the right of our citizens to make an honest living. It seems to me, and to others, that this is actually the intention of those in EPA and in Congress who want to see government power expanded without regard to whether it is needed to protect the environment or public health.”

Eliminating the EPA would provide a major savings by eliminating 80% of its budget. The remaining 20% could be used to run its research labs and administer the Committee of the Whole of the 50 state environmental agencies. “The Committee would determine which regulations are actually mandated in law by Congress and which were established by EPA without congressional approval.”

Dr. Lehr estimates the EPA’s federal budget would be reduced from $8.2 billion to $2 billion. Staffing would be reduced from more than 15,000 to 300 and that staff would serve in a new national EPA headquarters he recommends be “located centrally in Topeka, Kansas, to allow the closest contact with the individual states.” The staff would consist of six delegate-employees from each of the 50 states.”

“Most states,” says Dr. Lehr, “will enthusiastically embrace this plan, as their opposition to EPA’s ‘regulatory train wreck’ grows and since it gives them the autonomy and authority they were promised when EPA was first created and the funding to carry it out.”

The EPA was a good idea when it was created, the nation’s air and water needed to be cleaned, but they have been at this point. Since then, the utterly bogus “global warming”, now called “climate change”, has been used to justify a torrent of EPA regulations. The science the EPA cites as justification is equally tainted and often kept secret from the public.

“It’s time for the national EPA to go,” says Dr. Lehr and I most emphatically agree. “All that is missing is the political will.

SOURCE




The EPA takes aim at Tesla, electric cars

The cornerstone of personal independence and commerce in the modern world is motorized mobility — the car. Ever since Henry Ford’s Model T revolutionized travel in the United States over a hundred years ago, people have relied on the automobile for virtually every personal interaction and business expenditure. Today, the car may very well be at the precipice of its evolutionary leap into the 21st century, and Obama’s regulatory state could kill it on arrival.

Elon Musk, founder and CEO of Tesla Motors, has been a pioneer in the development of electric cars that are as practical as they are attractive. Tesla cars are inherently American: efficient, sleek, fast, and, well, sexy. Everything we look for in the vehicles that represent such an enormous part of the American experience.

Recent stories have revealed Musk’s plan to release a $35,000 Tesla model with the capability of traveling more than 200 miles per charge — or about double what the unattractive, euro-like Nissan Leaf can travel — said to possess the amenities and attractiveness of the current, far more expensive Tesla models. A top-end electric car for the every-man. If achieved successfully, this may mark the beginning of the commonly used exhaust-free, electric automobile. What a glorious achievement for the environmentalist left! … Right?

Well, not quite.

As one can easily deduce, the electric car requires electricity. For electricity to be a more efficient way to power said electric car over, say, petrol fuels, it needs to be available in inexpensive abundance. That’s the non-starter for the EPA and the environmental extremist allies of the Obama administration.

Most American energy is generated by coal and natural gas. Coal is already on its way out. Regardless of the resource’s ability to power the nation for over 500 years at current energy usage rates, the EPA has recently laid down a regulation forcing all plants to reduce emissions by 30 percent — a crippling blow to an already suffering industry. The regulations may actually work far better, and worse, than expected. They very well reduce emissions from power generated by coal by 100 percent when the industry is unable to afford the amazing costs of retrofitting plants with new government-regulated technology. They may also, ironically, kill an industry that actually lures the American public away from the gasoline-fired automobile that the same regulatory clear-cutters want to do away with.

If energy prices skyrocket, as Obama said would be an inevitable outcome of his environmental policies, there is no practical purpose to investing in an electric car at any price point.

The free market could be ready to be rid of the carbon-puffing car and the alarmist, reactionary left may have already killed it upon arrival.

What exactly does the Obama administration want for the future of American energy? The market knows what it wants, the people know what they want. But it seems like the environmentalist radicals behind the Obama administration’s energy and environment public policies have an indiscriminate taste to destroy, rather than build for the future.

Progress is just over the horizon, only the self-titled “progressives” stand in the way.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************