EPA WILL KILL THE ECONOMY AND YOUR JOB

EPA WILL KILL THE ECONOMY AND YOUR JOB

A discussion on EPA and Endangered Species act rules and regulations – discussing the damage and delays in producing everything from food [farming] to energy [nuclear]. Many articles are posted so there can be a library of items to use in your blogs or as a basis for discussions.

 

Hope you enjoy some of the items.

Advertisements
89 comments
  1. July 1, 2011
    Suits, Not Suites, Determine Firm Performance
    By Kennedy Maize
    Let’s hear it for “the suits.” That’s the phrase Wharton School management professor Ethan Mollick applies to the phalanx of middle managers who populate an important ecological niche in our modern business environment. These folks, writes Mollick in a new report, are “often overlooked and sometimes maligned” in the business world. But they are often the keys to effective performance, not just “interchangeable parts in an organization.”

    In his paper “People and Process, Suits and Innovators: Individuals and Firm Performance,” Mollick argues that it is time to look at the real people who accomplish the work in business organizations, not just the abstractions on organization charts. In this, he is swimming against a tide of organizational studies that look at structure and function—things such as business strategy, management systems, HR policies, and practices—as the drivers of success or failure.

    “Performance differences between firms,” Mollick writes, “are generally attributed to organizational factors—such as routines, knowledge, and strategy—rather than to differences among the individuals who make up firms. As a result, little is known about the part that individual firm members play in explaining the variance in performance among firms. The absence of evidence at the individual level of analysis also prevents a thorough understanding of which roles beyond those of top managers contribute most to firm performance.”

    Much of modern management science ignores the importance of people on the end product, says Mollick. “Is firm performance driven by people or by process?” he asks. “The strategy and organization literature has historically argued that a good process is the key to good performance. The result is a long tradition of using organizational factors, rather than differences among individual employees, to explain differences in firm performance.” Not enough academic work has focused on the role of people in those structures.

    This is not to denigrate top management, says Mollick. The CEOs and CFOs set the overall direction, which is clearly important for business direction. But most earlier studies overplay the role of the executive suite and underplay the roles of the non-executive suits. His paper suggests that differences in top management explain less than 5% of firm performance among Fortune 800 firms, concluding that “top managers, at last, account for relatively little of why some companies perform better than others.” But when it comes down to actual performance, “it is all about the middle managers.”

    Another school of thought elevates the role of “innovators”—the creative, outside-the-box types—in explaining business success. Neither conventional approach adequately explains why some firms prosper and others do not, according to Mollick. “It is the individuals who fill the role of middle managers—the ‘suits’—rather than the creative innovators that best explain variation in firm performance.”

    Mollick examined the computer gaming industry in his research for the University of Pennsylvania’s noted business school. But his work likely has application across other businesses, particularly where knowledge, not just rote activity, lies at the roots of success. Looking at the gaming business, Mollick decided that middle managers accounted for some 22% of the variation in revenue among projects, compared to 7% explained by “innovators” (the propeller-heads) and 21% by the organization itself (strategy, leadership, and practices). While Mollick looked at a new, high-growth knowledge industry, gaming for personal computers, he found, “Even in a young industry that rewards creative and innovative products, innovative roles explain far less variation in firm performance than do managers.”

    Many analyses of organizational behavior argue a variation of the old saw, “The clothes make the man,” suggesting that the organizational structure and culture determines individual performance. Not so, argues Mollick. “This is not about a person being a good fit in just one specific organization,” he says. “Their skills are useful anywhere.” Good managers are not just cogs in the business machine. “There is something innate in them that makes them good at what they do.”

    Middle managers in every business environmental have “a tough job,” says Mollick. They have to function at the daily firing line, balancing finite resources, attempting to control what is often inherently difficult to control, and trying to fit their activities into the overall goals of the firm. “It’s always easy to think about the worst managers you have had,” he says, “the ones you see in the Dilbert cartoons. But it’s important to recognize the vital role these middle managers play in making sure that information flows and that creativity happens.”

    —Kennedy Maize is MANAGING POWER’s executive editor.

  2. August 1, 2011

    PRISM: A Promising Near-Term Reactor Option
    By James M. Hylko

    Share on print Share on email
    Pages: 1234
    In the early 1970s, the liquid metal reactor (LMR) program focused on the construction and deployment of the Clinch River Breeder Reactor (CRBR), a sodium-cooled, fast-neutron reactor near Oak Ridge, Tenn. At that time, the 1,000-MWth-rated CRBR (not to be confused with the plant power rating, in MWe) was viewed as a commercially viable power generation source in the U.S. and a stepping stone to larger, 3,000-MWth commercial plants, the scale thought necessary to be economically competitive with large light water reactors (LWRs).

    The CRBR program was a joint effort of the U.S. Atomic Energy Commission, its successor agencies—the U.S. Energy Research and Development Administration and the Department of Energy (DOE)—and the U.S. electric power industry. In addition to encountering legal challenges, cost escalation, and schedule delays over the next several years, the LMR program faced an unexpected challenge: Uranium was not becoming scarce and prohibitively expensive as had earlier been predicted. Work continued on the CRBR until the U.S. Congress terminated funding in 1983 (see sidebar).

    The Liquid-Metal Fast Neutron Reactor Has a Long History
    In the U.S., several fast neutron reactors have been designed, and five of them have achieved operation (Figure 2). There has also been one U.S. Navy application of a liquid-metal reactor.

    2. Evolution of the PRISM reactor. Source: GE Hitachi Nuclear Energy

    Experimental Breeder Reactor-1 (EBR-I, 1949–1964). As part of the National Reactor Testing Station (now known as the Idaho National Laboratory), construction of EBR-I started in late 1949 in the desert, about 18 miles southeast of Arco, Idaho. In 1951, it became the world’s first electricity-generating nuclear power plant when it produced sufficient electricity to illuminate four 200-watt lightbulbs, and subsequently generated enough electricity to power its own building. However, the actual design purpose of EBR-I was not focused on electricity production but rather on validating the breeder reactor concept. In 1953, experiments revealed the reactor was producing additional fuel during fission, as designed. However, on Nov. 29, 1955, EBR-I suffered a partial meltdown during a coolant flow test. After being repaired, the EBR-1 continued to operate until it was deactivated in 1964 and replaced with the EBR-II. The EBR-I was declared a National Historic Landmark in 1965.

    EBR-II (1963–1994). The EBR-II was a 19-MWe (62.5-MWth) demonstration reactor, providing heat and power to the Idaho facility from 1963 through 1994. The EBR-II demonstrated a complete sodium-cooled breeder reactor power plant with on-site reprocessing of metallic fuel between 1964 and 1969. The emphasis then shifted to testing materials and fuels (metal and ceramic oxides, carbides and nitrides of uranium and plutonium) for larger fast reactors.
    The EBR-II became the basis of the U.S. Integral Fast Reactor (IFR) program. The plan was to develop a fully integrated system with electrometallurgical “pyroprocessing,” fuel fabrication, and a fast reactor in the same complex. The reactor could be operated as a breeder reactor, or not. About $46 million of IFR funding was provided by a Japanese utility consortium. The IFR program goals demonstrated the inherent safety concepts of the EBR-II apart from engineered controls in April 1986. The program was terminated by the Clinton administration in 1994.

    USS Seawolf (1957–1987). USS Seawolf (SSN-575) was the only U.S. submarine built with a sodium-cooled nuclear reactor. The submarine was launched on July 21, 1955, and commissioned on March 30, 1957. Although the sodium-cooled reactor reduced the size of machinery spaces by nearly 40% and was more efficient than a water-cooled reactor, concerns posed by the liquid sodium coolant led to the selection of the pressurized water reactor (PWR) as the Navy’s standard. The USS Seawolf was converted to a PWR and returned to service in 1960. The submarine was decommissioned on Mar. 30, 1987, after 30 years of service.

    Fermi 1 (1957–1972). The 94-MWe prototype, located in Frenchtown Charter Township, Mich., was first U.S. commercial fast breeder reactor. Construction started in 1957, but it operated for only three years before a coolant problem caused overheating and partial damage to the fuel on Oct. 5, 1966. Following an extended shutdown that involved fuel replacement, repairs to the vessel, and cleanup, Fermi 1 was restarted in 1970 and continued to operate intermittently until Sept. 22, 1972. Fermi 1’s license was not renewed, and it was officially decommissioned Dec. 31, 1975.

    The Southeast Experimental Fast Oxide Reactor (SEFOR) (1965–1972). The deactivated 20-MWth SEFOR reactor is located in Cove Creek Township, near Fayetteville, Ark. The reactor was built in 1965 and was used to confirm the inherent safety features of the oxide fuel/sodium-cooling configuration. The fuel and irradiated sodium coolant were removed and taken offsite in 1972, and the facility was placed in safe storage. The reactor was acquired by the University of Arkansas in 1975. SEFOR was designated a Nuclear Historic Landmark site in October 1986.
    Clinch River Breeder Reactor (CRBR) Project (1970–1983). Congress authorized the CRBR program in 1970 as a cooperative effort between industry and the U.S. government to build a demonstration-scale sodium-cooled reactor. The plant was to be located within Tennessee Valley Authority’s system at a site on the Clinch River just west of Oak Ridge, Tenn. The CRBR was designed to produce plutonium from the abundant uranium-238 isotope (hence the term “breeder”). The plutonium would be recovered from the reactor fuel and then used in water-cooled reactors as a constituent of a mixed oxide of uranium and plutonium.
    Licensing began in 1973, and the project environmental impact statement was submitted on Dec. 31, 1975. The plant design was reported as 90% complete, and $500 million of reactor components were on order by Sept. 30, 1981. The Reagan administration made the CRBR project a funding priority in fiscal year 1982 and said it “should be constructed in a timely and expeditious manner.” However, on Oct. 26, 1983, the U.S. Senate terminated funding for the project.

    Fast Flux Test Facility (FFTF, 1978–1993, 1997–2001). Construction of the 400-MWth FFTF in Hanford, Wash., was completed in 1978; initial operation followed in 1980. From April 1982 to April 1992, it operated as a national research facility to test various aspects of commercial reactor design and operation, especially relating to breeder reactors. The FFTF is not a breeder reactor itself, but rather a sodium-cooled fast neutron reactor. By December 1993, the number of uses for the reactor was diminishing, so the decision was made to deactivate it. In January 1997, the DOE ordered that the reactor be maintained in a standby condition, pending a decision to incorporate it into the U.S. Government’s tritium production program for both medical and fusion research. In December 2001, deactivation continued after it was determined that the FFTF was not needed for tritium production. In May 2005, the core support basket was drilled to drain the remaining sodium coolant, effectively making the reactor unusable. In April 2006, the FFTF was honored by the American Nuclear Society as a National Nuclear Historic Landmark.

    Recent Advanced Reactor and Nuclear Fuel Cycle Programs (1999–Present). The U.S. slowly restarted its advanced reactor and nuclear fuel cycle programs via the Generation IV Nuclear Energy Systems program, the Generation IV International Forum, Advanced Fuel Cycle Initiative, and the Global Nuclear Energy Partnership (GNEP). (See “Developing the Next Generation of Reactors,” April 2008 in the POWER archives at http://www.powermag.com.) In April 2009, the DOE announced the cancellation of the U.S. domestic component of GNEP, and in June 2009 it announced that it is no longer pursuing domestic commercial reprocessing of nuclear fuel. The program is now referred to as the International Framework for Nuclear Energy Cooperation.
    Pages: 1234

    http://www.powermag.com/issues/features/PRISM-A-Promising-Near-Term-Reactor-Option_3887.html?hq_e=el&hq_m=2264757&hq_l=18&hq_v=81403a90a1

  3. EPA uses phony statistics to justify costly air-quality rules
    Posted on August 19, 2011 | 1 Comment
    MILLOY & DUNN: Air-pollution scare debunked
    EPA uses phony statistics to justify costly air-quality rules
    Steve Milloy and Dr. John Dale Dunn

    What if today’s levels of air pollution didn’t kill anybody? That certainly would be bad news for the U.S. Environmental Protection Agency (EPA), which has spent the past 15 years stubbornly defending its extraordinarily expensive and ever-tightening air-quality regulations.

    The EPA claims airborne fine particulate matter kills tens of thousands annually and that the prevention of those deaths will provide society $2 trillion annually in monetized health benefits by 2020.

    But we can debunk those claims with more than mere criticisms of EPA’s statistical malpractice and secret data. We have actual data that simply discredit the EPA’s claims. (Washington Times)

  4. “beginning today the average human being must not emit more than 2 tons of CO2 per year in order to avoid dangerous global climate change”
    A Planetary Crisis Is A Terrible Thing to Waste by Christian Schwäg…
    One example is per capita CO2 emissions. German Chancellor Angela Merkel, a conservative politician, has just publicly endorsed calculations
    by the Potsdam Institute for Climate Impact Research that show how much we overdraw our global CO2 budget. According to the institute, beginning today the average human being must not emit more than 2 tons of CO2 per year in order to avoid dangerous global climate change. But the average Chinese citizen is emitting 4 to 5 tons per year, the average German 11 tons, and the average American more than 20 tons. That means we’re overdrawing our CO2 budget by a factor of 2 to 10.
    Yale Environment 360: Christian Schwagerl
    Christian Schwägerl is an environmental journalist who has reported on science and public policy for two decades and is author of the book The Age of Men, published in German under the title Menschenzeit by Riemann/Random House

  5. August 18, 2011
    PBS and Global Warming Skeptics’ Lockout
    By Russell Cook
    I’ve repeatedly asked politicians, policymakers, and mainstream media journalists to explain to me why we need greenhouse gas regulation when skeptic scientists’ climate assessments indicate that it’s a pointless action to stop a natural phenomenon. You’d think at least the journalists would directly answer my questions about their articles’ claims of a scientific consensus, acid oceans, and who corroborated a singular accusation that fossil fuel industries conspire with skeptic scientists to fool the public. Each time, I got evasive replies instead.
    Rush Limbaugh will probably howl at me for expecting the MSM to accurately report both sides of the issue, but my incessant inquiries reveal that one news outlet, PBS’ NewsHour, appears unable to clearly state why skeptic scientists’ viewpoints aren’t worth considering.
    Its national affairs editor, Murrey Jacobson, has now sidestepped my questions three times in a row about why his program excluded skeptic scientists since 1996, first via a private 12/7/09 email forwarded to me after numerous inquiries to the PBS ombudsman. Having received Jacobson’s permission to quote it publicly only days ago, I placed it word-for-word here, for all to see. Back when I got it in 2009, I suggested to the ombudsman that it should be public, which resulted in Jacobson’s different but equally evasive public response that I linked to in the first paragraph of my 12/19/09 piece, “The Lack of Climate Skeptics on PBS’s ‘NewsHour.'” He sidestepped my questions for the third time in his 6/20/11 email, seen verbatim here.
    Jacobson’s defense essentially boils down to a “belief” that skeptics are far outnumbered, and an insistence that the NewsHour’s coverage “has reflected the trajectory of the data while offering differing perspectives on these issues.” I’ll point out that those are perspectives on solving the human-created problem.
    After seeing Robert (aka “Robin”) MacNeil repeat Jim Lehrer’s “personal guidelines he works by as a journalist,” I stopped waiting for Jacobson and tried to get answers from the top man himself, via snail-mail directly to Lehrer. If Jacobson’s responses are troublesome, Lehrer’s is a jaw-dropper — scroll down the page if you can’t wait to see it. But first, my literal word-for-word letter is here (I had to spell out my web links), now as an open letter:
    May 18th, 2011
    Dear Mr Lehrer,
    I wanted to respond to what Robin MacNeil said about your ‘stealth exit from the NewsHour’ May 13th. As a NewsHour viewer since sometime in the late ’70s, you might find it amusing that as recently as a few years ago, I was still occasionally calling your show the “MacNeil/Lehrer NewsHour” despite Robin’s long-ago departure. No offense to the commercial news broadcasts, but when John Chancellor retired from NBC, the NewsHour became my sole source of properly done news and analysis of current political events. I particularly appreciated the two-side analysis approach on Middle East affairs, US / Soviet relations, and US political developments, as I reasoned the solutions to such problems lay somewhere in the middle, and could decide for myself just where that middle ground was.
    No doubt this is what you mean in your MacNeil/Lehrer journalism guideline about “assume there is at least one other side or version to every story.”
    Now for my journalism concern, which may be something you are not fully aware of: I firmly believe this guideline was never applied to the story of man-caused global warming at the NewsHour. Correct me if I am wrong, no skeptic scientists have ever appeared on the program in debate with IPCC scientists. I’ve done my own extensive online research at the NewsHour archive pages, going as far back as they allow, to 1996, and I also do not see any as guests offering their basic science viewpoints, while substantial amounts of time were given to multiple-repeat IPCC scientist guests like Michael Oppenheimer, Stephen Schneider, and Kevin Trenberth, along with others offering detailed explanations on conclusions about man-caused global warming.
    My worry is that you or your staff relied on reasons to exclude skeptic scientists from former Boston Globe reporter Ross Gelbspan, who was described by Al Gore as the Pulitzer-winning discoverer of ‘smoking gun’ evidence showing skeptic scientists received fossil fuel industry money in exchange for fabricated climate assessments that were only intended to confuse the public. My fear is that nobody at the NewsHour ever checked the veracity of Gelbspan’s claims or myriad other problems with his assertions:
    Gelbspan never won a Pulitzer; surely you’d agree his CYA response about it borders on preposterous — the Pulitzer group rewards exemplary reporting, not conceiving story ideas, editing, or guiding a reporting staff.
    Gelbspan did not discover the set of 1991 coal industry PR campaign memos he is so widely credited with doing, where one in particular contains a sentence which is the central bit of evidence in his accusation against skeptic scientists. In fact, he never discloses how those memos came into his hands.
    Neither Gelbspan nor multiple other reporters who rely on that central accusation sentence ever show the memo in its full context in any book, magazine article, web page, or media presentation. A reading of the actual memo reveals the sentence is out-of-context, and not actually any kind of top-level industry directive (I found the complete memo after seven months of searching for it, at an obscure Greenpeace page of archive scans).
    Gelbspan’s fossil fuel funding accusation is at best guilt-by-association; he never shows irrefutable proof that an exchange of industry money to skeptic scientists prompted false climate assessments.
    No one else has corroborated Gelbspan’s accusation, yet he is relied on as evidence in places ranging from Al Gore’s movie to two of the major global warming nuisance lawsuits.
    The long-repeated idea that the media gives too much balance to skeptic scientists is literally unsupportable. That is proven by the sheer lack of such scientists appearing at the NewsHour or even significant amounts time on the program devoted to skeptics’ viewpoints, plus the credibility of a 2004 study by Boykoff & Boykoff supposedly proving the existence of ‘too much balance’ is critically undermined by their own ties to Gelbspan.
    I could go on and on. I’ve done my own research, and have accumulated a computer notes file of web site pages and keyword phrases copied from those that is over 62,000 words. My concern about the lack of skeptic scientists at the NewsHour has been seen online at the PBS Ombudsman pages several times now. Yes, Murrey Jacobson responded to my question about why no skeptic scientists debated IPCC scientists in the December 17, 2009 Ombudsman page, but it is rather apparent he danced around the question instead of answering it directly.
    I did my own reporting on this at the American Thinker web site back in December 2009 in a blog piece titled “The Lack of Climate Skeptics on PBS’s ‘NewsHour’ http://www.americanthinker.com/blog/2009/12/the_lack_of_climate_skeptics_o.html In a July 2010 American Thinker article, I quantified the sheer imbalance of IPCC side vs skeptic side at the NewsHour and further showed how Margaret Warner appears to have relied on a Gelbspan book quote in her Dec 1997 interview of Western Fuels CEO Fred Palmer, see “The Left and Its Talking Points” http://www.americanthinker.com/2010/07/the_left_and_its_talking_point.html My latest A.T. article details how an ex-WCCO TV anchorman appears to be repeating Gelbspan’s 15-year old talking point about ‘unfair media balance’, and how the lack of fact-checking in this particular situation showcases an ominous sign of things to come for the mainstream media, see “Warmist Mantra Wearing Out”http://www.americanthinker.com/2011/05/warmist_mantra_wearing_out.html (note: headlines at A.T. are written by its editors)
    Do you understand the enormity of this problem, not only for you specifically, but also for all of basic journalism? I’m no journalist, I don’t pretend to be one, and I even partially turned back praise by the UK Telegraph’s James Delingpole of my research being ‘investigative journalism’ in my A.T. article “Warmist Slander of Scientific Skeptics”http://www.americanthinker.com/2010/09/warmist_slander_of_scientific.html I am simply asking tough questions that journalists have surprisingly not asked.
    You’ve had a fabulous career, but the apparent huge contradiction to your own guidelines on this specific topic threatens to put a very black mark on it. I seriously doubt this is deliberate on your part, but is rather a simple oversight that’s been made more serious over the years through a self-feeding influx of information supporting only the original oversight.
    This is an appearance problem you must face, and either prove me wrong, or acknowledge the problem and address how you intend to fix it. My preference is for you to do so at the PBS Ombudsman page, or better yet, at the NewsHour itself. I have nothing to hide, and would be glad to share all that I’ve found, if you have questions about any part of it. And, to borrow a point made by one of the more prominent speakers about the skeptic side, Lord Christopher Monckton, “no need to trust what I say, you may look all of this up for yourself.”
    Sincerely,
    Russell Cook

  6. « The Greenpeace War On Humanity Continues, With Support From Other… | Main | MIT Scientists Say ‘Smart Grid’ Could Make Power Grid Unstable – Po…
    Scientists Discover Arctic Temperature Dataset Seriously Flawed – Significant Fabricated Warming Is The Result
    Read here. Over 2,000 previous peer-reviewed studies are severely tainted with bad data from the often used ERA-40 Reanalysis regarding Arctic region temperature trends. Researchers Screen and Simmonds concluded that this dataset should no longer be relied on in future studies, which implies that many past studies indicating Arctic warming are robustly in error.
    Essentially, group-think consensus science by “experts” at its worst: “Hey…why don’t we all use the same computer output for every Arctic study.” Brilliant.
    “This study explicitly documents a discontinuity in the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERA-40) that leads to significantly exaggerated warming in the Arctic mid- to lower troposphere, and demonstrates that the continuing use of ERA-40 to study Arctic temperature trends is problematic…Decadal or multidecadal Arctic temperature trends calculated over periods that include 1997 are highly inaccurate…It is shown that ERA-40 is poorly suited to studying Arctic temperature trends and their vertical profile, and conclusions based upon them must be viewed with extreme caution. Consequently, its future use for this purpose is discouraged.”…..”Such an error not only affects the Arctic troposphere, but necessarily must effect the entire northern hemisphere jet stream.”[James A. Screen, Ian Simmonds 2011: Journal of Climate]

  7. Home The Climate Record The Truth About Greenhouse Gases
    The Truth About Greenhouse Gases
    Tuesday, 16 August 2011 14:43 William Happer

    “The object of the Author in the following pages has been to collect the most remarkable instances of those moral epidemics which have been excited, sometimes by one cause and sometimes by another, and to show how easily the masses have been led astray, and how imitative and gregarious men are, even in their infatuations and crimes,” wrote Charles Mackay in the preface to the first edition of his Extraordinary Popular Delusions and the Madness of Crowds. I want to discuss a contemporary moral epidemic: the notion that increasing atmospheric concentrations of greenhouse gases, notably carbon dioxide, will have disastrous consequences for mankind and for the planet. This contemporary “climate crusade” has much in common with the medieval crusades Mackay describes, with true believers, opportunists, cynics, money-hungry governments, manipulators of various types, and even children’s crusades.
    Read the full paper here

  8. The Original Enviro-Nazis
    By Mark Musser

    Mixing Green With Red Makes Brown (Shirts)

    In light of the recent environmental controversy over splattergate, where the green propaganda ad in the United Kingdom explosively became gory red in the classroom, it is time to be reminded that the Nazis started out green but became bloody red. This political reality came much to the dismay of many German conservationists as they slowly found out the real intent of Adolf Hitler.*

    Naïve German greens had no idea that many of the Führer’s savage premeditations about continental hegemony were conjured up at his mountain retreat in the wild Bavarian Alps. The greens just assumed that he was enjoying the alpine serenity of the region away from the hubbub of Berlin. Neither did they apprehend that the sweeping Nazi environmental laws of 1933-35 were an implicit anti-Semitic precursor for the racially charged Nuremberg laws. Together, these laws specifically targeted Jewish internationalism, whether coming from the capitalist West or the communist East, as being unnatural and alien, i.e., not indigenous to Germany.

    One of the most embarrassing environmental facts of the 1930s was that between 60% and 70% of the German greens were Nazi Party members, compared to only 10% of the population at large. In fact, German greens outperformed even medical doctors and teachers, with Nazi foresters and veterinarians leading the charge. Somehow, the so-called independent German wandervogels (German word for “wandering free spirits”) found themselves at the footstool of Der Führer. Their wandervogel attitudes about civilization and the wild forestlands found a political niche in the isolationist biology of the Nazi Party. Furthermore, their strong beliefs in holism found a political voice in the totalitarian Social Darwinism of the Nazis, which was largely rooted in Ernst’s Haeckel’s ecology of the 1800s.

    In those days, racism was good, “scientific” biology. Racism (disguised as eugenics) was the rage of the late 1800s and early 1900s. It required the cataclysm of World War II to bring about an international repentance on the subject. In Germany, ethnicity and nature, racism and environmentalism, often went hand in hand. By the time of the Nazi period, this collaboration had hardened into a political-biological ecology known as “blood and soil.”

    Though the German greens predated the Nazis by well over a hundred years, with the advent of the Nazi Party, their previous Romanticism was transformed into something far more sinister because of its ties to a strong totalitarian state, where, for the first time, they became political players at the federal level. While environmental historians are quick to point out that the Four Year Plan’s great battle for production later compromised the new conservation laws, the fact of the matter is that the idea of a political totalitarian environmentalism was born in the Third Reich. Indeed, the 1935 Reich Nature Protection Act trumpeted the slogan, “it shall be the whole landscape.”

    Green historians may debate on how effective this totalitarian conservation law was, but nonetheless, here is seen the birth of environmental impact statements and the like. In Nazi Germany, private property could even be seized without compensation for the sake of environmental concerns. As such, what was unique about Nazi environmentalism was not in its setting aside of nature reserves à la Teddy Roosevelt, but in its willingness to regulate already developed properties for the sake of the environment. This was a line that Teddy Roosevelt and Gifford Pinchot did not cross. The green Nazis did.

    Another man who worked with Kurt Tucholsky in the late 1920s was pacifist-Marxist John Heartfield. He was a very gifted political artist who often used exaggerated metaphors to hyperbolize the contradictions of National Socialism. Not surprisingly, upon Nazi accession to power, he was forced to suffer through a harrowing escape from Germany in 1933. In the later thirties, he made up a montage of Hitler watering an oak tree which bore the acorn fruit of army helmets and shells painted with swastikas. Like Tucholsky before him, Heartfield astutely recognized the dark shade of green that characterized Hitler and the Nazis of the 1930s.

    The oak tree was considered the sacred tree of Germany. Hitler loved oak trees and had them planted all over the Reich as “concordant with the spirit of the Führer.” Oak leaves and acorns were even the symbols of the SS, Hitler’s green Praetorian Guard. Hitler gave oak saplings to all of the gold medal winners of the 1936 Olympics as nationalistic living symbols of the competitive Olympic spirit. Jesse Owens received four of them. American gold medal winner Glenn Morris also took one home. Some of these Olympic oaks are still alive.

    In July of 2009, a tremendous controversy erupted in Jaslo, Poland over one of Hitler’s oaks planted by the Wehrmacht in 1942 to celebrate his birthday. According to the U.K. Times, “The ritual was always the same: a brass band and a speech by the German-imposed mayor that would always invoke the metaphor of deep roots, of tiny acorns turning into great oaks, of 1,000-year Reichs. Attendance was compulsory.”

    The Jaslo oak tree is evidence of Nazi Germany’s green plans for the East. Landscape planners were even chomping at the bit to get started in Ukraine and Belarus behind the Wehrmacht’s advance into Russia. The oak tree was scheduled to be taken down due to a construction project, but an eyewitness to the event intervened and informed the people of the city that the oak tree should be saved since it was a silent witness to some of the greatest crimes committed in the 20th century. With all of the articles that were written about the Jaslo controversy, it is intriguing how uninterested so many of them were in delving into the Nazi environmental record behind the planting of that oak tree. It seems that postmodern Western man has become too infatuated with environmental fascism to take notice.

    While many continue to use the eco-fascism label as a figure of speech to help illustrate the growing totalitarian menace of the modern green movement, it is no metaphor. Eco-fascism is rooted in history like a German oak tree. This is nowhere more shockingly evident than at the Buchenwald Concentration Camp. Buchenwald means “Beech Forest.” The SS enjoyed a zoo just outside the camp, and in the midst of the camp are the remains of Goethe’s oak. The famous scholar Johann Wolfgang von Goethe (1749-1832) blazed the trail for the German Romantic Movement. Goethe spent much time around the environs of that particular oak tree, the destination of his favorite forest retreat. Today, Romanticism is known as Environmentalism. When the Nazis cleared the ground for the construction of the camp, they carefully preserved Goethe’s oak tree. Late in the war, the tree died during an Allied bombing raid. However, the stump still remains thanks to the special care that the Nazis gave to it at its funeral.

    Mark Musser is the author of “Nazi Oaks: The Green Sacrificial Offering of the Judeo-Christian Worldview in the Holocaust” and a commentary on the warning passages in the book of Hebrews called “Wrath or Rest: Saints in the Hands of an Angry God.”

    * This was first pointed out by pacifist-leftist Kurt Tucholsky (1890-1935).
    on “The Original Enviro-Nazis”

  9. Nathan,

    thank you, as you can see I am trying to build a library of articles and papers that show how silly the false hypothesis of AGW is when real science is used instead of money grant chasers. I have debated NASA people about lack of real peer review but they get really upset when you questions their views or ESTIMATES?

    It has even gone so far as a Friend that is a Professor of Microbiology and Ocean studies with Scripts and the University of California at San Diego. he flies around the world making presentations to support the global warming estimate to raise research money from anyone even the Saudi’s. he admitted that the real data about ocean temps and currents they have collected would take decades with super computers to even evaluate. it is all about the money – Paid to play – if you find evidence my way you get money – if not no more money.

  10. Climate model output is now “data”
    Posted on 08/16/2011 by jblethen
    Both [Seth] Wenger [a fisheries researcher with Trout Unlimited in Boise] and [Dan] Isaak, a fisheries biologist at the U.S. Forest Service’s Rocky Mountain Research Station in Boise, were a part of a team of 11 scientists who said trout habitat could drop by 50 percent over the next 70 years because of a warming world. The paper, published Monday in the peer-reviewed science journal Proceedings of the National Academy of Sciences, predicts native cutthroat habitat could decline by 58 percent.

    The two men, who have devoted their lives to scientific research, say they depend on the scientific method and peer review to judge the quality of the research that underscores their findings. The climate predictions are based on 10 of the 20 climate models developed independently worldwide that all show the world is getting warmer.

    “The climate models have been right for 30 years and they are getting better all the time,” Isaak said. …

    The most dire climate models show temperatures in Idaho rising an average of 9 degrees in 70 years, Wenger said. …

    “I have to set aside my feelings and use the best data,” he said. …

    But what if all the climate models are wrong?

    “There just is not a lot of data supporting the alternative view,” Wenger said. “Idaho trout face climate trouble, study finds“

  11. “it is clear there is a serious problem with the models”
    Posted on August 16, 2011
    Dr Brady said the divergence between the sea-level trends from models and sea-level trends from the tide gauge records was now so great “it is clear there is a serious problem with the models”.

    “In a nutshell, this factual information means the high sea-level rises used as precautionary guidelines by the CSIRO in recent years are in essence ridiculous,” he said. During the 20th century, there was a measurable global average rise in mean sea level of about 17cm (plus or minus 5cm).

    But scientific projections, led by the Intergovernmental Panel on Climate Change, have suggested climate change will deliver a much greater global tide rise in mean sea level this century of 80-100cm.

    http://www.theaustralian.com.au/

    NATIONAL AFFAIRS
    Sea-level rises are slowing, tidal gauge records show
    BY: EXCLUSIVE STUART RINTOUL From: The Australian July 22, 2011 12:00AM 125 comments
    Increase Text Size
    Decrease Text Size
    Print

    Author of the NSW government’s sea levels report, Phil Watson, at Terrigal beach on the NSW central coast yesterday. Picture: Dan Himbrechts Source: The Australian
    ONE of Australia’s foremost experts on the relationship between climate change and sea levels has written a peer-reviewed paper concluding that rises in sea levels are “decelerating”.

    The analysis, by NSW principal coastal specialist Phil Watson, calls into question one of the key criteria for large-scale inundation around the Australian coast by 2100 — the assumption of an accelerating rise in sea levels because of climate change.

    Based on century-long tide gauge records at Fremantle, Western Australia (from 1897 to present), Auckland Harbour in New Zealand (1903 to present), Fort Denison in Sydney Harbour (1914 to present) and Pilot Station at Newcastle (1925 to present), the analysis finds there was a “consistent trend of weak deceleration” from 1940 to 2000.

    Mr Watson’s findings, published in the Journal of Coastal Research this year and now attracting broader attention, supports a similar analysis of long-term tide gauges in the US earlier this year. Both raise questions about the CSIRO’s sea-level predictions.

    RECOMMENDED COVERAGE

    Bootleggers hijack climate change debate

    ‘Man-made pollution reflects sun’

    Climate change researcher Howard Brady, at Macquarie University, said yesterday the recent research meant sea levels rises accepted by the CSIRO were “already dead in the water as having no sound basis in probability”.

    “In all cases, it is clear that sea-level rise, although occurring, has been decelerating for at least the last half of the 20th century, and so the present trend would only produce sea level rise of around 15cm for the 21st century.”

    Dr Brady said the divergence between the sea-level trends from models and sea-level trends from the tide gauge records was now so great “it is clear there is a serious problem with the models”.

    “In a nutshell, this factual information means the high sea-level rises used as precautionary guidelines by the CSIRO in recent years are in essence ridiculous,” he said. During the 20th century, there was a measurable global average rise in mean sea level of about 17cm (plus or minus 5cm).

    But scientific projections, led by the Intergovernmental Panel on Climate Change, have suggested climate change will deliver a much greater global tide rise in mean sea level this century of 80-100cm.

    The federal government has published a series of inundation maps based on the panel’s predictions showing that large areas of Australia’s capital cities, southeast Queensland and the NSW central coast will be under water by 2100.

    Without acceleration in sea-level rises, the 20th-century trend of 1.7mm a year would produce a rise of about 0.15m by 2100.

    Mr Watson’s analysis of the four longest continuous Australian and New Zealand records is consistent with the findings of US researchers Robert Dean and James Houston, who analysed monthly averaged records for 57 tide gauges, covering periods of 60 to 156 years.

    The US research concluded there was “no evidence to support positive acceleration over the 20th century as suggested by the IPCC, global climate change models and some researchers”.

    Mr Watson cautioned in his research and again yesterday that studies of a small number of northern hemisphere records spanning two or three centuries had found a small acceleration in sea-level rises. He said it was possible the rises could be subject to “climate-induced impacts projected to occur over this century”.

    Mr Watson’s research finds that in the 1990s, when sea levels were attracting international attention, although the decadal rates of ocean rise were high, “they are not remarkable or unusual in the context of the historical record at each site over the 20th century”.

    “What we are seeing in all of the records is there are relatively high rates of sea-level rise evident post-1990, but those sorts of rates of rise have been witnessed at other times in the historical record,” he said.

    “What remains unknown is whether or not these rates are going to persist into the future and indeed increase.”

    He said further research was required, “to rationalise the difference between the acceleration trend evident in the global sea level time-series reconstructions (models) and the relatively consistent deceleration trend evident in the long-term Australasian tide gauge records”.

    With an estimated 710,000 Australian homes within 3km and below 6m elevation of the coast, accurate sea-level predictions are vital for planning in coastal areas anticipating predicted sea-level rises of almost a metre by 2100.

  12. August 20, 2011
    USEPA: Hell-Bent on Over-Control
    By Harvey M. Sheldon

    In the United States, we are repeatedly told that good science requires this or that regulation because this or that bad thing will happen otherwise. We live in a world that depends on honest answers to good questions in order to keep itself from making serious mistakes. Unfortunately, government is not an institution designed to ask all the right questions, and it often fails to give good or honest answers to the questions it does ask. This is occurring more and more often in the environmental arena, where political and ideological goals are driving the misuse of science by the United States Environmental Protection Agency to churn out highly questionable new regulations. For example, in the field of air pollution control, companies big and small are faced with some new requirements that will do little for the environment while shrinking the chances of business or job growth.
    The USEPA has been preparing to issue new national ambient air quality standards (NAAQS) for ozone. These may be the most expensive environmental rules ever issued. Yet they are not needed, and they are ill-advised from a policy and program viewpoint.
    According to the Clean Air Act, the EPA must periodically review and adjust ambient air quality standards in order to provide protection of public health and the environment, with an adequate safety margin. Over the decades since its original passage in 1970, the Act has brought about beneficial reductions in ozone and other pollutant concentrations for which NAAQS exist. The EPA administrator must adjust the standards in keeping with the best science on the subject, and the reviews are to take place every five years. They are done in a staggered manner so that states can have staff attend to the need for new regulation in an orderly way and business can count on getting a certain amount of mileage from the their pollution control investments.
    The EPA’s willingness to take a very aggressive position in favor of public protection is admirable in a sense, but regulation based on bad science is destructive. The Agency has been alarmist in air quality programs, to the point where over-control is rather clearly occurring. The method for this over-control is to claim thousands of instances of death or disease prevented, yet there is little reliable solid scientific evidence of such a death toll actually being caused by air pollution. The Agency cites reviews of disease and mortality that it correlates with a given pollutant instead of findings of actual causes of death or disease. As Steve Milloy aptly says, show us the bodies.
    The current administrator has departed from the bipartisan prior practice and instead revisited a NAAQS on ozone in less time than the law provides. This is especially questionable since ozone air quality has greatly improved over the years, and the asserted bad health effects are dubious. Prior administrators have generally relied upon the advice and findings of a Clean Air Scientific Advisory Committee (CASAC), which is an expert panel set up by law to provide unbiased scientific opinion on air standards and the health effects, if any, of air pollutants.
    Administrator Lisa Jackson intends to go ahead and ratchet down the ozone standard fully two years before the next CASAC full review is due. The stated need for a more stringent standard is that people’s health is adversely affected, yet look at the facts on asthma, an ancient disease: ozone levels have declined 30% in the nation at the same time that asthma cases have increased threefold. Dr. Roger O. McClellan, a former CASAC member, has stated that “there is no compelling reason based on the EPA CASACs advice that the Ozone NAAQS be set in the range of 60 to 70 ppb” He also says that Administrator Jackson’s decision to “reconsider” that [2008] decision “is without precedent and, if she proceeds, will set a terrible precedent for any future EPA Administrator to reconsider every rule of the previous administration. What a way to create havoc and send the economy in to a tail spin.” Senator James Inhofe of Oklahoma, ranking Republican on the Senate Environment and Public Works Committee, has accused the EPA of deliberately using flawed science, biased advisors, and a lack of serious objectivity in its critical reviews of data and evidence respecting the need for a tougher standard.
    Other examples of EPA overcontrol with political motives include:
    On Global Warming: Carbon dioxide (CO2) is a naturally created gas essential to life on the planet that occurs only in trace amounts atmospherically. The EPA insists that CO2 causes and threatens serious warming of the climate. Recent research shows that the alarm is based on a number of doubtful ideas, including flawed and skewed interpretations of twentieth-century temperature data, overestimation of CO2’s “heat-trapping,” and incompetent computer models. The EPA continues in full alarm mode on CO2, without honestly subjecting it to the full hearing and scientific process required by the law. Control of CO2 is essentially the power to control almost all means of industrial production and heating.
    On Mercury: The EPA has proposed and will soon adopt a standard as to mercury emissions from electricity generation (e.g., coal) plants. The air quality measures for mercury put out by the EPA are two or three times more stringent than those of the FDA and some world health organizations. The EPA even used data other professionals have rejected as not probative. Moreover, serious scientists say that our power plants account for less than 0.5% of all the mercury in the air we breathe, and most mercury is naturally occurring. The EPA nevertheless demands that utility companies spend billions to retrofit coal-fired power plants that produce half of all U.S. electricity.
    On Fine Particulates: The EPA is fostering concern over very fine particulates associated with diesel fuel and other human activities. The Agency generally claims human health danger based on reviews of inexact public records, even though such studies are inherently incapable of proving causation. Associations between fine particles and mortality or heart disease have been said to exist in some and shown to be nonexistent in other studies. Health impacts are even seen to disappear when confounding conditions are taken into account. Scientists like James Enstrom in California have lost their jobs after publishing honest studies showing that no health effects occur, and many trucking companies in California now face ruin from the cost of these rules based on statistical guesswork.
    In sum, the EPA is assuming more control of the economy of the United States for reasons driven more by ideology and politics than demonstrated scientific or public health need. The Agency’s approach to regulation needs serious reform, and the ability of courts to provide serious scrutiny of agency action also must be improved.
    Harvey M. Sheldon, a former USEPA Regional Counsel and graduate of Harvard Law, has extensive environmental law practice experience. This article expresses his personal opinions and is not on any client’s or firm’s behalf.

    Read more: http://www.americanthinker.com/2011/08/usepa_hell-bent_on_over-control.html#ixzz1w0Tb7app

  13. Democratic Senator: Environmental Protection Agency Out of Control
    By Susan Jones
    June 10, 2011
    Subscribe to Susan Jones’s posts

    Coal-burning power plant. (AP photo)

    (CNSNews.com) – Democratic Sen. Joe Manchin, the former governor of coal-producing West Virginia, is blasting the Obama administration for using the Environmental Protection Agency to regulate coal-fueled power plants out of business.

    On Thursday, American Electric Power company announced that to comply with a series of EPA regulations, it will close five coal-fired plants — three in West Virginia and one each in Ohio and Virginia — at a net cost of 600 jobs.

    American Electric Power is one of the largest electric utilities in the United States, delivering electricity to more than 5 million customers in 11 states.

    “We have worked for months to develop a compliance plan that will mitigate the impact of these rules for our customers and preserve jobs, but because of the unrealistic compliance timelines in the EPA proposals, we will have to prematurely shut down nearly 25 percent of our current coal-fueled generating capacity, cut hundreds of good power plant jobs, and invest billions of dollars in capital to retire, retrofit and replace coal-fueled power plants,” said AEP Chairman and CEO Michael G. Morris.

    “The sudden increase in electricity rates and impacts on state economies will be significant at a time when people and states are still struggling,” he added.

    The plant closures in West Virginia alone will result in 242 lost jobs — “and that’s simply wrong,” Manchin said:

    “Let me be clear, it’s decisions like the one made by AEP today that demonstrate the urgent need to rein in government agencies like the EPA, preventing them from overstepping their bounds and imposing regulations that not only cost us good American jobs, but hurt our economy.

    “It is because of out-of-control agencies like the EPA as well as the need to protect American jobs that I sponsored the REINS Act — a commonsense measure that will help protect and create jobs by reining in needless or burdensome regulations, and that will put responsibility back where it belongs – in the hands of the people who are elected to govern and lead this great nation,” Manchin concluded.

    During his campaign for president, Barack Obama admitted that “if somebody wants to build a coal fired plant, they can. It’s just that it will bankrupt them because they’re going to be charged a huge sum for all that greenhouse gas that’s being emitted.”

    The cost of AEP’s compliance plan could range from $6-$8 billion in capital investment through the end of the decade, the company said. That’s in addition to the $7.2 billion that AEP has invested since 1990 to reduce emissions from its coal-fired plants.

    The company noted that annual emissions of nitrogen oxides from AEP plants are 80 percent lower today than they were in 1990, and sulfur dioxide emissions are 73 percent lower than they were in 1990.

    “We support regulations that achieve long-term environmental benefits while protecting customers, the economy and the reliability of the electric grid, but the cumulative impacts of the EPA’s current regulatory path have been vastly underestimated, particularly in Midwest states dependent on coal to fuel their economies, AEP said.

    The company said while some jobs will be created from the installation of emissions-reduction equipment, AEP expects a net loss of around 600 power plant jobs with annual wages totaling approximately $40 million as a result of complying with the proposed EPA rules.

    “We will continue to work through the EPA process with the hope that the agency will recognize the cumulative impact of the proposed rules and develop a more reasonable compliance schedule. We also will continue talking with lawmakers in Washington about a legislative approach that would achieve the same long-term environmental goals with less negative impact on jobs and the U.S. economy,” Morris said.

    “With more time and flexibility, we will get to the same level of emission reductions, but it will cost our customers less and will prevent premature job losses, extend the construction job benefits, and ensure the ongoing reliability of the electric system.”

    AEP said the following plants will be closed by the end of 2014:

    — Glen Lyn Plant, Glen Lyn, Va.
    — Kammer Plant, Moundsville, W.Va.
    — Kanawha River Plant, Glasgow, W.Va.
    — Phillip Sporn Plant, New Haven, W.Va.
    — Picway Plant, Lockbourne, Ohio

    In addition, AEP plans to scale back power generation at six plants.

  14. One of the links cover the micro dust issue which is just part of the coal EPA problems the other is mercury. the limits are completely unjustified and have no bearing on health of humans or the environment. This is the purpose of this thread is to provide resources that refute these false statement and silly regulations.

  15. 2306
    Comments
    Getting ready for a wave of coal-plant shutdowns
    Posted by Brad Plumer at 12:19 PM ET, 08/19/2011
    Text Size PrintE-mailReprints
    Share:
    More >

    (JOHN GILES/ASSOCIATED PRESS) Over the next 18 months, the Environmental Protection Agency will finalize a flurry of new rules to curb pollution from coal-fired power plants. Mercury, smog, ozone, greenhouse gases, water intake, coal ash—it’s all getting regulated. And, not surprisingly, some lawmakers are grumbling.
    Industry groups such the Edison Electric Institute, which represents investor-owned utilities, and the American Legislative Exchange Council have dubbed the coming rules “EPA’s Regulatory Train Wreck.” The regulations, they say, will cost utilities up to $129 billion and force them to retire one-fifth of coal capacity. Given that coal provides 45 percent of the country’s power, that means higher electric bills, more blackouts and fewer jobs. The doomsday scenario has alarmed Republicans in the House, who have been scrambling to block the measures. Environmental groups retort that the rules will bring sizeable public health benefits, and that industry groups have been exaggerating the costs of environmental regulations since they were first created.
    So, who’s right? This month, the nonpartisan Congressional Research Service, which conducts policy research for members of Congress, has been circulating a paper that tries to calmly sort through the shouting match. Thanks to The Hill’s Andrew Restuccia, it’s now available (PDF) for all to read. And the upshot is that CRS is awfully skeptical of the “train wreck” predictions.
    First, the report agrees that the new rules will likely force the closure of many coal plants between now and 2017, although it’s difficult to know precisely how many. For green groups, that’s a feature, not a bug: Many of these will be the oldest and dirtiest plants around. About 110 gigawatts, or one-third of all coal capacity in the United States, came online between 1940 and 1969. Many of these plants were grandfathered in under the Clean Air Act, and about two-thirds of them don’t have scrubbers:

    (FGD = Flue Gas Desulfurization, SCR = Selective Catalytic Reduction)
    CRS notes that many of the plants most affected by the new EPA rules were facing extinction anyway: “Many of these plants are inefficient and are being replaced by more efficient combined cycle natural gas plants, a development likely to be encouraged if the price of competing fuel—natural gas—continues to be low, almost regardless of EPA rules.”
    Still, that’s a lot of plants. Won’t this wreak havoc on the grid? Not necessarily, the CRS report says, although the transition won’t be simple. For one, most of these plants don’t provide as much baseload power as it appears on first glance—pre-1970 coal plants operating without emissions controls are in use, on average, only about 41 percent of the time. Second, the report notes that “there is a substantial amount of excess generation capacity at present,” caused by the recession and the boom in natural gas plants. Many of those plants can pitch in to satisfy peak demand. Third, electric utilities can add capacity fairly quickly if needed — from 2000 to 2003, utilities added more than 200 gigawatts of new capacity, far, far more than the amount that will be lost between now and 2017.
    Granted, those upgrades and changes won’t be free. The CRS report doesn’t try to independently evaluate the costs of the new rules, noting that they will depend on site-specific factors and will vary by utility and state. (Matthew Wald recently wrote a helpful piece in The New York Times looking at how utilities might cope.) But, the report says, industry group estimates are almost certainly overstated. For one, they were analyzing early EPA draft proposals, and in many cases, the agency has tweaked its rules to allay industry concerns. And many of the EPA’s rules are almost certain to get bogged down in court or delayed for years, which means that utilities will have more time to adapt than they fear.
    The CRS report also agrees with green groups that the benefits of these new rules shouldn’t be downplayed. Those can be tricky to quantify, however. In one example, the EPA estimates that an air-transport rule to clamp down on smog-causing sulfur dioxide and nitrogen dioxide would help prevent 21,000 cases of bronchitis and 23,000 heart attacks, and save 36,000 lives. That’s, at the high end, $290 billion in health benefits, compared with $2.8 billion per year in costs (according to the EPA) by 2014. “In most cases,” CRS concludes, “the benefits are larger.”
    Granted, few would expect this report to change many minds in Congress. Just 10 days ago, Michele Bachmann was on the campaign trail promising that if she becomes president, “I guarantee you the EPA will have doors locked and lights turned off, and they will only be about conservation.” That doesn’t sound like someone who’s waiting for a little more data before assessing the impact of the new regulations.

  16. EPA Says That Freezing People In The Dark Is Good For Their Health
    Posted on August 21, 2011
    The head of the Texas Public Utility Commission expressed concern Friday that a new federal air quality rule, set to take effect Jan. 1, will cause disruptions in electric service.

    If implementation of the Cross-State Air Pollution Rule is not delayed, “I have no doubt in my mind that this rule will result in reliability issues and rolling outages in Texas,” Donna Nelson said at the start of the commission’s meeting.

    The rule, issued in early July by the Environmental Protection Agency, would require substantial reductions in emissions of nitrogen oxides and sulfur dioxide at power plants in 27 states.

    The EPA says the rule will save and prolong lives by reducing harmful smog and soot pollution. Gina McCarthy, an EPA assistant administrator, said

    http://www.star-telegram.com/

  17. Well not so fast. The EPA may beg to differ. The Environmental Protection Agency is on track to unleash a train wreck of new regulations upon the country; regulations that are the logical equivalent of banning water because you can drown in it.

    long read
    The Bureaucrazies Part 3: the Dihydrogen Oxide Effect and the EPA’s Onslaught on Affordable Power

    Dihydrogen oxide, also known as hydric acid, is a dangerous chemical that’s use goes widely unregulated in industry, homes, and population centers as a whole. The substance can also be found outside of areas with human activity. High concentrations can be found in the soil in a variety of environments including forests and farms. Even greater concentrations are found in lakes, rivers, and oceans across the globe. Extreme levels of hydric acid are also trapped in the polar ice caps. High levels of dihydrogen oxide have also been linked to heavy thunderstorm activity among other natural disasters. Excessive hydric acid presence in the soil is known to increase the risk of sink holes and mud slides that can devastate entire communities. Hydric acid exposure can also cripple levies, dams, and other flood preventative structures, creating widespread destruction. The corrosive nature of hydric acid poses a severe risk to any metal based infrastructure, slowly destroying bridges, power line towers, communication arrays, sewage lines, and many other necessary structures. Dihydrogen oxide is so widespread that its presence in the atmosphere has created a global hydric acid rain/snow pandemic, with occurrences reported on every continent.

    The environmental dangers of hydric acid levels alone are enough to warrant action in this regulatory climate; however the danger doesn’t stop there. Human exposure to dihydrogen oxide can be and is often fatal. The material safety data sheet (MSDS) on dihyrdrogen oxide published by the HSE Group, lists the following dangers to the human body if exposed:

    Inhalation:
    Acute over exposure: Inhalation can result in asphyxiation and is often fatal.

    Skin Contact:
    Acute overexposure: Prolonged but constant contact with liquid may cause a mild dermatitis.
    Chronic overexposure: Mild to severe dermatitis.

    Ingestion:
    Acute overexposure: Excessive ingestion of liquid form can cause gastric distress and mild diarrhea.

    Specific personal protective equipment:
    Eyes: Goggles or full face splash shield when dealing with hot liquid.
    Hands: Use insulating gloves when extensive exposure to solid state or high temperature liquid state is contemplated.
    Other clothing and equipment: Use heat protective garment when exposed to large quantities of heated vapor.

    Precautionary statements:
    Compound is known as “the universal solvent” and does dissolve, at least to some extent, most common materials.
    Compound will conduct electricity when dissolved ionic solutes are present.

    Dihydrogen oxide is everywhere and its killing people through over exposure and the adverse weather and other environmental conditions it creates. The EPA has worked to create and implement regulations that have either banned or labeled hazardous far less lethal substances. So we must demand the EPA take action and regulate hydric acid right? After all the spread of dihydrogen oxide is so great that every single human being has close to a 70% contamination level. So where is the action? The dangers are proven. Why do we allow hydric acid to kill so many people and destroy so much? Simple:

    Dihydrogen oxide’s chemical formula is H2O. Hydric acid is water.

    So clearly it would be silly for the EPA to take action against water. Sure it can kill you, but you can’t live without it. If we safely use and recycle water, we can prevent most of the dangers it poses. Sure we can’t stop the thunderstorms and floods can be a bear to prevent, but just about everyone knows how to avoid drowning or that sticking your hand in boiling water is a bad idea. So if we can safely use a chemical or substance, despite its inherent dangers, it would be silly to impose government regulations on it right?

    Well not so fast. The EPA may beg to differ. The Environmental Protection Agency is on track to unleash a train wreck of new regulations upon the country; regulations that are the logical equivalent of banning water because you can drown in it.

    The EPA is currently developing and finalizing a collection of nearly 30 major regulations and over 170 major policy rules set to go into effect now through the year 2016. Using backdoor tactics that circumnavigate the authority of Congress, the EPA is implementing unilateral regulations on transportation and energy. The EPA is selfishly interpreting powers granted under acts such as the Clean Water Act, Clean Air Act, and the Resource Conservation and Recovery Act to impose regulations that are staggering in both audacity and cost. Here are some of the more notable regulations.

    Burning anything solid naturally creates and ash byproduct. Coal fired power plants are no exception. Much of the ash produced, while obviously not something you want to fill your lungs with (much like hydric acid), is usefully recycled. A plurality of drywall produced for construction of homes and offices is made from recycled coal ash. The recycled coal ash is a $2 billion dollar a year trade as coal ash has other valuable uses in other construction products and methods. Yet the EPA is contemplating regulation that will label coal ash as a hazardous material requiring special disposal techniques. An American Legislative Exchange Council report estimates that such a label would cripple the industries surrounding coal ash recycling and compliance costs for coal ash disposal could be as high as $77 billion, driving energy costs up as this cost is passed on to the consumer.
    The onslaught on power production continues.

    Fossil Fuel and Nuclear Power Plants often use a basic system of water flow to cool and condense steam produced by the furnaces or reactors back into water. The cooling water in this method is pumped from a natural water source nearby the facility and is run through the plant once before being pumped back into its source. This is a cost effective and safe method of cooling the plants and condensing the steam produced. However the EPA is scheduled to finalize a rule by mid-2012 that would require water cooled plants to shift to a closed-cycle cooling system such as a cooling tower.

    The retrofits required by such regulation would skyrocket energy costs and could drastically reduce energy production. Each plant requiring modification with cooling towers would face a nearly billion dollar capital cost. Aggregate capital costs across the country could be as high as $64 billion. Companies or states owning power plants that would require a revamp would have to decide whether or not to keep plants open. Many states facing budget deficits already would likely be forced to close the doors of their power plants and lay off thousands of people. Others that decide to keep plants open would naturally pass the sky high cost of cooling tower installation right on to their customers. Either way, opened or closed, energy costs for the consumer would skyrocket.

    Further retrofits or retirements of power plants could come as a result of new rules surrounding Hazardous Air Pollutants (HAPS). The EPA is proposing rules that require the use of the Maximum Available Control Technologies (MACT) to bring all power plants to the average pollution control performance of the top 12% of existing power plants. The prominent HAPS driving the MACT rule is mercury. While mercury pollution is a serious issue, the EPA has overblown and overreacted to the issue as it exists in the United States. The EPA MACT rule could cost as much as $358 billion in resource costs with another $100 billion in capital costs taken on by the electric companies and subsequently passed on the utility consumer. These expensive retrofits could force nearly 15 gigawatts worth or generator capacity into retirement. These costs are all unnecessary because U.S. energy production is not behind the mercury pollution problem. Current statistics show that mercury pollution in the oceans and subsequently seafood is an internationally fueled externality. Estimates are that 30% of mercury contamination in the United States in from outside of the country. In fact 80% of the seafood consumed (which is the primary source of mercury exposure) is from foreign markets. In total, the Electric Power Research Institute has estimated that less than 5% of the 2,500 tons of global annual mercury release is from the United States, rendering stricter compliance regulations on mercury emission unnecessary on top of being too expensive.

    We all acknowledge that the production of electric power has its negatives. The byproducts can be dangerous yet our society cannot function and our economy can’t heal itself and expand without a reliable power supply. We simply can’t live without affordable and reliable electricity. Our relationship with power is similar to our relationship to water. Water is everywhere and creates dangers from drowning to hurricanes, but we’ll all wither away without it. The problem with the current slate of EPA regulations is that they address inherent issues, hoping for elimination of dangers where only management is necessary or possible. The fragile economic state of the country isn’t prepared for massive retirements and expensive retrofits of major electric power sources simply because the EPA has deemed the already effectivele management of externalities of power production unacceptable. Mercury, coal ash, and other byproducts of economical power production are already effectively contained, managed, or recycled. This new slew of EPA rules and regulations, which are the logical equivalent of banning water for its inherent dangers, only serves to burden the American people with astronomically higher energy costs for a less reliable electric power system and will not significantly affect the state of the

    • Charles what a great post, I had almost forgotten – THE ULTIMATE SOLVENT it even dissolves glass , You will have the E-GREENS UP ALL NIGHT TRYING TO FIGURE IT ALL OUT – he he he he he

      Mercury comes from the mountains do you think they will support leveling them out so it does not leach into the water and fish? What – some said science was not fun?

  18. Utilities warn of higher rates because of pollution rules
    By Thomas Content of the Journal Sentinel Aug. 19, 2011
    EMAIL PRINT (81) COMMENTS
    Plugged In

    Energy writer Thomas Content keeps you current as you adapt to changes in the world of energy, climate change and efforts to build a greener economy.

    U.S. Venture buys Indiana fuel supplier
    Sales decline, but forecast is up at Telkonet
    Archer Daniels exec joins Virent board
    View All Blog Posts

    Two state utilities said this week new federal pollution rules will lead to higher electricity costs come January.

    Wisconsin Public Service Corp. of Green Bay said its residential customers can expect an increase of more than $4 a month next year, including about $2 linked to the new rules designed to limit air pollution from coal-fired power plants.

    The utility said it would see higher costs of about $32.6 million in 2012 from the Cross-State Air Pollution Rule that was finalized recently by the U.S. Environmental Protection Agency. That will result in rates going up by 6.8% instead of 3.4%, the utility said.

    The U.S. Environmental Protection Agency last month finalized stronger regulations for Wisconsin and 26 other states aimed at curbing air pollution from long-distance sources.

    Environmental groups praised the new rule because it would reduce acid rain and air pollution as well as help curb health effects from dirty air linked to coal plants. The EPA projected the rule will save up to 34,000 lives a year and prevent more than 400,000 asthma attacks as well as 19,000 admissions to hospitals.

    Nationwide, the EPA estimated that utilities are projected to spend $800 million on the rule in 2014, in addition to $1.6 billion a year that’s been spent to satisfy an earlier version of the regulations.

    But the EPA estimates the nation will see $120 billion to $280 billion in annual health and welfare benefits beginning in 2014.

    The new rule has been in development for several years but the first phase of compliance hits utilities in 2012. WPS said it won’t have time to install pollution controls by next year at its plants, but will be able to comply by purchasing credits from other utilities that have cut emissions.

    The utility also said it plans to operate its coal plants less next year than it otherwise would have, and will buy more power from the Midwest wholesale power market as a result, a move that it said is also a factor in higher costs for customers.

    “This is the best option we have to meet power supply needs for 2012 and comply with the new EPA rule at this time,” said Karen Kollmann, WPS director of fuels management in a statement.

    On Thursday, Wisconsin Power & Light Co. of Madison said it would face an additional $9 million in costs linked to the air pollution rule. With the change, the utility is now seeking an increase in 2012 of $20 million, or 2%, utility finance manager Martin Seitz said in a filing with state regulators.

    Todd Stuart, executive director of the Wisconsin Industrial Energy Group, criticized the increases, and he noted that large energy users like paper mills will see higher than average increases, compared with homeowners and small businesses. Paper mills served by WPS could see a 9% hike, he said.

    “The EPA’s new rules have directly resulted in a major new cost for struggling homeowners and manufacturers,” Stuart said in a statement. “Members of Congress should be taking a very hard look at the significant compliance costs of EPA’s new mandates.”

    “Industry always cries wolf whenever EPA tries to reduce air pollution,” said Katie Nekola, lawyer with the conservation group Clean Wisconsin. “The fact is, the new rule will affect old, inefficient, unnecessary coal plants that should have been shut down long ago. The continued operation of those old units is costing ratepayers money, but you don’t hear industry complaining about that.”

  19. Here is the extreme E-GREEN fight back to attempt to convince non believers 69% in AGW that science is agreeing that man is a cause in their OPINION. Now that is not science first you form a hypothesis and then you peer review it and then you must PRODUCE A PROOF – until all that is done you have no scientific PROOF just opinions. Most of the research done was corrupted by the underlying false or cherry picked data thereby rendering all computer model predictions are null void and invalid. Just look at the ocean level predictions from the computer data base vs actual on the ground reality.

  20. and coal ash must be limited, if Texas complies with a new EPA ruling next year we’ll be having nice, wonderful rolling blackouts. Sure hope it’s through being 100+ degrees every day by then.
    The EPA is currently considering a federal proposal to regulate coal ash that includes two options: the first option would classify coal ash as hazardous waste, requiring water quality monitoring, liners and the phase out of dangerous “wet” storage of coal ash, such as the pond that collapsed in Kingston, Tennessee in 2008. The second option would continue to allow states to inadequately regulate coal ash by establishing only guidelines that states are free to ignore. Unsurprisingly, coal ash generators support the weaker option. The EPA, under pressure from industry, has postponed finalizing the coal ash standard until 2012.
    http://www.commondreams.org/newswire/2011/08/17-2

    so i guess them reclassifying coal ash as hazardous waste is gonna set new standards for all the sheetrock in our houses?? we’ll now all have toxic houses?

    • TJE,

      Seems to me that we could use it in Highway blacktop – it is just carbon dust the rocks and sands in the asphalt contains much of the same metals and radiation IMO so use the carbon instead of or with oil?

      • carbon based creatures in a hysterical frenzy fearing carbon…
        silly friggin humans

  21. i gotta say Lock – if extreme weather patterns are supposed to convince folks the AlGore myth is true, I should rank among the most convinced considering what this #^%!%#!% drought has put me through. But that darn understanding of natural law just won’t seem to be overridden.
    Although i’ve had my moments of wonderin what the heck is going on, i’m more inclined to embrace the earth’s axis shift or ring of fire activity changing water temperatures. The foolish mistake repeated throughout history of humans believing they’re superior to the forces of nature is not one I’ll be making.
    Humans just aren’t ever gonna do it, not by Co2, not by HAARP, not ever… it’s the foolishness of failing civilizations.

  22. DUST TO DUST DIRT TO DIRT – ALL THE SAME IN THE END.

    We humans live for say 100 years and we want to dictate global climate changes out of 8 billion years – WTF – If observation of the past was a reliable indicator of the future stock brokers would be rich and not working – so would farmers?

    Pyramid Lake near Reno Nevada once was a Redwood forest by a huge inland sea but no is desert and the trees are petrified by high mineral content water. The Sahara desert was a forest so what make us so damn smart today. We can tell climate patterns 10, 20 30 , 50, 100 years form now – I don not think so.

    I know that I am intelligent, because I know that I know nothing.
    Socrates

  23. Is economic “graceful decline” the true agenda of some warmists?
    Posted on August 20, 2011 by Anthony Watts

    Bill McKibben – Image via Wikipedia

    Guest commentary by Indur Goklany

    Sometimes the true agenda is laid bare.

    From http://www.eenews.net/climatewire/print/2011/08/19/1, a piece on Bill McKibben, in which E&E News’ Paul Fialka discusses his agenda, are these passages.

    [My comments are in brackets. I have highlighted some passages.]

    Many of the climate theories in [McKibben’s] book [“The End of Nature.”]– and the future career path of McKibben — were shaped by James Hansen, who was then and is now the head of the NASA Goddard Institute for Space Studies in New York. Starting in 1988, Hansen had begun to testify before Congress that greenhouse gas emissions had begun to change familiar weather patterns on the planet and, without action to limit them, the changes would become more obvious and dangerous in the 21st century.

    As Hansen explained and as McKibben later found out, the people who were most vulnerable to the flooding, famine and drought and the spread of tropical diseases lived in developing countries. McKibben was interviewing people in the slums of Bangladesh in 2006 when he was hospitalized with dengue fever, which is still untreatable. As he watched others dying, he recalled in a later book: “Something in me snapped. Nothing concrete had come from my work, or anyone else’s.”…

    Putting the U.S. economy into ‘graceful decline’

    While some companies have been critical of the chamber’s lobbying, McKibben will have great difficulty convincing them about another premise of his, which is that to cope with the more expensive food, weather, health and energy challenges of a climate-changed world, the growth of America’s economy can’t continue.

    350.org supporters line up in Baku, Azerbaijan. They were among those in 188 countries who demonstrated for climate change solutions on Oct. 10, 2010. Photo courtesy of Flickr.
    He talks about federal policies that put the economy in a “graceful decline,” one that stimulates small-scale, organic farming and has more of a focus on activities in neighborhoods, towns and states than on national and international affairs. “We need to scale back, to go to ground,” he says in “Eaarth.”

    [COMMENT: (1) Apparently, it has never occurred to McKibben that the perhaps the major reason why people in developing countries were most vulnerable to flooding, famine and drought and the spread of tropical diseases and why Bangladeshis died from dengue is that they lacked economic development and had stuck to “organic farming” for much longer than farmers in the developed countries. (2) There is nothing “graceful” about lower economic development. Ask not only people in developing countries but also those trapped without jobs in developed countries.]

    What McKibben says he wants from Washington ispoverty a “stiff price on carbon” emissions. He calls cap and trade, the Democrats’ most recent legislative attempt to impose a price on carbon emissions through an economywide emissions trading scheme, “an incredibly complicated legislative scheme that gives door prizes to every interested industry and turns the whole operation over to Goldman Sachs to run.”

    …Fred Krupp, president of the Environmental Defense Fund…one of the leaders of a coalition of major environmental groups and corporations that pushed cap and trade through the House [when asked] about McKibben’s advocacy of civil disobedience, … said “that’s a matter of personal conscience and personal choice. It’s not among the tactics that EDF uses.”

    Frank O’Donnell, president of Clean Air Watch, a small, Washington-based environmental group, is among those lining up alongside McKibben…

    Paul Bledsoe, a former Clinton administration White House aide, has known McKibben for 15 years [and] now works with Washington’s Bipartisan Policy Center, said he isn’t surprised by McKibben’s move toward civil disobedience. “Because climate impacts will hurt and potentially devastate the poor disproportionately, the moral and social justice elements of climate are much greater than many other environmental problems,” Bledsoe said.

    [COMMENT: So how would a decline in economic development – “graceful” or otherwise – reduce climate impacts?]

    In the interview here, McKibben explained that his group, 350.org, gets about $1 million a year in donations, most of it coming from foundations. Most of its activists are volunteers, led by 20 to 30 staffers “who are paid very little.” Financially, it is outgunned by the U.S. Chamber and fossil fuel companies, which is why he has organized it as a “movement” to raise public awareness. “Our currency is bodies and spirit,” he said. “This [climate change] is the biggest thing that’s ever happened.

  24. we need to sue the epa and make them prove all the things they do are really needed for the country , i think this is all a bigger plan than we know , obama and epa very bad for our country

    • Texas is doing that now – it may also be a nullification issue – 10th amendment issue as it is not a Article I section 8 enumerated power. The limits are now arbitrary and capricious, not based on facts economics or even real heath issues – the limits are contrived to give them more power and control over the economy and manufacturing.

  25. Can we all agree that the EPA is the biggest and baddest of all the government agencies strangling this country’s economic welfare? It needs to be shut down or at least seriously curtailed and add it to the list of criteria we ask of our candidate for POTUS.
    All of the government agencies that Nixon and Carter started as well as the Department of Agriculture, have all been used to artificially manipulate this country’s economy as well as that global economy we keep hearing about!

    • It is right there with it evil twin “THE ENDANGERED SPECIES ACT” – subspecies are now endangered if they can not be found down the road – they delay projects for decades at a minimum. Look how the little fish shut down the irrigation to the California central valley putting 40% in the unemployment lines and BKing farmers.

  26. Drinking oil powered filters which clean water, then put in a oil product plastic bottle – oh yea it is not about power and money [fines and taxes] it is about saving the environment? BS

  27. inShare
    16
    Ads by Google
    Free Website for BusinessFree Website for Your Business With Google in 3 Easy Steps. Learn How! GYBO.com/Texas

    President Barack Obama is ignoring heated concerns from within his own administration that new Environmental Protection Agency coal industry regulations will be economically devastating.

    The EPA is plowing forward with new Maximum Achievable Control Technology (MACT) mandates. The regulations would force coal energy plants to install giant scrubber-like materials inside smokestacks to capture and cleanse carbon particles before their atmospheric release.

    The upgrade cost would fall on company employees and coal miners in the form of layoffs, as well as on businesses, which could expect to pay more for energy.

    In a lengthy letter to EPA Director Lisa Jackson, Obama’s Small Business Administration advocacy office wrote the EPA “may have significantly understated” the economic “burden this rulemaking would impose on small entities.”

    One Southern Indiana Chamber of Commerce vice president, Tonya Fischer, told The Daily Caller the entire state of Indiana would be “devastated” by these regulations. “We are definitely in opposition to [the MACT regulations] because it would be devastating for the state of Indiana.” She adds that local businesses, which are struggling with the tough economy already, would be forced to pick up the extra energy production costs Obama’s EPA is pushing. “We get 95 percent of our electricity from coal.”

    “The cost to convert those facilities would be passed on to the small business owners, or basically shut them [the coal energy producing facilities] down altogether,” Fischer said. “It would become cost-prohibitive for them [local businesses] to continue paying their electricity bills.”

    If the EPA regulations aren’t halted, Fischer expects unemployment numbers in Indiana to skyrocket. “This has got to affect tens of thousands of jobs in the area because, not only would you lose the employees from the coal facilities, the plants themselves would become more streamlined so you’d lose jobs there and, of course, the small and local businesses.”

    Read more: http://dailycaller.com/2011/08/11/white-house-epa-ignore-small-business-admins-report-that-new-coal-regulations-will-kill-jobs-economy/#ixzz1w0c8LAR6

    • BREAKING NEWS: The President has just confirmed that the DC earthquake occurred on a rare and obscure fault-line, apparently known as “Bush’s Fault”. The President also announced that the Secret Service and Maxine Waters continues an investigation of the quake’s suspicious ties to the Tea Party. Conservatives however have proven that it was caused by the founding fathers rolling over in their graves!!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: