The Paris Agreement on climate — a good start, but…

A 31-page accord on climate, the Paris Agreement, was adopted on 12 December 2015, and endorsed by acclamation by 195 countries, parties to the United Nations Framework Convention on Climate Change (UNFCCC) at their 21st meeting (COP21). The achievement of universality was remarkable and historic because, for the first time, developing countries also committed to taking action to prevent climate disaster. The rich countries reaffirmed that there are differential responsibilities — code for their far greater contribution to the problem of climate disruption.

Another truly remarkable thing was the skill with which the small island states, like the Marshall Islands, and their supporters navigated the waters where the Exxons and Saudi Arabias of the world sail. They led COP21 to an accord that seeks to hold “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change” (Article 2). The slogan was “one-point-five to survive.” Anything more would mean their destruction by rising oceans — along with so many other coastal communities and lands from Bangladesh to Shanghai to Miami and Mumbai. Hundreds of millions would be displaced at 2°C, the previous average temperature rise limit agreed to in climate negotiations. Take a look at the excellent The New York Times illustration of Chinese cities now, with 2°C temperature rise, and with a 4°C rise.

The 1.5°C limit implies an end to the large scale destruction of forests; Article 5 begins to address the issue. It would require leaving most oil and gas and coal in the ground: fossil fuels would become like stones after the Stone Age — obsolete. While essential for Mother Nature and people generally, millions of workers would lose their jobs. A just transition for them and the communities they live in was an option in Article 2 of the draft going into COP21; it was relegated to the preamble in the final document, as were “obligations on human rights, the right to health, the rights of indigenous peoples, local communities, migrants, children, persons with disabilities and people in vulnerable situations and the right to development, as well as gender equality, empowerment of women and intergenerational equity…” (p. 21). But the words are still there, inviting action. In addition, there was acknowledgement of the need for “gender balance” and that the knowledge of indigenous people would be valuable in adaptation.

Critically, the substance of the commitments, if they can be called that, are not remotely up to the task of limiting temperature rise to 1.5°C. Indeed, there are no legally binding targets at all. Instead there are highly inadequate, voluntary “Intended Nationally Determined Contributions” (INDCs) imply roughly a 3°C rise, double the 1.5°C target. Remember: damage would rise far faster than average temperature.

To keep temperature rise to less than 2°C, the Intergovernmental Panel of Climate Change (IPCC), in its Mitigation Report, estimated that CO2-equivalent (CO2eq) concentrations would have to be limited to 450 parts per million (ppm) by the year 2100 (pp. 8-10). It means emissions 40 to 70 percent below 2010 by 2050 and “near or below zero” in 2100 (pp. 10, 12; italics added). That would make it likely that the temperature rise would be less than 2°C; the chance that it would suffice for 1.5°C? Just 16 percent with a likely overshoot above that in mid-century (Figure 6-13, p. 439). The IPCC also noted in its summary explicitly addressed to policy makers that “Only a limited number of studies have explored scenarios that are more likely than not to bring temperature change back to below 1.5°C by 2100 relative to pre-industrial levels; these scenarios bring atmospheric concentrations to below 430 ppm CO2eq by 2100.” (p. 16, emphasis in the original) Below 430 ppm! The world was already at 430 ppm CO2eq (including all greenhouse gases) in 2011; we are at more than that now.

The breathtaking scale of this task is not evident in the Paris Agreement, though it does express “serious concern” about “the significant gap” between the INDCs and the ambition. Only it’s not just a significant gap; it’s a Himalayan crevasse. It seems reasonably clear that for a reasonable chance of limiting the temperature to 1.5°C, global emissions would have to go to zero well before 2100. Considering differentiated responsibilities, rich countries would have to get to essentially zero emissions by about 2050 or before.

The Paris Agreement has provisions for countries to strengthen their commitments to reduce emissions and for five year reviews. The first review will be in 2018 (“facilitative dialogue…to take stock”, p. 4). A high priority task, if we are serious about 1.5°C, would be to get zero emissions in the energy sector for rich countries by 2050 (at the latest) on the agenda for that dialogue. Global justice requires at least that. Energy justice within countries will need to be addressed too. For the United States, I suggest that the energy burdens of low-income households be capped at 6 percent, considered an affordable level. We’ve done a study detailing that for Maryland, it also explores how to provide universal solar energy access. They are more essential now both for economic justice and climate goals.

Presumably, the $100 billion a year that the rich countries promise to provide by 2020 and thereafter (pp. 16-17) would partly make up for the constraining the carbon space of those who did not contribute much to creating the problem. In fact, while recognizing that countries and peoples are already experiencing “loss and damage”, the Paris Agreement flatly states that the article covering such losses “does not involve or provide a basis for any liability or compensation.” (p. 8) The accord lacks a vital tool: teeth.

There is one bad element, a carryover from the Kyoto Protocol. Article 6 of the Paris Agreement would allow international offsets (“cooperative approaches that involve the use of internationally transferred mitigation outcomes towards nationally determined contributions”). This means that some countries (likely rich ones) could continue to pollute while claiming that others are doing more than their share or storing carbon in some way, for instance in soil or trees (likely poor ones). It is a giant loophole with potential for serious corruption as well.

The Paris Agreement is a good start, especially in that it sets forth a temperature goal and commits all parties to act, with differentiated responsibilities for the rich. Most of the needed words are there; however, they are, for the most part, weak. To give them effect and keep most fossil fuels in the ground will take the global equivalent of the movement that stopped the Keystone Pipeline. Yet, the agreement could be a solid beginning: it has created immense organizing energy. The work of keeping fossil fuels in the ground has already begun, among others by 350.org, the group that led the huge and diverse Keystone struggle.

We will also need national and local roadmaps for efficiency and renewable energy, transportation, and sustainable agriculture (a large source of greenhouse gas emissions). That vision will need to be broad. For instance, it will need include the cooking energy needs of hundreds of millions of families who now cook with wood, cow-dung, and crop residues. Women and children die in the millions each year of respiratory diseases; and black-carbon (soot) emissions contribute to global warming.

The world already has more than one billion petroleum fueled cars — it is headed to 2 billion by 2030. That is incompatible with the Paris Agreement. Transportation will need to be revolutionized — and electrified — with electrified public transport much more in the center of things and all types of transportation running on renewable energy. Paris should be an inspiration for a walkable city with wonderful public transport.

We will need roadmaps, created with public input, for productively investing and spending the $100 billion-a-year, and intense pressure to ensure at least that much money is forthcoming and that it is well spent and that it creates good jobs for workers in the fossil fuel sectors now.

At bottom, 1.5°C is about reshaping a world created by imperialist-drawn borders, often with oil at the center, and a hundred years of wars — still going on — into one that is ecologically sane, peaceful, and economically just. Remember Syria and Iraq (among others) were essentially created by Britain and France after World War I. Actually achieving a limit of 1.5°C will mean taking the tiger out of Exxon’s tank and putting it into the Paris Agreement. It may well be a perilous exercise in itself. But it is one that is essential — it is the one-point-five imperative.

The Clean Power Plan is a step in the right direction

With the publication of the final Clean Power Plan, the United States can finally claim some leadership in curbing CO2 emissions at the federal level. The final rule is, on balance, technically, economically and environmentally coherent. The actual goal is short of what it needs to be, but it is better than in the draft plan. And the direction is right, which is the most important thing. Thanks to all of you who worked with us and supported us in the process, especially Scott Denman, Diane Curran, Lisa Heinzerling, Elena Krieger, and my co-author, M.V. Ramana..

We asked for many things in our comments on the draft EPA Clean Power Plan. The EPA agreed not only with the substance, but more important, the reasoning underlying our policy positions in the final Clean Power Plan (CPP) rule.

Most of all we asked for a coherent, technology-neutral rule that would be more protective of climate. Here are some of the big picture items:

  1. Existing nuclear plants and license extensions will not be subsidized by the CPP: We asked that both existing nuclear power plants and existing renewable energy be removed from the calculation of emission targets because they do nothing to reduce CO2 emissions. We asked that they be treated consistently. (Neither have significant onsite emissions of CO2 and both have some offsite lifecycle emissions that are much less than natural gas per unit of generation). Existing generation should not be part of the “Best System of Emission Reduction” (BSER) because we want to reduce CO2 emissions from where they are now (or in 2012, the baseline year). The EPA agreed. Both are gone from the final rule. Further, in its draft rule, the EPA implicitly assumed (in its modelling of the electricity sector) that licenses of existing plants would be extended. The relicensing issue has been removed from the CPP since existing generation is not in the calculation of emission reductions. It is simply the baseline generation, as is clear from page 345 of the final plan (italics added):

    …we believe it is inappropriate to base the BSER on elements that will not reduce CO2 emissions from affected EGUs below current levels. Existing nuclear generation helps make existing CO2 emissions lower than they would otherwise be, but will not further lower CO2 emissions below current levels. Accordingly,…the EPA is not finalizing preservation of generation from existing nuclear capacity as a component of the BSER.

    The same reasoning was applied to license extensions. Only uprates (increases in licensed capacity of existing plants) would be allowed to be counted. This is consistent and technology neutral (in the same way that increasing the capacity of a wind farm would be counted). The rule does not seek to “preserve” existing power plants. Or to shut them down. That will happen on the merits without an EPA hand on the scale in favor of nuclear.

  2. New and under-construction nuclear reactors are not part of the best system of emission reduction; renewable energy is: We pointed out that new nuclear plants are very expensive; even the State of Georgia, whose ratepayers are forced to subsidize two nuclear units through their electricity bills, noted that in its comments. Since the “Best System of Emission Reduction” (BSER) has a cost criterion, new nuclear should be excluded from the BSER. (We also cited other reasons for that.) The EPA excluded new nuclear from BSER but included new renewable energy (p. 345, italics added):

    Investments in new nuclear capacity are very large capital-intensive investments that require substantial lead times. By comparison, investments in new RE generating capacity are individually smaller and require shorter lead times. Also, important recent trends evidenced in RE development, such as rapidly growing investment and rapidly decreasing costs, are not as clearly evidenced in nuclear generation. We view these factors as distinguishing the under-construction nuclear units from RE generating capacity, indicating that the new nuclear capacity is likely of higher cost and therefore less appropriate for inclusion in the BSER.

    This is a critically important statement. We don’t have a shortage of low CO2 sources. We have a shortage of time and money to reduce CO2 emissions. The EPA recognized (very delicately!) that renewable energy is better on both counts. As a result, one or more the four new reactors under construction at Vogtle and Summer can proceed or stop on the financial merits, rather than these units being pushed into existence with the Clean Power Plan playing the role of midwife.

    The EPA also “seeks to drive the widespread development and deployment of wind and solar, as these broad categories of renewable technology are essential to longer term climate strategies” (p. 874). This is an excellent goal. The EPA recognized that costs of solar and wind are declining.

  3. New natural gas plants are not part of the best system of emission reductions: This is perhaps the best and most solid indication that the Obama administration takes long-term reductions seriously. New natural gas combined cycle plants, even though they have lower CO2 emissions per megawatt-hour (using EPA leak rates and global warming potential for methane), will not be part of the BSER even though they meet the cost test and emission rate test. The reason: they will be emitting CO2 for decades (p. 346, italics added):

    However, our determination not to include new construction and operation of new NGCC capacity in the BSER in this final rule rests primarily on the achievable magnitude of emission reductions rather than costs. Unlike emission reductions achieved through the use of any of the building blocks, emission reductions achieved through the use of new NGCC capacity require the construction of additional CO2-emitting generating capacity, a consequence that is inconsistent with the long-term need to continue reducing CO2 emissions beyond the reductions that will be achieved through this rule. New generating assets are planned and built for long lifetimes –- frequently 40 years or more –-that are likely longer than the expected remaining lifetimes of the steam EGUs whose CO2 emissions would initially be displaced be the generation from the new NGCC units. The new capacity is likely to continue to emit CO2 throughout these longer lifetimes….

  4. Increased capacity factor of existing natural gas plants is BSER: The EPA is still allowing increased capacity factor of existing natural gas combined cycle power plants to displace coal. This is the result of its estimate of methane leak rates and global warming potential. So long as new central station natural gas plants are not encouraged, the rate of use of existing plants is a problem that can be sorted out in the coming years. It would have been very difficult to argue only on the grounds of the BSER rules and existing methane leaks estimates that increasing capacity factor of existing natural gas combined cycle units to displace coal is not BSER. The job now is to get the EPA to recognize a wider array of methane leaks rates (that have ample empirical support) and to use both a 20-year and 100-year warming potential screen in the design of its CO2 reduction programs. The recent report from the IPCC uses a global warming potential of 28-34, including feedback effects. It would be entirely appropriate for the EPA to adopt a similar evaluation metric. The 20-year warming potential, which is about three times higher would be even more appropriate given that the climate crisis is developing more rapidly than previously anticipated.
  5. The EPA has incentivized early investment in low-income efficiency programs (p. 864 onward): This is a very important feature of the CPP. States that want to make very sure that low-income households are not adversely impacted by the rule will take advantage of the additional emission reduction credits the EPA is offering for early action. This also promises to provide other benefits such as reduction of the cost of energy assistance programs and lower adverse health impacts due to inability to pay for health care or medicines.
  6. The cap-and-trade provision is OK in the electricity context, though with reservations: Carbon permits from new generation can be traded. For instance, existing nuclear plants cannot generate tradeable CO2 credits (unless they are from a licensed uprate). I am not a fan of expansive cap-and-trade but the EPA formulation in the CPP makes sense to me. It is the same as if emission limits were set for a group of states or at the grid level, such as the PJM grid in the mid-Atlantic region but extending inland to Ohio and beyond, or the MISO grid in the upper Midwest. The EPA seeks not to impose a model of reductions; only to get to a certain level of reductions. In the cap-and-trade system permitted by the EPA, the CO2 reduction could happen in one state or in another, but it will have to happen. One of my reservations is that the EPA also allows the trading of energy efficiency credits across state lines. It is difficult enough to account for program-induced efficiency improvements within a state and distinguish them from say, the effects of federal appliance standards. Bundling these efficiency gains into tradeable credits is not a good idea. Another issue is that the method of calculating the reduction in emission rate is not the best as applied to efficiency. We had asked for a more global and comprehensive approach to CO2 accounting, but did not succeed on this point.
  7. Conclusion – The CPP is a real tour de force; it gives me hope. Of course, there is much work to do now that the final CPP has been published (besides making it stick). We need to advocate for states to mandate GHG reduction targets of 40 to 50 percent by 2030 from all sources; we need to accelerate electrification of transportation and restructuring of the grid….But the CPP is a great springboard from which to make these leaps.

Fukushima reflections on the second anniversary of the accident

Statement of Arjun Makhijani for the March 2013 conference commemorating the Fukushima accident
To be read by Helen Caldicott

I appreciate that my friend, Helen Caldicott, one of the two people who inspired my book Carbon-Free and Nuclear-Free (the other was S. David Freeman) has agreed to read a brief statement from me on this second anniversary of the Fukushima disaster. I wanted to share two of the new things I have learned as I have followed the consequences of Fukushima unfold.

First, the Japanese government proposed to allow doses as high as 2 rem (20 millisieverts) per year to school children, claiming that the risk was low or at least tolerable. An exposure at this level over five years – 10 rem in all — to a girl, starting at age five, would create a cancer incidence risk of about 3 percent, using the [age- and gender-specific] risk estimates in the National Academies BEIR VII report.

Now imagine that you are a parent in Japan trying to decide whether to send your daughter to such a school. Roughly thirty of every hundred girls would eventually develop cancer at some point in their lives; just one of those would be attributable to Fukushima school exposure, according to the risk numbers. But no one would know if their daughter’s cancer was attributable to the exposure at school and neither would the Japanese government’s radiation bureaucrats. Why is it difficult to understand that while the risk attributable to school contamination would be one in thirty, the proportion of parents stricken with guilt and doubt would be closer to one in three? Would you ever forgive yourself if you made the decision to send your daughter to that school? Or your son, though the risk attributable to Fukushima exposure would be less than that experienced by girls?

Indeed, due to the long latency period of most cancers, you would be fearful even if no cancer had as yet appeared. The Pentagon understood this when a Joint Chiefs of Staff Task Force evaluated the extensive contamination produced by the July 1946 underwater nuclear bomb test (Test Baker) at Bikini for its usefulness in war. Here is a quote from their 1947 report:

“Of the survivors in the contaminated areas , some would be doomed by radiation sickness in hours some in days, some in years. But, these areas, irregular in size and shape, as wind and topography might form them, would have no visible boundaries. No survivor could be certain he was not among the doomed, and so added to every terror of the moment, thousands would be stricken with a fear of death and the uncertainty of the time of its arrival.”

Compare this for yourself with the aftermath of Fukushima and the plight of the parents.

Second, nuclear power’s conceit was that nuclear power is 24/7 electricity supply. Since Fukushima, over sixty of the world’s light water power reactors have been prematurely shut for a variety of reasons, though just four reactors were stricken by the accident: 52 in Japan, eight in Germany, several in the U.S. Even if some are eventually restarted, nuclear power has shown a unique ability to go from 24/7 power supply to 0/365 essentially overnight for long periods– hardly a convincing claim of reliability.

We can do better than making plutonium just to boil water or polluting the Earth with fossil fuel use. When I finished Carbon-Free Nuclear-Free in 2007, I estimated it would take about forty years to get to an affordable, fully renewable energy system in the United States. Today, I think in can be done in twenty-five to thirty years. Are we up to the challenge? Finally, I truly regret I cannot be there to publicly thank and honor my friend Helen for inspiring Carbon-Free, Nuclear-Free, which you can download free from ieer.org, also thanks to her. I wish you a very productive conference.

(Also see IEER’s publication Plutonium: Deadly Gold of the Nuclear Age, June 1992.)

When small is not beautiful is it at least cheap?

On February 6, 2013, Dan Stout, who is the Tennessee Valley Authority’s senior manager for its Small Modular Reactor project, gave a colloquium at the University of Tennessee in Knoxville. Much of the talk was just nuclear-boosterism. For instance, he claimed that “nuclear power was tested hard in 2011. It remains safe reliable and affordable.”

There was no mention of the fact that post-Fukushima, about 60 of the world’s light water reactors were closed for one reason or another, mainly in Japan and Germany, taking them from 24/7 power generation to 0/365. He said nothing of the enormous social, economic, and environmental dislocation and loss that has afflicted the Fukushima region and beyond since that time. And there was nothing of the Nuclear Regulatory Commission’s Task Force report of July 2011 that found US regulations seriously lacking in a number of respects (http://pbadupws.nrc.gov/docs/ML1118/ML111861807.pdf). But there was some refreshing candor about Small Modular Reactors (SMRs) mixed with this sales talk. His talk is archived at http://160.36.161.128/UTK/Viewer/?peid=fa73ded60b7b46698e9adc0732101a76

SMRs are supposed to overcome the loss of economies of scale by using assembly mass-manufacturing techniques such as are used for cars and passenger aircraft. The site set up would be standardized and the set up and commissioning on site would be relatively quick (36 months). So “economics of replication” would replace the “economies of scale” (one of the principal reasons that reactors got larger as time went on).

But there is a chicken and egg problem here, to quote a cliché. You’ve got to have a lot of orders before you can set up your assembly line and produce cheap reactors, but you have to have demonstrated your reactors are certified before you get the nuclear masses lining up to order them, given the risks involved. There are no such orders yet; no assembly line is in sight.

So for now, SMRs would be “cobbled together” in existing facilities. “Does the federal government want to help?” he asked rhetorically. “I don’t know,” he went on. “We’re going to find out. I am not going to make your electric bills all triple because of this project. That’s just …TVA won’t do that.” (Italics added.)

So for the foreseeable future without massive subsidies, nuclear power using SMRs will be the same as nuclear power with the present Atomic Leviathans – expensive and in need of government subsidies. But you have to hand it to Mr. Stout for one thing. Unlike Georgia Power and South Carolina Electric and Gas, two utilities building the new large behemoth variety (the AP1000), he is not willing to saddle TVA ratepayers with the cost of yet another nuclear gamble. TVA has been there and done that, and is still doing it with large reactors. A large part of TVA’s indebtedness is from 1970s and early 1980s was due to the cancellation mid-stream of costly and unneeded reactors. No, prudently for TVA, Mr. Stout wants the taxpayers to take the deal.

And the federal government has obliged by committing up to half of the $452 million proposed for SMRs to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification. [1] B&W had spent over $200 million on the project as of the middle of last year. Where will it lead other than the “cobbled together” machine? Specifically, what will it take to get an assembly line? Here is how Mr. Stout explained it:

So the concept is that you gotta to have an assembly line cranking out repeatable parts, achieving a standardized vision of lots of mPower reactors. That creates the nth of a kind plant that has the efficiency in cost. I’m building Unit One. I don’t want to pay for B&W’s factory with automation to crank out repeatable parts. So that creates a contracting challenge… So as you scratch your head and puzzle how does work, remember the math won’t work on one unit. In fact our unit is most likely going to be, I’m going to use the word “cobbled together”, it’s going to be manufactured within existing facilities. But if B&W can get an order backlog of a hundred SMRs and they are going to start delivering them in China and India etc. then they’ll be able to go get the financing and build the building and have new stuff put in place to crank out these parts in a more automated manner. So as long as the design is the same this should all work. The devil is going to be in the details and in the oversight and in the inspections. [italics added]

A hundred reactors, each costing almost $450 million, would amount to an order book of $45 billion. That does not include construction costs, which would double the figure to $90 billion, leaving aside for now the industry’s record of huge cost escalations (see B&W 2012 presentation at http://www.efcog.org/library/council_meeting/12SAECMtg/presentations/GS_Meeting/Day-1/B&W%20mPower%20Overview%20-%20EFCOG%202012-Ferrara.pdf for total estimated cost figure of $5000 per kW). This would make the SMR launch something like creating a new commercial airliner, say like Dreamliner or the Airbus 350. There were a total of 350 orders for the A350 in 2007, when it was seriously launched as a rival to Boeing’s Dreamliner. The list price of the A350 is between about $250 million and $400 million (rounded — http://www.airbus.com/presscentre/corporate-information/key-documents/?eID=dam_frontend_push&docID=14849), which would make the initial order total the same order of magnitude in cost as 100 completed mPower SMRs.

The end of this decade is the target for the commissioning of the first mPower reactor (the most advanced SMR in planning and design and the only one with substantial federal government support so far). It would take some years after that – well into the 2020s – to fully prove the design and, if needed, debug it. It stretches credulity that China and India, which along with Russia, are the main centers of nuclear power construction today, would put in orders totaling a hundred reactors much before 2020. Indeed, if they were that attracted to SMRs, why would they not pay the license fees and set up the assembly line themselves? Most notably, China, where 28 reactors are under construction (http://www.world-nuclear.org/info/inf63.html), already has a much better supply chain than the United States. So the government subsidy to B&W and TVA would likely pave the way for an assembly line in China! Unless…

The federal government orders a hundred reactors or entices utilities to do so with massive subsidies. We are talking putting tens of billions of taxpayer dollars at risk – at a time when the air is full of talk of furloughs for teachers, air traffic controllers and civilian Pentagon employees, among others, and cutbacks in training of military personnel.

What happens if a common design or manufacturing problem is discovered as it was with the Dreamliner batteries? How is a mass manufacturer of reactors, whose internals become radioactive after commissioning, going to recall them or their major components? No word from Mr. Stout, or that I am aware, from any of the SMR enthusiasts, about this. For safety issues see the testimony of Dr. Ed Lyman of the Union of Concerned Scientists, at http://www.ucsusa.org/assets/documents/nuclear_power/lyman-appropriations-subcom-7-14-11.pdf.

IEER and Physicians for Social Responsibility wrote an SMR fact sheet in 2010, outlining such concerns. (http://ieer.org/resource/factsheets/small-modular-reactors-solution/) They have only deepened with time. SMRs are highly unlikely to provide the industry the nuclear renaissance that has so far eluded it. Indeed, by the time the first ones are being commissioned, solar photovoltaics, smart grids, and a host of other disturbed electricity technologies will be more economical. Wind-generated electricity already is. No one has a perfect crystal ball in the energy business, but mine tells me that SMRs will not solve our CO2 woes. They are likely to be economically obsolete before this decade is done – and the end of this decade is the target date for the first mPower reactor to come on line. So why go there, to a time when there are going to be only costs and little prospect of benefits?

Notes:

  1. This sentence was corrected on April 3, 2013. The original text “And the federal government has obliged by committing $452 million to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification.” is incorrect. ↩ Return

The German Energy Transition

The Heinrich Böll Foundation has created a new site to inform the public about its historic energy transition or “energiewende”: http://www.EnergyTransition.de The aim is the reduce greenhouse gas emissions by 80 percent by 2050. A renewable electricity sector is a principal part of this goal. Germany already produces 26% of its electricity and is set to exceed the target of 40% by 2020. As evidence for severe climate disruption mounts, the German energiewende provides some hope. This kind of transformation was a gleam in my eye when I finished Carbon-Free and Nuclear Free in 2007. It was my hope for the United States. I still hope that action by states, corporations, individuals, and even the federal government (though CO2 rules, efficiency standards, etc.) will get us to a fully renewable energy sector by 2050 in the United States and worldwide.

— Arjun

Carbon-Free and Nuclear-Free is getting an enthusiastic response, but several new nuclear facilities are planned


I have been going around the country speaking about my new book, Carbon-Free and Nuclear-Free: A Roadmap for U.S. Energy Policy. (Download it free)

Nothing I have done in 37 years of work on energy, environment, and nuclear weapons and power issues has caught on like this.

As evidence of serious and rapid climate change mounts and a price on carbon emissions looks more and more certain, companies’ coal-fired power plants are hard to justify and harder to finance. So the nuclear industry wants to ride into town as the savior. Having failed to deliver electricity “too cheap to meter” (promised in the 1950s by the Chairman of the Atomic Energy Commission, Lewis Strauss), it now wants massive new government subsidies in the form of loan guarantees.

But it is a false choice. Those who oppose nuclear power as the “solution” to the global climate crisis are right: a combination of efficiency, renewable energy, combined heat and power, and emerging technologies such as plug-in hybrid cars can allow us to phase out all fossil fuels and nuclear power in 30 to 50 years.

Eight new nuclear reactors are being proposed in Texas alone. The two near Amarillo, in the panhandle, will consume 60 million gallons of water every day—more than what the entire city uses. The company proposing the plant has said there is a lake there in an unidentified location that will supply the water. In Idaho, the CEO of Alternate Energy Holdings, which wants to build a power plant there, implies that nuclear power will cost only 1 to 2 cents per kilowatt-hour, because capital cost is borne by the investors, as if Wall Street were a kind of charity for electricity consumers. Far from it. Wall Street got burned by nuclear power in the 1980s; it is leery of financing them. That’s why the nuclear industry has the largest hat in hand in Washington, D.C. asking for handouts such as license application subsidies and 100 percent loan guarantees.

But at least some investors are catching on. Mid-American Energy, owned by Warren Buffet’s Berkshire Hathaway, announced last month (January 2008) that it was abandoning plans to build a nuclear power plant in Idaho because it could not provide economical power to its customers. Austin Energy, the city-owned utility in the capital of Texas, has recommended that the City vote not to buy a share of the two proposed reactors near Bay City Texas. The investment would, at this time, be “unwise” and imprudent” said the utility, because of insufficient time to examine the paperwork and the risk of cost overruns and delays.

Here is a link to a summary of my book (Note: 2.5 MB pdf)

and to an op ed I recently wrote for the Deseret News (Salt Lake City)

I invite you to comment on the analysis in my book, on what you are doing in your neighborhood, city, county, or state regarding energy and climate and to link to my blog.

–Arjun

PO Box 5324 · Takoma Park, Maryland, 20913 USA · Tel. 1-301-509-6843 · E-mail: