The Clean Power Plan is a step in the right direction

With the publication of the final Clean Power Plan, the United States can finally claim some leadership in curbing CO2 emissions at the federal level. The final rule is, on balance, technically, economically and environmentally coherent. The actual goal is short of what it needs to be, but it is better than in the draft plan. And the direction is right, which is the most important thing. Thanks to all of you who worked with us and supported us in the process, especially Scott Denman, Diane Curran, Lisa Heinzerling, Elena Krieger, and my co-author, M.V. Ramana..

We asked for many things in our comments on the draft EPA Clean Power Plan. The EPA agreed not only with the substance, but more important, the reasoning underlying our policy positions in the final Clean Power Plan (CPP) rule.

Most of all we asked for a coherent, technology-neutral rule that would be more protective of climate. Here are some of the big picture items:

  1. Existing nuclear plants and license extensions will not be subsidized by the CPP: We asked that both existing nuclear power plants and existing renewable energy be removed from the calculation of emission targets because they do nothing to reduce CO2 emissions. We asked that they be treated consistently. (Neither have significant onsite emissions of CO2 and both have some offsite lifecycle emissions that are much less than natural gas per unit of generation). Existing generation should not be part of the “Best System of Emission Reduction” (BSER) because we want to reduce CO2 emissions from where they are now (or in 2012, the baseline year). The EPA agreed. Both are gone from the final rule. Further, in its draft rule, the EPA implicitly assumed (in its modelling of the electricity sector) that licenses of existing plants would be extended. The relicensing issue has been removed from the CPP since existing generation is not in the calculation of emission reductions. It is simply the baseline generation, as is clear from page 345 of the final plan (italics added):

    …we believe it is inappropriate to base the BSER on elements that will not reduce CO2 emissions from affected EGUs below current levels. Existing nuclear generation helps make existing CO2 emissions lower than they would otherwise be, but will not further lower CO2 emissions below current levels. Accordingly,…the EPA is not finalizing preservation of generation from existing nuclear capacity as a component of the BSER.

    The same reasoning was applied to license extensions. Only uprates (increases in licensed capacity of existing plants) would be allowed to be counted. This is consistent and technology neutral (in the same way that increasing the capacity of a wind farm would be counted). The rule does not seek to “preserve” existing power plants. Or to shut them down. That will happen on the merits without an EPA hand on the scale in favor of nuclear.

  2. New and under-construction nuclear reactors are not part of the best system of emission reduction; renewable energy is: We pointed out that new nuclear plants are very expensive; even the State of Georgia, whose ratepayers are forced to subsidize two nuclear units through their electricity bills, noted that in its comments. Since the “Best System of Emission Reduction” (BSER) has a cost criterion, new nuclear should be excluded from the BSER. (We also cited other reasons for that.) The EPA excluded new nuclear from BSER but included new renewable energy (p. 345, italics added):

    Investments in new nuclear capacity are very large capital-intensive investments that require substantial lead times. By comparison, investments in new RE generating capacity are individually smaller and require shorter lead times. Also, important recent trends evidenced in RE development, such as rapidly growing investment and rapidly decreasing costs, are not as clearly evidenced in nuclear generation. We view these factors as distinguishing the under-construction nuclear units from RE generating capacity, indicating that the new nuclear capacity is likely of higher cost and therefore less appropriate for inclusion in the BSER.

    This is a critically important statement. We don’t have a shortage of low CO2 sources. We have a shortage of time and money to reduce CO2 emissions. The EPA recognized (very delicately!) that renewable energy is better on both counts. As a result, one or more the four new reactors under construction at Vogtle and Summer can proceed or stop on the financial merits, rather than these units being pushed into existence with the Clean Power Plan playing the role of midwife.

    The EPA also “seeks to drive the widespread development and deployment of wind and solar, as these broad categories of renewable technology are essential to longer term climate strategies” (p. 874). This is an excellent goal. The EPA recognized that costs of solar and wind are declining.

  3. New natural gas plants are not part of the best system of emission reductions: This is perhaps the best and most solid indication that the Obama administration takes long-term reductions seriously. New natural gas combined cycle plants, even though they have lower CO2 emissions per megawatt-hour (using EPA leak rates and global warming potential for methane), will not be part of the BSER even though they meet the cost test and emission rate test. The reason: they will be emitting CO2 for decades (p. 346, italics added):

    However, our determination not to include new construction and operation of new NGCC capacity in the BSER in this final rule rests primarily on the achievable magnitude of emission reductions rather than costs. Unlike emission reductions achieved through the use of any of the building blocks, emission reductions achieved through the use of new NGCC capacity require the construction of additional CO2-emitting generating capacity, a consequence that is inconsistent with the long-term need to continue reducing CO2 emissions beyond the reductions that will be achieved through this rule. New generating assets are planned and built for long lifetimes –- frequently 40 years or more –-that are likely longer than the expected remaining lifetimes of the steam EGUs whose CO2 emissions would initially be displaced be the generation from the new NGCC units. The new capacity is likely to continue to emit CO2 throughout these longer lifetimes….

  4. Increased capacity factor of existing natural gas plants is BSER: The EPA is still allowing increased capacity factor of existing natural gas combined cycle power plants to displace coal. This is the result of its estimate of methane leak rates and global warming potential. So long as new central station natural gas plants are not encouraged, the rate of use of existing plants is a problem that can be sorted out in the coming years. It would have been very difficult to argue only on the grounds of the BSER rules and existing methane leaks estimates that increasing capacity factor of existing natural gas combined cycle units to displace coal is not BSER. The job now is to get the EPA to recognize a wider array of methane leaks rates (that have ample empirical support) and to use both a 20-year and 100-year warming potential screen in the design of its CO2 reduction programs. The recent report from the IPCC uses a global warming potential of 28-34, including feedback effects. It would be entirely appropriate for the EPA to adopt a similar evaluation metric. The 20-year warming potential, which is about three times higher would be even more appropriate given that the climate crisis is developing more rapidly than previously anticipated.
  5. The EPA has incentivized early investment in low-income efficiency programs (p. 864 onward): This is a very important feature of the CPP. States that want to make very sure that low-income households are not adversely impacted by the rule will take advantage of the additional emission reduction credits the EPA is offering for early action. This also promises to provide other benefits such as reduction of the cost of energy assistance programs and lower adverse health impacts due to inability to pay for health care or medicines.
  6. The cap-and-trade provision is OK in the electricity context, though with reservations: Carbon permits from new generation can be traded. For instance, existing nuclear plants cannot generate tradeable CO2 credits (unless they are from a licensed uprate). I am not a fan of expansive cap-and-trade but the EPA formulation in the CPP makes sense to me. It is the same as if emission limits were set for a group of states or at the grid level, such as the PJM grid in the mid-Atlantic region but extending inland to Ohio and beyond, or the MISO grid in the upper Midwest. The EPA seeks not to impose a model of reductions; only to get to a certain level of reductions. In the cap-and-trade system permitted by the EPA, the CO2 reduction could happen in one state or in another, but it will have to happen. One of my reservations is that the EPA also allows the trading of energy efficiency credits across state lines. It is difficult enough to account for program-induced efficiency improvements within a state and distinguish them from say, the effects of federal appliance standards. Bundling these efficiency gains into tradeable credits is not a good idea. Another issue is that the method of calculating the reduction in emission rate is not the best as applied to efficiency. We had asked for a more global and comprehensive approach to CO2 accounting, but did not succeed on this point.
  7. Conclusion – The CPP is a real tour de force; it gives me hope. Of course, there is much work to do now that the final CPP has been published (besides making it stick). We need to advocate for states to mandate GHG reduction targets of 40 to 50 percent by 2030 from all sources; we need to accelerate electrification of transportation and restructuring of the grid….But the CPP is a great springboard from which to make these leaps.

From Pearl Harbor to Hiroshima

Every anniversary of the atomic bombings of Hiroshima and Nagasaki, two schools of thought square off. One says, the bombings were not necessary to end the war; the Japanese were close to surrender anyway. The other says remember Pearl Harbor, the Japanese militarists’ determination to fight to the end. But many questions remain unasked in this framework. Why was the U.S. Pacific fleet moved to Pearl Harbor in 1940? Why did Japan bomb it? When were Japanese forces first targeted, rather than Germany? The answers may surprise you. They are in a talk I gave at Santa Fe in 2012: see the video of it below. It’s about an hour. Links to some historical documents and additional information are provided below.

  • Pearl Harbor was not a “sneak attack” in the sense that it was a total surprise to the United States. The Pacific fleet had been moved there in June 1940 to assert U.S. power in the Pacific. Admiral Kimmel, in charge of the fleet, noted in February 1941, nearly ten months before the attack that he felt that “a surprise attack (submarine, air, or combined) on Pearl harbor is a possibility.” So the bombing of Pearl Harbor in December 1941 was not a surprise in any military sense. The fleet’s vulnerability had been anticipated.
  • Japan depended on the U.S. for 80 percent of its oil imports in 1940, making it very vulnerable to the oil embargo which began on August 1, 1941. It had to decide — pursue empire and get to Indonesian oil or give up its attempt to conquer China and other areas of Asia.
  • In 1944, as a senator, Harry Truman had been frustrated and upset that he had not been allowed to send a personal military representative to the Hanford Site to determine whether the large expenditures there were wasteful or not. He gave in for the moment and agreed not to investigate, but warned Secretary Stimson in March 1944 that “[t]he responsibility … for any waste or improper action which might otherwise be avoided resists squarely upon the War Department.” Truman did not know about the Manhattan Project until after he became President in April 1945, upon the death of President Roosevelt.
  • James Brynes, as director of the Office of War Mobilization, advised President Roosevelt in a March 3, 1945 memorandum that there should be an independent scientific investigation “to justify continuance of the project.’’ He warned that “if the project proves a failure, it will then be subjected to relentless investigation and criticism.”
  • The concept that half-a-million lives may be saved by an early end to the war was mentioned in a note to Henry Stimson by an “economist” friend (my guess is that it was his cousin, Alfred Loomis, a wealthy Wall Street banker and amateur physicist) as an argument in favor of a conditional surrender policy. The main purpose of the suggestion of conditional surrender was to end the war before the Soviets entered, and thereby keep markets in Asia other than Formosa and Korea for the British and the Americans. Formosa and Korea were proposed to be ceded to Japan as part of the early end to the war.
  • Stimson forwarded the letter to General Marshall for evaluation by his staff, which the general sent him on June 7, 1945. The General Staff considered the proposal “acceptable from the military standpoint, but its implementation on the terms suggested is considered doubtful.” Overall, the staff analysis leaned in the direction of terms that appear to add up to unconditional surrender. It rejected out of hand the suggestion that the invasion of Japan would cost half-a-million American lives. It stated that the estimate “under our present plan of campaign, is considered entirely too high” (underlining in the original; italics and bold added). General Marshall’s cover note to Stimson stated that he “general agreement” with the analysis.
  • The war plan itself had three scenarios. Estimated deaths were in the range 25,000 to 46,000; injuries were in the range of 105,000 to 170,000 plus 2,500 to 4,000 missing in action. A few days later, on June 18, 1945, General MacArthur clarified that the “estimate was derived from the casualty rates in Normandy and Okinawa, the highest our forces have sustained….The estimate is purely academic and routine and was made for planning alone. I do not anticipate such a high rate of loss.”

Some reflections on nuclear costs

Thoughts inspired by the news near the anniversaries of the atomic bombings of Hirishoma and Nagasaki

Britain’s estimated nuclear stockpile is estimated to be 225 warheads, of which no more than 160 are available to be operational at any time, according to the Bulletin of Atomic Scientists (http://bos.sagepub.com/content/67/5/89.full.pdf+html). The costs of “cleanup” of Sellafield, the site that produced the plutonium for these bombs is now estimated at 67.5 billion pounds, or about 100 billion dollars (http://www.independent.co.uk/news/uk/politics/sellafield-failed-by-private-cleanup-firms-series-of-expensive-mistakes-has-led-to-review-at-nuclear-plant-8735040.html). A modest amount of electricity was produced there, but far short of the cleanup cost, which may yet rise. More than 100 metric tons of plutonium for civilian purposes were produced but have less than zero value. In simple economic terms, the civilian plutonium is a liability not an asset.

Aerial view Sellafield, Cumbria - geograph.org.uk - 50827
(Aerial view Sellafield, Cumbria. Photo credit Simon Ledingham)

So, as a practical matter, most of the $100 billion must be chalked up as the cost of Britain’s nuclear bombs, since that was the only (arguably) “useful” output. That is roughly $600 million dollars per operational bomb in Sellafield clean up costs alone. Then add remediation of all the other bomb-related sites and the costs of setting up and running the nuclear bomb complex.

In the United States cumulative costs to the year 1996 were about $5.5 trillion (1995 dollars), including not only the bombs, but the delivery systems, personnel, etc. incurred until then. It has been increasing by tens of billions each year since. Cleanup will total hundreds of billions of dollars. And, according to current plans, many sites will be left with significant amounts of contamination. (For an accounting of the U.S. program, See Atomic Audit, Brookings Institute, 1998, ed. Stephen Schwartz. I was one of the authors. It’s available from IEER’s website (http://ieer.org/resource/books/atomic-audit/)

In the meantime, Fuksuhima continues to be an emergency without end – vast amounts of radioactivity, including strontium-90 in the groundwater, evidence of leaks into the sea, the prospect of contaminated seafood. Strontium-90, being a calcium analog, bioaccumulates in the food chain. It is likely to be a seaside nightmare for decades. (Listen to Arjun discuss the ongoing radioactivity leaks in an interview with Living on Earth radio: http://www.loe.org/shows/segments.html?programID=13-P13-00030&segmentID=4)

According to the New York Times (http://www.nytimes.com/2013/07/27/world/asia/operator-of-fukushima-plant-criticized-for-delaying-disclosures-on-leaks.html), Shunichi Tanaka, chairman of Japan’s new Nuclear Regulation Authority chairman, recently said:

“The difficulties we face at Fukushima Daiichi are on par with the difficulties we faced in the wake of World War II. Tepco [the Tokyo Electric Power Company] needs more assistance from others in Japan, me included. We cannot force everything on Tepco; that’s probably not going to solve the problem.”

So nuclear power has gone from a promise of “too cheap to meter” to a disaster like the horrific post-war rubble in Japan. And the biggest bang for the buck that was supposed to be the bomb has become endless bucks for the bang. A sad reminder as we approach the anniversaries of the bombings of Hiroshima and Nagasaki.

All or nothing nuclear power – from 24/7 to 0/365

The last few months have seen some definite signs that commercial nuclear power is not the wave of the future but a way of boiling water that might be seen as a twentieth century folly. Four commercial nuclear reactors have been shut permanently ostensibly for different reasons, but economics underlies them all.

Crystal River in Florida came first, in early February 2013. It had been shut since 2009. Like many other pressurized water reactors, it had to have a premature replacement of its steam generators, the huge heat exchangers were the hot reactor water (“primary water”) heats up water in the secondary circuit to make the steam the drives the turbine-generator set. The outer layer of the containment structure cracked during the replacement. Duke Energy, the owner, determined it was too costly to fix the problem. See Duke’s press release at http://www.duke-energy.com/news/releases/2013020501.asp

The 556-megawatt Kewaunee reactor in Wisconsin came next, in early May, unable to compete with cheap natural gas and falling electricity prices. Indeed, electricity consumption in the United States is declining even as the economy recovers from the Great Recession due in part to the increasing efficiency of electricity use. There doesn’t appear to be enough money in the reserve fund for decommissioning at present – see the New York Times article at http://www.nytimes.com/2013/05/08/business/energy-environment/kewaunee-nuclear-power-plant-shuts-down.html.

San Onofre, with two reactors, came next. Both had been down since early 2012, when excessive wear of steam generator tubes and leaks of primary water were discovered. The steam generators were new, but contrary to the company’s claims, it turned out that the new ones were not copies of the original licensed design. A long, contentious process followed; prospects for a green light to restart faded. The blame game between the supplier of the steam generators, Mitsubishi, and the majority owner, Southern California Edison grew intense (and it continues). Announcing the decision to close the plant, the SCE President Ron Litzinger said: “Looking ahead, we think that our decision to retire the units will eliminate uncertainty and facilitate orderly planning for California’s energy future.” (See the La Times article at http://www.latimes.com/local/lanow/la-me-ln-edison-closing-san-onofre-nuclear-plant-20130607,0,7920425.story).

Nuclear plants were supposed to create certainty, reliability, predictability, 24/7 operation. But in the last few years, this has given way to a new reality. Nuclear reactors are 24/7 until they become 0/365 with little or no notice. The above are just four examples. Before the Fukushima disaster, Japan had 54 reactors. Four were irretrievably damaged by the accident. In the 15 months that followed, the other 50 were progressively shut or remained in shut down mode. In the last year, only two have been restarted. It will be a contentious process before any more of them can be restarted. It is possible none will be. Many in Japan assume they won’t be for they are installing solar power at rapid rates – 1.5 gigawatts in the first quarter of 2013 alone – equal to about one-and-a-half reactors in peak power output. About 6 gigawatts would be required to generate an equal amount of electricity to one typical power reactor. Capacity comparable to that will likely be installed in Japan this year.

Finally, Germany prematurely shut eight reactors following Fukushima, consolidating and accelerating the post-Chernobyl process of phasing out nuclear power altogether (the end date is now set for 2022).

But officialdom in the United States still clings to the idea that we need nuclear power. So reliable, so baseload, so twentieth century (oops, wrong century).

Fukushima reflections on the second anniversary of the accident

Statement of Arjun Makhijani for the March 2013 conference commemorating the Fukushima accident
To be read by Helen Caldicott

I appreciate that my friend, Helen Caldicott, one of the two people who inspired my book Carbon-Free and Nuclear-Free (the other was S. David Freeman) has agreed to read a brief statement from me on this second anniversary of the Fukushima disaster. I wanted to share two of the new things I have learned as I have followed the consequences of Fukushima unfold.

First, the Japanese government proposed to allow doses as high as 2 rem (20 millisieverts) per year to school children, claiming that the risk was low or at least tolerable. An exposure at this level over five years – 10 rem in all — to a girl, starting at age five, would create a cancer incidence risk of about 3 percent, using the [age- and gender-specific] risk estimates in the National Academies BEIR VII report.

Now imagine that you are a parent in Japan trying to decide whether to send your daughter to such a school. Roughly thirty of every hundred girls would eventually develop cancer at some point in their lives; just one of those would be attributable to Fukushima school exposure, according to the risk numbers. But no one would know if their daughter’s cancer was attributable to the exposure at school and neither would the Japanese government’s radiation bureaucrats. Why is it difficult to understand that while the risk attributable to school contamination would be one in thirty, the proportion of parents stricken with guilt and doubt would be closer to one in three? Would you ever forgive yourself if you made the decision to send your daughter to that school? Or your son, though the risk attributable to Fukushima exposure would be less than that experienced by girls?

Indeed, due to the long latency period of most cancers, you would be fearful even if no cancer had as yet appeared. The Pentagon understood this when a Joint Chiefs of Staff Task Force evaluated the extensive contamination produced by the July 1946 underwater nuclear bomb test (Test Baker) at Bikini for its usefulness in war. Here is a quote from their 1947 report:

“Of the survivors in the contaminated areas , some would be doomed by radiation sickness in hours some in days, some in years. But, these areas, irregular in size and shape, as wind and topography might form them, would have no visible boundaries. No survivor could be certain he was not among the doomed, and so added to every terror of the moment, thousands would be stricken with a fear of death and the uncertainty of the time of its arrival.”

Compare this for yourself with the aftermath of Fukushima and the plight of the parents.

Second, nuclear power’s conceit was that nuclear power is 24/7 electricity supply. Since Fukushima, over sixty of the world’s light water power reactors have been prematurely shut for a variety of reasons, though just four reactors were stricken by the accident: 52 in Japan, eight in Germany, several in the U.S. Even if some are eventually restarted, nuclear power has shown a unique ability to go from 24/7 power supply to 0/365 essentially overnight for long periods– hardly a convincing claim of reliability.

We can do better than making plutonium just to boil water or polluting the Earth with fossil fuel use. When I finished Carbon-Free Nuclear-Free in 2007, I estimated it would take about forty years to get to an affordable, fully renewable energy system in the United States. Today, I think in can be done in twenty-five to thirty years. Are we up to the challenge? Finally, I truly regret I cannot be there to publicly thank and honor my friend Helen for inspiring Carbon-Free, Nuclear-Free, which you can download free from ieer.org, also thanks to her. I wish you a very productive conference.

(Also see IEER’s publication Plutonium: Deadly Gold of the Nuclear Age, June 1992.)

When small is not beautiful is it at least cheap?

On February 6, 2013, Dan Stout, who is the Tennessee Valley Authority’s senior manager for its Small Modular Reactor project, gave a colloquium at the University of Tennessee in Knoxville. Much of the talk was just nuclear-boosterism. For instance, he claimed that “nuclear power was tested hard in 2011. It remains safe reliable and affordable.”

There was no mention of the fact that post-Fukushima, about 60 of the world’s light water reactors were closed for one reason or another, mainly in Japan and Germany, taking them from 24/7 power generation to 0/365. He said nothing of the enormous social, economic, and environmental dislocation and loss that has afflicted the Fukushima region and beyond since that time. And there was nothing of the Nuclear Regulatory Commission’s Task Force report of July 2011 that found US regulations seriously lacking in a number of respects (http://pbadupws.nrc.gov/docs/ML1118/ML111861807.pdf). But there was some refreshing candor about Small Modular Reactors (SMRs) mixed with this sales talk. His talk is archived at http://160.36.161.128/UTK/Viewer/?peid=fa73ded60b7b46698e9adc0732101a76

SMRs are supposed to overcome the loss of economies of scale by using assembly mass-manufacturing techniques such as are used for cars and passenger aircraft. The site set up would be standardized and the set up and commissioning on site would be relatively quick (36 months). So “economics of replication” would replace the “economies of scale” (one of the principal reasons that reactors got larger as time went on).

But there is a chicken and egg problem here, to quote a cliché. You’ve got to have a lot of orders before you can set up your assembly line and produce cheap reactors, but you have to have demonstrated your reactors are certified before you get the nuclear masses lining up to order them, given the risks involved. There are no such orders yet; no assembly line is in sight.

So for now, SMRs would be “cobbled together” in existing facilities. “Does the federal government want to help?” he asked rhetorically. “I don’t know,” he went on. “We’re going to find out. I am not going to make your electric bills all triple because of this project. That’s just …TVA won’t do that.” (Italics added.)

So for the foreseeable future without massive subsidies, nuclear power using SMRs will be the same as nuclear power with the present Atomic Leviathans – expensive and in need of government subsidies. But you have to hand it to Mr. Stout for one thing. Unlike Georgia Power and South Carolina Electric and Gas, two utilities building the new large behemoth variety (the AP1000), he is not willing to saddle TVA ratepayers with the cost of yet another nuclear gamble. TVA has been there and done that, and is still doing it with large reactors. A large part of TVA’s indebtedness is from 1970s and early 1980s was due to the cancellation mid-stream of costly and unneeded reactors. No, prudently for TVA, Mr. Stout wants the taxpayers to take the deal.

And the federal government has obliged by committing up to half of the $452 million proposed for SMRs to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification. [1] B&W had spent over $200 million on the project as of the middle of last year. Where will it lead other than the “cobbled together” machine? Specifically, what will it take to get an assembly line? Here is how Mr. Stout explained it:

So the concept is that you gotta to have an assembly line cranking out repeatable parts, achieving a standardized vision of lots of mPower reactors. That creates the nth of a kind plant that has the efficiency in cost. I’m building Unit One. I don’t want to pay for B&W’s factory with automation to crank out repeatable parts. So that creates a contracting challenge… So as you scratch your head and puzzle how does work, remember the math won’t work on one unit. In fact our unit is most likely going to be, I’m going to use the word “cobbled together”, it’s going to be manufactured within existing facilities. But if B&W can get an order backlog of a hundred SMRs and they are going to start delivering them in China and India etc. then they’ll be able to go get the financing and build the building and have new stuff put in place to crank out these parts in a more automated manner. So as long as the design is the same this should all work. The devil is going to be in the details and in the oversight and in the inspections. [italics added]

A hundred reactors, each costing almost $450 million, would amount to an order book of $45 billion. That does not include construction costs, which would double the figure to $90 billion, leaving aside for now the industry’s record of huge cost escalations (see B&W 2012 presentation at http://www.efcog.org/library/council_meeting/12SAECMtg/presentations/GS_Meeting/Day-1/B&W%20mPower%20Overview%20-%20EFCOG%202012-Ferrara.pdf for total estimated cost figure of $5000 per kW). This would make the SMR launch something like creating a new commercial airliner, say like Dreamliner or the Airbus 350. There were a total of 350 orders for the A350 in 2007, when it was seriously launched as a rival to Boeing’s Dreamliner. The list price of the A350 is between about $250 million and $400 million (rounded — http://www.airbus.com/presscentre/corporate-information/key-documents/?eID=dam_frontend_push&docID=14849), which would make the initial order total the same order of magnitude in cost as 100 completed mPower SMRs.

The end of this decade is the target for the commissioning of the first mPower reactor (the most advanced SMR in planning and design and the only one with substantial federal government support so far). It would take some years after that – well into the 2020s – to fully prove the design and, if needed, debug it. It stretches credulity that China and India, which along with Russia, are the main centers of nuclear power construction today, would put in orders totaling a hundred reactors much before 2020. Indeed, if they were that attracted to SMRs, why would they not pay the license fees and set up the assembly line themselves? Most notably, China, where 28 reactors are under construction (http://www.world-nuclear.org/info/inf63.html), already has a much better supply chain than the United States. So the government subsidy to B&W and TVA would likely pave the way for an assembly line in China! Unless…

The federal government orders a hundred reactors or entices utilities to do so with massive subsidies. We are talking putting tens of billions of taxpayer dollars at risk – at a time when the air is full of talk of furloughs for teachers, air traffic controllers and civilian Pentagon employees, among others, and cutbacks in training of military personnel.

What happens if a common design or manufacturing problem is discovered as it was with the Dreamliner batteries? How is a mass manufacturer of reactors, whose internals become radioactive after commissioning, going to recall them or their major components? No word from Mr. Stout, or that I am aware, from any of the SMR enthusiasts, about this. For safety issues see the testimony of Dr. Ed Lyman of the Union of Concerned Scientists, at http://www.ucsusa.org/assets/documents/nuclear_power/lyman-appropriations-subcom-7-14-11.pdf.

IEER and Physicians for Social Responsibility wrote an SMR fact sheet in 2010, outlining such concerns. (http://ieer.org/resource/factsheets/small-modular-reactors-solution/) They have only deepened with time. SMRs are highly unlikely to provide the industry the nuclear renaissance that has so far eluded it. Indeed, by the time the first ones are being commissioned, solar photovoltaics, smart grids, and a host of other disturbed electricity technologies will be more economical. Wind-generated electricity already is. No one has a perfect crystal ball in the energy business, but mine tells me that SMRs will not solve our CO2 woes. They are likely to be economically obsolete before this decade is done – and the end of this decade is the target date for the first mPower reactor to come on line. So why go there, to a time when there are going to be only costs and little prospect of benefits?

Notes:

  1. This sentence was corrected on April 3, 2013. The original text “And the federal government has obliged by committing $452 million to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification.” is incorrect. ↩ Return

PO Box 5324 · Takoma Park, Maryland, 20913 USA · Tel. 1-301-509-6843 · E-mail: