The Clean Power Plan is a step in the right direction

With the publication of the final Clean Power Plan, the United States can finally claim some leadership in curbing CO2 emissions at the federal level. The final rule is, on balance, technically, economically and environmentally coherent. The actual goal is short of what it needs to be, but it is better than in the draft plan. And the direction is right, which is the most important thing. Thanks to all of you who worked with us and supported us in the process, especially Scott Denman, Diane Curran, Lisa Heinzerling, Elena Krieger, and my co-author, M.V. Ramana..

We asked for many things in our comments on the draft EPA Clean Power Plan. The EPA agreed not only with the substance, but more important, the reasoning underlying our policy positions in the final Clean Power Plan (CPP) rule.

Most of all we asked for a coherent, technology-neutral rule that would be more protective of climate. Here are some of the big picture items:

  1. Existing nuclear plants and license extensions will not be subsidized by the CPP: We asked that both existing nuclear power plants and existing renewable energy be removed from the calculation of emission targets because they do nothing to reduce CO2 emissions. We asked that they be treated consistently. (Neither have significant onsite emissions of CO2 and both have some offsite lifecycle emissions that are much less than natural gas per unit of generation). Existing generation should not be part of the “Best System of Emission Reduction” (BSER) because we want to reduce CO2 emissions from where they are now (or in 2012, the baseline year). The EPA agreed. Both are gone from the final rule. Further, in its draft rule, the EPA implicitly assumed (in its modelling of the electricity sector) that licenses of existing plants would be extended. The relicensing issue has been removed from the CPP since existing generation is not in the calculation of emission reductions. It is simply the baseline generation, as is clear from page 345 of the final plan (italics added):

    …we believe it is inappropriate to base the BSER on elements that will not reduce CO2 emissions from affected EGUs below current levels. Existing nuclear generation helps make existing CO2 emissions lower than they would otherwise be, but will not further lower CO2 emissions below current levels. Accordingly,…the EPA is not finalizing preservation of generation from existing nuclear capacity as a component of the BSER.

    The same reasoning was applied to license extensions. Only uprates (increases in licensed capacity of existing plants) would be allowed to be counted. This is consistent and technology neutral (in the same way that increasing the capacity of a wind farm would be counted). The rule does not seek to “preserve” existing power plants. Or to shut them down. That will happen on the merits without an EPA hand on the scale in favor of nuclear.

  2. New and under-construction nuclear reactors are not part of the best system of emission reduction; renewable energy is: We pointed out that new nuclear plants are very expensive; even the State of Georgia, whose ratepayers are forced to subsidize two nuclear units through their electricity bills, noted that in its comments. Since the “Best System of Emission Reduction” (BSER) has a cost criterion, new nuclear should be excluded from the BSER. (We also cited other reasons for that.) The EPA excluded new nuclear from BSER but included new renewable energy (p. 345, italics added):

    Investments in new nuclear capacity are very large capital-intensive investments that require substantial lead times. By comparison, investments in new RE generating capacity are individually smaller and require shorter lead times. Also, important recent trends evidenced in RE development, such as rapidly growing investment and rapidly decreasing costs, are not as clearly evidenced in nuclear generation. We view these factors as distinguishing the under-construction nuclear units from RE generating capacity, indicating that the new nuclear capacity is likely of higher cost and therefore less appropriate for inclusion in the BSER.

    This is a critically important statement. We don’t have a shortage of low CO2 sources. We have a shortage of time and money to reduce CO2 emissions. The EPA recognized (very delicately!) that renewable energy is better on both counts. As a result, one or more the four new reactors under construction at Vogtle and Summer can proceed or stop on the financial merits, rather than these units being pushed into existence with the Clean Power Plan playing the role of midwife.

    The EPA also “seeks to drive the widespread development and deployment of wind and solar, as these broad categories of renewable technology are essential to longer term climate strategies” (p. 874). This is an excellent goal. The EPA recognized that costs of solar and wind are declining.

  3. New natural gas plants are not part of the best system of emission reductions: This is perhaps the best and most solid indication that the Obama administration takes long-term reductions seriously. New natural gas combined cycle plants, even though they have lower CO2 emissions per megawatt-hour (using EPA leak rates and global warming potential for methane), will not be part of the BSER even though they meet the cost test and emission rate test. The reason: they will be emitting CO2 for decades (p. 346, italics added):

    However, our determination not to include new construction and operation of new NGCC capacity in the BSER in this final rule rests primarily on the achievable magnitude of emission reductions rather than costs. Unlike emission reductions achieved through the use of any of the building blocks, emission reductions achieved through the use of new NGCC capacity require the construction of additional CO2-emitting generating capacity, a consequence that is inconsistent with the long-term need to continue reducing CO2 emissions beyond the reductions that will be achieved through this rule. New generating assets are planned and built for long lifetimes –- frequently 40 years or more –-that are likely longer than the expected remaining lifetimes of the steam EGUs whose CO2 emissions would initially be displaced be the generation from the new NGCC units. The new capacity is likely to continue to emit CO2 throughout these longer lifetimes….

  4. Increased capacity factor of existing natural gas plants is BSER: The EPA is still allowing increased capacity factor of existing natural gas combined cycle power plants to displace coal. This is the result of its estimate of methane leak rates and global warming potential. So long as new central station natural gas plants are not encouraged, the rate of use of existing plants is a problem that can be sorted out in the coming years. It would have been very difficult to argue only on the grounds of the BSER rules and existing methane leaks estimates that increasing capacity factor of existing natural gas combined cycle units to displace coal is not BSER. The job now is to get the EPA to recognize a wider array of methane leaks rates (that have ample empirical support) and to use both a 20-year and 100-year warming potential screen in the design of its CO2 reduction programs. The recent report from the IPCC uses a global warming potential of 28-34, including feedback effects. It would be entirely appropriate for the EPA to adopt a similar evaluation metric. The 20-year warming potential, which is about three times higher would be even more appropriate given that the climate crisis is developing more rapidly than previously anticipated.
  5. The EPA has incentivized early investment in low-income efficiency programs (p. 864 onward): This is a very important feature of the CPP. States that want to make very sure that low-income households are not adversely impacted by the rule will take advantage of the additional emission reduction credits the EPA is offering for early action. This also promises to provide other benefits such as reduction of the cost of energy assistance programs and lower adverse health impacts due to inability to pay for health care or medicines.
  6. The cap-and-trade provision is OK in the electricity context, though with reservations: Carbon permits from new generation can be traded. For instance, existing nuclear plants cannot generate tradeable CO2 credits (unless they are from a licensed uprate). I am not a fan of expansive cap-and-trade but the EPA formulation in the CPP makes sense to me. It is the same as if emission limits were set for a group of states or at the grid level, such as the PJM grid in the mid-Atlantic region but extending inland to Ohio and beyond, or the MISO grid in the upper Midwest. The EPA seeks not to impose a model of reductions; only to get to a certain level of reductions. In the cap-and-trade system permitted by the EPA, the CO2 reduction could happen in one state or in another, but it will have to happen. One of my reservations is that the EPA also allows the trading of energy efficiency credits across state lines. It is difficult enough to account for program-induced efficiency improvements within a state and distinguish them from say, the effects of federal appliance standards. Bundling these efficiency gains into tradeable credits is not a good idea. Another issue is that the method of calculating the reduction in emission rate is not the best as applied to efficiency. We had asked for a more global and comprehensive approach to CO2 accounting, but did not succeed on this point.
  7. Conclusion – The CPP is a real tour de force; it gives me hope. Of course, there is much work to do now that the final CPP has been published (besides making it stick). We need to advocate for states to mandate GHG reduction targets of 40 to 50 percent by 2030 from all sources; we need to accelerate electrification of transportation and restructuring of the grid….But the CPP is a great springboard from which to make these leaps.

Some reflections on nuclear costs

Thoughts inspired by the news near the anniversaries of the atomic bombings of Hirishoma and Nagasaki

Britain’s estimated nuclear stockpile is estimated to be 225 warheads, of which no more than 160 are available to be operational at any time, according to the Bulletin of Atomic Scientists (http://bos.sagepub.com/content/67/5/89.full.pdf+html). The costs of “cleanup” of Sellafield, the site that produced the plutonium for these bombs is now estimated at 67.5 billion pounds, or about 100 billion dollars (http://www.independent.co.uk/news/uk/politics/sellafield-failed-by-private-cleanup-firms-series-of-expensive-mistakes-has-led-to-review-at-nuclear-plant-8735040.html). A modest amount of electricity was produced there, but far short of the cleanup cost, which may yet rise. More than 100 metric tons of plutonium for civilian purposes were produced but have less than zero value. In simple economic terms, the civilian plutonium is a liability not an asset.

Aerial view Sellafield, Cumbria - geograph.org.uk - 50827
(Aerial view Sellafield, Cumbria. Photo credit Simon Ledingham)

So, as a practical matter, most of the $100 billion must be chalked up as the cost of Britain’s nuclear bombs, since that was the only (arguably) “useful” output. That is roughly $600 million dollars per operational bomb in Sellafield clean up costs alone. Then add remediation of all the other bomb-related sites and the costs of setting up and running the nuclear bomb complex.

In the United States cumulative costs to the year 1996 were about $5.5 trillion (1995 dollars), including not only the bombs, but the delivery systems, personnel, etc. incurred until then. It has been increasing by tens of billions each year since. Cleanup will total hundreds of billions of dollars. And, according to current plans, many sites will be left with significant amounts of contamination. (For an accounting of the U.S. program, See Atomic Audit, Brookings Institute, 1998, ed. Stephen Schwartz. I was one of the authors. It’s available from IEER’s website (http://ieer.org/resource/books/atomic-audit/)

In the meantime, Fuksuhima continues to be an emergency without end – vast amounts of radioactivity, including strontium-90 in the groundwater, evidence of leaks into the sea, the prospect of contaminated seafood. Strontium-90, being a calcium analog, bioaccumulates in the food chain. It is likely to be a seaside nightmare for decades. (Listen to Arjun discuss the ongoing radioactivity leaks in an interview with Living on Earth radio: http://www.loe.org/shows/segments.html?programID=13-P13-00030&segmentID=4)

According to the New York Times (http://www.nytimes.com/2013/07/27/world/asia/operator-of-fukushima-plant-criticized-for-delaying-disclosures-on-leaks.html), Shunichi Tanaka, chairman of Japan’s new Nuclear Regulation Authority chairman, recently said:

“The difficulties we face at Fukushima Daiichi are on par with the difficulties we faced in the wake of World War II. Tepco [the Tokyo Electric Power Company] needs more assistance from others in Japan, me included. We cannot force everything on Tepco; that’s probably not going to solve the problem.”

So nuclear power has gone from a promise of “too cheap to meter” to a disaster like the horrific post-war rubble in Japan. And the biggest bang for the buck that was supposed to be the bomb has become endless bucks for the bang. A sad reminder as we approach the anniversaries of the bombings of Hiroshima and Nagasaki.

All or nothing nuclear power – from 24/7 to 0/365

The last few months have seen some definite signs that commercial nuclear power is not the wave of the future but a way of boiling water that might be seen as a twentieth century folly. Four commercial nuclear reactors have been shut permanently ostensibly for different reasons, but economics underlies them all.

Crystal River in Florida came first, in early February 2013. It had been shut since 2009. Like many other pressurized water reactors, it had to have a premature replacement of its steam generators, the huge heat exchangers were the hot reactor water (“primary water”) heats up water in the secondary circuit to make the steam the drives the turbine-generator set. The outer layer of the containment structure cracked during the replacement. Duke Energy, the owner, determined it was too costly to fix the problem. See Duke’s press release at http://www.duke-energy.com/news/releases/2013020501.asp

The 556-megawatt Kewaunee reactor in Wisconsin came next, in early May, unable to compete with cheap natural gas and falling electricity prices. Indeed, electricity consumption in the United States is declining even as the economy recovers from the Great Recession due in part to the increasing efficiency of electricity use. There doesn’t appear to be enough money in the reserve fund for decommissioning at present – see the New York Times article at http://www.nytimes.com/2013/05/08/business/energy-environment/kewaunee-nuclear-power-plant-shuts-down.html.

San Onofre, with two reactors, came next. Both had been down since early 2012, when excessive wear of steam generator tubes and leaks of primary water were discovered. The steam generators were new, but contrary to the company’s claims, it turned out that the new ones were not copies of the original licensed design. A long, contentious process followed; prospects for a green light to restart faded. The blame game between the supplier of the steam generators, Mitsubishi, and the majority owner, Southern California Edison grew intense (and it continues). Announcing the decision to close the plant, the SCE President Ron Litzinger said: “Looking ahead, we think that our decision to retire the units will eliminate uncertainty and facilitate orderly planning for California’s energy future.” (See the La Times article at http://www.latimes.com/local/lanow/la-me-ln-edison-closing-san-onofre-nuclear-plant-20130607,0,7920425.story).

Nuclear plants were supposed to create certainty, reliability, predictability, 24/7 operation. But in the last few years, this has given way to a new reality. Nuclear reactors are 24/7 until they become 0/365 with little or no notice. The above are just four examples. Before the Fukushima disaster, Japan had 54 reactors. Four were irretrievably damaged by the accident. In the 15 months that followed, the other 50 were progressively shut or remained in shut down mode. In the last year, only two have been restarted. It will be a contentious process before any more of them can be restarted. It is possible none will be. Many in Japan assume they won’t be for they are installing solar power at rapid rates – 1.5 gigawatts in the first quarter of 2013 alone – equal to about one-and-a-half reactors in peak power output. About 6 gigawatts would be required to generate an equal amount of electricity to one typical power reactor. Capacity comparable to that will likely be installed in Japan this year.

Finally, Germany prematurely shut eight reactors following Fukushima, consolidating and accelerating the post-Chernobyl process of phasing out nuclear power altogether (the end date is now set for 2022).

But officialdom in the United States still clings to the idea that we need nuclear power. So reliable, so baseload, so twentieth century (oops, wrong century).

Fukushima reflections on the second anniversary of the accident

Statement of Arjun Makhijani for the March 2013 conference commemorating the Fukushima accident
To be read by Helen Caldicott

I appreciate that my friend, Helen Caldicott, one of the two people who inspired my book Carbon-Free and Nuclear-Free (the other was S. David Freeman) has agreed to read a brief statement from me on this second anniversary of the Fukushima disaster. I wanted to share two of the new things I have learned as I have followed the consequences of Fukushima unfold.

First, the Japanese government proposed to allow doses as high as 2 rem (20 millisieverts) per year to school children, claiming that the risk was low or at least tolerable. An exposure at this level over five years – 10 rem in all — to a girl, starting at age five, would create a cancer incidence risk of about 3 percent, using the [age- and gender-specific] risk estimates in the National Academies BEIR VII report.

Now imagine that you are a parent in Japan trying to decide whether to send your daughter to such a school. Roughly thirty of every hundred girls would eventually develop cancer at some point in their lives; just one of those would be attributable to Fukushima school exposure, according to the risk numbers. But no one would know if their daughter’s cancer was attributable to the exposure at school and neither would the Japanese government’s radiation bureaucrats. Why is it difficult to understand that while the risk attributable to school contamination would be one in thirty, the proportion of parents stricken with guilt and doubt would be closer to one in three? Would you ever forgive yourself if you made the decision to send your daughter to that school? Or your son, though the risk attributable to Fukushima exposure would be less than that experienced by girls?

Indeed, due to the long latency period of most cancers, you would be fearful even if no cancer had as yet appeared. The Pentagon understood this when a Joint Chiefs of Staff Task Force evaluated the extensive contamination produced by the July 1946 underwater nuclear bomb test (Test Baker) at Bikini for its usefulness in war. Here is a quote from their 1947 report:

“Of the survivors in the contaminated areas , some would be doomed by radiation sickness in hours some in days, some in years. But, these areas, irregular in size and shape, as wind and topography might form them, would have no visible boundaries. No survivor could be certain he was not among the doomed, and so added to every terror of the moment, thousands would be stricken with a fear of death and the uncertainty of the time of its arrival.”

Compare this for yourself with the aftermath of Fukushima and the plight of the parents.

Second, nuclear power’s conceit was that nuclear power is 24/7 electricity supply. Since Fukushima, over sixty of the world’s light water power reactors have been prematurely shut for a variety of reasons, though just four reactors were stricken by the accident: 52 in Japan, eight in Germany, several in the U.S. Even if some are eventually restarted, nuclear power has shown a unique ability to go from 24/7 power supply to 0/365 essentially overnight for long periods– hardly a convincing claim of reliability.

We can do better than making plutonium just to boil water or polluting the Earth with fossil fuel use. When I finished Carbon-Free Nuclear-Free in 2007, I estimated it would take about forty years to get to an affordable, fully renewable energy system in the United States. Today, I think in can be done in twenty-five to thirty years. Are we up to the challenge? Finally, I truly regret I cannot be there to publicly thank and honor my friend Helen for inspiring Carbon-Free, Nuclear-Free, which you can download free from ieer.org, also thanks to her. I wish you a very productive conference.

(Also see IEER’s publication Plutonium: Deadly Gold of the Nuclear Age, June 1992.)

When small is not beautiful is it at least cheap?

On February 6, 2013, Dan Stout, who is the Tennessee Valley Authority’s senior manager for its Small Modular Reactor project, gave a colloquium at the University of Tennessee in Knoxville. Much of the talk was just nuclear-boosterism. For instance, he claimed that “nuclear power was tested hard in 2011. It remains safe reliable and affordable.”

There was no mention of the fact that post-Fukushima, about 60 of the world’s light water reactors were closed for one reason or another, mainly in Japan and Germany, taking them from 24/7 power generation to 0/365. He said nothing of the enormous social, economic, and environmental dislocation and loss that has afflicted the Fukushima region and beyond since that time. And there was nothing of the Nuclear Regulatory Commission’s Task Force report of July 2011 that found US regulations seriously lacking in a number of respects (http://pbadupws.nrc.gov/docs/ML1118/ML111861807.pdf). But there was some refreshing candor about Small Modular Reactors (SMRs) mixed with this sales talk. His talk is archived at http://160.36.161.128/UTK/Viewer/?peid=fa73ded60b7b46698e9adc0732101a76

SMRs are supposed to overcome the loss of economies of scale by using assembly mass-manufacturing techniques such as are used for cars and passenger aircraft. The site set up would be standardized and the set up and commissioning on site would be relatively quick (36 months). So “economics of replication” would replace the “economies of scale” (one of the principal reasons that reactors got larger as time went on).

But there is a chicken and egg problem here, to quote a cliché. You’ve got to have a lot of orders before you can set up your assembly line and produce cheap reactors, but you have to have demonstrated your reactors are certified before you get the nuclear masses lining up to order them, given the risks involved. There are no such orders yet; no assembly line is in sight.

So for now, SMRs would be “cobbled together” in existing facilities. “Does the federal government want to help?” he asked rhetorically. “I don’t know,” he went on. “We’re going to find out. I am not going to make your electric bills all triple because of this project. That’s just …TVA won’t do that.” (Italics added.)

So for the foreseeable future without massive subsidies, nuclear power using SMRs will be the same as nuclear power with the present Atomic Leviathans – expensive and in need of government subsidies. But you have to hand it to Mr. Stout for one thing. Unlike Georgia Power and South Carolina Electric and Gas, two utilities building the new large behemoth variety (the AP1000), he is not willing to saddle TVA ratepayers with the cost of yet another nuclear gamble. TVA has been there and done that, and is still doing it with large reactors. A large part of TVA’s indebtedness is from 1970s and early 1980s was due to the cancellation mid-stream of costly and unneeded reactors. No, prudently for TVA, Mr. Stout wants the taxpayers to take the deal.

And the federal government has obliged by committing up to half of the $452 million proposed for SMRs to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification. [1] B&W had spent over $200 million on the project as of the middle of last year. Where will it lead other than the “cobbled together” machine? Specifically, what will it take to get an assembly line? Here is how Mr. Stout explained it:

So the concept is that you gotta to have an assembly line cranking out repeatable parts, achieving a standardized vision of lots of mPower reactors. That creates the nth of a kind plant that has the efficiency in cost. I’m building Unit One. I don’t want to pay for B&W’s factory with automation to crank out repeatable parts. So that creates a contracting challenge… So as you scratch your head and puzzle how does work, remember the math won’t work on one unit. In fact our unit is most likely going to be, I’m going to use the word “cobbled together”, it’s going to be manufactured within existing facilities. But if B&W can get an order backlog of a hundred SMRs and they are going to start delivering them in China and India etc. then they’ll be able to go get the financing and build the building and have new stuff put in place to crank out these parts in a more automated manner. So as long as the design is the same this should all work. The devil is going to be in the details and in the oversight and in the inspections. [italics added]

A hundred reactors, each costing almost $450 million, would amount to an order book of $45 billion. That does not include construction costs, which would double the figure to $90 billion, leaving aside for now the industry’s record of huge cost escalations (see B&W 2012 presentation at http://www.efcog.org/library/council_meeting/12SAECMtg/presentations/GS_Meeting/Day-1/B&W%20mPower%20Overview%20-%20EFCOG%202012-Ferrara.pdf for total estimated cost figure of $5000 per kW). This would make the SMR launch something like creating a new commercial airliner, say like Dreamliner or the Airbus 350. There were a total of 350 orders for the A350 in 2007, when it was seriously launched as a rival to Boeing’s Dreamliner. The list price of the A350 is between about $250 million and $400 million (rounded — http://www.airbus.com/presscentre/corporate-information/key-documents/?eID=dam_frontend_push&docID=14849), which would make the initial order total the same order of magnitude in cost as 100 completed mPower SMRs.

The end of this decade is the target for the commissioning of the first mPower reactor (the most advanced SMR in planning and design and the only one with substantial federal government support so far). It would take some years after that – well into the 2020s – to fully prove the design and, if needed, debug it. It stretches credulity that China and India, which along with Russia, are the main centers of nuclear power construction today, would put in orders totaling a hundred reactors much before 2020. Indeed, if they were that attracted to SMRs, why would they not pay the license fees and set up the assembly line themselves? Most notably, China, where 28 reactors are under construction (http://www.world-nuclear.org/info/inf63.html), already has a much better supply chain than the United States. So the government subsidy to B&W and TVA would likely pave the way for an assembly line in China! Unless…

The federal government orders a hundred reactors or entices utilities to do so with massive subsidies. We are talking putting tens of billions of taxpayer dollars at risk – at a time when the air is full of talk of furloughs for teachers, air traffic controllers and civilian Pentagon employees, among others, and cutbacks in training of military personnel.

What happens if a common design or manufacturing problem is discovered as it was with the Dreamliner batteries? How is a mass manufacturer of reactors, whose internals become radioactive after commissioning, going to recall them or their major components? No word from Mr. Stout, or that I am aware, from any of the SMR enthusiasts, about this. For safety issues see the testimony of Dr. Ed Lyman of the Union of Concerned Scientists, at http://www.ucsusa.org/assets/documents/nuclear_power/lyman-appropriations-subcom-7-14-11.pdf.

IEER and Physicians for Social Responsibility wrote an SMR fact sheet in 2010, outlining such concerns. (http://ieer.org/resource/factsheets/small-modular-reactors-solution/) They have only deepened with time. SMRs are highly unlikely to provide the industry the nuclear renaissance that has so far eluded it. Indeed, by the time the first ones are being commissioned, solar photovoltaics, smart grids, and a host of other disturbed electricity technologies will be more economical. Wind-generated electricity already is. No one has a perfect crystal ball in the energy business, but mine tells me that SMRs will not solve our CO2 woes. They are likely to be economically obsolete before this decade is done – and the end of this decade is the target date for the first mPower reactor to come on line. So why go there, to a time when there are going to be only costs and little prospect of benefits?

Notes:

  1. This sentence was corrected on April 3, 2013. The original text “And the federal government has obliged by committing $452 million to Babcock & Wilcox, designer of the mPower, 180 megawatt reactor, and the TVA to advance design and certification.” is incorrect. ↩ Return

Turning low-level radioactive waste into high levels of contamination

Three decades after the Nuclear Regulatory Commission first promulgated rules for disposal of low-level waste, it is proposing to massively relax them. The original rules were none too strict. As you can see from the calculation in the comments I made yesterday on the proposed rule, the existing rules would allow for contamination of groundwater hundreds of times above the drinking water limit in the case of disposal of graphite made radioactive by being used in a reactor. This is a Department of Energy calculation used in my comments as an illustration of what is now allowed under NRC rules. Instead of making the rules better and more rational and instead of filling the health protection gaps, a massive relaxation is proposed. For instance, the allowable pollution from plutonium, uranium, thorium strontium-90, radioactive iodine and many other radioactive materials from low-level waste disposal would be allowed to go up greatly because the dose limits for individual human organs are to be eliminated, according to the proposal. The proposed rule also does not define the term “member of the public.” So far in calculating compliance, infants and children are not explicitly taken into account.

Take a look at the proposed rules. Read my comments, and also read my letter to Allison Macfarlane, the new Chairman of the Nuclear Regulatory Commission. I had a very interesting meeting with her in November and I hope she will follow up to get children into the picture and, as a scientist, to straighten out the NRC bureaucracy on fixing at least egregious scientific errors.

I am going to write a bit more broadly on science and democracy this year.
Happy New Year

From Pearl Harbor to Hiroshima

The political temperature between Japan and China is rising again over the Senkaku/Diaoyu islands. Once more oil appears to be a principal issue – as it was in the period leading up to the Japanese attack on Pearl Harbor. The road to Pearl Harbor and from there to the atomic bombings of Hiroshima and Nagasaki that have shaped so much of the world ever needs to be clearly illuminated, now more than at any time since the end of World War II. The question of whether Japan should consider developing its own nuclear weapons is moving into the political discourse and even some acceptability. The former Governor of Tokyo who resigned to run for national office as head of the newly formed right wing Japan Restoration Party won 61 of the 480 seats in the lower house of Japan’s Parliament (the Diet). Mr. Ishihara has “suggested there is a need for Japan to arm itself with nuclear weapons, expand the military and revise the pacifist constitution,” according to new reports. See more: http://www.theprovince.com/news/Nationalists+take+power+Japan+fire+warning+shot+China/7707292/story.html#ixzz2FLjApjLp

On August 4, 2012 I gave a talk in Santa Fe, New Mexico on the history of US-Japanese relations that led up to rising tensions and the bombing of Pearl Harbor and of events from that time till the use of the atom bombs on Japan. More than 67 years after those bombings, few know that Japanese forces were first targeted on May 5, 1943 as the preferred target for those atom bombs, long before the bombs were built and well before anyone knew when the war would be over. In fact, Germany was explicitly de-targeted on that same date by the Military Policy Committee. Watch a video of the talk here.

This speech has a different perspective in many ways than are common in US discourse of the bombings. One side only discusses the evidence that the bombings were unjustified; the other points to Japanese militarism and the intensity of the violence in the Pacific Theater of World War II to justify the use of the bombs. I sought to affirm the truths in both arguments but added much that has been missing. So I would particularly welcome your comments on this speech and blog post. If you think you’ve learned something new, we encourage you to ask radio stations and television stations to use this material. It was broadcast on KEXP in Seattle shortly after the anniversary of Pearl Harbor earlier this month.

The German Energy Transition

The Heinrich Böll Foundation has created a new site to inform the public about its historic energy transition or “energiewende”: http://www.EnergyTransition.de The aim is the reduce greenhouse gas emissions by 80 percent by 2050. A renewable electricity sector is a principal part of this goal. Germany already produces 26% of its electricity and is set to exceed the target of 40% by 2020. As evidence for severe climate disruption mounts, the German energiewende provides some hope. This kind of transformation was a gleam in my eye when I finished Carbon-Free and Nuclear Free in 2007. It was my hope for the United States. I still hope that action by states, corporations, individuals, and even the federal government (though CO2 rules, efficiency standards, etc.) will get us to a fully renewable energy sector by 2050 in the United States and worldwide.

— Arjun

Is there a role for small modular reactors?

On November 20, President Obama announced funding to develop small modular reactors. The US went from small reactors to large ones to get economies of scale. Power reactor generating capacity goes up faster than material costs — one of the sources of economies of scale. Despite that, large reactors are very expensive and the so-called nuclear renaissance is sputtering. So now the new nuclear nirvana is going back to small reactors. It is unlikely to work. Government should not still be subsidizing nuclear 55 years after the first “commercial” nuclear power plant came on line in 1957. Like the first time, government is rushing in without due consideration of many of the problems.

For more information on SMRs and their potential problems, see the IEER-PSR factsheet.

— Arjun

After Sandy: Mitigation or Adaptation?

Arjun Makhijani [1]

A decade ago, concern about climate disruption focused mainly on mitigation. How could the world drastically reduce greenhouse gas emissions to curb the severity and frequency of extreme weather events? With global treaty efforts in tatters and Washington in gridlock however, the focus began to shift to adaptation. How can the damage from climate change be reduced?

Even a cursory look at the destruction wrought by Hurricane Sandy – a waterlogged landscape, natural gas explosions, devastating fires, shortages of food, water, and gasoline, and vast areas without electricity — makes it clear that we must do both.

Thoroughly revamping the country’s century-old electrical infrastructure is a critical starting point. We need a system that is much more resistant to damage and recovers quicker. One way to accomplish both goals was illustrated at Japan’s Tohoku-Fukushi University after last year’s devastating tsunami. The university’s electric power generation system consists of local natural gas-fired generators, fuel cells, solar photovoltaics, and storage batteries. Because of this microgrid, essential facilities, including the water plant, elevators, lighting kept functioning even as much of the rest of the larger grid was swept away. That allowed vital nursing facilities, clinic and laboratory equipment to keep running. (Learn more about the Tohoku-Fukushi microgrid and about other microgrid examples at the Lawrence Berkeley Lab website)

Courtesy of DOE/NREL, Credit – Connie Komomua.

Normally, a microgrid functions as part of a larger regional or national system. Electricity is generated, stored and supplied locally. At the same time, power is exchanged with the rest of the grid to reduce costs and maintain a high level of reliability and performance. In an emergency, however, a microgrid will cut itself off automatically from the stricken network. Instead, it goes into “island” mode, continuing to supply local customers essential needs. That would prevent problems like the one during Hurricane Sandy when an explosion at a single substation caused a massive blackout in Lower Manhattan. Of course, microgrids cannot protect specific locations from flooding or damage. That is a different kind of problem. But with a system of interconnected microgrids, much of the essential equipment in Lower Manhattan out of the reach of flooding would have kept operating.

Putting microgrids at the core of the transformation of the electrical system will end total dependence on a vulnerable, overly-centralized system. The replacement will be a distributed, intelligent system whose essential parts are much more likely to function without disruption during extreme events. In addition, a system based on microgrids is also well-matched to greatly increasing efficiency of electricity use. The higher the efficiency of use, the larger the number of functions a micro-grid in island mode can supply. Higher efficiency also means that a much larger part of the economy can keep functioning at any given level of power. Buildings that are well insulated will stay warm longer without the heating system functioning; food will be preserved much longer without power in highly efficient refrigerators. Crucially, this technology, built for adapting to climate disruption will also mitigate it by helping to reduce greenhouse gas emissions.

As they consider how to protect the region from extreme storms and floods, Governors Christie and Cuomo and Mayor Bloomberg should appoint a task force to create a roadmap for building a distributed resilient efficient and intelligent grid in New York City, Long Island and the Jersey shore. Such a project could be the core of the infrastructural transformation that is needed all along the Gulf and Atlantic Coasts. Interconnected microgrid networks can enable people and the economy to flourish in the new normal of more frequent and more violent weather events.

Notes:

  1. Arjun Makhijani is senior engineer and president of the Institute for Energy and Environmental Research; he has consulted with electric utilities and several agencies of the United Nations on energy issues. ↩ Return

PO Box 5324 · Takoma Park, Maryland, 20913 USA · Tel. 1-301-509-6843 · E-mail: