The Paris Agreement on climate — a good start, but…

A 31-page accord on climate, the Paris Agreement, was adopted on 12 December 2015, and endorsed by acclamation by 195 countries, parties to the United Nations Framework Convention on Climate Change (UNFCCC) at their 21st meeting (COP21). The achievement of universality was remarkable and historic because, for the first time, developing countries also committed to taking action to prevent climate disaster. The rich countries reaffirmed that there are differential responsibilities — code for their far greater contribution to the problem of climate disruption.

Another truly remarkable thing was the skill with which the small island states, like the Marshall Islands, and their supporters navigated the waters where the Exxons and Saudi Arabias of the world sail. They led COP21 to an accord that seeks to hold “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change” (Article 2). The slogan was “one-point-five to survive.” Anything more would mean their destruction by rising oceans — along with so many other coastal communities and lands from Bangladesh to Shanghai to Miami and Mumbai. Hundreds of millions would be displaced at 2°C, the previous average temperature rise limit agreed to in climate negotiations. Take a look at the excellent The New York Times illustration of Chinese cities now, with 2°C temperature rise, and with a 4°C rise.

The 1.5°C limit implies an end to the large scale destruction of forests; Article 5 begins to address the issue. It would require leaving most oil and gas and coal in the ground: fossil fuels would become like stones after the Stone Age — obsolete. While essential for Mother Nature and people generally, millions of workers would lose their jobs. A just transition for them and the communities they live in was an option in Article 2 of the draft going into COP21; it was relegated to the preamble in the final document, as were “obligations on human rights, the right to health, the rights of indigenous peoples, local communities, migrants, children, persons with disabilities and people in vulnerable situations and the right to development, as well as gender equality, empowerment of women and intergenerational equity…” (p. 21). But the words are still there, inviting action. In addition, there was acknowledgement of the need for “gender balance” and that the knowledge of indigenous people would be valuable in adaptation.

Critically, the substance of the commitments, if they can be called that, are not remotely up to the task of limiting temperature rise to 1.5°C. Indeed, there are no legally binding targets at all. Instead there are highly inadequate, voluntary “Intended Nationally Determined Contributions” (INDCs) imply roughly a 3°C rise, double the 1.5°C target. Remember: damage would rise far faster than average temperature.

To keep temperature rise to less than 2°C, the Intergovernmental Panel of Climate Change (IPCC), in its Mitigation Report, estimated that CO2-equivalent (CO2eq) concentrations would have to be limited to 450 parts per million (ppm) by the year 2100 (pp. 8-10). It means emissions 40 to 70 percent below 2010 by 2050 and “near or below zero” in 2100 (pp. 10, 12; italics added). That would make it likely that the temperature rise would be less than 2°C; the chance that it would suffice for 1.5°C? Just 16 percent with a likely overshoot above that in mid-century (Figure 6-13, p. 439). The IPCC also noted in its summary explicitly addressed to policy makers that “Only a limited number of studies have explored scenarios that are more likely than not to bring temperature change back to below 1.5°C by 2100 relative to pre-industrial levels; these scenarios bring atmospheric concentrations to below 430 ppm CO2eq by 2100.” (p. 16, emphasis in the original) Below 430 ppm! The world was already at 430 ppm CO2eq (including all greenhouse gases) in 2011; we are at more than that now.

The breathtaking scale of this task is not evident in the Paris Agreement, though it does express “serious concern” about “the significant gap” between the INDCs and the ambition. Only it’s not just a significant gap; it’s a Himalayan crevasse. It seems reasonably clear that for a reasonable chance of limiting the temperature to 1.5°C, global emissions would have to go to zero well before 2100. Considering differentiated responsibilities, rich countries would have to get to essentially zero emissions by about 2050 or before.

The Paris Agreement has provisions for countries to strengthen their commitments to reduce emissions and for five year reviews. The first review will be in 2018 (“facilitative dialogue…to take stock”, p. 4). A high priority task, if we are serious about 1.5°C, would be to get zero emissions in the energy sector for rich countries by 2050 (at the latest) on the agenda for that dialogue. Global justice requires at least that. Energy justice within countries will need to be addressed too. For the United States, I suggest that the energy burdens of low-income households be capped at 6 percent, considered an affordable level. We’ve done a study detailing that for Maryland, it also explores how to provide universal solar energy access. They are more essential now both for economic justice and climate goals.

Presumably, the $100 billion a year that the rich countries promise to provide by 2020 and thereafter (pp. 16-17) would partly make up for the constraining the carbon space of those who did not contribute much to creating the problem. In fact, while recognizing that countries and peoples are already experiencing “loss and damage”, the Paris Agreement flatly states that the article covering such losses “does not involve or provide a basis for any liability or compensation.” (p. 8) The accord lacks a vital tool: teeth.

There is one bad element, a carryover from the Kyoto Protocol. Article 6 of the Paris Agreement would allow international offsets (“cooperative approaches that involve the use of internationally transferred mitigation outcomes towards nationally determined contributions”). This means that some countries (likely rich ones) could continue to pollute while claiming that others are doing more than their share or storing carbon in some way, for instance in soil or trees (likely poor ones). It is a giant loophole with potential for serious corruption as well.

The Paris Agreement is a good start, especially in that it sets forth a temperature goal and commits all parties to act, with differentiated responsibilities for the rich. Most of the needed words are there; however, they are, for the most part, weak. To give them effect and keep most fossil fuels in the ground will take the global equivalent of the movement that stopped the Keystone Pipeline. Yet, the agreement could be a solid beginning: it has created immense organizing energy. The work of keeping fossil fuels in the ground has already begun, among others by 350.org, the group that led the huge and diverse Keystone struggle.

We will also need national and local roadmaps for efficiency and renewable energy, transportation, and sustainable agriculture (a large source of greenhouse gas emissions). That vision will need to be broad. For instance, it will need include the cooking energy needs of hundreds of millions of families who now cook with wood, cow-dung, and crop residues. Women and children die in the millions each year of respiratory diseases; and black-carbon (soot) emissions contribute to global warming.

The world already has more than one billion petroleum fueled cars — it is headed to 2 billion by 2030. That is incompatible with the Paris Agreement. Transportation will need to be revolutionized — and electrified — with electrified public transport much more in the center of things and all types of transportation running on renewable energy. Paris should be an inspiration for a walkable city with wonderful public transport.

We will need roadmaps, created with public input, for productively investing and spending the $100 billion-a-year, and intense pressure to ensure at least that much money is forthcoming and that it is well spent and that it creates good jobs for workers in the fossil fuel sectors now.

At bottom, 1.5°C is about reshaping a world created by imperialist-drawn borders, often with oil at the center, and a hundred years of wars — still going on — into one that is ecologically sane, peaceful, and economically just. Remember Syria and Iraq (among others) were essentially created by Britain and France after World War I. Actually achieving a limit of 1.5°C will mean taking the tiger out of Exxon’s tank and putting it into the Paris Agreement. It may well be a perilous exercise in itself. But it is one that is essential — it is the one-point-five imperative.

The Clean Power Plan is a step in the right direction

With the publication of the final Clean Power Plan, the United States can finally claim some leadership in curbing CO2 emissions at the federal level. The final rule is, on balance, technically, economically and environmentally coherent. The actual goal is short of what it needs to be, but it is better than in the draft plan. And the direction is right, which is the most important thing. Thanks to all of you who worked with us and supported us in the process, especially Scott Denman, Diane Curran, Lisa Heinzerling, Elena Krieger, and my co-author, M.V. Ramana..

We asked for many things in our comments on the draft EPA Clean Power Plan. The EPA agreed not only with the substance, but more important, the reasoning underlying our policy positions in the final Clean Power Plan (CPP) rule.

Most of all we asked for a coherent, technology-neutral rule that would be more protective of climate. Here are some of the big picture items:

  1. Existing nuclear plants and license extensions will not be subsidized by the CPP: We asked that both existing nuclear power plants and existing renewable energy be removed from the calculation of emission targets because they do nothing to reduce CO2 emissions. We asked that they be treated consistently. (Neither have significant onsite emissions of CO2 and both have some offsite lifecycle emissions that are much less than natural gas per unit of generation). Existing generation should not be part of the “Best System of Emission Reduction” (BSER) because we want to reduce CO2 emissions from where they are now (or in 2012, the baseline year). The EPA agreed. Both are gone from the final rule. Further, in its draft rule, the EPA implicitly assumed (in its modelling of the electricity sector) that licenses of existing plants would be extended. The relicensing issue has been removed from the CPP since existing generation is not in the calculation of emission reductions. It is simply the baseline generation, as is clear from page 345 of the final plan (italics added):

    …we believe it is inappropriate to base the BSER on elements that will not reduce CO2 emissions from affected EGUs below current levels. Existing nuclear generation helps make existing CO2 emissions lower than they would otherwise be, but will not further lower CO2 emissions below current levels. Accordingly,…the EPA is not finalizing preservation of generation from existing nuclear capacity as a component of the BSER.

    The same reasoning was applied to license extensions. Only uprates (increases in licensed capacity of existing plants) would be allowed to be counted. This is consistent and technology neutral (in the same way that increasing the capacity of a wind farm would be counted). The rule does not seek to “preserve” existing power plants. Or to shut them down. That will happen on the merits without an EPA hand on the scale in favor of nuclear.

  2. New and under-construction nuclear reactors are not part of the best system of emission reduction; renewable energy is: We pointed out that new nuclear plants are very expensive; even the State of Georgia, whose ratepayers are forced to subsidize two nuclear units through their electricity bills, noted that in its comments. Since the “Best System of Emission Reduction” (BSER) has a cost criterion, new nuclear should be excluded from the BSER. (We also cited other reasons for that.) The EPA excluded new nuclear from BSER but included new renewable energy (p. 345, italics added):

    Investments in new nuclear capacity are very large capital-intensive investments that require substantial lead times. By comparison, investments in new RE generating capacity are individually smaller and require shorter lead times. Also, important recent trends evidenced in RE development, such as rapidly growing investment and rapidly decreasing costs, are not as clearly evidenced in nuclear generation. We view these factors as distinguishing the under-construction nuclear units from RE generating capacity, indicating that the new nuclear capacity is likely of higher cost and therefore less appropriate for inclusion in the BSER.

    This is a critically important statement. We don’t have a shortage of low CO2 sources. We have a shortage of time and money to reduce CO2 emissions. The EPA recognized (very delicately!) that renewable energy is better on both counts. As a result, one or more the four new reactors under construction at Vogtle and Summer can proceed or stop on the financial merits, rather than these units being pushed into existence with the Clean Power Plan playing the role of midwife.

    The EPA also “seeks to drive the widespread development and deployment of wind and solar, as these broad categories of renewable technology are essential to longer term climate strategies” (p. 874). This is an excellent goal. The EPA recognized that costs of solar and wind are declining.

  3. New natural gas plants are not part of the best system of emission reductions: This is perhaps the best and most solid indication that the Obama administration takes long-term reductions seriously. New natural gas combined cycle plants, even though they have lower CO2 emissions per megawatt-hour (using EPA leak rates and global warming potential for methane), will not be part of the BSER even though they meet the cost test and emission rate test. The reason: they will be emitting CO2 for decades (p. 346, italics added):

    However, our determination not to include new construction and operation of new NGCC capacity in the BSER in this final rule rests primarily on the achievable magnitude of emission reductions rather than costs. Unlike emission reductions achieved through the use of any of the building blocks, emission reductions achieved through the use of new NGCC capacity require the construction of additional CO2-emitting generating capacity, a consequence that is inconsistent with the long-term need to continue reducing CO2 emissions beyond the reductions that will be achieved through this rule. New generating assets are planned and built for long lifetimes –- frequently 40 years or more –-that are likely longer than the expected remaining lifetimes of the steam EGUs whose CO2 emissions would initially be displaced be the generation from the new NGCC units. The new capacity is likely to continue to emit CO2 throughout these longer lifetimes….

  4. Increased capacity factor of existing natural gas plants is BSER: The EPA is still allowing increased capacity factor of existing natural gas combined cycle power plants to displace coal. This is the result of its estimate of methane leak rates and global warming potential. So long as new central station natural gas plants are not encouraged, the rate of use of existing plants is a problem that can be sorted out in the coming years. It would have been very difficult to argue only on the grounds of the BSER rules and existing methane leaks estimates that increasing capacity factor of existing natural gas combined cycle units to displace coal is not BSER. The job now is to get the EPA to recognize a wider array of methane leaks rates (that have ample empirical support) and to use both a 20-year and 100-year warming potential screen in the design of its CO2 reduction programs. The recent report from the IPCC uses a global warming potential of 28-34, including feedback effects. It would be entirely appropriate for the EPA to adopt a similar evaluation metric. The 20-year warming potential, which is about three times higher would be even more appropriate given that the climate crisis is developing more rapidly than previously anticipated.
  5. The EPA has incentivized early investment in low-income efficiency programs (p. 864 onward): This is a very important feature of the CPP. States that want to make very sure that low-income households are not adversely impacted by the rule will take advantage of the additional emission reduction credits the EPA is offering for early action. This also promises to provide other benefits such as reduction of the cost of energy assistance programs and lower adverse health impacts due to inability to pay for health care or medicines.
  6. The cap-and-trade provision is OK in the electricity context, though with reservations: Carbon permits from new generation can be traded. For instance, existing nuclear plants cannot generate tradeable CO2 credits (unless they are from a licensed uprate). I am not a fan of expansive cap-and-trade but the EPA formulation in the CPP makes sense to me. It is the same as if emission limits were set for a group of states or at the grid level, such as the PJM grid in the mid-Atlantic region but extending inland to Ohio and beyond, or the MISO grid in the upper Midwest. The EPA seeks not to impose a model of reductions; only to get to a certain level of reductions. In the cap-and-trade system permitted by the EPA, the CO2 reduction could happen in one state or in another, but it will have to happen. One of my reservations is that the EPA also allows the trading of energy efficiency credits across state lines. It is difficult enough to account for program-induced efficiency improvements within a state and distinguish them from say, the effects of federal appliance standards. Bundling these efficiency gains into tradeable credits is not a good idea. Another issue is that the method of calculating the reduction in emission rate is not the best as applied to efficiency. We had asked for a more global and comprehensive approach to CO2 accounting, but did not succeed on this point.
  7. Conclusion – The CPP is a real tour de force; it gives me hope. Of course, there is much work to do now that the final CPP has been published (besides making it stick). We need to advocate for states to mandate GHG reduction targets of 40 to 50 percent by 2030 from all sources; we need to accelerate electrification of transportation and restructuring of the grid….But the CPP is a great springboard from which to make these leaps.

The German Energy Transition

The Heinrich Böll Foundation has created a new site to inform the public about its historic energy transition or “energiewende”: http://www.EnergyTransition.de The aim is the reduce greenhouse gas emissions by 80 percent by 2050. A renewable electricity sector is a principal part of this goal. Germany already produces 26% of its electricity and is set to exceed the target of 40% by 2020. As evidence for severe climate disruption mounts, the German energiewende provides some hope. This kind of transformation was a gleam in my eye when I finished Carbon-Free and Nuclear Free in 2007. It was my hope for the United States. I still hope that action by states, corporations, individuals, and even the federal government (though CO2 rules, efficiency standards, etc.) will get us to a fully renewable energy sector by 2050 in the United States and worldwide.

— Arjun

After Sandy: Mitigation or Adaptation?

Arjun Makhijani [1]

A decade ago, concern about climate disruption focused mainly on mitigation. How could the world drastically reduce greenhouse gas emissions to curb the severity and frequency of extreme weather events? With global treaty efforts in tatters and Washington in gridlock however, the focus began to shift to adaptation. How can the damage from climate change be reduced?

Even a cursory look at the destruction wrought by Hurricane Sandy – a waterlogged landscape, natural gas explosions, devastating fires, shortages of food, water, and gasoline, and vast areas without electricity — makes it clear that we must do both.

Thoroughly revamping the country’s century-old electrical infrastructure is a critical starting point. We need a system that is much more resistant to damage and recovers quicker. One way to accomplish both goals was illustrated at Japan’s Tohoku-Fukushi University after last year’s devastating tsunami. The university’s electric power generation system consists of local natural gas-fired generators, fuel cells, solar photovoltaics, and storage batteries. Because of this microgrid, essential facilities, including the water plant, elevators, lighting kept functioning even as much of the rest of the larger grid was swept away. That allowed vital nursing facilities, clinic and laboratory equipment to keep running. (Learn more about the Tohoku-Fukushi microgrid and about other microgrid examples at the Lawrence Berkeley Lab website)

Courtesy of DOE/NREL, Credit – Connie Komomua.

Normally, a microgrid functions as part of a larger regional or national system. Electricity is generated, stored and supplied locally. At the same time, power is exchanged with the rest of the grid to reduce costs and maintain a high level of reliability and performance. In an emergency, however, a microgrid will cut itself off automatically from the stricken network. Instead, it goes into “island” mode, continuing to supply local customers essential needs. That would prevent problems like the one during Hurricane Sandy when an explosion at a single substation caused a massive blackout in Lower Manhattan. Of course, microgrids cannot protect specific locations from flooding or damage. That is a different kind of problem. But with a system of interconnected microgrids, much of the essential equipment in Lower Manhattan out of the reach of flooding would have kept operating.

Putting microgrids at the core of the transformation of the electrical system will end total dependence on a vulnerable, overly-centralized system. The replacement will be a distributed, intelligent system whose essential parts are much more likely to function without disruption during extreme events. In addition, a system based on microgrids is also well-matched to greatly increasing efficiency of electricity use. The higher the efficiency of use, the larger the number of functions a micro-grid in island mode can supply. Higher efficiency also means that a much larger part of the economy can keep functioning at any given level of power. Buildings that are well insulated will stay warm longer without the heating system functioning; food will be preserved much longer without power in highly efficient refrigerators. Crucially, this technology, built for adapting to climate disruption will also mitigate it by helping to reduce greenhouse gas emissions.

As they consider how to protect the region from extreme storms and floods, Governors Christie and Cuomo and Mayor Bloomberg should appoint a task force to create a roadmap for building a distributed resilient efficient and intelligent grid in New York City, Long Island and the Jersey shore. Such a project could be the core of the infrastructural transformation that is needed all along the Gulf and Atlantic Coasts. Interconnected microgrid networks can enable people and the economy to flourish in the new normal of more frequent and more violent weather events.

Notes:

  1. Arjun Makhijani is senior engineer and president of the Institute for Energy and Environmental Research; he has consulted with electric utilities and several agencies of the United Nations on energy issues. ↩ Return

Bad News on Climate; Good News on Energy

My February 26, 2008 op ed in the Dallas Morning News seems to have excited a great deal of interest, including on this blog. I really enjoyed my speaking tour of Texas, including being on the Dallas PBS TV program named Think, talking about Carbon-Free and Nuclear Free. See the video here.

(Dr. Egghead’s philosophical disclosure: Descartes could have done better than “€œI think therefore I am.” I prefer what the French do rather than what their philosophers say: “€œI eat therefore I am” and also “€œI am therefore I eat.”)

Watch the video anyway. You’€™ll like it. Krys Boyd was a really knowledgeable and gracious host at KERA TV. If you love my mellifluous voice on that, see clips from one of my Dallas area speeches, courtesy of the Dallas Peace Center.

There is bad news on climate and good news on energy.

One of the indicators of a warming Earth is the extent of summer Arctic Ice melting. Last summer’€™s melting was not only the worst since measurements began, but the rate of change increased drastically. Here is a chart showing model projections (the red and the dashed lines) and actual satellite measurements (heavy black line)

Great Arctic Ice Melt of 2007

Chart of IPCC's modelling predictions to the end of the century versus actual satellite measurements.

Chart is courtesy of Dr. A. Sorteberg, Bjerknes Centre for Climate Research, University of Bergen, Norway.

The previous worst case estimate for complete summer melting was about 2070. Now it may be less than a decade. We cannot afford to wait for time to tell us whether this worst case will come about. We must act. Two climate scientists, H. Damon Matthews of Concordia University and Ken Caldiera of the Carnegie Institution of Washington, recently published an article in Geophysical Research Letters, analyzing the long-term requirements for protecting climate and concluded as follows:

“We have shown here that stable global temperatures within the next several centuries can be achieved if CO2 emissions are reduced to nearly zero. This means that avoiding future human-induced climate warming may require policies that seek not only to decrease CO2 emissions, but to eliminate them entirely.” [emphasis and color added]Source: H. Damon Matthews and Ken Caldeira, Stabilizing climate requires near-zero emissions, GEOPHYSICAL RESEARCH LETTERS, VOL. 35, XXXX, 2008. (prepublication)

See a New Scientist article about this paper

There is good news to offset the bad news: My book Carbon-Free and Nuclear-Free shows that we do no€™t have to go to the poor house to eliminate carbon dioxide emissions from fossil fuels. We can have a flourishing economy and protect climate. Wind energy in good areas is already cheaper than nuclear or competitive with it. The country needs sensible rules for investment in transmission lines to create more of a boom in wind. It’€™s already happening in Texas, which has such rules; some oilmen like T. Boone Pickens see wind farms as the future of energy. See the New York Times article.

In the United States, the area of parking lots and commercial building rooftops is large enough to supply much or most of its electricity requirements. And Nanosolar, located in Silicon Valley, is all set to make solar panels on a large-scale for less than a dollar watt (plus installation). That means solar electricity is likely to make nuclear energy economically obsolete by the time the first proposed new nuclear plants come on line (if all goes according to the nuclear industry’€™s plans), making for another generation of economic lemons, for which ratepayers and taxpayers will pay a heavy price. Why go there?

New Zealand has announced a goal of zero CO2 emissions without nuclear power by mid-century. Why not the United States? Declaring that to be a goal and enacting the tough policies that will be needed could work wonders for restoring the positive image that most of the world’€™s people once had about the United States, which has fallen into sad disrepute abroad in recent times.

S. David Freeman, former Chairman of the TVA, noted in his Foreword to my book, that it will take “determination and guts …[to] achieve a renewable energy economy.” That means your involvement. Take the message of Carbon-Free and Nuclear-Free to the candidates of all parties, independent of those whom you personally support; ask them if they are familiar with Carbon-Free and Nuclear-Free, which shows we can live well without fossil fuels or nuclear power.

You can do more. Link to this blog; comment on it; make it the go-to place for energy commentary, discussion, and Q&A about the energy problems of our time. Read my book. Download it free. Discuss it in your book club.

Posts to come: On China and India; on efficiency; on the coming generation of passenger vehicles.

–Arjun

Carbon-Free and Nuclear-Free is getting an enthusiastic response, but several new nuclear facilities are planned


I have been going around the country speaking about my new book, Carbon-Free and Nuclear-Free: A Roadmap for U.S. Energy Policy. (Download it free)

Nothing I have done in 37 years of work on energy, environment, and nuclear weapons and power issues has caught on like this.

As evidence of serious and rapid climate change mounts and a price on carbon emissions looks more and more certain, companies’ coal-fired power plants are hard to justify and harder to finance. So the nuclear industry wants to ride into town as the savior. Having failed to deliver electricity “too cheap to meter” (promised in the 1950s by the Chairman of the Atomic Energy Commission, Lewis Strauss), it now wants massive new government subsidies in the form of loan guarantees.

But it is a false choice. Those who oppose nuclear power as the “solution” to the global climate crisis are right: a combination of efficiency, renewable energy, combined heat and power, and emerging technologies such as plug-in hybrid cars can allow us to phase out all fossil fuels and nuclear power in 30 to 50 years.

Eight new nuclear reactors are being proposed in Texas alone. The two near Amarillo, in the panhandle, will consume 60 million gallons of water every day—more than what the entire city uses. The company proposing the plant has said there is a lake there in an unidentified location that will supply the water. In Idaho, the CEO of Alternate Energy Holdings, which wants to build a power plant there, implies that nuclear power will cost only 1 to 2 cents per kilowatt-hour, because capital cost is borne by the investors, as if Wall Street were a kind of charity for electricity consumers. Far from it. Wall Street got burned by nuclear power in the 1980s; it is leery of financing them. That’s why the nuclear industry has the largest hat in hand in Washington, D.C. asking for handouts such as license application subsidies and 100 percent loan guarantees.

But at least some investors are catching on. Mid-American Energy, owned by Warren Buffet’s Berkshire Hathaway, announced last month (January 2008) that it was abandoning plans to build a nuclear power plant in Idaho because it could not provide economical power to its customers. Austin Energy, the city-owned utility in the capital of Texas, has recommended that the City vote not to buy a share of the two proposed reactors near Bay City Texas. The investment would, at this time, be “unwise” and imprudent” said the utility, because of insufficient time to examine the paperwork and the risk of cost overruns and delays.

Here is a link to a summary of my book (Note: 2.5 MB pdf)

and to an op ed I recently wrote for the Deseret News (Salt Lake City)

I invite you to comment on the analysis in my book, on what you are doing in your neighborhood, city, county, or state regarding energy and climate and to link to my blog.

–Arjun

PO Box 5324 · Takoma Park, Maryland, 20913 USA · Tel. 1-301-509-6843 · E-mail: