The Nagasaki atomic bombing – why the rush?

Nagasaki was destroyed by a plutonium atomic bomb seventy five years ago, on August 9, 1945. Called “Fat Man,” it was the same design that had been tested in the New Mexico desert less than a month before, spreading intense radioactive fallout over a wide area. A day before, on August 8, 1945, the Soviet Union had declared war on Japan, having been neutral until then. Japanese wartime leaders had been divided about surrender for weeks; now, facing a two-front war, the debate became more urgent and intense.

Susan Southard’s account of the debate in the Japanese councils of war notes that “the news of the second atomic bombing bombing had no apparent impact on their deliberations [on August 9], which, according to notes from their meeting continued throughout the day and with no further mention of Nagasaki.” The specter of occupation by the Soviets and the hopelessness of the two-front war seem to have decided the Emperor of Japan that very night to signal a surrender to the United States, which duly happened on August 15, 1945.

This much seems well-supported by the facts; despite that the controversy rages. What is even less debated is the timing of the use of the bombs and the targeting of Japan. In a blog post three days ago, I recounted, once again, that the decision to not target Germany and, instead, to orient the bomb to the Pacific theater had been taken on May 5, 1943, more than two years before Hiroshima and Nagasaki were destroyed.

The bombs were used as soon as they were ready and the weather permitted — August 6 and 9. On August 9, Nagasaki was not even the intended target; it was Kokura, but there were too many clouds over that city. So Nagasaki, a secondary target that had already been bombed with more mundane explosives, had the atomic misfortune instead. The main targets were cities that had deliberately been spared conventional bombing; the object was to measure the impact of the atomic bomb with as much scientific precision as possible. Prior destruction would confound those measurements.

The rush was not related to the anticipated large loss of lives of U.S. troops in an invasion (tens of thousands in the military’s estimates from June 1945) because that D-day was not until November 1, nearly three months after the atomic bombings. President Truman, General Marshall, and other military leaders believed that a Soviet entry into the war would cause Japan to capitulate. At the mid-July Potsdam conference, the Soviets had agreed to declare war on Japan on August 15. Following the Hiroshima bombing, Stalin accelerated that declaration by a week. He got the message the U.S. sent about the shape of the post-war world; he was going to have his say.

Why not wait till a few days after August 15 to bomb Hiroshima? Why not wait for a few days after August 8, when the Soviets actually entered the war, to use the second bomb? Why persist in the hurried schedule?

Two principal reasons can explain the rush: one imperative was to justify the use of vast resources on the Manhattan Project. If the bombs were not used and shown to be important, even decisive, in ending the war, there would be endless investigations. President Truman, as as a senator in 1944, had already threatened investigations when he was frustrated in his attempts to find out where all the money was going. Jimmy Byrnes, FDR’s director of the War Mobilization Board (and later Truman’s Secretary of State), had warned in February 1945 that he, FDR, had better show the money spent on the project was actually contributing to the war effort.

The Manhattan Project had had very high priority on wartime resources; for example, welders were sent from shipyards in San Francisco to Hanford to build the plutonium production plants there. Were the bombs not used, there could well be a reasonable argument that the Project actually cost the lives of U.S. soldiers and sailors.

The second was to announce the shape of the post-war world, most of all to the Soviet Union. Groves frankly acknowledged that when he said after the war: “There was never from about two weeks from the time I took charge of this Project any illusion my part that Russia was our enemy, and the project was conducted on that basis. I didn’t go along with the attitude of the country as a whole that Russia was a gallant ally….Of course, that was so reported to the President.” (as quoted by Martin Sherwin, in his book A World Destroyed, p. 62, Vintage Books, ppbk., 1987).

The United States and Britain had kept the fact of the bomb project from their Soviet wartime ally. Churchill had explicitly spurned a plea from the great Danish physicist Niels Bohr in 1944 that the Soviets, as allies, should be informed. In any case, Stalin was well-informed; he already knew of the bomb project through his spy network. With Truman’s hint at Potsdam about the successful atom bomb test, Stalin accelerated the Soviet bomb effort. With Hiroshima, he accelerated the Soviet declaration of war in Japan. He would have his say.

The die was cast. Having been started, at the instance of Einstein and others, as an effort to deter Hitler from blackmailing the world with atom bombs, over time it ceased being about Germany – most definitively so by early December 1944, by which time the Manhattan Project spy mission, Alsos, determined that Germany did not have a viable bomb project

At that point, the vast plutonium separation plants at Hanford Washington, operated by DuPont, had not yet been started. None of the tens of millions of gallons highly radioactive waste from plutonium separation that still haunt Eastern Washington State, had been created. At that moment, when the bomb project was accelerated instead of being stopped (a logical step had it still been about the Nazis), it had definitively become about money and power — wartime money and postwar power. That is the secret in open view about the timing of the bombing of Hiroshima and, even more so, about destruction of Nagasaki 75 years ago.

PS: Many years ago, I had the privilege of interviewing Walter Hooke, a marine veteran who was among the US troops that occupied Nagasaki. And here is my Hiroshima blog post from three days ago. These are vignettes of the history. For an overview, please refer to my 2012 talk in Santa Fe, From Pearl Harbor to Hiroshima; it’s about one hour.

When was the decision to use the atom bomb made?

When was the decision made to use atom bombs on Hiroshima and Nagasaki? Was it one decision — or several that made their use inexorable and inevitable? What were the forums in which those decisions were made? When was Japan targeted? And Germany? Seventy-five years after those cities were obliterated, these remain insistent questions.

The first step, the establishment of the Uranium Committee in October 1939, after President Roosevelt read Einstein’s letter urging a bomb project, was not really a decision to use the bomb. It was a scientific exploration that more or less had a deterrence aim to beat the Nazis to the bomb.

Vannevar Bush (no relative of the two presidents to come), who headed the National Defense Research Committee in the White House, slow walked the project until well into 1941. Only $10 million was spent in the first two years. He reported directly to FDR and was the central decider in weapons development during World War II.

Bush was an electrical engineer, inventor, Vice-President of the Massachusetts Institute of Technology and, from 1939 onward, President of the Carnegie Institution. He came to the White House job in 1940 determined to bring the full force of U.S. science, including that in academia, to bear on the development of weapons in this war, not the next. He knew that the generals considered academic scientists to be eggheads who, at best, would develop weapons for the next war — or in the words of Harvey Bundy, confidante of Secretary of War, Henry Stimson, the military “would naturally have the feeling that these damn scientists weren’t very practical men; they were visionaries.” Bush was determined to prove them wrong.

At first, the atom bomb project seemed too speculative to his purpose; who knew if it would work at all? But in July 1941 the British MAUD (Military Application of Uranium Detonation) Committee concluded that a modest amount of uranium-235 would be sufficient to produce a massive atomic explosion. Now the bomb was not speculative, though it was not yet a reality. Such was the certainty that, three years later, the uranium bomb was used directly on Hiroshima without ever being tested. (The July 1945 Trinity test was for the more tricky plutonium implosion design.) Now the argument that the U.S. should beat the Nazis to the nuclear punch could be joined to Bush’s ambition to put science in the service of U.S. weapons to be used in World War II and for global power after it (now more politely called “national security”).

Bush briefed FDR on October 9, 1941 recommending an all-out effort to make the bomb. Given his position and his ambition that the weapons whose development he was overseeing should be used in World War II, the first decision to use the bomb was made, in spirit and at least arguably, on that date, almost two months before the U.S. formally entered the war upon the bombing of Pearl Harbor. Of course, it had to be built in time. But course was set.

By May 1943, what came to be known as the Manhattan Project was firmly established. Leading U.S. physicists, including immigrants who had fled Europe, met in 1942 at the University of California, Berkeley for what should be the most famous summer study ever. They too concluded a uranium bomb would work. A nuclear chain reaction, the explosive heart of the bomb, was demonstrated at the University of Chicago on December 2, 1942. Hanford, Washington, had been selected as the site to make plutonium on an industrial scale. Los Alamos was selected as the place where scientists would design the bomb high on an isolated New Mexican mesa. It was a region that Robert Oppenheimer knew well.

On May 5, 1943, the Military Policy Committee, consisting of five men, chaired by Bush, met to review progress. It was, in effect, the Executive Committee of the Manhattan Project. James Conant, President of Harvard, was Vice-Chair. General Groves, who oversaw and coordinated the massive project on the ground, was a member, as were two other military personnel, General Styer, an expert in logistics, and Admiral Purnell, Deputy Chief of Naval Operations for Materiel. The Committee did not have a single general or admiral in charge of actually prosecuting the war that was raging. In fact, almost to a man, the the generals and admirals in the field did not know about the Manhattan Project.

It was on that fateful day, two years before the end of the war in Europe, that that the Military Policy Committee decided that Germany would NOT be targeted; the target selected was the Japanese fleet stationed at the Pacific Island of Truk. Manhattan Project scientists other than Bush and Conant, continued to labor under the idea that the bomb would be to deter Germany or perhaps used on it. Bush had decided not to inform them. When I interviewed several leading ones who were still alive in 1995, including Glenn Seaborg and Hans Bethe, none knew of the May 5, 1943 decision, even then though the fact had been public, in fine print, for decades.

May 5, 1943 was the first specific decision to use the bomb; it was the first targeting decision. Germany was not targeted since they might reverse engineer a bomb if it were a dud. Japan was thought to be less likely to do so. The targeting of the fleet at Truk rather than Tokyo was an added precaution — a dud would sink and be hard to recover. From May 5, 1943 onward the use of the bomb was all about the Pacific theater and in 1944, Japan itself.

On September 18, 1944, FDR and Churchill agreed that “when a ‘bomb’ is finally available, it might perhaps, after mature consideration, be used against the Japanese, who should be warned that this bombardment will be repeated until they surrender.” The aide-memoire makes no mention of Germany. In that same time frame, logistical preparations were made in the Pacific theater to use the bomb on Japan. There were none in the European theater.

By December 1944, the Manhattan Project spy mission, Alsos, had enough information to conclude that Germany’s efforts “to develop a bomb were still in the experimental stages,” as Groves wrote in his memoir, Now It Can Be Told. The war was coming to a close; the Soviets were well into Eastern Europe. Paris had been liberated months before. Joseph Rotblatt, a scientist at Los Alamos, decided his job was done; there was no German bomb threat. He quit. He was the only one.

The Project itself was accelerated so the bomb could be used before the war in the Pacific ended. Bush was joined in his determination that weapons developed during the war should be used in the war by others, notably General Groves. A principal motivation was to justify the use of immense resources for the atom bomb project; that meant showing that the bomb had played a big role in ending the war and saving the lives of US armed forces personnel by preventing an invasion.

The last decisions were in May 1945; they were to pick target cities in Japan. While some scientists opposed the use of the bomb on cities, their pleas did not reach President Truman. Jimmy Byrnes, his Secretary of State, explicitly rejected a similar plea from Leo Szilard, who first conceptualized the chain reaction years before it was experimentally achieved. At the top, there was no serious consideration about whether the bombs should be used. It was just a question of when.

The answer: as soon as the bombs were ready and weather permitted — those dates were August 6 and August 9, 1945. The invasion of Kyushu was not due till November 1, 1945. The determination to use the bomb in World War II was realized by its early use. The role of the bombings in ending the war has increasingly come into question with strong evidence pointing to the decisive role of the Soviet entry into the War on August 8, 1945. The Soviets had been neutral with respect Japan before then. Japanese rulers, observing Eastern Europe, did not want to be occupied by the Soviets. So the Japanese submitted to the United States’ demands.

It worth noting that, on April 23, 1945 as part of briefing materials for the newly installed President Truman, Groves wrote. “The target is and was always expected to be Japan.” (italics added). The decision in the direction of Japan and away from Germany was made two years before, on May 5, 1943. Groves knew because he was a part of it. Finally, there is much evidence that the use of the bombs was, in significant measure, a message to the Soviets about who would run the post-War world. Bush too had achieved his objective — the bomb was used in World War II and the U.S. had announced itself as the preeminent global power.

Remembering Dave Freeman – green cowboy, pioneer of U.S. energy policy

It was 1970. Dave Freeman had transitioned from being an energy advisor in Johnson’s White House to Nixon’s. At one of our lunches since he had moved to Washington, D.C. after retiring as the Chairman of the Port of Los Angeles, he recounted a conversation with John Ehrlichman, Nixon’s assistant for domestic policy:

“Ehrlichman told me ‘Dave, you had better get out of here. Things are going to get very hot and nasty in the coming campaign [to re-elect Nixon]. This is no place for a Democrat like you.'”

Dave found a most interesting and, as it turned out, historic exit. He convinced the Ford Foundation to give him four million dollars (about twenty five million in today’s money) to establish the Energy Policy Project within the Foundation. It would approach energy policy comprehensively; among other things it would explore how much of energy supply could be replaced by energy efficiency. The project would do its work and then disband. He asked for, and got, a free hand, though he did have a Board of Advisors, which included corporate chieftains like Donald Burnham, the Chairman of Westinghouse; luminaries from academia, like Carl Kaysen, Director of the Institute of Advanced Study and Harvey Brooks, Dean of Engineering and Applied Sciences at Harvard; and famously, William Tavoulareas, the president of Mobil Oil Company.

It was widely believed at the time that the energy consumption growth and economic growth were closely coupled. Dave, an engineer and a lawyer, had other ideas. He thought the same economic growth could be achieved at various levels of energy growth, including zero energy growth, which was a truly revolutionary concept at the time. At the other end of the country, as a doctoral student at the University of California, Berkeley, I had discovered, with extensive but back-of-the-envelope calculations done for a two-credit seminar, that the common wisdom about closely coupled economic and energy growth seemed to be wrong. A much bigger economy could be supported by the energy that the United States was consuming. Dave, or one of his staff, noticed that work, which was published with my academic advisor Allan Lichtenberg, and read into the Congressional Record by the maverick senator from Alaska, Mike Gravel. That is how, I, with my wild head of hair and my freshly minted doctorate in nuclear fusion, met Dave and moved to Washington, D.C. in November 1972.

His staffing idea was as gutsy as his substantive concept. Until the early 1970s, U.S. energy policy was mainly oil policy. But Dave felt oil companies had far too much influence, not only on energy but on political life in general. Indeed, much of the world’s politics was then dominated by what was known as the “Seven Sisters” – the Anglo-Iranian Oil Company, Shell, Standard Oil of New York, Standard Oil of New Jersey, Standard Oil of California, Gulf Oil, and Texaco. A major example was the U.S.-British orchestrated 1953 overthrow of the elected Iranian government of the time — an act designed to protect the interests of the Anglo-Iranian Oil Company that still haunts world politics and security.

Dave wanted his staff to be as sharp with numbers and analysis as any petroleum engineer drilling for oil; but he wanted open minds, free of oil industry cobwebs. He gave his (mostly) young staff a great deal of leeway. Besides the iconoclastic internal work, we also got to manage large external grants. In three years, the project published about twenty books on energy policy that covered the waterfront from economic modeling to demographics to industrial energy efficiency to nuclear proliferation to energy aspects of foreign policy to the energy implications of recycling steel and aluminum. I had the special privilege as a staff member to do my own research project (in addition to my normal work), not related to U.S. energy. That research was published in 1975 as Energy and Agriculture in the Third World; it achieved recognition in its own right, though in a rather specialized niche in Washington.

By the time of the October 1973 Arab Oil Embargo, occasioned by the Arab-Israel War (aka the Yom Kippur War), the core technical analysis was mostly done; the main features of the energy scenarios were clear. Dave decided we would do an urgent preliminary report. Working day and night, the team did it in two months. Exploring Energy Choices, published in January 1974, became a selection of the Book-of-the-Month Club, which distributed half a million copies and put the Energy Policy Project on the Washington map.

The Deputy Director of the project was going to send out for chicken sandwiches for the celebratory lunch. When I protested that the staff deserved better, Dave let me order it — and gave me no instruction as to the budget. I called one of the best French restaurants in town – alas, I have forgotten its name; but I do remember we had a 1966 St. Emilion grand cru to accompany the boeuf bourguignon served on fine china by liveried restaurant staff in our very own conference room at our very memorable address: 1776 Massachusetts Avenue, Northwest. Dave was shocked by the tab but said not a word to me then. Years later he told me he decided to send the invoice quietly along to headquarters, figuring it would not be noticed as unusual in the Foundation’s Executive Suite (headed at the time by McGeorge Bundy). It wasn’t. Among the project’s staff, I am remembered not so much for my technical work but for ordering that lunch. Dave liked to share that story too.

Dave sent our final report, A Time to Choose: America’s Energy Future, to every governor, among others. It caught the eye of the Governor of Georgia, a nuclear engineer named Jimmy Carter. It became, as Dave wrote later, “the foundation of President Carter’s energy policy.”

In the years that followed Dave, first as a senior Senate staffer and then as part of the Carter administration, shepherded some of our most important recommendations into policy and law. Our recommendation on vehicle fuel economy became the Corporate Average Fuel Economy regulations, better known as the CAFE standards. Intensified renewable energy research and development had been one of our energy supply recommendations. The Solar Energy Research Institute had been authorized in law in 1974; it was broadened to become the National Renewable Energy Laboratory in 1977. The 1978 Public Utilities Regulatory Policies Act (PURPA), which opened up utility-owned transmission and distribution wires to non-utility power, also had its roots in A Time to Choose. The greatest impact of that law lay far into the future. It has allowed large amounts of non-utility power — solar, wind, co-generation — to be carried (for a charge) on utility-owned wires.

In a few short years, Dave Freeman, the Green Cowboy, had gone from being an obscure White House staffer with a Tennessee drawl to being the visionary progenitor of energy policy in the United States — energy policy that really was public policy, and not dressed up petroleum company policy. His vision that energy growth could be decoupled from economic growth became a reality: from 1973 until the heyday of the Reagan years in the mid-1980s, the economy grew at an average annual rate of 2.8 percent; energy use growth was essentially zero — less than 0.1 percent a year.

A large part of the work of the Energy Policy Project was informed by Dave’s public power ethos and the notion that, while private capital had its place, the influence of corporate power, and especially oil company power, on public policy needed to be curbed. And our report said so. Tavoulareas thought the project had greatly exceeded its charter — and said so. But Dave was a man to write his own charter. That gave those of us on his staff the chance to be a part of the history he made.

In 1978, President Carter appointed Dave to be the Chair of the Tennessee Valley Authority. He loved the idea and reality of public power, in a way that only someone who grew up in Tennessee during the Depression could. TVA had built dams and power plants and irrigation canals; it had lighted up the back roads of the country. The New Deal, spearheaded by a government determined to alleviate unemployment and suffering, had shown that government could stand up to corporate power be an enlightened force for the public good. In contrast, Wall Street had largely opposed FDR’s proposals to hike income tax rates and his abandonment of the gold standard; the latter action was the monetary foundation of the New Deal.

He continued to make history at the TVA. By 1978, the agency had become something of an adjunct to the nuclear industry. Fourteen nuclear power reactors were being built at the same time. Dave asked me to come to Tennessee and help him put an energy efficiency program in place. He sent me to the power planning division in Chattanooga. It was soon very obvious to me that the division was not facing up to the fundamental changes in the energy landscape since 1973. Nationally, the growth rate of electricity was only about half of what it had been. On top of that, TVA was facing the loss of its largest single user, the federal government’s World War II uranium enrichment plant in Oak Ridge, Tennessee, which was to be shut down. None of that had been properly factored in.

I reported to Dave that there would be a vast surplus of electricity even without efficiency if TVA did not cancel at least eight of the 14 reactors under construction. Continued construction of all 14 would mean spiraling electricity costs to pay for idle reactors generating no revenue, hurting households and businesses. It was a Herculean task, but he did succeed in cancelling those reactors. He is now remembered rightly for his advocacy of solar energy and efficiency at the TVA, as at the other public power agencies he led after his TVA tenure.

A vignette, recounted to me over lunch — lunch seems to have been a theme in our relationship — showed one of his most admirable sides: his integrity. The Clinch River Breeder Reactor – supposed to make more plutonium than it used as a fuel, a design long dreamed of by nuclear engineers — was being built in Tennessee. Billions had already been spent around the world since the early 1950s to try and commercialize the design, to no avail. Costs of the Clinch River reactor had skyrocketed and the project was in trouble in Congress.

Senator Howard Baker of Tennessee, who had become the Majority Leader of the Senate in 1981, wanted the project completed. He asked Dave to go to bat for it. But, Dave, the son of an umbrella repairman, stood up to Baker, arguably the most powerful person in the U.S. Congress at the time. He said no. Dave believed that that project was bad for the TVA and bad for the country. He was right. Nearly four decades and tens of billions of dollars more spent worldwide after his refusal, the design has still has not been commercialized. In my view, its prospects remain miserably dim.

Dave led several public power agencies after TVA — the Lower Colorado River Authority in Texas, where he acquired his signature cowboy hat, the New York Power Authority, the Sacramento Municipal Utility District which he saved from itself by shutting down its costly nuclear power plant, and the Los Angeles Department of Water and Power. As a utility executive reputed for making tough decisions, he could easily have led an investor-owned utility and made oodles of money — far more than he made in public power. But over all the decades I knew him, I never once heard him even mention that possibility. The New Deal for him meant serving public power with integrity and competence; it was in his Depression-era DNA. He had enough to live well. He did not want more for himself; he wanted more for us all — clean air, an affordable, renewable, efficient energy system, government with integrity not beholden to corporate power. As much as anything, that made him a very great man.

I always felt that he was not as renowned as he should have been for being a leading pioneer of U.S. energy policy, as the man who, well before the 1973 energy crisis, dared to think economic growth could be decoupled from energy growth and then played a central role in making it happen. I used to joke with him that he was a bad salesman. Were he better, “Green Cowboy,” hat, drawl, and all, would long ago have become a widely celebrated brand, a green rival to the tiger-in-the-tank.

In 2006, Dave and I were at an energy conference organized by the famous physician and nuclear disarmament leader, Dr. Helen Caldicott. During a break, with Helen listening, he said “Arjun, I think we should get rid of oil and coal and nuclear and go to solar energy.”

I reacted sharply and noted that solar was very costly; his idea could create very serious problems for the economy.

His rejoinder was blunt: “You’re just being a knee-jerk naysayer. When is the last time you seriously looked at the energy landscape?”

I had to admit that it had been a while. Helen urged me to do the research. “I’ll raise the money for you” she promised. She did; and I did. Both she and Dave were on my Board of Advisors for that project. When it ended, I concluded that Dave was right on both counts. First, my response in 2006 had indeed been a knee-jerk reaction. Second, while it would be very difficult, a renewable energy system was feasible in the United States. The result of that effort was a book: Carbon-Free and Nuclear-Free: A Roadmap for U.S. Energy Policy. It was the first assessment of the feasibility of a renewable energy economy in the United States. A feather in Dave’s hat; I had been the numbers vehicle for his inspiration.

That conclusion has held up. The only change, based on my most recent work on the topic, Prosperous, Renewable Maryland, is that I think it won’t be as difficult to get there. Solar and wind are now the cheapest sources of electricity. We have the technology to deal with their intermittency. There have been breakthroughs in batteries for electric vehicles. Dave also wrote about the new technical realities and prospects in a 2016 book, An All-Electric America. The requirement for political guts to take on fossil fuel corporate power and for a vision grounded in technical reality has not changed.

Dave and I spoke a few weeks ago in late March – me at home in suburban Maryland, and he, at his daughter’s in suburban Virginia; our lunch had been derailed by the new corona virus. We spoke of the pandemic and the possibility that the moment might serve to make the world more in harmony with nature, more sustainable, one in which living well was joined with a notion of enough. I wanted to engage him in that conversation that day.

“It’s too early,” he said decidedly. “It’s too hard to see the outlines of things to come. Let’s wait till we can meet for lunch and talk.”

It is a lunch that I will have to eat without Dave. A heart attack has snatched him from us at a moment perhaps more pregnant with potential than the 1973 oil crisis. I will try to channel his visionary spirit and determination to serve the public purpose and meld them with my own nascent ideas.

Protected: Undoing human-caused harm: From Anthropocene to Gaia – Introduction

This content is password protected. To view it please enter your password below:

The Nagasaki Cross and Walter Hooke, World War II veteran

I interviewed World War II veteran Walter Hooke at his home in New York State in 2002. I got to know him when he wrote to me supporting my idea of establishing a Truth Commission on the health and environmental damage done by nuclear weapons, including from their production and testing across the world — not only in the nuclear weapon states but other places like Polynesia, Algeria, Australia, Kazakhstan, and the Marshall Islands where weapons were tested and places like Congo and Namibia and Canada that supplied much of the uranium.

He was a concerned citizen in the very best sense of the term — his compassion and concern were global but he was specially worried about the United States. He was an activist for democracy, for equality, for workers rights, and for peace. He was among the US troops sent to occupy Nagasaki after that city had suffered an atomic bombing. He was posted there at the end of October 1945, more than two-and-a-half months after the August 9, 1945 bombing.

Nagasaki was not actually the intended target, but as it happened the target city, Kokura, had too little visibility; to its misfortune, Nagasaki had a break in the clouds and was destroyed. It wasn’t the first choice because it had already been bombed with other weapons. One of the Target Committee’s criteria for target selection was to bomb an in-tact city so the impact of the atom bomb could be evaluated more accurately.

“At the time everyone was relieved that the war was over,” he told me. “But once you saw what happened you wondered how you could do something like that.” He was referring to terrible destruction and suffering, of course.

“The first troops that went in the occupation went in on 22 September [1945] and it was a lot worse then than in October. But in October there were still people walking around with their skin hanging. But we did not run into the terrible odors and everything. It was just an awful mess.”

Walter was very upset that General MacArthur had ordered the opening of “houses of prostituion”; he wrote a protest letter to Secretary of the Navy, James Forrestal. He met and befriended Paul Yamaguchi, the Bishop of Nagasaki, which was demographically Japan’s most Christian city before the bombing. He “crawled all over” the destroyed cathedral to salvage things. Bishop Yamaguchi gave him a cross from that cathedral as a gift. He sent it home to his mother; it was eventually gifted to the Peace Resource Center of Wilmington College in Ohio. This year, 2019, the Director of that Center, Tanya Maus, returned that cross to the City of Nagasaki in an August 7 ceremony there. Walter died in 2010.

We had an extraordinary conversation. He mentioned his niece, Sister Megan Rice, who protested the training of Latin American military at the School of the Americas in Fort Benning, Georgia, now called the Western Hemisphere Institute for Security Cooperation. Two years after Walter’s death, Sister Rice was part of a three-person Plowshares group that broke into the Oak Ridge nuclear weapons complex and splashed blood on the Y-12 complex, the location for fabricating the uranium parts of nuclear weapons, as part of a non-violent protest.

On a broader note he wondered what had gone wrong in the United States after World War II. He thought World War II was fought for democracy; yet workers in the United States could not freely express their views. He thought “a lot of it takes off from the Manhattan Project” It was he said the marriage of “deep science and deep secrecy.”

Oil, Pearl Harbor and the Atomic Bombings of Hiroshima and Nagaski

The Japanese attack on Pearl Harbor in December 1941 was mainly about oil — Indonesian oil. And oil is still in the center of global insecurity with climate disruption as an added danger; in fact, there is a lot of overlap in the cast of characters. First a few preliminaries on why I am saying that in the context of the atomic bombings of Hiroshima and Nagasaki.

Every August, on the anniversaries of the atomic bombings of Hiroshima and Nagasaki U.S. public opinion divides sharply into two camps. One side, pointing to the Japanese attack on Pearl Harbor and the ferocity with which the Japanese militarists prosecuted the war, believes the atomic bombings were justified because they ended the war and saved half a million allied soldiers’ lives (as President Truman claimed in his memoir). The other, pointing to official statements, including that of General Leslie Groves, who directed the Manhattan Project, say the war was essentially over and the atomic bombings were essentially a message to the Soviets about the postwar world order in the context of a U.S. atomic monopoly. (The Soviets were U.S. allies in World War II; nonetheless, Groves had said “There was never from about two weeks from the time I took charge of this Project [in 1942] any illusion on my part but that Russia was our enemy” He further said that “The Project was conducted on that basis.” See A World Destroyed, p. 62.)

My own view of the bombings is somewhat more complex; it is laid out in an August 2012 talk I gave in Santa Fe, New Mexico. In this blog I want to explore the connections between events before Pearl Harbor and subsequent events including the atomic bombings. I do think that, when all is said and done, the bombings were unjustified even in a war in which all sides killed civilians in large numbers. The timing of their use is a critical factor. The atom bombs were used as soon as they were ready without waiting to see if the Japanese would surrender upon Soviet entry into the war. That this was likely to happen was a widely held view among the Allies, including by President Truman. The U.S. invasion of the main islands of Japan was not due to start till November 1, 1945. The bombings, of course, happened on August 6 and 9, 1945. The Soviets declared war on Japan on August 8, 1945.

Since about the start of the twentieth century Japan’s ambition was to become an imperialist global power, an Eastern counterpart of Britain, another island state and long a global imperialist power. Japan waged war on Russia and won. It conquered Korea.

But Japan had no oil. (Neither did Britain. But Britain, along with their Soviet allies had already invaded Iran in August 1941 to secure wartime oil supplies.) As is well known, oil had become central to modern war machines – for ships, tanks, aircraft….No oil, effectively meant no capacity for conquest on a continental scale.

The U.S. was at that time an oil exporter and a principal supplier to Japan. But as Japan expanded its war into China and Southeast Asian countries (themselves ruled by the French and the British until Japanese occupation), tensions with the United States rose. The United States itself nurtured Pacific region ambitions as evidenced by the conquest of the Philippines and the overthrow of the Hawaiian queen decades earlier. In mid-1940, the United States moved its Pacific fleet from San Diego to Pearl Harbor. It also eventually embargoed oil exports to Japan.

The Japanese militarists now had a choice: they could occupy Indonesian oil fields and continue their conquest of China, Southeast Asia, and South Asia; or they could give up their imperialist ambitions. They chose the first course; the U.S. fleet at Pearl Harbor stood in the way. The United States had known that the fleet may be a target when it was moved — indeed, Admiral James O. Richardson was removed from command in February 1941 because he opposed the move, feeling the fleet would become vulnerable to attack. Admiral Husband E. Kimmel was installed in his place. Ironically, he had written in very the same month that “a surprise attack (submarine, air, or combined) on Pearl Harbor is a possibility…” He said he would take steps to “minimize damage” make the attacker “pay.” He, in turn, was relieved of his post ten days after the attack on Pearl Harbor.

It is one of the bizarre facts of history and a commentary on that time that the casus belli in 1941 was the United States trying to prevent Japan from getting at Indonesian oil, when Indonesia was a Dutch colony and Holland itself was occupied by the Nazis.

President Roosevelt made the decision to pursue the atomic bomb with vigor on October 9, 1941, almost two months before Pearl Harbor. Einstein, like many other scientists, did not want Hitler to have a monopoly of the bomb, which is why he wrote to President Roosevelt in 1939 recommending a bomb project. But that project had been languishing in the backwaters of war research until a British scientific effort, organized as the MAUD Committee, concluded in mid-1941 that a uranium bomb was feasible. At about the same time, Hitler invaded the Soviet Union; the Nazis advanced with stunning speed.

Vannevar Bush, an MIT engineering professor and Vice-President, was the head of the Roosevelt White House’s Office of Science and Technology Development. He was in charge of critical wartime military R&D well before the United States entered World War II in December 1941. One of his goals was to make sure that the scientific and technological efforts of the United States during World War II were available for use during that war. Until mid-1941 he had been more interested in radar than the atom bomb. But once he made up his mind about the bomb, he convinced Roosevelt to commit vast resources to it. The die was cast. If it worked, he would help ensure the bomb was used in World War II.

Preventing a Nazi atom bomb monopoly and nuclear blackmail had been a principal reason, and for many the only reason, to develop a U.S. atom bomb. But sights soon turned eastward. The first official discussion of an atomic bomb target was held by the Military Policy Committee of the Manhattan Project, on May 5, 1943. The Committee was headed by Vannevar Bush. His deputy was James Conant, the President of Harvard University. Groves was a member. On that date, they decided to not target Germany. They were nervous that if the bomb was a dud, Germany might reverse engineer it and use it. Germany’s expertise in matters nuclear was well-known; indeed the first fission reaction was the result of an experiment in Germany in 1938. While many brilliant German and other European physicists had emigrated to Britain and the United States, the Germans still had Werner Heisenberg, known worldwide as one of the most brilliant in an age of brilliant physicists.

The target would be the Japanese fleet based at the island of Truk in the Pacific Ocean, the Committee decided on May 5, 1943; if the bomb were a dud it would sink. Bush had already made a similar decision in relation to the “proximity fuze” — which could be called the first smart bomb. Bombs carrying the fuze were not to be used over enemy territory to prevent reverse engineering. It is a testament to Vannevar Bush’s power that, until the Battle of the Bulge at the end of 1944, they were not.

As confidence rose that the bomb would work, Japan itself became the target. In September 1944 Churchill and Roosevelt discussed the possible use of the bomb against the Japanese. Despite much research, I have found no document specifying targets in Germany or any documentation after May 5, 1943 about preparations to use the atom bomb in the European theater. Groves, in preparing Secretary of War Stimson for his late April 1945 briefing of the newly installed President Truman, wrote that “The target is and was always expected to be Japan”.

There is more to the story of course, as I recounted in my August 2012 talk. Suffice it to say here, as we struggle to create a fossil-fuel-free world and as the U.S., British, Iranian nuclear crisis heats up, that oil, Pearl Harbor, Hiroshima and Nagasaki are part of a deadly, tangled dance that is not yet over.

Dear Arjun: Can we generate electricity from nuclear waste?

My friend Hank is involved with a summer science program that he attended and loved many years (decades) ago. He apparently posted something on the group’s FB page recently suggesting there is no viable plan for storage of nuclear waste and got this response (in part):

“We’ve known how to effectively destroy nuclear waste for fifty years, and refuse to do it. Read “Smarter Use of Nuclear Waste” in December 2005 Scientific American. Read “Plentiful Energy” by Charles E. Till and Yoon Il Chang….The stuff we call “nuclear waste” is actually valuable 5%-used fuel — 5% fission products and 95% future fuel. Future fuel needs custody for 300,000 years, which is madness to contemplate. Fission products, unseparated, need custody for 300 years — a trivial problem….. Of course, the American energy economy won’t be all nuclear, so the amount would be less. I don’t foresee a breakthrough in storage, so the options are nuclear, coal, and gas — if firm power is desired. Take your choice.”

Hi Hank:

Sigh. This is one of the claims that keeps on coming back. But here’s one thing (almost) right in it:

Pressurized light water reactor spent fuel is about 93% U-238, which is not a fuel but which can potentially be turned into plutonium, which is; about 1% U-235, which is fissile, and about 1% plutonium, which can be used as fuel (mostly). The rest is fission products though there are also some minor actinides: neptunium-237 (half-life 2.14 million years) and americium-241 (half-life 432 years).

To convert non-fuel U-238 to Pu-239 fuel you need a “breeder reactor,” which produces more fuel than it uses, the nuclear dream. Alvin Weinberg, nuclear physicist and inventor at Oak Ridge National Lab, called it a “magical energy source” that would need a priesthood to guard the waste and the bomb-usable fissile material — he famously called this the “Faustian bargain”).

The sodium-cooled reactor is an efficient breeder reactor in that it produces excess plutonium in a shorter time than other types of breeders. Roughly $100 billion has been spent globally over nearly seven decades to try to commercialize it. The International Panel on Fissile Materials did a good global survey of the technology in 2010. In fact, two of the most recent demonstration sodium-cooled breeders (Superphenix in France and Monju and Japan) were among the worst operating of the lot; they are both shut. Some sodium-cooled breeders have operated well, others with problems and yet others have had accidents (like Monju) and/or been failures (like Superphenix, with a lifetime capacity factor of about 7 percent — one half to one-fourth of a typical solar installation, depending on where it is). In other words after $100 billion, it is still not commercial. Not that the French and Japanese have given up. They are working on the “ASTRID Project” – target opening date is in the 2030s. If successful, it would be the 2040s and 2050s before they could be deployed in significant numbers. Got time to solve the CO2 problem?

My report Plutonium End Game might be helpful as also my report on reprocessing, and one on a sodium-cooled breeder that Bill Gates likes — the so-called “travelling wave reactor.”

Even if they worked reliably and consistently from one reactor to the next, sodium-cooled breeders are more expensive than current light water reactors, which, in turn are ~2 times the cost of wind and solar. Then there are the proliferation concerns. We currently have more separated commercial bomb-usable plutonium in the commercial sector than in all the nuclear weapons in all nuclear weapon states combined, enough for ~30,000 bombs or more. In addition, reprocessing, needed to recover the extra plutonium and fabricate fuel from it, is also costly and dirty. The French (at La Hague) and British (at Sellafield) reprocessing plants have polluted the oceans all the way to the Arctic and caused neighboring governments to ask them to stop the discharges of radioactive liquids into the seas.

The theory behind it and the gleam in the nuclear engineers’ eyes in the 1950s was this could make non-fuel U-238 (99.3% of natural uranium) into fuel. Uranium was thought to be scarce, and this would give us an inexhaustible energy source for hundreds of thousands of years. But uranium turned out to be plentiful and cheap – it is a small part of the cost of nuclear power. So the economic rationale was gone.

And now solar and wind are much cheaper than nuclear. My hour-by-hour modeling shows that solar plus wind can provide reliable supply at affordable cost, using a balanced solar and wind portfolio, storage, demand response with a smart grid, and peaking generation using hydrogen made when there is no other use for solar and wind electricity. See my 2016 report, Prosperous, Renewable Maryland. Moreover, storage is not needed until solar and wind penetration reaches a much higher level than at present. Costs of storage have come down. A recent commercial solicitation in Arizona for meeting peak load resulted in solar plus battery storage beating out natural gas turbines. The idea that nuclear is needed is as obsolete as the technology. The plutonium in existing waste is just 1% of the spent fuel and hard to separate. It can’t be used to make bombs if it is left there. We don’t have to make a Faustian bargain to have reliable, clean and affordable electricity.

People got too excited based on the physics; but that’s just the starting point. The technology has to work consistently; it has to be affordable; and its other attributes should not pose dire risks — like CO2 from fossil fuels or plutonium from breeders. Every commercial nuclear reactor, ~1,000 MWe, produces about 30 bombs worth of plutonium each year, if separated from spent fuel. Nuclear, in the end, is making plutonium just to boil water.

The 300-year waste claim is wrong. I-129 is one of the fission products: half-life ~16 million years; it is one of the more troublesome ones for a repository. Then there is Cs-135: half-life 2.3 million years; and technetium-99: half life — 212,000 years. In addition there are the above-mentioned minor actinides.

There is another breeder reactor that has many fans: the Liquid-Fueled Thorium Reactor (LFTR). It is promoted by a set of folks who feel very cheated that this approach (Alvin Weinberg’s brainchild) was rejected in favor of the sodium-cooled breeder in the late 1960s or early 1970s. In my view, the LFTR is the most proliferation prone of the various breeder approaches in that it could lead to more countries having the capability to acquire nuclear-weapons-usable material with less difficulty than other approaches to breeders. For one thing the separation plant for the fissile material (in this case uranium-233) would be located at every reactor. The precursor of U-233, protactinium-233, has a half-life of 27 days. Thus, it is available for chemical separation for much longer than the precursor of plutonium-239 (neptunium-239, half-life 2.4 days). Chemical separation of Pa-233 would yield bomb usable U-233 after the Pa-233 decays.

While nuclear-weapon states like the United States or Russia would not go through the bother, an aspiring nuclear weapon state might be tempted to separate Pa-233 and acquire weapons-usable U-233. Policing would be difficult, since there would be so many reprocessing plants (if this reactor comes into widespread use). The fission products are in fluoride form, which poses more difficult long-term management challenges than the oxide form of current spent fuel. A pilot LFTR reactor, 8 megawatts thermal was built at Oak Ridge. It operated reasonably well but was never used to generate electricity or to make more fissile material — that wasn’t the purpose. It cost much less than $100 million in today’s dollars. The decommissioning is estimated to cost north of $400 million — this for a reactor that was about a quarter of one percent the size of a commercial nuclear power plant (~3,300 MW thermal, or ~1,000 MW-electrical). See a debate I did on Science Friday with a proponent and make up your own mind.

You were essentially right — there is no good solution for the problem of spent fuel. The least bad approach to long-term management is disposal in a suitably designed geologic isolation system (the National Academies did an excellent report on this in 1983). It would also be sensible if we would transition to renewable energy so we are not burdening future generations even more with the waste from our economic activities.

The Atom Bomb visits Bikini — 1946

The tests at Bikini in July 1946 were the coming out party for the atom bomb. Operation Crossroads began just two weeks after the United States presented the so-called Baruch plan to control the Bomb: The U.S. would give up its weapons only after it was sure no one else had them (or any other “weapon adaptable to mass destruction”). The United States would have the veto-free right to punish anyone that it thought was cheating. Operation Crossroads demonstrated what the “condign punishments” administered by the U.S. might look like, as up close and personal as a bomb explosion would permit.

The U.S. Pacific fleet was there along with 42,000 armed forces personnel. There were US and foreign dignitaries and journalists to report on the proceedings. But it was no tropical picnic for the armed forces personnel. Col. Stafford Warren noted with dismay the “hairy-chested approach” of many naval officers to the “unseen hazard” of radiation. In that spirit, after the first test (Test Able), 18-year-old sailor, John Smitherman, and others were ordered to fight a fire on one of the target ships that were stationed in the lagoon to assess the effects of the bomb. Afterwards, to cool off, he (and others) jumped into Bikini lagoon; He did not know and no one told him hat the lagoon was intensively radioactive, especially due to activated sodium (sodium-24), a powerful beta-emitter. He died in 1983 of lymphatic system cancer, a signature cancer of such exposure. The Veterans Administration repeatedly denied his claims that his illness was connected to his service.

The second test, Test Baker, was a more generalized radiological disaster. The bomb was under a barge; the explosion sent a million tons of radioactive spray into the air. Smitherman experienced some of the fallout. The ships of the fleet were taken into Bikini lagoon; all were contaminated.

The risks of significant exposure were widespread, from sailors scrubbing decks to meat being washed with contaminated seawater. There were no instruments to measure plutonium in the field. Yet even decades later, in the early 1980s, the Defense Department claimed that internal exposure was insignificant.

A study, I did with David Albright on the radiological conditions at Bikini (initiated by my friend Bob Alvarez) was presented to the House Veterans Affairs Committee by Karl Z. Morgan, who was present at Operation Crossroads. He was one of the founders of the discipline of health physics — assessing the risks of radiation to health. The study was based on the documents from the archive of the Chief of Radiological Safety during the tests, Col. Stafford Warren. The documents were brought to Bob and by Anthony Guarisco, who was a veteran of the 1946 Bikini tests.

The report created quite a stir in part because the director of the Defense Nuclear Agency testified that he was not aware of the Stafford Warren documents. In effect, the government had come to its conclusions that Smitherman’s cancer, and those of so many other atomic veterans, were not related to their service without consulting the data of the chief of radiological safety at Bikini. Subsequently, in 1990, the US government passed the Radiation Exposure Compensation Act to compensate armed forces personnel who participated in atmospheric testing, among others (including uranium miners and certain “downwinders” who lived downwind from the Nevada Test Site).

Operation Crossroads tragically established a pattern of all nuclear weapon states harming their own people in the name of national security — and doing so without informed consent. In fact, General Groves, who oversaw the making of the bomb during the Manhattan Project was fearful of claims being filed by participants in the Bikini tests.

Democracy in the nuclear age – remembering Bill Mitchell

My dear friend Bill Mitchell died on 25th May: a grievous loss for his family and friends as well as for all those who care about the Earth and about democracy. The story of Bill’s profound impact on my life and on the Institute for Energy and Environmental Research is a story really worth recounting as an example of the immense impact his life and work had on millions, yes millions, for remembering that we are again in the midst of rising nuclear dangers in the name of national security, and, not least, for describing the most extraordinary man he was.

As the Cold War began to thaw in the mid-1980s, Bill had the idea of bringing together democracy and environmental activists concerned about the dangers posed by Hanford (in Washington State), where the plutonium for the 1945 Nagasaki bomb and for much of the U.S. Cold War nuclear arsenal had been produced. Dark secrets lay there: tanks with liquid highly radioactive waste at risk of exploding; plutonium-laden waste dumped in open trenches; a 1949 experimental release of intensely radioactive iodine-131, known as the “Green Run”….Similar risks lurked throughout the country in the nuclear weapons complex owned by the U.S. government and operated with complete immunity from liability by multinational corporations, like DuPont and General Electric, and universities, with the University of California the most prominent among them.

His inspiration was to bring together community leaders from the places most directly affected by nuclear weapons production and testing. The demand for information about what the weapons made in the name of national security were doing to the land, water and people was simple enough. Nuclear weapons are by their nature antithetical to democracy. As (then) Deputy Secretary of Energy, W. Henson Moore, said in 1989, nuclear weapons production had been “a secret operation not subject to laws…no one was to know what was going on….[T]he way we’ve [the government and its contractors] operated these plants in the past…was: This is our business, it’s national security, everybody else butt out.” (as quoted in the Washington Post, 17 June 1989).

The notion that the public should “butt out” of activities that were laying land and water to waste, while endangering the very public the government claimed to protect was offensive to Bill and the core of activists who founded the Nuclear Safety Campaign that went on to become the Military Production Network (MPN), now known as the Alliance for Nuclear Accountability, (ANA).

Information about and accountability for the environmental and health impacts of nuclear weapons production and testing was at the center of MPN’s demands. The people who were being harmed in the name of national security had a right to know. A claim of national security was not enough.

Bill Mitchell, Sharon Carlsen, Tim Connor, Lisa Crawford, and others began to expose the chamber of nuclear weapons establishment horrors. It was the U.S. version of the glasnost that was taking hold in the Soviet Union in the Gorbachev era. Through demands for and analyses of environmental impact statements, Freedom of Information requests, and savvy outreach to Congress and the media, the Military Production Network fostered a startling public awareness: the nuclear weapons complex was, first of all, hurting the very people it was supposed to protect. Under cover of national security and the demand that everybody but the government and its contractors “butt out,” the nuclear weapons establishment had been polluting the land and rivers of patriotic stories and songs, while conducting experiments without informed consent on its own people.

But the most remarkable thing about Bill Mitchell and the Military Production Network was this: the microphone was in the hands of the leaders and activists who were from the communities where the nuclear weapons plants were located. Bill Mitchell never put himself out front. He never was the principal speaker at a press conference. He was never the spokesperson quoted in news releases. On the contrary, he was always in the background.

It would have been easy to be otherwise. After all, Bill was the strategic inspiration of the enterprise, its co-founder, and its chief fundraiser. But there was something in his nature that was profoundly democratic. He never thought of it as ceding control. It came naturally that Lisa Crawford, who lived near the Fernald plant in Ohio that processed half a million tons of uranium and who was directly affected through pollution of the well from which she unknowingly used water for years, was one of the people who was front and center. And so were many other community leaders and activists. Bill’s stewardship of the group consisted in the empowerment of the grassroots activists and the groups who were directly affected

Bill was at the center of creating that most unusual of organizations that has had influence in Washington. MPN’s reins were not in the hands of some central, inside-the-Beltway office. They were firmly in the hands of strong community leaders who knew their minds and developed friendship and respect that results in a most remarkable accomplishment.

I was privileged to be a part of this, with IEER providing technical analysis and support as well as technical training workshops for community leaders that become the highlight of my professional year, each year for many years. The work that Bill began was at the center of my professional life for nearly two decades. It continues to inspire me.

By the mid-1990s, under relentless pressure to justify new weapons production when it was clear that there were already too many nuclear bombs, to clean up the mess already made, to explain how human radiation experiments could have happened, to put the estimates of cancers and deaths caused by nuclear testing fallout throughout the country, most of the production facilities in the nuclear weapons complex were shut. That included the plutonium and tritium production reactors, the Hanford plutonium separation plant, the Rocky Flats plant where thousands of plutonium triggers for nuclear weapons were manufactured on an assembly line, the Fernald uranium processing plant, and many others.

The idea of accountability to the people by the nuclear weapons establishment had spread to the former Soviet Union. Grassroots collaborations between U.S. and Russian community activists were established. The Soviet nuclear test site at Semipalatinsk was closed. Nuclear testing was ended by the United States, Russia, France, Britain, and China. The United States, under pressure from ANA, made public a National Cancer Institute study on fallout that estimated that between 11,000 and 212,000 thyroid cancers would be caused by atmospheric testing in Nevada alone; those tests ended in 1962.

Not that the nuclear weapons complex gave up. Even as thousands of weapons were being declared surplus for security requirements in the early 1990s, the U.S. Department of Energy was promoting plans for a spanking new nuclear weapons complex, called Complex 21. MPN/ANA made sure it was not to be.

The legacy of that shutdown of key facilities in the nuclear weapons weapons complex, notably Rocky Flats, is of immense significance today as tensions, including nuclear tensions, rise between Russia and the United States. The Alliance for Nuclear Accountability is among the groups pointing to the folly of the proposed 30-year trillion dollar “upgrade” of the U.S. nuclear arsenal, aka the “Trillion Dollar Trainwreck”.

The work of the community activists and leaders is as important as ever, perhaps more so in the context of the climate crisis, where international cooperation to solve the problem is critical. Bill’s legacy lives on in rising demands worldwide for accountability and democracy; it has expanded to nuclear disarmament and to addressing the climate crisis without resorting to nuclear power plants.

But for me, the most important legacy of Bill’s life and work at least in the arena I know best was the humility, modesty, and fidelity to the principles of democracy with which he conducted himself when he conceived, launched and coordinated the Military Production Network. We need him sorely today. I hope we can take inspiration from the brilliant and principled way in which Bill Mitchell lived, and loved, and gave of himself.
————-
Bob Schaeffer of Public Policy Communications, a long-time adviser to IEER and to MPN/ANA, helped with this remembrance of our mutual friend Bill.

Choosing the first atomic target – May 5, 1943

May 5, 1943 is one of the most important dates, and possibly the least known, in the history of the nuclear age. It was the date when the first atomic bomb targeting decision was made — a full two years before the end of World War II in Europe.

The Military Policy Committee, the ultra-secret Manhattan Project’s de facto executive committee met that day to review progress. It also made the first decision regarding the use of the bomb: it should be used so it would land in water if it turned out to be a dud. And Germany would not be targeted; it would be the Japanese, who it was felt “would not be so apt to secure knowledge from it as would the Germans” (quote from a summary of the May 5 meeting). [After all, one of the world’s greatest physicists, Werner Heisenberg, was living in Nazi Germany.] So the Japanese fleet at the Pacific island of Truk was selected as the first target of the bomb; it met the criteria.

Later, in the fall of 1944, when scientists were certain that the uranium bomb would work, planning began for the use of the bomb on Japan. It was helped by the fact that Saipan had been conquered by July 9, 1944; Japan could be directly bombed from there using B-29 bombers. As Groves stated when he briefed the newly installed President Truman in April 1945, “The target is and was always expected to be Japan.”

Many of Manhattan Project’s scientists were European Jews motivated by a fear of a Nazi atomic monopoly. Ironically, they were completely unaware that Germany had been ruled out as a target two years before the war in Europe ended.

When the US atomic spy mission, called Alsos, confirmed in early December 1944 that Germany had no viable bomb project, the project was not stopped; rather it was accelerated. Joseph Rotblat was the only Project scientist who left at that time; he was to win a Nobel Peace Prize for that and later efforts to rid the world of nuclear weapons. Decades later, the great physicist, Richard Feynmann, wondered why he had continued work and not thought about the fact that the purpose of the Project had been accomplished when the war in Europe ended.

A very real and reasonable fear of a Nazi nuclear monopoly had been central to Einstein’s recommendation that President Roosevelt initiate an atomic bomb project. Apparently no one thought to ask what might occur if the United States wound up with the atomic monopoly. The answer came, of course, on August 6 and August 9, 1945 when Hiroshima and Nagasaki were obliterated.

Like many I have concluded that the bombings were unjustified, though that is an opinion far from universally held. But some of my reasons may surprise you. I explained them in a talk I gave in Santa Fe in 2012, entitled From Pearl Harbor to Hiroshima.

PO Box 5324 · Takoma Park, Maryland, 20913 USA · Tel. 1-301-509-6843 · E-mail: