Posted on February 3rd, 2013 No comments
I won’t parse all 150+ of them, but here are a few more that caught my eye.
…most scientists have conceded the high ground of determining human values, morals, and ethics to philosophers, agreeing that science can only describe the way things are but never tell us how they ought to be. This is a mistake.
It’s only a mistake to the extent that there’s actually some “high ground” to be conceded. There is not. Assuming that Shermer is not referring to the trivial case of discovering mere opinions in the minds of individual humans, neither science nor philosophy is capable determining anything about objects that don’t exist. Values, morals and ethics do not exist as objects. They are not things-in-themselves. They cannot leap out of the skulls of individuals and acquire a reality and legitimacy that transcends individual whim. Certainly, large groups of individuals who discover that they have whims in common can band together and “scientifically” force their whims down the throats of less powerful groups and individuals, but, as they say, that don’t make it right.
Suppose we experience a holocaust of some kind, and only one human survived the mayhem. No doubt he would still be able to imagine what it was like when there were large groups of other’s like himself. He might recall how they behaved, “scientifically” categorizing their actions as “good” or “evil,” according to his own particular moral intuitions. Supposed, now, that his life also flickered out. What would be left of his whims? Would the inanimate universe, spinning on towards its own destiny, care about them one way or the other. Science can determine the properties and qualities of things. Where, then, would the “good” and “evil” objects reside? Would they still float about in the ether as disembodied spirits? I’m afraid not. Science can have nothing to say about objects that don’t exist. Michael Shermer might feel “in his bones” that some version of “human flourishing” is “scientifically good,” but there is no reason at all why I or anyone else should agree with his opinion. By all means, let us flourish together, if we all share that whim, but surely we can pursue that goal without tacking moral intuitions on to it. “Scientific” morality is not only naive, but, as was just demonstrated by the Communists and the Nazis, extremely dangerous as well. According to Shermer,
We should be worried that scientists have given up the search for determining right and wrong…
In fact, if scientists cease looking for and seeking to study objects that plainly don’t exist, it would seem to me more reason for congratulations all around than worry. Here’s a sample of the sort of “reasoning” Shermer uses to bolster his case:
We begin with the individual organism as the primary unit of biology and society because the organism is the principal target of natural selection and social evolution. Thus, the survival and flourishing of the individual organism—people in this context—is the basis of establishing values and morals, and so determining the conditions by which humans best flourish ought to be the goal of a science of morality. The constitutions of human societies ought to be built on the constitution of human nature, and science is the best tool we have for understanding our nature.
Forgive me for being blunt, but this is gibberish. Natural selection can have no target, because it is an inanimate process, and can no more have a purpose or will than a stone. “Thus, the survival and flourishing of the individual organism – people in this context – is the basis of establishing values and morals”?? Such “reasoning” reminds me of the old “Far Side” cartoon, in which one scientist turns to another and allows that he doesn’t quite understand the intermediate step in his proof: “Miracle happens.” If a volcano spits a molten mass into the air which falls to earth and becomes a rock, is not it, in the same sense, the “target” of the geologic processes that caused indigestion in the volcano? Is not the survival and flourishing of that rock equally a universal “good?”
Of the remaining “worries,” this was the one that most worried me, but there were others. Kevin Kelly, Editor at Large of Wired Magazine, was worried about the “Underpopulation Bomb.” Noting the “Ur-worry” of overpopulation, Kelly writes,
While the global population of humans will continue to rise for at least another 40 years, demographic trends in full force today make it clear that a much bigger existential threat lies in global underpopulation.
Apparently the basis of Kelly’s worry is the assumption that, once the earths population peaks in 2050 or thereabouts, the decrease will inevitably continue until we hit zero and die out. In his words, “That worry seems preposterous at first.” I think it seem preposterous first and last.
Science writer Ed Regis is worried about, “Being Told That Our Destiny Is Among The Stars.” After reciting the usual litany of technological reasons that human travel to the stars isn’t likely, he writes,
Apart from all of these difficulties, the more important point is that there is no good reason to make the trip in the first place. If we need a new “Earth 2.0,” then the Moon, Mars, Europa, or other intra-solar-system bodies are far more likely candidates for human colonization than are planets light years away. So, however romantic and dreamy it might sound, and however much it might appeal to one’s youthful hankerings of “going into space,” interstellar flight remains a science-fictional concept—and with any luck it always will be.
In other words, he doesn’t want to go. By all means, then, he should stay here. I and many others, however, have a different whim. We embrace the challenge of travel to the stars, and, when it comes to human survival, we feel existential Angst at the prospect of putting all of our eggs in one basket. Whether “interstellar flight remains a science-fiction concept” at the moment depends on how broadly you define “we.” I see no reason why “we” should be limited to one species. After all, any species you could mention is related to all the rest. Interstellar travel may not be a technologically feasible option for me at the moment, but it is certainly feasible for my relatives on the planet, and at a cost that is relatively trivial. Many simpler life forms can potentially survive tens of thousands of years in interstellar space. I am of the opinion that we should send them on their way, and the sooner the better.
I do share some of the other worries of the Edge contributors. I agree, for example, with historian Noga Arikha’s worry about, “Presentism – the prospect of collective amnesia,” or, as she puts it, the “historical blankness” promoted by the Internet. In all fairness, the Internet has provided unprecedented access to historical source material. However, to find it you need to have the historical background to know what you’re looking for. That background about the past can be hard to develop in the glare of all the fascinating information available about the here and now. I also agree with physicist Anton Zeilinger’s worry about, “Losing Completeness – that we are increasingly losing the formal and informal bridges between different intellectual, mental, and humanistic approaches to seeing the world.” It’s an enduring problem. The name “university” was already a misnomer 200 years ago, and in the meantime the problem has only become worse. Those who can see the “big picture” and have the talent to describe it to others are in greater demand than ever before. Finally, I agree with astrophysicist Martin Rees’ worry that, “We Are In Denial About Catastrophic Risks.” In particular, I agree with his comment to the effect that,
The ‘anthropocene’ era, when the main global threats come from humans and not from nature, began with the mass deployment of thermonuclear weapons. Throughout the Cold War, there were several occasions when the superpowers could have stumbled toward nuclear Armageddon through muddle or miscalculation. Those who lived anxiously through the Cuba crisis would have been not merely anxious but paralytically scared had they realized just how close the world then was to catastrophe.
This threat is still with us. It is not “in abeyance” because of the end of the cold war, nor does that fact that nuclear weapons have not been used since World War II mean that they will never be used again. They will. It is not a question of “if,” but “when.”
Posted on October 22nd, 2012 2 comments
We have passed the end of the fiscal year, and the National Ignition Facility, or NIF, at Lawrence Livermore National Laboratory (LLNL) failed to achieve its goal of ignition (more fusion energy out than laser energy in). As I noted in earlier post about the NIF more than three years ago, this doesn’t surprise me. Ignition using the current indirect drive approach (most of the jargon and buzzwords are explained in the Wiki article on the NIF) requires conversion of the laser energy into an almost perfectly symmetric bath of x-rays. These must implode the target, preserving its spherical shape in the process in spite of a very high convergence ratio (initial radius divided by final radius), and launching a train of four shocks in the process, which must all converge in a tiny volume at the center of the target, heating it to fusion conditions. That will release energetic alpha particles (helium nuclei) which must then dump their energy in the surrounding, cold fuel material, causing a “burn wave” to propagate out from the center, consuming the remaining fuel. It would have been a spectacular achievement if LLNL had pulled it off. Unfortunately, they didn’t, for reasons that are explained in an excellent article that recently appeared in the journal Science. (Unfortunately, it’s behind a subscriber wall, and I haven’t found anything as good on the web at the moment. You can get the gist from this article at Huffpo.) The potential political implications of the failure were addressed in a recent article in the New York Times.
All of which begs the question, “What now?” My opinion, in short, is that the facility should remain operational, at full capacity (not on half shifts, which, for various reasons, would reduce the experimental value of the facility by significantly more than half).
I certainly don’t base that opinion on the potential of inertial confinement fusion (ICF), the technology implemented on the NIF, for supplying our future energy needs. While many scientists would disagree with me, I feel it has virtually none. Although they may well be scientifically feasible, ICF reactors would be engineering nightmares, and far too expensive to compete with alternative energy sources. It would be necessary to fabricate many thousands of delicate, precisely built targets every day and fill them with highly radioactive tritium. Tritium is not a naturally occurring isotope of hydrogen, and its half-life (the time it takes for half of a given quantity to undergo radioactive decay) is just over 12 years, so it can’t be stored indefinitely. It would be necessary to breed and extract the stuff from the reactor on the fly without releasing any into the environment (hydrogen is notoriously slippery stuff, that can easily leak right through several types of metal barriers), load it into the targets, and then cool them to cryogenic temperatures. There is not a reactor design study out there that doesn’t claim that this can be done cheaply enough to make ICF fusion energy cost-competitive. They are all poppycock. The usual procedure in such studies is to pick the cost number you need, and then apply “science” to make it seem plausible.
However, despite all the LLNL hype, the NIF was never funded as an energy project, but as an experimental tool to help maintain the safety and reliability of our nuclear stockpile in the absence of nuclear testing. The idea that it will be useless for that purpose, whether it achieves ignition or not, is nonsense. The facility has met and in some cases exceeded its design goals in terms of energy and precision. Few if any other facilities in the world, whether existing or planned, will be able to rival its ability to explore equations of state, opacities, and other weapons-relevant physics information about materials at conditions approaching those that exist in nuclear detonations. As long as the ban on nuclear testing remains in effect, the NIF will give us a significant advantage over other nuclear states. It seems to me that maintaining the ban is a good thing.
It also seems to me that it would behoove us to maintain a robust nuclear stockpile. Nuclear disarmament sounds nice on paper. In reality it would invite nuclear attack. The fact that nuclear weapons have not been used since 1945 is a tremendous stroke of luck. However, it has also seduced us into assuming they will never be used again. They will. The question is not if, but when. We could continue to be very lucky. We could also suffer a nuclear attack tomorrow, whether by miscalculation, or the actions of terrorists or rogue states. If we continue to have a stockpile, it must be maintained. Highly trained scientists must be available to maintain it. Unfortunately, babysitting a pile of nuclear bombs while they gather dust is not an attractive career path. Access to facilities like the NIF is a powerful incentive to those who would not otherwise consider such a career.
One of the reasons this is true is the “dual use” capability of the NIF. It can be used to study many aspects of high energy density physics that may not be relevant to nuclear weapons, but are of great interest to scientists in academia and elsewhere who are interested in fusion energy, the basic science of matter at extreme conditions, astrophysics, etc. Some of the available time on the facility will be reserved for these outside users.
As for the elusive goal of ignition itself, we know that it is scientifically feasible, just as we know that its magnetic fusion equivalent is scientifically feasible. The only question remaining is how big the lasers have to be to reach it. It may eventually turn out that the ones available on the NIF are not big enough. However, the idea that because we didn’t get ignition in the first attempts somehow proves that ignition is impossible and out of the question is ridiculous. It has not even been “proven” that the current indirect drive approach won’t work. If it doesn’t, there are several alternatives. The NIF is capable of being reconfigured for direct drive, in which the lasers are aimed directly at the fusion target. For various reasons, the beams are currently being frequency-tripled from the original “red” light of the glass lasers to “blue.” Much more energy, up to around four megajoules instead of the current 1.8, would be available if the beams were only frequency-doubled to “green”. It may be that the advantage of the extra energy will outweigh the physics-related disadvantages of green light. An interesting dark horse candidate is the “fast ignitor” scenario, in which the target would be imploded as before, but a separate beam or beams would then be used to heat a small spot on the outer surface to ignition conditions. An alpha particle “burn wave” would then propagate out, igniting the rest of the fuel, just as originally envisioned for the central hot spot approach.
Some of the comments following the Internet posts about NIF’s failure to reach ignition are amusing. For example, following an article on the Physics Today website we learn to our dismay:
With all due respect to the NIF and its team of well-meaning and enthusiastic researchers here, I am sorry to state hereby that sustainable nuclear fusion is predestined to fail, whether it be in the NIC, the Tokamak or anywhere else in solar space, for fundamentally two simple reasons paramount for fusion: ((1) vibrational synchronism (high-amplitude resonance) of reacting particles; and (2) the overall isotropy of their ambient field.
Obviously the commenter hadn’t heard that the scientific feasibility of both inertial and magnetic fusion has already been established. He reminds me of a learned doctor who predicted that Zadig, the hero of Voltaire’s novel of that name, must inevitably die of an injury. When Zadig promptly recovered, he wrote a thick tome insisting that Zadig must inevitably have died. Voltaire informs us that Zadig did not read the book. In an article on the IEEE Spectrum website, suggestively entitled National Ignition Facility: Mother of All Boondoggles?, another commenter chimes in:
How about we spend the billions on real research that actually has a chance of producing something useful? There are a gazillion ideas out there for research that has a much higher probability of producing useful results. Must be nice to work for LLNL where your ideas don’t need vetting.
In fact, the NIF was “vetted” by a full scale Federal Advisory Committee. Known as the Inertial Confinement Fusion Advisory Committee, or ICFAC, its members included Conrad Longmire, Marshall Rosenbluth, and several other experts in plasma physics and technology of world renown who had nothing whatsoever to gain by serving as shills for LLNL. It heard extensive testimony on plans to build the NIF, both pro and con, in the mid-90’s. Prominent among those who opposed the project was Steve Bodner, head of the ICF Program at the Naval Research Laboratory (NRL) at the time. Steve cited a number of excellent reasons for delaying major new starts like the NIF until some of the outstanding physics issues could be better understood. The Committee certainly didn’t ignore what he and other critics had to say. However, only one of the 15 or so members dissented from the final decision to recommend proceeding with the NIF. I suspect that LLNL’s possession of the biggest, baddest ICF computer code at the time had something to do with it. No one is better at bamboozling himself and others than a computational physicist with a big code. The one dissenter, BTW, was Tim Coffey, Director of NRL at the time, who was convinced that Bodner was right.
There are, of course, the predictable comments by those in the habit of imagining themselves geniuses after the fact, such as,
I am convinced. Garbage research.
Don’t these people feel ashamed telling so many lies?
after the IEEE Spectrum article, and,
It’s amazing to think that you can spout lies to the government to receive $6 billion for a machine that doesn’t come close to performing to spec and there are no consequences for your actions.
Following a post on the NIF at the LLNL – The True Story blog. Fortunately, most of the comments I’ve seen recently have been at a rather more thoughtful level. In any event, I hope Congress doesn’t decide to cut and run on the NIF. Pulling the plug at this point would be penny-wise and pound-foolish.
Posted on July 24th, 2012 No comments
According to a recent press release from Lawrence Livermore National Laboratory (LLNL) in California, the 192-beam National Ignition Facility (NIF) fired a 500 terawatt shot on July 5. The world record power followed a world record energy shot of 1.89 Megajoules on July 3. As news, this doesn’t rise above the “meh” category. A shot at the NIF’s design energy of 1.8 Megajoules was already recorded back in March. It’s quite true that, as NIF Director Ed Moses puts it, “NIF is becoming everything scientists planned when it was conceived over two decades ago.” The NIF is a remarkable achievement in its own right, capable of achieving energies 50 times greater than any other laboratory facility, with pulses shaped and timed to pinpoint precision. The NIF team in general and Ed Moses in particular deserve great credit, and the nation’s gratitude, for that achievement after turning things around following a very shaky start.
The problem is that, while the facility works as well, and even better than planned, the goal it was built to achieve continues to elude us. As its name implies, the news everyone is actually waiting for is the announcement that ignition (defined as fusion energy out greater than laser energy in) has been achieved. As noted in the article, Moses said back in March that “We have all the capability to make it happen in fiscal year 2012.” At this point, he probably wishes his tone had been a mite less optimistic. To reach their goal in the two months remaining, the NIF team will need to pull a rabbit out of their collective hat. A slim chance remains. Apparently the NIF’s 192 laser beams were aimed at a real ignition target with a depleted uranium capsule and deuterium-tritium fuel on July 5, and not a surrogate. The data from that shot may prove to be a great deal more interesting than the 500 terawatt power announcement.
Meanwhile, the Russians are apparently forging ahead with plans for their own superlaser, to be capable of a whopping 2.8 Megajoules, and the Chinese are planning another about half that size, to be operational at about the same time (around 2020). That, in itself, speaks volumes about the real significance of ignition. It may be huge for the fusion energy community, but not that great as far as the weaponeers who actually fund these projects are concerned. Many weapons designers at LLNL and Los Alamos were notably unenthusiastic about ignition when NIF was still in the planning stages. What attracted them more was the extreme conditions, approaching those in an exploding nuke, that could be achieved by the lasers without ignition. They thought, not without reason, that it would be much easier to collect useful information from such experiments than from chaotic ignition plasmas. Apparently the Russian bomb designers agree. They announced their laser project back in February even though LLNL’s difficulties in achieving ignition were well known at the time.
The same can be said of some of the academic types in the NIF “user community.” It’s noteworthy that two of them, Rick Petrasso of MIT and Ray Jeanloz of UC Berkeley, whose enthusiastic comments about the 500 terawatt shot where quoted in the latest press release, are both key players in the field of high energy density physics. Ignition isn’t a sine qua non for them either. They will be able to harvest scores of papers from the NIF whether it achieves ignition or not.
The greatest liability of not achieving early ignition may be the evaporation of political support for the NIF. The natives are already becoming restless. As noted in the Livermore Independent,
In early May, sounding as if it were discussing an engineering project rather than advanced research, the House Appropriations Committee worried that NIF’s “considerable costs will not have been warranted” if it does not achieve ignition by September 30, the end of the federal fiscal year.
Later that month, in a tone that seemed to demand that research breakthroughs take place according to schedule, the House Armed Services Committee recommended that NIF’s ignition research budget for next year be cut by $30 million from the requested $84 million budget unless NIF achieves ignition by September 30.
Funding cuts at this point, after we have come so far, and are so close to the goal, would be short-sighted indeed. One must hope that a Congress capable of squandering billions on white elephants like the International Space Station will not become penny-wise and pound-foolish about funding a project that really matters.
Posted on July 23rd, 2012 No comments
We tend to be strongly influenced by the recent past in our predictions about the future. After World War I, any number of pundits, statesmen, and military officers thought the next war would be a carbon copy of the one they had just lived through, albeit perhaps on a larger scale. The German government’s disastrous decision to declare war in 1914 was likely influenced by the quick and decisive German victories in 1864, 1866, and 1870. The Japanese were similarly mesmerized by their brilliant success against the Russians in 1904-05 after an opening surprise attack against the Russian fleet lying at anchor at Port Arthur, and assumed history would repeat itself if they launched a similar attack against Pearl Harbor.
Sometimes startling events force the reevaluation of old ideas and paradigms, such as the German armored Blitzkrieg or the destruction of powerful battleships from the air in World War II, or, more recently, the sudden collapse of Communism and the Soviet Union from 1989-91. We are always fascinated by such events, yet few of us grasp their significance as they are happening. Our tendency is always to look backwards, to fit the revolutionary and the unprecedented into the old world that we understand rather than the new one that we can’t yet imagine. So it was after the dropping of the first atomic bombs. It certainly focused the attention of public intellectuals, unleashing a torrent of essays full of dire predictions. For many, the future they imagined was simply a continuation of the immediate past, albeit with new and incredibly destructive weapons. It was to include the continued inexorable push for world dominion by totalitarian Communism, centered in the Soviet Union, and world wars following each other in quick succession every 15 to 20 years, about the same as the interval between the first two world wars.
Such a vision of the future was described by James Burnham in “The Struggle for the World,” published in 1947. Burnham was a former Marxist and Trotskyite who eventually abandoned Marxism, and became one of the leading conservative intellectuals of his day. His thought made a deep impression on, among others, George Orwell. For example, he had suggested the possibility of a world dominated by three massive totalitarian states, constantly at war with each other, in an earlier book, “The Managerial Revolution,” published in 1941. These became Oceania, Eastasia, and Eurasia in Orwell’s “1984.” The notions of “doublethink”, the totalitarian use of terms such as “justice” and “peace” in a sense opposite to their traditional meanings, and the rewriting of history every few years “so that history itself will always be a confirmation of the immediate line of the party,” familiar to readers of “1984,” were also recurrent themes in “The Struggle for the World.”
Burnham, born in 1905, had come of age during the stunning period of wars, revolutions, and the birth of the first totalitarian states that began and ended with the world wars of the 20th century. He assumed that events of such global impact would continue at the same pace, only this time in a world with nuclear weapons. As a former Marxist, he knew that the Communists, at least, were deliberately engaged in a “struggle for the world,” and was dismayed that U.S. politicians at the time were so slow to realize the nature of the struggle. He also correctly predicted that, unless they were stopped, the Communists would develop nuclear weapons in their Soviet base “in a few years.” This, he warned, could not be allowed to happen because it would inevitably and quickly lead to a full scale nuclear exchange. His reasoning was as follows:
Let us assume that more than one (two is enough for the assumption) power possesses, and is producing, atomic weapons. Each will be improving the efficiency and destructive potential of the weapons as it goes along. Now let us try to reason as the leaders of these powers would be compelled to reason.
Each leader of Power A could not but think as follows: Power B has at its disposal instruments which could, in the shortest time, destroy us. He has possibly made, or is about to make, new discoveries which will threaten even more complete and rapid destruction. At the moment, perhaps, he shows no open disposition to use these instruments. Nevertheless, I cannot possibly rely on his continued political benevolence – above all since he knows that I also have at my disposal instruments that can destroy him. Some hothead – or some wise statesman – of his may even now be giving the order to push the necessary buttons.
Even if there were no atomic weapons, many of the leaders would undoubtedly be reasoning today along these lines. Atomic weapons are, after all, not responsible for warfare, not even for the Third World War, which has begun. The fact that the political and social causes of a war are abundantly present stares at us from every edition of every newspaper. The existence of atomic weapons merely raises the stakes immeasurably higher, and demands a quicker decision.
But to assume, as do some foolish commentators, that fear of retaliation will be the best deterrent to an atomic war is to deny the lessons of the entire history of war and of society. Fear, as Ferrero so eloquently shows, is what provokes the exercise of force. Most modern wars have been, in the minds of every belligerent, preventive: an effort to stamp out the fear of what the other side might be about to do.
The existence of two or more centers of control of atomic weapons would be equal to a grenade with the pin already pulled.
According to Burnham, the resulting nuclear war or wars would lead to the collapse of Western Civilization. In his words,
If, however, we are not yet ready to accept passively the final collapse of Western Civilization, we may state the following as a necessary first condition of any workable solution of the problem of atomic weapons: there must be an absolute monopoly of the production, possession and use of all atomic weapons.
One wonders what direction world history might have taken had someone like Burnham been President in 1950 instead of Truman. He would have almost certainly adopted MacArthur’s plan to drop numerous atomic bombs on China and North Korea. We were lucky. In the end, Truman’s homespun common sense prevailed over Burnham’s flamboyant intellect, and the nuclear genie remained in the bottle.
However, in 1947 the U.S. still had a monopoly of nuclear weapons, and, for the reasons cited above, Burnham insisted we must keep it. He suggested that this might best be done by establishing an effectual world government, but dismissed the possibility as impractical. The only workable alternative to a Communist conquest of the world or full scale nuclear war and the end of Western Civilization was U.S. hegemony. In Burnham’s words,
It is not our individual minds or desires, but the condition of world society, that today poses for the Soviet Union, as representative of communism, and for the United States, as representative of Western Civilization, the issue of world leadership. No wish or thought of ours can charm this issue away.
This issue will be decided, and in our day. In the course off the decision, both of the present antagonists may, it is true, be destroyed. But one of them must be.
Whatever the words, it is well also to know the reality. The reality is that the only alternative to the communist World Empire is an American Empire which will be, if not literally worldwide in formal boundaries, capable of exercising decisive world control. Nothing less than this can be the positive, or offensive, phase of a rational United States policy.
As a first step to empire, Burnham proposed the union of Great Britain and the United States, to be followed, not by outright conquest, but by firm assertion of U.S. predominance and leadership in the non-Communist world. Beyond that, the Communist threat must finally be recognized for what it was, and a firm, anti-Communist policy substituted for what was seen as a lack of any coherent policy at all. Vacillation must end.
Fortunately, when it came to the nuclear standoff, Burnham was wrong, and the “foolish commentators” who invoked the fear of retaliation were right. Perhaps, having only seen the effects of dropping two low yield bombs, he could not yet imagine the effect of thousands of bombs orders of magnitude more powerful, or conceive of such a thing as mutually assured destruction. Perhaps it was only dumb luck, but the world did not stumble into a nuclear World War III as it had into the conventional world wars of the 20th century, and the decisive events in the struggle did not follow each other nearly as quickly as Burnham imagined they would.
Burnham also failed to foresee the implications of the gradual alteration in the nature of the Communist threat. At the time he wrote, it was everything he claimed it to be, a messianic secular religion at the height of its power and appeal. He assumed that it would retain that power and appeal until the battle was decided, one way or the other. Even though he was aware that the masses living under Communism, other than a dwindling number of incorrigible idealists, were already disillusioned by “the God that failed,” he didn’t foresee what a decisive weakness that would eventually become. In the end, time was on our side. The Communists, and not we, as Lenin had predicted, finally dropped onto the garbage heap of history “like a ripe plum.”
However, Burnham wasn’t wrong about everything. To win the struggle, it was necessary for us to finally recognize the threat. Whatever doubt remained on that score, at least as far as most of our political leaders were concerned, was dissipated by the North Korean invasion of the south. Our policy of vacillation didn’t exactly end, but was occasionally relieved by periods of firmness. In the end, in spite of a media dominated through most of the struggle by Lenin’s “useful idiots” and the resultant cluelessness of most Americans about what we were even trying to do on the front lines of the “clash between the cultures” in places like Vietnam, we prevailed.
It was a near thing. Burnham feared that, even after losing the opening battles of the next war to a United States with a monopoly of nuclear weapons, the Communists might regroup, abandon their vulnerable cities, and transform the struggle into a “people’s war.” His description of what would follow was eerily similar to what actually did happen, but in a much smaller arena than the whole world:
They would transform the struggle into a political war, a “people’s war,” fought in every district of the world by irregulars, partisans, guerillas, Fifth Columns, spies, stool pigeons, assassins, fought by sabotage and strikes and lies and terror and diversion and panic and revolt. They would play on every fear and prejudice of the United States population, every feeling of guilt or nobility; they would exploit every racial and social division; they would widen every antagonism between tentative allies; and they would tirelessly wear down the United States will to endure.
Though the result would be not quite so certain, perhaps, as if the communists also had atomic weapons, they would in the end, I think, succeed. Because of the lack of a positive United States policy, because it would not have presented to the world even the possibility of a political solution, its dreadful material strength would appear to the peoples as the unrelieved brutality of a murderer. Its failure to distinguish between the communist regime and that regime’s subject-victims would weld together the victims and their rulers. Americans themselves would be sickened and conscience-ridden by what would seem to them a senseless slaughter, never-ending, leading nowhere. The military leadership would be disoriented by the inability of their plans based on technical superiority to effect a decision. The failure to conceive the struggle politically would have given the communists the choice of weapons. From the standpoint of the United States, the entire world would have been turned into an ambush and a desert. In the long night, nerves would finally crack, and sentries would fire their last shots wildly into the darkness, and it would all be over.
Change “the world” to Vietnam and it reads like a history instead of a premonition. Tomorrow is another day, and I doubt that any of us will prove better at predicting what the future will bring than Burnham. We have lived through an era much different, more peaceful, and more sedate in the pace of events than the one he experienced between 1914 and 1945. We should beware of assuming, as he did, that the future will bear any resemblance to the immediate past. The world is still full of nuclear weapons, some of them already in the hands of, or soon to be in the hands of, dictators of suspect rationality. Some of our intellectuals soothe our fears with stories about the “vanishing of violence,” but as Omar Khayyam put it in the “Rubaiyat,” they could soon be “cast as foolish prophets forth, their mouths stopped with dust,” through some miscalculation or deliberate act of malice. As the Boy Scouts say, “be prepared.”
Posted on July 11th, 2012 1 comment
And you thought I was crazy… Check out this article by Freeman Dyson in the October 1968 issue of Physics Today entitled “Interstellar Transport.” Dyson was an active participant in Project Orion, a program to build interplanetary space vehicles propelled by nuclear bombs. After the program was ended by the 1963 nuclear test ban treaty, he decided to write a paper for a high visibility journal to insure that the idea was kept alive and people were aware of its potential.
People thought big in those days, and Dyson’s notional interstellar transports certainly reflected the fact. The first was designed to absorb the blast of one megaton deuterium fueled bombs in a gigantic copper hemisphere with a radius of 10 kilometers weighing 5 million tons. The fully loaded ship would have weighed 40 million tons, including 30 million of the one megaton bombs. Assuming each bomb would require 10 pounds of plutonium (or about 60 pounds of highly enriched uranium), a total of 150,000 tons of plutonium would be required for the mission.
Dubious assumptions were made, as, for example, that 100% of the bomb’s energy would go into the kinetic energy of debris, even though it was known at the time (and certainly known to Dyson), that the actual fraction is much less than that. The cost was calculated to be one 1968 gross national product, based entirely on the projected cost of the necessary deuterium fuel (3 billion pounds at $200 per pound in 1968 dollars, for a total of $600 billion.) In other words, the cost of the plutonium, copper, and other building material wasn’t even factored in, nor was the cost of getting it all into earth orbit prior to launch. In spite of all this, the massive ship, carrying about 20,000 colonists, would still take about 1300 years to reach the nearest stars. Barring a “Noah’s ark” forlorn hope escape from a dying world, even Dyson considered this impractical for human travel, writing,
As a voyage of colonization a trip as slow as this does not make much sense on a human time scale. A nonhuman species, longer lived or accustomed to thinking in terms of millennia rather than years, might find the conditions acceptable.
To obviate some of the objections of this “conservative” design, Dyson also proposed an “optimistic” design, which allowed some ablation of the surface of the vehicle nearest to the explosions, rather than requiring all the energy to be absorbed in solid material. After removing this energy limitation, the main limitation on the ship’s performance would be imposed by momentum, or, as Dyson put it, “the capacity of shock absorbers to transfer momentum from an impulsively accelerated pusher plate to the smoothly accelerated ship.” Basing his reasoning on the optimum performance of practical shock absorbers, Dyson calculated that such a ship could be accelerated at a constant one g, enabling it to reach the nearest stars in centuries rather than millennia. The cost, again based solely on the value of the deuterium fuel, would be only $60 billion 1968 dollars, or a tenth of the GNP at that time. The weight of the ship would be “only” 400,000 tons, a factor of 100 less than that of the “conservative” design. Dyson concluded,
If we continue our 4% growth rate we will have a GNP a thousand times its present size in about 200 years. When the GNP is multiplied by 1000, the building of a ship for $100B will seem like building a ship for $100M today. We are now building a fleet of Saturn V which cost about $100M each. It may be foolish but we are doing it anyhow. On this basis, I predict that about 200 years from now, barring a catastrophe, the first interstellar voyages will begin.
I suspect Dyson wrote most of this paper “tongue in cheek.” He’s nobody’s fool, has remarkable achievements to his credit in fields such as quantum electrodynamics, solid state physics, and nuclear engineering, and remains highly regarded by his peers. Nobel laureate Steven Weinberg said that the Nobel Committee had “fleeced” Dyson by never awarding him the prize. The objections to his designs are obvious, but for all that, bomb-propelled space vehicles are by no means impractical. I suspect Dyson realized that other scientists would recognize ways they could improve on his “conservative” and “optimistic” designs as soon as they read the paper, and start thinking about their own versions. Project Orion might be dead as a budget line item, but would live on in the minds and imaginations of his peers. And so it did.
Posted on June 10th, 2012 2 comments
As I mentioned in a previous post about fusion progress, signs of life have finally been appearing in scientific journals from the team working to achieve fusion ignition at the National Ignition Facility, or NIF, located at Lawrence Livermore National Laboratory (LLNL) in California. At the moment they are “under the gun,” because the National Ignition Campaign (NIC) is scheduled to end with the end of the current fiscal year on September 30. At that point, presumably, work at the facility will be devoted mainly to investigations of nuclear weapon effects and physics, which do not necessarily require fusion ignition. Based on a paper that recently appeared in Physical Review Letters, chances of reaching the ignition goal before that happens are growing dimmer.
The problem has to do with a seeming contradiction in the physical requirements for fusion to occur in the inertial confinement approach pursued at LLNL. In the first place, it is necessary for the NIF’s 192 powerful laser beams to compress, or implode, a target containing fusion fuel in the form of two heavy isotopes of hydrogen to extremely high densities. It is much easier to compress materials that are cold than those that are hot. Therefore, it is essential to keep the fuel material as cold as possible during the implosion process. In the business, this is referred to as keeping the implosion on a “low adiabat.” However, for fusion ignition to occur, the nuclei of the fuel atoms must come extremely close to each other. Unfortunately, they’re not inclined to do that, because they’re all positively charged, and like charges repel. How to overcome the repulsion? By making the fuel material extremely hot, causing the nuclei to bang into each other at high speed. The whole trick of inertial confinement fusion, then, is to keep the fuel material very cold, and then, in a tiny fraction of a second, while its inertia holds it in place (hence the name, “inertial” confinement fusion), raise it, or at least a small bit of it, to the extreme temperatures necessary for the fusion process to begin.
The proposed technique for creating the necessary hot spot was always somewhat speculative, and more than one fusion expert at the national laboratories were dubious that it would succeed. It consisted of creating a train of four shocks during the implosion process, which were to overtake one another all at the same time precisely at the moment of maximum compression, thereby creating the necessary hot spot. Four shocks are needed because of well-known theoretical limits on the increase in temperature that can be achieved with a single shock. Which brings us back to the paper in Physical Review Letters.
The paper, entitled Precision Shock Tuning on the National Ignition Facility, describes the status of efforts to get the four shocks to jump through the hoops described above. One cannot help but be impressed by the elegant diagnostic tools used to observe and measure the shocks. They are capable of peering through materials under the extreme conditions in the NIF target chamber, focusing on the tiny, imploded target core, and measuring the progress of a train of shocks over a period that only lasts for a few billionths of a second! These diagnostics, developed with the help of another team of brilliant scientists at the OMEGA laser facility at the University of Rochester’s Laboratory for Laser Energetics, are a triumph of human ingenuity. They reveal that the NIF is close to achieving the ignition goal, but not quite close enough. As noted in the paper, “The experiments also clearly reveal an issue with the 4th shock velocity, which is observed to be 20% slower than predictions from numerical simulation.”
It will be a neat trick indeed if the NIF team can overcome this problem before the end of the National Ignition Campaign. In the event that they don’t, one must hope that the current administration is not so short-sighted as to conclude that the facility is a failure, and severely reduce its funding. There is too much at stake. I have always been dubious about the possibility that either the inertial or magnetic approach to fusion will become a viable source of energy any time in the foreseeable future. However, I may be wrong, and even if I’m not, achieving inertial fusion ignition in the laboratory may well point the way to as yet undiscovered paths to the fusion energy goal. Ignition in the laboratory will also give us a significant advantage over other nuclear weapons states in maintaining our arsenal without nuclear testing.
Based on the progress reported to date, there is no basis for the conclusion that ignition is unachievable on the NIF. Even if the central hot spot approach currently being pursued proves too difficult, there are alternatives, such as polar direct drive and fast ignition. However, pursuing these alternatives will take time and resources. They will become a great deal more difficult to realize if funding for NIF operations is severely cut. It will also be important to maintain the ancillary capability provided by the OMEGA laser. OMEGA is much less powerful but also a good deal more flexible and nimble than the gigantic NIF, and has already proved its value in testing and developing diagnostics, investigating novel experimental approaches to fusion, developing advanced target technology, etc.
We have built world-class facilities. Let us persevere in the quest for fusion. We cannot afford to let this chance slip.
Posted on April 17th, 2012 10 comments
The National Ignition Facility, or NIF, is a huge, 192 beam laser system, located at Lawrence Livermore National Laboratory in California. It was designed, as the name implies, to achieve thermonuclear ignition in the laboratory. “Ignition” is generally accepted to mean getting a greater energy output from fusion than the laser input energy. Unlike magnetic confinement fusion, the approach currently being pursued at the International Thermonuclear Experimental Reactor, or ITER, now under construction in France, the goal of the NIF is to achieve ignition via inertial confinement fusion, or ICF, in which the fuel material is compressed and heated to the extreme conditions at which fusion occurs so quickly that it is held in place by its own inertia.
The NIF has been operational for over a year now, and a two year campaign is underway with the goal of achieving ignition by the end of this fiscal year. Recently, there has been a somewhat ominous silence from the facility, manifesting itself as a lack of publications in the major journals favored by fusion scientists. That doesn’t usually happen when there is anything interesting to report. Finally, however, some papers have turned up in the journal Physics of Plasmas, containing reports of significant progress.
To grasp the importance of the papers, it is necessary to understand what is supposed to occur within the NIF target chamber for fusion to occur. Of course, just as in magnetic fusion, the goal is to bring a mixture of deuterium and tritium, two heavy isotopes of hydrogen, to the extreme conditions at which fusion takes place. In the ICF approach, this hydrogen “fuel” is contained in a tiny, BB-sized target. However, the lasers are not aimed directly at the fuel “capsule.” Instead, the capsule is suspended in the middle of a tiny cylinder made of a heavy metal like gold or uranium. The lasers are fired through holes on each end of the cylinder, striking the interior walls, where their energy is converted to x-rays. It is these x-rays that must actually bring the target to fusion conditions.
It was recognized many years ago that one couldn’t achieve fusion ignition by simply heating up the target. That would require a laser driver orders of magnitude bigger than the NIF. Instead, it is first necessary to compress, or implode, the fuel material to extremely high density. Obviously, it is harder to “squeeze” hot material than cold material to the necessary high densities, so the fuel must be kept as “cold” as possible during the implosion process. However, cold fuel won’t ignite, begging the question of how to heat it up once the necessary high densities have been achieved.
It turns out that the answer is shocks. When the laser generated x-rays hit the target surface, they do so with such force that it begins to implode faster than the speed of sound. Everyone knows that when a plane breaks the sound barrier, it, too, generates a shock, which can be heard as a sonic boom. The same thing happens in ICF fusion targets. When such a shock converges at the center of the target, the result is a small “hot spot” in the center of the fuel. If the temperature in the hot spot were high enough, fusion would occur. Each fusion reaction would release a high energy helium nucleus, or alpha particle, and a neutron. The alpha particles would be slammed to a stop in the surrounding cold fuel material, heating it, in turn, to fusion conditions. This would result in a fusion “burn wave” that would propagate out through the rest of the fuel, completing the fusion process.
The problem is that one shock isn’t enough to create such a “hot spot.” Four of them are required, all precisely timed by the carefully tailored NIF laser pulse to converge at the center of the target at exactly the same time. This is where real finesse is needed in laser fusion. The implosion must be extremely symmetric, or the shocks will not converge properly. The timing must be exact, and the laser pulse must deliver just the right amount of energy.
One problem in the work to date has been an inability to achieve high enough implosion velocities for the above scenario to work as planned. One of the Physics of Plasmas papers reports that, by increasing the laser energy and replacing some of the gold originally used in the wall of the cylinder, or “hohlraum,” in which the fuel capsule is mounted with depleted uranium, velocities of 99% of those required for ignition have been achieved. In view of the recent announcement that a shot on the NIF had exceeded its design energy of 1.8 megajoules, it appears the required velocity is within reach. Another of the Physics of Plasmas papers dealt with the degree to which implosion asymmetries were causing harmful mixing of the surrounding cold fuel material into the imploded core of the target. It, too, provided grounds for optimism.
In the end, I suspect the success or failure of the NIF will depend on whether the complex sequence of four shocks can really be made to work as advertised. That will depend on the accuracy of the physics algorithms in the computer codes that have been used to model the experiments. Time and again, earlier and less sophisticated codes have been wrong because they didn’t accurately account for all the relevant physics. There is no guarantee that critical phenomena have not been left out of the current versions as well. We may soon find out, if the critical series of experiments planned to achieve ignition before the end of the fiscal year are carried out as planned.
One can but hope they will succeed, if only because some of our finest scientists have dedicated their careers to the quest to achieve the elusive goal of controlled fusion. Even if they do, fusion based on the NIF approach is unlikely to become a viable source of energy, at least in the foreseeable future. Laser fusion may prove scientifically feasible, but getting useful energy out of it will be an engineering nightmare, dangerous because of the need to rely on highly volatile and radioactive tritium, and much too expensive to compete with potential alternatives. I know many of the faithful in the scientific community will beg to differ with me, but, trust me, laser fusion energy aint’ gonna happen.
On the other hand, if ignition is achieved, the NIF will be invaluable to the country, not as a source of energy, but for the reason it was funded in the first place – to insure that our nation has an unmatched suite of experimental facilities to study the physics of nuclear weapons in a era free of nuclear testing. As long as we have unique access to facilities like the NIF, which can approach the extreme physical conditions within exploding nukes, we will have a significant leg up on the competition as long as the test ban remains in place. For that, if for no other reason, we should keep our fingers crossed that the NIF team can finally clear the last technical hurdles and reach the goal they have been working towards for so long.
Posted on January 19th, 2011 No comments
For those who don’t follow fusion technology, the National Ignition Facility, or NIF, is a giant, 192 beam laser facility located at Lawrence Livermore National Laboratory. As its name would imply, it is designed to achieve fusion ignition, which has been variously defined, but basically means that you get more energy out from the fusion process than it was necessary to pump into the system to set off the fusion reactions. There are two “classic” approaches to achieving controlled fusion in the laboratory. One is magnetic fusion, in which light atoms stripped of their electrons, or ions, typically heavy isotopes of hydrogen, are confined in powerful magnetic fields as they are heated to the temperatures necessary for fusion to occur. The other is inertial confinement fusion, or ICF, in which massive amounts of energy are dumped into a small target, causing it to reach fusion conditions so rapidly that significant fusion can occur in the very short time that the target material is held in place by its own inertia. The NIF is a facility of the latter type.
There are, in turn, two basic approaches to ICF. In one, referred to as direct drive, the target material is directly illuminated by the laser beams. In the other, indirect drive, the target is placed inside a small container, or “hohlraum,” with entrance holes for the laser beams. These are aimed at the inside walls of the hohlraum, where they are absorbed, producing x-rays which then compress and ignite the target. The NIF currently uses the latter approach.
The NIF was completed and became operational in 2009. Since that time, the amount of news coming out of the facility about the progress of experiments has been disturbingly slight. That is not a good thing. If everything were working as planned, a full schedule of ignition experiments would be underway as I write this. Instead, the facility is idle. The results of the first experimental campaign, announced in January, sounded positive. The NIF had operated at a large fraction of its design energy output of 1.8 Megajoules. Surrogate targets had been successfully compressed to very high densities in symmetric implosions, as required for fusion. However, on reading the tea leaves, things did not seem quite so rosy. Very high levels of laser plasma interaction (LPI) had been observed. In such complex scattering interactions, laser light can be scattered out of the hohlraum, or in other undesired directions, and hot electrons can be generated, wreaking havoc with the implosion process by preheating the target. We were assured that ways had been found to control the excess LPI, and even turn it to advantage in controlling the symmetry of the implosion. However, such “tuning” with LPI had not been foreseen at the time the facility was designed, and little detail was provided on how the necessary delicate, time-dependent shaping of the laser pulses would be achieved under such conditions.
After a long pause, another series of “integrated” experiments was announced in October. Even less information was released on this occasion. We were informed that symmetric implosions had been achieved, and that, “From both a system integration and from a physics point of view, this experiment was outstanding,” Since then, nothing.
It’s hard to imagine that the outlook is really as rosy as the above statement would imply. The NIF was designed for a much higher shot rate. If it sat idle through much of 2010, there must be a reason. It could be that damage to the laser optics has been unexpectedly high. This would not be surprising. Delicate crystals are used at the end of the chain of laser optics to triple the frequency of the laser light, and, given that the output energy of the facility is more than an order of magnitude larger than that of its next largest competitor, damage may have occurred in unexpected ways, as it did on Nova, the NIF’s predecessor at Livermore. LPI may, in fact, be more serious, more difficult to control, and more damaging than the optimistic accounts in January implied. Unexpected physics may be occurring in the absorption of laser light at the hohlraum walls. Whatever the problem, Livermore would be well advised to be forthcoming about it in its press releases. After all, the NIF will achieve ignition or not, regardless of how well the PR is managed.
All this seems very discouraging for the scientists who have devoted their careers to the quest for fusion energy, not to mention the stewards of the nation’s nuclear weapons stockpile, whose needs the NIF was actually built to address. In the end, these apparent startup problems may be overcome, and ignition achieved after all. However, I rather doubt it, unless perhaps Livermore comes up with an alternative to its indirect drive approach.
Posted on November 18th, 2010 4 comments
The Reliable Replacement Warhead is a really bad idea that never seems to go away. Congress has wisely condemned it, and it was explicitly rejected in the nation’s latest Nuclear Posture Review, but now the RRW has popped up again, artificially linked to the New Start arms control treaty, in a couple of opeds, one in the New York Times by former UN ambassador John Bolton, and another in the Wall Street Journal by R. James Woolsey, former arms control negotiator and Director of the CIA. Bolton writes, “Congress should pass a new law financing the testing and development of new warhead designs before approving New Start,” and Woolsey chimes in,
…the administration needs to commit to replacing and modernizing our aging nuclear infrastructure as well as the bombers, submarines and ballistic missiles – and the warheads on them – that provide our ultimate guarantee of national security. The Senate’s resolution of ratification should, for example, require the president to commit to specific modernization plans so we can be sure these programs will have his full support. The administration has particularly resisted warhead modernization, beginning with its Nuclear Posture Review last year. This led 10 former directors of the nation’s nuclear weapons labs to write to the secretaries of Defense and Energy urging them to revisit that misguided policy. The secretaries should commit to doing so.
In fact, one hopes they have enough sense not to follow that advice. What Bolton and Woolsey are referring to when they speak of “modernizing” weapons isn’t the continued refurbishment of old weapons, or the adding of new conventional packaging around them, as in the case of the B61-11, to make them more effective for earth penetration or some other specific mission. They are speaking of a new design of the nuclear device itself. At the moment, the RRW is the only player in that game.
Going ahead with the RRW would be self-destructive at a number of levels. In the first place, it’s unnecessary. There is no reason to doubt the safety and reliability of the existing weapons in our arsenal, nor our ability to maintain them into the indefinite future. A reason given for building the RRW is that low yield versions could be designed that would be “more effective deterrents,” because enemies would consider it a lot more likely that we would actually use such a weapon against them, as opposed to our existing high yield weapons. The problem with that logic is that they would be right. Given the alacrity with which we went to war in Iraq, it is not hard to imagine that we would be sorely tempted to use a mini-nuke to take out, say, a buried and/or hardened enemy bunker suspected of containing WMD’s. Any US first use of nuclear weapons, for whatever reason, and regardless of the chances of “collateral damage,” would be a disastrous mistake. It would let the nuclear genie out of the bottle once again, serving as a perfect pretense for the use of nuclear weapons by others, and particularly by terrorists against us. Those who think the Maginot line of nuclear detectors we are installing at our ports, or the imaginary difficulty of mastering the necessary technology, will protect us from such an eventuality, are gravely mistaken.
The building of a new weapon design would also provide a fine excuse for others to modernize their own arsenals. It is hard to imagine how this could work to the advantage of the United States. Our nuclear technology is mature, and it would simply give the lesser nuclear powers a chance to catch up with us. More importantly, it would almost inevitably imply a return to nuclear testing, thereby negating a tremendous advantage we now hold over every other nuclear power, namely, our above ground experimental (AGEX) capability. In the National Ignition Facility at Lawrence Livermore National Laboratory, the Z pulsed power machine at Sandia, the DAHRT radiographic test facility at Los Alamos, and a host of other experimental facilities, we possess an ability to study the physics that occurs in conditions near those in nuclear detonations that no other country comes close to matching. It would be utterly pointless to throw that advantage away in order to build a new nuclear weapon we don’t need.
It does not surprise me that 10 former directors of the nation’s nuclear weapons laboratories signed a letter calling on the Secretaries of Energy and Defense to revisit our RRW policy. It would certainly serve the interests of the nuclear weapons laboratories. It is much easier to attract talented physicists to an active testing program than to serve as custodians of an aging stockpile, and new designs would mean new money, and the removal of any perceived existential threats to one or more of the existing labs on the basis of their redundancy. The problem is that it would not serve the interests of the country.
Let the RRW stay buried. The nuclear genie will return soon enough as it is.
Posted on October 23rd, 2010 8 comments
Thorium is a promising candidate as a future source of energy. I just wonder what it is about the stuff that inspires so many people to write nonsense about it. It doesn’t take a Ph.D. in physics to spot the mistakes. Most of them should be obvious to anyone who’s taken the trouble to read a high school science book. Another piece of misinformation has just turned up at the website of Popular Mechanics, dubiously titled The Truth about Thorium and Nuclear Power.
The byline claims that, “Thorium has nearly 200 times the energy content of uranium,” a statement I will assume reflects the ignorance of the writer rather than any outright attempt to deceive. She cites physicist Carlo Rubbia as the source, but if he ever said anything of the sort, he was making some very “special” assumptions about the energy conversion process that she didn’t quite understand. I assume it must have had something to do with his insanely dangerous subcritical reactor scheme, in which case the necessary assumptions to get a factor of 200 would have necessarily been very “special” indeed. Thorium cannot sustain the nuclear chain reaction needed to produce energy on its own. It must first be transmuted to an isotope of uranium with the atomic weight of 233 (U233) by absorbing a neutron. Strictly speaking, then, the above statement is nonsense, because the “energy content” of thorium actually comes from a form of uranium, U233, which can sustain a chain reaction on its own. However, let’s be charitable and compare natural thorium and natural uranium as both come out of the ground when mined.
As I’ve already pointed out, thorium cannot be directly used in a nuclear reactor on its own. Natural uranium actually can. It consists mostly of an isotope of uranium with an atomic weight of 238 (U238), but also a bit over 0.7% of a lighter isotope with an atomic weight of 235 (U235). U238, like thorium, is unable to support a nuclear chain reaction on its own, but U235, like U233, can. Technically speaking, what that means is that, when the nucleus of an atom of U233 or U235 absorbs a neutron, enough energy is released to cause the nucleus to split, or fission. When U238 or natural thorium (Th232) absorbs a neutron, energy is also released, but not enough to cause fission. Instead, they become U239 and Th233, which eventually decay to produce U233 and plutonium 239 (Pu239) respectively.
Let’s try to compare apples and apples, and assume that enough neutrons are around to convert all the Th232 to U233, and all the U238 to Pu239. In that case we are left with a lump of pure U233 derived from the natural thorium and a mixture of about 99.3% Pu239 and 0.7% U235 from the natural uranium. In the first case, the fission of each atom of U233 will release, on average, 200.1 million electron volts (MeV) of energy that can potentially be converted to heat in a nuclear reactor. In the second, each atom of U235 will release, on average, 202.5 Mev, and each atom of Pu239 211.5 Mev of energy. In other words, the potential energy release from natural thorium is actually about equal to that of natural uranium.
Unfortunately, the “factor of 200” isn’t the only glaring mistake in the paper. The author repeats the familiar yarn about how uranium was chosen over thorium for power production because it produced plutonium needed for nuclear weapons as a byproduct. In fact, uranium would have been the obvious choice even if weapons production had not been a factor. As pointed out earlier, natural uranium can sustain a chain reaction in a reactor on its own, and thorium can’t. Natural uranium can be enriched in U235 to make more efficient and smaller reactors. Thorium can’t be “enriched” in that way at all. Thorium breeders produce U232, a highly radioactive and dangerous isotope, which can’t be conveniently separated from U233, complicating the thorium fuel cycle. Finally, the plutonium that comes out of nuclear reactors designed for power production, known as “reactor grade” plutonium, contains significant quantities of heavier isotopes of plutonium in addition to Pu239, making it unsuitable for weapons production.
Apparently the author gleaned some further disinformation for Seth Grae, CEO of Lightbridge, a Virginia-based company promoting thorium power. He supposedly told her that U233 produced in thorium breeders “fissions almost instantaneously.” In fact, the probability that it will fission is entirely comparable to that of U235 or Pu239, and it will not fission any more “instantaneously” than other isotopes. Why Grae felt compelled to feed her this fable is beyond me, as “instantaneous” fission isn’t necessary to prevent diversion of U233 as a weapons material. Unlike plutonium, it can be “denatured” by mixing it with U238, from which it cannot be chemically separated.
It’s a mystery to me why so much nonsense is persistently associated with discussions of thorium, a potential source of energy that has a lot going for it. It has several very significant advantages over the alternative uranium/plutonium breeder technology, such as not producing significant quantities of plutonium and other heavy actinides, less danger that materials produced in the fuel cycle will be diverted for weapons purposes if the technology is done right, and the ability to operate in a more easily controlled “thermal” neutron environment. I can only suggest that people who write popular science articles about nuclear energy take the time to educate themselves about the subject. Tried and true old textbooks like Introduction to Nuclear Engineering and Introduction to Nuclear Reactor Theory by John Lamarsh have been around for years, don’t require an advanced math background, and should be readable by any intelligent person with a high school education.