Polyanna Pinker’s Power Profundities

Recently Steven Pinker, public intellectual and author of a “history” of the Blank Slate debacle that was largely a fairy tale but at least drew attention to the fact that it happened, has been dabbling in something entirely different. Inspired by the latest UN Jeremiad against climate change, he has embraced nuclear power. In a series of tweets, he has endorsed articles advocating expanded reliance on nuclear power, such as one that recently turned up at Huffpo cleverly entitled “If We’re Going To Save the Planet, We’ve Got To Use the Nuclear Option.” As things now stand, that would be a dangerous, wasteful, and generally ill-advised idea.

I say “as things now stand.” I’m certainly not opposed to nuclear power. I’m just opposed to the way it would be implemented if we suddenly decided to build a bevy of new nukes given current economic realities.  The new reactors would probably look like the AP1000 models recently abandoned in South Carolina. Such reactors would use only a fraction of the available energy in their nuclear fuel, and would produce far larger amounts of long-lived radioactive waste than necessary. They are, however, cheaper than alternatives that could avoid both problems using proven technologies. Given the small number of players capable of coming up with the capital necessary to build even these inferior reactors, there is little chance that more rational alternatives will be chosen until alternative sources of energy become a great deal more expensive, or government steps in to subsidize them. Until that happens, we are better off doing without new nuclear reactors.

As noted above, the reasons for this have to do with the efficient utilization of nuclear fuel, and the generation of radioactive waste.  In nature there is only one potential nuclear fuel – Uranium 235, or U235. U235 is “fissile,” meaning it may fission if it encounters a neutron no matter how slow that neutron happens to be traveling.  As a result, it can sustain a nuclear chain reaction, which is the source of nuclear energy. Unfortunately, natural uranium consists of only 0.7 percent U235. The rest is a heavier isotope – U238. U238 is “fissionable.” In other words, it will fission, but only if it is struck by a very energetic neutron. It cannot sustain a fission chain reaction by itself.  However, if U238 absorbs a neutron, it becomes the isotope U239, which quickly decays to neptunium 239, which, in turn, quickly decays to plutonium 239. Plutonium 239 is fissile. It follows that if all the U238 in natural uranium could be converted to Pu239 in this way, it could release vastly more energy than the tiny amount of U235 alone. This is not possible in conventional reactors such as the AP1000 mentioned above. A certain amount of plutonium is produced and burned in the fuel elements of such reactors, but the amount is very small compared to the amount of available U238. In addition, other transuranic elements, such as americium and curium, which are produced in such reactors, along with various isotopes of plutonium, would remain dangerously radioactive for thousands of years.

These problems could be avoided by building fast breeder reactors. In conventional reactors, neutrons are “thermalized” to low energies, where the probability that they will react with a fuel nucleus are greatly increased. The neutron spectrum in “fast” reactors is significantly hotter but, as a result, more neutrons are produced, on average, in each encounter. More neutrons means that more Pu239 can be produced without quenching the fission chain reaction.  It also means that the dangerous transuranic elements referred to above, as well as long lived fission products that are the source of the most long-lived and dangerous radioactive isotopes in nuclear waste, could be destroyed via fission or transmutation. As a result, the residual radioactivity resulting from running such a nuclear reactor for, say 30 years, would drop below that released into the environment by a coal plant of comparable size in 300 to 500 years, as opposed to the thousands of years it would take for conventional reactors. And, yes, radioactivity is released by coal plants, because coal contains several parts per million each of radioactive uranium and thorium.  Meanwhile, a far higher percentage of the U238 in natural uranium would be converted to Pu239, resulting in a far more efficient utilization of the fuel material.

An even better alternative might be molten salt reactors. In such reactors, the critical mass would be in liquid form, and would include thorium 232 (Th232) in addition to a fissile isotope.  When Th232 absorbs a neutron, it decays into U233, another fissile material.  Such reactors could run at a lower neutron “temperature” than plutonium breeders, and would be easier to control as a result.  The liquid core would also greatly reduce the danger of a nuclear accident. If it became too hot, it could simply be decanted into a holding pan where it would immediately become subcritical. Thorium is more abundant than uranium in nature, so the “fuel” material would be cheaper.

Consider the above in the context of the present. Instead of extracting the vast amounts of energy locked up in U238, or “depleted” uranium, we use it for tank armor and armor piercing munitions. In addition to this incredibly stupid waste of potentially vast energy resources, we dispose of huge amounts of it as “radioactive waste.”  Instead of treasuring our huge stores of plutonium as sources of carbon-free energy, we busy ourselves thinking up clever ways to render them “safe” for burial in waste dumps.  It won’t work.  Plutonium can never be made “safe” in this way. Pu239 has a half-live of about 25,000 years.  It will always be possible to extract it chemically from whatever material we choose to mix it with.  Even if it is “reactor grade,” including other isotopes of plutonium such as Pu240, it will still be extremely dangerous – difficult to make into a bomb, to be sure, but easy to assemble into a critical mass that could potentially result in radioactive contamination of large areas. Carefully monitored breeder reactors are the only way of avoiding these problems.

According to the Huffpo article referenced above,

Doesn’t nuclear power contribute to nuclear weapons proliferation? No. Weapons programs do not depend on civilian nuclear power, which operates under stringent international safeguards.

Really? Will the “stringent international safeguards” last for the 25,000 years it takes for even half the plutonium waste produced by conventional reactors to decay? I would advise anyone who thinks it is impossible to fabricate this waste into a bomb, no matter what combination of isotopes it contains, to take an elementary course in nuclear engineering. The only way to avoid this problem is to burn all the plutonium in breeder reactors.  Predictably, the article doesn’t even mention the incredible wastefulness of current reactors, or the existence of breeder technology.

It’s nice that a few leftist “progressives” have finally noticed that their narrative on nuclear power has been controlled by imbeciles for the last half a century. I heartily concur that nuclear energy is a potent tool for reducing carbon and other greenhouse gas emissions.  I simply suggest that, if we decide to return to nuclear, we either provide the subsidies necessary to implement rational nuclear technologies now, or wait until it becomes economically feasible to implement them.

No, We Don’t Need to Resume Nuclear Testing

According to an article entitled In Alarming New Study, Nuclear Lab Scientists Question U.S. Weapons’ Performance that recently appeared in Investor’s Business Daily, a couple of Los Alamos scientists have released a report questioning the reliability of the U.S. nuclear arsenal, and calling for a resumption of nuclear testing.  It would be a bad idea.  Let me explain why.

The two scientists claim that the reliability of the aging weapons in our arsenal will become increasingly questionable as the number of years since their “best if used by” date increases.  They base their argument largely on uncertainties about whether computer codes will be able to accurately predict the performance of these aging weapons.  It’s true that computer codes have not always been perfectly accurate in predicting the outcome of complex physical processes.  However, the significance of that fact must be weighed in the context of how it affects all the nuclear powers, not just the United States.  In the absence of nuclear testing, we all have the same problem.  The relevant question, then, is not whether the problem exists, but how severely it impacts us compared to the other nuclear states.  We have conducted more nuclear tests than any other country, and therefore have a much larger database than our competitors with which to compare code predictions.  When it comes to computer codes, that gives us a very significant advantage as long as the moratorium on testing continues.  That advantage will become a great deal less significant if testing is resumed.

However, computer codes are not the only means we have of assessing the reliability of the weapons in our arsenal.  The U.S. also has an unmatched advantage in terms of experimental facilities that are able to access physical conditions relevant to those that occur in nuclear weapons.  For example, these include the Z machine at Sandia National Laboratories, which is capable of producing a far more powerful burst of x-rays in the laboratory than any competitor, and the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, which can dump a huge amount of energy in a tiny target in a very short time.  Such facilities can access extreme material densities and temperatures, enabling a host of experiments in weapon physics and weapon effects that are currently beyond the capabilities of any other country.  If we resume testing we will be throwing this advantage out the window as well.  In short, we may have our problems when it comes to assessing the reliability of the weapons in our arsenal, but the problems faced by our potential competitors are even worse.  Under the circumstances, it makes little sense to do something as destabilizing as rocking the nuclear boat.

If we are really worried about the reliability of our arsenal, we should seriously consider adding another experimental facility to go along with Z, NIF, and the rest.  I refer to what is known as an Advanced Hydrodynamic Facility, or AHF.  We seriously considered building such a facility back in the late 90’s, but have pretty much forgotten about it since then.  Basically, the AHF would be a very powerful particle accelerator, capable of delivering beams of energy so penetrating that they could image the critical, high-explosive driven implosion process in nuclear weapons in its entirety in three dimensions.  Obviously, it would be necessary to replace the nuclear materials used in real weapons with suitable surrogates, but this would introduce very little uncertainly in the experimental results.  An AHF would not only add to our existing advantage over other nuclear states as long as the test moratorium continues, but would effectively lay to rest any remaining uncertainties not resolved by the computer codes and experimental facilities we already have.

I can understand the eagerness of weapon scientists to resume nuclear testing.  It would make their lives a lot more interesting.  However, it would hardly be to the advantage of the rest of us.  I suggest that, instead of unilaterally taking such a foolhardy step, we maintain and expand the advantage we already have and will continue to enjoy as long as the test moratorium continues.

Fisking a Fusion Fata Morgana

Why is it that popular science articles about fusion energy are always so cringe-worthy? Is scientific illiteracy a prerequisite for writing them? Take the latest one to hit the streets, for example. Entitled Lockheed Martin Now Has a Patent For Its Potentially World Changing Fusion Reactor, it had all the familiar “unlimited energy is just around the corner” hubris we’ve come to expect in articles about fusion. When I finished reading it I wondered whether the author imagined all that nonsense on his own, or some devilish plasma physicist put him up to it as a practical joke. The fun starts in the first paragraph, where we are assured that,

If this project has been progressing on schedule, the company could debut a prototype system that size of shipping container, but capable of powering a Nimitz-class aircraft carrier or 80,000 homes, sometime in the next year or so.

Trust me, dear reader, barring divine intervention no such prototype system, capable of both generating electric energy and fitting within a volume anywhere near that of a shipping container, will debut in the next year, or the next five years, or the next ten years.  Reading on, we learn that,

Unlike in nuclear fission, where atoms hit each other release energy, a fusion reaction involves heating up a gaseous fuel to the point where its atomic structure gets disrupted from the pressure and some of the particles fuse into a heavier nucleus.

Well, not really.  Fission is caused by free neutrons, not by “atoms hitting each other.”  It would actually be more accurate to say that fusion takes place when “atoms hit each other,” although it’s really the atomic nuclei that “hit” each other.  Fusion doesn’t involve “atomic structure getting disrupted from pressure.” Rather, it happens when atoms acquire enough energy to overcome the Coulomb repulsion between two positively charged atomic nuclei (remember, like charges repel), and come within a sufficiently short distance of each other for the much greater strong nuclear force of attraction to take over. According to the author,

But to do this you need to be able to hold the gas, which is eventually in a highly energized plasma state, for a protracted period of time at a temperature of hundreds of millions of degrees Fahrenheit.

This is like claiming that a solid can be in a liquid state. A plasma is not a gas. It is a fourth state of matter quite unlike the three (solid, liquid, gas) that most of us are familiar with. Shortly thereafter we are assured that,

Running on approximately 25 pounds of fuel – a mixture of hydrogen isotopes deuterium and tritium – Lockheed Martin estimated the notional reactor would be able to run for an entire year without stopping. The device would be able to generate a constant 100 megawatts of power during that period.

25 pounds of fuel would include about 15 pounds of tritium, a radioactive isotope of hydrogen with a half-life of just over 12 years. In other words, its atoms decay about 2000 times faster than those of the plutonium 239 found in nuclear weapons.  It’s true that the beta particle (electron) emitted in tritium decay is quite low energy by nuclear standards but, as noted in Wiki, “Tritium is an isotope of hydrogen, which allows it to readily bind to hydroxyl radicals, forming tritiated water (HTO), and to carbon atoms. Since tritium is a low energy beta emitter, it is not dangerous externally (its beta particles are unable to penetrate the skin), but it can be a radiation hazard when inhaled, ingested via food or water, or absorbed through the skin.”  Obviously, water and many carbon compounds can be easily inhaled or ingested. Tritium is anything but benign if released into the environment. Here we will charitably assume that the author didn’t mean to say that 25 pounds of fuel would be available all at once, but would be bred gradually and then consumed as fuel in the reactor during operation.  The amount present at any given time would more appropriately be measured in grams than in pounds.  The article continues with rosy scenarios that might have been lifted from a “Back to the Future” movie:

Those same benefits could apply to vehicles on land, ships at sea, or craft in space, providing nearly unlimited power in compact form allowing for operations across large areas, effectively eliminating the tyranny of distance in many cases. Again, for military applications, unmanned ground vehicles or ships could patrol indefinitely far removed from traditional logistics chains and satellites could conduct long-term, resource intensive activities without the need for large and potentially dangerous fission reactors.

Great shades of “Dr. Fusion!” Let’s just say that “vehicles on land” is a bit of a stretch. I can only hope that no Lockheed engineer was mean-spirited enough to feed the author such nonsense. Moving right along, we read,

Therein lies perhaps the biggest potential benefits of nuclear fusion over fission. It’s produces no emissions dangerous to the ozone layer and if the system fails it doesn’t pose nearly the same threat of a large scale radiological incident. Both deuterium and tritium are commonly found in a number of regular commercial applications and are relatively harmless in low doses.

I have no idea what “emission” of the fission process the author thinks is “dangerous to the ozone layer.” Again, as noted above, tritium is anything but “relatively harmless” if ingested. Next we find perhaps the worst piece of disinformation of all:

And since a fusion reactor doesn’t need refined fissile material, its much harder for it to serve as a starting place for a nuclear weapons program.

Good grief, the highly energetic neutrons produced in a fusion reactor are not only capable of breeding tritium, but plutonium 239 and uranium 233 from naturally occurring uranium and thorium as well.  Both are superb explosive fuels for nuclear weapons.  And tritium?  It is used in a process known as “boosting” to improve the performance of nuclear weapons.  Finally, we run into what might be called the Achilles heel of all tritium-based fusion reactor designs:

Fuel would also be abundant and relatively easy to source, since sea water provides a nearly unlimited source of deuterium, while there are ready sources of lithium to provide the starting place for scientists to “breed” tritium.

I think not. Breeding tritium will be anything but a piece of cake.  The process will involve capturing the neutrons produced by the fusion reactions in a lithium blanket surrounding the reactor, doing so efficiently enough to generate more tritium from the resulting reactions than the reactor consumes as fuel, and then extracting the tritium and recycling it into the reactor without releasing any of the slippery stuff into the environment.  Do you think the same caliber of engineers who brought us Chernobyl, Fukushima, and Three Mile Island will be able to pull that rabbit out of their hats without a hitch?  If so, you’re more optimistic than I am.

Hey, I like to be as optimistic about fusion as it’s reasonable to be. I think it’s certainly possible that some startup company with a bright idea will find the magic bullet that makes fusion reactors feasible, preferably involving fusion reactions that don’t involve tritium. It’s also quite possible that the guys at Lockheed will achieve breakeven, although getting a high enough gain of energy in versus energy out to enable efficient generation of electric power is another matter.  There’s a difference between optimism and scientifically illiterate hubris, though.  Is it too much to ask that people who write articles about fusion at least run them by somebody who actually knows something about the subject to see if they pass the “ho, ho” test before publishing?  What’s that you say?  What about me?  Please read the story about the Little Red Hen.

Whither Nuclear Power? A Few Comments on Thorium and the End of the “Nuclear Renaissance”

About a decade ago there was much talk of a “nuclear renaissance” amid concerns about greenhouse gas emissions and the increasing cost of fossil fuel alternatives.  The Nuclear Regulatory Commission received applications to build no less than 31 new nuclear plants as the price of crude oil spiked to over $140 per barrel.  Now, however, with last month’s decision by SCANA Corp. to abandon the V. C. Summer project, a pair of nukes that had been under construction in South Carolina, nuclear’s future prospects look dim, at least in the United States.  Two plants remain under construction in Georgia but, like the ones abandoned in South Carolina, they are to be AP1000s, designed by Westinghouse.  Westinghouse filed for bankruptcy in March.  Delays and massive cost overruns similar to those that led to the demise of V. C. Summer also afflict the Georgia project, and its future seems doubtful at best.

In short, the dream of a nuclear renaissance has evaporated.  For the time being, at least, nuclear in the U.S. is no match for more agile competitors like wind, solar, and natural gas.  However, there may be a silver lining to this cloud.  Plants like Westinghouse’s AP1000 waste most of the energy in their nuclear fuel, creating massive amounts of avoidable radioactive waste in the process.  To the extent that it makes sense to build nuclear plants at all, these are not the kind we should be building.  To understand why this is true it is first necessary to acquire some elementary knowledge about nuclear physics.

The source of the energy produced in the core of nuclear reactors is a nuclear fission chain reaction.  Only one material that exists in significant quantities in nature can sustain such a chain reaction – uranium 235, or U235.  U235 is an isotope of uranium.  Isotopes of a given element consist of atoms with the same number of positively charged protons in their central core, or nucleus.  Like all other isotopes of uranium, U235 has 92.  There are also 143 neutrally charged neutrons, making a total of 235 “nucleons.”  Natural uranium consists of only about 0.7 percent U235.  Almost all the rest is a different isotope, U238, with a nucleus containing 146 neutrons instead of 143.

When we say that U235 can sustain a nuclear chain reaction, we mean that if a free neutron happens to come within a very short distance of its nucleus, it may be captured, releasing enough energy in the process to cause the nucleus to split into two fragments.  When this happens, more free neutrons are released, that can then be captured by other uranium nuclei, which, in turn, fission, releasing yet more neutrons, and so on.  As noted above, U235 is the only naturally occurring isotope that can sustain such a nuclear chain reaction.  However, other isotopes can be created artificially that can do so as well.  The most important of these are U233 and plutonium 239, or Pu239.  They are important because it is possible to “breed” them in properly designed nuclear reactors, potentially producing more usable fuel than the reactor consumes.  U233 is produced by the reactions following absorption of a neutron by thorium 232, or Th232, and Pu239 by those following the absorption of a neutron by U238.  In other words, we know of three practical types of nuclear fuel; U235, U233 and Pu239.  The first occurs naturally, and the other two can be readily “bred” artificially in nuclear reactors.

Let’s consider what this means in the case of conventional nuclear reactors like the Westinghouse AP1000.  These are powered by fuel elements that typically are enriched in U235 from the naturally occurring 0.7 percent to from three to five percent.  The remaining 95 to 97 percent of the uranium in these fuel elements is U238.  When the fission process starts, some of the neutrons released are captured by the U238, eventually resulting in the production of Pu239.  Some of this plutonium fissions along with the U235, contributing to the total energy produced by the fuel elements.  However, only a small fraction of the U238 is converted to Pu239 in this way before the fuel is consumed and it becomes necessary to replace the old fuel elements with fresh ones.  In addition to a great deal of U238, these spent fuel elements contain a significant amount of plutonium, as well as other transuranic elements such as americium and curium, which can remain dangerously radioactive for thousands of years.  The “waste” plutonium might even be used to produce a nuclear weapon.

Obviously, if possible it would be better to extract all the energy locked up in natural uranium rather than just a small fraction of it.  In fact, it is possible, or very nearly so.  Breeder reactors are feasible that could burn nearly all the U238 in natural uranium as well as the U235 by converting it into Pu239.  In the process they could destroy much of the transuranic waste that is the main source of radioactive danger from spent fuel.  In as little as 500 years the residual radioactivity from running a nuclear plant for 30 years could potentially be less than that of the original naturally occurring uranium.  Unfortunately, while all this is scientifically feasible, it is not economically feasible.  It won’t happen without massive government subsidies.  Perhaps such subsidies are warranted in view of the threat of climate change and perhaps not, but, regardless, breeder reactors won’t be built without them.  Since they are really the only types of reactors it makes sense to build, we would probably be better off, at least for the time being, building no reactors at all.  That’s the “silver lining” I referred to above.  Perhaps a time will come when the world runs out of expendable sources of base load electrical power, such as oil, coal and natural gas, and no way has been found to take up the slack with renewables.  In that case, it may once again make economic sense to build breeder reactors.  Until that time, the United States would do well to build up a healthy stockpile of uranium, and put a stop to the stupid, wasteful, and counterproductive use of depleted uranium that could potentially become a source of vast amounts of energy to produce munitions and armor.

But wait, there’s more!  What about thorium?  Thorium by itself can’t sustain a nuclear chain reaction.  It can, however, be converted into U233 by neutron absorption, and that is an ideal reactor fuel.  Among other things, it generates more neutrons per fission at lower neutron “temperatures” than either Pu239 or U235.  That means that extra neutrons are available to “breed” fuel at those lower temperatures where nuclear reactors are easier to control.  By “temperature” here, we’re referring to the average speed of the neutrons.  The slower they are, the more likely they are to be absorbed by a nucleus and cause fission reactions.  Neutrons are slowed in “moderators,” which can be any number of light types of atoms.  The most common is plain water, consisting of the elements hydrogen and oxygen.  Think of a billiard ball hitting another billiard ball head on.  It comes to a complete stop, transferring its energy to the other ball.  The same thing can happen with neutrons and the proton nucleus of hydrogen atoms, which are of approximately equal mass.  To breed plutonium effectively, reactors must be run at significantly higher neutron temperatures.

There’s more good news about thorium.  It can be dissolved in various exotic mixtures and breed U233 in a reactor with a liquid instead of a solid core.  This would have a number of advantages.  In the first place, a “meltdown” would be impossible in a core that’s already “melted.”  If the core became too “hot” it could simply be drained into a holding pan to form a subcritical mass that would quickly cool.  It would also be possible to extract waste fission products and introduce fresh fuel, etc., into the core “on the fly.”  As a result the reactor would be able to stay in operation longer between shutdowns for maintenance and refueling.  The necessary technology has already been demonstrated at places like Oak Ridge, Tennessee and Shippingport, Pennsylvania.  Recently, a Dutch team finally began experiments with molten salt technology intended to take up where these earlier experiments left off after a hiatus of more than 40 years.

Perhaps thorium’s biggest problem is the tendency of its proponents to over-hype its promise.  It even has a founding myth based on bogus claims that thorium technology isn’t dominant in the energy industry today because “it’s much harder to weaponize.”  For example, according to the article about the Dutch experiments linked above, entitled, ‘Safer’ thorium reactor trials could salvage nuclear power,

But, if it’s so safe and reliable why hasn’t thorium been used all along? Because (unlike uranium) it’s much harder to weaponize. As a result, it’s historically been sidelined by nations in search of both energy and a potential source of weapons-grade plutonium.

This yarn about a benign source of energy that might have benefited all mankind being torpedoed by evil weaponeers might sound good, but it’s complete nonsense.  Thorium itself can’t be weaponized, because it can’t sustain a nuclear chain reaction on its own.  The sole reason there’s any interest in it at all as a source of nuclear power is the possibility of transmuting it to U233.  Of course, it can’t be used to produce weapons-grade plutonium.  However, there is no better material for making nuclear bombs than U233.  As is the case with Pu239, four kilograms is sufficient to make a nuclear weapon, compared to the 25 kilograms that is a sufficient quantity of U235.  It’s main drawback as a weapons material is the fact that small amounts of U232 are produced along with it in thorium-based reactors, and U232 decays into radioactive daughters that are deadly sources of powerful gamma rays.  However, the amount of U232 produced can be reduced dramatically by cooling the neutron spectrum to a low “temperature.”  In short, thorium could definitely be used to make weapons.  The reason it isn’t the dominant technology for that purpose is the same as the reason it isn’t the dominant technology for producing electric power; it would be significantly more complex and expensive than using natural or slightly enriched uranium as a fuel.  That reason is as valid now as it was in the days of Little Boy and Fat Man.  The “dominant technology” would be the same as it is today whether nuclear weapons had ever been produced or not.

When it comes to the technology itself, thorium proponents also tend to be coy about mentioning problems that don’t afflict other reactor types.  For example, the materials needed for practical molten salt reactors are extremely corrosive.  There has been progress towards finding a metal that can hold them, but no ideal alloy has yet been found.  This isn’t necessarily a show stopper, but it’s not an insignificant problem, either.  Such material issues have been largely solved for conventional reactors.  If, as would seem to be the case, these are no longer economically competitive with their rivals, then molten salt is pretty much out of the question, at least for the time being.  It’s important to point out that, if breeder reactors ever do become economically feasible again, it will always be necessary to insure that they are secure, and that the materials they produce can’t be diverted for making weapons.  That concern applies to both plutonium and thorium breeders.

Meanwhile, it might behoove our political leaders to consider the question of why it was once possible to build more than 50 experimental reactors at what is now Idaho National Laboratory alone in a relatively short period of time for a small fraction of what similar reactors would cost today.  Merely negotiating the regulatory hurdles for building a power reactor based on anything as novel as the thorium fuel cycle would take the better part of a decade.  All these hurdles have been put in place in the name of “safety.”  That begs the question of how “safe” we will be if we lack reliable sources of electric energy.  There is a point beyond which excessive regulation itself becomes unsafe.

The Bomb and the Nuclear Posture Review

A Nuclear Posture Review (NPR) is a legislatively mandated review, typically conducted every five to ten years.  It assesses such things as the role, safety and reliability of the weapons in the U.S. nuclear stockpile, the status of facilities in the nuclear weapons complex, and nuclear weapons policy in areas such as nonproliferation and arms control.  The last one was conducted in 2010.  The Trump Administration directed that another one be conducted this year, and the review is already in its initial stages.  It should be finished by the end of the year.  There is reason for concern about what the final product might look like.

Trump has made statements to the effect that the U.S. should “expand its nuclear capability,” and that, “We have nuclear arsenals that are in very terrible shape.  They don’t even know if they work.”  Such statements have typically been qualified by his aides.  It’s hard to tell whether they reflect serious policy commitments, or just vague impressions based on a few minutes of conversation with some Pentagon wonk.  In fact, there are deep differences of opinion about these matters within the nuclear establishment.  That’s why the eventual content of the NPR might be problematic.  There have always been people within the nuclear establishment, whether at the National Nuclear Security Administration (NNSA), the agency within the Department of Energy responsible for maintaining the stockpile, or in the military, who are champing at the bit to resume nuclear testing.  Occasionally they will bluntly question the reliability of the weapons in our stockpile, even though by that very act they diminish the credibility of our nuclear deterrent.  If Trump’s comments are to be taken seriously, the next NPR may reflect the fact that they have gained the upper hand.  That would be unfortunate.

Is it really true that the weapons in our arsenal are “in very terrible shape,” and we “don’t even know if they work?”  I doubt it.  In the first place, the law requires that both the Department of Energy and the Department of Defense sign off on an annual assessment that certifies the safety and reliability of the stockpile.  They have never failed to submit that certification.  Beyond that, the weapons in our stockpile are the final product of more than 1000 nuclear tests.  They are both safe and robust.  Any credible challenge to their safety and reliability must cite some plausible reason why they might fail.  I know of no such reason.

For the sake of argument, let’s consider what might go wrong.  Modern weapons typically consist of a primary and a secondary.  The primary consists of a hollow “pit” of highly enriched uranium or plutonium surrounded by high explosive.  Often it is filled with a “boost” gas consisting of a mixture of deuterium and tritium, two heavy isotopes of hydrogen.  When the weapon is used, the high explosive implodes the pit, causing it to form a dense mass that is highly supercritical.  At the same time, nuclear fusion takes place in the boost gas, producing highly energetic neutrons that enhance the yield of the primary.  At the right moment an “initiator” sends a burst of neutrons into the imploded pit, setting off a chain reaction that results in a nuclear explosion.  Some of the tremendous energy released in this explosion in the form of x-rays then implodes the secondary, causing it, too, to explode, adding to the yield of the weapon.

What could go wrong?  Of course, explosives are volatile.  Those used to implode the primary might deteriorate over time.  However, these explosives are carefully monitored to detect any such deterioration.  Other than that, the tritium in the boost gas is radioactive, and has a half life of only a little over 12 years.  It will gradually decay into helium, reducing the effectiveness of boosting.  This, too, however is a well understood process, and one which is carefully monitored and compensated for by timely replacement of the tritium.  Corrosion of key parts might occur, but this too, is carefully checked, and the potential sources are well understood.  All these potential sources of uncertainty affect the primary.  However, much of the uncertainty about their effects can be eliminated experimentally.  Of course, the experiments can’t include actual nuclear explosions, but surrogate materials can be substituted for the uranium and plutonium in the pit with similar properties.  The implosion process can then be observed using powerful x-ray or proton beams.  Unfortunately, our experimental capabilities in this area are limited.  We cannot observe the implosion process all the way from the initial explosion to the point at which maximum density is achieved in three dimensions taking “snapshots” at optimally short intervals.  To do that, we would need what has been referred to as an Advanced Hydrodynamic Facility, or AHF.

We currently have an unmatched suite of above ground experimental facilities for studying the effects of aging on the weapons in our stockpile, including the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, the Z Machine at Sandia National Laboratories, and the Dual-Axis Radiographic Hydrodynamic Test facility (DARHT) at Los Alamos.  These give us a very significant leg up on the international competition when it comes to maintaining our stockpile.  That is a major reason why it would be foolish for us to resume nuclear testing.  We would be throwing away this advantage.  Unfortunately, while we once seriously considered building an AHF, basically an extremely powerful accelerator, we never got around to doing so.  It was a serious mistake.  If we had such a facility, it would effectively pull the rug out from under the feet of those who want to resume testing.  It would render all arguments to the effect that “we don’t even know if they work” moot.  We could demonstrate with a very high level of confidence that they will indeed work.

But that’s water under the bridge.  We must hope that cooler heads prevail, and the NPR doesn’t turn out to be a polemic challenging the credibility of the stockpile and advising a resumption of testing.  We’re likely to find out one way or the other before the end of the year.  Keep your fingers crossed.

Nuclear Fusion Update

At the moment the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory is in a class by itself when it comes to inertial confinement fusion (ICF) facilities.  That may change before too long.  A paper by a group of Chinese authors describing a novel 3-axis cylindrical hohlraum design recently appeared in the prestigious journal Nature.  In ICF jargon, a “hohlraum” is a container, typically cylindrical in form.  Powerful laser beams are aimed through two or more entrance holes to illuminate the inner wall of the hohlraum, producing a burst of x-rays.  These strike a target mounted inside the hohlraum containing fusion fuel, typically consisting of heavy isotopes of hydrogen, causing it to implode.  At maximum compression, a series of shocks driven into the target are supposed to converge in the center, heating a small “hot spot” to fusion conditions.  Unfortunately, such “indirect drive” experiments haven’t worked so far on the NIF.  The 1.8 megajoules delivered by NIF’s 192 laser beams haven’t been enough to achieve fusion with current target designs, even though the beams are very clean and uniform, and the facility itself is working as designed.  Perhaps the most interesting thing about the Chinese paper is not their novel three axis hohlraum design, but the fact that they are still interested in ICF at all in spite of the failure of the NIF to achieve ignition to date.  To the best of my knowledge, they are still planning to build SG-IV, a 1.5 megajoule facility, with ignition experiments slated for the early 2020’s.

Why would the Chinese want to continue building a 1.5 megajoule facility in spite of the fact that U.S. scientists have failed to achieve ignition with the 1.8 megajoule NIF?  For the answer, one need only look at who paid for the NIF, and why.  The project was paid for by the people at the Department of Energy (DOE) responsible for maintaining the nuclear stockpile.  Many of our weapons designers were ambivalent about the value of achieving ignition before the facility was built, and were more interested in the facility’s ability to access physical conditions relevant to those in exploding nuclear weapons for studying key aspects of nuclear weapon physics such as equation of state (EOS) and opacity of materials under extreme conditions.  I suspect that’s why the Chinese are pressing ahead as well.  Meanwhile, the Russians have also announced a super-laser project of their own that they claim will deliver energies of 2.8 megajoules.

Meanwhile, in the wake of the failed indirect drive experiments on the NIF, scientists in favor of the direct drive approach have been pleading their case.  In direct drive experiments the laser beams are shot directly at the fusion target instead of at the inner walls of a hohlraum.  The default approach for the NIF has always been indirect drive, but the alternative approach may be possible using an approach called “polar direct drive.”  In recent experiments at the OMEGA laser facility at the University of Rochester’s Laboratory for Laser Energetics, the nation’s premier direct drive facility, scientists claim to have achieved results that, if scaled up to energies available on the NIF would produce five times more fusion energy output than has been achieved with indirect drive to date.

Meanwhile, construction continues on ITER, a fusion facility designed purely for energy applications.  ITER will rely on magnetic plasma confinement, the other “mainstream” approach to harnessing fusion energy.  The project is a white elephant that continues to devour ever increasing amounts of scarce scientific funding in spite of the fact that the chances that magnetic fusion will ever be a viable source of electric power are virtually nil.  That fact should be obvious by now, and yet the project staggers forward, seemingly with a life of its own.  Watching its progress is something like watching the Titanic’s progress towards the iceberg.  Within the last decade the projected cost of ITER has metastasized from the original 6 billion euros to 15 billion euros in 2010, and finally to the latest estimate of 20 billion euros.  There are no plans to even fuel the facility for full power fusion until 2035!  It boggles the mind.

Magnetic fusion of the type envisioned for ITER will never come close to being an economically competitive source of power.  It would already be a stretch if it were merely a question of controlling an unruly plasma and figuring out a viable way to extract the fusion energy.  Unfortunately, there’s another problem.  Remember all those yarns you’ve been told about how an unlimited supply of fuel is supposed to be on hand in the form of sea water?  In fact, reactors like ITER won’t work without a heavy isotope of hydrogen known as tritium.  A tritium nucleus contains a proton and two neutrons, and, for all practical purposes, the isotope doesn’t occur in nature, in sea water or anywhere else.  It is highly radioactive, with a very short half-life of a bit over 12 years, and the only way to get it is to breed it.  We are told that fast neutrons from the fusion reactions will breed sufficient tritium in lithium blankets surrounding the reaction chamber.  That may work on paper, but breeding enough of the isotope and then somehow extracting it will be an engineering nightmare.  There is virtually no chance that such reactors will ever be economically competitive with renewable power sources combined with baseline power supplied by proven fission breeder reactor technologies.  Such reactors can consume most of the long-lived transuranic waste they produce.

In short, ITER should be stopped dead in its tracks and abandoned.  It won’t be, because too many reputations and too much money are on the line.  It’s too bad.  Scientific projects that are far worthier of funding will go begging as a result.  At best my descendants will be able to say, “See, my grandpa told you so!”

Fusion Update: The Turn of Direct Drive

Inertial confinement fusion, or ICF, is one of the two “mainstream” approaches to harnessing nuclear fusion in the laboratory.  As its name would imply, it involves dumping energy into nuclear material, commonly consisting of heavy isotopes of hydrogen, so fast that its own inertia hold it in place long enough for significant thermonuclear fusion to occur.  “Fast” means times on the order of billionths of a second.  There are, in turn, two main approaches to supplying the necessary energy; direct drive and indirect drive.  In direct drive the “target” of fuel material is hit directly by laser or some other type of energetic beams.  In indirect drive, the target is mounted inside of a “can,” referred to as a “hohlraum.”  The beams are aimed through holes in the hohlraum at the inner walls.  There they are absorbed, producing x-rays, which supply the actual energy to the target.

To date, the only approach used at the biggest ICF experimental facility in the world, the National Ignition Facility, or NIF, at Lawrence Livermore National Laboratory (LLNL), has been indirect drive.  So far, it has failed to achieve the goal implied by the facility’s name – ignition – defined as more fusion energy out than laser energy in.  A lot of very complex physics goes on inside those cans, and the big computer codes used to predict the outcome of the experiments didn’t include enough of it to be right.  They predicted ignition, but LLNL missed it by over a factor of 10.  That doesn’t necessarily mean that the indirect drive approach will never work.  However, the prospects of that happening are becoming increasingly dim.

Enter direct drive.  It has always been the preferred approach at the Naval Research Laboratory and the Laboratory for Laser Energetics (LLE) at the University of Rochester, the latter home of the second biggest laser fusion facility in the world, OMEGA.  They lost the debate to the guys at LLNL as the NIF was being built, but still managed to keep a crack open for themselves, in the form of polar direct drive.  It would have been too difficult and expensive to configure the NIF beams so that they would be ideal for indirect drive, but could then be moved into a perfectly symmetric arrangement for direct drive.  However, by carefully tailoring the length and power in each of the 192 laser beams, and delicately adjusting the thickness of the target at different locations, it is still theoretically possible to get a symmetric implosion.  That is the idea behind polar direct drive.

With indirect drive on the ropes, there are signs that direct drive may finally have its turn.  One such sign was the recent appearance in the prestigious journal, Physics of Plasmas, of a paper entitled Direct-drive inertial confinement fusion: A review.  At the moment it is listed as the “most read” of all the articles to appear in this month’s issue, a feat that is probably beyond the ability of non-experts.  The article is more than 100 pages long, and contains no less than 912 references to work by other scientists.  However, look at the list of authors.  They include familiar direct drive stalwarts like Bob McCrory, John Sethian, and Dave Meyerhofer.  However, one can tell which way the wind is blowing by looking at some of the other names.  They include some that haven’t been connected so closely with direct drive in the past.  Notable among them is Bill Kruer, a star in the ICF business who specializes in theoretical plasma physics, but who works at LLNL, home turf for the indirect drive approach.

Will direct drive ignition experiments happen on the NIF?  Not only science, but politics is involved, and not just on Capitol Hill.  Money is a factor, as operating the NIF isn’t cheap.  There has always been a give and take, or tug of war, if you will, between the weapons guys and the fusion energy guys.  It must be kept in mind that the NIF was built primarily to serve the former, and they have not historically always been full of enthusiasm for ignition experiments.  There is enough energy in the NIF beams to create conditions sufficiently close to those that occur in nuclear weapons without it.  Finally, many in the indirect drive camp are far from being ready to throw in the towel.

In spite of that, some tantalizing signs of a change in direction are starting to turn up.  Of course, the “usual suspects” at NRL and LLE continue to publish direct drive papers, but a paper was also just published in the journal High Energy Density Physics entitled, A direct-drive exploding-pusher implosion as the first step in development of a monoenergetic charged-particle backlighting platform at the National Ignition Facility.  An exploding pusher target is basically a little glass shell filled with fusion fuel, usually in gaseous form.  For various reasons, such targets are incapable of reaching ignition/breakeven.  However, they were the type of target used in the first experiments to demonstrate significant fusion via laser implosion at the now defunct KMS Fusion, Inc., back in 1974.  According to the paper, all of the NIF’s 192 beams were used to implode such a target, and they were, in fact, tuned for polar direct drive.  However, they were “dumbed down” to deliver only a little over 43 kilojoules to the target, only a bit more than two percent of the design limit of 1.8 megajoules!  Intriguingly enough, that happens to be just about the same energy that can be delivered by OMEGA.  The target was filled with a mixture of deuterium (hydrogen with an extra neutron), and helium 3.  Fusion of those two elements produces a highly energetic proton at 14.7 MeV.  According to the paper copious amounts of these mono-energetic protons were detected.  Ostensibly, the idea was to use the protons as a “backlighter.”  In other words, they would be used merely as a diagnostic, shining through some other target to record its behavior at very high densities.  That all sounds a bit odd to me.  If all 192 beams are used for the backlighter, what’s left to hit the target that’s supposed to be backlighted?  My guess is that the real goal here was to try out polar direct drive for later attempts at direct drive ignition.

All I can say is, stay tuned.  The guys at General Atomics down in San Diego who make the targets for NIF may already be working on a serious direct drive ignition target for all I know.  Regardless, I hope the guys at LLNL manage to pull a rabbit out of their hat and get ignition one way or another.  Those “usual suspects” among the authors I mentioned have all been at it for decades now, and are starting to get decidedly long in the tooth.  It would be nice if they could finally reach the goal they’ve been chasing for so long before they finally fade out of the picture.  Meanwhile, I can but echo the words of Edgar Allan Poe:

Over the Mountains
Of the Moon,
Down the Valley of the Shadow,
Ride, boldly ride,
The shade replied —
If you seek for El Dorado.

Of Solar Energy and Amateurish Agitprop at Fox News

Reading the “news” can be a painful experience in our time.  Most of it consists of a blend of sensationalism, human interest stories, accounts of the lives of various vapid celebrities, and attempts to inspire virtuous indignation based on a half-baked knowledge of some ideologically loaded issue or other.  One finds very little that could be accurately described as useful knowledge about things that are likely to have a major impact on our lives.  I generally find Fox News less painful to read than what is commonly described as the Mainstream Media because I happen to be emotionally conservative.  However, I must admit that Fox can occasionally be more ham-handed than the competition when it comes to dishing out propaganda.

A story that recently turned up on the Fox website is a case in point.  It happened to be about the Ivanpah solar generating system that was recently completed in California’s Mojave Desert.  The word “solar” should enable most readers to predict the ideological slant on the story one is likely to find at Fox.  Sure enough, the title of the story is, “Taxpayer-backed solar plant actually a carbon polluter.”  In the article itself we learn that the plant,

…is producing carbon emissions at nearly twice the amount that compels power plants and companies to participate in the state’s cap-and-trade program.

In fact, the plant does emit CO2 because it burns natural gas to avoid damage to equipment and to serve as a baseline source of power to meet electricity needs at night or during cloudy days.  A bit further on, we learn from a “research fellow at the Heartland Institute” named H. Sterling Burnett that,

…designers also erred in placing Ivanpah between the tallest mountains in the Mojave where there is significant cloud cover and dust which would interfere with the sunlight.

He adds that,

…They say it is green, but that assumes that there is a power source without any environmental impact.

I don’t find anything as egregious as actual lies in the article.  Rather, Fox limits itself to “creative” use of the truth.  For example, it may be quite true that the plant, “…is producing carbon emissions at nearly twice the amount that compels power plants and companies to participate in the state’s cap-and-trade program,” but it’s also true that it produces far less carbon per unit of electricity delivered than a purely fossil fuel fired plant, a fact that is left unsaid in spite of its much greater relevance to the underlying issue of climate change.  A researcher at the Heartland Institute is quoted without mentioning that the institute is funded by the fossil fuel industry, and is considered a source of blatant disinformation by environmentalists.  That charge may be unfair, but one can hardly claim that it is irrelevant and should be ignored.  As for his claim that, “designers also erred in placing Ivanpah between the tallest mountains in the Mojave,” etc., I invite interested readers who may happen to visit Las Vegas to drive out and have a look at the plant.  It’s actually quite a spectacular sight.  It certainly doesn’t appear to be sitting in the shadow of towering mountains, and the cloud cover is generally minimal, as one can confirm by Googling nearby locations.  As for the dust, one surmises that it would have been worse if the plant had been built on the Los Angeles side of the mountains.  As for Burnett’s last remark, as far as I am aware not even the most wild-eyed and fanatical environmentalist has ever claimed that the description of a power source as “green” implies the assumption that it has no environmental impact at all.

The reality is that the plant is reasonably sited given the location of the major consumers of the power it produces.  Given the current limitations in our ability to store and distribute the excess power produced by renewable energy sources like wind and solar, some form of baseline power is always necessary to insure a steady supply of electricity when the wind isn’t blowing or the sun isn’t shining.  My own choice for that purpose would be nuclear, but given the regulatory hurdles in the way, that would probably have been impractical for Ivanpah.  Natural gas produces significantly less CO2 than, for example, coal, and was probably the best choice.

In short, the article is an example of what I have referred to above as “attempts to inspire virtuous indignation based on a half-baked knowledge of some ideologically loaded issue or other.”  If the goal at Fox had been to inform rather than propagandize, they would have provided the reader with “fair and balanced” information about the cost of electricity produced at Ivanpah compared to alternative sources, the amount actually produced in comparison with predictions, the amount of CO2 it produces per unit of electricity in comparison to coal or oil fired plants, the relative advantages of solar and nuclear in limiting greenhouse gas emissions, etc.  None of what I write here should be taken to imply a belief that solar should be preferred to any alternative.  In fact, my own choice would be to reduce the regulatory burden to rational levels and build next generation nuclear plants instead.  However, regardless of the technology involved, I would prefer to see it judged on a level playing field.

I know, I know, the MSM is hardly innocent of slanting the news.  Indeed, its hysterical response after the announcement that Sarah Palin would be John McCain’s running mate puts anything I have ever seen at Fox News completely in the shade.  Generally, however, it tends to be at least marginally more subtle.  For example, instead of attempting to slant important news stories that don’t fit its narrative, it will often simply ignore them.  If the story is too big to ignore, it will vilify the messenger instead.  Of course, such techniques reflect a greater maturity and experience in handling agitprop than is available to the team at Fox News.  However, that doesn’t prevent them from learning by example.  Given that we will be subjected to propaganda no matter which “news” source we choose to follow, we should at least be able to demand that it not be crudely done.

Ivanpah

Another Fusion White Elephant Sighted in Germany

According to an article that just appeared in Science magazine, scientists in Germany have completed building a stellarator by the name of Wendelstein 7-X (W7-X), and are seeking regulatory permission to turn the facility on in November.  If you can’t get past the Science paywall, here’s an article in the popular media with some links.  Like the much bigger ITER facility now under construction at Cadarache in France, W7-X is a magnetic fusion device.  In other words, its goal is to confine a plasma of heavy hydrogen isotopes at temperatures much hotter than the center of the sun with powerful magnetic fields in order to get them to fuse, releasing energy in the process.  There are significant differences between stellarators and the tokamak design used for ITER, but in both approaches the idea is to hold the plasma in place long enough to get significantly more fusion energy out than was necessary to confine and heat the plasma.  Both approaches are probably scientifically feasible.  Both are also white elephants, and a waste of scarce research dollars.

The problem is that both designs have an Achilles heel.  Its name is tritium.  Tritium is a heavy isotope of hydrogen with a nucleus containing a proton and two neutrons instead of the usual lone proton.  Fusion reactions between tritium and deuterium, another heavy isotope of hydrogen with a single neutron in addition to the usual proton, begin to occur fast enough to be attractive as an energy source at plasma temperatures and densities much less than would be necessary for any alternative reaction.  The deuterium-tritium, or DT, reaction will remain the only feasible one for both stellarator and tokamak fusion reactors for the foreseeable future.  Unfortunately, tritium occurs in nature in only tiny trace amounts.

The question is, then, where do you get the tritium fuel to keep the fusion reactions going?  Well, in addition to a helium nucleus, the DT fusion reaction produces a fast neutron.  These can react with lithium to produce tritium.  If a lithium-containing blanket could be built surrounding the reaction chamber in such a way as to avoid interfering with the magnetic fields, and yet thick enough and close enough to capture enough of the neutrons, then it should be possible to generate enough tritium to replace that burned up in the fusion process.  It sounds complicated but, again, it appears to be at least scientifically feasible.  However, it is by no means as certain that it is economically feasible.

Consider what we’re dealing with here.  Tritium is an extremely slippery material that can pass right through walls of some types of metal.  It is also highly radioactive, with a half-life of about 12.3 years.  It will be necessary to find some way to efficiently extract it from the lithium blanket, allowing none of it to leak into the surrounding environment.  If any of it gets away, it will be easily detectable.  The neighbors are sure to complain and, probably, lawyer up.  Again, all this might be doable.  The problem is that it will never be doable at a low enough cost to make fusion reactor designs based on these approaches even remotely economically competitive with the non-fossil alternative sources of energy that will be available for, at the very least, the next several centuries.

What’s that?  Reactor design studies by large and prestigious universities and corporations have all come to the conclusion that these magnetic fusion beasts will be able to produce electricity at least as cheaply as the competition?  I don’t think so.  I’ve participated in just such a government-funded study, conducted by a major corporation as prime contractor, with several other prominent universities and corporations participating as subcontractors.  I’m familiar with the methodology used in several others.  In general, it’s possible to make the cost electricity come out at whatever figure you choose, within reason, using the most approved methods and the most sound project management and financial software.  If the government is funding the work, it can be safely assumed that they don’t want to hear something like, “Fuggedaboudit, this thing will be way too expensive to build and run.”  That would make the office that funded the work look silly, and the fusion researchers involved in the design look like welfare queens in white coats.  The “right” cost numbers will always come out of these studies in the end.

I submit that a better way to come up with a cost estimate is to use a little common sense.  Do you really think that a commercial power company will be able to master the intricacies of tritium production and extraction from the vicinity of a highly radioactive reaction chamber at anywhere near the cost of, say, wind and solar combined with next generation nuclear reactors for baseload power?  If you do, you’re a great deal more optimistic than me.  W7-X cost a billion euros.  ITER is slated to cost 13 billion, and will likely come in at well over that.  With research money hard to come by in Europe for much worthier projects, throwing amounts like that down a rat hole doesn’t seem like a good plan.

All this may come as a disappointment to fusion enthusiasts.  On the other hand, you may want to consider the fact that, if fusion had been easy, we would probably have managed to blow ourselves up with pure fusion weapons by now.  Beyond that, you never know when some obscure genius might succeed in pulling a rabbit out of their hat in the form of some novel confinement scheme.  Several companies claim they have sure-fire approaches that are so good they will be able to dispense with tritium entirely in favor of more plentiful, naturally occurring isotopes.  See, for example, here, here, and here, and the summary at the Next Big Future website.  I’m not optimistic about any of them, either, but you never know.

Stellarator

Fusion Power Update: Hoping for a Shortcut

Think of a pile of bowling balls in a deep well.  They don’t fly out because the force of gravity holds them in.  If you roll an extra bowling ball to the edge of the well and let it drop in, energy is released when it hits the pile at the bottom.  Atomic nuclei can be compared to the well.  The neutrons and protons that make it up are the bowling balls, and the “gravity” is the far more powerful “strong force.”  Roll some of these “bowling balls” into the well and energy will be released, just as in a real well.  The process is called nuclear fusion, and it’s the source of energy that powers the sun.  We’ve been trying to produce energy by repeating the process here on earth for a good many years now, but were only “lucky” enough to succeed in the case of thermonuclear weapons.  We’ve been stymied in our efforts to harness fusion energy in less destructive forms.  The problem is the Coulomb, or electrostatic force.  It’s what causes unlike charges to attract and like charges to repel in the physics experiments you did in high school.  It’s much weaker than the strong force that holds the neutrons and protons in an atomic nucleus together, but the strong force has a very short range.  The trick is to get within that range.  All atomic nuclei contain protons, and protons are positively charged.  They repel each other, resisting our efforts to push them up to the edge of the “well,” where the strong force will finally overwhelm the Coulomb force, causing these tiny “bowling balls” to drop in.  So far, only atomic bombs have supplied enough energy to provide a “push” big enough to result in a net release of fusion energy.

To date, we’ve tried two main approaches to supplying the “push” in a more controlled form; magnetic fusion and inertial confinement fusion, or ICF.  In both approaches the idea is to heat the nuclei to extreme temperatures, causing them to bang into each other with enough energy to overcome the Coulomb repulsion.  However, when you dump that much energy into a material, it tends to fly apart, as in a conventional explosion.  Somehow a way must be found to hold it in place long enough for significant fusion to take place.  In magnetic fusion that’s accomplished with magnetic lines of force that hold the hot nuclei within a confined space.  Some will always manage to escape, but if enough are held in place long enough, the resulting fusion reactions will release enough energy to keep the process going.  In inertial confinement fusion, as the name would imply, the magnetic fields are replaced by the material’s own inertia.  The idea is to supply so much energy in such a short period of time that significant fusion will occur before the material has time to fly apart.  That’s essentially what happens in thermonuclear weapons.  In ICF the atomic bomb that drives the reaction is replaced by powerful arrays of laser or particle beams.

Both of these approaches are scientifically feasible.  In other words, both will almost certainly work if the magnetic fields can be made strong enough, or the laser beams powerful enough.  Unfortunately, after decades of effort, we still haven’t managed to reach those thresholds.  Our biggest ICF facility, the National Ignition Facility or NIF, has so far failed to achieve “ignition,” defined as fusion energy out equal to laser energy in, by a wide margin.  The biggest magnetic fusion facility, ITER, currently under construction in France, may reach the goal, but we’ll have to wait a long time to find out.  The last time I looked there were no plans to even fuel it with deuterium and tritium, (D and T, heavy isotopes of hydrogen with one and two neutrons in the nucleus in addition to the usual proton) until 2028!  The DT fusion reaction, shown below with some of the others, is the easiest to harness in the laboratory.  For reasons I’ve outlined elsewhere, I doubt that either the “conventional” magnetic or inertial confinement approaches will ever produce energy at a cost competitive with the alternatives.

There are, however, other approaches out there.  Over the years, startup companies have occasionally managed to attract millions in investment capital to explore these alternatives.  Progress reports occasionally turn up on websites such as NextBigFuture.  Examples may be found here, here and here, and many others may be found by typing in the search term “fusion” at the website.  Typically, they claim they are three or four years away from building a breakeven device, or even a prototype reactor.  So far none of them have panned out, but I keep hoping that eventually one of them will pull a rabbit out of their hat and come up with a workable design.  The chances are probably slim, but at least marginally better than the odds that someone will perfect a perpetual motion machine.

I tend to be particularly dubious when I see proposals involving fusion fuels other than the usual deuterium and tritium.  Other fusion reactions have their advantages.  For example, some produce no neutrons, which can pose a radioactive hazard, and/or use fuels other than the highly radioactive tritium, which occurs in nature only in tiny trace amounts, and must therefore be “bred” in the reactor in order to keep the process going.  Some of the most promising ones are shown along with the more “mainline” DT and DD reactions below.

D + T → 4He (3.5 MeV) + neutron (14.1 MeV)

D + D →  T (1.01 MeV) + proton (3.02 MeV) 50%

D + D →  3He (0.82 MeV) + neutron (2.45 MeV) 50%

H + 11B → 3(4He); Q = 8.68 MeV

H + 6Li → 3He + 4He; Q = 4.023 MeV

3He + 6Li → H + 2(4He); Q = 16.88 MeV

3He + 6Li → D + 7Be; Q = 0.113 MeV

The problem with the seemingly attractive alternatives to DT shown above as well as a number of others that have been proposed is that they all require significantly higher temperatures and/or confinement times for fusion “ignition” to occur.  Take a look at the graph below.

Cross_section_1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The horizontal axis is in units of the “temperature” of the fuel in thousands of electron volts, and the vertical shows the “cross-section” for any of the reactions shown in units of “barns.”  The cross-section is related to the probability that a particular reaction will occur.  It is measured in units of 10−24 cm2, or “barns,” because, at least at the atomic scale, that’s as big as the broad side of a barn.  Notice that the DT reaction is much higher at lower temperatures than all the others.  Yet we failed to achieve fusion ignition on the NIF with that reaction in spite of the fact that the facility is capable of focusing a massive 1.8 megajoules of laser energy on a fusion target in a period of a few billionths of a second!  Obviously, if we couldn’t get DT to work on the NIF, the other reactions will be difficult to harness indeed.

In short, I tend to be dubious when I read the highly optimistic progress reports, complete with “breakthroughs,” of the latest fusion startup.  I tend to be a great deal more dubious when they announce they will dispense with DT altogether, as they are so sure of the superior qualities of their design that lithium, boron, or some other exotic fuel will work just as well.  Still, I keep my fingers crossed.