The world as I see it
RSS icon Email icon Home icon
  • Fusion Update: The NIF Inches Closer to Ignition

    Posted on August 30th, 2013 Helian No comments

    In a recent press release, Lawrence Livermore National Laboratory (LLNL) announced that it had achieved a yield of 3 x 1015 neutrons in the latest round of experiments at its National Ignition Facility, a giant, 192-beam laser facility designed, as its name implies, to achieve fusion ignition.  That’s nowhere near “ignition,” but still encouraging as it’s three times better than results achieved in earlier experiments.

    The easiest way to achieve fusion is with two heavy isotopes of hydrogen; deuterium, with a nucleus containing one proton and one neutron, and tritium, with a nucleus containing one proton and two neutrons.  Deuterium is not radioactive, and occurs naturally as about one atom to every 6400 atoms of “normal” hydrogen, with a nucleus containing only a single proton.  Tritium is radioactive, and occurs naturally only in tiny trace amounts.  It has a half-life (the time it takes for half of a given amount to undergo radioactive decay) of 12.3 years, and must be produced artificially.  When tritium and deuterium fuse, they release a neutron, a helium nucleus, or alpha particle, and lots of energy (17.6 million electron volts).

    Fortunately (because otherwise it would be too easy to blow up the planet), or unfortunately (if you want to convert the energy into electricity), fusion is hard.  The two atoms don’t like to get too close, because their positively charged nuclei repel each other.  Somehow, a way must be found to make the heavy hydrogen fuel material very hot, causing the thermal motion of the atoms to become very large.  Once they start moving fast enough, they can smash into each other with enough momentum to overcome the repulsion of the positive nuclei, allowing them to fuse.  However, the amount of energy needed per atom is huge, and when atoms get that hot, the last thing they want to do is stay close to each other (think of what happens in the detonation of high explosive.)  There are two mainstream approaches to solving this problem; magnetic fusion, in which the atoms are held in place by powerful magnetic fields while they are heated (the approach being pursued at ITER, the International Thermonuclear Experimental Reactor, currently under construction in France), and inertial confinement fusion (ICF), where the idea is to dump energy into the fuel material so fast that its own inertia holds it in place long enough for fusion to occur.  The NIF is an ICF facility.

    There are various definitions of ICF “ignition,” but, in order to avoid comparisons of apples and oranges between ICF and magnetic fusion experiments, LLNL has explicitly accepted the point at which the fusion energy out equals the laser energy in as the definition of ignition.  In the experiment referred to above, the total fusion energy release was about 10,000 joules, give or take.  Since the laser energy in was around 1.7 million joules, that’s only a little over one half of one percent of what’s needed for ignition.  Paltry, you say?  Not really.  To understand why, you have to know a little about how ICF experiments work.

    Recall that the idea is to heat the fuel material up so fast that its own inertia holds it in place long enough for fusion to occur.  The “obvious” way to do that would be to simply dump in enough laser energy to heat all the fuel material to fusion temperatures at once.  Unfortunately, this “volumetric heating” approach wouldn’t work.  The energy required would be orders of magnitude more than what’s available on the NIF.  What to do?   Apply lots and lots of finesse.  It turns out that if a very small volume or “hot spot” in the fuel material can be brought to fusion conditions, the alpha particles released in the fusion reactions might carry enough energy to heat up the nearby fuel to fusion conditions as well.  Ideally, the result would be an alpha “burn wave,” moving out through the fuel, and consuming it all.  But wait, it ain’t that easy!  An efficient burn wave will occur only if the alphas are slammed to a stop and forced to dump their energy after traveling only a very short distance in the cold fuel material around the hot spot.  Their range is too large unless the fuel is first compressed to a tiny fraction of its original volume, causing its density to increase by orders of magnitude.

    In other words, to get the fuel to fuse, we need to make it very hot, but we also need to compress it to very high density, which can be done much more easily and efficiently if the material is cold!  Somehow, we need to keep the fuel “cold” during the compression process, and then, just at the right moment, suddenly heat up a small volume to fusion conditions.  It turns out that shocks are the answer to the problem.  If a train of four shocks can be set off in the fuel material as it is being compressed, or “imploded,” by the lasers, precisely timed so that they will all converge at just the right moment, it should be possible, in theory at least, to generate a hot spot.  If the nice, spherical symmetry of the fuel target could be maintained during the implosion process, everything should work just fine.  The NIF would have more than enough energy to achieve ignition.  But there’s the rub. Maintaining the necessary symmetry has turned out to be inordinately hard.  Tiny imperfections in the target surface finish, small asymmetries in the laser beams, etc., lead to big deviations from perfect symmetry in the dense, imploded fuel.  These asymmetries have been the main reason the NIF has not been able to achieve its ignition goal to date.

    And that’s why the results of the latest round of experiments haven’t been as “paltry” as they seem.  As noted in the LLNL press release,

    Early calculations show that fusion reactions in the hot plasma started to self-heat the burning core and enhanced the yield by nearly 50 percent, pushing close to the margins of alpha burn, where the fusion reactions dominate the process.

    “The yield was significantly greater than the energy deposited in the hot spot by the implosion,” said Ed Moses, principle associate director for NIF and Photon Science. “This represents an important advance in establishing a self-sustaining burning target, the next critical step on the path to fusion ignition on NIF.”

    That’s not just hype.  If the self-heating can be increased in future experiments, it may be possible to reach a threshold at which the alpha heating sets off a burn wave through the rest of the cold fuel, as described above.  In other words, ignition is hardly a given, but the guys at LLNL still have a fighting chance.  Their main challenge may be to stem the gradual evaporation of political support for NIF while the experiments are underway.  Their own Senator, Diane Feinstein, is anything but an avid supporter.  She recently turned down appeals to halt NIF budget cuts, and says the project needs to be “reassessed” in light of the failure to achieve ignition.

    Such a “reassessment” would be a big mistake.  The NIF was never funded as an energy project.  Its support comes from the National Nuclear Security Administration (NNSA), a semi-autonomous arm of the Department of Energy charged with maintaining the safety and reliability of the nation’s nuclear arsenal.  As a tool for achieving that end, the NIF is without peer in any other country.  It has delivered on all of its performance design goals, including laser energy, illumination symmetry, shot rate, the precision and accuracy of its diagnostic instrumentation, etc.  The facility is of exceptional value to the weapons program even if ignition is never achieved.  It can still generate experimental conditions approaching those present in an exploding nuclear device, and, along with the rest of our suite of “above-ground experimental facilities,” or AGEX, it gives us a major leg up over the competition in maintaining our arsenal and avoiding technological surprise in the post-testing era.

    Why is that important?  Because the alternative is a return to nuclear testing.  Do you think no one at NNSA wants to return to testing, and that the weapon designers at the National Weapons Laboratories wouldn’t jump at the chance?  If so, you’re dreaming.  It seems to me we should be doing our best to keep the nuclear genie in the bottle, not let it out.  Mothballing the NIF would be an excellent start at pulling the cork!

    I understand why the guys at LLNL are hyping the NIF’s potential as a source of energy.  It’s a lot easier to generate political support for lots of electricity with very little radioactive waste and no greenhouse gases than for maintaining our aging arsenal of nuclear weapons.  However, IMHO, ICF is hopeless as a source of electricity, at least for the next few hundred years.  I know many excellent scientists will disagree, but many excellent scientists are also prone to extreme wishful thinking when it comes to rationalizing a technology they’ve devoted their careers to.  Regardless, energy hype isn’t needed to justify the NIF.  It and facilities like it will insure our technological superiority over potential nuclear rivals for years to come, and at the same time provide a potent argument against the resumption of nuclear testing.

  • More Plutonium Horror Stories in Germany

    Posted on March 25th, 2013 Helian No comments

    Germany is plagued by an unusually large number per capita of pathologically pious zealots of the type who like to strike heroic poses as saviors of humanity.  The number may even approach the levels found in the USA.  They definitely take the cake when it comes to the subspecies of the tribe whose tastes run to nuclear alarmism.  They came out of the woodwork in droves the last time an attempt was made to move radioactive waste via rail to the storage facility in Gorleben, tearing up the tracks, peacefully smearing a police vehicle with tar and setting it on fire, and generally making a nuisance of themselves.  Now, in keeping with that tradition, an article just appeared in the German version of New Scientist, according to which those evil Americans are actually planning to restart the production of (shudder) plutonium.

    Entitled The Return of Plutonium and written by one Helmut Broeg, the article assumes a remarkable level of stupidity on the part of its readers.  Mimicking Der Spiegel, Germany’s number one news magazine, its byline is more sensational than the article that follows, based on the (probably accurate) assumption that that’s as far as most consumers of online content will read. Here’s the translation:

    The USA stopped producing plutonium 25 years ago.  In order to preserve the ability to launch deep space missions, they will resume the production of the highly poisonous and radioactive material.

    Only in the body of the article do we learn that the particular isotope that will be produced is plutonium 238, which, unlike plutonium 239, is useless for making nuclear explosives.  As it happens, Pu-238 is the ideal material for powering thermoelectric generators such as that used on the Curiosity Mars rover because it decays primarily via emission of alpha particles (helium nuclei) and has a half life of 87.7 years.  That means that its decay products are mostly stopped in the material itself, generating a lot of heat in the process (because of the short half life, or time it take half of the material to decay), which can be converted to electricity using devices with no moving parts.  The world supply of the material is currently very short, and more is urgently needed to power future deep space missions.

    All this is very sinister, according to Broeg.  He quotes Heinz Smital, who, we are informed, is an “atomic expert” at Greenpeace, that, “the crash of such a satellite could contaminate large areas with radioactivity.  Don’t look now, Mr. Smital, but if you’re really worried about radioactive contamination by alpha emitters like Pu-238, you might want to reconsider building all the coal plants that Germany is currently planning to replace the nuclear facilities it has decided to shut down.  Coal typically contains several parts per million of radioactive uranium and thorium.    A good-sized plant will release 5 tons of uranium and 10 tons of thorium into the environment each year.  Estimated releases in 1982  from worldwide combustion of 2800 million tons of coal totaled 3640 tons of uranium (containing 51,700 pounds of uranium-235) and 8960 tons of thorium.  That amount has gone up considerably in the intervening years.  The cumulative radiation now covering the earth from these sources dwarfs anything that might conceivably result from the crash of a rocket with a Pu-238 power source, no matter what implausible assumptions one chose to make about how its containment would fail, how it would somehow enter the atmosphere at hypersonic speed so as to (optimize) its dispersion, etc.  Of course, the radioactive isotopes released from burning coal will also be with us for billions of years, not just the few hundred it takes for Pu-238 to decay.

    But wait!  Dispersal of Pu-238 isn’t the only problem.  There’s also (drum roll) the BOMB!  Broeg drags in another “expert,” Moritz Kütt, a physicist at the Technical University of Darmstadt, who assures us that, “In the production of Pu-238, some Pu-239 is produced as well.  As a matter of principle, that means the US is resuming the production of weapons-useful material.”  Kütt goes on to ask what the world community would have to say if Iran announced that it would produce Pu-238 for a space mission?

    To appreciate the level of gullibility it takes to swallow such “warnings,” one must spend a few minutes to check on how Pu-238 is actually produced.  Generally, it is done by irradiating neptunium 237 from spent nuclear fuel with neutrons in a reactor.  Occasionally the Np-237 captures a neutron, becoming Np-238.  This, in turn emits a beta particle (electron), and is transmuted to Pu-238.  It’s quite true that some of the Pu-238 will also capture a neutron, and become Pu-239.  However, the amounts produced in this way would be vanishingly small compared to the amounts that could be produced in the same reactor by simply removing some of the fuel rods after a few months and chemically extracting the nearly pure Pu-239, which would not then have to be somehow separated from far greater quantities of highly radioactive Pu-238.  In other words, if the world community learned that Iran had a nefarious plan to produce bomb material in the way suggested by Kütt, the reasonable immediate reaction would be a horse laugh, perhaps followed by sympathy for a people who were sufficiently stupid to adopt such a plan.  As for the US deciding to replentish its stocks of bomb material in this way, the idea is more implausible than anything those good Germans, the brothers Grimm ever came up with.  It only takes 4 kilos of Pu-239 to make a bomb, and we have tons of it on hand.  In the unlikely event we wanted more, we would simply extract it from reactor fuel rods.  The idea that we would ever prefer to attempt the separation of Pu-239 from Pu-238 instead is one that could only be concocted in the fevered imagination of a German “atomic expert.”

     

    Plutonium 238

    Plutonium 238

     

     

  • But Wait! There are More “Worries” from The Edge!

    Posted on February 3rd, 2013 Helian No comments

    I won’t parse all 150+ of them, but here are a few more that caught my eye.

    Science writer and historian Michael Shermer, apparently channeling Sam Harris, is worried about the “Is-Ought Fallacy of Science and Morality.”  According to Shermer,

    …most scientists have conceded the high ground of determining human values, morals, and ethics to philosophers, agreeing that science can only describe the way things are but never tell us how they ought to be. This is a mistake.

    It’s only a mistake to the extent that there’s actually some “high ground” to be conceded.  There is not.  Assuming that Shermer is not referring to the trivial case of discovering mere opinions in the minds of individual humans, neither science nor philosophy is capable determining anything about objects that don’t exist.  Values, morals and ethics do not exist as objects.  They are not things-in-themselves.  They cannot leap out of the skulls of individuals and acquire a reality and legitimacy that transcends individual whim.  Certainly, large groups of individuals who discover that they have whims in common can band together and “scientifically” force their whims down the throats of less powerful groups and individuals, but, as they say, that don’t make it right.

    Suppose we experience a holocaust of some kind, and only one human survived the mayhem.  No doubt he would still be able to imagine what it was like when there were large groups of other’s like himself.  He might recall how they behaved, “scientifically” categorizing their actions as “good” or “evil,” according to his own particular moral intuitions.  Supposed, now, that his life also flickered out.  What would be left of his whims?  Would the inanimate universe, spinning on towards its own destiny, care about them one way or the other.  Science can determine the properties and qualities of things.  Where, then, would the “good” and “evil” objects reside?  Would they still float about in the ether as disembodied spirits?  I’m afraid not.  Science can have nothing to say about objects that don’t exist.  Michael Shermer might feel “in his bones” that some version of “human flourishing” is “scientifically good,” but there is no reason at all why I or anyone else should agree with his opinion.  By all means, let us flourish together, if we all share that whim, but surely we can pursue that goal without tacking moral intuitions on to it.  “Scientific” morality is not only naive, but, as was just demonstrated by the Communists and the Nazis, extremely dangerous as well. According to Shermer,

    We should be worried that scientists have given up the search for determining right and wrong…

    In fact, if scientists cease looking for and seeking to study objects that plainly don’t exist, it would seem to me more reason for congratulations all around than worry.  Here’s a sample of the sort of “reasoning” Shermer uses to bolster his case:

    We begin with the individual organism as the primary unit of biology and society because the organism is the principal target of natural selection and social evolution. Thus, the survival and flourishing of the individual organism—people in this context—is the basis of establishing values and morals, and so determining the conditions by which humans best flourish ought to be the goal of a science of morality. The constitutions of human societies ought to be built on the constitution of human nature, and science is the best tool we have for understanding our nature.

    Forgive me for being blunt, but this is gibberish.  Natural selection can have no target, because it is an inanimate process, and can no more have a purpose or will than a stone.  “Thus, the survival and flourishing of the individual organism – people in this context – is the basis of establishing values and morals”??  Such “reasoning” reminds me of the old “Far Side” cartoon, in which one scientist turns to another and allows that he doesn’t quite understand the intermediate step in his proof:  “Miracle happens.”  If a volcano spits a molten mass into the air which falls to earth and becomes a rock, is not it, in the same sense, the “target” of the geologic processes that caused indigestion in the volcano?  Is not the survival and flourishing of that rock equally a universal “good?”

    Of the remaining “worries,” this was the one that most worried me, but there were others.  Kevin Kelly, Editor at Large of Wired Magazine, was worried about the “Underpopulation Bomb.”  Noting the “Ur-worry” of overpopulation, Kelly writes,

    While the global population of humans will continue to rise for at least another 40 years, demographic trends in full force today make it clear that a much bigger existential threat lies in global underpopulation.

    Apparently the basis of Kelly’s worry is the assumption that, once the earths population peaks in 2050 or thereabouts, the decrease will inevitably continue until we hit zero and die out.  In his words, “That worry seems preposterous at first.”  I think it seem preposterous first and last.

    Science writer Ed Regis is worried about, “Being Told That Our Destiny Is Among The Stars.”  After reciting the usual litany of technological reasons that human travel to the stars isn’t likely, he writes,

    Apart from all of these difficulties, the more important point is that there is no good reason to make the trip in the first place. If we need a new “Earth 2.0,” then the Moon, Mars, Europa, or other intra-solar-system bodies are far more likely candidates for human colonization than are planets light years away.  So, however romantic and dreamy it might sound, and however much it might appeal to one’s youthful hankerings of “going into space,” interstellar flight remains a science-fictional concept—and with any luck it always will be.

    In other words, he doesn’t want to go.  By all means, then, he should stay here.  I and many others, however, have a different whim.  We embrace the challenge of travel to the stars, and, when it comes to human survival, we feel existential Angst at the prospect of putting all of our eggs in one basket.  Whether “interstellar flight remains a science-fiction concept” at the moment depends on how broadly you define “we.”  I see no reason why “we” should be limited to one species.  After all, any species you could mention is related to all the rest.  Interstellar travel may not be a technologically feasible option for me at the moment, but it is certainly feasible for my relatives on the planet, and at a cost that is relatively trivial.  Many simpler life forms can potentially survive tens of thousands of years in interstellar space.  I am of the opinion that we should send them on their way, and the sooner the better.

    I do share some of the other worries of the Edge contributors.  I agree, for example, with historian Noga Arikha’s worry about, “Presentism – the prospect of collective amnesia,” or, as she puts it, the “historical blankness” promoted by the Internet.  In all fairness, the Internet has provided unprecedented access to historical source material.  However, to find it you need to have the historical background to know what you’re looking for.  That background about the past can be hard to develop in the glare of all the fascinating information available about the here and now.  I also agree with physicist Anton Zeilinger’s worry about, “Losing Completeness – that we are increasingly losing the formal and informal bridges between different intellectual, mental, and humanistic approaches to seeing the world.”  It’s an enduring problem.  The name “university” was already a misnomer 200 years ago, and in the meantime the problem has only become worse.  Those who can see the “big picture” and have the talent to describe it to others are in greater demand than ever before.  Finally, I agree with astrophysicist Martin Rees’ worry that, “We Are In Denial About Catastrophic Risks.”  In particular, I agree with his comment to the effect that,

    The ‘anthropocene’ era, when the main global threats come from humans and not from nature, began with the mass deployment of thermonuclear weapons. Throughout the Cold War, there were several occasions when the superpowers could have stumbled toward nuclear Armageddon through muddle or miscalculation. Those who lived anxiously through the Cuba crisis would have been not merely anxious but paralytically scared had they realized just how close the world then was to catastrophe.

    This threat is still with us.  It is not “in abeyance” because of the end of the cold war, nor does that fact that nuclear weapons have not been used since World War II mean that they will never be used again.  They will.  It is not a question of “if,” but “when.”

  • The NIF Misses its Ignition Milestone

    Posted on October 22nd, 2012 Helian 2 comments

    We have passed the end of the fiscal year, and the National Ignition Facility, or NIF, at Lawrence Livermore National Laboratory (LLNL) failed to achieve its goal of ignition (more fusion energy out than laser energy in). As I noted in earlier post about the NIF more than three years ago, this doesn’t surprise me. Ignition using the current indirect drive approach (most of the jargon and buzzwords are explained in the Wiki article on the NIF) requires conversion of the laser energy into an almost perfectly symmetric bath of x-rays. These must implode the target, preserving its spherical shape in the process in spite of a very high convergence ratio (initial radius divided by final radius), and launching a train of four shocks in the process, which must all converge in a tiny volume at the center of the target, heating it to fusion conditions. That will release energetic alpha particles (helium nuclei) which must then dump their energy in the surrounding, cold fuel material, causing a “burn wave” to propagate out from the center, consuming the remaining fuel. It would have been a spectacular achievement if LLNL had pulled it off. Unfortunately, they didn’t, for reasons that are explained in an excellent article that recently appeared in the journal Science. (Unfortunately, it’s behind a subscriber wall, and I haven’t found anything as good on the web at the moment. You can get the gist from this article at Huffpo.)  The potential political implications of the failure were addressed in a recent article in the New York Times.

    All of which begs the question, “What now?” My opinion, in short, is that the facility should remain operational, at full capacity (not on half shifts, which, for various reasons, would reduce the experimental value of the facility by significantly more than half).

    I certainly don’t base that opinion on the potential of inertial confinement fusion (ICF), the technology implemented on the NIF, for supplying our future energy needs.  While many scientists would disagree with me, I feel it has virtually none.  Although they may well be scientifically feasible, ICF reactors would be engineering nightmares, and far too expensive to compete with alternative energy sources.  It would be necessary to fabricate many thousands of delicate, precisely built targets every day and fill them with highly radioactive tritium.  Tritium is not a naturally occurring isotope of hydrogen, and its half-life (the time it takes for half of a given quantity to undergo radioactive decay) is just over 12 years, so it can’t be stored indefinitely.  It would be necessary to breed and extract the stuff from the reactor on the fly without releasing any into the environment (hydrogen is notoriously slippery stuff, that can easily leak right through several types of metal barriers), load it into the targets, and then cool them to cryogenic temperatures.  There is not a reactor design study out there that doesn’t claim that this can be done cheaply enough to make ICF fusion energy cost-competitive.  They are all poppycock.  The usual procedure in such studies is to pick the cost number you need, and then apply “science” to make it seem plausible.

    However, despite all the LLNL hype, the NIF was never funded as an energy project, but as an experimental tool to help maintain the safety and reliability of our nuclear stockpile in the absence of nuclear testing.  The idea that it will be useless for that purpose, whether it achieves ignition or not, is nonsense.  The facility has met and in some cases exceeded its design goals in terms of energy and precision.  Few if any other facilities in the world, whether existing or planned, will be able to rival its ability to explore equations of state, opacities, and other weapons-relevant physics information about materials at conditions approaching those that exist in nuclear detonations.  As long as the ban on nuclear testing remains in effect, the NIF will give us a significant advantage over other nuclear states.  It seems to me that maintaining the ban is a good thing.

    It also seems to me that it would behoove us to maintain a robust nuclear stockpile.  Nuclear disarmament sounds nice on paper.  In reality it would invite nuclear attack.  The fact that nuclear weapons have not been used since 1945 is a tremendous stroke of luck.  However, it has also seduced us into assuming they will never be used again.  They will.  The question is not if, but when.  We could continue to be very lucky.  We could also suffer a nuclear attack tomorrow, whether by miscalculation, or the actions of terrorists or rogue states.  If we continue to have a stockpile, it must be maintained.  Highly trained scientists must be available to maintain it.  Unfortunately, babysitting a pile of nuclear bombs while they gather dust is not an attractive career path.  Access to facilities like the NIF is a powerful incentive to those who would not otherwise consider such a career.

    One of the reasons this is true is the “dual use” capability of the NIF.  It can be used to study many aspects of high energy density physics that may not be relevant to nuclear weapons, but are of great interest to scientists in academia and elsewhere who are interested in fusion energy, the basic science of matter at extreme conditions, astrophysics, etc.  Some of the available time on the facility will be reserved for these outside users.

    As for the elusive goal of ignition itself, we know that it is scientifically feasible, just as we know that its magnetic fusion equivalent is scientifically feasible.  The only question remaining is how big the lasers have to be to reach it.  It may eventually turn out that the ones available on the NIF are not big enough.  However, the idea that because we didn’t get ignition in the first attempts somehow proves that ignition is impossible and out of the question is ridiculous.  It has not even been “proven” that the current indirect drive approach won’t work.  If it doesn’t, there are several alternatives.  The NIF is capable of being reconfigured for direct drive, in which the lasers are aimed directly at the fusion target.  For various reasons, the beams are currently being frequency-tripled from the original “red” light of the glass lasers to “blue.”  Much more energy, up to around four megajoules instead of the current 1.8, would be available if the beams were only frequency-doubled to “green”.  It may be that the advantage of the extra energy will outweigh the physics-related disadvantages of green light.  An interesting dark horse candidate is the “fast ignitor” scenario, in which the target would be imploded as before, but a separate beam or beams would then be used to heat a small spot on the outer surface to ignition conditions.  An alpha particle “burn wave” would then propagate out, igniting the rest of the fuel, just as originally envisioned for the central hot spot approach.

    Some of the comments following the Internet posts about NIF’s failure to reach ignition are amusing.  For example, following an article on the Physics Today website we learn to our dismay:

    With all due respect to the NIF and its team of well-meaning and enthusiastic researchers here, I am sorry to state hereby that sustainable nuclear fusion is predestined to fail, whether it be in the NIC, the Tokamak or anywhere else in solar space, for fundamentally two simple reasons paramount for fusion: ((1) vibrational synchronism (high-amplitude resonance) of reacting particles; and (2) the overall isotropy of their ambient field.

    Obviously the commenter hadn’t heard that the scientific feasibility of both inertial and magnetic fusion has already been established.  He reminds me of a learned doctor who predicted that Zadig, the hero of Voltaire’s novel of that name, must inevitably die of an injury.  When Zadig promptly recovered, he wrote a thick tome insisting that Zadig must inevitably have died.  Voltaire informs us that Zadig did not read the book.  In an article on the IEEE Spectrum website, suggestively entitled National Ignition Facility:  Mother of All Boondoggles?, another commenter chimes in:

    How about we spend the billions on real research that actually has a chance of producing something useful? There are a gazillion ideas out there for research that has a much higher probability of producing useful results. Must be nice to work for LLNL where your ideas don’t need vetting.

    In fact, the NIF was “vetted” by a full scale Federal Advisory Committee.  Known as the Inertial Confinement Fusion Advisory Committee, or ICFAC, its members included Conrad Longmire, Marshall Rosenbluth, and several other experts in plasma physics and technology of world renown who had nothing whatsoever to gain by serving as shills for LLNL.  It heard extensive testimony on plans to build the NIF, both pro and con, in the mid-90’s.  Prominent among those who opposed the project was Steve Bodner, head of the ICF Program at the Naval Research Laboratory (NRL) at the time.  Steve cited a number of excellent reasons for delaying major new starts like the NIF until some of the outstanding physics issues could be better understood.  The Committee certainly didn’t ignore what he and other critics had to say.  However, only one of the 15 or so members dissented from the final decision to recommend proceeding with the NIF.  I suspect that LLNL’s possession of the biggest, baddest ICF computer code at the time had something to do with it.  No one is better at bamboozling himself and others than a computational physicist with a big code.  The one dissenter, BTW, was Tim Coffey, Director of NRL at the time, who was convinced that Bodner was right.

    There are, of course, the predictable comments by those in the habit of imagining themselves geniuses after the fact, such as,

    I am convinced. Garbage research.

    and,

    Don’t these people feel ashamed telling so many lies?

    after the IEEE Spectrum article, and,

    It’s amazing to think that you can spout lies to the government to receive $6 billion for a machine that doesn’t come close to performing to spec and there are no consequences for your actions.

    Following a post on the NIF at the LLNL – The True Story blog.  Fortunately, most of the comments I’ve seen recently have been at a rather more thoughtful level.  In any event, I hope Congress doesn’t decide to cut and run on the NIF.  Pulling the plug at this point would be penny-wise and pound-foolish.

    One of the two NIF laser bays

  • The NIF: Lots of Power and Energy, but No Ignition

    Posted on July 24th, 2012 Helian No comments

    According to a recent press release from Lawrence Livermore National Laboratory (LLNL) in California, the 192-beam National Ignition Facility (NIF) fired a 500 terawatt shot on July 5.  The world record power followed a world record energy shot of 1.89 Megajoules on July 3.  As news, this doesn’t rise above the “meh” category.  A shot at the NIF’s design energy of 1.8 Megajoules was already recorded back in March.  It’s quite true that, as NIF Director Ed Moses puts it, “NIF is becoming everything scientists planned when it was conceived over two decades ago.”  The NIF is a remarkable achievement in its own right, capable of achieving energies 50 times greater than any other laboratory facility, with pulses shaped and timed to pinpoint precision.  The NIF team in general and Ed Moses in particular deserve great credit, and the nation’s gratitude, for that achievement after turning things around following a very shaky start.

    The problem is that, while the facility works as well, and even better than planned, the goal it was built to achieve continues to elude us.  As its name implies, the news everyone is actually waiting for is the announcement that ignition (defined as fusion energy out greater than laser energy in) has been achieved.  As noted in the article, Moses said back in March that “We have all the capability to make it happen in fiscal year 2012.”  At this point, he probably wishes his tone had been a mite less optimistic.  To reach their goal in the two months remaining, the NIF team will need to pull a rabbit out of their collective hat.  A slim chance remains.  Apparently the NIF’s 192 laser beams were aimed at a real ignition target with a depleted uranium capsule and deuterium-tritium fuel on July 5, and not a surrogate.  The data from that shot may prove to be a great deal more interesting than the 500 terawatt power announcement.

    Meanwhile, the Russians are apparently forging ahead with plans for their own superlaser, to be capable of a whopping 2.8 Megajoules, and the Chinese are planning another about half that size, to be operational at about the same time (around 2020).  That, in itself, speaks volumes about the real significance of ignition.  It may be huge for the fusion energy community, but not that great as far as the weaponeers who actually fund these projects are concerned.  Many weapons designers at LLNL and Los Alamos were notably unenthusiastic about ignition when NIF was still in the planning stages.  What attracted them more was the extreme conditions, approaching those in an exploding nuke, that could be achieved by the lasers without ignition.  They thought, not without reason, that it would be much easier to collect useful information from such experiments than from chaotic ignition plasmas.  Apparently the Russian bomb designers agree.  They announced their laser project back in February even though LLNL’s difficulties in achieving ignition were well known at the time.

    The same can be said of some of the academic types in the NIF “user community.”  It’s noteworthy that two of them, Rick Petrasso of MIT and Ray Jeanloz of UC Berkeley, whose enthusiastic comments about the 500 terawatt shot where quoted in the latest press release, are both key players in the field of high energy density physics.  Ignition isn’t a sine qua non for them either.  They will be able to harvest scores of papers from the NIF whether it achieves ignition or not.

    The greatest liability of not achieving early ignition may be the evaporation of political support for the NIF.  The natives are already becoming restless.  As noted in the Livermore Independent,

    In early May, sounding as if it were discussing an engineering project rather than advanced research, the House Appropriations Committee worried that NIF’s “considerable costs will not have been warranted” if it does not achieve ignition by September 30, the end of the federal fiscal year.

    and,

    Later that month, in a tone that seemed to demand that research breakthroughs take place according to schedule, the House Armed Services Committee recommended that NIF’s ignition research budget for next year be cut by $30 million from the requested $84 million budget unless NIF achieves ignition by September 30.

    Funding cuts at this point, after we have come so far, and are so close to the goal, would be short-sighted indeed.  One must hope that a Congress capable of squandering billions on white elephants like the International Space Station will not become penny-wise and pound-foolish about funding a project that really matters.

  • The Atomic Bomb and the Premonitions of James Burnham

    Posted on July 23rd, 2012 Helian No comments

    We tend to be strongly influenced by the recent past in our predictions about the future.  After World War I, any number of pundits, statesmen, and military officers thought the next war would be a carbon copy of the one they had just lived through, albeit perhaps on a larger scale.  The German government’s disastrous decision to declare war in 1914 was likely influenced by the quick and decisive German victories in 1864, 1866, and 1870.  The Japanese were similarly mesmerized by their brilliant success against the Russians in 1904-05 after an opening surprise attack against the Russian fleet lying at anchor at Port Arthur, and assumed history would repeat itself if they launched a similar attack against Pearl Harbor.

    Sometimes startling events force the reevaluation of old ideas and paradigms, such as the German armored Blitzkrieg or the destruction of powerful battleships from the air in World War II, or, more recently, the sudden collapse of Communism and the Soviet Union from 1989-91.  We are always fascinated by such events, yet few of us grasp their significance as they are happening.  Our tendency is always to look backwards, to fit the revolutionary and the unprecedented into the old world that we understand rather than the new one that we can’t yet imagine.  So it was after the dropping of the first atomic bombs.  It certainly focused the attention of public intellectuals, unleashing a torrent of essays full of dire predictions.  For many, the future they imagined was simply a continuation of the immediate past, albeit with new and incredibly destructive weapons.  It was to include the continued inexorable push for world dominion by totalitarian Communism, centered in the Soviet Union, and world wars following each other in quick succession every 15 to 20 years, about the same as the interval between the first two world wars.

    Such a vision of the future was described by James Burnham in “The Struggle for the World,” published in 1947.  Burnham was a former Marxist and Trotskyite who eventually abandoned Marxism, and became one of the leading conservative intellectuals of his day.  His thought made a deep impression on, among others, George Orwell.  For example, he had suggested the possibility of a world dominated by three massive totalitarian states, constantly at war with each other, in an earlier book, “The Managerial Revolution,” published in 1941.  These became Oceania, Eastasia, and Eurasia in Orwell’s “1984.”  The notions of “doublethink”, the totalitarian use of terms such as “justice” and “peace” in a sense opposite to their traditional meanings, and the rewriting of history every few years “so that history itself will always be a confirmation of the immediate line of the party,” familiar to readers of “1984,” were also recurrent themes in “The Struggle for the World.”

    Burnham, born in 1905, had come of age during the stunning period of wars, revolutions, and the birth of the first totalitarian states that began and ended with the world wars of the 20th century.  He assumed that events of such global impact would continue at the same pace, only this time in a world with nuclear weapons.   As a former Marxist, he knew that the Communists, at least, were deliberately engaged in a “struggle for the world,” and was dismayed that U.S. politicians at the time were so slow to realize the nature of the struggle.  He also correctly predicted that, unless they were stopped, the Communists would develop nuclear weapons in their Soviet base “in a few years.”  This, he warned, could not be allowed to happen because it would inevitably and quickly lead to a full scale nuclear exchange.  His reasoning was as follows:

    Let us assume that more than one (two is enough for the assumption) power possesses, and is producing, atomic weapons.  Each will be improving the efficiency and destructive potential of the weapons as it goes along.  Now let us try to reason as the leaders of these powers would be compelled to reason.

    Each leader of Power A could not but think as follows:  Power B has at its disposal instruments which could, in the shortest time, destroy us.  He has possibly made, or is about to make, new discoveries which will threaten even more complete and rapid destruction.  At the moment, perhaps, he shows no open disposition to use these instruments.  Nevertheless, I cannot possibly rely on his continued political benevolence – above all since he knows that I also have at my disposal instruments that can destroy him.  Some hothead – or some wise statesman – of his may even now be giving the order to push the necessary buttons.

    Even if there were no atomic weapons, many of the leaders would undoubtedly be reasoning today along these lines.  Atomic weapons are, after all, not responsible for warfare, not even for the Third World War, which has begun.  The fact that the political and social causes of a war are abundantly present stares at us from every edition of every newspaper.  The existence of atomic weapons merely raises the stakes immeasurably higher, and demands a quicker decision.

    But to assume, as do some foolish commentators, that fear of retaliation will be the best deterrent to an atomic war is to deny the lessons of the entire history of war and of society.  Fear, as Ferrero so eloquently shows, is what provokes the exercise of force.  Most modern wars have been, in the minds of every belligerent, preventive:  an effort to stamp out the fear of what the other side might be about to do.

    The existence of two or more centers of control of atomic weapons would be equal to a grenade with the pin already pulled.

    According to Burnham, the resulting nuclear war or wars would lead to the collapse of Western Civilization.  In his words,

    If, however, we are not yet ready to accept passively the final collapse of Western Civilization, we may state the following as a necessary first condition of any workable solution of the problem of atomic weapons: there must be an absolute monopoly of the production, possession and use of all atomic weapons.

    One wonders what direction world history might have taken had someone like Burnham been President in 1950 instead of Truman.  He would have almost certainly adopted MacArthur’s plan to drop numerous atomic bombs on China and North Korea.  We were lucky.  In the end, Truman’s homespun common sense prevailed over Burnham’s flamboyant intellect, and the nuclear genie remained in the bottle.

    However, in 1947 the U.S. still had a monopoly of nuclear weapons, and, for the reasons cited above, Burnham insisted we must keep it.  He suggested that this might best be done by establishing an effectual world government, but dismissed the possibility as impractical.  The only workable alternative to a Communist conquest of the world or full scale nuclear war and the end of Western Civilization was U.S. hegemony.  In Burnham’s words,

    It is not our individual minds or desires, but the condition of world society, that today poses for the Soviet Union, as representative of communism, and for the United States, as representative of Western Civilization, the issue of world leadership. No wish or thought of ours can charm this issue away.

    This issue will be decided, and in our day. In the course off the decision, both of the present antagonists may, it is true, be destroyed. But one of them must be.

    Whatever the words, it is well also to know the reality. The reality is that the only alternative to the communist World Empire is an American Empire which will be, if not literally worldwide in formal boundaries, capable of exercising decisive world control. Nothing less than this can be the positive, or offensive, phase of a rational United States policy.

    As a first step to empire, Burnham proposed the union of Great Britain and the United States, to be followed, not by outright conquest, but by firm assertion of U.S. predominance and leadership in the non-Communist world.   Beyond that, the Communist threat must finally be recognized for what it was, and a firm, anti-Communist policy substituted for what was seen as a lack of any coherent policy at all.  Vacillation must end.

    Fortunately, when it came to the nuclear standoff, Burnham was wrong, and the “foolish commentators” who invoked the fear of retaliation were right.  Perhaps, having only seen the effects of dropping two low yield bombs, he could not yet imagine the effect of thousands of bombs orders of magnitude more powerful, or conceive of such a thing as mutually assured destruction.  Perhaps it was only dumb luck, but the world did not stumble into a nuclear World War III as it had into the conventional world wars of the 20th century, and the decisive events in the struggle did not follow each other nearly as quickly as Burnham imagined they would.

    Burnham also failed to foresee the implications of the gradual alteration in the nature of the Communist threat.  At the time he wrote, it was everything he claimed it to be, a messianic secular religion at the height of its power and appeal.  He assumed that it would retain that power and appeal until the battle was decided, one way or the other.  Even though he was aware that the masses living under Communism, other than a dwindling number of incorrigible idealists, were already disillusioned by “the God that failed,” he didn’t foresee what a decisive weakness that would eventually become.   In the end, time was on our side.  The Communists, and not we, as Lenin had predicted, finally dropped onto the garbage heap of history “like a ripe plum.”

    However, Burnham wasn’t wrong about everything.  To win the struggle, it was necessary for us to finally recognize the threat.  Whatever doubt remained on that score, at least as far as most of our political leaders were concerned, was dissipated by the North Korean invasion of the south.  Our policy of vacillation didn’t exactly end, but was occasionally relieved by periods of firmness.  In the end, in spite of a media dominated through most of the struggle by Lenin’s “useful idiots” and the resultant cluelessness of most Americans about what we were even trying to do on the front lines of the “clash between the cultures” in places like Vietnam, we prevailed.

    It was a near thing.  Burnham feared that, even after losing the opening battles of the next war to a United States with a monopoly of nuclear weapons, the Communists might regroup, abandon their vulnerable cities, and transform the struggle into a “people’s war.”  His description of what would follow was eerily similar to what actually did happen, but in a much smaller arena than the whole world:

    They would transform the struggle into a political war, a “people’s war,” fought in every district of the world by irregulars, partisans, guerillas, Fifth Columns, spies, stool pigeons, assassins, fought by sabotage and strikes and lies and terror and diversion and panic and revolt. They would play on every fear and prejudice of the United States population, every feeling of guilt or nobility; they would exploit every racial and social division; they would widen every antagonism between tentative allies; and they would tirelessly wear down the United States will to endure.

    Though the result would be not quite so certain, perhaps, as if the communists also had atomic weapons, they would in the end, I think, succeed. Because of the lack of a positive United States policy, because it would not have presented to the world even the possibility of a political solution, its dreadful material strength would appear to the peoples as the unrelieved brutality of a murderer. Its failure to distinguish between the communist regime and that regime’s subject-victims would weld together the victims and their rulers. Americans themselves would be sickened and conscience-ridden by what would seem to them a senseless slaughter, never-ending, leading nowhere. The military leadership would be disoriented by the inability of their plans based on technical superiority to effect a decision. The failure to conceive the struggle politically would have given the communists the choice of weapons. From the standpoint of the United States, the entire world would have been turned into an ambush and a desert. In the long night, nerves would finally crack, and sentries would fire their last shots wildly into the darkness, and it would all be over.

    Change “the world” to Vietnam and it reads like a history instead of a premonition.  Tomorrow is another day, and I doubt that any of us will prove better at predicting what the future will bring than Burnham.  We have lived through an era much different, more peaceful, and more sedate in the pace of events than the one he experienced between 1914 and 1945.  We should beware of assuming, as he did, that the future will bear any resemblance to the immediate past.  The world is still full of nuclear weapons, some of them already in the hands of, or soon to be in the hands of, dictators of suspect rationality.  Some of our intellectuals soothe our fears with stories about the “vanishing of violence,” but as Omar Khayyam put it in the “Rubaiyat,” they could soon be “cast as foolish prophets forth, their mouths stopped with dust,” through some miscalculation or deliberate act of malice.  As the Boy Scouts say, “be prepared.”

  • Interstellar Transport: Freeman Dyson and Hydrogen Bomb Propulsion

    Posted on July 11th, 2012 Helian 1 comment

    And you thought I was crazy…  Check out this article by Freeman Dyson in the October 1968 issue of Physics Today entitled “Interstellar Transport.”  Dyson was an active participant in Project Orion, a program to build interplanetary space vehicles propelled by nuclear bombs.  After the program was ended by the 1963 nuclear test ban treaty, he decided to write a paper for a high visibility journal to insure that the idea was kept alive and people were aware of its potential.

    People thought big in those days, and Dyson’s notional interstellar transports certainly reflected the fact.  The first was designed to absorb the blast of one megaton deuterium fueled bombs in a gigantic copper hemisphere with a radius of 10 kilometers weighing 5 million tons.  The fully loaded ship would have weighed 40 million tons, including 30 million of the one megaton bombs.  Assuming each bomb would require 10 pounds of plutonium (or about 60 pounds of highly enriched uranium), a total of 150,000 tons of plutonium would be required for the mission.

    Dubious assumptions were made, as, for example, that 100% of the bomb’s energy would go into the kinetic energy of debris, even though it was known at the time (and certainly known to Dyson), that the actual fraction is much less than that.  The cost was calculated to be one 1968 gross national product, based entirely on the projected cost of the necessary deuterium fuel (3 billion pounds at $200 per pound in 1968 dollars, for a total of $600 billion.)  In other words, the cost of the plutonium, copper, and other building material wasn’t even factored in, nor was the cost of getting it all into earth orbit prior to launch.  In spite of all this, the massive ship, carrying about 20,000 colonists, would still take about 1300 years to reach the nearest stars.  Barring a “Noah’s ark” forlorn hope escape from a dying world, even Dyson considered this impractical for human travel, writing,

    As a voyage of colonization a trip as slow as this does not make much sense on a human time scale.  A nonhuman species, longer lived or accustomed to thinking in terms of millennia rather than years, might find the conditions acceptable.

    To obviate some of the objections of this “conservative” design, Dyson also proposed an “optimistic” design, which allowed some ablation of the surface of the vehicle nearest to the explosions, rather than requiring all the energy to be absorbed in solid material.  After removing this energy limitation, the main limitation on the ship’s performance would be imposed by momentum, or, as Dyson put it, “the capacity of shock absorbers to transfer momentum from an impulsively accelerated pusher plate to the smoothly accelerated ship.”  Basing his reasoning on the optimum performance of practical shock absorbers, Dyson calculated that such a ship could be accelerated at a constant one g, enabling it to reach the nearest stars in centuries rather than millennia.  The cost, again based solely on the value of the deuterium fuel, would be only $60 billion 1968 dollars, or a tenth of the GNP at that time.  The weight of the ship would be “only” 400,000 tons, a factor of 100 less than that of the “conservative” design.  Dyson concluded,

    If we continue our 4% growth rate we will have a GNP a thousand times its present size in about 200 years.  When the GNP is multiplied by 1000, the building of a ship for $100B will seem like building a ship for $100M today.  We are now building a fleet of Saturn V which cost about $100M each.  It may be foolish but we are doing it anyhow.  On this basis, I predict that about 200 years from now, barring a catastrophe, the first interstellar voyages will begin.

    I suspect Dyson wrote most of this paper “tongue in cheek.”  He’s nobody’s fool, has remarkable achievements to his credit in fields such as quantum electrodynamics, solid state physics, and nuclear engineering, and remains highly regarded by his peers.  Nobel laureate Steven Weinberg said that the Nobel Committee had “fleeced” Dyson by never awarding him the prize.  The objections to his designs are obvious, but for all that, bomb-propelled space vehicles are by no means impractical.  I suspect Dyson realized that other scientists would recognize ways they could improve on his “conservative” and “optimistic” designs as soon as they read the paper, and start thinking about their own versions.  Project Orion might be dead as a budget line item, but would live on in the minds and imaginations of his peers.  And so it did.

  • Nuclear Fusion Update

    Posted on June 10th, 2012 Helian 2 comments

    As I mentioned in a previous post about fusion progress, signs of life have finally been appearing in scientific journals from the team working to achieve fusion ignition at the National Ignition Facility, or NIF, located at Lawrence Livermore National Laboratory (LLNL) in California.  At the moment they are “under the gun,” because the National Ignition Campaign (NIC) is scheduled to end with the end of the current fiscal year on September 30.  At that point, presumably, work at the facility will be devoted mainly to investigations of nuclear weapon effects and physics, which do not necessarily require fusion ignition.  Based on a paper that recently appeared in Physical Review Letters, chances of reaching the ignition goal before that happens are growing dimmer.

    The problem has to do with a seeming contradiction in the physical requirements for fusion to occur in the inertial confinement approach pursued at LLNL.  In the first place, it is necessary for the NIF’s 192 powerful laser beams to compress, or implode, a target containing fusion fuel in the form of two heavy isotopes of hydrogen to extremely high densities.  It is much easier to compress materials that are cold than those that are hot.  Therefore, it is essential to keep the fuel material as cold as possible during the implosion process.  In the business, this is referred to as keeping the implosion on a “low adiabat.”  However, for fusion ignition to occur, the nuclei of the fuel atoms must come extremely close to each other.  Unfortunately, they’re not inclined to do that, because they’re all positively charged, and like charges repel.  How to overcome the repulsion?  By making the fuel material extremely hot, causing the nuclei to bang into each other at high speed.  The whole trick of inertial confinement fusion, then, is to keep the fuel material very cold, and then, in a tiny fraction of a second, while its inertia holds it in place (hence the name, “inertial” confinement fusion), raise it, or at least a small bit of it, to the extreme temperatures necessary for the fusion process to begin.

    The proposed technique for creating the necessary hot spot was always somewhat speculative, and more than one fusion expert at the national laboratories were dubious that it would succeed.  It consisted of creating a train of four shocks during the implosion process, which were to overtake one another all at the same time precisely at the moment of maximum compression, thereby creating the necessary hot spot.  Four shocks are needed because of well-known theoretical limits on the increase in temperature that can be achieved with a single shock.   Which brings us back to the paper in Physical Review Letters.

    The paper, entitled Precision Shock Tuning on the National Ignition Facility, describes the status of efforts to get the four shocks to jump through the hoops described above.  One cannot help but be impressed by the elegant diagnostic tools used to observe and measure the shocks.  They are capable of peering through materials under the extreme conditions in the NIF target chamber, focusing on the tiny, imploded target core, and measuring the progress of a train of shocks over a period that only lasts for a few billionths of a second!  These diagnostics, developed with the help of another team of brilliant scientists at the OMEGA laser facility at the University of Rochester’s Laboratory for Laser Energetics, are a triumph of human ingenuity.  They reveal that the NIF is close to achieving the ignition goal, but not quite close enough.  As noted in the paper, “The experiments also clearly reveal an issue with the 4th shock velocity, which is observed to be 20% slower than predictions from numerical simulation.”

    It will be a neat trick indeed if the NIF team can overcome this problem before the end of the National Ignition Campaign.  In the event that they don’t, one must hope that the current administration is not so short-sighted as to conclude that the facility is a failure, and severely reduce its funding.  There is too much at stake.  I have always been dubious about the possibility that either the inertial or magnetic approach to fusion will become a viable source of energy any time in the foreseeable future.  However, I may be wrong, and even if I’m not, achieving inertial fusion ignition in the laboratory may well point the way to as yet undiscovered paths to the fusion energy goal.  Ignition in the laboratory will also give us a significant advantage over other nuclear weapons states in maintaining our arsenal without nuclear testing.

    Based on the progress reported to date, there is no basis for the conclusion that ignition is unachievable on the NIF.  Even if the central hot spot approach currently being pursued proves too difficult, there are alternatives, such as polar direct drive and fast ignition.  However, pursuing these alternatives will take time and resources.  They will become a great deal more difficult to realize if funding for NIF operations is severely cut.  It will also be important to maintain the ancillary capability provided by the OMEGA laser.  OMEGA is much less powerful but also a good deal more flexible and nimble than the gigantic NIF, and has already proved its value in testing and developing diagnostics, investigating novel experimental approaches to fusion, developing advanced target technology, etc.

    We have built world-class facilities.  Let us persevere in the quest for fusion.  We cannot afford to let this chance slip.

  • Fusion Update: Signs of Life from the National Ignition Facility

    Posted on April 17th, 2012 Helian 10 comments

    The National Ignition Facility, or NIF, is a huge, 192 beam laser system, located at Lawrence Livermore National Laboratory in California.  It was designed, as the name implies, to achieve thermonuclear ignition in the laboratory.  “Ignition” is generally accepted to mean getting a greater energy output from fusion than the laser input energy.  Unlike magnetic confinement fusion, the approach currently being pursued at the International Thermonuclear Experimental Reactor, or ITER, now under construction in France, the goal of the NIF is to achieve ignition via inertial confinement fusion, or ICF, in which the fuel material is compressed and heated to the extreme conditions at which fusion occurs so quickly that it is held in place by its own inertia.

    The NIF has been operational for over a year now, and a two year campaign is underway with the goal of achieving ignition by the end of this fiscal year.  Recently, there has been a somewhat ominous silence from the facility, manifesting itself as a lack of publications in the major journals favored by fusion scientists.  That doesn’t usually happen when there is anything interesting to report.  Finally, however, some papers have turned up in the journal Physics of Plasmas, containing reports of significant progress.

    To grasp the importance of the papers, it is necessary to understand what is supposed to occur within the NIF  target chamber for fusion to occur.  Of course, just as in magnetic fusion, the goal is to bring a mixture of deuterium and tritium, two heavy isotopes of hydrogen, to the extreme conditions at which fusion takes place.  In the ICF approach, this hydrogen “fuel” is contained in a tiny, BB-sized target.  However, the lasers are not aimed directly at the fuel “capsule.”  Instead, the capsule is suspended in the middle of a tiny cylinder made of a heavy metal like gold or uranium.  The lasers are fired through holes on each end of the cylinder, striking the interior walls, where their energy is converted to x-rays.  It is these x-rays that must actually bring the target to fusion conditions.

    It was recognized many years ago that one couldn’t achieve fusion ignition by simply heating up the target.  That would require a laser driver orders of magnitude bigger than the NIF.  Instead, it is first necessary to compress, or implode, the fuel material to extremely high density.  Obviously, it is harder to “squeeze” hot material than cold material to the necessary high densities, so the fuel must be kept as “cold” as possible during the implosion process.  However, cold fuel won’t ignite, begging the question of how to heat it up once the necessary high densities have been achieved.

    It turns out that the answer is shocks.  When the laser generated x-rays hit the target surface, they do so with such force that it begins to implode faster than the speed of sound.  Everyone knows that when a plane breaks the sound barrier, it, too, generates a shock, which can be heard as a sonic boom.  The same thing happens in ICF fusion targets.  When such a shock converges at the center of the target, the result is a small “hot spot” in the center of the fuel.  If the temperature in the hot spot were high enough, fusion would occur.  Each fusion reaction would release a high energy helium nucleus, or alpha particle, and a neutron.  The alpha particles would be slammed to a stop in the surrounding cold fuel material, heating it, in turn, to fusion conditions.  This would result in a fusion “burn wave” that would propagate out through the rest of the fuel, completing the fusion process.

    The problem is that one shock isn’t enough to create such a “hot spot.”  Four of them are required, all precisely timed by the carefully tailored NIF laser pulse to converge at the center of the target at exactly the same time.  This is where real finesse is needed in laser fusion.  The implosion must be extremely symmetric, or the shocks will not converge properly.  The timing must be exact, and the laser pulse must deliver just the right amount of energy.

    One problem in the work to date has been an inability to achieve high enough implosion velocities for the above scenario to work as planned.  One of the Physics of Plasmas papers reports that, by increasing the laser energy and replacing some of the gold originally used in the wall of the cylinder, or “hohlraum,” in which the fuel capsule is mounted with depleted uranium, velocities of 99% of those required for ignition have been achieved.  In view of the recent announcement that a shot on the NIF had exceeded its design energy of 1.8 megajoules, it appears the required velocity is within reach.  Another of the Physics of Plasmas papers dealt with the degree to which implosion asymmetries were causing harmful mixing of the surrounding cold fuel material into the imploded core of the target.  It, too, provided grounds for optimism.

    In the end, I suspect the success or failure of the NIF will depend on whether the complex sequence of four shocks can really be made to work as advertised.  That will depend on the accuracy of the physics algorithms in the computer codes that have been used to model the experiments.  Time and again, earlier and less sophisticated codes have been wrong because they didn’t accurately account for all the relevant physics.  There is no guarantee that critical phenomena have not been left out of the current versions as well.  We may soon find out, if the critical series of experiments planned to achieve ignition before the end of the fiscal year are carried out as planned.

    One can but hope they will succeed, if only because some of our finest scientists have dedicated their careers to the quest to achieve the elusive goal of controlled fusion.  Even if they do, fusion based on the NIF approach is unlikely to become a viable source of energy, at least in the foreseeable future.  Laser fusion may prove scientifically feasible, but getting useful energy out of it will be an engineering nightmare, dangerous because of the need to rely on highly volatile and radioactive tritium, and much too expensive to compete with potential alternatives.  I know many of the faithful in the scientific community will beg to differ with me, but, trust me, laser fusion energy aint’ gonna happen.

    On the other hand, if ignition is achieved, the NIF will be invaluable to the country, not as a source of energy, but for the reason it was funded in the first place – to insure that our nation has an unmatched suite of experimental facilities to study the physics of nuclear weapons in a era free of nuclear testing.  As long as we have unique access to facilities like the NIF, which can approach the extreme physical conditions within exploding nukes, we will have a significant leg up on the competition as long as the test ban remains in place.  For that, if for no other reason, we should keep our fingers crossed that the NIF team can finally clear the last technical hurdles and reach the goal they have been working towards for so long.

    Fusion ignition process,courtesy of Lawrence Livermore National Laboratory

  • The NIF: No News is Bad News

    Posted on January 19th, 2011 Helian No comments

    For those who don’t follow fusion technology, the National Ignition Facility, or NIF, is a giant, 192 beam laser facility located at Lawrence Livermore National Laboratory.  As its name would imply, it is designed to achieve fusion ignition, which has been variously defined, but basically means that you get more energy out from the fusion process than it was necessary to pump into the system to set off the fusion reactions.  There are two “classic” approaches to achieving controlled fusion in the laboratory.  One is magnetic fusion, in which light atoms stripped of their electrons, or ions, typically heavy isotopes of hydrogen, are confined in powerful magnetic fields as they are heated to the temperatures necessary for fusion to occur.  The other is inertial confinement fusion, or ICF, in which massive amounts of energy are dumped into a small target, causing it to reach fusion conditions so rapidly that significant fusion can occur in the very short time that the target material is held in place by its own inertia.  The NIF is a facility of the latter type.

    There are, in turn, two basic approaches to ICF.  In one, referred to as direct drive, the target material is directly illuminated by the laser beams.  In the other, indirect drive, the target is placed inside a small container, or “hohlraum,” with entrance holes for the laser beams.  These are aimed at the inside walls of the hohlraum, where they are absorbed, producing x-rays which then compress and ignite the target.  The NIF currently uses the latter approach.

    The NIF was completed and became operational in 2009.  Since that time, the amount of news coming out of the facility about the progress of experiments has been disturbingly slight.  That is not a good thing.  If everything were working as planned, a full schedule of ignition experiments would be underway as I write this.  Instead, the facility is idle.  The results of the first experimental campaign, announced in January, sounded positive.  The NIF had operated at a large fraction of its design energy output of 1.8 Megajoules.  Surrogate targets had been successfully compressed to very high densities in symmetric implosions, as required for fusion.  However, on reading the tea leaves, things did not seem quite so rosy.  Very high levels of laser plasma interaction (LPI) had been observed.  In such complex scattering interactions, laser light can be scattered out of the hohlraum, or in other undesired directions, and hot electrons can be generated, wreaking havoc with the implosion process by preheating the target.  We were assured that ways had been found to control the excess LPI, and even turn it to advantage in controlling the symmetry of the implosion.  However, such “tuning” with LPI had not been foreseen at the time the facility was designed, and little detail was provided on how the necessary delicate, time-dependent shaping of the laser pulses would be achieved under such conditions.

    After a long pause, another series of “integrated” experiments was announced in October.  Even less information was released on this occasion.  We were informed that symmetric implosions had been achieved, and that, “From both a system integration and from a physics point of view, this experiment was outstanding,”  Since then, nothing.  

    It’s hard to imagine that the outlook is really as rosy as the above statement would imply.  The NIF was designed for a much higher shot rate.  If it sat idle through much of 2010, there must be a reason.  It could be that damage to the laser optics has been unexpectedly high.  This would not be surprising.  Delicate crystals are used at the end of the chain of laser optics to triple the frequency of the laser light, and, given that the output energy of the facility is more than an order of magnitude larger than that of its next largest competitor, damage may have occurred in unexpected ways, as it did on Nova, the NIF’s predecessor at Livermore.  LPI may, in fact, be more serious, more difficult to control, and more damaging than the optimistic accounts in January implied.  Unexpected physics may be occurring in the absorption of laser light at the hohlraum walls.  Whatever the problem, Livermore would be well advised to be forthcoming about it in its press releases.  After all, the NIF will achieve ignition or not, regardless of how well the PR is managed.

    All this seems very discouraging for the scientists who have devoted their careers to the quest for fusion energy, not to mention the stewards of the nation’s nuclear weapons stockpile, whose needs the NIF was actually built to address.  In the end, these apparent startup problems may be overcome, and ignition achieved after all.  However, I rather doubt it, unless perhaps Livermore comes up with an alternative to its indirect drive approach.