Posted on April 22nd, 2013 No comments
A while back in an online discussion with a German “Green,” I pointed out that, if Germany shut down its nuclear plants, coal plants would have to remain in operation to take up the slack. He was stunned that I could be so obtuse. Didn’t I realize that the lost nuclear capacity would all be replaced by benign “green” energy technology? Well, it turns out things didn’t quite work out that way. In fact, the lost generating capacity is being replaced by – coal.
Germany is building new coal-fired power plants hand over fist, with 26 of them planned for the immediate future. According to Der Spiegel, the German news magazine that never misses a trick when it comes to bashing nuclear, that’s a feature, not a bug. A recent triumphant headline reads, “Export Boom: German Coal Electricity Floods Europe.” Expect more of the same from the home of Europe’s most pious environmentalists. Germany has also been rapidly expanding its solar and wind capacity recently thanks to heavy state subsidies, but the wind doesn’t always blow and the sun doesn’t always shine, especially in Germany. Coal plants are required to fill in the gaps – lots of them. Of course, it would be unprofitable to let them sit idle when wind and solar are available, so they are kept going full blast. When the power isn’t needed in Germany, it is sold abroad, serving as a useful prop to Germany’s export fueled economy.
Remember the grotesque self-righteousness of Der Spiegel and the German “Greens” during the Kyoto Treaty debates at the end of the Clinton administration? Complying with the Kyoto provisions cost the Germans nothing. They had just shut down the heavily polluting and grossly unprofitable industries in the former East Germany, had brought large numbers of new gas-fired plants on line thanks to increasing gas supplies from the North Sea fields, and had topped it off with a lame economy in the 90′s compared to the booming U.S. Their greenhouse gas emissions had dropped accordingly. Achieving similar reductions in the U.S. wouldn’t have been a similar “freebie.” It would have cost tens of thousands of jobs. The German “Greens” didn’t have the slightest problem with this. They weren’t interested in achieving a fair agreement that would benefit all. They were only interested in striking pious poses.
Well, guess what? Times have changed. Last year U.S. carbon emissions were at their lowest level since 1994, and down 3.7% from 2011. Our emissions are down 7.7% since 2006, the largest drop among major industrial states on the planet. German emissions were up at least 1.5% last year, and probably more like 2%. Mention this to a German “Green,” and he’s likely to mumble something about Germany still being within the Kyoto limits. That’s quite true. Germany is still riding the shutdown of what news magazine Focus calls “dilapidated, filthy, communist East German industry after the fall of the Berlin Wall,” to maintain the facade of environmental “purity.”
That’s small comfort to her eastern European neighbors. Downwind from Germany’s coal-fired plants, their “benefit” from her “green” policies is acid rain, nitrous oxide laced smog, deadly particulates that kill and sicken thousands and, last but not least, a rich harvest of radioactive fallout. That’s right, Germany didn’t decrease the radioactive hazard to her neighbors by shutting down her nuclear plants. She vastly increased it. Coal contains several parts per million each of radioactive uranium and thorium. These elements are harmless enough – if kept outside the body. The energetic alpha particles they emit are easily stopped by a normal layer of skin. When that happens, they dump the energy they carry in a very short distance, but, since skin is dead, it doesn’t matter. It’s an entirely different matter when they dump those several million electron volts of energy into a living cell – such as a lung cell. Among other things, that can easily derange the reproductive equipment of the cell, causing cancer. How can they reach the lungs? Very easily if the uranium and thorium that emit them are carried in the ash from a coal-fired plant. A typical coal-fired plant releases about 5 tons of uranium and 12 tons of thorium every year. The German “Greens” have no problem with this, even though they’re constantly bitching about the relatively miniscule release of uranium from U.S. depleted uranium munitions. Think scrubber technology helps? Guess again! The uranium and thorium are concentrated in the ash, whether it ends up in the air or not. They can easily leach into surrounding cropland and water supplies.
The last time there was an attempt to move radioactive waste to the Gorleben storage facility within Germany, the “Greens” could be found striking heroic poses as saviors of the environment all along the line, demonstrating, tearing up tracks, and setting police vehicles on fire. Their “heroic” actions forced the shutdown of Germany’s nuclear plants. The “gift” (German for “poison”) of their “heroic” actions to Germany’s neighbors came in the form of acid rain, smog, and airborne radiation. By any reasonable standard, coal-fired plants are vastly more dangerous and damaging to the environment than the nuclear facilities they replaced.
It doesn’t matter to Germany’s “Greens.” The acid rain, the radiation, the danger of global warming they always pretend to be so concerned about? It doesn’t matter. For them, as for the vast majority of other environmental zealots worldwide, the pose is everything. The reality is nothing.
Posted on February 4th, 2013 No comments
According to a German proverb, “Lügen haben kurze Beine” – Lies have short legs. That’s not always true. Some lies have very long ones. One of the most notorious is the assertion, long a staple of anti-nuclear propaganda, that the nuclear industry ever claimed that nuclear power would be “Too cheap to meter.” In fact, according to the New York Times, the phrase did occur in a speech delivered to the National Association of Science Writers by Lewis L. Strauss, then Chairman of the Atomic Energy Commission, in September 1954. Here is the quote, as reported in the NYT on September 17, 1954:
“Our children will enjoy in their homes electrical energy too cheap to meter,” he declared. … “It is not too much to expect that our children will know of great periodic regional famines in the world only as matters of history, will travel effortlessly over the seas and under them and through the air with a minimum of danger and at great speeds, and will experience a lifespan far longer than ours, as disease yields and man comes to understand what causes him to age.”
Note that nowhere in the quote is there any direct reference to nuclear power, or for that matter, to fusion power, although the anti-nuclear Luddites have often attributed it to proponents of that technology as well. According to Wikipedia, Strauss was “really” referring to the latter, but I know of no evidence to that effect. In any case, Strauss had no academic or professional background that would qualify him as an expert in nuclear energy. He was addressing the science writers as a government official, and hardly as a “spokesman” for the nuclear industry. The sort of utopian hyperbole reflected in the above quote is just what one would expect in a talk delivered to such an audience in the era of scientific and technological hubris that followed World War II. There is an excellent and detailed deconstruction of the infamous “Too cheap to meter” lie on the website of the Canadian Nuclear Society. Some lies, however, are just too good to ignore, and anti-nuclear zealots continue to use this one on a regular basis, as, for example, here, here and here. The last link points to a paper by long-time anti-nukers Arjun Makhijani and Scott Saleska. They obviously knew very well the provenance of the quote and the context in which it was given. For example, quoting from the paper:
In 1954, Lewis Strauss, Chairman of the U.S. Atomic Energy Commission, proclaimed that the development of nuclear energy would herald a new age. “It is not too much to expect that our children will enjoy in their homes electrical energy too cheap to meter,” he declared to a science writers’ convention. The speech gave the nuclear power industry a memorable phrase to be identified with, but also it saddled it with a promise that was essentially impossible to fulfill.
In other words, it didn’t matter that they knew very well that Strauss had no intention of “giving the nuclear power industry a memorable phrase to be identified with.” They used the quote in spite of the fact that they knew that claim was a lie. I all fairness, it can be safely assumed that most of those who pass along the “too cheap to meter” lie are not similarly culpable. They are merely ignorant.
Posted on January 7th, 2013 No comments
Der Spiegel, Germany’s top news magazine, has been second to none in promoting green energy, striking pious poses over the U.S. failure to jump on the Kyoto bandwagon, and trashing nuclear energy. All this propaganda has succeeded brilliantly. Germany has a powerful Green Party and is a world leader in the production of wind and solar energy, the latter in a cloudy country, the lion’s share of which lies above the 50th parallel of latitude. Now the bill has come due. In 2012 German consumers paid more than 20 billion Euros for green energy that was worth a mere 2.9 billion on the open market. True to form, Der Spiegel has been churning out shrill condemnations of the high prices, as if it never had the slightest thing to do with promoting them in the first place. In an article entitled “Green Energy Costs Consumers More Than Ever Before,” we find, among other things, that,
The cost of renewable energy continues climbing year after year. At the beginning of the year it increased from 3.59 to 5.27 (Euro) cents per kilowatt hour. One of the reasons for the increase is solar energy: more new solar facilities were installed in Germany in 2012 than ever before. The drawback of the solar boom is that it drives up the production costs paid by consumers. The reason – green energy producers will receive guaranteed compensation for every kilowatt hour for the next 20 years.
As a result, German consumers saw their bills for electricity increase by an average of 12% at the beginning of 2013. The comments following the article are at least as revealing as its content. The environmental hubris of the population shows distinct signs of fading when tranlated into terms of cold, hard cash. Examples:
What a laugh! The consumers pay 17 billion Euros, and the producers receive 2.9 billion Euros. Conclusion: End the subsidies for solar facilities immediately!! It’s too bad that the pain of consumers – if the Green Party joins the government after the Bundestag election – won’t end, but will only get worse. Other than that, solar facilities belong in countries with significantly more hours of sunlight than Germany.
Those were the days, when (Green politician) Trittin told shameless lies to the public, claiming that the switch to green energy would only cost 1.5 Euros per household.
In ten years we’ll learn what the green energy lies are really going to cost us.
The real costs are even higher. When there’s no wind, or clouds cut off the sunlight, then the conventional energy sources held in reserve must make up the deficit; the oil, coal and brown coal energy plants. If production costs are calculated correctly, then their expense should be included in the price of green energy. All at once there is a jump from 17 billion to 25 billion Euros in the price we have to pay for the “favors” the Green-Red parties have done us.
Specious arguments about the supposedly comparable costs of the nuclear power plants Germany is in the process of shutting down are no longer swallowed with alacrity. For example, in response to the familiar old chestnut of citing exaggerated costs for decommissioning nuclear plants and storing the waste a commenter replies:
Hmmm, if nuclear energy is so expensive, why are so many countries in central Europe – for example, the Czech Republic – interested in nuclear power? Certainly not to breed actinides to build nuclear weapons in order to become “nuclear powers.” The cost of long term waste storage in terms of the energy produced only amounts to about 0.01 Euros per Kw/h. Even decommissioning expenses don’t add significantly to the overall cost… Let us split atoms, not hairs.
A “green” commenter suggests that the cleanup costs for the Fukushima reactors be automatically added to the cost of all reactors:
According to the latest figures for November 2012 for Fukushima: 100 billion Euros. Distributing this over the total energy production of 880,000 GWh (according to Wikipedia) that’s 11 cents per kilowatt hour. That amounts to twice the “prettified” cost of nuclear power (without insurance and without subsidies) of 5 cents per kilowatt hour. And even then the Japanese were lucky that the wind didn’t shift in the direction of Tokyo. But the 100 billion won’t be the last word.
Drawing the response from another reader:
Let’s see. Japanese nuclear power plants produce 7,656,400 GWh of energy. In comparison to economic costs in the high tens of billions, 100 billion suddenly doesn’t seem so unreasonable. It only adds 1.3 cent per KWh to the cost of nuclear energy. Peanuts. In Germany, renewables are currently costing an average of 18 cents per KWh. That translates to 100 billion in under four years. In other words, thanks to renewables, we have a Fukushima in Germany every four years.
In response to a remark about all the wonderful green jobs created, another commenter responds,
Jobs created? Every job is subsidized to the tune of 40,000 Euros; how, exactly, is that supposed to result in a net gain for the economy overall?? According to your logic, all we have to do to eliminate any level of unemployment is just subsidize it away. That’s Green politics for you.
Another unhappy power customer has noticed that, in addition to the hefty subsidy he’s paying for his own power, he has to finance his well-healed “green” neighbors rooftop solar array as well:
Whoever is surprised about the increases in the cost of electricity hasn’t been paying attention. There’s no such thing as a free lunch. At the moment the consumer is paying for the solar cells on his neighbor’s roof right along with his own electricity bill. Surprising? Who’s surprised?
It’s amazing how effective a substantial and increasing yearly hit to income can be in focusing the mind when it comes to assessing the real cost of green energy.
Posted on October 23rd, 2012 No comments
ARPA-E, or the Advanced Research Projects Agency – Energy, is supposed to be DOE’s version of DARPA. According to its website, its mission,
…is to fund projects that will develop transformational technologies that reduce America’s dependence on foreign energy imports; reduce U.S. energy related emissions (including greenhouse gasses); improve energy efficiency across all sectors of the U.S. economy and ensure that the U.S. maintains its leadership in developing and deploying advanced energy technologies.
So far, it has not come up with anything quite as “transformational” as the Internet or stealth technology. There is good reason for this. Its source selection people are decidedly weak in the knees. Consider the sort of stuff it’s funded in the latest round of contract awards. The people at DARPA would probably call it “workman like.” H. L. Mencken, the great Sage of Baltimore, would more likely have called it “pure fla fla.” For example, there are “transformational” systems to twiddle with natural gas storage that the industry, not exactly short of cash at the moment, would have been better left to develop on its own, such as,
Liquid-Piston Isothermal Home Natural Gas Compressor
Chilled Natural Gas At-Home Refueling
Superplastic-Formed Gas Storage Tanks
There is the ”transformational” university research that is eye-glazingly mundane, and best reserved as filler for the pages of obscure academic journals, such as,
Cell-level Power Management of Large Battery Packs
Health Management System for Reconfigurable Battery Packs
Optimal Operation and Management of Batteries Based on Real Time Predictive Modeling and Adaptive Battery Management Techniques.
There is some “groundbreaking” stuff under the rubric of “build a better magnet, and the world will beat a pathway to your door.”
Manganese-Based Permanent Magnet with 40 MGOe at 200°C
Rare‐Earth‐Free Permanent Magnets for Electrical Vehicle Motors and Wind Turbine Generators: Hexagonal Symmetry Based Materials Systems Mn‐Bi and M‐type Hexaferrite
Discovery and Design of Novel Permanent Magnets using Non-strategic Elements having Secure Supply Chains
…and so on. Far be it for me to claim that any of this research is useless. It is, however, also what the people at DARPA would call “incremental,” rather than transformational. Of course, truly transformational ideas don’t grow on trees, and DARPA also funds its share of “workmanlike” projects, but at least the source selection people there occasionally go out on a limb. In the work funded by ARPA-E, on the other hand, I can find nothing that might induce the bureaucrats on Secretary Chu’s staff to swallow their gum.
If the agency is really serious about fulfilling its mission, it might consider some of the innovative ideas out there for harnessing fusion energy. All of them can be described as “high risk, high payoff,” but isn’t that the kind of work ARPA-E is supposed to be funding? According to a recent article on the Science Magazine website, the White House has proposed cutting domestic fusion research by 16%to help pay for the U.S. contribution to the international fusion experiment, ITER, under construction in Cadarache, France. As I’ve pointed out elsewhere, ITER is second only to the International Space Station as the greatest white elephant of all time, and is similarly vacuuming up funds that might otherwise have supported worthwhile research in several other countries. All the more reason to give a leg up to fusion, a technology that has bedeviled scientists for decades, but that could potentially supply mankind’s energy needs for millennia to come. Ideas being floated at the moment include advanced fusor concepts such as the Bussard polywell, magneto-inertial fusion, focus fusion, etc. None of them look particularly promising to me, but if any of them pan out, the potential payoff is huge. I’ve always been of the opinion that, if we ever do harness fusion energy, it will be by way of some such clever idea rather than by building anything like the current “conventional” inertial or magnetic fusion reactor designs.
When it comes to conventional nuclear energy, we are currently in the process of being left in the dust by countries like India and China. Don’t expect any help from industry here. They are in the business to make a profit. There’s certainly nothing intrinsically wrong with that, but at the moment, profits are best maximized by building light water reactors that consume the world’s limited supply of fissile uranium 235 without breeding more fuel to replace it, and spawn long-lived and highly radioactive transuranic actinides in the process that it will be necessary to find a way to safely store for thousands of years into the future. This may be good for profits, but it’s definitely bad for future generations. Alternative designs exist that would breed as much new fuel as they consume, be intrinsically safe against meltdown, would destroy the actinides along with some of the worst radioactive fission products, and would leave waste that could be potentially less radioactive than the original ore in a matter of a few hundred years. DOE’s Office of Nuclear Energy already funds some research in these areas. Unfortunately, in keeping with the time-honored traditions of government research funding, they like to play it safe, funneling awards to “noted experts” who tend to keep plodding down well-established paths even when they are clearly leading to dead ends. ITER and the International Space Station are costly examples of where that kind of thinking leads. If it were really doing its job, an agency like ARPA-E might really help to shake things up a little.
Finally, we come to that scariest of boogeymen of “noted experts” the world over; cold fusion, or, as some of its advocates more reticently call it, Low Energy Nuclear Reactions (LENR). Following the initial spate of excitement on the heels of the announcement by Pons and Fleischmann of excess heat in their experiments with palladium cells, the scientific establishment agreed that such ideas were to be denounced as heretical. Anathemas and interdicts rained down on their remaining proponents. Now, I must admit that I don’t have much faith in LENR myself. I happened to attend the Cold Fusion Workshop in Sante Fe, NM which was held in 1989, not long after the Pons/Fleischmann bombshell, and saw and heard some memorably whacky posters and talks. I’ve talked to several cold fusion advocates since then, and some appeared perfectly sober, but an unsettlingly large proportion of others seemed to be treading close to the lunatic fringe. Just as fusion energy is always “30 years in the future,” cold fusion proponents have been claiming that their opponents will be “eating crow in six months” ever since 1989. Some very interesting results have been reported. Unfortunately, they haven’t been reproducible.
For all that, LENR keeps hanging around. It continues to find advocates among those who, for one reason or another, aren’t worried about their careers, or lack respect for authority, or are just downright contrarians. The Science of Low Energy Nuclear Reactions by Edmund Storms is a useful source for the history of and evidence for LENR. Websites run by the cold fusion faithful may be found here and here. Recently, stories have begun cropping up again in “respectable” mags, such as Forbes and Wired. Limited government funding has been forthcoming from NASA Langley and, at least until recently, from the Navy at its Space and Naval Warfare Systems Command (SPAWAR). Predictably, such funding is routinely attacked as support for scientific quackery. The proper response to that from the source selection folks at ARPA-E should be, “So what?” After all,
ARPA-E was created to be a catalyst for innovation. ARPA-E’s objective is to tap into the risk-taking American ethos and to identify and support the pioneers of the future. With the best research and development infrastructure in the world, a thriving innovation ecosystem in business and entrepreneurship, and a generation of youth that is willing to engage with fearless intensity, the U.S. has all the ingredients necessary for future success. The goal of ARPA-E is to harness these ingredients and make a full-court press to address the U.S.’s technological gaps and leapfrog over current energy approaches.
The best way to “harness these ingredients and make a full-court press” is not by funding of the next round of incremental improvements in rare earth magnets. Throwing a few dollars to the LENR people, on the other hand, will certainly be “high risk,” but it just might pan out. I hope the people at ARPA-E can work up the minimal level of courage it takes to do so. If the Paris fashions can face down ridicule, so can they. If they lack the nerve, then DOE would probably do better to terminate its bad imitation of DARPA and feed the money back to its existing offices. They can continue funding mediocrity just as well as ARPA-E.
Posted on October 22nd, 2012 2 comments
We have passed the end of the fiscal year, and the National Ignition Facility, or NIF, at Lawrence Livermore National Laboratory (LLNL) failed to achieve its goal of ignition (more fusion energy out than laser energy in). As I noted in earlier post about the NIF more than three years ago, this doesn’t surprise me. Ignition using the current indirect drive approach (most of the jargon and buzzwords are explained in the Wiki article on the NIF) requires conversion of the laser energy into an almost perfectly symmetric bath of x-rays. These must implode the target, preserving its spherical shape in the process in spite of a very high convergence ratio (initial radius divided by final radius), and launching a train of four shocks in the process, which must all converge in a tiny volume at the center of the target, heating it to fusion conditions. That will release energetic alpha particles (helium nuclei) which must then dump their energy in the surrounding, cold fuel material, causing a “burn wave” to propagate out from the center, consuming the remaining fuel. It would have been a spectacular achievement if LLNL had pulled it off. Unfortunately, they didn’t, for reasons that are explained in an excellent article that recently appeared in the journal Science. (Unfortunately, it’s behind a subscriber wall, and I haven’t found anything as good on the web at the moment. You can get the gist from this article at Huffpo.) The potential political implications of the failure were addressed in a recent article in the New York Times.
All of which begs the question, “What now?” My opinion, in short, is that the facility should remain operational, at full capacity (not on half shifts, which, for various reasons, would reduce the experimental value of the facility by significantly more than half).
I certainly don’t base that opinion on the potential of inertial confinement fusion (ICF), the technology implemented on the NIF, for supplying our future energy needs. While many scientists would disagree with me, I feel it has virtually none. Although they may well be scientifically feasible, ICF reactors would be engineering nightmares, and far too expensive to compete with alternative energy sources. It would be necessary to fabricate many thousands of delicate, precisely built targets every day and fill them with highly radioactive tritium. Tritium is not a naturally occurring isotope of hydrogen, and its half-life (the time it takes for half of a given quantity to undergo radioactive decay) is just over 12 years, so it can’t be stored indefinitely. It would be necessary to breed and extract the stuff from the reactor on the fly without releasing any into the environment (hydrogen is notoriously slippery stuff, that can easily leak right through several types of metal barriers), load it into the targets, and then cool them to cryogenic temperatures. There is not a reactor design study out there that doesn’t claim that this can be done cheaply enough to make ICF fusion energy cost-competitive. They are all poppycock. The usual procedure in such studies is to pick the cost number you need, and then apply “science” to make it seem plausible.
However, despite all the LLNL hype, the NIF was never funded as an energy project, but as an experimental tool to help maintain the safety and reliability of our nuclear stockpile in the absence of nuclear testing. The idea that it will be useless for that purpose, whether it achieves ignition or not, is nonsense. The facility has met and in some cases exceeded its design goals in terms of energy and precision. Few if any other facilities in the world, whether existing or planned, will be able to rival its ability to explore equations of state, opacities, and other weapons-relevant physics information about materials at conditions approaching those that exist in nuclear detonations. As long as the ban on nuclear testing remains in effect, the NIF will give us a significant advantage over other nuclear states. It seems to me that maintaining the ban is a good thing.
It also seems to me that it would behoove us to maintain a robust nuclear stockpile. Nuclear disarmament sounds nice on paper. In reality it would invite nuclear attack. The fact that nuclear weapons have not been used since 1945 is a tremendous stroke of luck. However, it has also seduced us into assuming they will never be used again. They will. The question is not if, but when. We could continue to be very lucky. We could also suffer a nuclear attack tomorrow, whether by miscalculation, or the actions of terrorists or rogue states. If we continue to have a stockpile, it must be maintained. Highly trained scientists must be available to maintain it. Unfortunately, babysitting a pile of nuclear bombs while they gather dust is not an attractive career path. Access to facilities like the NIF is a powerful incentive to those who would not otherwise consider such a career.
One of the reasons this is true is the “dual use” capability of the NIF. It can be used to study many aspects of high energy density physics that may not be relevant to nuclear weapons, but are of great interest to scientists in academia and elsewhere who are interested in fusion energy, the basic science of matter at extreme conditions, astrophysics, etc. Some of the available time on the facility will be reserved for these outside users.
As for the elusive goal of ignition itself, we know that it is scientifically feasible, just as we know that its magnetic fusion equivalent is scientifically feasible. The only question remaining is how big the lasers have to be to reach it. It may eventually turn out that the ones available on the NIF are not big enough. However, the idea that because we didn’t get ignition in the first attempts somehow proves that ignition is impossible and out of the question is ridiculous. It has not even been “proven” that the current indirect drive approach won’t work. If it doesn’t, there are several alternatives. The NIF is capable of being reconfigured for direct drive, in which the lasers are aimed directly at the fusion target. For various reasons, the beams are currently being frequency-tripled from the original “red” light of the glass lasers to “blue.” Much more energy, up to around four megajoules instead of the current 1.8, would be available if the beams were only frequency-doubled to “green”. It may be that the advantage of the extra energy will outweigh the physics-related disadvantages of green light. An interesting dark horse candidate is the “fast ignitor” scenario, in which the target would be imploded as before, but a separate beam or beams would then be used to heat a small spot on the outer surface to ignition conditions. An alpha particle “burn wave” would then propagate out, igniting the rest of the fuel, just as originally envisioned for the central hot spot approach.
Some of the comments following the Internet posts about NIF’s failure to reach ignition are amusing. For example, following an article on the Physics Today website we learn to our dismay:
With all due respect to the NIF and its team of well-meaning and enthusiastic researchers here, I am sorry to state hereby that sustainable nuclear fusion is predestined to fail, whether it be in the NIC, the Tokamak or anywhere else in solar space, for fundamentally two simple reasons paramount for fusion: ((1) vibrational synchronism (high-amplitude resonance) of reacting particles; and (2) the overall isotropy of their ambient field.
Obviously the commenter hadn’t heard that the scientific feasibility of both inertial and magnetic fusion has already been established. He reminds me of a learned doctor who predicted that Zadig, the hero of Voltaire’s novel of that name, must inevitably die of an injury. When Zadig promptly recovered, he wrote a thick tome insisting that Zadig must inevitably have died. Voltaire informs us that Zadig did not read the book. In an article on the IEEE Spectrum website, suggestively entitled National Ignition Facility: Mother of All Boondoggles?, another commenter chimes in:
How about we spend the billions on real research that actually has a chance of producing something useful? There are a gazillion ideas out there for research that has a much higher probability of producing useful results. Must be nice to work for LLNL where your ideas don’t need vetting.
In fact, the NIF was “vetted” by a full scale Federal Advisory Committee. Known as the Inertial Confinement Fusion Advisory Committee, or ICFAC, its members included Conrad Longmire, Marshall Rosenbluth, and several other experts in plasma physics and technology of world renown who had nothing whatsoever to gain by serving as shills for LLNL. It heard extensive testimony on plans to build the NIF, both pro and con, in the mid-90′s. Prominent among those who opposed the project was Steve Bodner, head of the ICF Program at the Naval Research Laboratory (NRL) at the time. Steve cited a number of excellent reasons for delaying major new starts like the NIF until some of the outstanding physics issues could be better understood. The Committee certainly didn’t ignore what he and other critics had to say. However, only one of the 15 or so members dissented from the final decision to recommend proceeding with the NIF. I suspect that LLNL’s possession of the biggest, baddest ICF computer code at the time had something to do with it. No one is better at bamboozling himself and others than a computational physicist with a big code. The one dissenter, BTW, was Tim Coffey, Director of NRL at the time, who was convinced that Bodner was right.
There are, of course, the predictable comments by those in the habit of imagining themselves geniuses after the fact, such as,
I am convinced. Garbage research.
Don’t these people feel ashamed telling so many lies?
after the IEEE Spectrum article, and,
It’s amazing to think that you can spout lies to the government to receive $6 billion for a machine that doesn’t come close to performing to spec and there are no consequences for your actions.
Following a post on the NIF at the LLNL – The True Story blog. Fortunately, most of the comments I’ve seen recently have been at a rather more thoughtful level. In any event, I hope Congress doesn’t decide to cut and run on the NIF. Pulling the plug at this point would be penny-wise and pound-foolish.
Posted on September 13th, 2012 No comments
You might get the idea from reading my blog that I have something against thorium. It ain’t so! I consider thorium a very promising candidate for supplying our future energy needs. It’s just that there’s something about the stuff that seems to drive people off the deep end. I actually missed the really bad thorium idea that’s the subject of this post when it turned up on the Internet about a year ago. However, the articles are still out there, and are interesting examples of how really bad science can be promoted as perfectly plausible by people who have impressive credentials, but actually don’t know what they’re talking about.
The idea in question was the use of thorium fueled mini-subcritical reactors to generate power for a new generation of electric cars. It was proposed by an outfit called Laser Power Systems (LPS). Science may not be their strong suit, but their PR people must be top drawer. They actually convinced the people at Cadillac to embarrass themselves by designing a “concept car” around the idea.
Steven Ashley penned an article for GE’s Txchnologist website about the idea entitled, “Thorium lasers: the perfectly plausible idea for nuclear cars.” Ashley, who is described as a contributing editor for Scientific American and whose work has appeared in such venues as Popular Mechanics, MIT’s Technology Review, and Physics Today, does not explain on what credentials or academic background he bases his conclusion that the idea is “perfectly plausible.” Presumably, none, because it isn’t. He tells us, quoting an LPS official, that they are “working on a turbine/electric generator system that is powered by ‘an accelerator-driven thorium-based laser.’” The same individual further assured him that the new technology “would be totally emissions-free, with no need for recharging.” Ashley adds that,
…because a gram of thorium has the equivalent potential energy content of 7,500 gallons of gasoline, LPS calculates that using just 8 grams of thorium in the unit could power an average car for 5,000 hours, or about 300,000 miles of normal driving.
Where, the interested reader might ask, is all this energy to come from? Lasers, after all, are not a source of energy. In general, they are rather inefficient energy sinks. I found several similar articles, and none of them ever gets around to explaining this intriguing mystery. Well, the only possible way that such a small amount of thorium could come close to producing that much energy is via fission, and even if every bit of it underwent fission, it would still produce about an order of magnitude less energy than 7,500 gallons of gasoline. We are assured that, “only a thin layer of aluminum foil is needed to shield people from the weakly emitting metal.” True, but the same doesn’t apply to thorium’s fission products. They would eventually accumulate to become a potentially deadly source of radiation unless heavily shielded. None of the articles ever gets around to explaining where, exactly, the “thorium laser” comes in, what specific atomic transitions it would rely on, how, exactly, it would be pumped, and similar seemingly obvious questions.
What can one do but shake one’s head and congratulate the LPS people on their brilliant success in bamboozling Cadillac and a whole host of ostensibly perfectly respectable science writers into taking seriously an idea that is completely wacky on the face of it? I’m certainly glad that I don’t fall for such pseudo-scientific nonsense. Oh, by the way, would anyone out there like to purchase a slightly used supply of fish oil pills?
Posted on July 24th, 2012 No comments
According to a recent press release from Lawrence Livermore National Laboratory (LLNL) in California, the 192-beam National Ignition Facility (NIF) fired a 500 terawatt shot on July 5. The world record power followed a world record energy shot of 1.89 Megajoules on July 3. As news, this doesn’t rise above the “meh” category. A shot at the NIF’s design energy of 1.8 Megajoules was already recorded back in March. It’s quite true that, as NIF Director Ed Moses puts it, “NIF is becoming everything scientists planned when it was conceived over two decades ago.” The NIF is a remarkable achievement in its own right, capable of achieving energies 50 times greater than any other laboratory facility, with pulses shaped and timed to pinpoint precision. The NIF team in general and Ed Moses in particular deserve great credit, and the nation’s gratitude, for that achievement after turning things around following a very shaky start.
The problem is that, while the facility works as well, and even better than planned, the goal it was built to achieve continues to elude us. As its name implies, the news everyone is actually waiting for is the announcement that ignition (defined as fusion energy out greater than laser energy in) has been achieved. As noted in the article, Moses said back in March that “We have all the capability to make it happen in fiscal year 2012.” At this point, he probably wishes his tone had been a mite less optimistic. To reach their goal in the two months remaining, the NIF team will need to pull a rabbit out of their collective hat. A slim chance remains. Apparently the NIF’s 192 laser beams were aimed at a real ignition target with a depleted uranium capsule and deuterium-tritium fuel on July 5, and not a surrogate. The data from that shot may prove to be a great deal more interesting than the 500 terawatt power announcement.
Meanwhile, the Russians are apparently forging ahead with plans for their own superlaser, to be capable of a whopping 2.8 Megajoules, and the Chinese are planning another about half that size, to be operational at about the same time (around 2020). That, in itself, speaks volumes about the real significance of ignition. It may be huge for the fusion energy community, but not that great as far as the weaponeers who actually fund these projects are concerned. Many weapons designers at LLNL and Los Alamos were notably unenthusiastic about ignition when NIF was still in the planning stages. What attracted them more was the extreme conditions, approaching those in an exploding nuke, that could be achieved by the lasers without ignition. They thought, not without reason, that it would be much easier to collect useful information from such experiments than from chaotic ignition plasmas. Apparently the Russian bomb designers agree. They announced their laser project back in February even though LLNL’s difficulties in achieving ignition were well known at the time.
The same can be said of some of the academic types in the NIF “user community.” It’s noteworthy that two of them, Rick Petrasso of MIT and Ray Jeanloz of UC Berkeley, whose enthusiastic comments about the 500 terawatt shot where quoted in the latest press release, are both key players in the field of high energy density physics. Ignition isn’t a sine qua non for them either. They will be able to harvest scores of papers from the NIF whether it achieves ignition or not.
The greatest liability of not achieving early ignition may be the evaporation of political support for the NIF. The natives are already becoming restless. As noted in the Livermore Independent,
In early May, sounding as if it were discussing an engineering project rather than advanced research, the House Appropriations Committee worried that NIF’s “considerable costs will not have been warranted” if it does not achieve ignition by September 30, the end of the federal fiscal year.
Later that month, in a tone that seemed to demand that research breakthroughs take place according to schedule, the House Armed Services Committee recommended that NIF’s ignition research budget for next year be cut by $30 million from the requested $84 million budget unless NIF achieves ignition by September 30.
Funding cuts at this point, after we have come so far, and are so close to the goal, would be short-sighted indeed. One must hope that a Congress capable of squandering billions on white elephants like the International Space Station will not become penny-wise and pound-foolish about funding a project that really matters.
Posted on July 7th, 2012 No comments
It’s been over a century since Max Planck came up with the idea that electromagnetic energy could only be emitted in fixed units called quanta as a means of explaining the observed spectrum of light from incandescent light bulbs. Starting from this point, great physicists such as Bohr, de Broglie, Schrödinger, and Dirac developed the field of quantum mechanics, revolutionizing our understanding of the physical universe. By the 1930′s it was known that matter, as well as electromagnetic energy, could be described by wave equations. In other words, at the level of the atom, particles do not behave at all as if they were billiard balls on a table, or, in general, in the way that our senses portray physical objects to us at a much larger scale. For example, electrons don’t act like hard little balls flying around outside the nuclei of atoms. Rather, it is necessary to describe where they are in terms of probability distributions, and how they act in terms of wave functions. It is impossible to tell at any moment exactly where they are, a fact formalized mathematically in Heisenberg’s famous Uncertainty Principle. All this has profound implications for the very nature of reality, most of which, even after the passage of many decades, are still unknown to the average lay person. Among other things, it follows from all this that there are two basic types of elementary particles; fermions and bosons. It turns out that they behave in profoundly different ways, and that the idiosyncrasies of neither of them can be understood in terms of classical physics.
Sometimes the correspondence between mathematics and physical reality seems almost magical. So it is with the math that predicts the existence of fermions and bosons. When it was discovered that particles at the atomic level actually behave as waves, a brilliant Austrian scientist named Erwin Schrödinger came up with a now-famous wave equation to describe the phenomenon. Derived from a few elementary assumptions based on some postulates derived by Einstein and others relating the wavelength and frequency of matter waves to physical quantities such as momentum and energy, and the behavior of waves in general, the Schrödinger equation could be solved to find wave functions. It was found that these wave functions were complex numbers, that is, they had a real component, and an “imaginary” component that was a multiple of i, the square root of minus one. For example, such a number might be written down mathematically as x + iy. Each such number has a complex conjugate, found by changing the sign of the complex term. The complex conjugate of the above number is, therefore, x – iy. Max born found that the probability of finding a physical particle at any given point in space and time could be derived from the product of a solution to Schrödinger’s equation and its complex conjugate.
So far, so good, but eventually it was realized that there was a problem with describing particles in this way that didn’t arise in classical physics; you couldn’t tell them apart! Elementary particles are, after all, indistinguishable. One electron, for example, resembles every other electron like so many peas in a pod. Suppose you could put two electrons in a glass box, and set them in motion bouncing off the walls. Assuming you had very good eyes, you wouldn’t have any trouble telling the two of them apart if they behaved like classical billiard balls. You would simply have to watch their trajectories as they bounced around in the box. However, they don’t behave like billiard balls. Their motion must be described by wave functions, and wave functions can overlap, making it impossible to tell which wave function belongs to which electron! Trying to measure where they are won’t help, because the wave functions are changed by the very act of measurement.
All this was problematic, because if elementary particles really were indistinguishable in that way, they also had to be indistinguishable in the mathematical equations that described their behavior. As noted above, it had been discovered that the physical attributes of a particle could be determined in terms of the product of a solution to Schrödinger’s equation and its complex conjugate. Assuming for the moment that the two electrons in the box didn’t collide or otherwise interact with each other, that implies that the solution for the two particle system would depend on the product of the solution for both particles and their complex conjugates. Unfortunately, the simple product didn’t work. If the particles were labeled and the labels switched around in the solution, the answer came out different. The particles were distinguishable! What to do?
Well, Schrödinger’s equation has a very useful mathematical property. It is linear. What that means in practical terms is that if the products of the wave functions for the two particle system is a solution, then any combination of the products will also be a solution. It was found that if the overall solution was expressed as the product of the two wave functions plus their product with the labels of the two particles interchanged, or of the product of the two wave functions minus their product with the labels interchanged, the resulting probability density function was not changed by changing around the labels. The particles remained indistinguishable!
The solution to the Schrödinger equation, referred to mathematically as an eigenfunction, is called symmetric in the plus case, and antisymmetric in the minus case. It turns out, however, that if you do the math, particles act in very different ways depending on whether the plus sign or the minus sign is used. And here’s where the magic comes in. So far with just been doing math, right? We’ve just been manipulating symbols to get the math to come out right. Well, as the great physicist, Richard Feynman, once put it, “To those who do not know mathematics it is difficult to get across a real feeling as to the beauty, the deepest beauty, of nature.” So it is in this case. The real particles act just as the math predicts, and in ways that are completely unexplainable in terms of classical physics! Particles that can be described by an antisymmetric eigenfunction are called fermions, and particles that can be described by an symmetric eigenfunction are called bosons.
How do they actually differ? Well, for reasons I won’t go into here, the so-called exclusion principle applies to fermions. There can never be more than one of them in exactly the same quantum state. Electrons are fermions, and that’s why they are arranged in different levels as they orbit the nucleus of an atom. Bosons behave differently, and in ways that can be quite spectacular. Assuming a collection of bosons can be cooled to a low enough temperature they will tend to all condense into the same low energy quantum state. As it happens, the helium atom is a boson. When it is cooled below a temperature of 2.18 degrees above absolute zero, it shows some very remarkable large scale quantum effects. Perhaps the weirdest of these is superfluidity. In this state, it behaves as if it had no viscosity at all, and can climb up the sides of a container and siphon itself out over the top!
No one really knows what matter is at a fundamental level, or why it exists at all. However, we do know enough about it to realize that our senses only tell us how it acts at the large scales that matter to most living creatures. They don’t tell us anything about its essence. It’s unfortunate that now, nearly a century after some of these wonderful discoveries about the quantum world were made, so few people know anything about them. It seems to me that knowing about them and the great scientist who made them adds a certain interest and richness to life. If nothing else, when physicists talk about the Higgs boson, it’s nice to have some clue what they’re talking about.
Posted on June 10th, 2012 1 comment
As I mentioned in a previous post about fusion progress, signs of life have finally been appearing in scientific journals from the team working to achieve fusion ignition at the National Ignition Facility, or NIF, located at Lawrence Livermore National Laboratory (LLNL) in California. At the moment they are “under the gun,” because the National Ignition Campaign (NIC) is scheduled to end with the end of the current fiscal year on September 30. At that point, presumably, work at the facility will be devoted mainly to investigations of nuclear weapon effects and physics, which do not necessarily require fusion ignition. Based on a paper that recently appeared in Physical Review Letters, chances of reaching the ignition goal before that happens are growing dimmer.
The problem has to do with a seeming contradiction in the physical requirements for fusion to occur in the inertial confinement approach pursued at LLNL. In the first place, it is necessary for the NIF’s 192 powerful laser beams to compress, or implode, a target containing fusion fuel in the form of two heavy isotopes of hydrogen to extremely high densities. It is much easier to compress materials that are cold than those that are hot. Therefore, it is essential to keep the fuel material as cold as possible during the implosion process. In the business, this is referred to as keeping the implosion on a “low adiabat.” However, for fusion ignition to occur, the nuclei of the fuel atoms must come extremely close to each other. Unfortunately, they’re not inclined to do that, because they’re all positively charged, and like charges repel. How to overcome the repulsion? By making the fuel material extremely hot, causing the nuclei to bang into each other at high speed. The whole trick of inertial confinement fusion, then, is to keep the fuel material very cold, and then, in a tiny fraction of a second, while its inertia holds it in place (hence the name, “inertial” confinement fusion), raise it, or at least a small bit of it, to the extreme temperatures necessary for the fusion process to begin.
The proposed technique for creating the necessary hot spot was always somewhat speculative, and more than one fusion expert at the national laboratories were dubious that it would succeed. It consisted of creating a train of four shocks during the implosion process, which were to overtake one another all at the same time precisely at the moment of maximum compression, thereby creating the necessary hot spot. Four shocks are needed because of well-known theoretical limits on the increase in temperature that can be achieved with a single shock. Which brings us back to the paper in Physical Review Letters.
The paper, entitled Precision Shock Tuning on the National Ignition Facility, describes the status of efforts to get the four shocks to jump through the hoops described above. One cannot help but be impressed by the elegant diagnostic tools used to observe and measure the shocks. They are capable of peering through materials under the extreme conditions in the NIF target chamber, focusing on the tiny, imploded target core, and measuring the progress of a train of shocks over a period that only lasts for a few billionths of a second! These diagnostics, developed with the help of another team of brilliant scientists at the OMEGA laser facility at the University of Rochester’s Laboratory for Laser Energetics, are a triumph of human ingenuity. They reveal that the NIF is close to achieving the ignition goal, but not quite close enough. As noted in the paper, “The experiments also clearly reveal an issue with the 4th shock velocity, which is observed to be 20% slower than predictions from numerical simulation.”
It will be a neat trick indeed if the NIF team can overcome this problem before the end of the National Ignition Campaign. In the event that they don’t, one must hope that the current administration is not so short-sighted as to conclude that the facility is a failure, and severely reduce its funding. There is too much at stake. I have always been dubious about the possibility that either the inertial or magnetic approach to fusion will become a viable source of energy any time in the foreseeable future. However, I may be wrong, and even if I’m not, achieving inertial fusion ignition in the laboratory may well point the way to as yet undiscovered paths to the fusion energy goal. Ignition in the laboratory will also give us a significant advantage over other nuclear weapons states in maintaining our arsenal without nuclear testing.
Based on the progress reported to date, there is no basis for the conclusion that ignition is unachievable on the NIF. Even if the central hot spot approach currently being pursued proves too difficult, there are alternatives, such as polar direct drive and fast ignition. However, pursuing these alternatives will take time and resources. They will become a great deal more difficult to realize if funding for NIF operations is severely cut. It will also be important to maintain the ancillary capability provided by the OMEGA laser. OMEGA is much less powerful but also a good deal more flexible and nimble than the gigantic NIF, and has already proved its value in testing and developing diagnostics, investigating novel experimental approaches to fusion, developing advanced target technology, etc.
We have built world-class facilities. Let us persevere in the quest for fusion. We cannot afford to let this chance slip.
Posted on May 6th, 2012 9 comments
Nuclear power is an attractive candidate for meeting our future energy needs. Nuclear plants do not release greenhouse gases. They release significantly less radiation into the environment than coal plants, because coal contains several parts per million of radioactive thorium and uranium. They require far less space and are far more reliable than alternative energy sources such as wind and solar. In spite of some of the worst accidents imaginable due to human error and natural disasters, we have not lost any cities or suffered any mass casualties, and the horrific “China Syndrome” scenarios invented by the self-appointed saviors of mankind have proven to be fantasies. That is not to say nuclear power is benign. It is just more benign than any of the currently available alternatives. The main problem with nuclear is not that it is unsafe, but that it is being ill-used. In this case, government could actually be helpful. Leadership and political will could put nuclear on a better track.
To understand why, it is necessary to know a few things about nuclear fuel, and how it “burns.” Bear with me while I present a brief tutorial in nuclear engineering. Nuclear energy is released by nuclear fission, or the splitting of heavy elements into two or more lighter ones. This doesn’t usually happen spontaneously. Before a heavy element can undergo fission, an amount of energy above a certain threshold must first be delivered to its nucleus. How does this happen? Imagine a deep well. If you drop a bowling ball into the well, it will cause a large splash when it hits the water. It does so because it has been accelerated by the force of gravity. A heavy nucleus is something like a well, but things don’t fall into it because of gravity. Instead, it relies on the strong force, which is very short range, but vastly more powerful than gravity. The role of “bowling ball” can be played by a neutron. If one happens along and gets close enough to fall into the strong force ”well,” it will also cause a “splash,” releasing energy as it is bound to the heavy element’s nucleus, just as the real bowling ball is “bound” in the water well until someone fishes it out. This “splash,” or release of energy, causes the heavy nucleus to “jiggle,” much like an unstable drop of water. In one naturally occurring isotope – uranium with an atomic weight of 235 – this “jiggle” is so violent that it can cause the “drop of water” to split apart, or fission.
There are other isotopes of uranium. All of them have 92 protons in their nucleus, but can have varying numbers of neutrons. The nucleus of uranium 235, or U235, has 92 protons and 143 protons, adding up to a total of 235. Unfortunately, U235 is only 0.7% of natural uranium. Almost all the rest is U238, which has 92 protons and 146 neutrons. When a neutron falls into the U238 “well,” the “splash” isn’t big enough to cause fission, or at least not unless the neutron had a lot of energy to begin with, as if the “bowling ball” had been shot from a cannon. As a result, U238 can’t act as the fuel in a nuclear reactor. Almost all the nuclear reactors in operation today simply burn that 0.7% of U235 and store what’s left over as radioactive waste. Unfortunately, that’s an extremely inefficient and wasteful use of the available fuel resources.
To understand why, it’s necessary to understand something about what happens to the neutrons in a reactor that keep the nuclear chain reaction going. First of all, where do they come from? Well, each fission releases more neutrons. The exact number depends on how fast the neutron that caused the fission was going, and what isotope underwent fission. If enough are released to cause, on average, one more fission, then the resulting chain reaction will continue until the fuel is used up. Actually, two neutrons, give or take, are released in each fission. However, not all of them cause another fission. Some escape the fuel region and are lost. Others are absorbed in the fuel material. That’s where things get interesting.
Recall that, normally, most of the fuel in a reactor isn’t U235, but the more common isotope, U238. When U238 absorbs a neutron, it forms U239, which quickly decays to neptunium 239 and then plutonium 239. Now it just so happens that plutonium 239, or Pu239, will also fission if a neutron “falls into its well,” just like U235. In other words, if enough neutrons were available, the reactor could actually produce more fuel, in the form of Pu239, than it consumes, potentially burning up most of the U238 as well as the U235. This is referred to as the “breeding” of nuclear fuel. Instead of just lighting the U235 “match” and letting it burn out, it would be used to light and burn the entire U238 “log.” Unfortunately, there are not enough neutrons in normal nuclear reactors to breed more fuel than is consumed. Such reactors have, however, been built, both in the United States and other countries, and have been safely operated for periods of many years.
Plutonium breeders aren’t the only feasible type. In addition to U235 and Pu239, another isotope will also fission if a neutron falls into its “well” - uranium 233. Like Pu239, U233 doesn’t occur in nature. However, it can be “bred,” just like Pu239, from another element that does occur in nature, and is actually more common than uranium – thorium. I’ve had a few critical things to say about some of the popular science articles I’ve seen on thorium lately, but my criticisms were directed at inaccuracies in the articles, not at thorium technology itself. Thorium breeders actually have some important advantages over plutonium. When U233 fissions, it produces more neutrons than Pu239, and it does so in a “cooler” neutron spectrum, where the average neutron energy is much lower, making the reactor significantly easier to control. These extra neutrons could not only breed more fuel. They could also be used to burn up the transuranic elements – those beyond uranium on the table of the elements – that are produced in conventional nuclear reactors, and account for the lion’s share of the long-lived radioactive waste. This would be a huge advantage. Destroy the transuranics, and the residual radioactivity from a reactor would be less than that of the original ore, potentially in a few hundred years, rather than many thousands.
Thorium breeders have other potentially important advantages. The fuel material could be circulated through the core in the form of a liquid, suspended in a special “salt” material. Of course, this would eliminate the danger of a fuel meltdown. In the event of an accident like the one at Fukushima, the fuel would simply be allowed to run into a holding basin, where it would be sub-critical and cool quickly. Perhaps more importantly, the United States has the biggest proven reserves of thorium on the planet.
Breeders aren’t the only reactor types that hold great promise for meeting our future energy needs. High temperature gas cooled reactors would produce gas heated to high temperature in addition to electricity. This could be used to produce hydrogen gas via electrolysis, which is much more efficient at such high temperatures. When hydrogen burns, it produces only water. Such reactors could also be built over the massive oil shale deposits in the western United States. The hot gas could then be used to efficiently extract oil from the shale “in situ” without the need to mine it. It is estimated that the amount of oil that could be economically recovered in this way from the Green River Basin deposits in Utah, Wyoming and Colorado alone is three times greater than the oil reserves of Saudi Arabia.
Will any of this happen without government support and leadership? Not any time soon. The people who build nuclear reactors expect to make a profit, and the easiest way to make a profit is to build more conventional reactors of the type we already have. Raise the points I’ve mentioned above, and they’ll simply tell you that there’s plenty of cheap uranium around and therefore no need to breed more fuel, the radioactive danger of transuranics has been much exaggerated, etc., etc. All these meretricious arguments make sense if your goal is to make a profit in the short run. They make no sense at all if you have any concern for the energy security and welfare of future generations.
Unless the proponents of controlled fusion or solar and other forms of alternative energy manage to pull a rabbit out of their collective hats, I suspect we will eventually adopt breeder technology. The question is when. After we have finally burnt our last reserves of fossil fuel? After we have used up all our precious reserves of U238 by scattering it hither and yon in the form of “depleted uranium” munitions? The longer we wait, the harder and more expensive it will become to develop a breeder economy. It would be well if, in this unusual case, government stepped in and did what it is theoretically supposed to do; lead.