Posted on October 22nd, 2013 2 comments
A consortium led by France’s EDF Energy, including Chinese investors, has agreed with the government of the UK on terms for building a pair of new nuclear reactors at Hinkley Point in the southwest of the country, not far from Bristol. If a final investment decision is made some time next year, and the plants are actually built, they will probably be big (about 1600 Megawatts) pressurized water reactors (PWR’s) based on the French company Areva’s EPR design. These are supposed to be (and probably are) safer, more efficient, and more environmentally friendly than earlier designs. In general, I tend to be pro-nuclear. I would certainly feel a lot safer living next to a nuclear plant than a coal plant. However, I’m a bit ambivalent about these new starts. I think we could be a lot smarter in the way we implement nuclear power programs.
Reactors of the type proposed will burn uranium. Natural uranium consists mostly of two isotopes, U235 and U238, and only U235 can be burnt directly in a nuclear reactor. Why? The answer to that question depends on something called “the binding energy of the last neutron.” Think of a neutron as a bowling ball, and the nucleus of a uranium atom as a deep well. If the bowling ball happens to roll into the well, it will drop over the edge, eventually smacking into the bottom, and releasing the energy it acquired due to the acceleration of gravity in the process. The analogous force in the nucleus of a uranium atom is the nuclear force, incomparably greater than the force of gravity, but it acts in much the same way. The neutron doesn’t notice this very short range force until it gets very close to the nucleus, or “lip of the well,” but when it does, it “falls in” and releases the energy acquired in the process in much the same way. This energy is what I’ve referred to above as “the binding energy of the last neutron.”
When this binding energy is released in the nucleus, it causes it to wiggle and vibrate, something like a big drop of water falling through the air. In the case of U235, the energy is sufficient to cause this “liquid drop” to actually break in two, or “fission.” Such isotopes are referred to as “fissile.” In U238, the binding energy of the last neutron alone is not sufficient to cause fission, but the isotope can still actually fission if the neutron happens to be moving very fast when it hits the nucleus, bringing some of its own energy to the mix. Such isotopes, while not “fissile,” are referred to as “fissionable.” Unfortunately, the isotope U235 is only 0.7 percent of natural uranium. Once it’s burnt, the remaining U238 is no longer useful for starting a nuclear chain reaction on its own.
That would be the end of the story as far as conventional reactors are concerned, except for the fact that something interesting happens to the U238 when it absorbs a neutron. As mentioned above, it doesn’t fission unless the neutron is going very fast to begin with. Instead, with the extra neutron, it becomes U239. However, U239 is unstable, and decays into neptunium 239, which further decays into plutonium 239, or Pu239. In Pu239 the binding energy of the last neutron IS enough to cause it to fission. Thus, conventional reactors burn not only U235, but also some of the Pu239 that is produced in this way. Unfortunately, they don’t produce enough extra plutonium to keep the reactor going, so only a few percent of the U238 is “burnt” in addition to the U235 before the fuel has to be replaced and the old fuel either reprocessed or stored as radioactive waste. Even though a lot of energy is locked up in the remaining U238, it is usually just discarded or used in such applications as the production of heavy armor or armor piercing munitions. In other words, the process is something like throwing a log on your fireplace, then fishing it out and throwing it away when only a small fraction of it has been burnt.
Can anything be done about it? It turns out that it can. The key is neutrons. They not only cause the U235 and Pu239 to fission, but also produce Pu239 via absorption in U238. What if there were more of them around? If there were enough, then enough new Pu239 could be produced to replace the U235 and old Pu239 lost to fission, and a much greater fraction of the U238 could be converted into useful energy. A much bigger piece of the “log” could be burnt.
As a matter of fact, what I’ve described has actually been done, in so-called breeder reactors. To answer the question “How?” it’s necessary to understand where all those neutrons come from to begin with. In fact, they come from the fission process itself. When an atom of uranium or plutonium fissions, it releases an average of something between 2 and 3 neutrons in the process. These, in turn, can cause other fissions, keeping the nuclear chain reaction going. The chances that they actually will cause another fission depends, among other things, on how fast they are going. In general, the slower the neutron, the greater the probability that it will cause another fission. For that reason, the neutrons in nuclear reactors are usually “moderated” to slower speeds by allowing them to collide with lighter elements, such as hydrogen. Think of billiard balls. If one of them hits another straight on, it will stop, transferring its energy to the second ball. Much the same thing happens in neutron “moderation.”
However, more neutrons will be produced in each fission if the neutrons aren’t heavily moderated, but remain “fast.” In fact, enough can be produced, not only to keep the chain reaction going, but to convert more U238 into useful fuel via neutron absorption than is consumed. That is the principle of the so-called fast breeder reactor. Another way to do the same thing is to replace the U238 with the more plentiful naturally occurring element thorium 232. When it absorbs a neutron, it eventually decays into U233, which, like U235, is fissile. There are actually many potential advantages to this thorium breeding cycle, such as potentially greater resistance to nuclear weapons proliferation, the ability to run the process at slower average neutron speeds, allowing smaller reactor size and easier control, less production of dangerous, long-lived transuranic actinides, such as plutonium and americium, etc. In fact, if enough neutrons are flying around, they will fission and eliminate these actinides. It turns out that’s very important, because they’re the nastiest components of nuclear waste. If they could be recycled and burned, the amount of residual radiation from the waste produced by operating a nuclear plant for 30 or 40 years could be reduced to a level below that of the original uranium or thorium ore in a matter of only a few hundred years, rather than the many thousands that would otherwise be necessary.
So breeders can use almost all the potential energy in uranium or thorium instead of just a small fraction, while at the same time minimizing problems with radioactive waste. What’s not to like? Why aren’t we doing this? The answer is profit. As things now stand, power from breeder reactors of the type I’ve just described would be significantly more expensive than that from conventional reactors like EPR. EPR’s would use enriched natural uranium, which is still relatively cheap and plentiful. They would require no expensive reprocessing step. Ask an industry spokesman, and they will generally assure you (and quite possibly believe themselves, because self-interest has always had a strong delusional effect) that we will never run out of natural uranium, that the radioactive danger from conventional reactor waste has been grossly exaggerated, and there is no long-term proliferation danger from simply discarding plutonium-laced waste somewhere and letting it decay for several thousand years. I’m not so sure.
Now, I have no problem with profit, and I find Hollywood’s obsession with the evils of large corporations tiresome, but I really do think this is one area in which government might actually do something useful. It might involve some mix of increased investment in research and development of advanced reactor technology, including the building of small demonstration reactors, continued robust support for the nuclear Navy, and eliminating subsidies on new conventional reactors. Somehow, we managed to build scores of research reactors back in the 50’s, 60’s and 70’s. It would be nice if we could continue building a few more now and then, not only for research into breeder technology, but as test beds for new corrosion and radiation resistant materials and fuels, exploration of high temperature gas-cooled reactors that could not only produce electricity but facilitate the production of hydrogen from water and synthetic natural gas from carbon dioxide and coal, both processes that are potentially much more efficient at high temperatures, and even fusion-fission hybrids if we can ever get fusion to work.
We aren’t going to run out of energy any time soon, but there are now over 7 billion people on the planet. Eventually we will run out of fossil fuels, and depending entirely on wind, solar and other renewables to take up the slack seems a little risky to me. Wasting potential fuel for the reactors of the future doesn’t seem like such a good idea either. Under the circumstances, keeping breeder technology on the table as a viable alternative doesn’t seem like a bad idea.
Posted on October 19th, 2013 No comments
Who says there’s no such thing as German humor? Take, for example, some of the comments left by Teutonic wags after an article about the recent fusion “breakthrough” reported by scientists at Lawrence Livermore National Laboratory working on the National Ignition Facility (NIF). One of the first was left by one of Germany’s famous “Greens,” who was worried about the long term effects of fusion energy. Very long term. Here’s what he had to say:
So nuclear fusion is green energy, is it? The opposite is true. Nuclear fusion is the form of energy that guarantees that any form of Green will be forever out of the question. In comparison, Chernobyl is a short-lived joke! Why? Have you ever actually considered what will be “burned” with fusion energy? Hydrogen, one of the two components of water, (and a material without which life is simply impossible)! Nuclear fusion? I can already see the wars over water coming. And, by the way, the process is irreversible. Once hydrogen is fused, it’s gone forever. Nothing and no one will ever be able to make water out of it ever again!
I’m not kidding! The guy was dead serious. Of course, this drew a multitude of comments from typical German Besserwisser (better knowers), such as, “If you don’t have a clue, you should shut your trap.” However, some of the other commenters were more light-hearted. for example,
No, no, no. What eu-fan (the first commenter) doesn’t seem to understand is that this should be seen as a measure against the rise in sea level that will result from global warming. Less hydrogen -> less water -> reduced sea level -> everything will be OK.
Another hopeful commenter adds,
…if it ever actually does succeed, this green fusion, can we have our old-fashioned light bulbs back?
Noting that the fusion of hydrogen produces helium, another commenter chimes in,
So, in other words, if a fusion reactor blows up, the result will be a global bird cage: The helium released will make us all talk like Mickey Mouse!
In all seriousness, the article in Der Spiegel about the “breakthrough” wasn’t at all bad. The author actually bothered to ask a local fusion expert, Sibylle Günter, Scientific Director of the Max Planck Institute for Plasma Physics, about Livermore’s “breakthrough.” She replied,
The success of our colleagues (at Livermore) is remarkable, and I don’t want to belittle it. However, when one speaks of a “breakeven point” in the classical sense, in which the fusion energy out equals the total energy in, they still have a long way to go.
That, of course, is entirely true. The only way one can speak of a “breakthough” in the recent NIF experiments is by dumbing down the accepted definition of “ignition” from “fusion energy out equals laser energy in” to “fusion energy out equals energy absorbed by the target,” a much lower amount. That didn’t deter many writers of English-language reports, who couldn’t be troubled to fact check Livermore’s claims with the likes of Dr. Günter. In some cases the level of fusion wowserism was extreme. For example, according to the account at Yahoo News,
After fifty years of research, scientists at the National Ignition Facility (NIF) in Livermore, have made a breakthrough in harnessing and controlling fusion.
According to the BBC, NIF conducted an experiment where the amount of energy released through the fusion reaction was more than the amount of energy being absorbed by it. This process is known as “ignition” and is the first time it has successfully been done anywhere in the world.
I’m afraid not. The definition of “ignition” that has been explicitly accepted by scientists at Livermore is “fusion energy out equals laser energy in.” That definition puts them on a level playing field with their magnetic fusion competitors. It’s hardly out of the question that the NIF will reach that goal, but it isn’t there yet. Not by a long shot.
Posted on October 10th, 2013 No comments
It has always seemed plausible to me that some clever scientist(s) might find a shortcut to fusion that would finally usher in the age of fusion energy, rendering the two “mainstream” approaches, inertial confinement fusion (ICF) and magnetic fusion, obsolete in the process. It would be nice if it happened sooner rather than later, if only to put a stop to the ITER madness. For those unfamiliar with the field, the International Thermonuclear Experimental Reactor, or ITER, is a gigantic, hopeless, and incredibly expensive white elephant and welfare project for fusion scientists currently being built in France. In terms of pure, unabashed wastefulness, think of it as a clone of the International Space Station. It has always been peddled as a future source of inexhaustible energy. Trust me, nothing like ITER will ever be economically competitive with alternative energy sources. Forget all your platitudes about naysayers and “they said it couldn’t be done.” If you don’t believe me, leave a note to your descendants to fact check me 200 years from now. They can write a gloating refutation to my blog if I’m wrong, but I doubt that it will be necessary.
In any case, candidates for the hoped for end run around magnetic and ICF keep turning up, all decked out in the appropriate hype. So far, at least, none of them has ever panned out. Enter two stage laser fusion, the latest pretender, introduced over at NextBigFuture with the assurance that it can achieve “10x higher fusion output than using the laser directly and thousands of times better output than hitting a solid target with a laser.” Not only that, but it actually achieved the fusion of boron and normal hydrogen nuclei, which produces only stable helium atoms. That’s much harder to achieve than the usual deuterium-tritium fusion between two heavy isotopes of hydrogen, one of which, tritium, is radioactive and found only in tiny traces in nature. That means it wouldn’t be necessary to breed tritium from the fusion reactions just to keep them going, one of the reasons that ITER will never be practical.
Well, I’d love to believe this is finally the ONE, but I’m not so sure. The paper describing the results NBF refers to was published by the journal Nature Communications. Even if you don’t subscribe, you can click on the figures in the abstract and get the gist of what’s going on. In the first place, one of the lasers has to accelerate protons to high enough energies to overcome the Coulomb repulsion of the stripped (of electrons) boron nuclei produced by the other laser. Such laser particle accelerators are certainly practical, but they only work at extremely high power levels. In other words, they require what’s known in the business as petawatt lasers, capable of achieving powers in excess of a quadrillion (10 to the 15th power) watts. Power comes in units of energy per unit time, and such lasers generally reach the petawatt threshold by producing a lot of energy in a very, very short time. Often, we’re talking picoseconds (trillionths of a second).
Now, you can do really, really cool things with petawatt lasers, such as pulling electron positron pairs right out of the vacuum. However, their practicality as drivers for fusion power plants, at least in their current incarnation, is virtually nil. The few currently available, for example, at the University of Rochester’s Laboratory for Laser Energetics, the University of Texas at Austin, the University of Nevada at Reno, etc., are glass lasers. There’s no way they could achieve the “rep rates” (shot frequency) necessary for useful energy generation. Achieving lots of fusions, but only for a few picoseconds, isn’t going to solve the world’s energy problems.
As it happens, conventional accelerators can also be used for fusion. As a matter of fact, it’s a common way of generating neutrons for such purposes as neutron radiography. Unfortunately, none of the many fancy accelerator-driven schemes for producing energy that people have come up with over the years has ever worked. There’s a good physical reason for that. Instead of using their energy to overcome the Coulomb repulsion of other nuclei (like charges repel, and atomic nuclei are all positively charged), and fuse with them, the accelerated particles prefer to uselessly dump that energy into the electrons surrounding those nuclei. As a result, it has always taken more energy to drive the accelerators than could be generated in the fusion reactions. That’s where the “clever” part of this scheme comes in. In theory, at least, all those pesky electrons are gone, swept away by the second laser. However, that, too, is an energy drain. So the question becomes, can both lasers be run efficiently enough and with high enough rep rates and with enough energy output to strip enough boron atoms to get enough of energy out to be worth bothering about, in amounts greater than that needed to drive the lasers? I don’t think so. Still, it was a very cool experiment.
Posted on August 30th, 2013 No comments
In a recent press release, Lawrence Livermore National Laboratory (LLNL) announced that it had achieved a yield of 3 x 1015 neutrons in the latest round of experiments at its National Ignition Facility, a giant, 192-beam laser facility designed, as its name implies, to achieve fusion ignition. That’s nowhere near “ignition,” but still encouraging as it’s three times better than results achieved in earlier experiments.
The easiest way to achieve fusion is with two heavy isotopes of hydrogen; deuterium, with a nucleus containing one proton and one neutron, and tritium, with a nucleus containing one proton and two neutrons. Deuterium is not radioactive, and occurs naturally as about one atom to every 6400 atoms of “normal” hydrogen, with a nucleus containing only a single proton. Tritium is radioactive, and occurs naturally only in tiny trace amounts. It has a half-life (the time it takes for half of a given amount to undergo radioactive decay) of 12.3 years, and must be produced artificially. When tritium and deuterium fuse, they release a neutron, a helium nucleus, or alpha particle, and lots of energy (17.6 million electron volts).
Fortunately (because otherwise it would be too easy to blow up the planet), or unfortunately (if you want to convert the energy into electricity), fusion is hard. The two atoms don’t like to get too close, because their positively charged nuclei repel each other. Somehow, a way must be found to make the heavy hydrogen fuel material very hot, causing the thermal motion of the atoms to become very large. Once they start moving fast enough, they can smash into each other with enough momentum to overcome the repulsion of the positive nuclei, allowing them to fuse. However, the amount of energy needed per atom is huge, and when atoms get that hot, the last thing they want to do is stay close to each other (think of what happens in the detonation of high explosive.) There are two mainstream approaches to solving this problem; magnetic fusion, in which the atoms are held in place by powerful magnetic fields while they are heated (the approach being pursued at ITER, the International Thermonuclear Experimental Reactor, currently under construction in France), and inertial confinement fusion (ICF), where the idea is to dump energy into the fuel material so fast that its own inertia holds it in place long enough for fusion to occur. The NIF is an ICF facility.
There are various definitions of ICF “ignition,” but, in order to avoid comparisons of apples and oranges between ICF and magnetic fusion experiments, LLNL has explicitly accepted the point at which the fusion energy out equals the laser energy in as the definition of ignition. In the experiment referred to above, the total fusion energy release was about 10,000 joules, give or take. Since the laser energy in was around 1.7 million joules, that’s only a little over one half of one percent of what’s needed for ignition. Paltry, you say? Not really. To understand why, you have to know a little about how ICF experiments work.
Recall that the idea is to heat the fuel material up so fast that its own inertia holds it in place long enough for fusion to occur. The “obvious” way to do that would be to simply dump in enough laser energy to heat all the fuel material to fusion temperatures at once. Unfortunately, this “volumetric heating” approach wouldn’t work. The energy required would be orders of magnitude more than what’s available on the NIF. What to do? Apply lots and lots of finesse. It turns out that if a very small volume or “hot spot” in the fuel material can be brought to fusion conditions, the alpha particles released in the fusion reactions might carry enough energy to heat up the nearby fuel to fusion conditions as well. Ideally, the result would be an alpha “burn wave,” moving out through the fuel, and consuming it all. But wait, it ain’t that easy! An efficient burn wave will occur only if the alphas are slammed to a stop and forced to dump their energy after traveling only a very short distance in the cold fuel material around the hot spot. Their range is too large unless the fuel is first compressed to a tiny fraction of its original volume, causing its density to increase by orders of magnitude.
In other words, to get the fuel to fuse, we need to make it very hot, but we also need to compress it to very high density, which can be done much more easily and efficiently if the material is cold! Somehow, we need to keep the fuel “cold” during the compression process, and then, just at the right moment, suddenly heat up a small volume to fusion conditions. It turns out that shocks are the answer to the problem. If a train of four shocks can be set off in the fuel material as it is being compressed, or “imploded,” by the lasers, precisely timed so that they will all converge at just the right moment, it should be possible, in theory at least, to generate a hot spot. If the nice, spherical symmetry of the fuel target could be maintained during the implosion process, everything should work just fine. The NIF would have more than enough energy to achieve ignition. But there’s the rub. Maintaining the necessary symmetry has turned out to be inordinately hard. Tiny imperfections in the target surface finish, small asymmetries in the laser beams, etc., lead to big deviations from perfect symmetry in the dense, imploded fuel. These asymmetries have been the main reason the NIF has not been able to achieve its ignition goal to date.
And that’s why the results of the latest round of experiments haven’t been as “paltry” as they seem. As noted in the LLNL press release,
Early calculations show that fusion reactions in the hot plasma started to self-heat the burning core and enhanced the yield by nearly 50 percent, pushing close to the margins of alpha burn, where the fusion reactions dominate the process.
“The yield was significantly greater than the energy deposited in the hot spot by the implosion,” said Ed Moses, principle associate director for NIF and Photon Science. “This represents an important advance in establishing a self-sustaining burning target, the next critical step on the path to fusion ignition on NIF.”
That’s not just hype. If the self-heating can be increased in future experiments, it may be possible to reach a threshold at which the alpha heating sets off a burn wave through the rest of the cold fuel, as described above. In other words, ignition is hardly a given, but the guys at LLNL still have a fighting chance. Their main challenge may be to stem the gradual evaporation of political support for NIF while the experiments are underway. Their own Senator, Diane Feinstein, is anything but an avid supporter. She recently turned down appeals to halt NIF budget cuts, and says the project needs to be “reassessed” in light of the failure to achieve ignition.
Such a “reassessment” would be a big mistake. The NIF was never funded as an energy project. Its support comes from the National Nuclear Security Administration (NNSA), a semi-autonomous arm of the Department of Energy charged with maintaining the safety and reliability of the nation’s nuclear arsenal. As a tool for achieving that end, the NIF is without peer in any other country. It has delivered on all of its performance design goals, including laser energy, illumination symmetry, shot rate, the precision and accuracy of its diagnostic instrumentation, etc. The facility is of exceptional value to the weapons program even if ignition is never achieved. It can still generate experimental conditions approaching those present in an exploding nuclear device, and, along with the rest of our suite of “above-ground experimental facilities,” or AGEX, it gives us a major leg up over the competition in maintaining our arsenal and avoiding technological surprise in the post-testing era.
Why is that important? Because the alternative is a return to nuclear testing. Do you think no one at NNSA wants to return to testing, and that the weapon designers at the National Weapons Laboratories wouldn’t jump at the chance? If so, you’re dreaming. It seems to me we should be doing our best to keep the nuclear genie in the bottle, not let it out. Mothballing the NIF would be an excellent start at pulling the cork!
I understand why the guys at LLNL are hyping the NIF’s potential as a source of energy. It’s a lot easier to generate political support for lots of electricity with very little radioactive waste and no greenhouse gases than for maintaining our aging arsenal of nuclear weapons. However, IMHO, ICF is hopeless as a source of electricity, at least for the next few hundred years. I know many excellent scientists will disagree, but many excellent scientists are also prone to extreme wishful thinking when it comes to rationalizing a technology they’ve devoted their careers to. Regardless, energy hype isn’t needed to justify the NIF. It and facilities like it will insure our technological superiority over potential nuclear rivals for years to come, and at the same time provide a potent argument against the resumption of nuclear testing.
Posted on May 13th, 2013 2 comments
Paul Gross and Norman Levitt published their now classic Higher Superstition: The Academic Left and Its Quarrels with Science almost two decades ago. The book described the flipping and flopping of the various species of self-appointed saviors of mankind on campus left high and dry by the collapse of Marxism. In the absence of that grand, unifying philosophy, the authors found them running about like so many chickens with their heads cut off, engaged in internecine warfare, and chasing after the various chimeras of postmodernism, eco-extremism, radical feminism, anti-racist racism, etc. For some reason, perhaps because they were scientists and they objected to their ox being gored, Gross and Levitt were willing to subject themselves to the incredible boredom of attending the conferences, following the journals, and reading the books emanating from these various swamps. Since they happened to be on the left of the ideological spectrum themselves, their book was also thoughtfully written and not just one of the usual rants from the right.
Unfortunately, no one with similar insight and tolerance for pain has published anything of similar stature in the ensuing years. We have been reduced to scrutinizing the data points that periodically bubble up through the froth to formulate some idea of how close we are to being saved. Based on the meager information at our disposal, we gather that no great new secular religion has sprung up in the meantime to take the place of Marxism. The only thing on hand to fill the vacuum left behind by its demise has been radical Islam. Since, in a sense, it’s the only game in town, we’ve been treated to the amusing spectacle of watching leftist “progressives” making eyes at the fanatical zealots of one of the most reactionary religious systems ever concocted by the mind of man, while the latter have been busily cannibalizing the revolutionary vernacular familiar from the heyday of Communism.
Other than that, it would seem that the scene today would be quite familiar to readers of Higher Superstition. Consider, for example, the recent “revolutionary action” that took place on the campus of Swarthmore. If we are to believe the somewhat overwrought account at National Review Online, it involved intimidation of the school administration and bullying of conservative students at what was advertised as an open Board of Managers meeting. The ostensible goal of the disruption was to get the administration to agree to the divestment of stocks in fossil fuel companies, apparently based on the rather dubious assumption that nothing disagreeable would happen if all mankind suddenly stopped using them. However, the divestment thing is hardly what is nearest and dearest to the hearts of the “academic left” at Swarthmore. What is nearest and dearest? According to NRO,
The radicals are demanding a massive expansion of Swarthmore’s politicized “studies” programs, with a new Latino Studies major specifically dedicated to Latinos in the United States, and mandatory classes for all Swarthmore students in ethnic studies and gender and sexuality studies.
I doubt that the gentry at NRO really understand what is going on here, because they lack the proper grounding in Marxist theory. As Trotsky might have put it, they just don’t understand the dialectic. What we are really seeing here is the emergence of a new exploiting class of gigantic proportions, cleverly attempting to obfuscate their true historical role behind a smokescreen of revolutionary jargon. These people are exploiters, not exploitees. Ensconced in their ivory towers, untouchable within their tenured cocoons, they are increasingly gaining a monopoly of the social means of education. Like the bourgeoisie of old, who used the social means of production to suck the blood of the exploited workers, they use their own monopoly to feast on the sweat of the academic proletariat – their students. They accumulate these useless “studies” courses for the same reasons that the capitalists accumulated money.
Little realizing that they are being reduced to debt-serfs, with lives sold out and mortgaged to maintain these academic vampires in their accustomed luxury, the student proletariat are kept docile with fairy tales about “saving the world.” Now, if Marx was right (and how could he possibly be wrong?) this “thesis” of the academic exploiters will soon run head on into the “antithesis” of the developing revolutionary consciousness of the student proletariat they have so cynically betrayed. At least the bourgeoisie used their monopoly to produce something useful. The new class of academic exploiters fobs off its victims with “studies” that they will find entirely useless in their struggle against the slavery that awaits them, unless they are among the happy few co-opted into the exploiting class. Where is this leading? How will the exploited academic proletariat react when they finally figure out, crushed under a mountain of debt, with heads full of “liberating” jargon and no prospect of employment that the “radical and emancipatory” blather they were being fed really leads to chains and slavery? I can but quote the ringing warning of Edwin Markham in his famous poem, Man with the Hoe:
O masters, lords and rulers in all lands,
How will the Future reckon with this Man?
How answer his brute question in that hour
When whirlwinds of rebellion shake the world?
The pundits at NRO should relax. If I’ve interpreted the Marxist dialectic correctly, the revolutionary climax will be followed by a brief period of the dictatorship of the academic proletariat, followed by the gradual withering of academic administrations, and a new era of universal wisdom based on enlightened self-education.
And what of the academic exploiters? I think it goes without saying that it will be necessary to “expropriate the expropriators.” However, being by nature a kindly and sedate man, I can only hope that it doesn’t come to the “liquidation of the academic exploiters as a class.” On the other hand, I don’t want to be accused of “right opportunism” and realize full well that “you have to break some eggs to make an omelet.”
Posted on April 22nd, 2013 No comments
A while back in an online discussion with a German “Green,” I pointed out that, if Germany shut down its nuclear plants, coal plants would have to remain in operation to take up the slack. He was stunned that I could be so obtuse. Didn’t I realize that the lost nuclear capacity would all be replaced by benign “green” energy technology? Well, it turns out things didn’t quite work out that way. In fact, the lost generating capacity is being replaced by – coal.
Germany is building new coal-fired power plants hand over fist, with 26 of them planned for the immediate future. According to Der Spiegel, the German news magazine that never misses a trick when it comes to bashing nuclear, that’s a feature, not a bug. A recent triumphant headline reads, “Export Boom: German Coal Electricity Floods Europe.” Expect more of the same from the home of Europe’s most pious environmentalists. Germany has also been rapidly expanding its solar and wind capacity recently thanks to heavy state subsidies, but the wind doesn’t always blow and the sun doesn’t always shine, especially in Germany. Coal plants are required to fill in the gaps – lots of them. Of course, it would be unprofitable to let them sit idle when wind and solar are available, so they are kept going full blast. When the power isn’t needed in Germany, it is sold abroad, serving as a useful prop to Germany’s export fueled economy.
Remember the grotesque self-righteousness of Der Spiegel and the German “Greens” during the Kyoto Treaty debates at the end of the Clinton administration? Complying with the Kyoto provisions cost the Germans nothing. They had just shut down the heavily polluting and grossly unprofitable industries in the former East Germany, had brought large numbers of new gas-fired plants on line thanks to increasing gas supplies from the North Sea fields, and had topped it off with a lame economy in the 90’s compared to the booming U.S. Their greenhouse gas emissions had dropped accordingly. Achieving similar reductions in the U.S. wouldn’t have been a similar “freebie.” It would have cost tens of thousands of jobs. The German “Greens” didn’t have the slightest problem with this. They weren’t interested in achieving a fair agreement that would benefit all. They were only interested in striking pious poses.
Well, guess what? Times have changed. Last year U.S. carbon emissions were at their lowest level since 1994, and down 3.7% from 2011. Our emissions are down 7.7% since 2006, the largest drop among major industrial states on the planet. German emissions were up at least 1.5% last year, and probably more like 2%. Mention this to a German “Green,” and he’s likely to mumble something about Germany still being within the Kyoto limits. That’s quite true. Germany is still riding the shutdown of what news magazine Focus calls “dilapidated, filthy, communist East German industry after the fall of the Berlin Wall,” to maintain the facade of environmental “purity.”
That’s small comfort to her eastern European neighbors. Downwind from Germany’s coal-fired plants, their “benefit” from her “green” policies is acid rain, nitrous oxide laced smog, deadly particulates that kill and sicken thousands and, last but not least, a rich harvest of radioactive fallout. That’s right, Germany didn’t decrease the radioactive hazard to her neighbors by shutting down her nuclear plants. She vastly increased it. Coal contains several parts per million each of radioactive uranium and thorium. These elements are harmless enough – if kept outside the body. The energetic alpha particles they emit are easily stopped by a normal layer of skin. When that happens, they dump the energy they carry in a very short distance, but, since skin is dead, it doesn’t matter. It’s an entirely different matter when they dump those several million electron volts of energy into a living cell – such as a lung cell. Among other things, that can easily derange the reproductive equipment of the cell, causing cancer. How can they reach the lungs? Very easily if the uranium and thorium that emit them are carried in the ash from a coal-fired plant. A typical coal-fired plant releases about 5 tons of uranium and 12 tons of thorium every year. The German “Greens” have no problem with this, even though they’re constantly bitching about the relatively miniscule release of uranium from U.S. depleted uranium munitions. Think scrubber technology helps? Guess again! The uranium and thorium are concentrated in the ash, whether it ends up in the air or not. They can easily leach into surrounding cropland and water supplies.
The last time there was an attempt to move radioactive waste to the Gorleben storage facility within Germany, the “Greens” could be found striking heroic poses as saviors of the environment all along the line, demonstrating, tearing up tracks, and setting police vehicles on fire. Their “heroic” actions forced the shutdown of Germany’s nuclear plants. The “gift” (German for “poison”) of their “heroic” actions to Germany’s neighbors came in the form of acid rain, smog, and airborne radiation. By any reasonable standard, coal-fired plants are vastly more dangerous and damaging to the environment than the nuclear facilities they replaced.
It doesn’t matter to Germany’s “Greens.” The acid rain, the radiation, the danger of global warming they always pretend to be so concerned about? It doesn’t matter. For them, as for the vast majority of other environmental zealots worldwide, the pose is everything. The reality is nothing.
Posted on February 25th, 2013 1 comment
Apropos the baby bust discussed in my last post, an interesting article on the subject by Joel Kotkin and Harry Siegel recently turned up at The Daily Beast via Newsweek entitled, “Where have all the Babies Gone?” According to the authors, “More and more Americans are childless by choice. But what makes sense for the individual may spell disaster for the country as a whole.” Their forebodings of doom are based on a subset of the reasons cited in Jonathan Last’s “What to Expect when No One’s Expecting,” and are the same as those that usually turn up in similar articles. The lack of babies, “….is likely to propel us into a spiral of soaring entitlement costs and diminished economic vigor, and create a culture marked by hyperindividualism and dependence on the state as the family unit erodes.” As I pointed out in my earlier post on the subject, if all these things really will result absent a constantly increasing population, they are not avoidable outcomes, for the simple reason that, at some point, the population of the planet must stop growing. The only question is, how many people will be around to experience those outcomes when they happen, and what fraction of the planet’s depleted resources will still be around at the time to deal with the situation. Many of the commenters on the article do an excellent job of pointing that out. For example, from Si 1979,
At some point the population has to stop growing, space on the earth is finite. As such there is going to come a generation that needs to ‘take the hit’ and the earlier that hit is taken the easier it will be. The larger the population gets the worse an ageing population each generation will experience. We can take the hit now or leave future generations a much worse problem to deal with.
To deal with the “problem,” the authors propose some of the usual “solutions”;
These include such things as reforming the tax code to encourage marriage and children; allowing continued single-family home construction on the urban periphery and renovation of more child-friendly and moderate-density urban neighborhoods; creating extended-leave policies that encourage fathers to take more time with family, as has been modestly successfully in Scandinavia; and other actions to make having children as economically viable, and pleasant, as possible. Men, in particular, will also have to embrace a greater role in sharing child-related chores with women who, increasingly, have careers and interests of their own.
As a father of children who has strongly encouraged his own children to have children as well, I am fully in favor of all such measures, as long as they remain ineffective.
The baby promoters have remarkably short historical memories. Are they unaware of the other side of this coin? One need go no further back than the 20th century. What spawned Hitler’s grandiose dreams of “Lebensraum in the east” for Germany, at the expense of Russia? Hint: It wasn’t a declining German population. Why did the Japanese come up with the “Greater East Asian Co-prosperity Sphere” idea and start invading their neighbors? It happened at a time when her population was expanding at a rapid pace, and was, by the way, only about half of what it is now. In spite of that, in the days before miracle strains of rice and other grains, it seemed impossible that she would be able to continue to feed her population. Have we really put such fears of famine behind us for all times? Such a claim must be based on the reckless assumption that the planet will never again suffer any serious disruptions in food production. These are hardly isolated examples. A book by Henry Cox entitled The Problem of Population, which appeared in 1923 and is still available on Amazon, cites numerous other examples.
Needless to say, I don’t share the fears of the Kotkins and Siegels of the world. My fear is that we will take foolhardy risks with the health of our planet by spawning unsustainably large populations in the name of maintaining entitlements which mankind has somehow incomprehensibly managed to live without for tens of thousands of years. In truth, we live in wonderful times. I and those genetically close to me can procreate without limit on a planet where the population may soon begin to decline to within sustainable limits because increasing numbers of people have decided not to have children. It’s a win-win. I’m happy with their choice, and presumably, they’re happy with mine, assuming they want to have some remnant of a working population available to exploit (or at least try to) once they’ve retired. My only hope is that people like Kotkin and Siegel don’t succeed in rocking the boat.
Posted on February 24th, 2013 3 comments
In his latest book, What to Expect When No One’s Expecting, Jonathan Last warns us of the dire consequences of shrinking populations. He’s got it backwards. It’s the best thing that could happen to us.
Before proceeding with my own take on this issue, I would like to assure the reader that I am not a rabid environmentalist or a liberal of the sort who considers people with children morally suspect. I have children and have encouraged my own children to have as many children as possible themselves. It seems to me that the fact that those among us who are supposedly the most intelligent are also the most infertile is a convincing proof of the stupidity of our species.
Why did I decide to have children? In the end, it’s a subjective whim, just like every other “purpose of life” one might imagine. However, as such I think it’s justifiable enough. The explanation lies in the way in which I perceive my “self.” As I see it, “we” are not our conscious minds, although that is what most of us perceive as “we.” Our conscious minds are evanescent manifestations of the physical bodies whose development is guided by our genes. They pop into the world for a moment and are then annihilated in death. They exist for that brief moment for one reason only – because they happened to promote our genetic survival. Is it not more reasonable to speak of “we” as that about us which has existed for billions of years and is potentially immortal, namely, our genes, than to assign that term to an ancillary manifestation of those genes that exists for a vanishingly small instant of time by comparison? We have a choice. We can choose that this “we” continue to survive, or we can choose other goals, and allow this “we” to be snuffed out, so that the physical bodies that bear our “we” become the last link in an unbroken chain stretching back over billions of years. There is no objective reason why we should prefer one choice or the other. The choice is purely subjective. The rest of the universe cares not a bit whether our genes survive or not. I, however, care. If countless links in a chain have each created new links in turn and passed on the life they carried over the eons, only to come to a link possessed of qualities that cause it to fail to continue the chain, it seems reasonable to consider that link dysfunctional, or, in the most real sense imaginable, a failure. I personally would not find the realization comforting that I am a sick and dysfunctional biological unit, a failure at carrying out that one essential function that a process of natural selection has cultivated for an almost inconceivable length of time. Therefore, I have children. As far as I am concerned, they, and not wealth, or property, or fame, are the only reasonable metric of success in the life of any individual. The very desire for wealth, property or fame only exist because at some point in our evolutionary history they have promoted our survival and procreation. As ends in themselves, divorced from the reason they came into existence in the first place, they lead only to death.
Am I concerned if others don’t agree with me? Far from it! And that brings us back to the main point of this post. I do not agree with Jonathan Last that a constantly increasing population, or even a stable one at current levels, is at all desirable. As far as I am concerned, it is a wonderful stroke of luck that in modern societies the conscious minds of so many other humans have become dysfunctional, resulting in their genetic death. I am interested in keeping other genes around only to the extent that they promote the survival of my own. That is also the only reason that I would prefer one level of population on the planet to one that is larger or smaller. That, of course, is a very personal reason, but it seems to me that it is a conclusion that must follow for anyone else to the extent that they prefer survival to the alternative.
Survival, then, is my sine qua non. Given that this planet is, for practical purposes, the only one we can depend on to support our survival, I consider it foolhardy to prefer a population that is potentially unsustainable, or that will diminish everyone’s chances of long term survival. I am hardly a fanatical environmentalist. I would just prefer that we refrain from rocking the boat. I have read Bjorn Lomborg’s The Skeptical Environmentalist, and am well aware of how frequently the environmentalists have been crying “wolf” lo now these many years. However, like Lomborg, I agree that there is still reason for concern. Pollution and environmental degradation are real problems, as is the rapid exploitation of limited sources of cheap energy and other raw materials. Obviously, Paul Ehrlich’s dire predictions that we would run out of everything in short order were far off the mark. However, eventually, they will run out, and it seems reasonable to me to postpone the date as long as possible. Let us consider the reasons Jonathan Last believes all these risks are worth taking. In all honesty, assuming we are agreed that survival is a worthwhile goal, they seem trivial to me.
To begin, while paying lip service to the old chestnut that a correlation does not necessarily indicate causation, Last suggests exactly that. On page 7 of the hardcover version of his book he writes, “Declining populations have always followed or been followed by Very Bad Things. Disease. War. Economic stagnation or collapse.” To see whether this suggestion holds water, let’s look at one of Lasts own examples of “declining populations.” On p. 36 he writes, “World population also declined steeply between 1340 and 1400, shrinking from 443 million to 374 million. This was not a period of environmental and social harmony; it was the reign of the Black Death. I leave it as an exercise for the reader to determine whether declining populations were the cause of the Black Death, or the Black Death was the cause of declining populations. To anyone who has read a little history, it is abundantly clear that, while disease, war, and economic collapse may cause depopulation, the instances where the reverse was clearly the case are few and far between. In a similar vein, referring to the Roman Empire, Last writes on p. 35, “Then, between A.D. 200 and 600, population shrank from 257 million to 208 million, because of falling fertility. We commonly refer to that period as the descent into the Dark Ages.” Where is the evidence that the population fell because of “falling fertility”? Last cites none. On the other hand, there is abundant source material from the period to demonstrate that, as in the case of the Black Death, declining populations were a result, and not a cause. In Procopius‘ history of the Great Italian War in the 6th century, for example, he notes that Italy has become depopulated. The great historian was actually there, and witnessed the cause first hand. It was not “declining fertility,” but starvation resulting from the destruction of food sources by marauding armies.
However, this allusion to “Very Bad Things” is really just a red herring. Reading a little further in Last’s book, it doesn’t take us long to discover the real burrs under his saddle. Most of them may be found by glancing through the 50 pages between chapters 5 and 7 of his book. They include, 1) The difficulty of caring for the elderly. 2) The decrease in inventiveness and entrepreneurship (because of an over proportion of elderly) 3) A decline in military strength, accompanied by an unwillingness to accept casualties, and 4) Lower economic growth. The idea that anyone could seriously suggest that any of these transient phenomena could justify playing risky games with the ability of our planet to sustain life for millennia into the future boggles the mind. The population of the planet cannot keep increasing indefinitely in any case. At some point, it must stabilize, and these consequences will follow regardless. The only question is, how many people will be affected.
Consider Japan, a country Last considers an almost hopeless demographic basket case. Its population was only 42 million as recently as 1900. At the time it won wars against both China and Russia, which had much greater populations of 415 million and 132 million, respectively at the time. Will it really be an unmitigated disaster if its population declines to that level again? It may well be that Japan’s elderly will have to make do with less during the next century or two. I hereby make the bold prediction that, in spite of that, they will not all starve to death or be left without health care to die in the streets. Demographically, Japan is the most fortunate of nations, not the least favored. At least to date, she does not enjoy the “great advantage” of mass immigration by culturally alien populations, an “advantage” that is likely to wreak havoc in the United States and Europe.
As for military strength, I doubt that we will need to fear enslavement by some foreign power as long as we maintain a strong and reliable nuclear arsenal, and, with a smaller population, the need to project our power overseas, for example to protect sources of oil and other resources, will decline because our needs will be smaller. As for inventiveness, entrepreneurship, and economic growth, it would be better to promote them by restraining the cancerous growth of modern tax-devouring welfare states than by artificially stimulating population growth. Again, all of Last’s “Very Bad Things” are also inevitable things. What he is proposing will not enable us to avoid them. It will merely postpone them for a relatively short time, as which point they will be even more difficult to manage because of depleted resources and a degraded environment than they are now. It seems a very meager excuse for risking the future of the planet.
In a word, I favor a double standard. Unrestricted population growth of my own family and those closely related to me genetically balanced by an overall decline in the population overall. There is nothing incongruous about this. It is the inherent nature of our species to apply one standard to our ingroup, and an entirely different one to outgroups. We all do the same, regardless of whether we are prepared to admit it or not. I leave you, dear reader, in the hope that you will not become confused by the distinction between the two.
Posted on February 4th, 2013 No comments
According to a German proverb, “Lügen haben kurze Beine” – Lies have short legs. That’s not always true. Some lies have very long ones. One of the most notorious is the assertion, long a staple of anti-nuclear propaganda, that the nuclear industry ever claimed that nuclear power would be “Too cheap to meter.” In fact, according to the New York Times, the phrase did occur in a speech delivered to the National Association of Science Writers by Lewis L. Strauss, then Chairman of the Atomic Energy Commission, in September 1954. Here is the quote, as reported in the NYT on September 17, 1954:
“Our children will enjoy in their homes electrical energy too cheap to meter,” he declared. … “It is not too much to expect that our children will know of great periodic regional famines in the world only as matters of history, will travel effortlessly over the seas and under them and through the air with a minimum of danger and at great speeds, and will experience a lifespan far longer than ours, as disease yields and man comes to understand what causes him to age.”
Note that nowhere in the quote is there any direct reference to nuclear power, or for that matter, to fusion power, although the anti-nuclear Luddites have often attributed it to proponents of that technology as well. According to Wikipedia, Strauss was “really” referring to the latter, but I know of no evidence to that effect. In any case, Strauss had no academic or professional background that would qualify him as an expert in nuclear energy. He was addressing the science writers as a government official, and hardly as a “spokesman” for the nuclear industry. The sort of utopian hyperbole reflected in the above quote is just what one would expect in a talk delivered to such an audience in the era of scientific and technological hubris that followed World War II. There is an excellent and detailed deconstruction of the infamous “Too cheap to meter” lie on the website of the Canadian Nuclear Society. Some lies, however, are just too good to ignore, and anti-nuclear zealots continue to use this one on a regular basis, as, for example, here, here and here. The last link points to a paper by long-time anti-nukers Arjun Makhijani and Scott Saleska. They obviously knew very well the provenance of the quote and the context in which it was given. For example, quoting from the paper:
In 1954, Lewis Strauss, Chairman of the U.S. Atomic Energy Commission, proclaimed that the development of nuclear energy would herald a new age. “It is not too much to expect that our children will enjoy in their homes electrical energy too cheap to meter,” he declared to a science writers’ convention. The speech gave the nuclear power industry a memorable phrase to be identified with, but also it saddled it with a promise that was essentially impossible to fulfill.
In other words, it didn’t matter that they knew very well that Strauss had no intention of “giving the nuclear power industry a memorable phrase to be identified with.” They used the quote in spite of the fact that they knew that claim was a lie. I all fairness, it can be safely assumed that most of those who pass along the “too cheap to meter” lie are not similarly culpable. They are merely ignorant.
Posted on January 7th, 2013 No comments
Der Spiegel, Germany’s top news magazine, has been second to none in promoting green energy, striking pious poses over the U.S. failure to jump on the Kyoto bandwagon, and trashing nuclear energy. All this propaganda has succeeded brilliantly. Germany has a powerful Green Party and is a world leader in the production of wind and solar energy, the latter in a cloudy country, the lion’s share of which lies above the 50th parallel of latitude. Now the bill has come due. In 2012 German consumers paid more than 20 billion Euros for green energy that was worth a mere 2.9 billion on the open market. True to form, Der Spiegel has been churning out shrill condemnations of the high prices, as if it never had the slightest thing to do with promoting them in the first place. In an article entitled “Green Energy Costs Consumers More Than Ever Before,” we find, among other things, that,
The cost of renewable energy continues climbing year after year. At the beginning of the year it increased from 3.59 to 5.27 (Euro) cents per kilowatt hour. One of the reasons for the increase is solar energy: more new solar facilities were installed in Germany in 2012 than ever before. The drawback of the solar boom is that it drives up the production costs paid by consumers. The reason – green energy producers will receive guaranteed compensation for every kilowatt hour for the next 20 years.
As a result, German consumers saw their bills for electricity increase by an average of 12% at the beginning of 2013. The comments following the article are at least as revealing as its content. The environmental hubris of the population shows distinct signs of fading when tranlated into terms of cold, hard cash. Examples:
What a laugh! The consumers pay 17 billion Euros, and the producers receive 2.9 billion Euros. Conclusion: End the subsidies for solar facilities immediately!! It’s too bad that the pain of consumers – if the Green Party joins the government after the Bundestag election – won’t end, but will only get worse. Other than that, solar facilities belong in countries with significantly more hours of sunlight than Germany.
Those were the days, when (Green politician) Trittin told shameless lies to the public, claiming that the switch to green energy would only cost 1.5 Euros per household.
In ten years we’ll learn what the green energy lies are really going to cost us.
The real costs are even higher. When there’s no wind, or clouds cut off the sunlight, then the conventional energy sources held in reserve must make up the deficit; the oil, coal and brown coal energy plants. If production costs are calculated correctly, then their expense should be included in the price of green energy. All at once there is a jump from 17 billion to 25 billion Euros in the price we have to pay for the “favors” the Green-Red parties have done us.
Specious arguments about the supposedly comparable costs of the nuclear power plants Germany is in the process of shutting down are no longer swallowed with alacrity. For example, in response to the familiar old chestnut of citing exaggerated costs for decommissioning nuclear plants and storing the waste a commenter replies:
Hmmm, if nuclear energy is so expensive, why are so many countries in central Europe – for example, the Czech Republic – interested in nuclear power? Certainly not to breed actinides to build nuclear weapons in order to become “nuclear powers.” The cost of long term waste storage in terms of the energy produced only amounts to about 0.01 Euros per Kw/h. Even decommissioning expenses don’t add significantly to the overall cost… Let us split atoms, not hairs.
A “green” commenter suggests that the cleanup costs for the Fukushima reactors be automatically added to the cost of all reactors:
According to the latest figures for November 2012 for Fukushima: 100 billion Euros. Distributing this over the total energy production of 880,000 GWh (according to Wikipedia) that’s 11 cents per kilowatt hour. That amounts to twice the “prettified” cost of nuclear power (without insurance and without subsidies) of 5 cents per kilowatt hour. And even then the Japanese were lucky that the wind didn’t shift in the direction of Tokyo. But the 100 billion won’t be the last word.
Drawing the response from another reader:
Let’s see. Japanese nuclear power plants produce 7,656,400 GWh of energy. In comparison to economic costs in the high tens of billions, 100 billion suddenly doesn’t seem so unreasonable. It only adds 1.3 cent per KWh to the cost of nuclear energy. Peanuts. In Germany, renewables are currently costing an average of 18 cents per KWh. That translates to 100 billion in under four years. In other words, thanks to renewables, we have a Fukushima in Germany every four years.
In response to a remark about all the wonderful green jobs created, another commenter responds,
Jobs created? Every job is subsidized to the tune of 40,000 Euros; how, exactly, is that supposed to result in a net gain for the economy overall?? According to your logic, all we have to do to eliminate any level of unemployment is just subsidize it away. That’s Green politics for you.
Another unhappy power customer has noticed that, in addition to the hefty subsidy he’s paying for his own power, he has to finance his well-healed “green” neighbors rooftop solar array as well:
Whoever is surprised about the increases in the cost of electricity hasn’t been paying attention. There’s no such thing as a free lunch. At the moment the consumer is paying for the solar cells on his neighbor’s roof right along with his own electricity bill. Surprising? Who’s surprised?
It’s amazing how effective a substantial and increasing yearly hit to income can be in focusing the mind when it comes to assessing the real cost of green energy.