Posted on February 4th, 2013 No comments
According to a German proverb, “Lügen haben kurze Beine” – Lies have short legs. That’s not always true. Some lies have very long ones. One of the most notorious is the assertion, long a staple of anti-nuclear propaganda, that the nuclear industry ever claimed that nuclear power would be “Too cheap to meter.” In fact, according to the New York Times, the phrase did occur in a speech delivered to the National Association of Science Writers by Lewis L. Strauss, then Chairman of the Atomic Energy Commission, in September 1954. Here is the quote, as reported in the NYT on September 17, 1954:
“Our children will enjoy in their homes electrical energy too cheap to meter,” he declared. … “It is not too much to expect that our children will know of great periodic regional famines in the world only as matters of history, will travel effortlessly over the seas and under them and through the air with a minimum of danger and at great speeds, and will experience a lifespan far longer than ours, as disease yields and man comes to understand what causes him to age.”
Note that nowhere in the quote is there any direct reference to nuclear power, or for that matter, to fusion power, although the anti-nuclear Luddites have often attributed it to proponents of that technology as well. According to Wikipedia, Strauss was “really” referring to the latter, but I know of no evidence to that effect. In any case, Strauss had no academic or professional background that would qualify him as an expert in nuclear energy. He was addressing the science writers as a government official, and hardly as a “spokesman” for the nuclear industry. The sort of utopian hyperbole reflected in the above quote is just what one would expect in a talk delivered to such an audience in the era of scientific and technological hubris that followed World War II. There is an excellent and detailed deconstruction of the infamous “Too cheap to meter” lie on the website of the Canadian Nuclear Society. Some lies, however, are just too good to ignore, and anti-nuclear zealots continue to use this one on a regular basis, as, for example, here, here and here. The last link points to a paper by long-time anti-nukers Arjun Makhijani and Scott Saleska. They obviously knew very well the provenance of the quote and the context in which it was given. For example, quoting from the paper:
In 1954, Lewis Strauss, Chairman of the U.S. Atomic Energy Commission, proclaimed that the development of nuclear energy would herald a new age. “It is not too much to expect that our children will enjoy in their homes electrical energy too cheap to meter,” he declared to a science writers’ convention. The speech gave the nuclear power industry a memorable phrase to be identified with, but also it saddled it with a promise that was essentially impossible to fulfill.
In other words, it didn’t matter that they knew very well that Strauss had no intention of “giving the nuclear power industry a memorable phrase to be identified with.” They used the quote in spite of the fact that they knew that claim was a lie. I all fairness, it can be safely assumed that most of those who pass along the “too cheap to meter” lie are not similarly culpable. They are merely ignorant.
Posted on January 7th, 2013 No comments
Der Spiegel, Germany’s top news magazine, has been second to none in promoting green energy, striking pious poses over the U.S. failure to jump on the Kyoto bandwagon, and trashing nuclear energy. All this propaganda has succeeded brilliantly. Germany has a powerful Green Party and is a world leader in the production of wind and solar energy, the latter in a cloudy country, the lion’s share of which lies above the 50th parallel of latitude. Now the bill has come due. In 2012 German consumers paid more than 20 billion Euros for green energy that was worth a mere 2.9 billion on the open market. True to form, Der Spiegel has been churning out shrill condemnations of the high prices, as if it never had the slightest thing to do with promoting them in the first place. In an article entitled “Green Energy Costs Consumers More Than Ever Before,” we find, among other things, that,
The cost of renewable energy continues climbing year after year. At the beginning of the year it increased from 3.59 to 5.27 (Euro) cents per kilowatt hour. One of the reasons for the increase is solar energy: more new solar facilities were installed in Germany in 2012 than ever before. The drawback of the solar boom is that it drives up the production costs paid by consumers. The reason – green energy producers will receive guaranteed compensation for every kilowatt hour for the next 20 years.
As a result, German consumers saw their bills for electricity increase by an average of 12% at the beginning of 2013. The comments following the article are at least as revealing as its content. The environmental hubris of the population shows distinct signs of fading when tranlated into terms of cold, hard cash. Examples:
What a laugh! The consumers pay 17 billion Euros, and the producers receive 2.9 billion Euros. Conclusion: End the subsidies for solar facilities immediately!! It’s too bad that the pain of consumers – if the Green Party joins the government after the Bundestag election – won’t end, but will only get worse. Other than that, solar facilities belong in countries with significantly more hours of sunlight than Germany.
Those were the days, when (Green politician) Trittin told shameless lies to the public, claiming that the switch to green energy would only cost 1.5 Euros per household.
In ten years we’ll learn what the green energy lies are really going to cost us.
The real costs are even higher. When there’s no wind, or clouds cut off the sunlight, then the conventional energy sources held in reserve must make up the deficit; the oil, coal and brown coal energy plants. If production costs are calculated correctly, then their expense should be included in the price of green energy. All at once there is a jump from 17 billion to 25 billion Euros in the price we have to pay for the “favors” the Green-Red parties have done us.
Specious arguments about the supposedly comparable costs of the nuclear power plants Germany is in the process of shutting down are no longer swallowed with alacrity. For example, in response to the familiar old chestnut of citing exaggerated costs for decommissioning nuclear plants and storing the waste a commenter replies:
Hmmm, if nuclear energy is so expensive, why are so many countries in central Europe – for example, the Czech Republic – interested in nuclear power? Certainly not to breed actinides to build nuclear weapons in order to become “nuclear powers.” The cost of long term waste storage in terms of the energy produced only amounts to about 0.01 Euros per Kw/h. Even decommissioning expenses don’t add significantly to the overall cost… Let us split atoms, not hairs.
A “green” commenter suggests that the cleanup costs for the Fukushima reactors be automatically added to the cost of all reactors:
According to the latest figures for November 2012 for Fukushima: 100 billion Euros. Distributing this over the total energy production of 880,000 GWh (according to Wikipedia) that’s 11 cents per kilowatt hour. That amounts to twice the “prettified” cost of nuclear power (without insurance and without subsidies) of 5 cents per kilowatt hour. And even then the Japanese were lucky that the wind didn’t shift in the direction of Tokyo. But the 100 billion won’t be the last word.
Drawing the response from another reader:
Let’s see. Japanese nuclear power plants produce 7,656,400 GWh of energy. In comparison to economic costs in the high tens of billions, 100 billion suddenly doesn’t seem so unreasonable. It only adds 1.3 cent per KWh to the cost of nuclear energy. Peanuts. In Germany, renewables are currently costing an average of 18 cents per KWh. That translates to 100 billion in under four years. In other words, thanks to renewables, we have a Fukushima in Germany every four years.
In response to a remark about all the wonderful green jobs created, another commenter responds,
Jobs created? Every job is subsidized to the tune of 40,000 Euros; how, exactly, is that supposed to result in a net gain for the economy overall?? According to your logic, all we have to do to eliminate any level of unemployment is just subsidize it away. That’s Green politics for you.
Another unhappy power customer has noticed that, in addition to the hefty subsidy he’s paying for his own power, he has to finance his well-healed “green” neighbors rooftop solar array as well:
Whoever is surprised about the increases in the cost of electricity hasn’t been paying attention. There’s no such thing as a free lunch. At the moment the consumer is paying for the solar cells on his neighbor’s roof right along with his own electricity bill. Surprising? Who’s surprised?
It’s amazing how effective a substantial and increasing yearly hit to income can be in focusing the mind when it comes to assessing the real cost of green energy.
Posted on June 10th, 2012 2 comments
As I mentioned in a previous post about fusion progress, signs of life have finally been appearing in scientific journals from the team working to achieve fusion ignition at the National Ignition Facility, or NIF, located at Lawrence Livermore National Laboratory (LLNL) in California. At the moment they are “under the gun,” because the National Ignition Campaign (NIC) is scheduled to end with the end of the current fiscal year on September 30. At that point, presumably, work at the facility will be devoted mainly to investigations of nuclear weapon effects and physics, which do not necessarily require fusion ignition. Based on a paper that recently appeared in Physical Review Letters, chances of reaching the ignition goal before that happens are growing dimmer.
The problem has to do with a seeming contradiction in the physical requirements for fusion to occur in the inertial confinement approach pursued at LLNL. In the first place, it is necessary for the NIF’s 192 powerful laser beams to compress, or implode, a target containing fusion fuel in the form of two heavy isotopes of hydrogen to extremely high densities. It is much easier to compress materials that are cold than those that are hot. Therefore, it is essential to keep the fuel material as cold as possible during the implosion process. In the business, this is referred to as keeping the implosion on a “low adiabat.” However, for fusion ignition to occur, the nuclei of the fuel atoms must come extremely close to each other. Unfortunately, they’re not inclined to do that, because they’re all positively charged, and like charges repel. How to overcome the repulsion? By making the fuel material extremely hot, causing the nuclei to bang into each other at high speed. The whole trick of inertial confinement fusion, then, is to keep the fuel material very cold, and then, in a tiny fraction of a second, while its inertia holds it in place (hence the name, “inertial” confinement fusion), raise it, or at least a small bit of it, to the extreme temperatures necessary for the fusion process to begin.
The proposed technique for creating the necessary hot spot was always somewhat speculative, and more than one fusion expert at the national laboratories were dubious that it would succeed. It consisted of creating a train of four shocks during the implosion process, which were to overtake one another all at the same time precisely at the moment of maximum compression, thereby creating the necessary hot spot. Four shocks are needed because of well-known theoretical limits on the increase in temperature that can be achieved with a single shock. Which brings us back to the paper in Physical Review Letters.
The paper, entitled Precision Shock Tuning on the National Ignition Facility, describes the status of efforts to get the four shocks to jump through the hoops described above. One cannot help but be impressed by the elegant diagnostic tools used to observe and measure the shocks. They are capable of peering through materials under the extreme conditions in the NIF target chamber, focusing on the tiny, imploded target core, and measuring the progress of a train of shocks over a period that only lasts for a few billionths of a second! These diagnostics, developed with the help of another team of brilliant scientists at the OMEGA laser facility at the University of Rochester’s Laboratory for Laser Energetics, are a triumph of human ingenuity. They reveal that the NIF is close to achieving the ignition goal, but not quite close enough. As noted in the paper, “The experiments also clearly reveal an issue with the 4th shock velocity, which is observed to be 20% slower than predictions from numerical simulation.”
It will be a neat trick indeed if the NIF team can overcome this problem before the end of the National Ignition Campaign. In the event that they don’t, one must hope that the current administration is not so short-sighted as to conclude that the facility is a failure, and severely reduce its funding. There is too much at stake. I have always been dubious about the possibility that either the inertial or magnetic approach to fusion will become a viable source of energy any time in the foreseeable future. However, I may be wrong, and even if I’m not, achieving inertial fusion ignition in the laboratory may well point the way to as yet undiscovered paths to the fusion energy goal. Ignition in the laboratory will also give us a significant advantage over other nuclear weapons states in maintaining our arsenal without nuclear testing.
Based on the progress reported to date, there is no basis for the conclusion that ignition is unachievable on the NIF. Even if the central hot spot approach currently being pursued proves too difficult, there are alternatives, such as polar direct drive and fast ignition. However, pursuing these alternatives will take time and resources. They will become a great deal more difficult to realize if funding for NIF operations is severely cut. It will also be important to maintain the ancillary capability provided by the OMEGA laser. OMEGA is much less powerful but also a good deal more flexible and nimble than the gigantic NIF, and has already proved its value in testing and developing diagnostics, investigating novel experimental approaches to fusion, developing advanced target technology, etc.
We have built world-class facilities. Let us persevere in the quest for fusion. We cannot afford to let this chance slip.
Posted on May 8th, 2012 1 comment
Theodosius Dobzhansky was in important early proponent of what is now generally referred to as evolutionary psychology. Although his last book appeared as recently as 1983, he is generally forgotten today, at least in the fanciful and largely imaginery “histories” of the field that appear in college textbooks. Unfortunately, he was indelicate enough to jump the gun, joining contemporaries like Robert Ardrey and Konrad Lorenz in writing down the essential ideas of evolutionary psychology, particularly as applied to humans, long before the publication of E. O. Wilson’s Sociobiology in 1975.
That event was subsequently arbitrarily anointed by the gatekeepers of the chronicles of the science as the official “beginning” of evolutionary psychology. In fact, the reason Sociobiology gained such wide notoriety was Wilson’s insistence that what is commonly referred to as human nature actually does exist. As I have noted elsewhere, neither that claim nor the controversy surrounding it began with Wilson. Far from it. The “Blank Slate” opponents of Wilson’s ideas had long recognized Robert Ardrey as their most significant and effective opponent, with Konrad Lorenz a close second. Dobzhansky’s Mankind Evolving also presented similar hypotheses, well-documented with copious experimental evidence which, if textbooks such as David Buss’ Evolutionary Psychology are to be believed, didn’t exist at the time. Anyone who reads Mankind Evolving, published in 1962, a year after Ardrey’s African Genesis, will quickly realize from the many counter-examples noted in the book that Buss’ claim that the early ethologists and their collaborators, “…did not develop rigorous criteria for discovering adaptations,” is a myth. Alas, Dobzhansky was premature. He wrote too early to fit neatly into the “history” of evolutionary psychology concocted later.
It’s unfortunate that Dobzhansky has been swept under the rug with the rest, because he had some interesting ideas that don’t appear in many other works. He also wrote from the point of view of a geneticist, which enabled him to explain the mechanics of evolution with unusual clarity.
Latter day critics of evolutionary psychology commonly claim that it minimizes the significance of culture. Not only is that not true today, but it has never been true. Thinkers like Ardrey, Lorenz and Eibes-Eiblfeldt never denied the importance of culture. They merely insisted that the extreme cultural determinism of the Blank Slate orthodoxy that prevailed in their day was wrong, and that innate, evolved traits also had a significant effect on human behavior. Dobzhansky was very explicit about it, citing numerous instances in which culture and learning played a dominant role, and others more reliant on innate predispositions. As he put it,
In principle any trait is modifiable by changes in the genes and by manipulation of the environment.
He went so far as to propose a theory of superorganisms:
In producing the genetic basis for culture, biological evolution has transcended itself – it has produced the superorganic.
…and constantly stressed the interdependence of innate predispositions and culture. For example,
Why do so many people insist that biological and cultural evolution are absolutely independent? I suggest that this is due in large part to a widespread misunderstanding of the nature of heredity… Biological heredity, which is the basis of biological evolution, doesn not transmit cultural, or for that matter physical, traits ready-made; what it does is determine the response of the developing organism to the environment in which the development takes place.
The dichotomy of hereditary and environmental traits is untenable: in principle, any trait is modifiable by changes in the genes and by manipulation of the environment.
In higher animals and most of all in man instinctual behavior is intertwined with, overlaid by, and serves merely as a backdrop to learned behavior. Yet it would be rash to treat this backdrop as unimportant.
…the old fashioned nature-nurture debates were meaningless. The dichotomy of environment vs. genetic traits is invalid; what we really want to know are the relative magnitudes of the genetic and environmental components in the variance observed in a given trait, a certain population, at a particular time.
It has a surprisingly modern ring to it for something written in 1962, doesn’t it? Dobzhansky was as well aware as Ardrey of the reasons for the Blank Slate orthodoxy that prevailed in the behavioral sciences when he wrote Mankind Evolving, and that is now being so assiduously ignored, as if the ideological derailment and insistence on doctrines so bogus they could have been immediately recognized as such by a child over a period of decades in such “sciences” as anthropology, sociology and psychology, was a matter of no concern. Citing Ashley Montagu, editor of that invaluable little document of the times, Mankind and Aggression, as a modern proponent of such ideas, he writes,
Some philosophes who were perhaps bothered by questions of this sort (whether human nature was really good or not) concluded that human nature is, to begin with, actually a void, an untenanted territory. The “tabula rasa” theory was apparently first stated clearly by John Locke (1632-1704). The mind of a newborn infant is, Locke thought, a blank page.
Patore (1949) compared the sociopolitical views of twenty-four psychologists, biologists, and sociologists with their opinions concerning the nature-nurture problem. Among the twelve classified as “liberals or radicals,” eleven were environmentalists and one an hereditarian; among the twelve “conservatives,” eleven were hereditarians and one an environmentalist. This is disconcerting! If the solution of a scientific problem can be twisted to fit one’s biases and predilections, the field of science concerned must be in a most unsatisfactory state.
That is certainly the greatest understatement in Dobzhansky’s book. In fact, for a period of decades in the United States, major branches of the behavioral sciences functioned, not as sciences, but as ideological faiths posing as such. The modern tendency to sweep that inconvenient truth under the rug is dangerous in the extreme. It is based on the apparent assumption that such a thing can never happen again. It not only will happen again, but is happening even as I write this. It will happen a great deal more frequently as long as we continue to refuse to learn from our mistakes.
Posted on May 6th, 2012 9 comments
Nuclear power is an attractive candidate for meeting our future energy needs. Nuclear plants do not release greenhouse gases. They release significantly less radiation into the environment than coal plants, because coal contains several parts per million of radioactive thorium and uranium. They require far less space and are far more reliable than alternative energy sources such as wind and solar. In spite of some of the worst accidents imaginable due to human error and natural disasters, we have not lost any cities or suffered any mass casualties, and the horrific “China Syndrome” scenarios invented by the self-appointed saviors of mankind have proven to be fantasies. That is not to say nuclear power is benign. It is just more benign than any of the currently available alternatives. The main problem with nuclear is not that it is unsafe, but that it is being ill-used. In this case, government could actually be helpful. Leadership and political will could put nuclear on a better track.
To understand why, it is necessary to know a few things about nuclear fuel, and how it “burns.” Bear with me while I present a brief tutorial in nuclear engineering. Nuclear energy is released by nuclear fission, or the splitting of heavy elements into two or more lighter ones. This doesn’t usually happen spontaneously. Before a heavy element can undergo fission, an amount of energy above a certain threshold must first be delivered to its nucleus. How does this happen? Imagine a deep well. If you drop a bowling ball into the well, it will cause a large splash when it hits the water. It does so because it has been accelerated by the force of gravity. A heavy nucleus is something like a well, but things don’t fall into it because of gravity. Instead, it relies on the strong force, which is very short range, but vastly more powerful than gravity. The role of “bowling ball” can be played by a neutron. If one happens along and gets close enough to fall into the strong force ”well,” it will also cause a “splash,” releasing energy as it is bound to the heavy element’s nucleus, just as the real bowling ball is “bound” in the water well until someone fishes it out. This “splash,” or release of energy, causes the heavy nucleus to “jiggle,” much like an unstable drop of water. In one naturally occurring isotope – uranium with an atomic weight of 235 – this “jiggle” is so violent that it can cause the “drop of water” to split apart, or fission.
There are other isotopes of uranium. All of them have 92 protons in their nucleus, but can have varying numbers of neutrons. The nucleus of uranium 235, or U235, has 92 protons and 143 protons, adding up to a total of 235. Unfortunately, U235 is only 0.7% of natural uranium. Almost all the rest is U238, which has 92 protons and 146 neutrons. When a neutron falls into the U238 “well,” the “splash” isn’t big enough to cause fission, or at least not unless the neutron had a lot of energy to begin with, as if the “bowling ball” had been shot from a cannon. As a result, U238 can’t act as the fuel in a nuclear reactor. Almost all the nuclear reactors in operation today simply burn that 0.7% of U235 and store what’s left over as radioactive waste. Unfortunately, that’s an extremely inefficient and wasteful use of the available fuel resources.
To understand why, it’s necessary to understand something about what happens to the neutrons in a reactor that keep the nuclear chain reaction going. First of all, where do they come from? Well, each fission releases more neutrons. The exact number depends on how fast the neutron that caused the fission was going, and what isotope underwent fission. If enough are released to cause, on average, one more fission, then the resulting chain reaction will continue until the fuel is used up. Actually, two neutrons, give or take, are released in each fission. However, not all of them cause another fission. Some escape the fuel region and are lost. Others are absorbed in the fuel material. That’s where things get interesting.
Recall that, normally, most of the fuel in a reactor isn’t U235, but the more common isotope, U238. When U238 absorbs a neutron, it forms U239, which quickly decays to neptunium 239 and then plutonium 239. Now it just so happens that plutonium 239, or Pu239, will also fission if a neutron “falls into its well,” just like U235. In other words, if enough neutrons were available, the reactor could actually produce more fuel, in the form of Pu239, than it consumes, potentially burning up most of the U238 as well as the U235. This is referred to as the “breeding” of nuclear fuel. Instead of just lighting the U235 “match” and letting it burn out, it would be used to light and burn the entire U238 “log.” Unfortunately, there are not enough neutrons in normal nuclear reactors to breed more fuel than is consumed. Such reactors have, however, been built, both in the United States and other countries, and have been safely operated for periods of many years.
Plutonium breeders aren’t the only feasible type. In addition to U235 and Pu239, another isotope will also fission if a neutron falls into its “well” - uranium 233. Like Pu239, U233 doesn’t occur in nature. However, it can be “bred,” just like Pu239, from another element that does occur in nature, and is actually more common than uranium – thorium. I’ve had a few critical things to say about some of the popular science articles I’ve seen on thorium lately, but my criticisms were directed at inaccuracies in the articles, not at thorium technology itself. Thorium breeders actually have some important advantages over plutonium. When U233 fissions, it produces more neutrons than Pu239, and it does so in a “cooler” neutron spectrum, where the average neutron energy is much lower, making the reactor significantly easier to control. These extra neutrons could not only breed more fuel. They could also be used to burn up the transuranic elements – those beyond uranium on the table of the elements – that are produced in conventional nuclear reactors, and account for the lion’s share of the long-lived radioactive waste. This would be a huge advantage. Destroy the transuranics, and the residual radioactivity from a reactor would be less than that of the original ore, potentially in a few hundred years, rather than many thousands.
Thorium breeders have other potentially important advantages. The fuel material could be circulated through the core in the form of a liquid, suspended in a special “salt” material. Of course, this would eliminate the danger of a fuel meltdown. In the event of an accident like the one at Fukushima, the fuel would simply be allowed to run into a holding basin, where it would be sub-critical and cool quickly. Perhaps more importantly, the United States has the biggest proven reserves of thorium on the planet.
Breeders aren’t the only reactor types that hold great promise for meeting our future energy needs. High temperature gas cooled reactors would produce gas heated to high temperature in addition to electricity. This could be used to produce hydrogen gas via electrolysis, which is much more efficient at such high temperatures. When hydrogen burns, it produces only water. Such reactors could also be built over the massive oil shale deposits in the western United States. The hot gas could then be used to efficiently extract oil from the shale “in situ” without the need to mine it. It is estimated that the amount of oil that could be economically recovered in this way from the Green River Basin deposits in Utah, Wyoming and Colorado alone is three times greater than the oil reserves of Saudi Arabia.
Will any of this happen without government support and leadership? Not any time soon. The people who build nuclear reactors expect to make a profit, and the easiest way to make a profit is to build more conventional reactors of the type we already have. Raise the points I’ve mentioned above, and they’ll simply tell you that there’s plenty of cheap uranium around and therefore no need to breed more fuel, the radioactive danger of transuranics has been much exaggerated, etc., etc. All these meretricious arguments make sense if your goal is to make a profit in the short run. They make no sense at all if you have any concern for the energy security and welfare of future generations.
Unless the proponents of controlled fusion or solar and other forms of alternative energy manage to pull a rabbit out of their collective hats, I suspect we will eventually adopt breeder technology. The question is when. After we have finally burnt our last reserves of fossil fuel? After we have used up all our precious reserves of U238 by scattering it hither and yon in the form of “depleted uranium” munitions? The longer we wait, the harder and more expensive it will become to develop a breeder economy. It would be well if, in this unusual case, government stepped in and did what it is theoretically supposed to do; lead.
Posted on February 29th, 2012 1 comment
It’s quiet out there – too quiet. The National Ignition Facility, or NIF, at Lawrence Livermore National Laboratory, a giant, 192 beam laser facility, has been up and running for well over a year now. In spite of that, there is a remarkable lack of the type of glowing journal articles with scores of authors one would expect to see if the facility had achieved any notable progress towards its goal of setting off fusion ignition in a tiny target with a mix of fuel in the form of tritium and deuterium, two heavy isotopes of hydrogen. Perhaps they will turn things around, but at the moment it doesn’t look good.
The NIF was built primarily to study various aspects of nuclear weapons science, but it is potentially also of great significance to the energy future of mankind. Fusion is the source of the sun’s energy. Just as energy is released when big atoms, such as uranium, are split, it is also released when the central core, or nuclei, of light atoms are “fused” together. This “fusion” happens when the nuclei are moved close enough together for the attraction of the ”strong force,” a very powerful force but one with a range limited to the very short distances characteristic of atomic nuclei, to overwhelm the “Coulomb” repulsion, or electric force that tends to prevent two like charges, such as positively charged atomic nuclei, from approaching each other. When that happens with deuterium, whose nucleus contains a neutron and a proton, and tritium, whose nucleus contains two neutrons and a proton, the result is a helium nucleus, containing two neutrons and two protons, and a free neutron that carries off a very large quantity of energy.
The problem is that overcoming the Coulomb force is no easy matter. It can only be done if you pump in a lot of energy to “light” the fusion fire. On the sun, this is accomplished by the massive force of gravity. Here on earth the necessary energy can be supplied by a fission explosion, the source of energy that “lights” thermonuclear bombs. Mother Nature decided, no doubt very wisely, to make it very difficult to accomplish the same thing in a controlled manner on a laboratory scale. Otherwise we probably would have committed suicide with pure fusion weapons by now. At the moment, two major approaches are being pursued to reach this goal. One is inertial confinement fusion, or ICF, as used on the NIF. In inertial confinement fusion, the necessary energy is supplied in such a short period of time by massive lasers or other “drivers” that the fuel is held in place by its own inertia long enough for significant fusion to occur. In the other approach, magnetic fusion, the fusion fuel is confined by powerful magnetic fields as it is heated to fusion temperatures. This is the approach being pursued with ITER, the International Thermonuclear Experimental Reactor, currently under construction in France.
Based on computer models and the results of experiments on much smaller facilities, such as NOVA at Livermore, and OMEGA at the University of Rochester, it was expected that fusion could be accomplished with the nominal 1.8 megajoules of energy available from the 192 NIF laser beams. It was to happen like this – carefully shaped laser pulses would implode the fusion fuel to extremely high densities. Such implosions have already been demonstrated many times in the laboratory. The problem is that, to achieve the necessary densities, one must compress the fuel while it is in a relatively “cold” state (it is much more difficult to “squeeze” something that is “hot” in that way). Unfortunately, fusion doesn’t happen in cold material. Once the necessary high densities have been achieved, it is somehow necessary to heat at least a small portion of the material to the extreme temperatures necessary for fusion to occur. If that can be done, a “burn wave” will move out from this “hot spot,” igniting the rest of the cold fuel material. Of course, this begs the question of how one is to produce the “hot spot” to begin with.
On the NIF, the trick was to be accomplished by setting off a series of converging shocks in the fuel material during the implosion process. Once the material had reached the necessary high density, these shocks would converge at a point in the center of the imploded target, creating a spot hot enough to set off the burn wave referred to above. It would be a neat trick if it could be done. Unfortunately, it was never demonstrated on a laboratory scale before the NIF was built. Obviously, the “trick” is turning out to be harder than the scientists at Livermore expected. There could be many reasons for this. If the implosion isn’t almost perfectly symmetric, the hot and cold fuel materials will mix, quenching the fusion reaction. If the timing of the shocks isn’t just right, or the velocity of the implosion is too slow, the resulting number of fusion reactions will not be enough to achieve ignition. All kinds of complicated physical processes, such as the generation of huge magnetic and electric fields, so-called laser-plasma instabilities, and anomalies in the absorption of laser light, can happen that are extremely difficult to include in computer models.
The game isn’t up yet, though. There are some very bright folks at Livermore, and they may yet pull a rabbit out of the hat. Even if the current “mainline” approach using central hot spot ignition doesn’t work, it may be possible to create a hot spot on the outer surface of the imploded target using a technique known as fast ignition. Currently, “indirect drive” is being used on the NIF. In other words, the laser beams are shot into a cylindrical can, or “hohlraum,” where their energy is converted to x-rays. These x-rays then “indirectly” illuminate the target. The NIF can also accommodate a “direct drive” approach, in which the laser beams are aimed directly at the target. Perhaps it will work better. One hopes so. Some of the best old knights of science have been riding towards that El Dorado for a long time. It would be great to see them finally reach it. Alas, to judge by the deafening silence coming out of Livermore, it seems they are still a long way off.
And what of ITER? Let me put it this way. Along with the International Space Station, the project is one of the two greatest scientific white elephants ever concocted by the mind of man. The NIF is justified because it cost only a fraction of ITER, and it was never conceived as an energy project. It was always intended as an above ground experimental facility that would enable us to maintain our nuclear arsenal in the absence of testing. As such, it is part of an experimental capability unequalled in the rest of the world, and one which will give us a very significant advantage over any potential enemy as long the ban on testing continues. ITER, on the other hand, can only be justified as an energy project. The problem with that is that, while it may work scientifically, it will be an engineering nightmare. As a result, it is virtually inconceivable that magnetic fusion reactors similar to ITER will ever produce energy economically any time in the next few hundred years.
A big part of the problem is that such reactors will require a tritium economy. Each of them will burn on the order of 50 kilograms of tritium per year. Tritium is highly radioactive, with a half-life of 12.3 years, is as difficult to contain as any other form of hydrogen, and does not occur naturally. In other words, failing some outside source, each reactor will have to produce as much tritium as it consumes. Each fusion reaction produces a single neutron, and neutrons can interact with an isotope of lithium to produce tritium. However, some of the neutrons will inevitably be lost, so it will be necessary to multiply their number. This trick can be accomplished with the element beryllium. In other words, in order to build a workable reactor, it will be necessary to have a layer of some extremely durable material containing the plasma, thick enough to resist radiation embrittlement and corrosion for some reasonable period of time, followed by a layer of highly toxic beryllium thick enough to generate enough neutrons, followed by a layer of highly reactive lithium thick enough to produce enough tritium to keep the reaction going. But wait, there’s more! It will then be necessary to somehow quickly extract the lithium and return it to the reaction chamber without losing any of it. Tritium? Lithium? Beryllium? Forget about it! I’m sure there are any number of reactor design studies that all “prove” that all of the above can be done economically. I’m also sure none of them are worth the paper they are printed on. We have other options that don’t suffer from the drawbacks of a tritium economy and are far more likely to produce the energy we need at a fraction of the cost.
Meanwhile, ITER crawls ahead, sucking enormous amounts of research money from a host of more worthy projects. A classic welfare project for smart guys in white coats, there are no plans to even fuel it with tritium before the year 2028! I’m sure that at this point many European scientists are asking a simple question; Can’t we please stop this thing?
Fusion is immensely promising as a potential future source of energy. However, we should not be seduced by that promise into throwing good money after bad, funding a white elephant that has virtually no chance of ever fulfilling that promise. I suspect that one of these days we will “finesse” Mother Nature, and devise a clever way to overcome the Coulomb barrier without gigantic superconducting magnets or massive arrays of lasers. Scientists around the world are currently working on many novel and speculative approaches to fusion. Few of them are likely to succeed, but it just takes one. We would be much better off funding some of the more promising of these approaches with a fraction of the money currently being wasted on ITER, and devoting the rest to developing other technologies that have at least a fighting chance of eventually producing energy economically.
Meanwhile, I’m keeping my fingers crossed for the NIF crew at Livermore. It ain’t over until the fat lady sings, and she’s still a long way off.
Posted on February 20th, 2012 No comments
Rick Santorum threw the Left a meaty pitch right down the middle with his comments about “theology” to an audience in Columbus. Here’s what he said:
It’s not about you. It’s not about your quality of life. It’s not about your job. It’s about some phony ideal, some phony theology. Oh, not a theology based on the Bible. A different theology. But no less a theology.
The quote seems to lend credence to the “Santorum is a scary theocrat” meme, and the Left lost no time in flooding the media and the blogosphere with articles to that effect. The Right quickly fired back with the usual claims that the remarks were taken out of context. This time the Right has it right. For example, from Foxnews,
Rick Santorum said Sunday he wasn’t questioning whether President Obama is a Christian when he referred to his “phony theology” over the weekend, but was in fact challenging policies that he says place the stewardship of the Earth above the welfare of people living on it.
“I wasn’t suggesting the president’s not a Christian. I accept the fact that the president is a Christian,” Santorum said.
“I was talking about the radical environmentalist,” he said. “I was talking about energy, this idea that man is here to serve the Earth as opposed to husband its resources and be good stewards of the Earth. And I think that is a phony ideal.
I note in passing a surprising thing about almost all the articles about this story, whether they come from the Left or the Right. The part of Santorum’s speech that actually does put things in context is absent. Here it is:
I think that a lot of radical environmentalists have it backwards. This idea that man is here to serve the earth, as opposed to husband its resources and be good stewards of the earth. Man is here to use the resources and use them wisely. But man is not here to serve the earth.
I can understand its absence on the Left, but on the Right? Could it be that contrived controversies are good for the bottom line? Well, be that as it may, I’m not adding my two cents worth to this kerfluffle because I’m particularly fond of Santorum. However, he did touch on a matter that deserves serious consideration; the existence of secular religions.
In fact, there are secular religions, and they have dogmas, just like the more traditional kind. It’s inaccurate to call those dogmas “theologies,” because they don’t have a Theos, but otherwise they’re entirely similar. In both cases they describe elaborate systems of belief in things that either have not or cannot be demonstrated and proved. The reason for this is obvious in the case of traditional religions. They are based on claims of the existence of spiritual realms inaccessible to the human senses. Secular dogmas, on the other hand, commonly deal with events that can’t be fact-checked because they are to occur in the future.
Socialism in it’s heyday was probably the best example of a secular religion to date. While it lasted, millions were completely convinced that the complex social developments it predicted were the inevitable fate of mankind, absent any experimental demonstration or proof whatsoever. Not only did they believe it, they considered themselves superior in intellect and wisdom to other mere mortals by virtue of that knowledge. They were elitists in the truest sense of the word. Thousands and thousands of dreary tomes were written elaborating on the ramifications and details of the dogma, all based on the fundamental assumption that it was true. They were similar in every respect to the other thousands and thousands of dreary tomes of theology written to elaborate on conventional religious dogmas, except for the one very important distinction referred to above. Instead of describing an entirely different world, they described the future of this world.
That was their Achilles heal. The future eventually becomes the present. The imaginary worker’s paradise was eventually exchanged for the very real Gulag, mass executions, and exploitation by a New Class beyond anything ever imagined by the bourgeoisie. Few of the genuine zealots of the religion ever saw the light. They simply refused to believe what was happening before their very eyes, on the testimony of thousands of witnesses and victims. Eventually, they died, though, and their religion died with them. Socialism survives as an idea, but no longer as the mass delusion of cocksure intellectuals. For that we can all be grateful.
In a word, then, the kind of secular “theologies” Santorum was referring to really do exist. The question remains whether the specific one he referred to, radical environmentalism, rises to the level of such a religion. I think not. True, some of the telltale symptoms of a secular religion are certainly there. For example, like the socialists before them, environmental ideologues are characterized by a faith, free of any doubt, that a theoretically predicted future, e.g., global warming, will certainly happen, or at least will certainly happen unless they are allowed to “rescue” us. The physics justifies the surmise that severe global warming is possible. It does not, however, justify fanatical certainty. Probabilistic computer models that must deal with billions of ill-defined degrees of freedom cannot provide certainty about anything.
An additional indicator is the fact that radical environmentalists do not admit the possibility of honest differences of opinion. They have a term for those who disagree with them; “denialists.” Like the heretics of religions gone before, denialists are an outgroup. It cannot be admitted that members of an outgroup have honest and reasonable differences of opinion. Rather, they must be the dupes of dark political forces, or the evil corporations they serve, just as, in an earlier day, anyone who happened not to want to live under a socialist government was automatically perceived as a minion of the evil bourgeoisie.
However, to date, at least, environmentalism possesses nothing like the all encompassing world view, or “Theory of Everything,” if you will, that, in my opinion at least, would raise it to the level of a secular religion. For example, Christianity has its millennium, and the socialists had their worker’s paradise. The environmental movement has nothing of the sort. So far, at least, it also falls short of the pitch of zealotry that results in the spawning of warring internal sects, such as the Arians and the Athanasians within Christianity, or the Mensheviks and Bolsheviks within socialism.
In short, then, Santorum was right about the existence of secular religions. He was merely sloppy in according that honor to a sect that really doesn’t deserve it.
Posted on February 11th, 2012 No comments
I don’t think so! Less than a century after H. L. Mencken wrote that the Uplift was a purely American phenomenon, there may now be even more of the pathologically pious in Germany per capita than in the U.S. They all think they’re far smarter than the average human being, they all see a savior of mankind when they look in the mirror, and almost all of them are cocksure that nuclear power is one of the Evils they need to save us from. Just last November tens of thousands of them turned out in force to block the progress of a spent fuel castor from France to the German radioactive waste storage site at Gorleben. The affair turned into a regular Uplift feeding frenzy, complete with pitched battles between the police and the peaceful protesters, who were armed with clubs and pyrotechnics, tearing up of railroad tracks, etc. It’s no wonder the German government finally threw in the towel and announced the country would shut down its nuclear power plants.
At least the decision took the wind out of their sails for a while. As Malcolm Muggeridge once said, “nothing fails like success” for the Saviors of Mankind. Success tends to leave them high and dry. At best they have to go to the trouble of finding another holy cause to fight for. At worst, as in the aftermath of their fine victory in establishing a Worker’s Paradise in Russia, they’re all shot.
It would seem the “bitter dregs of success” were evident in a recent article on the website of the German news magazine, Der Spiegel, entitled “Electricity is Becoming Scarce in Germany.” Der Spiegel has always been in the van of the pack of baying anti-nuclear hounds in Germany, so I was somewhat surprised by the somber byline, which reads as follows:
The nuclear power shutdown has been a burden for Germany’s electric power suppliers in any case. Now the cold wave is making matters worse. The net operators have already had to fall back on emergency reserves for the second time this winter, and buy additional electricity from Austria.
That’ s quite an admission coming from the Der Spiegel, where anti-nuclear polemics are usually the order of the day. Even the resolutely Green Washington Post editorialized against the German shutdown, noting, among other things,
THE INTERNATIONAL Energy Agency reported on Monday that global energy-related carbon emissions last year were the highest ever, and that the world is far off track if it wants to keep temperatures from rising more than 2 degrees Celsius, after which the results could be very dangerous.
So what does Germany’s government decide to do? Shut down terawatts of low-carbon electric capacity in the middle of Europe. Bowing to misguided political pressure from Germany’s Green Party, Chancellor Angela Merkel endorsed a plan to close all of the country’s nuclear power plants by 2022.
European financial analysts (estimate) that Germany’s move will result in about 400 million tons of extra carbon emissions by 2020, as the country relies more on fossil fuels. Nor is Donald Tusk, Poland’s prime minister, who ominously announced that Germany has put coal-fired power “back on the agenda” — good for his coal-rich nation directly to Germany’s east but terrible for the environment and public health.
…and so on. Not exactly a glowing endorsement of the German Greens optimistic plans to replace nuclear with solar in a cloudy country that gets cold in the winter and lies on the wrong side of the 50th parallel of latitude. Poland’s prime minister is right to worry about being downwind of Germany. In spite of the cheery assurances of the Greens, she currently plans to build 26 new coal-fired power plants. It’s funny how environmental zealots forget all about the terrible threat of global warming if its a question of opposing nuclear power. But Poland has a lot more to worry about than Germany’s carbon footprint.
It’s estimated that 25,000 people die from breathing coal particulates in the U.S. alone every year. The per capita death rate in Poland, directly downwind from the German plants, will likely be significantly higher. Then there’s the radiation problem. That’s right, coal typically contains several parts per million of radioactive uranium and thorium. A good-sized plant will release 5 tons of uranium and 10 tons of thorium into the environment each year. Estimated releases in 1982 from worldwide combustion of 2800 million tons of coal totaled 3640 tons of uranium (containing 51,700 pounds of uranium-235) and 8960 tons of thorium. China currently burns that much coal by herself. The radiation from uranium and thorium is primarily in the form of alpha particles, or helium nuclei. Such radiation typically has a very short range in matter, because it slows down quickly and then dumps all of its remaining energy in a very limited distance, the so-called Bragg peak. On the one hand that means that a piece of paper is enough to stop most alpha radiation. On the other it means that if you breath it in, the radiation will be slammed to a stop in your sensitive lung tissue, dealing tremendous damage in the process. Have you ever heard of people dying of lung cancer who never smoked a day in their lives? If you’re looking for a reason, look no further.
No matter. As Stalin said, one death is a tragedy. One million is a statistic. Germany’s Greens will continue to ignore such dry statistics, and they will continue to strike noble poses as they fight the nuclear demon, forgetting all about global warming in the process. For them, the pose is everything, and the reality nothing.
Posted on February 7th, 2012 2 comments
Geoffrey Gorer was a British anthropologist, essayist, long-time friend of George Orwell, and, at least in my estimation, a very intelligent man. He was also a Blank Slater. In other words, he was a proponent of the orthodox dogma that prevailed among psychologists, anthropologists, sociologists, and other experts in the behavioral sciences during much of the 20th century according to which there was no such thing as human nature or, if it existed at all, its impact on human behavior was insignificant. He defended that orthodoxy, among other places, in Man and Aggression, a collection of essays edited by Ashley Montagu, and an invaluable piece of source material for students of the Blank Slate phenomenon.
Now, of course, after one of the most remarkable paradigm shifts in the history of mankind, the Blank Slate has gone the way of Aristotelian cosmology, and books roll off the presses in an uninterrupted stream discussing innate human behavior as if the subject had never been the least bit controversial. How, one might ask, if Geoffrey Gorer really was such an intelligent man, could he ever have taken the Blank Slate ideology seriously? Well, I speak of intelligence in relative terms. Taken as a whole, we humans aren’t nearly as smart as we think we are and, as Julius Caesar once said, we have a marked tendency to belief what we want to believe.
And why did Gorer “want” to believe in the Blank Slate? I submit it was for the same reason that so many of his contemporaries defended the theory; their faith in socialism. I do not use the term socialism in any kind of a pejorative sense. Rather, I speak of it as the social phenomenon it was; for all practical purposes a secular religion posing as a science. It is scarcely possible for people today to grasp the power and pervasiveness of socialist ideology in its heyday. We have the advantage of hindsight and have watched socialist systems, ranging from the Communist authoritarian versions to the benign, democratic variant of the type tried in Great Britain after the war, fail over and over again. Earlier generations did not have that advantage.
More or less modern socialist theories were prevalent in England long before Marx. By 1917 they had taken such root in the minds of the Russian intelligentsia that Maxim Gorky could write that he couldn’t imagine a democratic state that wasn’t socialist. A couple of decades later, in the aftermath of the Great Depression, Malcolm Muggeridge remarked that,
In 1931, protests were made in Parliament against a broadcast by a Cambridge economist, Mr. Maurice Dobb, on the ground that he was a Marxist; now the difficulty would be to find an economist employed in any university who was not one.
Anyone doubting the influence of similar ideas in the United States at the time need only go back and read the New Republic, The Nation, The American Mercury, and some of the other intellectual and political journals of the mid-30′s. In a word, then, socialism was once accepted as an unquestionable truth by large numbers of very influential intellectuals. It seemed perfectly obvious that capitalism was gasping its last, and the only question left seemed to be how the transition to socialism would take place, and how the socialist states of the future should be run.
There was just one problem with this as far as the social and behavioral sciences were concerned. Socialism and human nature were mutually exclusive. The firmest defenders of genetically programmed behavioral predispositions in human beings have never denied the myriad possible variations in human societies that are attributable to culture and environment. Socialism, however, required more than that. It required human behavior to be infinitely malleable which, if innate behavioral predispositions exist, it most decidedly is not.
Which brings us back to Geoffrey Gorer. In an essay entitled, appropriately enough, The Remaking of Man, written in 1956, we can follow the intellectual threads that show how all this came together in the mind of a mid-20th century anthropologist. I will let Gorer speak for himself.
One of the most urgent problems – perhaps the most urgent problem – facing the world today is how to change the character and behavior of adult human beings within a single generation. This problem of rapid transformation has underlaid every revolution (as opposed to coups d’etat) at least from the time of the English Revolution in the seventeenth century, which sought to establish the Rule of the Saints by some modifications in the governing institutions and the laws they promulgated; and from this point of view every revolution has failed… the character of the mass of the population, their attitudes and expectations, change apparently very little.
Up till the present century revolutions were typically concerned with the internal arrangements of one political unit, one country; but the nearly simultaneous development of world-wide communications and world-wide ideologies – democracy, socialism, communism – has posed the problem not merely of how to transform ourselves – whoever ‘ourselves’ may be – but how to transform others.
In Gorer’s opinion the problem wasn’t human nature. It couldn’t be, or socialism wouldn’t work. The problem was that we simply hadn’t been using the right technique. For example, we hadn’t been relying on proper role models. Gorer had somehow convinced himself that female school teachers had played a decisive role in altering the character of immigrants to American, “transforming them within a generation into good citizens of their countries of adoption, with changed values, habits and expectations… In our original thinking, this role of the school-teacher, and the derivatives of this situation, were idiosyncratic to the culture of the United States.” According to Gorer, the introduction of police in England had had a similar magical effect. By serving as role models, they had, almost sole-handedly, brought about “the great modifications in the behavior of the English urban working classes in the nineteenth century from violence and lawlessness to gentleness and law-abiding.” They had, “…provided an exemplar of self-control which the mass of the population could emulate and use as a model.” If the phrase “just so story” popped into your mind, you’re not alone. Of course, one man’s “just so story” is another man’s “scientific hypothesis.” It all depends on whether it happens to be politically convenient or not.
Proper role models, then, were one of the ingredients that Gorer discovered were needed to “change the character and behavior of adult human beings within a single generation.” He discovered no less than four more in the process of reading Margaret Mead’s New Lives for Old, which he described as ”account of a society which has transformed itself within twenty-five years.” The society in question was that of the Manus, inhabitants of the Admiralty Islands, which lie just north of New Guinea. And sure enough, their society did change drastically in a generation and, if we are to believe Mead, for the good. This change had been greatly facilitated by one Paliau, a charismatic leader of the Manu who luckily happened along at the time. Gorer admitted this was a fortuitous accident, but he saw, or at least imagined he saw, several other ingredients for radical change which could be applied by properly qualified experts. In his words,
The availability of a man of Paliau’s genius is obviously an unpredictable accident which cannot be generalized; but the other four conditions – readiness for change, the presentation of a model for study and observation, the sudden and complete break with the past, nurture and support during the first years of the new life – would seem to provide a paradigm of the way in which men may be changed in a single generation.
Human societies certainly may change radically within a very short time. It is an adaptive trait that accounts for the fact that we managed, not only to survive, but to thrive during times of rapid environmental change. The brilliant South African, Eugene Marais, was the first to make the connection. In his words,
If now we picture the great continent of Africa with its extreme diversity of natural conditions – its high, cold, treeless plateaux; its impenetrable tropical forests; its great river systems; its inland seas; its deserts; its rain and droughts; its sudden climatic changes capable of altering the natural aspect of great tracts of country in a few years – all forming an apparently systemless chaos, and then picture its teeming masses of competing organic life, comprising more species, more numbers and of greater size than can be found on any other continent on earth, is it not at once evident how great would be the advantage if under such conditions a species could be liberated from the limiting force of hereditary memories? Would it not be conducive to preservation if under such circumstances a species could either suddenly change its habitat or meet any new natural conditions thrust upon it by means of immediate adaptation? Is it not self-evident that in a species far-wandering, whether on account of sudden natural changes, competitive pressure, or through inborn “wanderlust,” those individuals which could best and most quickly adapt themselves to the most varied conditions would be the ones most likely to survive and perpetuate the race, and that among species, one equipped for distant migrations would always have a better chance than a confined one? Are not all the elements present to bring about the natural selection of an attribute by means of which a species could thus meet and neutralise one of the most prolific causes of destruction?
This is not advanced as a demonstrable theory. It is no more than an attempt to show that it is hardly possible to imagine conditions existing anywhere in nature at any time which would not in some degree tend towards the evolution of such an attribute. If these present conditions are self-evidently likely to select it, how much more likely, for instance, would not its birth and growth have been during the earlier history of the planet, during the Pleistocene period, when cataclysmic movements of its crust and great and repeated climatic changes still belonged to the usual and customary category of natural events.
These astounding insights occurred to a man, working mainly alone in South Africa, in the early years of the 20th century. Marais was indeed a genius. Unfortunately, at least from Gorer’s point of view, while his theories accounted for mankind’s extreme adaptability, they in no way implied that that adaptability would enable well-meaning ideologues to reinvent human character at will to convert us into suitable inmates for whatever utopia du jour they were cooking up for us. It would seem that’s what Gorer overlooked in his sanguine conclusions about the Manu. Their society had indeed adapted, but it had done so on its own, and not as programmed by some inspired anthropologist. He concludes his essay as follows:
The great merit of New Lives for Old is that it opens up a whole new field for observation, experiment and speculation, a field of the greatest relevance to our present preoccupations.
The “present preoccupation” which required the “remaking of man” was, of course, our happy transformation to a socialism. Unfortunately, the wishful thinking of a generation of Gorers made no impression on the genetic programming responsible for our behavioral predispositions. It remained stubbornly in place, spoiling who knows how many of the splendid Brave New Worlds that noble idealists the world over were preparing for us.
In retrospect, socialism ended, as the old Bolshevik Leon Trotsky suggested it might in 1939, just before Stalin had him murdered, in a utopia. It was a secular religion that inspired a highly speculative and mindless faith in a collection of untested theories in the minds of a host of otherwise highly intelligent and perfectly sane people like Gorer, who all managed to somehow convince themselves that the socialist mirage was a “science.” As E. O. Wilson so succinctly summed it up, “Great theory, wrong species.” And that, perhaps, is the reason that the Blank Slate was defended so fiercely for so long, in the teeth of increasingly weighty scientific evidence refuting it, not to mention common sense, until its conflict with reality became too heavy to bear, and it finally collapsed. It was an indispensible prop for a God that failed.
Posted on February 2nd, 2012 No comments
Ron Bailey just posted an interesting article on the ethics of extraterrestrial terraforming at Reason.com. It illustrates, once again, that before entering into deep philosophical debates about morality, it’s useful to know what you’re talking about.
Before taking up the article in question, let me lay my own cards on the table. I consider human morality to be the expression of behavioral traits that exist because they evolved in a species with a large brain. Thus, good and evil are subjective categories that depend for their existence on emotional responses in the minds of individuals. As such, it is impossible for them to have any independent objective existence as things in themselves. As subjective emotional responses elicited in individual minds, there is no way in which they can acquire objective legitimacy. Elaborations on this theme may be found here and here.
Of course, one can dispute my take on morality, but, in that case, it will be necessary to somehow explain away the increasing flood of findings relative to what some call hardwired morality now appearing in academic and scientific journals and the popular media, not to mention the increasingly compelling evidence of analogs of the behavioral traits we associate with morality in other animals.
What does all this have to do with the ethics of terraforming? Simply this – arguments about whether terraforming is morally good or evil are absurd, and efforts in futility. They amount to attempts to apply behavioral predispositions that have evolved over millions of years in circumstances utterly unlike the present, and that exist for the sole reason that they promoted the genetic survival of the creatures who carried them, to a situation completely unrelated to the conditions and causes under which they evolved in the first place. Such arguments are completely senseless failing the assumption, long cultivated by philosophers, but nevertheless delusional, that good and evil can somehow acquire an objective legitimacy and objective existence of their own. In view of what we now know about the evolved roots of morality, belief in the existence of good and evil as things in themselves is no longer rationally supportable.
The article in question, entitled Does Mars have Rights, argues that terraforming is good, contradicting an earlier essay by Australian philosopher Robert Sparrow entitled The Ethics of Terraforming that claims that, at least for the present, it is evil. Let’s take up Prof. Sparrow’s essay first. He uses what he calls an agent-based virtue ethics to support his claim that advocacy of terraforming reveals “a shocking moral bankruptcy at the heart of our attitude toward the environment.” An agent-based ethics is motivated by the observation that “It is much easier to point out those who are cruel or benevolent in a community than it is to provide a description of what counts as a cruel or benevolent act.” It is based on the assertion that it is ”the virtuous (or vicious) character of the actor which makes the act virtuous (or vicious).” As such it is easier to apply in practice that an alternative system of virtue ethics, namely, agent-focused ethics, which Sparrow describes in his essay. Basing his conclusions on such an agent-based ethics, Sparrow argues that “terraforming reveals two serious defects of character. First, it demonstrates that we are suffering from an ethically significant aesthetic insensitivity,” and, “Second, it involves us in the sin of hubris.”
Sparrow goes into a great deal of detail in describing these two “sins,” but their legitimacy as “real” sins is based on their validation as “vicious” according to whether some subset of a population of animals with large brains “feels” that persons committing such acts are vicious. I say subset because it has been demonstrated that even infants, presumably without the benefit of having read the ancient philosophers, judge “agents” according to their actions. Prof. Sparrow does not go into a great deal of detail as to how that subset would be chosen. Clearly, this “feeling” test does not actually call the sins in question into existence. Rather, it is merely a means of detecting them once they have been committed. In other words, in order to accept the validity of the system, it is necessary for us to assume, a priori, that the sins in question exist as things in themselves, independent of the actors and agents that allow us to detect them. If, however, as I have maintained, morality is really the expression of a subset of evolved behavioral traits in a particular type of animal, this assumption is absurd, and the system collapses. Regardless of my opinions about morality, it is irrational to simply assume the objective existence and legitimacy of good and evil as entities in themselves, as Sparrow has done, without making the slightest attempt to explain the rationale on which their existence and legitimacy are based.
And what of Bailey’s post at Reason taking issue with Prof. Sparrow? He either doesn’t seem to have understood Sparrow’s definition of agent-based ethics, or has simply decided to ignore it. Instead, he explains to us why terraforming would be “really good” in terms of his own system of morality, which comes with rather less philosophical ballast courtesy of Aristotle and company. Addressing Sparrow’s two evidences of moral deficit, he writes,
Sparrow acknowledged that he did not offer an objective account of beauty, so the notion still resides in the eye of the beholder, as does desolate ugliness. And as awesome as the view down Valles Marineris might be right now, it would arguably be even more so if it were teeming with life. With regard to the hubris of terraforming, one initial response whould be a hearty “so what?” Terraforming offers the promise of helping humanity toward practical moral improvement by increasing our understanding of just how precious terrestrial life is, aiding us in managing it toward greater integrity, stability, and beauty.
To this, Sparrow’s virtuous agent would presumably reply, “Yes, and your point is?” In fact, there is no point, because Bailey missed it. His reply simply ignores the role of the virtuous agent in Sparrow’s ethics, a role which the philosopher explained clearly enough. He could simply observe that Bailey has self-identified as an “unvirtuous agent,” and his remarks about beauty and hubris are, therefore, neither here nor there. Bailey’s implication that terraforming would be morally good because it, “offers the promise of helping humanity toward practical moral improvement,” is simply a statement of the circular argument that terraforming is moral because it is moral.
Again, while both author’s arguments depend on the existence of objective good, they simply assume it a priori, without troubling themselves to explain to us how they have deduced the existence of that holy grail. Presumably it floats somewhere out there in the luminiferous ether, independent of any crude animal intelligence, and we are to take it on trust that, while it remains invisible to vulgar eyes, they have beheld it in all its glory. If all life in the universe ceased to exist, it would still remain, one gathers, as some kind of potential energy, ready to hop into the brain of any sentient beings that happened to evolve, guiding them towards the light.
Our consciousness certainly leads us to perceive the Good as an objective thing. In spite of that it was clear enough to Hume, Mill, and any number of other pre-Darwinian thinkers that no such object existed. Still, the illusion is so strong that even now, after the recent “discovery” by our social scientists that such a thing as human nature exists, and morality is a manifestation of that nature, objective Good is still taken for granted in deep, philosophical debates by people who should know better.
And what does all this have to do with terraforming? Simply this; morality is completely irrelevant to the question of whether we should do it or not. My personal opinion is that we should, as soon as we are able, because it will enhance the chances that both terrestrial life in general and our species in particular will survive and continue to evolve. Is our survival objectively good? Certainly not! Call it a mere whim of mine, if you will, but I submit that it’s at least a natural whim. Virtually everything about me exists because it happened to promote the survival of the genes responsible for putting me together at some point or other in the past. Furthermore, subjective though they may be, such whims make life not only endurable, but exciting and enjoyable. I hope that others will share this whim, this preference for survival over oblivion. If enough do, then terraforming will some day become a reality.