Posted on January 25th, 2011 No comments
Philosopher Nassim Taleb is famous for his theories regarding black swans, described in his book of that name as events of large magnitude and disproportionate consequence that are unexpected and unpredictable. According to the summary of his ideas on his webpage,
We don’t understand the world as well as we think we do and tend to be fooled by false patterns, mistake luck for skills (the fooled by randomness effect), overestimate knowledge about rare events (Black Swans), as well as human understanding, something that has been getting worse with the increase in complexity.
The collapse of the Soviet Union has my vote for the greatest Black Swan of the 20th century. As Taleb predicted, once it happened, it immediately became a basis for “overestimating our knowledge about rare events.” Transformed in the public imagination from an unprecedented and unpredicted anomaly into a commonplace, it now serves as the basis for all sorts of fanciful predictions, the most prominent of which are probably the recurring reports of China’s eminent demise. Insty just linked another typical example penned by Lawrence Solomon. According to the first two paragraphs:
In 1975, while I was in Siberia on a two-month trip through the U.S.S.R., the illusion of the Soviet Union’s rise became self-evident. In the major cities, the downtowns seemed modern, comparable to what you might see in a North American city. But a 20-minute walk from the centre of downtown revealed another world — people filling water buckets at communal pumps at street corners. The U.S.S.R. could put a man in space and dazzle the world with scores of other accomplishments yet it could not satisfy the basic needs of its citizens. That economic system, though it would largely fool the West until its final collapse 15 years later, was bankrupt, and obviously so to anyone who saw the contradictions in Soviet society.
The Chinese economy today parallels that of the latter-day Soviet Union — immense accomplishments co-existing with immense failures. In some ways, China’s stability today is more precarious than was the Soviet Union’s before its fall. China’s poor are poorer than the Soviet Union’s poor, and they are much more numerous — about one billion in a country of 1.3 billion. Moreover, in the Soviet Union there was no sizeable middle class — just about everyone was poor and shared in the same hardships, avoiding resentments that might otherwise have arisen.
Right. Except for the fact that the Chinese economy today does not parallel that of the latter-day Soviet Union (how prominent were Soviet consumer goods in the U.S. market in 1988?), the mentality of China’s citizens has nothing in common with the descriptions of pervasive despair in the Soviet Union so poignantly described by David Remnick in Lenin’s Tomb, and the rest of these “obvious” parallels amount to a broad comparison of apples and oranges. Such stuff might have figured prominently in Taleb’s book if it had been written a little earlier. In a chapter about World War I, for example, he describes how no one expected it before it happened, and everyone suffered from the illusion they had known about it and predicted it all along after the fact. They then used it as the basis for all kinds of delusional predictions, almost none of which came true. Copious examples can be found in the intellectual journals of the decade following the war.
Meanwhile, predictions of China’s doom have become something of a cottage industry for some writers. Gordon Chang, for example, wrote a book in 2001 predicting China’s collapse not later than 2011, and spend the intervening years writing articles proving inductively and deductively that it must be true. China’s leaders apparently didn’t read the book. We have arrived at 2011, and China’s governing class seems to be as alive and kicking as ever. Black Swans can always happen, but I will not be too astounded if they are still around and still cheating their “inevitable” fate in 2021.
China’s rise is itself a Black Swan of sorts. She was a basket case in the 1920′s, and still patronized as little removed from a third world country as recently as the 1980′s. Many in the West are uncomfortable with her sudden rise to superpower status. However, it’s unlikely she will be toppled by wishful thinking. In the long term, her government is in a state of unstable equilibrium. It does not govern by the consent of the governed, and bases its legitimacy on a failed alien philosophy which its economic policy entirely contradicts. However, Rome’s government was similarly unstable during the reign of Augustus Caesar. Somehow she managed to stagger on for another four centuries and more.
Posted on January 24th, 2011 No comments
The journal Evolutionary Psychology hosts a blog written by Robert Kurzban, an Associate Professor at the University of Pennsylvania. Its content is mostly commentary about ongoing research in the field, with a strong academic flavor. Now and again, however, Robert will react with a measure of chagrin, and seeming surprise, to the occasional potshot directed at EP by some unrepentant cultural determinist (for example, here, here and here). These latter typically seize on some supposed flaw in one obscure scientific paper or another as a pretext to condemn the entire field of EP as pseudo-science. What surprises me most about this is Robert’s surprise. His replies always have the air of someone who can’t comprehend why his field has been singled out for carte blanche condemnation, like the victim of schoolyard bullies who can’t fathom the reason that they constantly steal his glasses and tromp on them.
In fact, nothing could be more predictable than these attacks. After all, a basic premise of the field of Evolutionary Psychology is that there is such a thing as innate human nature. That premise, obvious as it may seem, contradicts the quasi-religious, ideologically driven denial of human nature that has been the prevailing orthodoxy in the behavioral sciences ever since the days of Franz Boas, an orthodoxy that was very much alive and kicking well into the late 90′s. Should one really be surprised at the bitter mutterings of the many partisans of that now-shattered orthodoxy who are presumably still alive and kicking as well? EP, after all, does not exist in a vacuum. It is not just another scientific sandbox for specialists to play in, isolated, not only from all the other scientific sandboxes, but from the real world outside as well. It is inextricably entangled with any number of weighty issues relevant to politics, ideology, philosophy, and religion. The idea that one can arrive at independent scientific judgments in the field without taking the significance and influence of these connections into account is, at the very least, “bad science.” It assumes an almost complete lack of awareness of the intellectual history relevant to the field ever science the days of Darwin.
Perhaps I’m the one who should be surprised that I’m surprised. To see why, one need look no further than the works that pass as textbooks in the field. For example, Evolutionary Psychology, by David Buss, is accepted by many as the standard. The first chapter, entitled “The Scientific Movements Leading to Evolutionary Psychology,” is a remarkable example of “history” encapsulated in the form of a disarmingly simple-minded fairy tale. For example, there is a section entitled “The Ethology Movement.” To begin, as anyone who was actually alive at the time of the “ethology movement” and has some passing familiarity with both the relevant scientific and popular science literature that appeared at the time must be aware, by far the most significant player in this “movement” was Robert Ardrey, acknowledged at the time as such by scientific friend and foe alike. To confirm that fact, one need look no further than Man and Aggression, published in 1968 and edited by Ashley Montagu, a collection of essays directed at Ardrey and Konrad Lorenz by several experts in the behavioral sciences. By all means, check the source material. As I write this, the hardcover version is available at Amazon for $1.88, and the paperback for only a penny. Nowhere in Buss’ account of the Ethology Movement does one so much as encounter Ardrey’s name.
It is not so easy to studiously ignore Konrad Lorenz. He was, after all, a Nobel Prize winner. He appears in Buss’ book as a nice old man followed by a line of ducklings. It would seem, you see that that was his primary contribution to the field. According to the book, “Lorenz (1965) started a new branch of evolutionary biology called ethology, and imprinting in birds was a vivid phenomenon used to launch this new field,” and “Indeed, the glimmerings of evolutionary psychology itself may be seen in the early writings of Lorenz, who wrote, “our cognitive and perceptual categories, given to us prior to individual experience, are adapted to the environment for the same reasons that the horse’s hoof is suited for the plains before the horse is born, and the fin of a fish is adapted for water before the fish hatches from its egg.” One cannot but laugh out loud when reading such stuff. Imprinting, professor? Really? Have you never heard of such other works by Lorenz as King Solomon’s Ring, On Aggression, and Behind the Mirror, all of which contained a great deal more than a “glimmering” of what later was rechristened “Evolutionary Psychology,” and all of which had a great deal more to say about the significance of the field to the human condition than his papers about imprinting in ducks?
Professor Buss next helpfully informs us that,
Ethology ran into three problems, however. First, many descriptions acted more as “labels” for behavior patterns and did not really go very far in explaining them. Second, ethologists tended to focus on observable behavior – much like their behaviorist counterparts – and so did not look “inside the heads” of animals to the underlying mechanisms responsible for generating that behavior. And third, although ethology was concerned with adaptation (one of the four critical issues listed by Tinbergen), it did not develop rigorous criteria for discovering adaptations.
Yes, professor, and in the same sense, Aristotle “ran into the problem” of not inventing magnetic resonance imaging. Such abject trivializations of the work of a whole generation of brilliant thinkers is apparently what today passes for the official “history” of the field.
Which brings us to the anomalous situation we are in today. The whole essence of the “Ethology Movement” and the whole essence of what is now called Evolutionary Psychology, is encapsulated in that one statement of Lorenz’, “our cognitive and perceptual categories, given to us prior to individual experience, are adapted to the environment for the same reasons that the horse’s hoof is suited for the plains before the horse is born, and the fin of a fish is adapted for water before the fish hatches from its egg.” The work of Ardrey, Lorenz, Tinbergen, and the lesser lights of the “Movement,” all focused on that one theme, has been triumphantly vindicated. And yet, in the weird Twilight Zone of what today passes for “history,” they have either been forgotten entirely, or, failing that, the essential relevance they always stressed of their ideas to the human condition writ large ignored and students who will never understand the significance of their field unless they are aware of the significance of these connections fobbed off with some incoherent mumblings about “imprinting theory.”
One can but shake one’s head. Would you know something about the real history of what is today called Evolutionary Psychology? You had better come armed with a fondness for seeking sources, the spirit of a detective, and a lot of patience.
Posted on January 19th, 2011 No comments
For those who don’t follow fusion technology, the National Ignition Facility, or NIF, is a giant, 192 beam laser facility located at Lawrence Livermore National Laboratory. As its name would imply, it is designed to achieve fusion ignition, which has been variously defined, but basically means that you get more energy out from the fusion process than it was necessary to pump into the system to set off the fusion reactions. There are two “classic” approaches to achieving controlled fusion in the laboratory. One is magnetic fusion, in which light atoms stripped of their electrons, or ions, typically heavy isotopes of hydrogen, are confined in powerful magnetic fields as they are heated to the temperatures necessary for fusion to occur. The other is inertial confinement fusion, or ICF, in which massive amounts of energy are dumped into a small target, causing it to reach fusion conditions so rapidly that significant fusion can occur in the very short time that the target material is held in place by its own inertia. The NIF is a facility of the latter type.
There are, in turn, two basic approaches to ICF. In one, referred to as direct drive, the target material is directly illuminated by the laser beams. In the other, indirect drive, the target is placed inside a small container, or “hohlraum,” with entrance holes for the laser beams. These are aimed at the inside walls of the hohlraum, where they are absorbed, producing x-rays which then compress and ignite the target. The NIF currently uses the latter approach.
The NIF was completed and became operational in 2009. Since that time, the amount of news coming out of the facility about the progress of experiments has been disturbingly slight. That is not a good thing. If everything were working as planned, a full schedule of ignition experiments would be underway as I write this. Instead, the facility is idle. The results of the first experimental campaign, announced in January, sounded positive. The NIF had operated at a large fraction of its design energy output of 1.8 Megajoules. Surrogate targets had been successfully compressed to very high densities in symmetric implosions, as required for fusion. However, on reading the tea leaves, things did not seem quite so rosy. Very high levels of laser plasma interaction (LPI) had been observed. In such complex scattering interactions, laser light can be scattered out of the hohlraum, or in other undesired directions, and hot electrons can be generated, wreaking havoc with the implosion process by preheating the target. We were assured that ways had been found to control the excess LPI, and even turn it to advantage in controlling the symmetry of the implosion. However, such “tuning” with LPI had not been foreseen at the time the facility was designed, and little detail was provided on how the necessary delicate, time-dependent shaping of the laser pulses would be achieved under such conditions.
After a long pause, another series of “integrated” experiments was announced in October. Even less information was released on this occasion. We were informed that symmetric implosions had been achieved, and that, “From both a system integration and from a physics point of view, this experiment was outstanding,” Since then, nothing.
It’s hard to imagine that the outlook is really as rosy as the above statement would imply. The NIF was designed for a much higher shot rate. If it sat idle through much of 2010, there must be a reason. It could be that damage to the laser optics has been unexpectedly high. This would not be surprising. Delicate crystals are used at the end of the chain of laser optics to triple the frequency of the laser light, and, given that the output energy of the facility is more than an order of magnitude larger than that of its next largest competitor, damage may have occurred in unexpected ways, as it did on Nova, the NIF’s predecessor at Livermore. LPI may, in fact, be more serious, more difficult to control, and more damaging than the optimistic accounts in January implied. Unexpected physics may be occurring in the absorption of laser light at the hohlraum walls. Whatever the problem, Livermore would be well advised to be forthcoming about it in its press releases. After all, the NIF will achieve ignition or not, regardless of how well the PR is managed.
All this seems very discouraging for the scientists who have devoted their careers to the quest for fusion energy, not to mention the stewards of the nation’s nuclear weapons stockpile, whose needs the NIF was actually built to address. In the end, these apparent startup problems may be overcome, and ignition achieved after all. However, I rather doubt it, unless perhaps Livermore comes up with an alternative to its indirect drive approach.
Posted on January 18th, 2011 1 comment
The Blank Slate is absurd. Consider your own behavior, the behavior of those around you, and the many observable commonalities in human behavior that are obvious if you trouble yourself to read a little history, and it is difficult to grasp how anyone could believe something so palpably ridiculous. In spite of that, it prevailed for many years as the dominant theory of human behavior among those who passed as experts in related fields. We have a powerful inclination to believe in comforting fallacies over jarring realities, and nothing so jarred the comforting fallacy that human behavior is so malleable that we can be “re-educated” at will to become perfect citizens of ideal fantasy worlds or systems as the reality of innate human behavioral traits. So intertwined are our emotions with the whole subject of why we act and think the way we do that the very history of the subject has been amply adjusted to suit preferred narratives. That is true whether one speaks of the adherents of the Blank Slate or its opponents.
An intriguing instance of the latter is the case of Robert Ardrey. He was arguably the most influential opponent of the Blank Slate who ever took up a pen. He is also an unperson. It is a remarkable fact that Steven Pinker, who wrote a book entitled The Blank Slate, purporting to describe the history and nature of a phenomenon he accurately described as a secular religion, could only bring himself to mention Ardrey’s name in a single paragraph. Even then it was only to distance himself from the man, as if from an untouchable. Speaking of Ashley Montagu’s Man and Aggression, a collection of essays by Blank Slaters directly aimed at Ardrey and, to a lesser extent, Konrad Lorenz, he wrote, apparently in the persona of Dawkins’ poodle,
Some of the criticisms were, to be sure, deserved: Ardrey and Lorenz believed in archaic theories such as that aggression was like the discharge of a hydraulic pressure and that evolution acted for the good of the species. But far stronger criticisms of Ardrey and Lorenz had been made by the sociobiologists themselves. (On the second page of The Selfish Gene, for example, Dawkins wrote, “The trouble with these books is that the authors got it totally and utterly wrong.”)
This statement must seem remarkable to anyone who has bothered to read Ardrey and Lorenz, not to mention Dawkins. To the best of my knowledge, Lorenz’ ideas about the “discharge of hydraulic pressure” never appeared in Ardrey’s work, and Lorenz himself only mentioned the hypothesis as an afterthought to an earlier paper. It by no means played any central or significant role in his thought or intellectual legacy, and no role in Ardrey’s work whatsoever. As for Dawkins’ claim that “the authors got it totally and utterly wrong,” it was based entirely on his rejection of theories of group selection proposed by Wynne-Edwards that Ardrey mentioned approvingly in The Social Contract. It is hard to believe that Pinker ever troubled himself to actually read Ardrey’s books, not to mention those of many other thinkers whose work he freely bowdlerized to fit his narrative in The Blank Slate. If he had, he would have noticed that the common theme of all of them was that the Blank Slate was wrong, that innate predispositions profoundly influence human behavior, with the caveat that they influence it less than in perhaps any other species, their actual expression being heavily influenced by culture and environment, and that, far from implying anything “deterministic” about either our behavior or our future, we can and should alter our behavior based on a recognition of the reality of human nature. In a word, the basic themes of The Blank Slate appeared in Ardrey’s work more than a quarter of a century earlier, but expressed more clearly, certainly more entertainingly, and without Pinker’s regrettable tendency to pontificate about the role of thinkers whose work he has either not read or not understood.
As for group selection, the notion that it played some kind of a central role in Ardrey’s work, or even in The Social Contract, the one of his books in which it is mentioned, is nonsense. The phrase in Dawkins’ book to which Pinker refers reads as follows (Dawkins is speaking of claims about the significance of his subject):
These are claims that could have been made for Lorenz’s On Aggression, Ardrey’s The Social Contract, and Eibl-Eibesfeldt’s Love and Hate. The trouble with these books is that their authors got it totally and utterly wrong. They got it wrong because they misunderstood how evolution works. They made the erroneous assumption that the important thing in evolution is the good of the species (or the group) rather than the good of the individual (or the gene.)
I haven’t read Eibl-Eibesfeldt’s book, but as far as Lorenz and Ardrey are concerned, the one who got it “totally and utterly” wrong here is Dawkins. Neither of them “assumed that the important thing in evolution is the good of the species.” Apparently, writing as a young man far less prominent than he is today, Dawkins completely missed the point of their work. Both of them understood the genetic basis of evolution, and were well aware of the controversy regarding group selection, which Dawkins hardly “discovered.” Human and animal behavior, rather than evolution, was the central theme of their work, a fact that Dawkins apparently missed completely. It’s difficult to understand his attack on them as other than an attempt to gain notoriety and promote his book by tweaking the tails of two individuals who were both a great deal more prominent than he at the time, and who both had many enemies in the orthodox scientific community. To get an idea of the basis for Dawkins remark, consider what he said about Ardrey a bit later in The Selfish Gene. Speaking of the theory of group selection he writes,
To put it in a slightly more respectable way, a group, such as a species or a population within a species, whose individual members are prepared to sacrifice themselves for the welfare of the group, may be less likely to go extinct than a rival group whose individual members place their own selfish interests first. Therefore the world becomes populated mainly by groups consisting of self-sacrificing individuals. This is the theory of ‘group selection’, long assumed to be true by biologists not familiar with the details of evolutionary theory, brought out into the open in a famous book by V. C. Wynne-Edwards, and popularized by Robert Ardrey in The Social Contract.
Robert Ardrey, in The Social Contract, used the group-selection theory to account for the whole of social order in general. He clearly sees man as a species that has strayed from the path of animal righteousness. Ardrey at least did his homework. His decision to disagree with orthodox theory was a conscious one, and for this he deserves credit.
Dawkins disingenuousness here is staggering. Let’s assume that he actually read The Social Contract. In that case, he either completely failed to comprehend what he was reading, or he is deliberately misrepresenting Ardrey’s work. In the first place there’s the incredible arrogance of the comment that group selection was “assumed to be true by biologists not familiar with the details of evolutionary theory.” This is to completely ignore that group selection had long been a matter of scholarly debate well before Dawkins published his book, that the parties of any significance on either side were both well aware of “his” theory of the selfish gene, and they either supported or opposed it using sophisticated evolutionary arguments. Other than that, The Social Contract was not about group selection, nor was the subject central to the theme of the book. Ardrey brought up the subject, not as an “assumption,” but as an admittedly controversial hypothesis that might explain, for example, the prevalence of alpha males within groups from generation to generation. Ardrey must have scratched his head at reading Dawkins nonsense to the effect that he “used the group-selection theory to account for the whole of social order in general.” There is no basis whatsoever for that remark in any fair reading of Ardrey. He did not believe, nor did he ever claim, either implicitly or explicitly, that “man as a species has strayed from the path of animal righteousness.”
Other than that, Dawkins was “completely and utterly wrong” to claim that Ardrey, Lorenz, Wynne-Edwards, or any of its other serious proponents was “completely and utterly wrong” about group selection. That is apparent from the fact that the hypothesis of group selection hardly disappeared after Dawkins published his book. It continues to be a contentious and controversial issue to this day. However, the question is not whether group selection can or cannot actually occur. The question is whether there could have been any possible basis for making the claim that the hypothesis was “completely and utterly wrong” in 1972, when Dawkins published his book. In fact, there was insufficient knowledge of the complexity of gene interaction and expression, not to mention a detailed physical understanding of the causes of such complex behavioral traits as altruism and moral behavior, and not to mention the lack of mathematical tools sufficiently precise to model the relevant processes, both then and now, to justify such a claim. Thus, Dawkins implicit assertion that he was as infallible as the pope regarding group selection is ridiculous, and Pinker’s recognition of Dawkins as an infallible pope is even more absurd.
That such obscurantist versions of the “truth” can appear as easily among the supposed opponents as among the defenders of the Blank Slate is a testimony to the degree to which our emotions cloud the discussion of human nature. Scientific detachment is difficult to achieve in studying both ourselves and our species. We are so influenced by preferred narratives about the way things ought to be that we often can’t perceive the simplest truths about the way they really are. And what of Ardrey? One can only assume that, by pointing out that the “scientific” orthodoxy of the Blank Slate was palpably absurd, he insulted the gravitas of the entire professional scientific community, whether pro- or anti. After all, he was a mere playwright (like Shakespeare, who Darwin loved to quote). His was an act of unforgiveable lese majeste. Hence, it was necessary that he disappear. He became an unperson.
To those interested in knowing the truth, I can only suggest that they read the source material. Those who trouble themselves to actually read Ardrey will find that group selection and the “good of the species” were virtually irrelevant to the central themes of his work. Again, those themes were that the Blank Slate is wrong, that innate predispositions profoundly influence human behavior, and that their actual expression is strongly dependent on culture and environment. They appeared in his books long before the publication of Sociobiology, which in its essentials is a mere echo of Ardrey. Ardrey’s own explanation of the existence of Blank Slate in African Genesis was at once more concise, more entertaining, and less philosophically flatulent than Pinker’s The Blank Slate, which appeared almost half a century later. It would also never have occurred to Ardrey to write a long book about such a subject that studiously ignored the role of individuals who played key historical roles relevant thereto.
One can only hope that future historians have the intelligence and probity to recognize the true significance of Ardrey’s role. He was a man of many hypotheses, and was quick to admit it when he was wrong. However, regarding the key theme of his work, the profound influence of the innate on human behavior, he was right, and his detractors were wrong. None were better than he at grasping the “big picture,” in the spirit of E. O. Wilson’s Consilience. In the intervening years since his last book was published, we have witnessed what amounts, for the most part, to a triumphant vindication of his work. As we have seen, his reward has been relegation to the status of an unperson.
No doubt many others who recognized important truths about the human condition consigned themselves to oblivion, or bowdlerization, in the process. Would you like to know what Hume, or Mill, or Huxley, or Spencer, or Read, or Keith, or Lorenz, or Ardrey really had to say about the subject? There’s only one way to find out for sure. Read them yourself.
Posted on January 13th, 2011 2 comments
Behavioral scientists of the old school would call the Amity/Enmity Complex a “just so story.” In other words, it’s a universal phenomenon, observable in countless instances in both humans and other animals, inexplicable other than as a manifestation of an innate behavioral trait, but something that they find inconvenient for ideological reasons and therefore choose to deny and ignore. To justify this seemingly irrational denial of the obvious, they demand a standard of proof that such traits exist immeasurably stronger than that they apply to “proved scientific facts,” by which they mean far flimsier hypotheses that happen to have the virtue of agreeing with a preferred narrative.
Briefly put, the Amity/Enmity Complex refers to our innate tendency to categorize others of our species into in-groups and out-groups, favoring the former and hating and despising the latter. As the great anatomist and anthropologist Sir Arthur Keith put it, “Human nature has a dual constitution; to hate as well as to love are parts of it; and conscience may enforce hate as a duty just as it enforces the duty of love. Conscience has a two-fold role in the soldier: it is his duty to save and protect his own people and equally his duty to destroy their enemies… Thus conscience serves both codes of group behavior; it gives sanction to practices of the code of enmity as well as the code of amity.” Today the Complex is commonly referred to as in-group/out-group behavior, but I see no need to conform to the constantly shifting nuances of jargon in the behavioral sciences.
China’s Great Cultural Revolution was a great tragedy. It was also a perfect illustration of the Complex in action. In 1966 the bored old man who happened to run China at the time decided that the Chinese Communist Party and society at large were permeated by a “bourgeois spirit,” and that what the country needed was more revolutionary spirit. He decided to shake things up a bit. What happened next is summed up in Wikipedia as follows:
On August 8, 1966, the Central Committee of the CPC passed its “Decision Concerning the Great Proletarian Cultural Revolution” (also known as “the 16 Points”). This decision defined the GPCR as “a great revolution that touches people to their very souls and constitutes a new stage in the development of the socialist revolution in our country, a deeper and more extensive stage”:
“Although the bourgeoisie has been overthrown, it is still trying to use the old ideas, culture, customs, and habits of the exploiting classes to corrupt the masses, capture their minds, and endeavor to stage a comeback. The proletariat must do just the opposite: It must meet head-on every challenge of the bourgeoisie in the ideological field and use the new ideas, culture, customs, and habits of the proletariat to change the mental outlook of the whole of society. At present, our objective is to struggle against and crush those persons in authority who are taking the capitalist road, to criticize and repudiate the reactionary bourgeois academic “authorities” and the ideology of the bourgeoisie and all other exploiting classes and to transform education, literature and art, and all other parts of the superstructure that do not correspond to the socialist economic base, so as to facilitate the consolidation and development of the socialist system.”
The decision thus took the already existing student movement and elevated it to the level of a nationwide mass campaign, calling on not only students but also “the masses of the workers, peasants, soldiers, revolutionary intellectuals, and revolutionary cadres” to carry out the task of “transforming the superstructure” by writing big-character posters and holding “great debates.”
In the intervening years many eyewitnesses have published vignettes of what happened next including Life and Death in Shanghai by Nien Cheng, Red Scarf Girl by Ji-Li Jiang, and China’s Son by Da Chen. One of the most interesting is Born Red, a fine piece of writing by Gao Yuan. It is a case study in how new in-group/out-group relationships emerged in the supposedly “classless” society that was established in the wake of the Communist victory, how easy it was to inflame them against each other, how seemingly insignificant and incomprehensible differences between them were magnified until they assumed earthshaking importance in the minds of the opposing factions, how loyalty to the in-group inspired acts of fearless bravado, “heroism,” and even martyrdom, and, in the end, how all the resulting chaos and mayhem were finally stopped and society returned to “normal.” In short, the Revolution was an experiment in human psychology on a massive scale, demonstrating the manifestation of an ancient and innate human behavioral trait in a world far different from the one in which it evolved.
The Amity/Enmity Complex describes the interplay of in-groups and out-groups and, of course, Communism has always had its own idiosyncratic out-group. It is the bourgeoisie, technically the private owners of the social means of production, but a term that has often been expanded to include peasants with slightly more land or slightly more productive and affluent than their neighbors, workers who were somewhat better off than average, people whose houses were larger than a certain size, or anyone else with some kind of a real or imagined privilege. So it was that, when the Great Cultural Revolution was launched, it began with the posting of innumerable “dazibao,” or “big character posters,” attacking the “bourgeoisie.” It couldn’t be just a vague, general bourgeoisie. Individuals were needed. The party helped things along with its suggestion that the “criticism” start with “reactionary bourgeois academic authorities.” Thus, teachers and school administrators were among the first victims of the dazibao smears. They were associated with a host of evil traits that have been associated with out-groups since the dawn of time. For example, they were “impure” and “dirty,” by virtue of “bourgeois” parents, grandparents or other associations. They were the essence of evil by virtue of their opposition to the embodiment of good, in the person of Mao and his “revolutionary line.” They were guilty by virtue of association with evil incarnate in the person of Chiang Kai Shek and his Guomintang Party. All these charges were usually baseless slander, but the “revolutionary masses” of students made them stick. After all, in-groups must have out-groups, even if it’s necessary to invent them out of whole cloth.
Eventually, the in-groups began to turn their wrath against each other. Nothing was easier than to convince themselves that the “others,” too, were “dirty,” “impure,” and “evil” distorters of the pure revolutionary line of Mao, just like the school authorities. They began to “struggle” against each other. Starting with dazibao, the means of “struggle” became ever more violent and destructive, escalating to fists, spears and slingshots with crude armor, homemade grenades, and, eventually firearms. Captured opponents, people who had formerly been friends, schoolmates and neighbors, were beaten, viciously tortured, maimed, and occasionally killed. The author tells of one young girl who, on the point of being captured by the “enemy,” committed suicide by throwing herself from an upper story window rather than be “defiled” by contact with the out-group. Anyone who failed to take part in these sanguinary and seemingly senseless battles, or who sought to “desert,” became the target of all the opprobrium traditionally heaped on “traitors.”
And so it continued until Mao, finally tiring of the sport or deciding his political goal of consolidating power had been accomplished, called the whole thing off in 1969. The active phase of the revolution sputtered on for a while, ending for good only with the death of Mao and the arrest of the Gang of Four in 1976. Their mortal deity having passed from the scene, the contending factions forgot all the reasons for their mutual hatred that had formerly seemed of such earth shattering importance. Disavowed by the powers that had called them into existence, and having no legitimacy but that conferred by a man who was now dead, the in-groups collapsed, and their members disbanded and went back to their “normal” lives. In the epilogue, the author, who had emigrated to America in the meantime, recounts how he went back to visit some of his former enemies and torturers. All acted as if the whole thing had been a bad dream.
We have all seen it happen over and over and over again, across nations, cultures, tribes and societies of all stripes. We have seen the incarnations of the Complex in the form of racism, religious bigotry, anti-Semitism, and countless other “isms.” The details change, but the fundamental nature of the behavior is always the same. Isn’t it time to recognize the fact that our five thousand years of recorded history of the same phenomenon over and over again wasn’t just a coincidence? If there is any reason for optimism about the Chinese experience, it is that it was neither inevitable that the Complex become active and virulent as it did, nor was it impossible to suppress and control once people with the necessary authority finally realized how destructive it had become. If that experience is any guide, surely we are intelligent enough to control an innate behavioral trait that exists because it promoted our survival at some point in the distant past, but has now become the most likely source of our potential self-destruction. We cannot, however, effectively control it until we recognize it for what it is, accept its existence, and stop covering our eyes, stopping up our ears, and shouting “just so story” because the Amity/Enmity Complex doesn’t fit in the “nice” world of our fond imaginations. It’s time to end the denial. We’ve graduated far beyond dazibao and slingshots to nuclear weapons. It has become much too dangerous to refuse to understand ourselves in the name of preserving a world that never was.
Posted on January 12th, 2011 No comments
There is no such thing as news. There is only narrative. The significance of most of what passes for news is derived from the attention the media pays to it rather than its intrinsic importance. A case in point is the remarkable, ongoing obsession of the news media on both the left and right with the shootings in Arizona. In this case the feeding frenzy was set in motion by the left. Even though there have obviously always been people on both ends of the spectrum who have no life outside of politics, I was still taken aback by their desperate attempts to seize on this issue like so many drowning men grasping at straws. Evidently their resounding defeat in November was even more galling than I imagined. They made no secret of the fact that they were waiting with bated breath for some incident they could construe as evidence of the “violent nature” of the Tea Party movement, conservative talk radio, and the rest of their pet bogeymen. They admitted as much. As their reaction to the shootings makes clear, they were very eager indeed. They’re acting for all the world like so many Communists marching behind the coffin of a murdered “martyr” in days gone by. All that’s missing is the red flags.
Some examples of their overwrought reaction can be found here, here, and here, all based on zero evidence that there was any link whatsoever between the shooter and the Tea Party movement, Sarah Palin, Rush Limbaugh, Mark Levin, or anyone else on the right. The “objective” CNN even went so far as to write a panegyric of Sheriff Dupnik, now infamous for his ham-handed attempts at political exploitation of the murders, as the soul of wisdom, complete down to everything but his birth in a log cabin. I doubt we’ll be seeing more of the same from those quarters, as in the meantime the good sheriff has been giving off such a stench that even the stalwarts of the left have begun holding their noses.
The left’s seizing at this particular straw was, obviously, ill-considered. Other than not bothering to come up with any evidence to back up their accusations, only to find out after the fact that there was none, they set their own hypocrisy on a pedestal for the right to take pot shots at. After all, the left doesn’t commonly engage its opponents in reasoned discourse. Its forte’s have always been demonization, virtuous indignation, and a style of “eliminationist rhetoric” all its own. They gave the other side a perfect opportunity to point that out, as they did with relish, for example, here, here and here.
There is little that can demonstrate the extent to which the left overshot its mark in its crudely insensitive attempts to exploit the Arizona deaths and the grave wounding of Gabrielle Giffords than the reaction of the foreign media. Germany’s for example, is usually reliably leftist, often taking its talking points directly from the New York Times. It is all the more remarkable that the Washington correspondent of Der Spiegel, Marc Hujer, penned an article entitled “America’s Insane Debate,” in which he wrote, among other things,
The very people who got so upset about the tone of debate in the past year, about the rhetoric of the Tea Party, the harsh words of the Right, the unabashed caricatures of Obama as Hitler, are now poisoning the debate themselves with shameless insinuations. Without learning the facts, they seek the guilty behind the attack, and commonly find them on the right, in the Tea Party, in Republican Party chief Michael Steele and Tea Party heroine Sarah Palin.
The language chosen by Sarah Palin and other Tea Partiers was doubtless raw and over the top, but doesn’t come close to providing any proof for the claim that they motivated the shootings in Arizona. Indeed, what is known about the shooter at this point gives no indication that he is a member of the Tea Party movement, or a fan of Palin, or that he has any clear political convictions at all. His favorite books included the Communist Manifesto, Hitler’s Mein Kampf, and Peter Pan, a weird collection. However, there is no indication that his act was motivated by politics.
The massive criticism directed at Sarah Palin is delusional, and not just because it’s a baseless accusation. The attempt to weaken Palin in this way could accomplish the opposite.
That’s strong stuff coming from a source that’s usually reliably critical of the right, in the U.S. as well as in Germany. The left in this country might do well to take heed for their own good. Perhaps more worrisome than their baseless accusations is what they propose as a cure; a further dismantling of the Bill of Rights. In this case their targets are the first and second amendments to the Constitution. If the history of the last hundred years is any guide, we have more reason than ever before to continue to fight against any diminishing of those rights.
Posted on January 6th, 2011 2 comments
Bonobos are the new darlings of the noble savage crowd. They were bitterly disappointed by the rest of the great apes that, as recently as the 1970’s, were all supposed to be peaceful, vegetarian, and inoffensive. When Jane Goodall and others started actually observing great apes in the wild and, as chronicled in books such as Wrangham and Peterson’s Demonic Males, found that they occasionally displayed a few less endearing traits, such as hunting and eating meat, rape, infanticide, and the use of weapons in violent border warfare and raiding, true believers in the innate “goodness” of mankind demonstrated their own nobility by subjecting the messengers to furious ad hominem attacks. It didn’t work. Too many observers were reporting the same thing, and the evidence was too compelling.
Enter the Bonobo. They supposedly possess all the “good” traits their close relatives, the chimpanzees, so notably lack. Occasionally their halo will slip. For example, they compete for status, just like the other great apes. Then, too, their hagiographers will occasionally slip up. I was at a lecture about them once at which the speaker sought to emphasize their “feminist” nature. It seems the females in bonobo groups tend to form alliances for self-protection, and to maintain decorum among the males. The speaker recounted how, in one of the groups, an unruly male had attempted some aggressive behavior towards one of the females. She and her pals ganged up on the evil-doer, giving him a thorough drubbing and, in the words of the speaker, nearly tearing his scrotum completely off. Feminism was certainly vindicated by the incident, but the bonobo’s supposedly non-violent nature less so.
Be that as it may, apart from a few such rare lapses, bonobos do seem to be a great deal less violent and generally “demonic” than their close relatives, the chimpanzees. If estimates that they shared a common ancestor as recently as 1.5 to 3 million years ago are correct, it would seem to demonstrate a high degree of flexibility in the evolutionary toolkit pertaining to the innate behavioral traits that characterize humans as well as other animals. On the other hand, it may be that all these observed traits are subject to greater cultural variation within species that previously imagined. Perhaps bonobo groups can be more “demonic” than their observed behavior to date would indicate, and chimps have taken a bum rap and are really capable of more placid behavior under the right conditions.
The inimitable Robert Ardrey drew attention to a few data points to that effect in his The Social Contract, published in 1970. In Chapter 7 of that book he recounted a series of observations of langurs, a leaf-eating monkey widely distributed in India. Carried out by different researchers in different locations and environments, they revealed widely divergent information about the “typical behavior” of langurs. The first, carried out by Phyllis Jay in an area where the creatures are fairly scarce, found that troops of 25 members more or less occupied ranges of about two square miles, and rarely contacted each other. There appeared to be no defended territories, and no evident boundaries between groups. A rigid rank order prevailed within the groups, and serious quarrels were almost non-existent. As Ardrey put it, they
…seemed the ideal, sunny, non-aggressive creatures of legend, and (the) study, completed in the early year of 1959, did much to reinforce the arguments of those primate students that monkeys never fight, never defend territory, never do anything but behave themselves in a fashion rarely glimpsed in human schoolyards. It was a time when we all still said that “langurs are this way.”
Then, however, an account of another study of langurs appeared, carried out this time in Ceylon by Suzanne Ripley. Again quoting Ardrey,
Troops were of about the same size. But nowhere did there exist those infinite distances for the happy, wandering life. The troop’s two square miles of India’s central forests became an eighth of a square mile in Ceylon. And here there were not only territories, with actively defended, unchanging borders; groups sought combat. (Like chimps! Alas, Ardrey never lived to learn the truth about them or the great apes, universally believed to all be truly peaceful, vegetarian, and inoffensive at the time he wrote his books. If only he had known how thoroughly the subsequent revelations about them vindicated his hypotheses. But I digress.) Like the howler and the callicebus, the langur is a noisy monkey. Morning treetop whoops would bring defiant answers from whooping neighbors and mobilization on the border. Ritualized displays might take place, with vast leaps through the trees. But in these combats between groups true fighting could take place, too, with chasing, wrestling, biting, tail-pulling.
But wait, there’s more. Yukimara Sugiyama of Kyoto University also went to India to study langurs, this time in the extreme of population density among the three groups, about twice that in the Ceylon study. What he found was what has been described by others as a “behavioral sink.” Again quoting Ardrey,
…disorder was quite nearly perfect. There were territories, but borders were obscure and ill-defended. When troops met, leaders fought unassisted. Neither were there the rigid rank orders of dominance so characteristic of Jay’s widely separated groups. Perhaps as a consequence almost all troops had only one adult male, though there might be six or ten adult females. Sugiyama speculated that without a hierarchy regulating the relationships of males, quarrels were so disruptive that only one male usually remained. The expelled males formed their own groups in the forest.
When the sexual season approached its peak, an all-male gang, …would descend on a troop containing females, kill or drive off the leader and any sub-adult males, and fight among themselves for sexual sovereignty. Far from mourning their departed overlord, the females would respond to the action with sexual stimulation which brought on an immediate peak of copulation with the conqueror. Infants were neglected. And the episode reached its climax when the conqueror bit to death all young.
Readers of Demonic Males will note the remarkable parallels between Sugiyama’s langurs and the behavior of individual “outsider” male gorillas, which will occasionally raid the troop of a silverback, seeking to kill the infants. If successful, the mother may follow him as her new overlord!
But the upshot of the story is that the behavior of a given species of primate can vary widely depending on environmental conditions. Innate behavior does not imply “deterministic” behavior. It merely constrains the potential paths it can take. May not the same phenomena observed among langurs be possible in the great apes? Given the right conditions, could there be peaceful chimps and violent bonobos? What about our own species? Often human populations that have been peaceful for generations have also become incredibly violent in a relatively short time. Why? We need to learn. There is no more important study than the study of our own nature. Our survival depends on learning who we are.
Posted on January 5th, 2011 1 comment
There is nothing more important for us to learn and understand than our own nature. Human nature, by which I mean our innate behavioral traits, does not determine human history, but it constrains it. Anyone aware of those fundamentally emotional traits, related although certainly not identical versions of which exist in many other animals, would have realized that Communism was a non-starter. The Communists and their intellectual fellow travelers fondly believed that their noble experiment would be immune from such hard-wired features of our mental equipment as ingroup-outgroup behavior and the inevitable competition for status and power in human groups, whether they be political parties, “classes,” or social clubs. It was not. As E. O. Wilson so accurately observed concerning Communism, “Great theory, wrong species.”
Communism was a costly experiment. In the attempt to apply it, countries like Russia and Cambodia virtually decapitated themselves. Given its cost, it would behoove us to learn from it. I see very little happening along those lines. The whole phenomenon is fading from living memory, and the historical facts relating to its spectacular rise to prominence as the greatest secular religion of all time, its brutal and bloody reality, and its eventual collapse are all becoming dim as they recede into the mists of time. I can think of no history that it would be more important for our children to learn, but I doubt that more than one in a hundred of our high school students knows who Lenin actually was, let alone the basic tenets of Marxist philosophy.
One lesson we should surely learn is humility. We are not really an intelligent species. We are just smarter than the rest. Our powers of self-delusion are, nevertheless, phenomenal. In the wake of the Great Depression, a whole generation of some of the best and brightest intellectuals among us managed to bamboozle themselves, in spite of copious evidence to the contrary existing at the time, into believing that Communism was both humane and the inevitable future of mankind. Read the pages of such journals as The New Republic, the Nation, and the American Mercury after H. L. Mencken turned over the editorship to Charles Angoff, and you’ll see what I mean. There were certainly a few more sober heads among them, but many of the most prominent political thinkers were cocksure that the Depression proved beyond any reasonable doubt that capitalism had reached a dead end, and the only remaining question regarding the transition to socialism was how it would occur. There are countless examples of this mindset, well known to anyone familiar with the history of the time. One of the more obscure but illustrative examples was published in 1933 by Elias Tobenkin in his book, Stalin’s Ladder. Here are some vignettes from that work:
(The criminologist) leaves the Soviet Union with a heartening sense of having witnessed something new under the sun. Soviet prisons and the Soviet penal system open a novel and inspiring chapter in the relations between society and the criminal. Soviet Russia is successfully coping with the age-old problem of crime and punishment on the basis of a complete transformation of prison life and a complete reversal of the old attitude of vindictiveness toward the individual offender.
With the antiquated prison system there went by the board the practices of corporal punishment, of solitary confinement and of the “iron bags” – the vault-like individual cells that gave the Czarist prisons their stark appellation of the “House of the Dead.”
The conception of punishment, of revenge upon the criminal has been outlawed. In a decree issued in March, 1919, the Communist party ordered the country’s prisons to be transformed into educational institutions. Confinement in such an institution was declared to be an “economic corrective,” for the purpose of educating the offender “in the discipline common to all workers.”
The prisoner has his whole life recut and reshaped during his period of confinement. He gets a complete overhauling physically, mentally, and psychically. His emotions are drained of past bitterness and disappointments and attuned to a course of labor and peace with his fellows, with the world.
…and much more of the same. Solzhenitsyn’s The Gulag Archipelago and Eugenia Ginzburg’s Journey into the Whirlwind were yet to appear, but there were already many published accounts available of the reality of the Soviet forced labor camps by lesser known authors with firsthand knowledge at the time Tobenkin was writing his book. They were ignored by those who should have known better, swamped by a vast wave of confirmation bias, and trivialized by phrases like, “you have to break a few eggs to make an omelet.”
It’s easy to gain a sense of intellectual hubris in reading the countless similar examples of delusional self-deception published at the time by the likes of George Bernard Shaw and H. G. Wells, not to mention such lesser intellectual lights as Tobenkin. We would do well to resist the urge. In matters touching the Soviet prison system we have the familiar advantage of Monday morning quarterbacks. Not so concerning the political and intellectual controversies of our own day. The true believers in the political narratives of the left and the right are just as cocksure they have a monopoly on the truth as ever the likes of Shaw or Wells were in their own day. History is likely to prove them just as delusional.
The truth is elusive to minds as limited as ours. It is best to retain a due sense of intellectual humility, and refrain from wandering too far from the domain of repeatable experiments into the realm of unfalsifiable speculation. Otherwise we are just so many Tobenkins waiting to happen.
Posted on January 2nd, 2011 No comments
Trying to learn some history but the relentless political correctness of leftist authors makes you nauseous? You need a change of pace. Try Modern Times by Paul Johnson. It’s still politically correct, but it’s the masculine, almost Rhodesian political correctness of the right instead of the pecksniffing, pathologically pious political correctness of the left. In accordance with the rules of modern historiography, all the important players are sorted into bad guys and good guys, but the roles are reversed: Harding and Coolidge and good guys and Roosevelt and the New Dealers are bad guys.
According to Johnson, the bad guys became evil because they abandoned Judeo-Christian morality, source of such uplifting triumphs of the objective good as the hanging and burning of several hundred thousand “witches,” centuries of genocidal attacks on the Jews, and religious wars beyond counting. The bad guys, on the other hand, are all “moral relativists.” In case you’re wondering what a “moral relativist” is, the Stanford Encyclopedia of Philosophy has a very good article about it. In short, a “moral relativist” is anyone who differs with you touching matters of morality. As the Stanford Encyclopedia puts it:
Moral relativism has the unusual distinction—both within philosophy and outside it—of being attributed to others, almost always as a criticism, far more often than it is explicitly professed by anyone.
Johnson’s own version of “objective morality” at least has the virtue of being idiosyncratic. Where academic leftists would use the phrase, “You have to break some eggs to make an omelet” to rationalize the crimes of Stalin, Johnson would be more likely to use it to rationalize Franco’s shooting of more than 150,000 helpless victims after his victory in the Spanish Civil War. He speaks of the matter as if it were a mere bagatelle, and, after all, Franco was an upholder of “Judeo-Christian morality.”
Be that as it may, Modern Times is no hack journalist’s history. Johnson has a profound knowledge of the events he describes, and has much of interest to say about the intellectual currents and personalities of the 20th century. Any history with such a broad scope is bound to transmute complex human beings into wooden dummies. That’s not Johnson’s fault. Given the nature of his book, he’s done his job if he at least points them out to you. Finding the human being beneath the wooden shell is always something you’ll have to do on your own.
Posted on January 1st, 2011 2 comments
According to a favorite argument of religious believers, God must exist because otherwise the physical universe with all its wonders would be inexplicable. I have always considered it a very powerful argument against His existence that such arguments leave you with an even bigger problem. If you can’t accept the existence of the universe without a Creator, why do you accept the existence of a Creator to begin with? He must necessarily be even more complex and inexplicable than that which he created. In other words, you don’t gain anything by positing the existence of something more complex to explain something less complex. Jean Meslier used the argument in his Testament, and Richard Dawkins and others have included it in more recent works.
Moslems and some Christians use divine inspiration, or faith, to get around the argument. In the more extreme, Muslim version, God decided in advance who would have faith and who not. He created unbelievers in such a way that their minds would be hardened against faith in Him, and for the “sin” of being created that way, he intends to burn them forever. It’s all set forth very explicitly in the Koran.
However, Christians who imagine themselves more sophisticated than the rest, apparently never having read the bit in Matthew 18:3 about the impossibility of entering the kingdom of heaven except as a little child, have more “complex” arguments. One such is Paul Wallace, who set forth a version thereof at the website of Religion Dispatches.
Wallace begins with the well-worn argument that, if you don’t believe in God, you’re really just a religious horse of a different color. In his words,
The atheisms of most committed, principled atheists are often not more than mirror images—inversions—of the theisms they negate.
By that logic, if you don’t believe in fairies, you belong to the “anti-fairy cult,” and if you’ve never read Virginia’s letter, and lost faith in Santa, you’re a zealot in the “anti-Santa” religion. Winston in Orwell’s “1984,” was presumably a fundamentalist religious fanatic because he insisted he only counted four fingers instead of five when his torturer held up his hand.
Wallace is just warming up, though. Citing Yale theology professor Denys Turner, he explains that, if you don’t see the fifth finger, you’re just not trying hard enough:
Turner also writes that, very often, the theisms attacked by atheists are not very interesting; therefore, the atheisms of most committed, principled atheists are not very interesting. Why this is so is not clear; perhaps it is because in many cases theism was abandoned before it was allowed time to develop into something of substance.
He then focuses on the version of the argument presented in Richard Dawkins’ The God Delusion -
In The God Delusion, Dawkins presents his central argument against the existence of God in the fourth chapter. His thinking goes something like this: The universe is a complex thing. Therefore the God of the Christians, who, Christians say, made the universe, must be at least as complex as the universe God made. Therefore we are left with an even bigger problem than before: Who made this ultra-complex God? A hyper-complex megaGod? It makes plain sense, according to Occam’s razor, to stop before we get to the first God. The complex universe is enough. Ergo, in all likelihood, God does not exist.
This argument, which boils down to Well, who made God, then?, assumes that God is a thing like any other thing. It assumes that God must exist in the same way the moon exists, in the same way Dawkins himself exists. As Terry Eagleton wrote in his now-infamous review of The God Delusion, Dawkins seems to think that God is “a celestial super-object or divine UFO,” a creature like other creatures, only bigger and smarter: a kind of überthing, but a thing nonetheless.
But nowhere does Dawkins get outside of himself and ask, Is my assumption that God is a thing like any other thing really necessary? On what is this assumption grounded? Where did it come from?
I’m no fan of Dawkins. As I’ve mentioned elsewhere, I was not enthralled by his quasi-racist anti-American ranting about the “U.S. Taliban” and overt bigotry against Christian fundamentalists in The God Delusion. Be that as it may, his argument doesn’t depend on God being a thing like other things. It only requires that God is a thing, as opposed to nothing. Nowhere does Dawkins suggest that God is a thing like other things, but merely that, whatever sort of thing he is imagined to be, if He is the creator, he must necessarily be more complex than that which he created. As a result, whatever kind of a thing believers of whatever stripe might imagine Him to be, the argument that He must exist because otherwise the remarkable physical world we see around us could not exist becomes absurd. It is assuming something more complicated to explain something less complicated. It doesn’t solve anything. Wallace, however, demures:
What is at issue here is, Dawkins refuses to examine the ground on which he stands: science itself. That is, Dawkins may change his mind about evolution, but nothing will change his mind about science. He will never question—in a serious way—the sufficiency of science as a guide to truth.
Here we see the familiar portrayal of “science” as a religious belief. In fact, it is nothing of the sort, but merely a systematic way of discovering and acquiring knowledge. There is nothing mystical about the word “science” at all. It is simply one way of reasoning about what is true. Continuing with Wallace:
He will never question—in a serious way—the sufficiency of science as a guide to truth. Perhaps he thinks the success of science makes it a self-evident choice when it comes to grounding his worldview; what he does not and will not consider is the very real possibility that science is so successful precisely because it is so limited. To reject this possibility out-of-hand is nothing but intellectual laziness. Dawkins is dogmatically rigid and fixed in place. He is a fundamentalist.
Fine. Science is limited. However, Christian fundamentalism, an “easy target,” is also limited. Dawkins just wasn’t aiming high enough. Forget the Christians as “little children” meme. If you want to “see through” his argument, it’s going to take some serious mental gymnastics. Wallace describes the process in terms of four levels of “God-talk,” with the third being the most important. Let’s let him explain:
The third level is the most difficult but the most important. This is second-order negation, or the inversion of the inversion. Here we would say, “God is not a fire, but God is not a not-fire either,” and “God is not love, but neither is God not-love.” God transcends the (human-based) distinction between love and not-love.
Also on this third level is found the insistence, made for centuries by theologians throughout Christendom, that God transcends the distinction of being and not-being. Therefore, if we use the conventional definition of existence, God does not exist. Our category of existence does not apply to God. Put another way, the word “exist” cannot be used univocally of things and God. These are artificial categories imagined and used by human beings; they are manifestly not divine attributes. In the end, to speak correctly, there are no divine attributes. Which means that God is not distinct from creation, nor is God not-distinct from creation. That is, in God there is no distinction at all, nor is there non-distinction. No affirmation or denial properly applies to God.
Or, in other words, God is neither a thing or nothing. This very convenient for believers, because it puts their God out of reach of logic. By the same token, I can say that fairies, Santa, or the Great Green Grasshopper God are neither thing or nothing, and no one can prove they don’t exist.
But atheists say that Christianity is false, that God does not exist. Asking them to defend their position in light of mature theology is doing nothing but taking them for their word and respecting their intelligence.
So atheists are wrong because, like Winston and his four fingers, they can’t imagine an entity that is neither a thing nor nothing. Wallace assigns them the task of disproving the existence of that entity, but without using language, because that would be too deceptive, and without reasoning, because that which is outside the union of “thing” and “nothing” is also outside the realm of rational argument. If they fail then, voila, the existence of God is proved! Of course, the author realizes he’s walking on thin ice. He admits as much:
Also, one may say that negative theology is content-free and useless because it nullifies the use of rational thought. In a sense this is a valid argument. But one can go beyond negative theology while bearing in mind its lessons. In fact, negative theology constitutes the central nervous system, if you will, of the entire Summa Theologica of Thomas Aquinas that Dawkins so happily and ignorantly mocks. In this work, Thomas employs analogical language in order to speak freely of God’s attributes without the possibility of confusing them with the attributes of, say, fire or kingship or love or being.
Since it’s obviously impossible to believe in an un-thing, the author, after assuring us that God is neither thing nor nothing, is suddenly speaking of Him as an object with attributes. I, and I daresay anyone else who speaks English fluently, would call an object with attributes a thing.
This is one of the most powerful aspects of negative theology: It cleanses the mind not only of assumptions about God, but of idols (like science, say) that can so easily replace God.
Again assigning some mystical quality to “Science.” As noted above, science is just systematic reasoning. What the above amounts to is the claim that anyone who dares to use their brain as something other than inert stuffing for their skull is an “idolater.”
We are required to have faith in no thing at all; only then will our faith have any chance of finding its true home in God.
There are, of course, different flavors of this “no thing.” The author should take care that he has faith in the right “no thing.” If it turns out that the Moslem “no thing” is the real one, he’ll be spending quadrillions and quintillions of years sizzling in hell, and that’s just for starters. I will leave that to the competing “no things” to sort out among themselves. Poor, deluded atheist that I am, I am left by all these arguments in direr straights than before. I will certainly end up frying in the afterlife regardless unless, without relying on logic or language, I somehow manage to figure out what “no thing” is, and that with alacrity, I being no longer the youngest. I gather from what the author is telling me that this will only be possible by virtue of reading Thomas Aquinas and a voluminous stack of other religious tomes. I suspect that such fare may not really be the path to divine enlightenment. Rather, it seems more likely that the author has been left in more or less the same condition by reading his own pile of books about religion as Don Quixote was left by reading a pile of books about knight errantry. Miguel de Cervantes provides a detailed psychological description in the first chapter of his famous account of that gentleman.
While I strongly suspect that Wallace is as deluded in matters of religion as Don Quixote was touching knights in shining armor, I am content to let him believe whatever he chooses as long as he accords the same right to me, and does not conclude, as so many others have done in the past, that his “no thing” requires him to burn people, or launch wars against those who believe in other “no things,” or fly airplanes into buildings on behalf of the “no thing”, or that the state should serve as an interpreter of the will of the “no thing.” As long as we’re clear about those things we should be able to coexist.