The world as I see it
RSS icon Email icon Home icon
  • Ingroups and Outgroups and Sir Arthur Keith – Adventures in the Bowdlerization of History

    Posted on February 16th, 2019 Helian No comments

    There is no more important aspect of human nature than our tendency to perceive others in terms of ingroups and outgroups. Without an awareness of its existence and its power it is impossible to understand either out history or many of the critical events that are happening around us today. A trait that probably existed in our ancestors millions of years ago, it evolved because it promoted our survival when our environment and way of life were radically different from what they are now. In the context of current human technologies and societies, it often appears to have become wildly dysfunctional. We can distinguish ingroup from outgroup based on the subtlest of differences. That worked fine when we all lived in small groups of hunter-gatherers. The outgroup was always just the next group over. Today the same mental equipment for identifying the outgroup has resulted in endless confusion and, in many cases, disaster. The only way out lies in self-understanding, but as a species we exhibit an incorrigible resistance to knowing ourselves.

    In my last post I commented on the foibles of an ingroup of intellectuals whose “territory” was defined by ideology. I’m sure they all believed their behavior was entirely rational, but they had no clue what was going on as they reacted to a “turncoat” and “heretic” in the same way that ingroups have done for eons. Had they read a seminal book by Sir Arthur Keith entitled A New Theory of Human Evolution, they might have had at least an inkling about the real motivation of their behavior. Published in 1948, the book was of critical importance, not just because it addressed the question of ingroups and outgroups, but because of Keith’s sure feel for the aspects of human behavior that really matter, and for his forthright and undaunted insistence on the existence and importance of innate human nature. He was certainly not infallible. What scientist is? He believed the Piltdown skull was real until it was finally proved a hoax just before he died. Some of what he had to say about human behavior has stood the test of time and some hasn’t. However, his hypotheses about ingroups and outgroups definitely belong in the former category, along with many others. There is no question that they were closer to the truth than the Blank Slate dogmas that already served as holy writ for most of the so-called behavioral scientists of the day.

    Today there are few original copies of his book around, although some are offered at Amazon as I write this. However, it is available online at, and reprints are available at and elsewhere. It is a must read if you interested in human behavior, and even more so if you are interested in the history of the behavioral sciences in general and the Blank Slate in particular. Unfortunately, most of the accounts of that history that have appeared in the last 50 years or so are largely fairy tales, concocted either to deny or “embellish” the reality that the Blank Slate was the greatest scientific catastrophe of all time. If you want to know what really happened, there is no alternative to consulting the source material yourself.  One of the biggest fairy tales is that the man who played the greatest single role in demolishing the Blank Slate, Robert Ardrey, was “totally and utterly wrong.” In fact, Ardrey was “totally and utterly right” about the main theme of all his books; that human nature is both real and important. He insisted on that truth in the teeth of the Blank Slate lies that had been swallowed by virtually every “behavioral scientist” of his day.

    Ardrey had an uncanny ability to ferret out scientists whose work actually did matter. Sir Arthur Keith was no exception. What he had to say about Keith and his take on ingroup/outgroup behavior was far more eloquent than anything I could add. For example,

    In his last two books, Essays on Human Evolution in 1946 and A New Theory of Human Evolution in 1948, Keith took the final, remorseless step which his thinking had made inevitable. Conscience, he affirmed is simply that human mechanism dictating allegiance to the dual code. Those who assert that conscience is inborn are therefore correct. But just how far does conscience compel our actions in such an ultimate direction as that of the brotherhood of man? Not far. Conscience is the instrument of the group.

    Human nature has a dual constitution; to hate as well as to love are parts of it; and conscience may enforce hate as a duty just as it enforces the duty of love. Conscience has a two-fold role in the soldier: it is his duty to save and protect his own people and equally his duty to destroy their enemies… Thus, conscience serves both codes of group behavior: it gives sanction to practices of the code of enmity as well as of the code of amity.

    These were Keith’s last words on the subject. If the grand old man had any noteworthy capacities for self-delusion, they escape the eye. And when he died a few years later, at the age of ninety, with him ended truth’s brief history. His thoughts by then were overwhelmed by the new romanticism (the Blank Slate, ed.) when falsehood came to flower: his sentiments were condemned by that academic monopoly which substituted high-mindedness for the higher learning. And as for almost twenty years no one followed C. R. Carpenter (a primatologist who published some “inconvenient truths” about the behavior of monkeys and apes in the field, anticipating the revelations of Goodall and others, ed.) into the rain forest, so for almost twenty years none has followed Sir Arthur Keith into the jungle of noble intentions.

    Beautifully said by the great nemesis of the Blank Slate. Ardrey had much else to say about both Keith and the history of hypotheses about ingroup/outgroup behavior in Chapter 8, “The Amity-Enmity Complex” of his The Territorial Imperative. If you’re looking for source material on the history of the Blank Slate, Ardrey’s four books on human nature wouldn’t be a bad place to start. They’re certainly more accurate than Pinker’s fanciful “history” of the affair. Keith himself was certainly aware of Blank Slate ideologues and their “academic monopoly.” However, he had a naïve faith that, if he only told the truth, he would eventually be vindicated. A hint about the extent to which that faith was realized can be gleaned by perusing the Wiki entry about him, which dismisses him into the realm of unpersons with the usual hackneyed claim of the pathologically pious that he was a “racist,” along with a gleeful notice that he was taken in by the Piltdown skull.

    When it comes to the bowdlerization of history, by all means, have a look at the Wiki entry on “Ingroups and outgroups” as well. The most shocking thing about it is the thought that its author might actually believe what he’s written. We learn, for example, that “The terminology was made popular by Henri Tajfel and colleagues during his work in formulating social identity theory.” One wonders whether to laugh or despair on reading such absurdities. The idea that the history of what Ardrey referred to as the “Amity-Enmity Complex” began with some inconsequential “study” done by a Polish psychologist back in 1971 is beyond ludicrous. That’s just one of the reasons why its important to read such important bits of source material as Keith’s book. He actually presents an accurate account of the history of this critical aspect of human behavior. For example,

    In brief, I hold that from the very beginning of human evolution the conduct of every local group was regulated by two codes of morality, distinguished by Herbert Spencer as the “code of amity” and the “code of enmity.”

    Spencer wrote extensively about the subject in his Principles of Ethics, which appeared in 1892, nearly 80 years before the subject “was made popular” in Tajfel’s “study.” Unfortunately, he also noted the fallacies behind the then fashionable versions of socialism in another of his books, and gave reasons that governments based on them would fail that were amply confirmed by the history of the next hundred years. For that, he was viciously condemned as a “Social Darwinist” by the socialist true believers. The moniker has stuck to this day, in spite of the fact that Spencer was never even a “Darwinist” to begin with. He certainly had his own theories of evolution, but they were much closer to Lamarckism than Darwinism. In any case, Keith continues,

    As a result of group consciousness, which serves to bind the members of a community together and to separate the community from all others, “there arises,” to use the words of Professor Sumner, “a differentiation between ourselves – the ‘we’ group or ‘in’ group – and everybody else – the ‘out’ group.”

    The passage Keith refers to appeared in Folkways, published by Prof. William Graham Sumner in 1906, also somewhat earlier than the good Prof. Tajfel’s study. Of course, studies by learned professors of psychology are not necessary to document ingroup/outgroup behavior. Just read a little history. Look around you. Can one really understand the furious hatred of Trump by so many highly educated academics and intellectuals absent a grasp of this aspect of human behavior? Are racism, anti-Semitism, religious bigotry, hatred of the “bourgeoisie” or other versions of the “class enemy,” or any of the other myriad versions of outgroup identification that have been documented in our history best understood as the acts of “evil” people, who apparently get up every morning wracking their brains to decide what bad deeds they can do that day, or are they better understood as manifestations of the type of innate behavior described by Prof. Keith? I personally lean towards the latter explanation. Given the incredibly destructive results of this aspect of our behavior, would it not be advisable for our “experts” in evolutionary psychology to devote a bit more attention to it, as opposed to the more abstruse types of sexual behavior by which they now seem to be so fascinated? No doubt it would annoy the hardcore Blank Slaters who still haunt academia, but on the other hand, it might actually be useful.

    Sir Arthur had much more to say about the evolution of human nature, including that great tool of historical obfuscation, “group selection.” But that’s a matter best left to another day.

  • A New York Intellectual’s Unwitting Expose; Human Nature Among the Ideologues

    Posted on February 11th, 2019 Helian No comments

    Norman Podhoretz is one of the New York literati who once belonged to a group of leftist intellectuals he called the Family. He wrote a series of books, including Making It, Breaking Ranks, and Ex-Friends, describing what happened when he underwent an ideological metamorphosis from leftist radical to neoconservative. In the process he created a wonderful anthropological study of human nature in the context of an ingroup defined by ideology. Behavior within that ingroup was similar to behavior within ingroups defined by race, class, religion, ethnicity, or any of the other often subtle differences that enable ingroups to distinguish themselves from the “others.” The only difference was that, in the case of Podhoretz’ Family, the ingroup was defined by loyalty to ideological dogmas. Podhoretz described a typical denizen as follows:

    Such a person takes ideas as seriously as an orthodox religious person takes, or anyway used to take, doctrine or dogma. Though we cluck our enlightened modern tongues at such fanaticism, there is a reason why people have been excommunicated, and sometimes even put to death, by their fellow congregants for heretically disagreeing with the official understanding of a particular text or even of a single word. After all, to the true believer everything important – life in this world as well as life in the next – depends on obedience to these doctrines and dogmas, which in turn depends on an accurate interpretation of their meaning and which therefore makes the spread of heresy a threat of limitless proportions.

    This fear and hatred of the heretic, together with the correlative passion to shut him up one way or the other, is (to say the least, and in doing so I am bending over backward) as much a character trait of so-called liberal intellectuals as it is of conservatives… For we have seen that “liberal” intellectuals who tell us that tolerance and pluralism are the highest values, who profess to believe that no culture is superior to any other, and who are on that account great supporters of “multiculturalism” will treat these very notions as sacred orthodoxies, will enforce agreement with them in every venue in which they have the power to do so (the universities being the prime example at the moment), and will severely punish any deviation that dares to make itself known.

    Podhoretz may not have been aware of the genetic roots responsible for such behavior, but he was certainly good at describing it. His description of status seeking, virtue signaling, hatred of the outgroup, allergic reaction to heretics, etc., within the Family would be familiar to any student of urban street gangs. As anthropological studies go, his books have the added advantage of being unusually entertaining, if only by virtue of the fact that his ingroup included such lions of literature as Norman Mailer, Hannah Arendt, Mary McCarthy, Allen Ginsburg, and Lionel Trilling,

    Podhoretz was editor of the influential cultural and intellectual magazine Commentary from 1960 to 1995. When he took over the magazine already represented the anti-Communist Left. However, he originally planned to take a more radically leftist line, based on the philosophy of Paul Goodman, a utopian anarchist. In his Growing Up Absurd, Goodman claimed that American society was stuck with a number of “incomplete revolutions.” To escape this “absurdity” it was necessary to complete the revolutions. Podhoretz seized on Goodman’s ideas as the “radical” solution to our social ills he was seeking, and immediately started a three-part serialization of his book in Commentary. Another major influence on Podhoretz at the time was Life Against Death by Norman O. Brown, a late Freudian tract intended to reveal “the psychoanalytical meaning of history.” It is depressing to read these books today in the knowledge that they were once taken perfectly seriously by people who imagined themselves to be the cream of the intellectual crop. Goodman certainly chose the right adjective for them – “absurd.”

    In any case, as the decade wore on, the Left did become more radicalized, but not in the way foreseen by Podhoretz. What was known then as the New Left emerged, and began its gradual takeover of the cultural institutions of the country, a process that has continued to this day. When he came of age, most leftists had abandoned the Stalinism or Trotskyism they had flirted with in the 30’s and 40’s, and become largely “pro-American” and anti-Communist as the magnitude of the slaughter and misery in the Soviet Union under Stalin became impossible to ignore. However, as the war in Vietnam intensified, the dogs returned to their vomit, so to speak. Leftists increasingly became useful idiots – effectively pro-Communist whether they admitted it or not. As Israel revealed its ability to effectively defend itself, they also became increasingly anti-Semitic as well, a development that also continues to this day. Then, as now, anti-Semitism was fobbed off as “anti-Zionism,” but Podhoretz, a Jew as were many of the other members of the family, was not buying it. He may have been crazy enough to take Goodman and Brown seriously, but he was not crazy enough to believe that it was preferable to live in a totalitarian Communist state than in the “imperialist” United States, nor, in light of the Holocaust, was he crazy enough to believe that the creation of a Jewish state was “unjust.” In the following passage he describes his response when he first began to notice this shift in the Zeitgeist, in this case on the part of an erstwhile “friend”:

    I was not afraid of Jason. I never hesitated to cut him off when he began making outrageous statements about others, and once I even made a drunken public scene in a restaurant when he compared the United States to Nazi Germany and Lyndon Johnson to Hitler. This comparison was later to become a commonplace of radical talk, but I had never heard it made before, and it so infuriated me that I literally roared in response.

    Today, of course, one no longer roars. One simply concludes that those who habitually resort to Hitler comparisons are imbeciles, and leaves it at that. In any case, Podhoretz began publishing “heretical” articles in Commentary, rejecting these notions, and nibbling away that the shibboleths that defined what had once been his ingroup in the process. In the end, he became a full-blown neoconservative. The behavioral responses to Podhoretz “treason” to his ingroup should be familiar to all students of human behavior. His first book length challenge to his ingroup’s sense of its own purity and righteousness was Making It, published in 1967. As Podhoretz recalls,

    In an article about Making It and its reception that was itself none too friendly to the book, Norman Mailer summed up the critical response as “brutal – coarse, intimate, snide, grasping, groping, slavering, slippery of reference, crude and naturally tasteless.” But, he added, “the public reception of Making It was nevertheless still on the side of charity if one compared the collective hooligan verdict to the earlier fulminations of the Inner Clan.” By the “Inner Clan,” Mailer meant the community of New York literary intellectuals I myself had called the Family. According to Mailer, what they had been saying in private about Making It even before it was published made the “horrors” of the public reception seem charitable and kind. “Just about everyone in the Establishment” – i.e., the Family – was “scandalized, shocked, livid, revolted, appalled, disheartened, and enraged.” They were “furious to the point of biting their white icy lips… No fate could prove undeserved for Norman, said the Family in thin quivering late-night hisses.”

    Podhoretz notes that academia was the first of the cultural institutions of the country to succumb to the radical Gleichschaltung that has now established such firm control over virtually all the rest, to the point that it has become the new “normalcy.” In his words,

    For by 1968 radicalism was so prevalent among college students that any professor who resisted it at the very least risked unpopularity and at the worst was in danger of outright abuse. Indeed it was in the universities that the “terror” first appeared and where it operated most effectively.

    By the late 60’s the type of behavior that is now ubiquitous on university campuses was hardly a novelty. “De-platforming” was already part of the campus culture:

    By 1968 SDS (the leftist Students for a Democratic Society) had moved from argument and example to shouting down speakers with whom it disagreed on the ground that only the “truth” had a right to be heard. And it also changed its position on violence… and a number of its members had gone beyond advocacy to actual practice in the form of bombings and other varieties of terrorism.

    As Podhoretz documents, the War in Vietnam had originally been supported, and indeed started and continued by intellectuals and politicians on the left of the political spectrum. He noted that Robert Kennedy had been prominent among them:

    Kennedy too then grew more and more radicalized as radicalism looked more and more like the winning side. Having been one of the architects of the war in Vietnam and a great believer in resistance to Communist power in general, he now managed to suggest that he opposed these policies both in the small and in the large.

    However, in one of the rapid changes in party line familiar to those who’ve read the history of Communism in the Soviet Union and memorialized by George Orwell in 1984, the hawks suddenly became doves:

    …a point was soon reached where speakers supporting the war were either refused a platform or shouted down when they attempted to speak. A speaker whose criticisms were insufficiently violent could even expect a hard time, as I myself discovered when a heckler at Wayne State in Detroit accused me, to the clear delight of the audience, of not being “that much” against the war because in expressing my opposition to the American role I had also expressed my usual reservations about the virtues of the Communist side.

    Of course, there was no Internet in the 60’s, so “de-platforming” assumed a form commensurate with the technology available at the time. Podhoretz describes it as follows:

    The word “terror,” like everything else about the sixties, was overheated. No one was arrested or imprisoned or executed; no one was even fired from a job (though there were undoubtedly some who lost out on job opportunities or on assignments or on advances from book publishers they might otherwise have had). The sanctions of this particular reign of “terror” were much milder: one’s reputation was besmirched, with unrestrained viciousness in conversation and, when the occasion arose, by means of innuendo in print. People were written off with the stroke of an epithet – “fink” or “racist” or “fascist” as the case might be – and anyone so written off would have difficulty getting a fair hearing for anything he might have to say. Conversely, anyone who went against the Movement party line soon discovered that the likely penalty was dismissal from the field of discussion.

    Seeing others ruthless dismissed in this way was enough to prevent most people from voicing serious criticisms of the radical line and – such is the nature of intellectual cowardice – it was enough in some instances to prevent them even from allowing themselves to entertain critical thoughts.

    The “terror” is more powerful and pervasive today than it ever was in the 60’s, and it’s ability to “dismiss from the field of discussion” is far more effective. As a result, denizens of the leftist ingroup or those who depend on them for their livelihood tend to be very cautious about rocking the boat.  That’s why young, pre-tenure professors include ritualistic denunciations of the established heretics in their fields before they dare to even give a slight nudge to the approved dogmas. Indeed, I’ve documented similar behavior by academics approaching retirement on this blog, so much do they fear ostracism by their own “Families.” Podhoretz noticed the same behavior early on by one of his erstwhile friends:

    As the bad boy of American letters – itself an honorific status in the climate of the sixties – he (Normal Mailer) still held a license to provoke and he rarely hesitated to use it, even if it sometimes meant making a fool of himself in the eyes of his own admirers. But there were limits he instinctively knew how to observe; and he observed them. He might excoriate his fellow radicals on a particular point; he might discomfit them with unexpected sympathies (for right-wing politicians, say, or National Guardsmen on the other side of a demonstration) and equally surprising antipathies (homosexuality and masturbation, for example, he insisted on stigmatizing as vices); he might even on occasion describe himself as (dread word) a conservative. But always in the end came the reassuring gesture, the wink of complicity, the subtle signing of the radical loyalty oath.

    So much for Podhoretz description of the behavioral traits of the denizens of an ideologically defined ingroup. I highly recommend all of the three books noted above, not only as an unwitting but wonderfully accurate studies of “human nature,” but as very entertaining descriptions of some of the many famous personalities Podhoretz crossed paths with during his long career. One of them was Jackie Kennedy, who happened to show up at his door one day in the company of his friend, Richard Goodwin, “who had worked in various capacities for President Kennedy.”

    She and I had never met before, but we seemed to strike an instant rapport, and at her initiative I soon began seeing her on a fairly regular basis. We often had tea alone together in her apartment on Fifth Avenue where I would give her the lowdown on the literary world and the New York intellectual community – who was good, who was overrated, who was amusing, who was really brilliant – and she would reciprocate with the dirt about Washington society. She was not in Mary McCarthy‘s league as a bitchy gossip (who was?), but she did very well in her own seemingly soft style. I enjoyed these exchanges, and she (an extremely good listener) seemed to get a kick out of them too.

    Elsewhere Podhoretz describes McCarthy as “our leading bitch intellectual.” Alas, she was an unrepentant radical, too, and even did a Jane Fonda in North Vietnam, but I still consider her one of our most brilliant novelists. I guess there’s no accounting for taste when it comes to ingroups.

  • Robert Plomin’s “Blueprint”: The Reply of the Walking Dead

    Posted on January 30th, 2019 Helian No comments

    The significance of Robert Plomin’s Blueprint is not that every word therein is infallible. Some reviewers have questioned his assertions about the relative insignificance of the role that parents, schools, culture, and other environmental factors play in the outcome of our lives, and it seems to me the jury is still out on many of these issues. See, for example, the thoughtful review of Razib Khan in the National Review. What is significant about it is Plomin’s description of new and genuinely revolutionary experimental tools of rapidly increasing power and scope that have enabled us to confirm beyond any reasonable doubt that our DNA has a very significant influence on human behavior. In other words, there is such a thing as “human nature,” and it is important. This truth might see obvious today. It is also a fact, however, that this truth was successfully suppressed and denied for over half a century by the vast majority of the “scientists” who claimed to be experts on human behavior.

    There is no guarantee that such scientific debacles are a thing of the past. Ideologues devoted to the quasi-religious faith that the truth must take a back seat to their equalist ideals are just as prevalent now as they were during the heyday of the Blank Slate. Indeed, they are at least as powerful now as they were then, and they would like nothing better than to breathe new life into the flat earth dogmas they once foisted on the behavioral sciences. Consider, for example, a review of Blueprint by Nathaniel Comfort entitled “Genetic determinism rides again,” that appeared in the prestigious journal Nature. The first paragraph reads as follows:

    It’s never a good time for another bout of genetic determinism, but it’s hard to imagine a worse one than this. Social inequality gapes, exacerbated by climate change, driving hostility towards immigrants and flares of militant racism. At such a juncture, yet another expression of the discredited, simplistic idea that genes alone control human nature seems particularly insidious.

    Can anyone with an ounce of common sense, not to mention the editors of a journal that purports to speak for “science,” read such a passage and conclude that the author will continue with a dispassionate review of the merits of the factual claims made in a popular science book? One wonders what on earth they were thinking. Apparently Gleichschaltung is sufficiently advanced at Nature that the editors have lost all sense of shame. Consider, for example, the hoary “genetic determinism” canard. A “genetic determinist” is a strawman invented more than 50 years ago by the Blank Slaters of old. These imaginary beings were supposed to believe that our behavior is rigidly programmed by “instincts.” I’ve searched diligently during the ensuing years, but have never turned up a genuine example of one of these unicorns. They are as mythical as witches, but the Blank Slaters never tire of repeating their hackneyed propaganda lines. It would be hard to “discredit” the “simplistic idea that genes alone control human nature” by virtue of the fact that no one ever made such a preposterous claim to begin with, and Plomin certainly wasn’t the first. Beyond that, what could possibly be the point of dragging in all the familiar dogmas of the “progressive” tribe? Apparently Nature would have us believe that scientific “truth” is to be determined by ideological litmus tests.

    In the next paragraph Comfort supplies Plomin, a professor of behavior genetics, with the title “educational psychologist,” and sulks that his emphasis on chromosomal DNA leaves microbiologists, epigeneticists, RNA experts, and developmental biologists out in the cold. Seriously? Since when did these fields manage to hermetically seal themselves off from DNA and become “non-overlapping magisteria?” Do any microbiologists, epigeneticists, RNA experts or developmental biologists actually exist who consider DNA irrelevant to their field?

    Comfort next makes the logically questionable claim that, because “Darwinism begat eugenics”, “Mendelism begat worse eugenics,” and medical genetics begat the claim that men with an XYY genotype were violent, therefore behavioral genetics must also “begat” progeny that are just as bad. QED

    Genome-wide association (GWA) methods, the increasingly powerful tool described in Blueprint that has now put the finishing touches on the debunking of the Blank Slate, are dismissed as something that “lures scientists” because of its “promise of genetic explanations for complex traits, such as voting behavior or investment strategies.” How Comfort distills this “promise” out of anything that actually appears in the book is beyond me. One wonders if he ever actually read it. That suspicion is greatly strengthened when one reads the following paragraph:

    A polygenic score is a correlation coefficient. A GWAS identifies single nucleotide polymorphisms (SNPs) in the DNA that correlate with the trait of interest. The SNPs are markers only. Although they might, in some cases, suggest genomic neighborhoods in which to search for genes that directly affect the trait, the polygenic score itself is in no sense causal. Plomin understands this and says so repeatedly in the book – yet contradicts himself several times by arguing that the scores are in fact, causal.

    You have to hand it to Comfort, he can stuff a huge amount of disinformation into a small package. In the first place, the second and third sentences contradict each other. If SNPs are variations in the rungs of DNA that occur between individuals, they are not just markers, and they don’t just “suggest genomic neighborhoods in which to search for genes that directly affect the trait.” If they are reliable and replicable GWA hits, they are one of the actual points at which the trait is affected. Plomin most definitely does not “understand” that polygenic scores are in no sense causal, and nowhere does he say anything of the sort, far less “repeatedly.” What he does say is:

    In contrast, correlations between a polygenic score and a trait can only be interpreted causally in one direction – from the polygenic score to the trait. For example, we have shown that the educational attainment polygenic score correlates with children’s reading ability. The correlation means that the inherited DNA differences captured by the polygenic score cause differences between children in their school achievement, in the sense that nothing in our brains, behavior, or environment can change inherited differences in DNA sequence.

    I would be very interested to hear what Comfort finds “illogical” about that passage, and by virtue of what magical mental prestidigitations he proposes to demonstrate that the score is a “mere correlation.” Elsewhere we read,

    Hereditarian books such as Charles Murray and Richard Herrnstein’s The Bell Curve (1994) and Nicholas Wade’s 2014 A Troublesome Inheritance (see N. Comfort Nature 513, 306–307; 2014) exploited their respective scientific and cultural moments, leveraging the cultural authority of science to advance a discredited, undemocratic agenda. Although Blueprint is cut from different ideological cloth, the consequences could be just as grave.

    In fact, neither The Bell Curve nor A Troublesome Inheritance have ever been discredited, if by that term is meant being proved factually wrong. If books are “discredited” by how many ideological zealots begin foaming at the mouth on reading them, of course, it’s a different matter. Beyond that, if something is true, it does not become false by virtue of Comfort deeming it “undemocratic.” I could go on, but what’s the point? Suffice it to say that Comfort’s favorite “scientific authority” is Richard Lewontin, an obscurantist high priest of the Blank Slate if ever there was one, and author of Not in Our Genes.

    I can understand the editors of Nature’s desire to virtue signal their loyalty to the prevailing politically correct fashions, but this “review” is truly abject. It isn’t that hard to find authors on the left of the political spectrum who can write a book review that is at least a notch above the level of tendentious ideological propaganda. See, for example, Kathryn Paige Harden’s review of Blueprint in the Spectator. Somehow she managed to write it without implying that Plomin is a Nazi in every second sentence.  I suggest that next time they look a little harder.

    My initial post about Blueprint tended to emphasize the historical ramifications of the book in the context of the Blank Slate disaster. As a result, my description of the scientific substance of the book was very broad brush. However, there are many good reviews out there that cover that ground, expressing some of my own reservations about Plomin’s conclusions about the importance of environment in the process. See, for example, the excellent review by Razib Khan in the National Review linked above. As I mentioned in my earlier post, the book itself is only 188 pages long, so, by all means, read it.

  • Robert Plomin’s “Blueprint” – The Blank Slate and the Behavioral Genetics Insurgency

    Posted on January 28th, 2019 Helian No comments

    Robert Plomin‘s Blueprint is a must read. That would be true even if it were “merely” an account of recent stunning breakthroughs that have greatly expanded our understanding of the links between our DNA and behavior. However, beyond that it reveals an aspect of history that has been little appreciated to date; the guerilla warfare carried on by behavioral geneticists against the Blank Slate orthodoxy from a very early date. You might say the book is an account of the victorious end of that warfare. From now on those who deny the existence of heritable genetic effects on human behavior will self-identify as belonging to the same category as the more seedy televangelists, or even professors in university “studies” departments.

    Let’s begin with the science.   We have long known by virtue of thousands of twin and adoption studies that many complex human traits, including psychological traits, are more or less heritable due to differences in DNA. These methods also enable us to come up with a ballpark estimate of the degree to which these traits are influenced by genetics. However, we have not been able until very recently to detect exactly what inherited differences in DNA sequences are actually responsible for the variations we see in these traits. That’s were the “revolution” in genetics described by Plomin comes in. It turns out that detecting these differences was to be a far more challenging task than optimistic scientists expected at first. As he put it,

    When the hunt began twenty-five years ago everyone assumed we were after big game – a few genes of large effect that were mostly responsible for heritability. For example, for heritabilities of about 50 per cent, ten genes each accounting for 5 per cent of the variance would do the job. If the effects were this large, it would require a sample size of only 200 to have sufficient power to detect them.

    This fond hope turned out to be wishful thinking. As noted in the book, some promising genes were studied, and some claims were occasionally made in the literature that a few such “magic” genes had been found. The result, according to Plomin, was a fiasco. The studies could not be replicated. It was clear by the turn of the century that a much broader approach would be necessary. This, however, would require the genotyping of tens of thousands of single-nucleotide polymorphisms, or SNPs (snips). A SNP is a change in a single one of the billions of rungs of the DNA ladder each of us carries. SNPs are one of the main reasons for differences in the DNA sequence among different human beings. To make matters worse, it was expected that sample sizes of a thousand or more individuals would have to be checked in this way to accumulate enough data to be statistically useful. At the time, such genome-wide association (GWA) studies would have been prohibitively expensive. Plomin notes that he attempted such an approach to find the DNA differences associated with intelligence, with the aid of a few shortcuts. He devoted two years to the study, only to be disappointed again. It was a second false start. Not a single DNA association with intelligence could be replicated.

    Then, however, a major breakthrough began to make its appearance in the form of SNP chips.  According to Plomin, “These could “genotype many SNPs for an individual quickly and inexpensively. SNP chips triggered the explosion of genome-wide association studies.” He saw their promise immediately, and went back to work attempting to find SNP associations with intelligence. The result? A third false start. The chips available at the time were still too expensive, and could identify too few SNPs. Many other similar GWA studies failed miserably as well. Eventually, one did succeed, but there was a cloud within the silver lining. The effect size of the SNP associations found were all extremely small. Then things began to snowball. Chips were developed that could identify hundreds of thousands instead of just tens of thousands of SNPs, and sample sizes in the tens of thousands became feasible. Today, sample sizes can be in the hundreds of thousands. As a result of all this, revolutionary advances have been made in just the past few years. Numerous genome-wide significant hits have been found for a host of psychological traits. And now we know the reason why the initial studies were so disappointing. In Plomin’s words,

    For complex traits, no genes have been found that account for 5 per cent of the variance, not even 0.5 per cent of the variance. The average effect sizes are in the order of 0.01 per cent of the variance, which means that thousands of SNP associations will be needed to account for heritabilities of 50 per cent… Thinking about so many SNPs with such small effects was a big jump from where we started twenty-five years ago. We now know for certain that heritability is caused by thousands of associations of incredibly small effect. Nonetheless, aggregating these associations in polygenic scores that combine the effects of tens of thousands of SNPs makes it possible to predict psychological traits such as depression, schizophrenia and school achievement.

    In short, we now have a tool that, as I write this, is rapidly increasing in power, and that enables falsifiable predictions regarding many psychological traits based on DNA alone. As Plomin puts it,

    The DNA revolution matters much more than merely replicating results from twin and adoption studies. It is a game-changer for science and society. For the first time, inherited DNA differences across our entire genome of billions of DNA sequences can be used to predict psychological strengths and weaknesses for individuals, called personal genomics.

    As an appreciable side benefit, thanks to this revolution we can now officially declare the Blank Slate stone cold dead. It’s noteworthy that this revolutionary advance in our knowledge of the heritable aspects of our behavior did not happen in the field of evolutionary psychology, as one might expect. Diehard Blank Slaters have been directing their ire in that direction for some time. They could have saved themselves the trouble. While the evolutionary psychologists have been amusing themselves inventing inconsequential just so stories about the more abstruse aspects of our sexual behavior, a fifth column that germinated long ago in the field of behavioral genetics was about to drive the decisive nail in their coffin. Obviously, it would have been an inappropriate distraction for Plomin to expand on the fascinating history behind this development in Blueprint.  Read between the lines, though, and its quite clear that he knows what’s been going on.

    It turns out that the behavioral geneticists were already astute at dodging the baleful attention of the high priests of the Blank Slate, flying just beneath their radar, at a very early date. A useful source document recounting some of that history entitled, Origins of Behavior Genetics: The Role of The Jackson Laboratory, was published in 2009 by Donald Dewsbury, emeritus professor of psychology at the University of Florida. He notes that,

    A new field can be established and coalesce around a book that takes loosely evolving material and organizes it into a single volume. Examples include Watson’s (1914) Behavior: An Introduction to Comparative Psychology and Wilson’s (1975) Sociobiology. It is generally agreed that Fuller and Thompson’s 1960 Behavior Genetics served a similar function in establishing behavior genetics as a separate field.

    However, research on the effects of genes on behavior had already begun much earlier. In the 1930’s, when the Blank Slate already had a firm grip on the behavioral sciences, According to the paper, Harvard alumnus Alan Gregg, who was Director of the Medical Sciences Division of Rockefeller Foundation,

    …developed a program of “psychobiology” or “mental hygiene” at the Foundation. Gregg viewed mental illness as a fundamental problem in society and believed that there were strong genetic influences. There was a firm belief that the principles to be discovered in nonhuman animals would generalize to humans. Thus, fundamental problems of human behavior might be more conveniently and effectively studied in other species.

    The focus on animals turned out to be a very wise decision. For many years it enabled the behavioral geneticists to carry on their work while taking little flak from the high priests of the Blank Slate, whose ire was concentrated on scientists who were less discrete about their interest in humans, in fields such as ethology. Eventually Gregg teamed up with Clarence Little, head of the Jackson Laboratory in Bar Harbor, Maine, and established a program to study mice, rabbits, guinea pigs, and, especially dogs. Gregg wrote papers about selective breeding of dogs for high intelligence and good disposition. However, as his colleagues were aware, another of his goals “was conclusively to demonstrate a high heritability of human intelligence.”

    Fast forward to the 60’s. It was a decade in which the Blank Slate hegemony began to slowly crumble under the hammer blows of the likes of Konrad Lorenz, Niko Tinbergen, Robert Trivers, Irenäus Eibl-Eibesfeldt, and especially the outsider and “mere playwright” Robert Ardrey. In 1967 the Institute for Behavioral Genetics (IBG) was established at the University of Colorado by Prof. Jerry McClearn with his colleagues Kurt Schlesinger and Jim Wilson. In the beginning, McClearn et. al. were a bit coy, conducting “harmless” research on the behavior of mice, but by the early 1970’s they had begun to publish papers that were explicitly about human behavior. It finally dawned on the Blank Slaters what they were up to, and they were subjected to the usual “scientific” accusations of fascism, Nazism, and serving as running dogs of the bourgeoisie, but by then it was too late. The Blank Slate had already become a laughing stock among lay people who were able to read and had an ounce of common sense. Only the “experts” in the behavioral sciences would be rash enough to continue futile attempts to breath life back into the corpse.

    Would that some competent historian could reconstruct what was going through the minds of McClearn and the rest when they made their bold and potentially career ending decision to defy the Blank Slate and establish the IBG. I believe Jim Wilson is still alive, and no doubt could tell some wonderful stories about this nascent insurgency. In any case, in 1974 Robert Plomin made the very bold decision for a young professor to join the Institute. One of the results of that fortuitous decision was the superb book that is the subject of this post. As noted above, digression into the Blank Slate affair would only have been a distraction from the truly revolutionary developments revealed in his book. However, there is no question that that he was perfectly well aware of what had been going on in the “behavioral sciences” for many years. Consider, for example, the following passage, about why research results in behavioral genetics are so robust and replicate so strongly:

    Another reason seems paradoxical: behavioral genetics has been the most controversial topic in psychology during the twentieth century. The controversy and conflict surrounding behavioral genetics raised the bar for the quality and quantity of research needed to convince people of the importance of genetics. This has had the positive effect of motivating bigger and better studies. A single study was not enough. Robust replication across studies tipped the balance of opinion.

    As the Germans say, “Was mich nicht umbringt, macht mich stark” (What doesn’t kill me make me strong). If you were looking for a silver lining to the Blank Slate, there you have it. What more can I say. The book is a short 188 pages, but in those pages are concentrated a wealth of knowledge bearing on the critical need of our species to understand itself. If you would know yourself, then by all means, buy the book.

  • Morality Whimsy: Darwin and the Latter Day Philosophers

    Posted on September 30th, 2018 Helian No comments

    It’s hard to imagine how Darwin could have explained morality more clearly, given the Victorian context in which he wrote.  In Chapter IV of his The Descent of Man he said in so many words that it is a subjective manifestation of human nature. However, as I pointed out in my last post, even the philosophers of the 19th century who understood natural selection couldn’t draw the obvious conclusions.  None of them could free themselves of the illusion that Good and Evil are real, objective things, existing independently of human minds.  This was reflected in the various systems of “evolutionary morality” they proposed. They typically assumed that evolved morality had a goal, or purpose, which was usually some version of human flourishing, moral perfection, or “the good of the species.”  To all appearances, it never occurred to any of them that, as a natural process, evolution by natural selection cannot have a goal or a purpose.  In the 20th century, moral philosophers began to accept some of the more obvious implications of Darwinism.  In spite of that, they remained spellbound by the power of the illusion.  The only significant exception I’m aware of was Edvard Westermarck, who pointed out some of the obvious implications of Darwin’s claim that morality exists by virtue of evolved behavioral traits as far back as 1906.  He was forgotten, and we haven’t recovered the lost ground since.

    Today we know a lot more about the mechanics of natural selection than they did in the 19th century.  The study of morality suffered as much as any of the other behavioral sciences during the Blank Slate debacle, but we seem to be on the path to recovery, at least for the time being. Today many scientists and philosophers are at least vaguely aware of the fact, obvious as it was to Darwin, that human morality is a manifestation of innate behavioral traits. Some of them have even drawn some of the more obvious conclusions from that fact. However, we live in a highly moralistic era, especially in academia, and what we find written about morality today reflects this moralistic culture.

    To illustrate how far we’ve come, and how far we have yet to go, let’s consider the work of the philosopher Michael Ruse, one of the current crop of evolutionary moralists. He has written much on the subject, but I will focus on a paper he co-authored with E. O. Wilson back in 1986 entitled Moral Philosophy as Applied Science and the book Taking Darwin Seriously, published in 1999. First, the good news. Ruse does take Darwin seriously when it comes to the illusion of objective morality:

    …human beings function better if they are deceived by their genes into thinking that there is a disinterested objective morality binding upon them, which all should obey.

    We believe that implicit in the scientific interpretation of moral behavior is a conclusion of central importance to philosophy, namely that there can be no genuinely objective external ethical premises. Everything that we know about the evolutionary process indicates that no such extrasomatic guides exist.

    As these passages imply, Ruse also rejected the Blank Slate:

    The evidence from both genetic and cognitive studies demonstrates decisively that the human brain is not a tabula rasa.

    The following passage just repeats what Darwin wrote over a century ago in Chapter IV of The Descent of Man:

    It is easy to conceive of an alien intelligent species evolving rules its members consider highly moral but which are repugnant to human beings, such as cannibalism, incest, the love of darkness and decay, parricide, and the mutual eating of faeces. Many animal species perform some or all of these things, with gusto and in order to survive. If human beings had evolved from a stock other than savanna-dwelling, bipedal, carnivorous man-apes we might do the same, feeling inwardly certain that such behaviors are natural and correct. In short, ethical premises are the peculiar products of genetic history. And they can be understood solely as mechanisms that are adaptive for the species that possess them. It follows that the ethical code of one species cannot be translated into that of another. No abstract moral principles exist outside the particular nature of individual species.

    Ruse explicitly rejects the currently fashionable philosophical conceit that evolved morality somehow tracks “true” morality:

    It is thus entirely correct to say that ethical laws can be changed, at the deepest level, by genetic evolution. This is obviously quite inconsistent with the notion of morality as a set of objective, eternal verities. Morality is rooted in contingent human nature, through and through.

    Nor is it possible to uphold the true objectivity of morality by believing in the existence of an ultimate code, such that what is considered right corresponds to what is truly right – that the thoughts produced by the epigenetic rules parallel external premises.

    Here “epigenetic rules” is a term Ruse and Wilson coined referring to the innate predispositions that are responsible for the existence of morality. In other words, they’re what the 19th century philosophers referred to as “instincts.” It was an unfortunate choice in view of the current bitter disputes about the significance of epigenetic inheritance. They would have done better to stick with the terms already in use.

    So where is the fly in this promising ointment? To begin, Ruse isn’t quite on board with his own philosophy. In spite of his insistence on the subjective nature of morality, we constantly find him signaling to his morality-drenched academic peers that he’s “really good.” He suffers from the same morality addiction as the rest of them. Indeed, to get that monkey off his back, he would have to jump right out of his academic ingroup. For example,

    Like Huxley, I find these views (Social Darwinism)  taken to the extreme to be morally repellant. They are the epitome of all that is immoral, and anything but a guide to proper behavior… This philosophy I believe (generally) to be grossly immoral.

    Children with the disease (Tay-Sachs) develop at first in a normal manner. Then at six months they start to collapse into zombies, and die by the age of four. I see nothing immoral about detecting and aborting such children. In fact, I believe we have a positively moral obligation to do so.

    John Stuart Mill’s campaign for women’s rights was a good thing, as was Bertrand Russell’s opposition to nuclear weapons.

    What we have in the case of Darwinian ethics is a denial of objectivity, which is surely a denial of metaphysical reality by another name, and an affirmation of subjectivity, which is no less a commitment to common sense, in which the subject plays an active creative part. If anything is common sense, it is that rape is simply, totally, wrong.

    In spite of having affirmed that morality is a manifestation of innate predispositions, or “epigenetic rules,” Ruse can find nothing wrong with applying it to decide all sorts of issues that could not possibly have contributed to the evolution of those rules. Consider, for example, this passage, which also includes virtue signaling in the form of a wink and a nod to his liberal ingroup.

    Darwinism is anything but a gospel for the extreme conservative. Apart from anything else, no one is saying that there are humans towards whom we have no sense of moral obligation whatsoever. Furthermore, the pretense that we need not bother about the Third World is self-refuting. If we ignore it, then through such effects as overpopulation, we shall soon find that it raises all sorts of difficult moral issues which do directly impinge on us.

    In case we are left in any doubt about Ruse’s actual commitment to objective morality under a veneer of subjectivism, he adds,

    My only hope is to have shown that a Darwinian approach to morality does not call for a repudiation of standards and values cherished by decent people of all nations.

    It is beyond me where in Ruse’s philosophy one can find a definition of “decent people.” Indeed, his philosophy excludes the possibility that one can make unqualified reference to “decent people” unless “decency” exists as an independent object. In other words, his use of the term is a blatant non sequitur. All this makes no sense at all unless we are aware that Ruse imagines he has found a way to skip blithely around Hume’s is/ought barrier. It goes something like this:

    If morality means anything, it means being prepared to hold out a helping hand to others. Christians, utilitarians, Kantians, and everyone else come together on this.

    I guess I’m not one of the above. To me, morality refers to social behavior that is ultimately the result of evolved behavioral traits. The above is yet another example of Ruse’s tendency to objectivize a possible manifestation of that behavior as “good.” Next, we are optimistically informed that a universal human morality is possible based on the dubious assumption that there are no differences in the evolved traits on which it is based among human populations:

    When it comes to general shared moral principles, the Darwinian stands firm. Humans share a common moral understanding. This universality is guaranteed by the shared genetic background of every member of Homo sapiens. The differences between us are far outweighed by the similarities. We (virtually) all have hands, eyes, ears, noses, and the same ultimate awareness. That is part of being human. There is, therefore, absolutely nothing arbitrary about morality, considered from the human perspective.

    All this is so much hand waving. Given the evidence of vast differences in moral rules and behavior across human populations, the idea that there is absolutely nothing arbitrary about it is nonsense. No matter. Apparently based on this axiom of universality, a miracle happens. Ruse cuts the Gordian knot, and walks right around the is/ought barrier!

    To use an American sporting metaphor, the Darwinian does an end-run around the is/ought barrier. He/she realizes that you cannot go through it, but argues that you can go around it, giving morality all of the justificatory insight possible.

    In fact, all the “justificatory insight possible” amounts to zero. There is no plausible reason for the claim that the implausible assumption of universal “epigenetic rules” relevant to morality enables an “end-run” around the is/ought barrier. In other words, Ruse is just another modern philosopher attempting to have his cake and eat it, too.

    Unfortunately, Ruse has left out a few things in his “universal moral understanding.” Among them is the outgroup. He never mentions its existence in any of his work I’ve read so far, and yet, if there is any universal aspect of human moral behavior, that is one of them.  If what Ruse has written above about skipping around the is/ought barrier is true, then it becomes our duty to hate the outgroup with a blind, irrational fury. Beyond that, he never seriously takes into account the vast difference between the environment in which we now live, and the one in which the predispositions responsible for moral behavior evolved. If he did, it would immediately reduce his notion that morality is an appropriate tool for deciding issues about how to deal with the Third World to an absurdity.

    Perhaps the most significant thing of all that Ruse has left out of his philosophizing is a very fundamental feature of human morality. We do not apply it to ourselves alone. We apply it to others as well. To the extent that one imagines that he has done an “end-run” around the is/ought barrier, he also imagines that he has acquired the right to dictate behavior to others. After all, who are we to dispute such a noted philosopher’s take on what our “universal human morality” consists of? That is my biggest problem with our latter day “evolutionary moralists.” In reality, they are just as addicted to objective morality as their 19th century precursors, and just as intent on explaining to the rest of us what we “ought” to do.

    Do you like to have others dictate to you what you ought and ought not to do? I don’t. I know that we require some form of morality, because as a species we are too stupid to do without it. Under the circumstances, I prefer to keep it as simple as possible, and to reduce its sphere of influence as much as possible. It strikes me that expanding that sphere to include “the Third World,” or anything of the sort, is not only absurd, but extremely dangerous. I cannot give you any objective reason why you ought not to grovel before people who presume to dictate to you what you ought or ought not to do. I can only inform you that I prefer not to grovel myself. That, it seems to me, is one of the great advantages of grasping the truth about the subjective nature of morality. That truth does not imply moral chaos, or the impossibility of a society with “absolute” moral rules. It merely provides some insight into what such an “absolute” morality might look like in the context of whatever goals or purpose you’ve established for yourself in life.

    In my next post I will review the work of another modern “evolutionary moralist” who, predictably, has been no more capable of shaking the objective morality illusion than Ruse. Things haven’t changed much since the 19th century. The symptoms of the addiction have just become more subtle.

  • Morality Whimsy: What the Philosophers “Learned” from Darwin

    Posted on September 15th, 2018 Helian 4 comments

    When he published The Descent of Man, Charles Darwin practically spoon fed the rest of us the truth about human morality. He explained that it was as much a result of evolution by natural selection as any of our more obvious physical features. Similar versions of the heritable mental traits responsible for its existence are also present in other animals. The only difference between us and them is our ability to contemplate what we experience as a result of those traits with our large brains, and communicate our thoughts to others. As the result of a natural process, morality is not fixed, and could potentially be entirely different in other animals that might eventually happen to acquire levels of intelligence close to our own. In other words, it is a purely subjective phenomenon that does not “track” some imaginary “true” version of objective moral law. As a natural phenomenon, there is no reason to expect that it is striving towards some imaginary goal, such as human perfection or ideal virtue. It’s hard to imagine how Darwin could have expressed these facts in simpler or more straightforward terms.

    If Darwin’s claim that morality is derived from heritable mental traits that exist by virtue of natural selection is right, it follows that it is not a perfectly malleable manifestation of environment or culture. Human beings cannot be programmed by learning or environment to adopt completely arbitrary versions of morality. It also follows that humans will perceive moral rules as absolutes. Furthermore, human beings are social animals. If morality exists by virtue of evolved mental traits, it follows that it enhances the probability of the survival and reproduction of the responsible genes in a group environment. It would hardly be effective in doing so if it predisposed us to believe that certain of our behaviors are “good” and others “evil” merely as individuals, but that no such rules or categories apply to the behavior of others. In that case altruism would certainly be a losing strategy in the struggle for survival. However, altruism exists. It follows that we must perceive the moral “rules” not only as absolute, and not only as applying to ourselves, but to everyone else as well. In short, belief in objective morality is an entirely predictable illusion, but an illusion regardless. If it were not an illusion, Darwin’s comment that completely different versions of morality could evolve for different intelligent species would necessarily be false. Whatever else one thinks of objective morality, it is certainly un-Darwinian.

    In the years that followed, Darwin’s great theory spawned a host of different versions of “evolutionary morality.” One cannot but experience a sinking feeling in reading through them. Not a single one of the authors had a clue what Darwin was talking about. As far as I can tell, every single one of the systems of “evolutionary morality” concocted in the 19th century was based on the assumption of objective moral law. Evolution was merely the “natural” process of mankind’s progress towards the “goal” of compliance with this objective law, and the outcome of this “natural” process would be (of course) human moral perfection, in harmony with assorted versions of “true” morality. In other words, the power of the illusion asserted itself with a vengeance. “Man the wise” proved incapable of putting two and two together. Instead we clung to the old, familiar mirage that good and evil exist as objective things, just as our minds have always portrayed them to us.

    One can confirm the above by reviewing some representative samples of the early versions of evolutionary morality. Many of them were described by Charles Mallory Williams in his A Review of the Systems of Ethics Founded on the Theory of Evolution, published in 1893. By that time such systems were hardly a novelty. As Williams put it,

    Now every year and almost every month brings with it a fresh supply of books, pamphlets and magazine articles on The Evolution of Morality. So many are the waters which now pour themselves into this common stream that the current threatens soon to become too deep and swift for any but the most expert swimmers.

    Noting that it was already impossible to do justice to all the theories in a single book, Williams limited himself to reviewing the systems proposed by the most prominent authors in the field. These included Ernst Haeckel, who suggested substituting a “nature religion” based on evolution for the old “church religions.” According to Haeckel,

    The greatest rudeness and barbarity of custom often goes hand in hand with the absolute dominion of an all-powerful church; in confirmation of which assertion one need only remember the Middle Ages. On the other hand, we behold the highest standard of perfection attained by men who have severed connection with every creed. Independent of every confession of faith, there lives in the breast of every human being the germ of a pure nature religion; this is indissolubly bound up with the noblest sides of human life. Its highest commandment is love, the restraint of our natural egoism for the benefit of our fellow-men, and for the good of human society, whose members we are.

    The very un-Darwinian assumptions that evolution had resulted in a moral sense that was in tune with some version of ideal goodness, referred to by Haeckel as “a pure nature religion,” and that this moral sense existed to serve “the good of human society,” or the good of the species, are characteristic of all the early versions of “evolutionary morality.” For example, from the system proposed by Herbert Spencer,

    From the fundamental laws of life and the conditions of social existence are inducible certain imperative limitations to individual action – limitations which are essential to a perfect life, individual and social, or in other words essential to the greatest possible happiness. And these limitations following inevitably as they do from undeniable first principles deep as the nature of life itself constitute what we may distinguish as absolute morality… In the ideal state towards which evolution tends, any falling short of function implies deviation from perfectly moral conduct.

    Spencer’s friend, John Fiske, imagined that Darwin, “properly understood” pointed in a similar direction:

    Man is slowly passing from a primitive social state, in which he was little better than a brute, toward an ultimate social state in which his character shall have become so transformed that nothing of the brute can be detected in it. The “original sin” of theology is the brute inheritance, which is being gradually eliminated; and the message of Christianity: “Blessed are the meek for they shall inherit the earth” will be realized in the state of universal peace towards which mankind is tending. Strife and Sorrow shall disappear. Peace and Love shall reign supreme. The goal of evolution is the perfecting of man, whereby we see, more than ever, that he is the chief object of divine care, the fruition of that creative energy which is manifested throughout the knowable universe.

    Another Englishman, Alfred Barratt, proposed an even more confused version of “Darwinian morality:”

    The Moral Sense therefore is merely one of the emotions, though the last of all in the order of evolution. It can only claim a life of some two or three centuries, (!) and there are even some who still doubt its existence. Man, at any rate, is the only animal who possesses it in its latest development, for even in horses and dogs we cannot believe that it has passed the intentional or conscious stage. Good with them has no artificial meaning; it is simply identical with the greatest pleasure. Only by complete and perfect obedience to all emotions can perfect freedom from regret be obtained in the gratification of all desire. Man is at present passion’s slave because he is so only in part, for the cause of repentance is never the attainment of some pleasure, but always the non-attainment of more; not the satisfaction of one desire, but the inability to satisfy all. The highest virtue, therefore, consists in being led not by one desire but by all in the complete organization of the Moral Nature.

    According to the abstruce version of “Darwinism” proposed by Austrian philosopher Bartolomäus von Carneri, evolution had a “goal.” Happily, it was “the perfection of man.”

    When we do away with all concessions to one sided extravagant desires, abstain from placing mind above the universal law of causality, and are content with the facts made known to us by science, we perceive that the absolute True, Beautiful, and Good bears the character of the Universal. In this universal character it has always finally found expression in human life and in this character it will always find expression… There is no absolute Evil in contrast to the absolute Good. Evil is negative. The perfection of man is identical with the attainment of absolute Good through evolution.

    So much for “evolutionary morality” in the 19th century.  None of these philosophers had a clue that they were spouting nonsense that flew in the face of what Darwin had actually said about morality.  None of them so much as stopped to think that there is no path from a natural process such as evolution by natural selection to objective “oughts.”  They could not free themselves of the powerful illusion that good and evil are real things. It took a critic of Darwin who rejected the idea that evolution had anything to do with morality to see the blatant fallacies at the bottom of all these systems of “evolutionary morality.” Such a man was Jacob Gould Schurman, who took occasion to point out some of the gaping holes in all these fine theories in his The Ethical Import of Darwinism, published in 1888. The diehard Schurman commented bitterly that,

    It is a historical fact that no one nowadays seems to doubt the validity of the general theory of evolution. However, the same cannot be said of natural selection.

    He cited several prominent contemporary scientists, including Alfred Russel Wallace, who rejected Darwin’s theory either in whole or in part. Noting that “Darwin is certainly the father of evolutionary ethics,” Schurman then continued with a scathing attack on the whole idea, pointing out gaping holes in the above theories of “evolutionary morality” that are just as applicable to the tantrums of modern SJWs. For example,

    It is worse than idle for mechanical evolutionists to talk of the reason or end or ground of morality.

    The mental and moral faculties are both reduced to the rank of natural phenomena.

    The absolute ought cannot be the product of (evolution).

    Will not evolution, then, as thus interpreted, work revolution in our views of the moral nature of man, since it implies that morality is not grounded in the nature of things, but something purely relative to man’s circumstances; a happy device whereby man’s ancestors managed to cohere in a united society, and so kill out rival and disunited groups.

    Exactly! If Darwin was right, then the claims of any system of “evolutionary morality” to represent objective moral truths must be dismissed as absurd. It is impossible for objective Good and Evil to be “grounded in the nature of things” if morality is the outcome of a random natural process. Indeed, it is not out of the question that intelligent life may already have evolved on other planets by a process similar to the one that occurred on earth, resulting in entirely different versions of good and evil.  It is a tribute to the power of the illusions that our evolved “moral sense” spawns in our brains that it is only obvious to those who disagree with our preferred version of “moral truth” that we are delusional.

    Today we suffer from an infestation of secular “Social Justice Warriors,” who are in the habit of delivering themselves of bombastic moral pronunciamientos, and become furious when the rest of us pay no attention to them. Only Christians and other theists appear capable of noticing that they lack any basis for the legitimacy of their moral claims. In fact, they are behaving just as Darwin would have predicted, blindly responding to innate moral emotions, oblivious to the fact that the consequences of doing so today are highly unlikely to be the same as those that applied in the radically different world in which those emotions evolved. Just as the Darwin critic Schurman immediately recognized that the evolutionary moralists’ fantastic notion that they had discovered a philosopher’s stone to prop up their “absolute ought” was absurd, today’s theists can immediately see that the fine “objective truths” in which secular humanists imagine they’ve arrayed their moralistic emperor are purely figments of their imaginations.  Their emperor is naked.

    As far as “evolutionary morality” is concerned, little has changed since the 19th century.  “Evolutionary moralists” flourish even more luxuriantly now than they did then.  Some of them even deny the existence of objective moral truths.  None that I am aware of are to be taken seriously when they make that claim.  In nearly the same breath in which they announce their belief in subjective morality, they will launch into a morally drenched rant against conservatives, or populists, or nationalists, or capitalists, or whoever else has the honor of belonging to their outgroup.  They do this without the least explanation, as if there were nothing at all contradictory about it.  They announce that there are no moral truths, and then proceed to furiously defend whatever flavor of moral truth they happen to prefer. Nothing could be further from their minds than explaining just how they imagine the particular “moral truths” they endorse will enhance the odds that the responsible genes they happen to carry will survive and reproduce. Only the great Edvard Westermarck popped for a brief moment out of the prevailing fog and followed the teachings of Darwin to their logical conclusion.  He was quickly forgotten.

    Why is all this important?  I can only answer that question from a personal point of view.  It may not be important to some people.  That said, it is important to me because I find it expedient to know and base my actions and decisions on the truth.  I can’t say with absolute certainty whether anything is true or not, so I settle for what I consider probably true, and I deem it highly probable that there is no such thing as objective moral truth.

    Some have argued that acknowledging this particular truth will harm society, because it will lead to moral relativism and moral chaos.  Human history in general, and the historical facts I have cited above in particular, demonstrate that this conclusion is false.  In view of what Darwin wrote about morality, it would seem perfectly clear and perfectly obvious that no system of objective morality can be based on his theory of evolution by natural selection.  This was abundantly clear to many of his opponents.  It remains obvious to the theists who reject his theory today.  However, almost to a man, those who considered themselves “Darwinians” and proposed systems of morality supposedly based on his theory concluded that there are objective moral truths, and that it is the “goal” of evolution to realize these truths! I can think of no rational explanation for this fact other than the existence of a powerful, innate human predisposition to perceive moral rules as independent, objective facts.  The power of this common illusion is demonstrated by the fact that highly intelligent “Darwinian” moral philosophers could not wean themselves from it even after Darwin had, for all practical purposes, told them point blank that they were fooling themselves.  In short, our species faces no danger from moral relativism.  The opposite is true. We are moral absolutists by nature, and will continue to be moral absolutists regardless of the scribblings of philosophers.  The real danger we face is our tendency to blindly follow the promptings of our “moral sense” in an environment that is radically different from the one in which that moral sense evolved.

    Demonstrating the truth of the above couldn’t be simpler. Just gather up as many evolutionary moralists, postmodernists, and self-proclaimed believers in subjective morality as you please. Then take a close look at what they’ve actually written.  You’ll quickly find that every single one of them has made and continues to make morally loaded pronouncements that make no sense whatever absent the implicit assumption that there are objective moral truths.  They will announce that someone in their outgroup is immoral, or that we “ought” to do something, not merely as a matter of utility, but because it is the “right” thing to do, or that we have a “duty” to do something and refrain from doing something else.  They will proclaim their desire for “moral progress” or “human flourishing” without feeling in the least embarrassed by their failure to explain how “moral progress” or “human flourishing” will promote the survival of the genes that are the ultimate reason they find these nebulous utopias so attractive to begin with.

    I, too, am human, and tend to wander off into such irrationalities myself sometimes.  However, if challenged, I will at least admit that I am merely expressing whims spawned by my own “moral sense,” and that I know of no legitimate basis whatever for claiming that my whims have some magical power to dictate to others what they ought or ought not to do.

    We are not threatened by moral relativism.  We are threatened by the pervasive illusion that the objects we refer to as good and evil are real, and that we and the members of our ingroup have a monopoly on the knowledge of what these imaginary objects look like.  We cannot free ourselves of this illusion.  We are moral absolutists by nature.  Under the circumstances, it might behoove us to construct an “absolute morality” that is as benign, useful, and unobtrusive as possible.  If nothing else, it would pull the rug out from under the feet of the pious bullies and self-appointed moral dictators that I personally find an insufferable blight on modern society.  With luck, it might even encourage some of our benighted fellow creatures, who are now rushing down “morally pure” paths to extinction, to think twice about the wisdom of what they are doing, or as least to refrain from insisting that the rest of us accompany them on the journey.

  • The Pinker Effect: Prof. Pickering’s Violent Agreement with the Hunting Hypothesis

    Posted on August 18th, 2018 Helian No comments

    Rough and Tumble by Prof. Travis Pickering is an amazing little book. The author’s ostensible goal was to defend the “hunting hypothesis,” according to which hunting played an important role in the evolution of our species. In spite of that, Pickering devotes much of it to furiously denouncing authors who proposed very similar versions of that hypotheses, in some cases nearly a century earlier. I’ve seen this phenomenon often enough now to coin a phrase for it; The Pinker Effect.  The Pinker Effect may be described as proposing a hypothesis combined with a denunciation and/or vilification of authors who proposed the same hypothesis years earlier, often in a clearer, more articulate and accurate form.  The quintessential example is Steven Pinker’s denunciation of Robert Ardrey, in his The Blank Slate, in spite of the fact that Ardrey had presented a better and more accurate description of the Blank Slate debacle in books he had published as many as four decades earlier. Interestingly enough, Ardrey is also one of the authors who presented a very similar version of Pickering’s hunting hypothesis in a book, appropriately entitled The Hunting Hypothesis, back in 1976.  He is bitterly denounced in Rough and Tumble, along with several other authors, including Carveth Read, who proposed a prescient version of the hypothesis in his The Origin of Man as long ago as 1920. What could explain this counterintuitive phenomenon?

    I can only speculate that what we are seeing is a form of ritual appeasement of the powers that control the ideology, not to mention the purse strings, of one’s tribe. In this case we are speaking of academia, now controlled by aging leftists.  I suspect that many of them haven’t forgotten the shame and humiliation they experienced when Ardrey, Konrad Lorenz, and several others made a laughing stock of them back in the 60’s and 70’s in the process of demolishing the Blank Slate orthodoxy. This demolition crew included several authors who were also prominently associated with the hunting hypothesis.  Now, nearly half a century later, it would seem that Pickering still doesn’t dare to defend that hypothesis without first performing a triple kowtow before the former high priests of the Blank Slate! The historical background is fascinating.

    First, let’s review the striking similarities between Pickering’s version of the hunting hypothesis and those proposed by other authors as much as a century earlier. Keep in mind as you read down the list that he not only borrows their ideas without attribution or even praise, but actually denounces and vilifies every one of them!

    Early meat eating

    Pickering: Like others before me, I argue that hunting was a primary factor in our becoming fully human – a factor underpinning the completely unique ways in which we organize ourselves and interact with others of our own kind. This means, in turn, that we need to characterize human predation as accurately as possible in order to build the fullest and most realistic understanding of what it is to be human.
    Carveth Read: But the ancestor of Man found an object for association and cooperation in the chase. Spencer, indeed, says that a large carnivore, capable of killing its own prey, profits by being solitary; and this may be true where game is scarce: in the Oligocene and Miocene periods game was not scarce. Moreover, when our (ancestral, ed.) ape first pursued game, especially big game ( not being by ancient adaptation in structure and instinct a carnivore), he may have been, and probably was, incapable of killing enough prey single-handed; and, if so, he will have profited by becoming both social and cooperative as a hunter, like the wolves and dogs – in short, a sort of wolf-ape (Lycopithecus).

    Early bipedalism

    Pickering: “The contrasting (in comparison to Australopithecines, ed.) long legs of Homo (including even those of its earliest species, like Homo erectus) probably made it a more efficient bipedal strider than were the australopithecines. But the anatomy of the ape-man hips, legs, knees, and ankles indicates that its species were also quite capable terrestrial bipeds.”
    Raymond Dart: “It is significant that this index, which indicates in a measure the poise of the skull upon the vertebral column, points to the assumption by this fossil group of an attitude appreciably more erect than that of modern anthropoids. The improved poise of the head, and the better posture of the whole body framework which accompanied this alteration in the angle at which its dominant member was supported, is of great significance. It means that a greater reliance was being placed by this group upon the feet as organs of progression, and that the hands were being freed from their more primitive function of accessory organs of locomotion.” (Australopithecus africanus: The Man-Ape of South Africa, published in Nature, February 7, 1925.)

    Use of weapons

    Pickering: Perhaps in an effort to maintain at least a semblance of behavioral distinction between “us and them,” some scientists still insist on clinging to the remaining (seemingly less consequential) disparities. Hunting with weapons was one such vestige of supposed human uniqueness. But, recently primatologist Jill Pruetz saw to toppling even this minor remnant of presumed human exceptionalism. Using their teeth to sharpen the ends of sticks into points, the chimpanzees of Fongoli, in the West African country of Senegal, fashion what are essentially simple thrusting spears into hollows in trees in an effort to stab and extract bushbabies, the small nocturnal primates who sleep in the holes during the day.
    Carveth Read: The utility and consequent selection of hands had been great throughout; but their final development may be referred to the making and using of weapons fashioned according to a mental pattern. Those who had the best hands were selected because they made the best weapons and used them best. (The Origin of Man, 1920)

    Debunking of human scavenging

    Pickering: Like all scientific hypotheses, these that sought to balance the reality of ancient cut marks with the idea of passive scavenging generated testable predictions. And, time and again, they failed their archaeological tests. In failing, they also effectively falsified the overarching hypothesis of passively scavenging hominins.
    Robert Ardrey: I wondered from an early date about the popularity of the scavenger hypothesis. If we were incapable of killing large prey animals such as wildebeest and waterbuck, then how were we capable of stealing their remains from their rightful and more dangerous killers? If we had been concerned with only a few stray bones, then luck could account for it. But the impressive accumulations at early hominid living sites must indicate either that we had been even more adept thieves than we are today, or that the great carnivores in those times were unaccountably lazy at guarding their kills.

    Hypothesis of ambush hunting:

    Pickering: Along this tactical continuum, hunting from a tree-stand is fairly simple, but it still conveys many benefits to the hunter. In addition to the disadvantaging nature of hunting from above (again, ungulates do not typically look up when scanning for predators), attacking an animal from above also takes the hunter out of potentially harmful physical contact with the prey.
    Carveth Read: We may, indeed, suppose that at first prey was sometimes attacked by leaping upon it from the branch of a tree, as leopards sometimes do.
    Robert Ardrey: The rare waterhole, the occasionally trickling stream, were the only places where they (other animals, ed.) could come to drink. So water became a natural trap. We did not need the long-striding foot: we could wait with our ambush for the game to come to us.

    I could cite many other examples. The fact that Pickering devotes much of his book to denouncing these authors who agree with him seems odd enough, but it’s not so surprising if you happen to be familiar with the history of the Blank Slate debacle.  Let’s review some of the salient details.

    Konrad Lorenz and Robert Ardrey were two authors singled out by Pickering as paragons of villainy. To hear him tell it, they both must have wracked their brains each morning to come up with a list of bad deeds they could do that day. Oddly enough, it happens that they were also the twin betes noire of the Blank Slaters of old. They were loathed and hated, not because of anything they had to say about hunting, but because they insisted there is such a thing as human nature, and it is not only significant and important, but extremely dangerous for us to ignore. During a period of several decades before they appeared on the scene, it had gradually become anathema for scientists in fields relevant to human behavior to suggest that we were possessed of innate behavioral traits of any kind. Marxism and the other fashionable egalitarian ideologies of the time required it. Instead, reality was ignored in favor of the myth that all our behavior is a result of learning and experience. The result was what we now refer to as the Blank Slate. During the 60’s and 70’s Ardrey and Lorenz published a series of books that revealed to an amused lay audience the absurd nonsense that passed for “science” among these “experts.” As one might expect, this provoked a furious reaction, as documented, for example, in books like Man and Aggression, edited by Blank Slate high priest Ashley Montagu, which appeared in 1968.  It’s still available for just two dollars at Amazon, and is required reading for anyone with a serious interest in the affair. It didn’t help. The Blank Slate charade slowly began to unravel. As increasing numbers of the more honest members of the academic and professional tribe began to break ranks, it eventually collapsed. Clearly, the shame of the Blank Slaters of old still rankles because, after all these years, Pickering still found it necessary to appease them by coming up with a ludicrously contrived rationalization for claiming that his “good” version of the hunting hypothesis was different from the “evil” version proposed by Ardrey, Lorenz, and company long ago.

    As it happens, the reason Pickering gives for smearing Ardrey, Lorenz, and the rest, who are conveniently no longer around to defend themselves, is their supposed support for the so-called “Killer Ape Theory.” It is commonly defined as the theory that war and interpersonal aggression were the driving forces behind human evolution. It is usually associated with “genetic determinism,” the notion that humans have an irresistible and uncontrollable instinct to murder others of their kind. None of the authors Pickering denounces believed any such thing. This “theory” was a strawman invented by their Blank Slate enemies. Its genesis is of historical interest in its own right.

    Raymond Dart is usually cited as the author of the theory. The basis for this claim is a paper he published in 1953 entitled The Predatory Transition from Ape to Man. The paper is available online. Read it, and you will see that it contains nothing even approaching a coherent “theory that war and interpersonal aggression were the driving forces behind human evolution.” To the extent that an “theory” is present in the paper at all, it is just what the title claims; that pre-human anthropoid apes hunted and ate meat. The problem with the paper, seized on years later by the Blank Slaters to prop up their “Killer Ape Theory” strawman, was that it appeared to have been written by a somewhat unhinged junior high school student who had been watching too many Friday night creature features. Some of the more striking examples include,

    Either these Procrustean proto-human folk tore the battered bodies of their quarries apart limb from limb and slaked their thirst with blood, consuming the flesh raw like every other carnivorous beast; or, like early man, some of them understood the advantages of fire as well as the use of missiles and clubs.

    A microcephalic mental equipment was demonstrably more than adequate for the crude, carnivorous, cannibalistic, bone-club wielding, jawbone-cleaving Samsonian phase of human emergence.

    On this thesis man’s predecessors differed from living apes in being confirmed killers: carnivorous creatures, that seized living quarries by violence, battered them to death, tore apart their broken bodies, dismembered them limb from limb, slaking their ravenous thirst with the hot blood of victims and greedily devouring livid writhing flesh.

    To characterize this class B movie stuff as a “theory” is a bit of a stretch. When it comes to human nature, there is nothing in the paper in the form of a coherently elaborated theory at all. The only time Dart even mentions human nature is in the context of a sentence claiming that “recognition of the carnivorous habit as a distinctive australopithecine trait” has implications for understanding it. Based on this flimsy “evidence” that the “Killer Ape Theory” strawman was real, and Dart was its author, Pickering goes on to claim that,

    Ironically, it was Robert Ardrey, an American dramatist (and Dart’s mouthpiece in four popular books), who provided the voice closest to cool detachment when he abstracted the “killer ape hypothesis” thusly: ‘Man is a predator whose natural instinct is to kill with a weapon.’ In no subtle way, predation and aggression were coupled as the ultimate propellants of human evolution.

    Here we must charitably assume that Pickering has never actually read Ardrey’s books, because otherwise we would be forced to conclude that he is a bald-faced liar. The theme of all Ardrey’s books, which reviewed the work, not only of Dart, but of hundreds of other scientists, was that there is such a thing as human nature, and it is significant and important. The idea that he was nothing but “Dart’s mouthpiece” is beyond absurd. His books are easily available today, and anyone can confirm that fact who takes the trouble to actually read them. In the process, they will see that when Ardrey wrote that “Man is a predator whose natural instinct is to kill with a weapon,” he had nothing even remotely similar to the “Killer Ape Theory” in mind. Pickering himself amply documents in his book that not only human beings but our hominin ancestors were predators, that they killed, and that they did so with weapons. That leaves only the term “instinct” as the basis for all Pickerings fulminations against Ardrey and the rest.

    In order to pull off this feat, he had to come up with a fairy tale according to which they all believed that humans were driven to hunt by some kind of a genetically induced rage, directed both against their animal prey and other human beings. He, on the other hand, while generously admitting that some emotions were relevant to hunting behavior, prefers a more cerebral version of hunting behavior characterized by cool calculation rather than emotion. This is really the only significant difference he comes up with between their version of the hunting hypothesis and his own, and apparently is the basis of his conclusion that they were “evil,” whereas he is “good.” According to Pickering, those earlier, “evil” proponents of the hunting hypothesis believed in a version of hunting behavior that was actually more characteristic of chimpanzees. He goes to a great deal of trouble to distinguish their “emotional” style hunting with our own, “cerebral” version. To quote from the book,

    Expertise in hunting the large, warily dangerous prey of human foragers and cashing in on its concomitant evolutionary rewards does not mature from the hell-bent approach employed by chimpanzees to dispatch their prey. Application of brute physicality is an efficient means for chimpanzees to kill because they hunt in groups, they concentrate on much smaller animals than themselves, and the rely on their superhuman strength and agility to overpower their victims… A human has no hope of out-muscling, out-running, or out-climbing his typical prey, but, if his mind stays clear, he can absolutely count on out-thinking those animals.

    …all the brain power and fine motor control in the world aren’t worth a damn to a human hunter if his brain’s commands are overridden by emotion. Clear thinking in survival situations – and what is a hunting and gathering life if not a daily struggle for survival? – is dependent on control of emotion.

    General emotional control in hominins may not have yet developed by the time of Homo erectus. But, the archaeological record of Homo erectus implies strongly that the species applied emotional control, at least situationally, when it hunted…

    So much for Pickering’s version of the difference between his ideas and the “Killer Ape Theory” he attributes to Ardrey, Lorenz, et. al. Even as it stands it’s a pathetic excuse, not only for failing to attribute the many “original” ideas in his book about human hunting to the virtually identical versions presented by Ardrey in his The Hunting Hypothesis, not to mention years earlier by Carveth Read in his The Origin of Man, but for actually denouncing and vilifying those authors. However, the “difference” itself is imaginary, as can be easily seen by anyone who takes the trouble to read what Ardrey and the rest actually wrote.

    Pickering’s deception is particularly obvious in the case of Lorenz. He made it perfectly clear that he didn’t associate Pickering’s version of “emotion” with hunting behavior. Indeed, he was dubious about associating “aggression” with hunting at all.  For example, in On Aggression, he wrote,

    In yet another respect the fight between predator and prey is not a fight in the real sense of the word:  the stroke of the paw with which a lion kills his prey may resemble the movements that he makes when he strikes his rival, just as a shot-gun and a rifle resemble each other outwardly; but the inner motives of the hunter are basically different from those of the fighter.  The buffalo which the lion fells provokes his aggression as little as the appetizing turkey which I have just seen hanging in the larder provokes mine.  The differences in these inner drives can clearly be seen in the expression movements of the animal:  a dog about to catch a hunted rabbit has the same kind of excitedly happy expression as he has when he greets his master or awaits some longed-for treat. From many excellent photographs it can be seen that the lion, in the dramatic moment before he springs, is in no way angry.  Growling, laying the ears back, and other well-known expression movements of fighting behavior are seen in predatory animals only when they are very afraid of a wildly resisting prey, and even then the expressions are only suggested.

    In none of his books did Lorenz ever suggest that hunting behavior in man was any different from that of other hunting animals.  That which Ardrey actually wrote on the subject, as opposed to the “killer ape theory” flim flam that is constantly and falsely attributed to him, is much the same.  For example, from The Hunting Hypothesis, he discusses what might have given us an advantage as nascent predators as follows,

    Yet we had some advantages.  There was the innocence of animals, such as Paul Martin has described in North American prey pursued by skilled but unfamiliar intruders from Asia; our Pliocene victims could only have been easy marks.  There was our ape brain, incomparably superior to that of any natural predator.  If the relatively unintelligent lioness can practice tactical hunting and plan ambushes as Schaller has described, then our talents must have been of an order far beyond lion imagination.

    In his Serengeti studies George Schaller shows that any predator taking his prey is cool, calculating, methodical.  It is a kind of aggressive behavior radically unlike his defense of a kill against competitors.  Then there is overwhelming emotion, rage, and sometimes a lethal outcome unlike normal relations within a species.  Such would have been the situation between competing hunters in glacial Europe.

    Pickering anointed poor Carveth Read and other early authors honorary proponents of the “killer ape theory” even though they were long dead before Dart ever published his paper.  At the beginning of chapter 3 he writes,

    The same nauseating waves of cannibalism, unquenchable bloodthirst, cruel misogyny (specifically), and raging misanthropy (generally) that course through the writings of Dart and Ardrey also typify the pre-Dartian ramblings of Morris, Campbell and Read.

    Dart may have been a bit over the top in his “seminal” paper, but the above is truly unhinged. Pickering must imagine that no one will take the trouble to excavate Read’s The Origin of Man from some dusty library stack and read it.  In fact, it can be read online.  Even out of the context of his time, this furious rant against Read is truly grotesque.  Read the first few chapters of his book, and you will see that his hypothesis about hunting behavior in early man actually came quite close to the version proposed by Pickering.

    In his eagerness to virtue signal to the other inmates of his academic tribe that his version of the hunting hypothesis is “good” as opposed to the “evil” versions of the “others,” Pickering actually pulls off the amusing stunt of using now irrelevant studies once favored by the Blank Slaters of old because they “proved” early man didn’t hunt, to attack Dart, supposed author of the “killer ape theory,” even though the same studies undermine his own hypotheses.  In particular, he devotes a great deal of space to describing studies done by C. K. Brain to refute Dart’s claim that statistical anomalies in the distribution of various types of bones in South African caves were evidence that certain bones had been used as weapons and other tools. It was masterful work on cave taphonomy, in which Brain explored the statistics of bone accumulations left by animals as diverse as hyenas, leopards, owls and porcupines.  Unfortunately, he chose to publish his work under the unfortunate title; The Hunters or the Hunted? The work was immediately seized on by the Blank Slaters as “proof” that early man hadn’t hunted at all, and was really a meek vegetarian, just as Ashley Montagu and his pals had been telling us all along.  Brain was immediately anointed a “good” opponent of hunting, as opposed to the “evil” men whose ideas his work supposedly contradicted.  Pickering apparently wanted to bask in the reflected glory of Brain’s “goodness.”

    Of course, all that happened in the days when one could still claim that chimpanzees were “amiable vegetarians,” as Ashley Montagu put it.  It’s worth noting that when Jane Goodall began publishing observations that suggested they aren’t really all that “amiable” after all, she was vilified by the Blank Slaters just as viciously as Pickering has vilified Dart, Ardrey, Lorenz and Read.  Now we find Pickering trotting out Brain’s book even though it “disproves” his own hypotheses.  Meanwhile it has been demonstrated, for example, in careful isotopic studies of Australopithecine teeth, that the species Dart first discovered ate a substantial amount of meat after all, as he had always claimed.  Clearly, they were also occasionally prey animals.  So were Neanderthals, as their remains have been found in predator bone accumulations as well.  That hardly proves that they didn’t hunt.

    In short, if you like to read popular science books, beware the Pinker Effect.  I note in passing that C. K. Brain never stooped to the practice of “proving” the value and originality of his own work via vicious ad hominem attacks on other scientists.  He was Dart’s friend, and remained one to the end.

  • On Steven Pinker’s Second Fairy Tale: The “Hydraulic Theory” of Konrad Lorenz

    Posted on August 4th, 2018 Helian 2 comments

    You have to hand it to Steven Pinker.  At least his book about the Blank Slate drew attention to the fact that it ever happened.  It would have been nice if he’d gotten the history right as well.  Unfortunately, his description of the affair airbrushes the two men most responsible for ending it completely out of the picture.  I refer to Robert Ardrey and Konrad Lorenz.  Ardrey played by far the most significant role of any individual in smashing the Blank Slate orthodoxy.  He was an outsider, a former playwright, whose highly popular and influential books insisting on the existence and significance of human nature made a mockery of the Blank Slate among intelligent lay people.  The academic and professional tribe of “scientists” in the behavioral disciplines never forgave him.  The humiliation they suffered during their slow, post-Ardrey return to reality following their long debauch with ideologically motivated myths tarted up as “science” rankles to this day.  One can still find occasional artifacts of their hatred in the popular media, as I noted in an earlier post.  That probably explains why Pinker dropped Ardrey down the memory hole.  It can be understood, at least in part, as a belated defense of his academic ingroup.  The result was a ludicrous “history” of the Blank Slate affair that studiously avoided mentioning the role of the individual who played the single most important role in ending it.

    Pinker’s rationalization for ignoring Ardrey and Lorenz was certainly crude enough.  He managed it in a single paragraph in Chapter 7 of The Blank Slate.  The first part of the paragraph reads as follows:

    The Noble Savage, too, is a cherished doctrine among critics of the sciences of human nature.  In Sociobiology, Wilson mentioned that tribal warfare was common in human prehistory.  The against-sociobiologists declared that this had been “strongly rebutted both on the basis of historical and anthropological studies.” I looked up these “studies,” which were collected in Ashley Montagu’s Man and Aggression.  In fact they were just hostile reviews of books by the ethologist Konrad Lorenz, the playwright Robert Ardrey, and the novelist William Golding (author of Lord of the Flies).  Some of the criticisms were, to be sure, deserved.  Ardrey and Lorenz believed in archaic theories such as that aggression was like the discharge of a hydraulic pressure and that evolution acted for the good of the species.  But far stronger criticisms of Ardrey and Lorenz had been made by the sociobiologists themselves.  (On the second page of The Selfish Gene, for example, Dawkins wrote, “The trouble with these books is that the authors got it totally and utterly wrong.”)  In any case, the reviews contained virtually no data about tribal warfare.

    That’s for sure!  Man and Aggression, published in 1968, was a collection of essays by some of the most prominent anthropologists and psychologists of the day.  It’s quite true that it had little to do with tribal warfare, because it was intended mainly as an attempt to refute Ardrey and Lorenz’ insistence on the existence and importance of human nature.  As such, it is one of the most important pieces of historical source material relevant to the Blank Slate.  Among other things, it demonstrates that Pinker’s portrayal of E. O. Wilson as the knight in shining armor who slew the Blank Slate dragon in Chapter 6 of his book is nonsense.  The battle had been joined long before the appearance of Wilson’s Sociobiology in 1975, and the two chapters in that book that had even mentioned human nature were essentially just restatements of what Ardrey, Lorenz, and several other authors of note, such as Robin Fox, Paul Leyhausen, Desmond Morris, Anthony Storr, and Lionel Tiger, had already written, in part, more than a decade earlier.

    As can be seen in the paragraph from Pinker’s book, he cites two main reasons for airbrushing Ardrey and Lorenz out of existence.  The first is Dawkins’ comment in The Selfish Gene that, “The trouble with these books is that the authors got it totally and utterly wrong.”  If you actually read what Dawkins was talking about, you’ll see this comment had nothing to do with human nature, the Blank Slate, or sociobiology.  Indeed, it had nothing to do with the theme of Pinker’s book, or any fundamental theme in the work of either Ardrey or Lorenz, either, for that matter.  It turns out Dawkins was referring solely to their favorable comments about group selection! In one of the more amusing ironies of scientific history, E. O. Wilson, Pinker’s heroic debunker of the Blank Slate, later outed himself as a far more devoted advocate of group selection than anything Ardrey or Lorenz ever dreamed of!  If they were “totally and utterly wrong,” Wilson must be doubly “totally and utterly wrong,” and himself and candidate for the memory hole.  I’ve written at length about this dubious rationale for dismissing Ardrey and Lorenz elsewhere.

    However, group selection wasn’t Pinker’s only excuse for creating his fairy tale version of the Blank Slate.  His other one (or more correctly, two), is contained in the sentence, “Ardrey and Lorenz believed in archaic theories such as that aggression was like the discharge of a hydraulic pressure and that evolution acted for the good of the species.”  In fact, Lorenz often does discuss whether particular adaptations are for the good of the species or not.  He does so mainly to illustrate his point that, while the innate behavioral traits that can result in aggression in human beings were “good for the species,” in the sense that they promoted the survival of our species as a whole, at the time that they evolved, the same traits may now be “not for the good of the species” in the radically different environment we find ourselves in today.  One could say in the same sense that our hands, feet and eyes are “for the good of the species,” because we are better off with them than without them.  I can only surmise that Pinker falsely imagined that Lorenz was trying to claim that selection operated at the level of the species.  In fact, he never claimed anything of the sort.  In the few instances he actually spoke of selection in his book, On Aggression, he was careful to point out that it took place at the level of individuals, or perhaps a few individuals.

    It turns out that the history behind Pinker’s comment that “Ardrey and Lorenz believed in archaic theories such as that aggression was like the discharge of a hydraulic pressure” is a great deal more interesting.  I seriously doubt that Pinker even knew what he was talking about here.  His knowledge of the “hydraulic theory” was probably second or third hand.  In the first place, Lorenz never had a “hydraulic theory.”  He did have a “hydraulic model,” and referred to it often.  An animated version of the model, which he first presented at a conference in 1949, may be found here.  Lorenz never referred to it as other than an admittedly crude model, but one which illustrated what he actually saw in the behavior of many different species.  Anyone who is capable of raising fish in an aquarium or ducks and geese in their backyard, can read Lorenz and see for themselves that, whether Pinker thinks the model is “archaic” or not, it does nicely illustrate aspects of how these species’ actually behave.

    This begs the question of how this simple and accurate model became transmogrified into a “theory.”  It turns out that the “authority” the Blank Slaters of old most often used to “refute” Lorenz’ “hydraulic theory” was one Daniel Lehrman, a professor at Rutgers and a purveyor of behaviorist flim flam of the first water.  His A Critique of Konrad Lorenz’s Theory of Instinctive Behavior appeared in The Quarterly Review of Biology back in 1953. By all means, have a look at it.  To read it is to marvel at how delusional the Blank Slaters had become by the early 50’s.  Lehrman denied the existence of instincts, not only in the great apes and human beings, as Ashley Montagu did in the 60’s, but in rats and geese, no less!  For example, according to Lehrman, the innate egg retrieving behavior of geese described by Lorenz was not innate, but was a result of “conditioning” while the goose was still in the egg!  He cited studies according to which the neck movements used by the goose to retrieve the egg actually began developing a few days after the egg was laid when the “head is stimulated tactually by the yolk sac.”  Apparently it never occurred to Lehrman that he was merely kicking the can down the road.  Why would the fetal goose move its head one way rather than another in response to this “conditioning?”  Indeed, why would it move it’s head at all?  As Lorenz put it, there must have been an innate “schoolmarm” to teach the goose these things.  Lehrman gives several other examples, explaining innate developmental feedback mechanisms in terms of behaviorist “conditioning.”  The following is another example of his “devastating” arguments against Lorenz:

    Now, what exactly is meant by the statement that a behavior pattern is “inherited” or “genetically controlled?”  Lorenz undoubtedly does not thing that the zygote contains the instinctive act in miniature, or that the gene is the equivalent of an entelechy which purposefully and continuously tries to push the organism’s development in a particular direction.  Yet one or both of these preformistic assumptions, or their equivalents, must underlie the notion that some behavior patterns are “inherited” as such.

    Quick!  Someone run and tell the computer programmers!  Everything they’ve done to date is clearly impossible.  Are they trying to claim that their video games actually exist in miniature in the software they’re trying to peddle?  Lehrman next gives a perfect illustration of what George Orwell was talking about when he spoke of “Newspeak,” in his 1984.  Newspeak was a version of the language that would make it impossible to even conceptualize “Crimethink.”  As Lehrman puts it,

    To lump them (behavioral traits) together under the rubric of “inherited” or “innate” characteristics serves to block the investigation of their origin just at the point where it should leap forward in meaningfulness.

    Elsewhere Lehrman makes a similar case for actually expunging the words “innate” and “instinct” from the behavioral science dictionary.  To borrow Orwell’s terminology, he considered them “doubleplus ungood.”  In retrospect, I think we can see perfectly well at this point what kinds of “investigation” really were blocked for upwards of half a century by the high priests of the Blank Slate, and it certainly wasn’t the kind that was dear to the heart of Prof. Lehrman.  But what of the “hydraulic theory?”  Here’s what Lehrman has to say about it:

    Lorenz (1950) describes in some detail a hydraulic model, or analogy, of the instinct mechanism, including a reservoir of excitation and devices for keeping it dammed up (innate releasing mechanism) until appropriate keys unlock the sluices.  Hydraulic analogies have reappeared so regularly in Lorenz’s papers since 1937 as to justify the impression that they are not really analogies – they are actual representations of Lorenz’s conception and channeling of “instinctive energy.”

    Got that?  You’d better not hum the tune to the Rolling Stone’s “She’s Like a Rainbow” too often, or you’ll find yourself accused of proposing a “theory” of the transformation of women into rainbows.  The same goes for “Like the Dawn,” by the “Oh Hello’s.”  Heaven forefend that you ever describe a cloud as like a camel, or a whale, or a unicorn, or you might find yourself accused of proposing a “theory” of the transubstantiation of clouds.  That, my friends, was the magical process by which Lorenz’ simple model was transmuted into Pinker’s mythical “archaic hydraulic theory.”

    So much for Pinker’s “fake but true” history of the Blank Slate.  To my knowledge he has never yet shown the slightest remorse for the violence he has done to the history of what is probably the greatest scientific debacle of all time, not to mention to the legacy of the two men most responsible for restoring some semblance of sanity to the behavioral sciences.  I would caution those who expect that he ever will not to hold their breath.  As for Lehrman, he became a member of any number of prestigious learned societies, and received any number of prestigious awards and decorations for his brilliant contributions to the advancement of “science.”  It would seem that, just as no good deed goes unpunished, no bad deed goes unrewarded.

  • Why the Blank Slate? Let Max Eastman Explain

    Posted on July 29th, 2018 Helian 1 comment

    In my opinion, science, broadly construed, is the best “way of knowing” we have.  However, it is not infallible, is never “settled,” cannot “say” anything, and can be perverted and corrupted for any number of reasons.  The Blank Slate affair was probably the worst instance of the latter in history.  It involved the complete disruption of the behavioral sciences for a period of more than half a century in order to prop up the absurd lie that there is no such thing as human nature.  It’s grip on the behavioral sciences hasn’t been completely broken to this day.  It’s stunning when you think about it.  Whole branches of the sciences were derailed to support a claim that must seem ludicrous to any reasonably intelligent child.  Why?  How could such a thing have happened?  At least part of the answer was supplied by Max Eastman in an article that appeared in the June 1941 issue of The Reader’s Digest.  It was entitled, Socialism Doesn’t Jibe with Human Nature.

    Who was Max Eastman?  Well, he was quite a notable socialist himself in his younger days.  He edited a radical magazine called The Masses from 1913 until it was suppressed in 1918 for its antiwar content.  In 1922 he traveled to the Soviet Union, and stayed to witness the reality of Communism for nearly two years, becoming friends with a number of Bolshevik worthies, including Trotsky.  Evidently he saw some things that weren’t quite as ideal as he had imagined.  He became increasingly critical of the Stalin regime, and eventually of socialism itself.  In 1941 he became a roving editor for the anti-Communist Reader’s Digest, and the above article appeared shortly thereafter.

    In it, Eastman reviewed the history of socialism from it’s modest beginnings in Robert Owen’s utopian village of New Harmony through a host of similar abortive experiments to the teachings of Karl Marx, and finally to the realization of Marx’s dream in the greatest experiment of them all; the Bolshevik state in Russia.  He noted that all the earlier experiments had failed miserably but, in his words, “The results were not better than Robert Owen’s but a million times worse.”  The outcome of Lenin’s great experiment was,

    Officialdom gone mad, officialdom erected into a new and merciless exploiting class which literally wages war on its own people; the “slavery, horrors, savagery, absurdities and infamies of capitalist exploitation” so far outdone that men look back to them as to a picnic on a holiday; bureaucrats everywhere, and behind the bureaucrats the GPU; death for those who dare protest; death for theft – even of a piece of candy; and this sadistic penalty extended by a special law to children twelve years old!  People who still insist that this is a New Harmony are for the most part dolts or mental cowards.  To honest men with courage to face facts it is clear that Lenin’s experiment, like Robert Owen’s, failed.

    It would seem the world produced a great many dolts and mental cowards in the years leading up to 1941.  In the 30’s Communism was all the rage among intellectuals, not only in the United States but worldwide.  As Malcolm Muggeridge put it in his book, The Thirties, at the beginning of the decade it was rare to find a university professor who was a Marxist, but at the end of the decade it was rare to find one who wasn’t.  If you won’t take Muggeridge’s word for it, just look at the articles in U.S. intellectual journals such as The Nation, The New Republic, and the American Mercury during, say, the year 1934.  Many of them may be found online.  These were all very influential magazines in the 30’s, and at times during the decade they all took the line that capitalism was dead, and it was now merely a question of finding a suitable flavor of socialism to replace it.  If you prefer reality portrayed in fiction, read the guileless accounts of the pervasiveness of Communism among the intellectual elites of the 1930’s in the superb novels of Mary McCarthy, herself a leftist radical.

    Eastman was too intelligent to swallow the “common sense” socialist remedies of the news stand journals.  He had witnessed the reality of Communism firsthand, and had followed its descent into the hellish bloodbath of the Stalinist purges and mass murder by torture and starvation in the Gulag system.  He knew that socialism had failed everywhere else it had been tried as well.  He also knew the reason why.  Allow me to quote him at length:

    Why did the monumental efforts of these three great men (Owen, Marx and Lenin, ed.) and tens of millions of their followers, consecrated to the cause of human happiness – why did they so miserably fail? They failed because they had no science of human nature, and no place in their science for the common sense knowledge of it.

    In October 1917, after the news came that Kerensky’s government had fallen, Lenin, who had been in hiding, appeared at a meeting of the Workers and Soldiers’ Soviet of Petrograd.  He mounted the rostrum and, when the long wild happy shouts of greeting had died down, remarked: “We will now proceed to the construction of a socialist society.” He said this as simply as though he were proposing to put up a new cowbarn.  But in all his life he had never asked himself the equally simple question: “How is this newfangled contraption going to fit in with the instinctive tendencies of the animals it was made for?”

    Lenin actually knew less about the science of man, after a hundred years, than Robert Owen did.  Owen had described human nature, fairly well for an amateur, as “a compound of animal propensities, intellectual faculties and moral qualities.”  He had written into the preamble of the constitution of New Harmony that “man’s character… is the result of his formation, his location, and of the circumstances within which he exists.”

    It seems incredible, but Karl Marx, with all his talk about making socialism “scientific,” took a step back from this elementary notion. He dropped out the factor of man’s hereditary nature altogether.  He dropped out man altogether, so far as he might present an obstacle to social change.  “The individual,” he said, “has no real existence outside the milieu in which he lives.” By which he meant: Change the milieu, change the social relations, and man will change as much as you like.  That is all Marx ever said on the primary question.  And Lenin said nothing.

    That is why they failed.  They were amateurs – and worse than amateurs, mystics – in the subject most essential to their success.

    To begin with, man is the most plastic and adaptable of animals.  He truly can be changed by his environment, and even by himself, to a unique degree, and that makes extreme ideas of progress reasonable.  On the other hand, he inherits a set of emotional impulses or instincts which, although they can be trained in various ways in the individual, cannot be eradicated from the race.  And no matter how much they may be repressed or redirected by training, they reappear in the original form – as sure as a hedgehop puts out spines – in every baby that is born.

    Amazing, considering these words were written in 1941.  Eastman had a naïve faith that science would remedy the situation, and that, as our knowledge of human behavior advanced, mankind would see the truth.  In fact, by 1941, those who didn’t want to hear the inconvenient truth that the various versions of paradise on earth they were busily concocting for the rest of us were foredoomed to failure already had the behavioral sciences well in hand.  They made sure that “science said” what they wanted it to say.  The result was the Blank Slate, a scientific debacle that brought humanity’s efforts to gain self-understanding to a screeching halt for more than half a century, and one that continues to haunt us even now.  Their agenda was simple – if human nature stood in the way of heaven on earth, abolish human nature!  And that’s precisely what they did.  It wasn’t the first time that ideological myths have trumped the truth, and it certainly won’t be the last, but the Blank Slate may well go down in history as the deadliest myth of all.

    I note in passing that the Blank Slate was the child of the “progressive Left,” the same people who today preen themselves on their great respect for “science.”  In fact, all the flat earthers, space alien conspiracy nuts, and anti-Darwin religious fanatics combined have never pulled off anything as damaging to the advance of scientific knowledge as the Blank Slate debacle.  It’s worth keeping in mind the next time someone tries to regale you with fairy tales about what “science says.”

  • How a “Study” Repaired History and the Evolutionary Psychologists Lived Happily Ever After

    Posted on June 12th, 2018 Helian No comments

    It’s a bit of a stretch to claim that those who have asserted the existence and importance of human nature have never experienced ideological bias. If that claim is true, then the Blank Slate debacle could never have happened. However, we know that it happened, based not only on the testimony of those who saw it for the ideologically motivated debasement of science that it was, such as Steven Pinker and Carl Degler, but of the ideological zealots responsible for it themselves, such as Hamilton Cravens, who portrayed it as The Triumph of Evolution. The idea that the Blank Slaters were “unbiased” is absurd on the face of it, and can be immediately debunked by simply counting the number of times they accused their opponents of being “racists,” “fascists,” etc., in books such as Richard Lewontin’s Not in Our Genes, and Ashley Montagu’s Man and Aggression. More recently, the discipline of evolutionary psychology has experienced many similar attacks, as detailed, for example, by Robert Kurzban in an article entitled, Alas poor evolutionary psychology.

    The reasons for this bias has never been a mystery, either to the Blank Slaters and their latter day leftist descendants, or to evolutionary psychologists and other proponents of the importance of human nature. Leftist ideology requires not only that human beings be equal before the law, but that the menagerie of human identity groups they have become obsessed with over the years actually be equal, in intelligence, creativity, degree of “civilization,” and every other conceivable measure of human achievement. On top of that, they must be “malleable,” and “plastic,” and therefore perfectly adaptable to whatever revolutionary rearrangement in society happened to be in fashion. The existence and importance of human nature has always been perceived as a threat to all these romantic mirages, as indeed it is. Hence the obvious and seemingly indisputable bias.

    Enter Jeffrey Winking of the Department of Anthropology at Texas A&M, who assures us that it’s all a big mistake, and there’s really no bias at all! Not only that, but he “proves” it with a “study” in a paper entitled, Exploring the Great Schism in the Social Sciences, that recently appeared in the journal Evolutionary Psychology. We must assume that, in spite of his background in anthropology, Winking has never heard of a man named Napoleon Chagnon, or run across an article entitled Darkness’s Descent on the American Anthropological Association, by Alice Degler.

    Winking begins his article by noting that “The nature-nurture debate is one that biologists often dismiss as a false dichotomy,” but adds, “However, such dismissiveness belies the long-standing debate that is unmistakable throughout the biological and social sciences concerning the role of biological influences in the development of psychological and behavioral traits in humans.” I agree entirely. One can’t simply hand-wave away the Blank Slate affair and a century of bitter ideological debate by turning up one’s nose and asserting the term isn’t helpful from a purely scientific point of view.

    We also find that Winking isn’t completely oblivious to examples of bias on the “nature” side of the debate. He cites the Harvard study group which “evaluated the merits of sociobiology, and which included intellectual giants like Stephen J. Gould and Richard Lewontin.” I am content to let history judge whether Gould and Lewontin were really “intellectual giants.” Regardless, if Winking actually read these “evaluations,” he cannot have failed to notice that they contained vicious ad hominem attacks on E. O. Wilson and others that it is extremely difficult to construe as anything but biased. Winking goes on to note similar instances of bias by other authors in various disciplines, such as,

    Many researchers use [evolutionary approaches to the study of international relations] to justify the status quo in the guise of science.

    The totality [of sociobiology and evolutionary psychology] is a myth of origin that is compelling precisely because it resonates strongly with Euro American presuppositions about the nature of the world.

    …in the social sciences (with the exception of primatology and psychology) sociobiology appeals most to right-wing social scientists.

    These are certainly compelling examples of bias. Now, however, Winking attempts to demonstrate that those who point out the bias, and correctly interpret the reasons for it, are just as biased themselves. As he puts it,

    Conversely, those who favor biological approaches have argued that those on the other side are rendered incapable of objective assessment by their ideological promotion of equality. They are alleged to erroneously reject evidence of biological influences because such evidence suggests that social outcomes are partially explained by biology, and this might inhibit the realization of equality. Their critiques of biological approaches are therefore often blithely dismissed as examples of the moralistic/naturalistic fallacy. This line of reason is exemplified in the quote by biologist Jerry Coyne

    If you can read the [major Evolutionary Psychology review paper] and still dismiss the entire field as worthless, or as a mere attempt to justify scientists’ social prejudices, then I’d suggest your opinions are based more on ideology than judicious scientific inquiry.

    I can’t imagine what Winking finds “blithe” about that statement! Is it really “blithe” to so much as suggest that people who dismiss entire fields of science as worthless may be ideologically motivated? I note in passing that Coyne must have thought long and hard about that statement, because his Ph.D. advisor was none other than Richard Lewontin, whom he still honors and admires!  Add to that the fact that Coyne is about as far as you can imagine from “right wing,” as anyone can see by simply visiting his Why Evolution is True website, and the notion that he is being “blithe” here is ludicrous. Winking’s other examples of “blithness” are similarly dubious, including,

    For critics, the heart of the intellectual problem remains an ideological adherence to the increasingly implausible view that human behavior is strictly determined by socialization… Should [social]hierarchies result strictly from culture, then the possibilities for an egalitarian future were seen to be as open and boundless as our ever-malleable brains might imagine.

    Like the Church, a number of contemporary thinkers have also grounded their moral and political views in scientific assumptions about… human nature, specifically that there isn’t one.

    Unlike the “comparable” statements by the Blank Slaters, these statements neither accuse those who deny the existence of human nature of being Nazis, nor is evidence lacking to back them up.  On the contrary, one could cite a mountain of evidence to back them up supplied by the Blank Slaters themselves.  Winking soon supplies us with the reason for this strained attempt to establish “moral equivalence” between “nature” and “nurture.”  It appears in his “hypothesis,” as follows:

    It is entirely possible that confirmation bias plays no role in driving disagreement and that the overarching debate in academia is driven by sincere disagreements concerning the inferential value of the research designs informing the debate.

    Wait a minute!  Don’t roll your eyes like that!  Winking has a “study” to back up this hypothesis.  Let me explain it to you.  He invented some “mock results” of studies which purported to establish, for example, the increased prevalence of an allele associated with “appetitive aggression” in populations with African ancestry.  Subtle, no?  Then he used Mechanical Turk and social media to come up with a sample of 365 people with Masters degrees or Ph.D.’s for a survey on what they thought of the “inferential power” of the fake data.  Another sample of 71 were scraped together for another survey on “research design.”  In the larger sample, 307 described themselves as either only “somewhat” on the “nature” side, or “somewhat” on the “nurture” side.  Only 57 claimed they leaned strongly one way or the other.  The triumphant results of the study included, for example, that,

    Participants perceptions of inferential value did not vary by the degree to which results supported a particular ideology, suggesting that ideological confirmation bias is not affecting participant perceptions of inferential value.

    Seriously?  Even the author admits that the statistical power of his “study” is low because of the small sample sizes.  However statistical power only applies where the samples are truly random, meaning, in this case, where the participants are either unequivocably on the “nature” or “nurture” side.  That is hardly the case.  Mechanical Turk samples, for example are biased towards a younger and more liberal demographic.  Most of the participants were on the fence between nature and nurture.  In other words, there’s no telling what their true opinions were even if they were honest about them.  Even the most extreme Blank Slaters admitted that nature plays a significant role in such bodily functions as urinating, defecating, and breathing, and so could have easily described themselves as “somewhat bioist.”  Perhaps most importantly, any high school student could have easily seen what this “study” was about.  There is no doubt whatsoever that holders of Masters and Doctors degrees in related disciplines had no trouble a) inferring what the study was about, and b) had an interest in making sure that the results demonstrated that they were “unbiased.”  In other words, were not exactly talking “double blind” here.

    I think the author was well aware that most readers would have no trouble detecting the blatant shortcomings of his “study.”  Apparently to ward off ridicule he wrote,

    Regardless of one’s position, it is important to remind scholars that if they believe a group of intelligent and informed academics could be so unknowingly blinded by ideology that they wholeheartedly subscribe to an unquestionably erroneous interpretation of an entire body of research, then they must acknowledge they themselves are equally as capable of being so misguided.

    Kind of reminds you of the curse over King Tut’s tomb, doesn’t it?  “May those who question my study be damned to dwell among the misguided forever!”  Sorry, my dear Winking, but “a group of intelligent and informed academics” not only could, but were “so unknowingly blinded by ideology that they wholeheartedly subscribed to an unquestionably erroneous interpretation of an entire body of research.”  It was called the Blank Slate, and it derailed the behavioral sciences for more than half a century.  That’s what Pinker’s book was about.  That’s what Degler’s book was about, and yes, that’s even what Cravens’ book was about.  They all did an excellent job of documenting the debacle.  I suggest you read them.

    Or not.  You could decide to believe your study instead.  I have to admit, it would have its advantages.  History would be “fixed,” the lions would lie down with the lambs, and the evolutionary psychologists would live happily ever after.