Secular Humanism and Religion; Standoff at Quillette

As I noted in a recent post, (Is Secular Humanism a Religion? Is Secular Humanist Morality Really Subjective), John Staddon, a Professor of Psychology and Professor of Biology emeritus at Duke, published a very timely and important article at Quillette entitled Is Secular Humanism a Religion noting the gaping inconsistencies and irrationalities in secular humanist morality. These included its obvious lack of any visible means of support, even as flimsy as a God, for its claims to authority and legitimacy. My post included a link to a review by Prof. Jerry Coyne, proprietor of the Why Evolution is True website and New Atheist stalwart, that called Prof. Staddon’s article the “worst” ever to appear on Quillette, based on the false assumption that he actually did maintain that secular humanism is a religion. In fact, it’s perfectly obvious based on a fair reading of the article that he did nothing of the sort.

Meanwhile, Quillette gave Prof. Coyne the opportunity to post a reply to Staddon. His rebuttal, entitled Secular Humanism is Not a Religion, doubled down on the false assertion that Staddon had claimed it is. Then, in a counterblast, entitled Values, Even Secular Ones, Depend on Faith: A Reply to Jerry Coyne, Staddon simply pointed out Prof. Coyne’s already obvious “confusion” about what he had actually written, and elaborated on his contention that secular values depend on faith. As I noted in the following comment I posted at Quillette, I couldn’t agree more: Continue reading “Secular Humanism and Religion; Standoff at Quillette”

On the Illusion of Objective Morality; We Should Have Listened to Westermarck

The illusion of objective morality is amazingly powerful. The evidence is now overwhelming that morality is a manifestation of emotions, and that these emotions exist by virtue of natural selection. It follows that there can be no such thing as objective moral truths. The brilliant Edvard Westermarck explained why more than a century ago in his The Origin and Development of the Moral Ideas:

As clearness and distinctness of the conception of an object easily produces the belief in its truth, so the intensity of moral emotion makes him who feels it disposed to objectivize the moral estimate to which it gives rise, in other words, to assign to it universal validity. The enthusiast is more likely than anybody else to regard his judgments as true, and so is the moral enthusiast with reference to his moral judgments. The intensity of his emotions makes him the victim of an illusion.

Westermarck, in turn, was merely pointing out some of the more obvious implications of what Darwin had written about morality in his The Descent of Man, published in 1871. Today Westermarck is nearly forgotten, what Darwin wrote about morality is ignored as if it didn’t exist, and the illusion is as powerful and persistent as it was more than a century ago. Virtually every human being on the planet either believes explicitly in objective moral truths, or behaves as if they did regardless of whether they admit to believing in them or not. Continue reading “On the Illusion of Objective Morality; We Should Have Listened to Westermarck”

Has It Ever Occurred To You That None Of Us Are Acting Rationally?

Do you imagine that you are acting for the good of all mankind? You are delusional. What is your actual goal when you imagine you are acting for the good of all mankind? Maximization of human happiness? Maximization of the rate at which our species as a whole reproduces? Complete elimination of our species? All of these mutually exclusive goals are deemed by some to be for the “good of all mankind.” How is that possible if there really is such a thing as “the good of all mankind?” The answer is that there is no such thing, for the simple reason that there is no such thing as good, unless one is speaking of a subjective impression.

Look, just stop arguing with me in your mind for a moment and try a thought experiment. Imagine that what I’ve said above about good – that it is merely a subjective impression – is true. In that case, how can we account for the existence of this subjective impression, this overpowering belief that some things are good and other things are evil? It must exist for the same reason that all of our other behavioral predispositions and traits exist – by virtue of natural selection, the same process that accounts for our very existence to begin with. In that case, these subjective impressions, these overpowering beliefs, must exist because, in the environment in which they evolved, they enhanced the odds that the responsible genes would survive and reproduce. How, then, is it possible for us to imagine that our goal is “the good of all mankind.” Natural selection does not operate at the level of “all mankind.” It operates at the level of the individual and, perhaps, at the level of small groups. If our goal is to act for “the good of the species,” we can only conclude that the behavioral predispositions responsible for this desire have become “dysfunctional,” in the sense that they are no longer likely to promote the survival of the responsible genes. The most plausible reason they have become “dysfunctional” is the fact that they exist in the context of a radically changed environment.

This has some obvious implications as far as the rationality of our behavior is concerned. Try following the reasons you imagine you’re doing what you do down through the accumulated “rational” muck to the emotional bedrock where they originate. You can string as many reasons together as you want, one following the other, and all perfectly rational, but eventually the chain of reasons must lead back to the origin of them all. That origin cannot be the “good in itself,” because such an object does not exist. It is imaginary. In fact, the bedrock we are seeking consists of behavioral predispositions that exist because they evolved. As the result of a natural process, they cannot possibly be “rational,” in the sense of having some deeper purpose or meaning more fundamental than themselves. It is evident that these behavioral traits exist because, at least at some point in time and in some environment, they enhanced the odds that the individuals possessing these traits would survive and reproduce. That, however, is not their purpose, or their function, because there was no one around to assign them a purpose or function. They have no purpose or function. They simply are.

That’s what I mean when I say that none of us acts rationally. The sun does not act rationally when it melts solid objects that happen to fall into it. It does not have the purpose or goal of melting them. It simply does. The ocean does not act rationally when it drowns air breathing creatures that are unfortunate enough to sink beneath its surface. Millions of creatures have drowned in the ocean, but the ocean didn’t do it on purpose, nor did it have a goal in doing so. In the same sense, our behavioral traits do not have a goal or purpose when they motivate us to act in one way or another. Just as it is a fact of nature that the sun melts solid objects, and the ocean drowns land creatures, it is a fact of nature that we are motivated to do some things, and avoid others. That is what I mean when I say that our behavior is irrational. I don’t mean that it can’t be explained. I do mean that it has no underlying purpose or goal for doing what it does. Goals and purposes are things we assign to ourselves. They cannot be distilled out of the natural world as independent objects or things in themselves.

Consider what this implies when it comes to all the utopian schemes that have ever been concocted for our “benefit” over the millennia. A goal that many of these schemes have had in common is “moral progress.” It is one of the more prominent absurdities of our day that even those among us who are most confident that Darwin was right, and who have admitted that there is a connection between morality and our innate behavioral predispositions, and who also realize and have often stated publicly that morality is subjective, nevertheless embrace this goal of “moral progress.” This begs the question, “Progress towards what?” Assuming one realizes and has accepted the fact that morality is subjective, it can’t be progress towards any objective Good, existing independently of what anyone thinks about it. It must, then, be progress towards something going on in conscious minds. However, as noted above, conscious minds are a fact of nature, existing by virtue of natural processes that have no function and have no goal. They simply are. Furthermore, our conscious minds are not somehow connected all across the planet in some mystical collective. They all exist independently of each other. They include predispositions that motivate the individuals to whom they belong to have desires and goals. However, those desires and goals cannot possibly exist by virtue of the fact that they benefit all mankind. They exist by virtue of the fact that they enhanced the odds that the responsible genetic material would survive and reproduce. They were selected at the level of the individual, and perhaps of small groups. They were definitely not selected by virtue of any beneficial effect on all mankind.

In other words, when one speaks of “moral progress,” what one is in reality speaking of is progress towards satisfying the whims of some individual. The reason for the existence of these whims has nothing to do with the welfare of all mankind. To the extent that the individual imagines they have some such connection, the whims have become “dysfunctional,” in the sense that they have been redirected towards a goal that is disconnected from the reasons they exist to begin with. Belief in “moral progress,” then, amounts to a blind emotional response to innate whims on the part of individuals who have managed to profoundly delude themselves about exactly what it is they’re up to. The problem, of course, is that they’re not the only ones affected by their delusion. Morality is always aimed at others. They insist that everyone else on the planet must respect their delusion, and allow it to dictate how those others should or should not behave.

This fundamental irrationality applies not just to morality, but to every other aspect of human behavior. Whether it’s a matter of wanting to be “good,” or of “serving mankind,” or accumulating wealth, or having sex, or striving for “success” and recognition, we are never motivated by reason. We are motivated by whims, although we certainly can and do reason about what the whims are trying to tell us. This process of reasoning about whims can result in a bewildering variety of conclusions, most of which have nothing to do with the reasons the whims exist to begin with. You might say that our brains have evolved too quickly. Our innate behavioral baggage has not kept up, and remains appropriate only to environments and forms of society that most of us left behind thousands of years ago. We continue to blindly respond to our emotions without understanding why they exist, pursuing goals that have nothing to do with the reasons they exist. In effect, we are living in an insane asylum.

I am not suggesting that we all stop having goals and aspirations. Life would be extremely boring without them, and they can be just as noble as we please, at least from our own point of view. From my point of view, the fact that creatures like us can exist at all seems wildly improbable, wonderful, and sublime. For all we know, the life we are a part of may exist on only one of the trillions of planets in our universe. I personally deem it precious, and one of my personal goals is that it be preserved. Others may have different goals. I merely suggest that, regardless of what they are, we keep in mind what motivates us to seek them in the first place. I personally would prefer that we avoid botching the wildly improbable, wonderful, and sublime experiment of nature that is us by failing to understand ourselves.

A New York Intellectual’s Unwitting Expose; Human Nature Among the Ideologues

Norman Podhoretz is one of the New York literati who once belonged to a group of leftist intellectuals he called the Family. He wrote a series of books, including Making It, Breaking Ranks, and Ex-Friends, describing what happened when he underwent an ideological metamorphosis from leftist radical to neoconservative. In the process he created a wonderful anthropological study of human nature in the context of an ingroup defined by ideology. Behavior within that ingroup was similar to behavior within ingroups defined by race, class, religion, ethnicity, or any of the other often subtle differences that enable ingroups to distinguish themselves from the “others.” The only difference was that, in the case of Podhoretz’ Family, the ingroup was defined by loyalty to ideological dogmas. Podhoretz described a typical denizen as follows:

Such a person takes ideas as seriously as an orthodox religious person takes, or anyway used to take, doctrine or dogma. Though we cluck our enlightened modern tongues at such fanaticism, there is a reason why people have been excommunicated, and sometimes even put to death, by their fellow congregants for heretically disagreeing with the official understanding of a particular text or even of a single word. After all, to the true believer everything important – life in this world as well as life in the next – depends on obedience to these doctrines and dogmas, which in turn depends on an accurate interpretation of their meaning and which therefore makes the spread of heresy a threat of limitless proportions.

This fear and hatred of the heretic, together with the correlative passion to shut him up one way or the other, is (to say the least, and in doing so I am bending over backward) as much a character trait of so-called liberal intellectuals as it is of conservatives… For we have seen that “liberal” intellectuals who tell us that tolerance and pluralism are the highest values, who profess to believe that no culture is superior to any other, and who are on that account great supporters of “multiculturalism” will treat these very notions as sacred orthodoxies, will enforce agreement with them in every venue in which they have the power to do so (the universities being the prime example at the moment), and will severely punish any deviation that dares to make itself known.

Podhoretz may not have been aware of the genetic roots responsible for such behavior, but he was certainly good at describing it. His description of status seeking, virtue signaling, hatred of the outgroup, allergic reaction to heretics, etc., within the Family would be familiar to any student of urban street gangs. As anthropological studies go, his books have the added advantage of being unusually entertaining, if only by virtue of the fact that his ingroup included such lions of literature as Norman Mailer, Hannah Arendt, Mary McCarthy, Allen Ginsburg, and Lionel Trilling,

Podhoretz was editor of the influential cultural and intellectual magazine Commentary from 1960 to 1995. When he took over the magazine already represented the anti-Communist Left. However, he originally planned to take a more radically leftist line, based on the philosophy of Paul Goodman, a utopian anarchist. In his Growing Up Absurd, Goodman claimed that American society was stuck with a number of “incomplete revolutions.” To escape this “absurdity” it was necessary to complete the revolutions. Podhoretz seized on Goodman’s ideas as the “radical” solution to our social ills he was seeking, and immediately started a three-part serialization of his book in Commentary. Another major influence on Podhoretz at the time was Life Against Death by Norman O. Brown, a late Freudian tract intended to reveal “the psychoanalytical meaning of history.” It is depressing to read these books today in the knowledge that they were once taken perfectly seriously by people who imagined themselves to be the cream of the intellectual crop. Goodman certainly chose the right adjective for them – “absurd.”

In any case, as the decade wore on, the Left did become more radicalized, but not in the way foreseen by Podhoretz. What was known then as the New Left emerged, and began its gradual takeover of the cultural institutions of the country, a process that has continued to this day. When he came of age, most leftists had abandoned the Stalinism or Trotskyism they had flirted with in the 30’s and 40’s, and become largely “pro-American” and anti-Communist as the magnitude of the slaughter and misery in the Soviet Union under Stalin became impossible to ignore. However, as the war in Vietnam intensified, the dogs returned to their vomit, so to speak. Leftists increasingly became useful idiots – effectively pro-Communist whether they admitted it or not. As Israel revealed its ability to effectively defend itself, they also became increasingly anti-Semitic as well, a development that also continues to this day. Then, as now, anti-Semitism was fobbed off as “anti-Zionism,” but Podhoretz, a Jew as were many of the other members of the family, was not buying it. He may have been crazy enough to take Goodman and Brown seriously, but he was not crazy enough to believe that it was preferable to live in a totalitarian Communist state than in the “imperialist” United States, nor, in light of the Holocaust, was he crazy enough to believe that the creation of a Jewish state was “unjust.” In the following passage he describes his response when he first began to notice this shift in the Zeitgeist, in this case on the part of an erstwhile “friend”:

I was not afraid of Jason. I never hesitated to cut him off when he began making outrageous statements about others, and once I even made a drunken public scene in a restaurant when he compared the United States to Nazi Germany and Lyndon Johnson to Hitler. This comparison was later to become a commonplace of radical talk, but I had never heard it made before, and it so infuriated me that I literally roared in response.

Today, of course, one no longer roars. One simply concludes that those who habitually resort to Hitler comparisons are imbeciles, and leaves it at that. In any case, Podhoretz began publishing “heretical” articles in Commentary, rejecting these notions, and nibbling away that the shibboleths that defined what had once been his ingroup in the process. In the end, he became a full-blown neoconservative. The behavioral responses to Podhoretz “treason” to his ingroup should be familiar to all students of human behavior. His first book length challenge to his ingroup’s sense of its own purity and righteousness was Making It, published in 1967. As Podhoretz recalls,

In an article about Making It and its reception that was itself none too friendly to the book, Norman Mailer summed up the critical response as “brutal – coarse, intimate, snide, grasping, groping, slavering, slippery of reference, crude and naturally tasteless.” But, he added, “the public reception of Making It was nevertheless still on the side of charity if one compared the collective hooligan verdict to the earlier fulminations of the Inner Clan.” By the “Inner Clan,” Mailer meant the community of New York literary intellectuals I myself had called the Family. According to Mailer, what they had been saying in private about Making It even before it was published made the “horrors” of the public reception seem charitable and kind. “Just about everyone in the Establishment” – i.e., the Family – was “scandalized, shocked, livid, revolted, appalled, disheartened, and enraged.” They were “furious to the point of biting their white icy lips… No fate could prove undeserved for Norman, said the Family in thin quivering late-night hisses.”

Podhoretz notes that academia was the first of the cultural institutions of the country to succumb to the radical Gleichschaltung that has now established such firm control over virtually all the rest, to the point that it has become the new “normalcy.” In his words,

For by 1968 radicalism was so prevalent among college students that any professor who resisted it at the very least risked unpopularity and at the worst was in danger of outright abuse. Indeed it was in the universities that the “terror” first appeared and where it operated most effectively.

By the late 60’s the type of behavior that is now ubiquitous on university campuses was hardly a novelty. “De-platforming” was already part of the campus culture:

By 1968 SDS (the leftist Students for a Democratic Society) had moved from argument and example to shouting down speakers with whom it disagreed on the ground that only the “truth” had a right to be heard. And it also changed its position on violence… and a number of its members had gone beyond advocacy to actual practice in the form of bombings and other varieties of terrorism.

As Podhoretz documents, the War in Vietnam had originally been supported, and indeed started and continued by intellectuals and politicians on the left of the political spectrum. He noted that Robert Kennedy had been prominent among them:

Kennedy too then grew more and more radicalized as radicalism looked more and more like the winning side. Having been one of the architects of the war in Vietnam and a great believer in resistance to Communist power in general, he now managed to suggest that he opposed these policies both in the small and in the large.

However, in one of the rapid changes in party line familiar to those who’ve read the history of Communism in the Soviet Union and memorialized by George Orwell in 1984, the hawks suddenly became doves:

…a point was soon reached where speakers supporting the war were either refused a platform or shouted down when they attempted to speak. A speaker whose criticisms were insufficiently violent could even expect a hard time, as I myself discovered when a heckler at Wayne State in Detroit accused me, to the clear delight of the audience, of not being “that much” against the war because in expressing my opposition to the American role I had also expressed my usual reservations about the virtues of the Communist side.

Of course, there was no Internet in the 60’s, so “de-platforming” assumed a form commensurate with the technology available at the time. Podhoretz describes it as follows:

The word “terror,” like everything else about the sixties, was overheated. No one was arrested or imprisoned or executed; no one was even fired from a job (though there were undoubtedly some who lost out on job opportunities or on assignments or on advances from book publishers they might otherwise have had). The sanctions of this particular reign of “terror” were much milder: one’s reputation was besmirched, with unrestrained viciousness in conversation and, when the occasion arose, by means of innuendo in print. People were written off with the stroke of an epithet – “fink” or “racist” or “fascist” as the case might be – and anyone so written off would have difficulty getting a fair hearing for anything he might have to say. Conversely, anyone who went against the Movement party line soon discovered that the likely penalty was dismissal from the field of discussion.

Seeing others ruthless dismissed in this way was enough to prevent most people from voicing serious criticisms of the radical line and – such is the nature of intellectual cowardice – it was enough in some instances to prevent them even from allowing themselves to entertain critical thoughts.

The “terror” is more powerful and pervasive today than it ever was in the 60’s, and it’s ability to “dismiss from the field of discussion” is far more effective. As a result, denizens of the leftist ingroup or those who depend on them for their livelihood tend to be very cautious about rocking the boat.  That’s why young, pre-tenure professors include ritualistic denunciations of the established heretics in their fields before they dare to even give a slight nudge to the approved dogmas. Indeed, I’ve documented similar behavior by academics approaching retirement on this blog, so much do they fear ostracism by their own “Families.” Podhoretz noticed the same behavior early on by one of his erstwhile friends:

As the bad boy of American letters – itself an honorific status in the climate of the sixties – he (Normal Mailer) still held a license to provoke and he rarely hesitated to use it, even if it sometimes meant making a fool of himself in the eyes of his own admirers. But there were limits he instinctively knew how to observe; and he observed them. He might excoriate his fellow radicals on a particular point; he might discomfit them with unexpected sympathies (for right-wing politicians, say, or National Guardsmen on the other side of a demonstration) and equally surprising antipathies (homosexuality and masturbation, for example, he insisted on stigmatizing as vices); he might even on occasion describe himself as (dread word) a conservative. But always in the end came the reassuring gesture, the wink of complicity, the subtle signing of the radical loyalty oath.

So much for Podhoretz description of the behavioral traits of the denizens of an ideologically defined ingroup. I highly recommend all of the three books noted above, not only as an unwitting but wonderfully accurate studies of “human nature,” but as very entertaining descriptions of some of the many famous personalities Podhoretz crossed paths with during his long career. One of them was Jackie Kennedy, who happened to show up at his door one day in the company of his friend, Richard Goodwin, “who had worked in various capacities for President Kennedy.”

She and I had never met before, but we seemed to strike an instant rapport, and at her initiative I soon began seeing her on a fairly regular basis. We often had tea alone together in her apartment on Fifth Avenue where I would give her the lowdown on the literary world and the New York intellectual community – who was good, who was overrated, who was amusing, who was really brilliant – and she would reciprocate with the dirt about Washington society. She was not in Mary McCarthy‘s league as a bitchy gossip (who was?), but she did very well in her own seemingly soft style. I enjoyed these exchanges, and she (an extremely good listener) seemed to get a kick out of them too.

Elsewhere Podhoretz describes McCarthy as “our leading bitch intellectual.” Alas, she was an unrepentant radical, too, and even did a Jane Fonda in North Vietnam, but I still consider her one of our most brilliant novelists. I guess there’s no accounting for taste when it comes to ingroups.

Robert Plomin’s “Blueprint”: The Reply of the Walking Dead

The significance of Robert Plomin’s Blueprint is not that every word therein is infallible. Some reviewers have questioned his assertions about the relative insignificance of the role that parents, schools, culture, and other environmental factors play in the outcome of our lives, and it seems to me the jury is still out on many of these issues. See, for example, the thoughtful review of Razib Khan in the National Review. What is significant about it is Plomin’s description of new and genuinely revolutionary experimental tools of rapidly increasing power and scope that have enabled us to confirm beyond any reasonable doubt that our DNA has a very significant influence on human behavior. In other words, there is such a thing as “human nature,” and it is important. This truth might see obvious today. It is also a fact, however, that this truth was successfully suppressed and denied for over half a century by the vast majority of the “scientists” who claimed to be experts on human behavior.

There is no guarantee that such scientific debacles are a thing of the past. Ideologues devoted to the quasi-religious faith that the truth must take a back seat to their equalist ideals are just as prevalent now as they were during the heyday of the Blank Slate. Indeed, they are at least as powerful now as they were then, and they would like nothing better than to breathe new life into the flat earth dogmas they once foisted on the behavioral sciences. Consider, for example, a review of Blueprint by Nathaniel Comfort entitled “Genetic determinism rides again,” that appeared in the prestigious journal Nature. The first paragraph reads as follows:

It’s never a good time for another bout of genetic determinism, but it’s hard to imagine a worse one than this. Social inequality gapes, exacerbated by climate change, driving hostility towards immigrants and flares of militant racism. At such a juncture, yet another expression of the discredited, simplistic idea that genes alone control human nature seems particularly insidious.

Can anyone with an ounce of common sense, not to mention the editors of a journal that purports to speak for “science,” read such a passage and conclude that the author will continue with a dispassionate review of the merits of the factual claims made in a popular science book? One wonders what on earth they were thinking. Apparently Gleichschaltung is sufficiently advanced at Nature that the editors have lost all sense of shame. Consider, for example, the hoary “genetic determinism” canard. A “genetic determinist” is a strawman invented more than 50 years ago by the Blank Slaters of old. These imaginary beings were supposed to believe that our behavior is rigidly programmed by “instincts.” I’ve searched diligently during the ensuing years, but have never turned up a genuine example of one of these unicorns. They are as mythical as witches, but the Blank Slaters never tire of repeating their hackneyed propaganda lines. It would be hard to “discredit” the “simplistic idea that genes alone control human nature” by virtue of the fact that no one ever made such a preposterous claim to begin with, and Plomin certainly wasn’t the first. Beyond that, what could possibly be the point of dragging in all the familiar dogmas of the “progressive” tribe? Apparently Nature would have us believe that scientific “truth” is to be determined by ideological litmus tests.

In the next paragraph Comfort supplies Plomin, a professor of behavior genetics, with the title “educational psychologist,” and sulks that his emphasis on chromosomal DNA leaves microbiologists, epigeneticists, RNA experts, and developmental biologists out in the cold. Seriously? Since when did these fields manage to hermetically seal themselves off from DNA and become “non-overlapping magisteria?” Do any microbiologists, epigeneticists, RNA experts or developmental biologists actually exist who consider DNA irrelevant to their field?

Comfort next makes the logically questionable claim that, because “Darwinism begat eugenics”, “Mendelism begat worse eugenics,” and medical genetics begat the claim that men with an XYY genotype were violent, therefore behavioral genetics must also “begat” progeny that are just as bad. QED

Genome-wide association (GWA) methods, the increasingly powerful tool described in Blueprint that has now put the finishing touches on the debunking of the Blank Slate, are dismissed as something that “lures scientists” because of its “promise of genetic explanations for complex traits, such as voting behavior or investment strategies.” How Comfort distills this “promise” out of anything that actually appears in the book is beyond me. One wonders if he ever actually read it. That suspicion is greatly strengthened when one reads the following paragraph:

A polygenic score is a correlation coefficient. A GWAS identifies single nucleotide polymorphisms (SNPs) in the DNA that correlate with the trait of interest. The SNPs are markers only. Although they might, in some cases, suggest genomic neighborhoods in which to search for genes that directly affect the trait, the polygenic score itself is in no sense causal. Plomin understands this and says so repeatedly in the book – yet contradicts himself several times by arguing that the scores are in fact, causal.

You have to hand it to Comfort, he can stuff a huge amount of disinformation into a small package. In the first place, the second and third sentences contradict each other. If SNPs are variations in the rungs of DNA that occur between individuals, they are not just markers, and they don’t just “suggest genomic neighborhoods in which to search for genes that directly affect the trait.” If they are reliable and replicable GWA hits, they are one of the actual points at which the trait is affected. Plomin most definitely does not “understand” that polygenic scores are in no sense causal, and nowhere does he say anything of the sort, far less “repeatedly.” What he does say is:

In contrast, correlations between a polygenic score and a trait can only be interpreted causally in one direction – from the polygenic score to the trait. For example, we have shown that the educational attainment polygenic score correlates with children’s reading ability. The correlation means that the inherited DNA differences captured by the polygenic score cause differences between children in their school achievement, in the sense that nothing in our brains, behavior, or environment can change inherited differences in DNA sequence.

I would be very interested to hear what Comfort finds “illogical” about that passage, and by virtue of what magical mental prestidigitations he proposes to demonstrate that the score is a “mere correlation.” Elsewhere we read,

Hereditarian books such as Charles Murray and Richard Herrnstein’s The Bell Curve (1994) and Nicholas Wade’s 2014 A Troublesome Inheritance (see N. Comfort Nature 513, 306–307; 2014) exploited their respective scientific and cultural moments, leveraging the cultural authority of science to advance a discredited, undemocratic agenda. Although Blueprint is cut from different ideological cloth, the consequences could be just as grave.

In fact, neither The Bell Curve nor A Troublesome Inheritance have ever been discredited, if by that term is meant being proved factually wrong. If books are “discredited” by how many ideological zealots begin foaming at the mouth on reading them, of course, it’s a different matter. Beyond that, if something is true, it does not become false by virtue of Comfort deeming it “undemocratic.” I could go on, but what’s the point? Suffice it to say that Comfort’s favorite “scientific authority” is Richard Lewontin, an obscurantist high priest of the Blank Slate if ever there was one, and author of Not in Our Genes.

I can understand the editors of Nature’s desire to virtue signal their loyalty to the prevailing politically correct fashions, but this “review” is truly abject. It isn’t that hard to find authors on the left of the political spectrum who can write a book review that is at least a notch above the level of tendentious ideological propaganda. See, for example, Kathryn Paige Harden’s review of Blueprint in the Spectator. Somehow she managed to write it without implying that Plomin is a Nazi in every second sentence.  I suggest that next time they look a little harder.

My initial post about Blueprint tended to emphasize the historical ramifications of the book in the context of the Blank Slate disaster. As a result, my description of the scientific substance of the book was very broad brush. However, there are many good reviews out there that cover that ground, expressing some of my own reservations about Plomin’s conclusions about the importance of environment in the process. See, for example, the excellent review by Razib Khan in the National Review linked above. As I mentioned in my earlier post, the book itself is only 188 pages long, so, by all means, read it.

Robert Plomin’s “Blueprint” – The Blank Slate and the Behavioral Genetics Insurgency

Robert Plomin‘s Blueprint is a must read. That would be true even if it were “merely” an account of recent stunning breakthroughs that have greatly expanded our understanding of the links between our DNA and behavior. However, beyond that it reveals an aspect of history that has been little appreciated to date; the guerilla warfare carried on by behavioral geneticists against the Blank Slate orthodoxy from a very early date. You might say the book is an account of the victorious end of that warfare. From now on those who deny the existence of heritable genetic effects on human behavior will self-identify as belonging to the same category as the more seedy televangelists, or even professors in university “studies” departments.

Let’s begin with the science.   We have long known by virtue of thousands of twin and adoption studies that many complex human traits, including psychological traits, are more or less heritable due to differences in DNA. These methods also enable us to come up with a ballpark estimate of the degree to which these traits are influenced by genetics. However, we have not been able until very recently to detect exactly what inherited differences in DNA sequences are actually responsible for the variations we see in these traits. That’s were the “revolution” in genetics described by Plomin comes in. It turns out that detecting these differences was to be a far more challenging task than optimistic scientists expected at first. As he put it,

When the hunt began twenty-five years ago everyone assumed we were after big game – a few genes of large effect that were mostly responsible for heritability. For example, for heritabilities of about 50 per cent, ten genes each accounting for 5 per cent of the variance would do the job. If the effects were this large, it would require a sample size of only 200 to have sufficient power to detect them.

This fond hope turned out to be wishful thinking. As noted in the book, some promising genes were studied, and some claims were occasionally made in the literature that a few such “magic” genes had been found. The result, according to Plomin, was a fiasco. The studies could not be replicated. It was clear by the turn of the century that a much broader approach would be necessary. This, however, would require the genotyping of tens of thousands of single-nucleotide polymorphisms, or SNPs (snips). A SNP is a change in a single one of the billions of rungs of the DNA ladder each of us carries. SNPs are one of the main reasons for differences in the DNA sequence among different human beings. To make matters worse, it was expected that sample sizes of a thousand or more individuals would have to be checked in this way to accumulate enough data to be statistically useful. At the time, such genome-wide association (GWA) studies would have been prohibitively expensive. Plomin notes that he attempted such an approach to find the DNA differences associated with intelligence, with the aid of a few shortcuts. He devoted two years to the study, only to be disappointed again. It was a second false start. Not a single DNA association with intelligence could be replicated.

Then, however, a major breakthrough began to make its appearance in the form of SNP chips.  According to Plomin, “These could “genotype many SNPs for an individual quickly and inexpensively. SNP chips triggered the explosion of genome-wide association studies.” He saw their promise immediately, and went back to work attempting to find SNP associations with intelligence. The result? A third false start. The chips available at the time were still too expensive, and could identify too few SNPs. Many other similar GWA studies failed miserably as well. Eventually, one did succeed, but there was a cloud within the silver lining. The effect size of the SNP associations found were all extremely small. Then things began to snowball. Chips were developed that could identify hundreds of thousands instead of just tens of thousands of SNPs, and sample sizes in the tens of thousands became feasible. Today, sample sizes can be in the hundreds of thousands. As a result of all this, revolutionary advances have been made in just the past few years. Numerous genome-wide significant hits have been found for a host of psychological traits. And now we know the reason why the initial studies were so disappointing. In Plomin’s words,

For complex traits, no genes have been found that account for 5 per cent of the variance, not even 0.5 per cent of the variance. The average effect sizes are in the order of 0.01 per cent of the variance, which means that thousands of SNP associations will be needed to account for heritabilities of 50 per cent… Thinking about so many SNPs with such small effects was a big jump from where we started twenty-five years ago. We now know for certain that heritability is caused by thousands of associations of incredibly small effect. Nonetheless, aggregating these associations in polygenic scores that combine the effects of tens of thousands of SNPs makes it possible to predict psychological traits such as depression, schizophrenia and school achievement.

In short, we now have a tool that, as I write this, is rapidly increasing in power, and that enables falsifiable predictions regarding many psychological traits based on DNA alone. As Plomin puts it,

The DNA revolution matters much more than merely replicating results from twin and adoption studies. It is a game-changer for science and society. For the first time, inherited DNA differences across our entire genome of billions of DNA sequences can be used to predict psychological strengths and weaknesses for individuals, called personal genomics.

As an appreciable side benefit, thanks to this revolution we can now officially declare the Blank Slate stone cold dead. It’s noteworthy that this revolutionary advance in our knowledge of the heritable aspects of our behavior did not happen in the field of evolutionary psychology, as one might expect. Diehard Blank Slaters have been directing their ire in that direction for some time. They could have saved themselves the trouble. While the evolutionary psychologists have been amusing themselves inventing inconsequential just so stories about the more abstruse aspects of our sexual behavior, a fifth column that germinated long ago in the field of behavioral genetics was about to drive the decisive nail in their coffin. Obviously, it would have been an inappropriate distraction for Plomin to expand on the fascinating history behind this development in Blueprint.  Read between the lines, though, and its quite clear that he knows what’s been going on.

It turns out that the behavioral geneticists were already astute at dodging the baleful attention of the high priests of the Blank Slate, flying just beneath their radar, at a very early date. A useful source document recounting some of that history entitled, Origins of Behavior Genetics: The Role of The Jackson Laboratory, was published in 2009 by Donald Dewsbury, emeritus professor of psychology at the University of Florida. He notes that,

A new field can be established and coalesce around a book that takes loosely evolving material and organizes it into a single volume. Examples include Watson’s (1914) Behavior: An Introduction to Comparative Psychology and Wilson’s (1975) Sociobiology. It is generally agreed that Fuller and Thompson’s 1960 Behavior Genetics served a similar function in establishing behavior genetics as a separate field.

However, research on the effects of genes on behavior had already begun much earlier. In the 1930’s, when the Blank Slate already had a firm grip on the behavioral sciences, According to the paper, Harvard alumnus Alan Gregg, who was Director of the Medical Sciences Division of Rockefeller Foundation,

…developed a program of “psychobiology” or “mental hygiene” at the Foundation. Gregg viewed mental illness as a fundamental problem in society and believed that there were strong genetic influences. There was a firm belief that the principles to be discovered in nonhuman animals would generalize to humans. Thus, fundamental problems of human behavior might be more conveniently and effectively studied in other species.

The focus on animals turned out to be a very wise decision. For many years it enabled the behavioral geneticists to carry on their work while taking little flak from the high priests of the Blank Slate, whose ire was concentrated on scientists who were less discrete about their interest in humans, in fields such as ethology. Eventually Gregg teamed up with Clarence Little, head of the Jackson Laboratory in Bar Harbor, Maine, and established a program to study mice, rabbits, guinea pigs, and, especially dogs. Gregg wrote papers about selective breeding of dogs for high intelligence and good disposition. However, as his colleagues were aware, another of his goals “was conclusively to demonstrate a high heritability of human intelligence.”

Fast forward to the 60’s. It was a decade in which the Blank Slate hegemony began to slowly crumble under the hammer blows of the likes of Konrad Lorenz, Niko Tinbergen, Robert Trivers, Irenäus Eibl-Eibesfeldt, and especially the outsider and “mere playwright” Robert Ardrey. In 1967 the Institute for Behavioral Genetics (IBG) was established at the University of Colorado by Prof. Jerry McClearn with his colleagues Kurt Schlesinger and Jim Wilson. In the beginning, McClearn et. al. were a bit coy, conducting “harmless” research on the behavior of mice, but by the early 1970’s they had begun to publish papers that were explicitly about human behavior. It finally dawned on the Blank Slaters what they were up to, and they were subjected to the usual “scientific” accusations of fascism, Nazism, and serving as running dogs of the bourgeoisie, but by then it was too late. The Blank Slate had already become a laughing stock among lay people who were able to read and had an ounce of common sense. Only the “experts” in the behavioral sciences would be rash enough to continue futile attempts to breath life back into the corpse.

Would that some competent historian could reconstruct what was going through the minds of McClearn and the rest when they made their bold and potentially career ending decision to defy the Blank Slate and establish the IBG. I believe Jim Wilson is still alive, and no doubt could tell some wonderful stories about this nascent insurgency. In any case, in 1974 Robert Plomin made the very bold decision for a young professor to join the Institute. One of the results of that fortuitous decision was the superb book that is the subject of this post. As noted above, digression into the Blank Slate affair would only have been a distraction from the truly revolutionary developments revealed in his book. However, there is no question that that he was perfectly well aware of what had been going on in the “behavioral sciences” for many years. Consider, for example, the following passage, about why research results in behavioral genetics are so robust and replicate so strongly:

Another reason seems paradoxical: behavioral genetics has been the most controversial topic in psychology during the twentieth century. The controversy and conflict surrounding behavioral genetics raised the bar for the quality and quantity of research needed to convince people of the importance of genetics. This has had the positive effect of motivating bigger and better studies. A single study was not enough. Robust replication across studies tipped the balance of opinion.

As the Germans say, “Was mich nicht umbringt, macht mich stark” (What doesn’t kill me make me strong). If you were looking for a silver lining to the Blank Slate, there you have it. What more can I say. The book is a short 188 pages, but in those pages are concentrated a wealth of knowledge bearing on the critical need of our species to understand itself. If you would know yourself, then by all means, buy the book.

Morality and Reason – Why Do We Do the Things We Do?

Consider the evolution of life from the very beginning. Why did the first stirrings of life – molecules that could reproduce themselves – do what they did? The answer is simple – chemistry. As life forms became more complex, they eventually acquired the ability to exploit external sources of energy, such as the sun or thermal vents, to survive and reproduce. They improved the odds of survival even further by acquiring the ability to move towards or away from such resources. One could easily program a machine to perform such simple tasks. Eventually these nascent life forms increased the odds that they would survive and reproduce even further by acquiring the ability to extract energy from other life forms. These other life forms could only survive themselves by virtue of acquiring mechanisms to defend themselves from these attacks. This process of refining the traits necessary to survive continues to this day. We refer to it as natural selection. Survival tools of astounding complexity have evolved in this way, such as the human brain, with its ability evoke consciousness of such things as the information received from our sense organs, drives such as thirst, hunger, and sexual desire, and our emotional responses to, for example, our own behavior and the behavior of others. Being conscious of these things, it can also reason about them, considering how best to satisfy our appetites for food, water, sex, etc., and how to interpret the emotions we experience as we interact with others of our species.

A salient feature of all these traits, from simple to complex, is the reason they exist to begin with. They exist because at the time and in the environment in which they evolved, they enhanced the odds that we would survive, or at least they did to the extent that they were relevant to our survival at all. They exist for no other reason. Our emotions and predispositions to behave in some ways and not others are certainly no exception. They are innate, in the sense that their existence depends on genetic programming. Thanks to natural selection, we also possess consciousness and the ability to reason. As a result, we can reason about what these emotions and predispositions mean, and how we should respond to them. They are not rigid instincts, and they do not “genetically determine” our behavior. In the case of a subset of them, we refer to the outcome of this process of reasoning about and seeking to interpret them as morality. It is these emotions and predispositions that are the root cause for the existence of morality. Without them, morality as we know it would not exist. They exist by virtue of natural selection. At some time and in some environment, they promoted our survival and reproduction. It can hardly be assumed that they will accomplish the same result at a later date and in a different environment. In fact, it is quite apparent that in the drastically different environment we live in today, they often accomplish the opposite. For a sizable subset of the human population, morality has become maladaptive.

The remarkable success of our species in expanding from a small cohort of African apes to cover virtually the entire planet is due in large part to our ability to deal with rapid changes in the environment. We can thrive in the tropics or the arctic, and in deserts or rain forests. However, when it comes to morality, we face a very fundamental problem in dealing with such radical changes. Our brain spawns illusions that make it extremely difficult for us to grasp the nature of the problem we are dealing with. We perceive Good, Evil, Rights, etc., as real, objective things. These illusions are extremely powerful, because by being powerful they could most effectively regulate our behavior in ways that promoted survival. Now, in many cases, the illusions have become a threat to our survival, but we can’t shake them, or see them for what they really are. What they are is subjective constructs that are completely incapable of existing independently outside of the minds of individuals. Even those few who claim to see through the illusion are found defending various “Goods,” “Evils,” “Rights,” “Duties,” and other “Oughts” in the very next breath as if they were referring to real, objective things. They often do so in support of behaviors that are palpably maladaptive, if not suicidal.

An interesting feature of such maladaptive behaviors is the common claim that they are justified by “reason.” The Scotch-Irish philosopher Francis Hutcheson explained very convincingly why moral claims can’t be based on reason alone almost 300 years ago. As David Hume put it somewhat later, “Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.” Reason alone can never do anything but chase its own tail. After all, computers don’t program themselves. There must be something to reason about. In the case of human behavior the chain of reasons can be as long and as elaborate as you please, but must always and invariably originate in an innate predisposition or drive, whether it be hunger, thirst, lust, or what is occasionally referred to as our “moral sense.” Understood in that way, all of our actions are “unreasonable,” because reason can never, ever serve as the cause of our actions itself.  Reasoning about good and evil is equivalent to reasoning about the nature of God. In both cases one is reasoning about imaginary things. Behavior can never be objectively good or evil, because those categories only exist as illusions. It can, however, be objectively described as adaptive or maladaptive, depending on whether it enhances the odds of genetic survival or not.

In the case of morality, maladaptive behavior is seldom limited to a single individual. Morality is always other-regarding. The illusion that Good, Evil, etc., exist as independent, objective things implies that, not just we ourselves, but everyone else “ought” to behave in ways that embrace the “Good,” and resist “Evil.” As a result we assume a “right” to dictate potentially maladaptive and/or suicidal behavior to others. If we are good at manipulating the relevant emotions, those others may quite possibly agree with us. If we can convince them to believe our version of the illusion, they may accept our reasoning about what our moral emotions are “really” trying to tell us, and become convinced that they must act in ways detrimental to their own survival as well. They may clearly see that they are being induced to behave in a way that is not to their advantage, but the illusion would tend to paralyze any attempt to behave differently. The only means of resistance would be to manipulate the moral sense so as to evoke different illusions of what good and evil “really” are.

If, as noted above, there is nothing objectively good or evil about anything, it follows that there is nothing objectively good or evil about any of these behaviors. They are simply biological facts that happen to be observable at a given time and in a given environment. However, whatever one seeks to accomplish in life, they will be more likely to succeed if they base their actions on facts rather than illusions. That applies to the illusions associated with our moral sense as much as to any others. The vast majority of us, including myself, have an almost overwhelming sense that the illusions are real, and that good and evil are objective things. However, it is becoming increasingly dangerous, if not suicidal, to continue to cling to these illusions, assuming one places any value on survival.

Most of us have goals in life. In most cases those goals are based on illusions such as those described above. Human beings tend to stumble blindly through life, without a clue about the fundamental reasons they behave the way they do. Occasionally one sees them jumping off cliffs, stridently insisting that others must jump off the cliff too, because it is “good,” or it is their “duty.” Perhaps Socrates had such behavior in mind when he muttered, “The unexamined life is not worth living” at his trial. Before jumping off a cliff, would it not be wise to closely examine your reasons for doing so, following those reasons to their emotional source, and considering why those emotions exist to begin with? I, too, have goals. Paramount among my personal goals is survival and reproduction. There is nothing intrinsically or objectively better about those goals than anyone else’s, including the goal of jumping off a cliff. I have them because I perceive them to be in harmony with the reasons I exist to begin with. Those who do not wish to survive and reproduce appear to me to be sick and dysfunctional biological units. I do not care to be such a unit. As corollary goals I wish for the continued evolution of my species to become ever more capable of survival, and beyond that for the continued existence of biological life in general. I have no basis for claiming that my goals are “correct,” or that the goals of others are “wrong.” Mine are just as much expressions of emotion as anyone else’s. Call them whims, if you will, but at least they have the virtue of being whims that aren’t self destructive.

Supposing you have similar goals, I suggest that it would behoove you to shed the illusion of objective morality. That is by no means the same thing as dispensing with morality entirely, nor does it imply that you can’t treat a version of morality you deem conducive to your survival as an absolute. In other words, it doesn’t imply “moral relativism.” It is our nature to perceive whatever version of morality we happen to favor as absolute. Understanding why that is our nature will not result in moral nihilism, but it will have the happy effect of pulling the rug out from under the feet of the moralistic bullies who have always assumed a right to dictate behavior to the rest of us. To understand morality is to realize that the “moral high ground” they imagine they’re standing on doesn’t exist.

It is unlikely that any of us will be able to resist or significantly influence the massive shifts in population, ideology and the other radical changes to the world we live in that are happening at an ever increasing rate merely by virtue of the fact that we recognize morality and the illusions of objective good and evil associated with it for what they really are. However, it seems to me that recognizing the truth will at least enhance our ability to cope with those changes. In other words, it will help us survive, and, after all, survival is the reason that morality exists to begin with.

How a “Study” Repaired History and the Evolutionary Psychologists Lived Happily Ever After

It’s a bit of a stretch to claim that those who have asserted the existence and importance of human nature have never experienced ideological bias. If that claim is true, then the Blank Slate debacle could never have happened. However, we know that it happened, based not only on the testimony of those who saw it for the ideologically motivated debasement of science that it was, such as Steven Pinker and Carl Degler, but of the ideological zealots responsible for it themselves, such as Hamilton Cravens, who portrayed it as The Triumph of Evolution. The idea that the Blank Slaters were “unbiased” is absurd on the face of it, and can be immediately debunked by simply counting the number of times they accused their opponents of being “racists,” “fascists,” etc., in books such as Richard Lewontin’s Not in Our Genes, and Ashley Montagu’s Man and Aggression. More recently, the discipline of evolutionary psychology has experienced many similar attacks, as detailed, for example, by Robert Kurzban in an article entitled, Alas poor evolutionary psychology.

The reasons for this bias has never been a mystery, either to the Blank Slaters and their latter day leftist descendants, or to evolutionary psychologists and other proponents of the importance of human nature. Leftist ideology requires not only that human beings be equal before the law, but that the menagerie of human identity groups they have become obsessed with over the years actually be equal, in intelligence, creativity, degree of “civilization,” and every other conceivable measure of human achievement. On top of that, they must be “malleable,” and “plastic,” and therefore perfectly adaptable to whatever revolutionary rearrangement in society happened to be in fashion. The existence and importance of human nature has always been perceived as a threat to all these romantic mirages, as indeed it is. Hence the obvious and seemingly indisputable bias.

Enter Jeffrey Winking of the Department of Anthropology at Texas A&M, who assures us that it’s all a big mistake, and there’s really no bias at all! Not only that, but he “proves” it with a “study” in a paper entitled, Exploring the Great Schism in the Social Sciences, that recently appeared in the journal Evolutionary Psychology. We must assume that, in spite of his background in anthropology, Winking has never heard of a man named Napoleon Chagnon, or run across an article entitled Darkness’s Descent on the American Anthropological Association, by Alice Degler.

Winking begins his article by noting that “The nature-nurture debate is one that biologists often dismiss as a false dichotomy,” but adds, “However, such dismissiveness belies the long-standing debate that is unmistakable throughout the biological and social sciences concerning the role of biological influences in the development of psychological and behavioral traits in humans.” I agree entirely. One can’t simply hand-wave away the Blank Slate affair and a century of bitter ideological debate by turning up one’s nose and asserting the term isn’t helpful from a purely scientific point of view.

We also find that Winking isn’t completely oblivious to examples of bias on the “nature” side of the debate. He cites the Harvard study group which “evaluated the merits of sociobiology, and which included intellectual giants like Stephen J. Gould and Richard Lewontin.” I am content to let history judge whether Gould and Lewontin were really “intellectual giants.” Regardless, if Winking actually read these “evaluations,” he cannot have failed to notice that they contained vicious ad hominem attacks on E. O. Wilson and others that it is extremely difficult to construe as anything but biased. Winking goes on to note similar instances of bias by other authors in various disciplines, such as,

Many researchers use [evolutionary approaches to the study of international relations] to justify the status quo in the guise of science.

The totality [of sociobiology and evolutionary psychology] is a myth of origin that is compelling precisely because it resonates strongly with Euro American presuppositions about the nature of the world.

…in the social sciences (with the exception of primatology and psychology) sociobiology appeals most to right-wing social scientists.

These are certainly compelling examples of bias. Now, however, Winking attempts to demonstrate that those who point out the bias, and correctly interpret the reasons for it, are just as biased themselves. As he puts it,

Conversely, those who favor biological approaches have argued that those on the other side are rendered incapable of objective assessment by their ideological promotion of equality. They are alleged to erroneously reject evidence of biological influences because such evidence suggests that social outcomes are partially explained by biology, and this might inhibit the realization of equality. Their critiques of biological approaches are therefore often blithely dismissed as examples of the moralistic/naturalistic fallacy. This line of reason is exemplified in the quote by biologist Jerry Coyne

If you can read the [major Evolutionary Psychology review paper] and still dismiss the entire field as worthless, or as a mere attempt to justify scientists’ social prejudices, then I’d suggest your opinions are based more on ideology than judicious scientific inquiry.

I can’t imagine what Winking finds “blithe” about that statement! Is it really “blithe” to so much as suggest that people who dismiss entire fields of science as worthless may be ideologically motivated? I note in passing that Coyne must have thought long and hard about that statement, because his Ph.D. advisor was none other than Richard Lewontin, whom he still honors and admires!  Add to that the fact that Coyne is about as far as you can imagine from “right wing,” as anyone can see by simply visiting his Why Evolution is True website, and the notion that he is being “blithe” here is ludicrous. Winking’s other examples of “blithness” are similarly dubious, including,

For critics, the heart of the intellectual problem remains an ideological adherence to the increasingly implausible view that human behavior is strictly determined by socialization… Should [social]hierarchies result strictly from culture, then the possibilities for an egalitarian future were seen to be as open and boundless as our ever-malleable brains might imagine.

Like the Church, a number of contemporary thinkers have also grounded their moral and political views in scientific assumptions about… human nature, specifically that there isn’t one.

Unlike the “comparable” statements by the Blank Slaters, these statements neither accuse those who deny the existence of human nature of being Nazis, nor is evidence lacking to back them up.  On the contrary, one could cite a mountain of evidence to back them up supplied by the Blank Slaters themselves.  Winking soon supplies us with the reason for this strained attempt to establish “moral equivalence” between “nature” and “nurture.”  It appears in his “hypothesis,” as follows:

It is entirely possible that confirmation bias plays no role in driving disagreement and that the overarching debate in academia is driven by sincere disagreements concerning the inferential value of the research designs informing the debate.

Wait a minute!  Don’t roll your eyes like that!  Winking has a “study” to back up this hypothesis.  Let me explain it to you.  He invented some “mock results” of studies which purported to establish, for example, the increased prevalence of an allele associated with “appetitive aggression” in populations with African ancestry.  Subtle, no?  Then he used Mechanical Turk and social media to come up with a sample of 365 people with Masters degrees or Ph.D.’s for a survey on what they thought of the “inferential power” of the fake data.  Another sample of 71 were scraped together for another survey on “research design.”  In the larger sample, 307 described themselves as either only “somewhat” on the “nature” side, or “somewhat” on the “nurture” side.  Only 57 claimed they leaned strongly one way or the other.  The triumphant results of the study included, for example, that,

Participants perceptions of inferential value did not vary by the degree to which results supported a particular ideology, suggesting that ideological confirmation bias is not affecting participant perceptions of inferential value.

Seriously?  Even the author admits that the statistical power of his “study” is low because of the small sample sizes.  However statistical power only applies where the samples are truly random, meaning, in this case, where the participants are either unequivocably on the “nature” or “nurture” side.  That is hardly the case.  Mechanical Turk samples, for example are biased towards a younger and more liberal demographic.  Most of the participants were on the fence between nature and nurture.  In other words, there’s no telling what their true opinions were even if they were honest about them.  Even the most extreme Blank Slaters admitted that nature plays a significant role in such bodily functions as urinating, defecating, and breathing, and so could have easily described themselves as “somewhat bioist.”  Perhaps most importantly, any high school student could have easily seen what this “study” was about.  There is no doubt whatsoever that holders of Masters and Doctors degrees in related disciplines had no trouble a) inferring what the study was about, and b) had an interest in making sure that the results demonstrated that they were “unbiased.”  In other words, were not exactly talking “double blind” here.

I think the author was well aware that most readers would have no trouble detecting the blatant shortcomings of his “study.”  Apparently to ward off ridicule he wrote,

Regardless of one’s position, it is important to remind scholars that if they believe a group of intelligent and informed academics could be so unknowingly blinded by ideology that they wholeheartedly subscribe to an unquestionably erroneous interpretation of an entire body of research, then they must acknowledge they themselves are equally as capable of being so misguided.

Kind of reminds you of the curse over King Tut’s tomb, doesn’t it?  “May those who question my study be damned to dwell among the misguided forever!”  Sorry, my dear Winking, but “a group of intelligent and informed academics” not only could, but were “so unknowingly blinded by ideology that they wholeheartedly subscribed to an unquestionably erroneous interpretation of an entire body of research.”  It was called the Blank Slate, and it derailed the behavioral sciences for more than half a century.  That’s what Pinker’s book was about.  That’s what Degler’s book was about, and yes, that’s even what Cravens’ book was about.  They all did an excellent job of documenting the debacle.  I suggest you read them.

Or not.  You could decide to believe your study instead.  I have to admit, it would have its advantages.  History would be “fixed,” the lions would lie down with the lambs, and the evolutionary psychologists would live happily ever after.

Of Ingroups and Outgroups and the Hatreds they Spawn

Did it ever strike you as odd that the end result of Communism, a philosophy that was supposed to usher in a paradise of human brotherhood, was the death of 100 million people, give or take, and the self-decapitation of countries like Cambodia and the former Soviet Union?  Does it seem counter-intuitive that the adherents of a religion that teaches “blessed are the peacemakers” should have launched wars that killed tens of millions?  Is it bewildering than another one, promoted as the “religion of peace,” should have launched its zealots out of Arabia, killing millions more, and becoming the most successful vehicle of colonialism and imperialism ever heard of?  Do you find the theory that human warfare resulted from purely environmental influences that were the unfortunate outcome of the transition to Neolithic economies somewhat implausible?  In fact, all of these “anomalies” are predictable manifestations of what is perhaps both the most important and the most dangerous aspect of innate human behavior; our tendency to perceive others in terms of ingroups and outgroups.

Our tendency to associate the good with our ingroup, and all that is evil, disgusting and contemptible with outgroups, is a most inconvenient truth for moral philosophy.  You might call it the universal solvent of all moral systems concocted to date.  It is a barrier standing in the way of all attempts to manipulate human moral emotions, to force them to serve a “higher purpose,” or to cajole them into promoting the goal of “human flourishing.”  Because it is such an inconvenient truth it was vehemently denied as one aspect of the Blank Slate catastrophe.  Attempts were made to scare it away by calling it bad names.  Different specific manifestations became racism, bigotry, xenophobia, and so on.  The result was something like squeezing jello.  The harder we squeezed, the faster the behavior slipped through our fingers in new forms.  New outgroups emerged to take the place of the old ones, but the hatred remained, often more virulent than before.

It is impossible to understand human behavior without first determining who are the ingroups, and who are their associated outgroups.  Consider, for example, recent political events in the United States.  Wherever one looks, whether in news media, social media, on college campuses, or in the “jokes” of comedians, one finds manifestations of a furious hatred directed at Trump and his supporters.  There is jubilation when they are murdered in effigy on stage, or shot in reality on baseball fields.  The ideologically defined ingroup responsible for all this hatred justifies its behavior with a smokescreen of epithets, associating all sorts of “bad” qualities with its outgroup, following a pattern that should be familiar to anyone who has studied a little history.  In fact, their hate is neither rational, nor does it result from any of these “bad” things.  They hate for the same reason that humans have always hated; because they have identified Trump and his supporters as an outgroup.

Going back several decades, one can see the same phenomenon unfolding under the rubric of the Watergate Affair.  In that case, of course, Nixon and his supporters were the outgroup, and the ingroup can be more specifically identified with the “mainstream media” of the day.  According to the commonly peddled narrative, Nixon was a very bad man who committed very terrible crimes.  I doubt it, but it doesn’t matter one way or the other.  Nixon was deposed in what we are informed was a “triumph of justice” by some heroic reporters.  In fact, it was a successful coup d’état carried out behind a façade of legality.  The idea that what Nixon did or didn’t do had anything to do with it can be immediately exposed as a fiction by anyone who is aware of the type of human behavior described in this post, and who bothers to read through the front pages of the Washington Post and the New York Times during the 18 months or so the affair lasted.  There he will not find a conscientious attempt to keep readers informed about affairs in the world that might be important to them.  Rather, he will see an unrelenting obsession with Watergate, inexplicable as other than the manifestation of a deep hatred.  The result was a dangerous destabilization of the U.S. government, leading to further attempts to depose legitimately elected Presidents, as we saw in the case of Clinton, and as we now see underway in the case of Trump.  In Nixon’s day the mainstream media controlled the narrative.  They were able to fob off their coup d’état as the triumph of virtue and justice.  That won’t happen this time around.  Now there are powerful voices on the other side, and the outcome of such a “nice and legal” coup d’état carried out against Trump will be the undermining of the trust of the American people in the legitimacy of their political system at best.  At worst, some are suggesting we will find ourselves in the middle of a civil war.

Those still inclined to believe that the behavior in question really can be explained by the rationalizations used to justify it need only look a bit further back in history.  There they will find descriptions of exactly the same behavior, but rationalized in ways that appear incomprehensible and absurd to modern readers.  For example, read through the accounts of the various heresies that afflicted Christianity over the years.  Few Christians today could correctly identify the “orthodox” number of persons, natures, and wills of the Godhead, or the “orthodox” doctrines regarding the form of Communion or the efficacy of faith, and yet such issues have spawned ingroup/outgroup identification accompanied by the usual hatreds, resulting in numerous orgies of mass murder and warfare.

I certainly don’t mean to claim that issues and how they are decided never matter in themselves.  However, when it comes to human behavior, their role often becomes a mere pretext, a façade used to rationalize hatred that is actually a manifestation of innate emotional predispositions.  Read the comments following articles about politics and you will get the impression that half the population wakes up in the morning determined to deliberately commit as many bad deeds as they possibly can, and the other half is heroically struggling to stop them and secure the victory of the Good.  Does that really make sense?  Is it really so difficult to see that such a version of reality represents a delusion, explicable only if one accepts human nature for what it is?  Would you understand what’s going on in the world?  Then for starters you need to identify the ingroups and outgroups.  Lacking that fundamental insight, you will be stumbling in the dark.  In the dark it’s very difficult to see that you, too, are a hater, simply by virtue of the fact that you belong to the species Homo sapiens, and to understand why you hate.  Hatred is a destructive force.  It would behoove us to learn to control it no matter what our goals happen to be, but we will have a very difficult time controlling it unless we finally understand why it exists.

More Whimsical History of the Blank Slate

As George Orwell wrote in 1984, “Who controls the past controls the future. Who controls the present controls the past.”  The history of the Blank Slate is a perfect illustration of what he meant.  You might say there are two factions in the academic ingroup; those who are deeply embarrassed by the Blank Slate, and those who are still bitterly clinging to it.  History as it actually happened is damaging to both factions, so they’ve both created imaginary versions that support their preferred narratives.  At this point the “official” histories have become hopelessly muddled.  I recently ran across an example of how this affects younger academics who are trying to make sense of what’s going on in their own fields in an article entitled, Sociology’s Stagnation at the Quillette website.  It was written by Brian Boutwell, Associate Professor of Criminology and Criminal Justice at St. Louis University.

Boutwell cites an article published back in 1990 by sociologist Pierre van den Berghe excoriating the practitioners in his own specialty.  Van den Berghe was one of those rare sociologists who insisted on the relevance of evolved behavioral traits to his field.  He did not mince words.  Boutwell quotes several passages from the article, including the following:

Such a theoretical potpourri is premised on the belief that, in the absence of a powerful simplifying idea, all ideas are potentially good, especially if they are turgidly presented, logically opaque, and empirically irrefutable. This sorry state of theoretical affairs in sociology is probably the clearest evidence of the discipline’s intellectual bankruptcy. But let my colleagues rest assured: intellectual bankruptcy never spelled the doom of an academic discipline. Those within it are professionally deformed not to recognize it, and those outside of it could not care less. Sociology is safe for at least a few more decades.

In response, Boutwell writes,

Intellectually bankrupt? Those are strong words. Can a field survive like this? It can, and it has. Hundreds of new sociology PhDs are minted every year across the country (not to mention the undergraduate and graduate degrees that are conferred as well). How many students were taught that human beings evolved about around 150,000 years ago in Africa? How many know what a gene is? How many can describe Mendel’s laws, or sexual selection? The answer is very few. And, what is worse, many sociologists do not think this ignorance matters.

In other words, Boutwell thinks the prevailing malaise in Sociology continues because sociologists don’t know about Darwin.  He may be right in some cases, but that’s not really the problem.  The problem is that the Blank Slate still prevails in sociology.  It is probably the most opaque of all the behavioral “sciences.”  In fact, it is just an ideological narrative pretending to be a science, just as psychology was back in the day when van den Berghe wrote his article.  Psychologists deal with individuals.  As a result they have to look at behavior a lot closer to the source of what motivates it.  As most reasonably intelligent lay people have been aware for millennia, it is motivated by human nature.  By the end of the 90’s, naturalists, neuroscientists, and evolutionary psychologists had heaped up such piles of evidence supporting that fundamental fact that psychologists who tried to prop up the threadbare shibboleths of the Blank Slate ran the risk of becoming laughing stocks.  By 2000 most of them had thrown in the towel.  Not so the sociologists.  They deal with masses of human beings.  It was much easier for them to insulate themselves from the truth by throwing up a smokescreen of “culture.”  They’ve been masturbating with statistics ever since.

Boutwell thinks the solution is for them to learn some evolutionary biology.  I’m not sure which version of the “history” gave him that idea.  However, if he knew how the Blank Slate really went down, he might change his mind.  Evolutionary biologists and scientists in related fields were part of the heart and soul of the Blank Slate orthodoxy.  They knew all about genes, Mendel’s laws, and sexual selection, but it didn’t help.  Darwin?  They simply redacted those parts of his work that affirmed the relationship between natural selection, human nature in general, and morality in particular.  No matter that Darwin himself was perfectly well aware of the connections.  For these “scientists,” an ideological narrative trumped scientific integrity until the mass of evidence finally rendered the narrative untenable.

Of course, one could always claim that I’m just supporting an ideological narrative of my own.  Unfortunately, that claim would have to explain away a great deal of source material, and because the events in question are so recent, the source material is still abundant and easily accessible.  If Prof. Boutwell were to consult it he would find that evolutionary biologists like Stephen Jay Gould, geneticists like Richard Lewontin, and many others like them considered the Blank Slate the very “triumph of evolution.”  I suggest that anyone with doubts on that score have a look at a book that bears that title by scientific historian Hamilton Cravens published in 1978 during the very heyday of the Blank Slate.  It is very well researched, cites scores of evolutionary biologists, geneticists, and behavioral scientists, and concludes that all the work of these people who were perfectly familiar with Darwin culminated in the triumphant establishment of the Blank Slate as “scientific truth,” or, as announced by the title of his book, “The Triumph of Evolution.”  His final paragraph gives a broad hint about how something so ridiculous could ever have been accepted as an unquestionable dogma.  It reads,

The long-range, historical function of the new evolutionary science was to resolve the basic questions about human nature in a secular and scientific way, and thus provide the possibilities for social order and control in an entirely new kind of society.  Apparently this was a most successful and enduring campaign in American culture.

Here, unbeknownst to himself, Cravens hit the nail on the head.  Social control was exactly what the Blank Slate was all about.  It was essential that the ideal denizens of the future utopias that the Blank Slaters had in mind for us have enough “malleability” and “plasticity” to play their assigned parts.  “Human nature” in the form of genetically transmitted behavioral predispositions would only gum things up.  They had to go, and go they did.  Ideology trumped and derailed science, and kept it derailed for more than half a century.  As Boutwell has noticed, it remains derailed in sociology and a few other specialties that have managed to develop similarly powerful allergic reactions to the real world.  Reading Darwin isn’t likely to help a bit.

One of the best books on the genesis of the Blank Slate is In Search of Human Nature, by Carl Degler.  It was published in 1991, well after the grip of the Blank Slate on the behavioral sciences had begun to loosen, and presents a somewhat more sober and realistic portrayal of the affair than Cravens’ triumphalist account.  Among other things it gives an excellent account of the genesis of the Blank Slate.  As portrayed by Degler, in the beginning it hadn’t yet become such a blatant tool for social control.  One could better describe it as an artifact of idealistic cravings.  Then, as now, one of the most important of these was the desire for human equality, not only under the law, but in a much more real, physical sense, among both races and individuals.  If human nature existed and was important, than such equality was out of the question.  Perfect equality was only possible if every human mind started out as a Blank Slate.

Degler cites the work of several individuals as examples of this nexus between the ideal of equality and the Blank Slate, but I will focus on one in particular; John B. Watson, the founder of behaviorism.  One of the commenters to an earlier post suggested that the behaviorists weren’t Blank Slaters.  I think that he, too, is suffering from historical myopia.  Again, it’s always useful to look at the source material for yourself.  In his book, Behaviorism, published in 1924, Watson notes that all human beings breathe, sneeze, have hearts that beat, etc., but have no inherited traits that might reasonably be described as human nature.  In those days, psychologists like William James referred to hereditary behavioral traits as “instincts.”  According to Watson,

In this relatively simple list of human responses there is none corresponding to what is called an “instinct” by present-day psychologists and biologists.  There are then for us no instincts – we no longer need the term in psychology.  Everything we have been in the habit of calling an “instinct” today is the result largely of training – belongs to man’s learned behavior.

A bit later on he writes,

The behaviorist recognizes no such things as mental traits, dispositions or tendencies.  Hence, for him, there is no use in raising the question of the inheritance of talent in its old form.

In case we’re still in doubt about his Blank Slate bona fides, a few pages later he adds,

I should like to go one step further now and say, “Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select – doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors.”  I am going beyond my facts and I admit it, but so have the advocates of the contrary and they have been doing it for many thousands of years.  Please note that when this experiment is made I am to be allowed to specify the way the children are to be brought up and the type of world they have to live in.

Here, in a nutshell, we can see the genesis of hundreds of anecdotes about learned professors dueling over the role of “nature” versus “nurture,” in venues ranging from highbrow intellectual journals to several episodes of The Three Stooges.  Watson seems to be literally pulling at our sleeves and insisting, “No, really, I’m a Blank Slater.”  Under the circumstances I’m somewhat dubious about the claim that Watson, Skinner, and the rest of the behaviorists don’t belong in that category.

What motivated Watson and others like him to begin this radical reshaping of the behavioral sciences?  I’ve already alluded to the answer above.  To make a long story short, they wanted to create a science that was “fair.”  For example, Watson was familiar with the history of the Jukes family outlined in an account of a study by Richard Dugdale published in 1877.  It documented unusually high levels of all kinds of criminal behavior in the family.  Dugdale himself insisted on the role of environmental as well as hereditary factors in explaining the family’s criminality, but later interpreters of his work focused on heredity alone.  Apparently Watson considered such an hereditary burden unfair.  He decided to demonstrate “scientifically” that a benign environment could have converted the entire family into model citizens.  Like many other scientists in his day, Watson abhorred the gross examples of racial discrimination in his society, as well as the crude attempts of the Social Darwinists to justify it.  He concluded that “science” must support a version of reality that banished all forms of inequality.  The road to hell is paved with good intentions.

I could go on and on about the discrepancies one can find between the “history” of the Blank Slate and source material that’s easily available to anyone willing to do a little searching.  Unfortunately, I’ve already gone on long enough for a single blog post.  Just be a little skeptical the next time you read an account of the affair in some textbook.  It ain’t necessarily so.