The Red Centennial

Today marks the 100th anniversary of the Bolshevik Revolution.  If there’s anything to celebrate, it’s that Communism was tried, it failed, and as a result it is no longer viable as a global secular religion.  Unfortunately, the cost of the experiment in human lives was far greater than that of any comparable revolutionary ideology before or since.  It’s not as if we weren’t warned.  As I noted in an earlier post, Herbert Spencer was probably the most accurate prophet of all.  In his A Plea for Liberty he wrote,

Already on the continent, where governmental organizations are more elaborate and coercive than here, there are chronic complaints of the tyranny of bureaucracies – the hauteur and brutality of their members. What will these become when not only the more public actions of citizens are controlled, but there is added this far more extensive control of all their respective daily duties? What will happen when the various divisions of this vast army of officials, united by interests common to officialism – the interest of the regulators versus those of the regulated – have at their command whatever force is needful to suppress insubordination and act as ‘saviors of society’? Where will be the actual diggers and miners and smelters and weavers, when those who order and superintend, everywhere arranged class above class, have come, after some generations, to intermarry with those of kindred grades, under feelings such as are operative under existing classes; and when there have been so produced a series of castes rising in superiority; and when all these, having everything in their own power, have arranged modes of living for their own advantage: eventually forming a new aristocracy far more elaborate and better organized than the old?

What will result from their (the bureaucracy’s) operation when they are relieved from all restraints?…The fanatical adherents of a social theory are capable of taking any measures, no matter how extreme, for carrying out their views: holding, like the merciless priesthoods of past times, that the end justifies the means. And when a general socialistic organization has been established, the vast, ramified, and consolidated body of those who direct its activities, using without check whatever coercion seems to them needful in the interests of the system (which will practically become their own interests) will have no hesitation in imposing their rigorous rule over the entire lives of the actual workers; until eventually, there is developed an official oligarchy, with its various grades, exercising a tyranny more gigantic and more terrible than any which the world has seen.

Spencer’s prophesy was eloquently confirmed by former Communist Milovan Djilas in his The New Class, where he wrote,

The transformation of the Party apparatus into a privileged monopoly (new class, nomenklatura) existed in embryonic form in Lenin’s prerevolutionary book Professional Revolutionaries, and in his time was already well under way. It is just this which has been the major reason for the decay of communism… Thus he, Stalin, the greatest Communist – for so everyone thought him save the dogmatic purists and naive “quintessentialists” – the incarnation of the real essence, the real possibilities, of the ideal – this greatest of all Communists, killed off more Communists than did all the opponents of Communism taken together, worldwide… Ideology exterminates its true believers.

The biggest danger we face in the aftermath of Communism is that the lesson will be forgotten.  It was spawned on the left of the ideological spectrum, and today’s leftists would prefer that the monster they created be forgotten.  Since they control the present, in the form of the schools, they also control the past, according to the dictum set forth by George Orwell in his 1984.  As a result, today’s students hear virtually nothing about the horrors of Communism.  Instead, they are fed a bowdlerized “history,” according to which nothing of any significance has ever happened in the United States except the oppression and victimization of assorted racial and other minority groups.  No matter that, by any rational standard, the rise of the United States has been the greatest boon to “human flourishing” in the last 500 years.  No matter that Communism would almost certainly have spread its grip a great deal further and lasted a great deal longer if the US had never existed.  The Left must be spared embarrassment.  Therefore, the US is portrayed as the “villain,” and Communism has been dropped down the memory hole.

Indeed, if Bernie Sanders recent bid for the Presidency, sadly sabotaged by the Clinton machine via the DNC, is any indication, socialism, if not Communism, is still alive and well.  Of course, anyone with even a passing knowledge of history knows that socialism has been tried in a virtually infinite array of guises, from the “hard” versions that resulted in the decapitation of Cambodia and the Soviet Union to the “soft” version foisted on the United Kingdom after World War II.  It has invariably failed.  No matter.  According to its proponents, that’s only because “it hasn’t been done right.”  These people are nothing if not remarkably slow learners.

Consider the implications.  According to Marx, the proletarian revolution to come could not possibly result in the slaughter and oppression characteristic of past revolutions because, instead to the dictatorship of a minority over a majority, it would result in the dictatorship of the proletarian majority over a bourgeois minority.  However, the Bolshevik Revolution did result in oppression and mass slaughter on an unprecedented scale.  How to rescue Marx?  We could say that the revolution wasn’t really a proletarian revolution.  That would certainly have come as a shock to Lenin and his cronies.  If not a proletarian revolution, what kind was it?  There aren’t really many choices.  Was it a bourgeois revolution?  Then how is it that all the “owners of the social means of production” who were unlucky enough to remain in the country had their throats slit?  Who among the major players was an “owner of the social means of production?  Lenin?  Trotsky?  Stalin?  I doubt it.  If not a bourgeois revolution, could it have been a feudal revolution?  Not likely in view of the fact that virtually the entire surviving Russian nobility could be found a few years later waiting tables in French restaurants.  If we take Marx at his word, it must, in fact, have been a proletarian revolution, and Marx, in fact, must have been dead wrong.  In one of the last things he wrote, Trotsky, probably the best and the brightest of all the old Bolsheviks, admitted as much.  He had hoped until the end that Stalinism was merely a form of “bureaucratic parasitism,” and the proletariat would soon shrug it off and take charge as they should have from the start.  However, just before he was murdered by one of Stalin’s assassins, he wrote,

If, however, it is conceded that the present war (World War II) will provoke not revolution but a decline of the proletariat, then there remains another alternative; the further decay of monopoly capitalism, its further fusion with the state and the replacement of democracy wherever it still remained by a totalitarian regime. The inability of the proletariat to take into its hands the leadership of society could actually lead under these conditions to the growth of a new exploiting class from the Bonapartist fascist bureaucracy. This would be, according to all indications, a regime of decline, signaling the eclipse of civilization… Then it would be necessary in retrospect to establish that in its fundamental traits the present USSR was the precursor of a new exploiting regime on an international scale… If (this) prognosis proves to be correct, then, of course, the bureaucracy will become a new exploiting class. However onerous the second perspective may be, if the world proletariat should actually prove incapable of fulfilling the mission placed upon it by the course of development, nothing else would remain except only to recognize that the socialist program, based on the internal contradictions of capitalist society, ended as a Utopia.

And so it did.  Trotsky, convinced socialist that he was, saw the handwriting on the wall at last.  However, Trotsky was a very smart man.  Obviously, our latter day socialists aren’t quite as smart.  It follows that we drop the history of Communism down Orwell’s “memory hold” at our peril.  If we refuse to learn anything from the Communist experiment, we may well find them foisting another one on us before long.  Those who do want to learn something about it would do well to be wary of latter day “interpretations.”  With Communism, as with anything else, it’s necessary to consult the source literature yourself if you want to uncover anything resembling the truth.  There is a vast amount of great material out there.  Allow me to mention a few of my personal favorites.

There were actually two Russian Revolutions in 1917.  In the first, which occurred in March (new style) the tsar was deposed and a provisional government established in the place of the old monarchy.  Among other things it issued decrees that resulted in a fatal relaxation of discipline in the Russian armies facing the Germans and Austro-Hungarians, paving the way for the Bolshevik coup that took place later that year.  Perhaps the best account of the disintegration of the armies that followed was written by a simple British nurse named Florence Farmborough in her With the Armies of the Tsar; A Nurse at the Russian Front, 1914-18.  The Communists themselves certainly learned from this experience, executing thousands of their own soldiers during World War II at the least hint of insubordination.  My favorite firsthand account of the revolution itself is The Russian Revolution 1917; An Eyewitness Account, by N. N. Sukhanov, a Russian socialist who played a prominent role in the Provisional Government.  He described Stalin at the time as a “grey blur.”  Sukhanov made the mistake of returning to the Soviet Union.  He was arrested in 1937 and executed in 1940.  Another good firsthand account is Political Memoirs, 1905-1917, by Pavel Miliukov.  An outstanding account of the aftermath of the revolution is Cursed Days, by novelist Ivan Bunin.  Good accounts by diplomats include An Ambassador’s Memoirs by French ambassador to the court of the tsar Maurice Paleologue, and British Agent by Bruce Lockhart.

When it comes to the almost incredible brutality of Communism, it’s hard to beat Solzhenitsyn’s classic The Gulag Archipelago.  Other good accounts include Journey into the Whirlwind by Yevgenia Ginzburg and Back in Time by Nadezhda Joffe.  Ginzburg was the wife of a high Communist official, and Joffe was the daughter of Adolph Joffe, one of the most prominent early Bolsheviks.  Both were swept up in the Great Purge of the late 1930’s, and both were very lucky to survive life in the Gulag camps.  Ginzburg had been “convicted” of belong to a “counterrevolutionary Trotskyist terrorist organization,” and almost miraculously escaped being shot outright.  She spent the first years of her sentence in solitary confinement.  In one chapter of her book she describes what happened to an Italian Communist who dared to resist her jailers:

I heard the sound of several feet, muffled cries, and a shuffling noise as though a body were being pulled along the stone floor.  Then there was a shrill cry of despair; it continued for a long while on the same note, and stopped abruptly.

It was clear that someone was being dragged into a punishment cell and was offering resistance… The cry rang out again and stopped suddenly, as though the victim had been gagged… But it continued – a penetrating, scarcely human cry which seemed to come from the victim’s very entrails, to be viscous and tangible as it reverberated in the narrow space.  Compared with it, the cries of a woman in labor were sweet music.  They, after all, express hope as well as anguish, but here there was only a vast despair.

I felt such terror as I had not experienced since the beginning of my wanderings through this inferno.  I felt that at any moment I should start screaming like my unknown neighbor, and from that it could only be a step to madness.

At that moment I heard clearly, in the midst of the wailing, the words “Communista Italiana, Communista Italiana!”  So that was it!  No doubt she had fled from Mussolini just as Klara, my cellmate at Butyrki, had fled from Hitler.

I heard the Italian’s door opened, and a kind of slithering sound which I could not identify.  Why did it remind me of flower beds?  Good God, it was a hose!  So Vevers (one of her jailers) had not been joking when he had said to me:  “We’ll hose you down with freezing water and then shove you in a punishment cell.”

The wails became shorter as the victim gasped for breath.  Soon it was a tiny shrill sound, like a gnat’s.  The hose played again; then I heard blows being struck, and the iron door was slammed to.  Dead silence.

That was just a minute part of the reality of the “worker’s paradise.”  Multiply it millions of times and you will begin to get some inkling of the reality of Communism under Stalin.  Many of the people who wrote such accounts began as convinced Communists and remained so until the end of their days.  They simply couldn’t accept the reality that the dream they had dedicated their lives to was really a nightmare.  Victor Serge was another prominent Bolshevik and “Trotskyist” who left an account of his own struggle to make sense of what he saw happening all around him in his Memoirs of a Revolutionary:

Nobody was willing to see evil in the proportions it had reached.  As for the idea that the bureaucratic counterrevolution had attained power, and that a new despotic State had emerged from our own hands to crush us, and reduce the country to absolute silence – nobody, nobody in our ranks was willing to admit it.  From the depths of his exile in Alma-Ata Trotsky affirmed that this system was still ours, still proletarian, still Socialist, even though sick; the Party that was excommunicating, imprisoning, and beginning to murder us remained our Party, and we still owed everything to it:  we must live only for it, since only through it could we serve the Revolution.  We were defeated by Party patriotism:  It both provoked us to rebel and turned us against ourselves.

Serge was lucky.  He was imprisoned years before the Great Purge began in earnest, and was merely sentenced to internal exile in Siberia.  The secret police even supplied him and a fellow exile with a bread ration.  After a few years, thanks to pressure from foreign socialists, he was allowed to leave the Soviet Union.  Conditions for the normal citizens of Orenburg where he spent his exile, were, if anything, worse than his, even though more than a decade had elapsed since the advent of the “worker’s paradise.”  In the following he describes what happened when they received their bread ration:

I heard shouting from the street, and then a shower of vigorous knocks on the door.  “Quick, Victor Lvovich, open up!”  Bobrov was coming back from the bakery, with two huge four-kilo loaves of black bread on his shoulders.  He was surrounded by a swarm of hungry children, hopping after the bread like sparrows, clinging on his clothes, beseeching:  “A little bit, uncle, just a little bit!”  They were almost naked.  We threw them some morsels, over which a pitched battle promptly began.  The next moment, our barefooted maidservant brought boiling water, unasked, for us to make tea.  When she was alone with me for a moment, she said to me, her eyes smiling, “Give me a pound of bread and I’ll give you the signal in a minute… And mark my words, citizen, I can assure you that I don’t have the syphilis, no, not me…”  Bobrov and I decided to go out only by turns, so as to keep an eye on the bread.

So much for the look of real oppression, as opposed to the somewhat less drastic versions that occupy the florid imaginations of today’s Social Justice Warriors.  Speaking of SJW’s, especially of the type whose tastes run to messianic revolutionary ideologies, the demise of Communism has had an interesting effect.  It has pulled the rug out from under their feet, leaving them floating in what one might describe as an ideological vacuum.  Somehow writing furious diatribes against Trump on Facebook just doesn’t tickle the same itch as Communism did in its day.  When it comes to fanatical worldviews, oddly enough, radical Islam is the only game in town.  The SJWs can’t really fall for it hook, line and sinker the way they once did for Communism.  After all, its ideology is diametrically opposed to what they’ve claimed to believe in lo these many years.  The result has been the weird love affair between the radical Left and Islam that’s been such an obvious aspect of the ideological scene lately, complete with bold flirtations and coy, steamy glances from afar.  Strange bedfellows indeed!

In terms of the innate, ingroup/outgroup behavior of human beings I’ve often discussed on this blog, the outgroup of the Communist ingroup was, of course, the “bourgeoisie.”  If even the most tenuous connection could be made between some individual and the “bourgeoisie,” it became perfectly OK to murder and torture that individual, after the fashion of our species since time immemorial.  We saw nearly identical behavior directed against the “aristocrats” after the French Revolution, and against the Jews under the Nazis.  If our species learns nothing else from its experiment with Communism, it is to be hoped that we at least learn the extreme danger of continuing to uncritically indulge this aspect of our behavioral repertoire.  I realize that it is very likely to be a vain hope.  If anything, ingroup/outgroup identification according to ideology is intensifying and becoming increasingly dangerous.  The future results are unpredictable, but are very unlikely to be benign.  Let us at least hope that, under the circumstances, no new messianic secular religion appears on the scene to fill the vacuum left by Communism.  We can afford to wait a few more centuries for that.

Life Among the Mormons

A few years ago I moved into an almost entirely Mormon neighborhood.  It turns out that Mormons are a great deal more tolerant than the average atheist Social Justice Warrior.  As a result I was able to learn some things about them that certainly won’t be news to other Mormons, but may interest the readers of this blog.

One day, shortly after my arrival, I was chatting with my next door neighbor, and she mentioned that some of the neighbors in our age group were in the habit of getting together socially every other week, and wondered if I would like to tag along.  I said, “Sure.”  She suggested I ride along with her and her husband, as the group rotated from house to house, and they knew the neighborhood.  Well, when we were underway, she casually slipped me a large Bible.  It turns out that the “social gathering” was what the Mormons call Family Home Evening, or FHE.  The host is responsible for coming up with a program that relates to the church in some way.  This time around it involved each guest reading passages from the Bible with a common theme, which the group would then discuss.  At other times the Book of Mormon or other Mormon religious books might be substituted for the Bible.  Once we were to act out different parables, and the others would try to guess what they were.  On another occasion there was a presentation about the Mormon system of indexing genealogical records, and how volunteers might help with the process.  I wasn’t particularly uncomfortable with any of this, as I attended Sunday School regularly and went to church camps as a child, and still know my Bible fairly well.

After the first meeting I e-mailed my neighbor to thank her for taking me to FHE, but told her that I had no intention of changing my religion.  I quoted my favorite Bible passage, Ephesians 2: 8-9 in self defense.  It goes like this:

For by grace are ye saved through faith; and that not of yourselves:  It is the gift of God:  Not of works, lest any man should boast.

I strongly recommend it to my fellow atheists.  It’s great for warding off pesky proselytizers.  After all, if you’ve read the Bible and have an open mind, then nothing more can be done for you by human agency.  The rest depends on God, “lest any man should boast.”  It usually works, but not this time.  It turns out my neighbor was something of an activist in the Mormon community, and was bound and determined to make sure that when “grace” came, I would be standing close enough to the source to notice it.  She said that I’d made a very favorable impression on the other neighbors, and they would be very disappointed if I stopped coming to FHE.  They knew I wasn’t a Mormon, but it didn’t matter.

Well, my curiosity got the best of me, and I agreed to keep coming.  I must admit with a certain degree of shame that I never flat out said I was an atheist.  I mentioned that an ancestor had been a Baptist preacher, and I think they took me for some kind of a hard core Protestant, probably with a distinct Calvinist bent.  As an extenuating circumstance I might mention that I’m not much of a cook, and delicious snacks were served at the end of each meeting.  I’m not talking potato chips.  I’m not sure if “my” FHE was typical, but these people were real gourmets.  They laid out some goodies that gladdened my heart, and were a welcome relief from the hamburgers and bologna sandwiches that were my usual fare.  It’s possible my FHE was an outlier in things other than food as well.  My boss was a Mormon, and seemed surprised when he heard that I attended.  He said I’d better watch out.  I was getting pretty close to the fire!

In the meetings that followed I always felt accepted by the group, and never “othered” for not being a Mormon.  None of them ever came to my door to engage in spiritual arm twisting (that was limited to the local Jehovah’s Witnesses), nor was I ever subjected to any heavy-handed attempts at conversion.  They did let me know on occasion that, if I had any questions about the church, they would be glad to answer them.  They also encouraged me to come to church to see what it was like, and always invited me to other Mormon social affairs.  These included a barn dance, “Trick or Trunk,” a convenient substitute for trick or treating on Halloween at which candy is passed out from the trunks of cars parked side by side, Christmas dinner at the church, a Christmas pageant, etc.  The atmosphere at these affairs always reminded me of the church I grew up in during the 50’s and 60’s.  Now it is a typical mainstream Protestant church, attended mainly by people who appear to be well over 70, but in those days it was a great deal more vibrant, with a big congregation that included many children.  So it was in the Mormon church.  There were members of all ages, and there must have been 50 boys and girls in the children’s choir.  In a word, you didn’t get the feeling that the church was dying.

I did attend church on one occasion, and it was quite different from a typical Protestant service.  To begin, there are no regular pastors.  Everything is done by lay people.  The church services last about three hours.  Ours was divided into a general service, another lesson delivered by one of the lay people, and another period in which the men and women were divided into separate groups.  Of course, there’s also Sunday school for the children.

Each church is attended by one or more “wards,”  and there are several wards in a “stake.”  Each ward has a lay “Bishop,” who is appointed for a period of five years, give or take.  The stake is headed by a lay “President,” also appointed for a limited time.  These part time clergymen aren’t paid, don’t get to wear any gorgeous vestments, and certainly nothing like the Pope’s Gucci slippers, but they still have all the counseling, visiting, and other duties of more conventional clergy.  I was familiar with both my ward Bishop and stake President.  Both were intelligent and capable professional men.  They were respected by the rest of the congregation, but the ones I knew weren’t patronizing or in any way “stuck up.”  They were just members of the congregation at the service I attended, but perhaps they occasionally play a more active role.

Hard core Mormons give ten percent of their gross income to the church.  I’m not sure what percentage is “hard core,” and I’m also not sure what the church does with all the money.  That question has probably been asked ever since the days of Joseph Smith.  I suspect the IRS is reasonably well informed, but otherwise they keep financial matters pretty close to the vest.  In any case, only members who tithe are allowed to attend services at or be married in a Mormon Temple.

Mormons are a great deal more “moral” when it comes to reproduction than the average atheist.  In other words, their behavior in such matters is consistent with what the relevant predispositions accomplished at the time they evolved.  For example, the lady who tossed the Bible in my lap had 11 children and 37 grandchildren.  Large families were the rule in our neighborhood.  I can’t really understand the objections of the “anti-breeders” to such behavior in a country where the population would be declining if it weren’t for massive illegal immigration.  In any case, all those grandchildren and great grandchildren will have inherited the earth long after the mouths of those who criticized their ancestors have been stopped with dust.

The people in my ward included some who were brought up in the Mormon faith, and some, including my zealous neighbor lady, who had been converted later in life.  Among the former there were some older people who still had a lively memory of the days when polygamy was a great deal more common than it is now.  They recall that there were federal “revenuers” who were on the lookout for such arrangements just as their more familiar peers were snooping after moonshine stills.  A neighbor, aged about 80, recounted a story of one such family she had heard as a child.  A baby had been born to a man with several wives, but died soon after birth.  The “revenuers” were aware of the fact.  Soon, however, the stork arrived again, and this time delivered a healthy baby.  Shortly thereafter the man was sitting at the dinner table holding the new arrival when he was warned that inspectors were on the way to pay him a visit.  He took it on the lam out the back door, and hid in the family cemetery were the first child was buried.  When the inspectors arrived, they asked the wife who happened to be in the house where they could find her husband.  With a downcast look she replied, “He’s up in the cemetery with the baby.”  That statement was, of course, perfectly true.  The embarrassed “revenuers” muttered their condolences and left!

I must say I had to clench my teeth occasionally on listening to some of the passages from the Book of Mormon.  On the other hand, there’s really nothing there that’s any more fantastic than the similar stories you can read in the Bible, or the lives of the saints.  In any case, what they believe strikes me as a great deal less dangerous than the equally fantastic belief held by the “men of science” for half a century that there is no such thing as human nature, not to mention “scientific” Marxism-Leninism.  According to some atheists, indoctrinating children with stories from the Bible and the Book of Mormon constitutes “child abuse.”  I have my doubts given the fact that they seem to accomplish those most “moral” of all goals, survival and reproduction, a great deal better than most of my fellow infidels.  Many of my fellow atheists have managed to convince themselves that they’ve swallowed the “red pill,” but in reality they’re just as delusional as the Mormons, and their delusions are arguably more destructive.  I personally would rather see my children become Mormons than dour, barren, intolerant, and ultra-Puritanical Social Justice Warriors, striding down the path to genetic suicide with a self-righteous scowl.  I would also much rather live among spiritual Mormons than secular Communists.

As one might expect, there were many non-Mormons in the local community who “othered” the Mormons, and vice versa.  Nothing is more natural for our species than to relegate those who are in any way different to the outgroup.  For example, Mormons, were supposed to stick together and favor each other in business dealings, government appointments, etc.  Unfortunately, there has never been a population of humans who consider themselves members of the same group that has not done precisely the same, at least to some extent.  Mormon religious beliefs were considered “crazy,” as opposed, apparently, to such “perfectly sane” stories as Noah’s ark, the loaves and the fishes, the magical conversion of bread and wine to flesh and blood, etc.   Mormons were supposed to imagine that they wore “magic clothes.”  In reality the Mormons don’t consider such garments any more “magical” than a nun’s habit or a Jew’s yarmulke.

In general, I would prefer that people believe the truth.  I am an atheist, and don’t believe in the existence of any God or gods.  I’m not an “accommodationist,” and I don’t buy Stephen Jay Gould’s notion of “Non-Overlapping Magisteria.”  On the other hand, when people treat me with kindness and generosity, as I was treated in the Mormon community, I’m not in the habit of responding with stones and brickbats, either.  The hard core Hobbesians out there will claim that all that kindness sprang from selfish motives, but hard core Hobbesians must also perforce admit that neither they nor anyone else acts any differently.

If you want to get a fictional “taste” of what Mormons are like, I recommend the film “Once I was a Beehive.”  You can rent it at Amazon.  It’s about a teenage girl whose mom remarries to a Mormon.  The flavor of the Mormon community pictured in the film reflects my own impressions pretty accurately.  The Mormon Bishop, in particular, is very typical and true to life.

As for me, in the fullness of time I left the land of the Mormons and now live among the heathen once again.  None of them has seen fit to follow me and pull me back from the fiery furnace by the scruff of my neck.  It may be that they finally realized I was a hopeless case, doomed to sizzle over the coals in the hereafter for the edification of the elect.  I’m afraid they’re right about that.  If they do come after me they’ll find me armed with my copy of Ephesians, as stubborn as ever.

The God Myth and the “Humanity Can’t Handle The Truth” Gambit

Hardly a day goes by without some pundit bemoaning the decline in religious faith.  We are told that great evils will inevitably befall mankind unless we all believe in imaginary super-beings.  Of course, these pundits always assume a priori that the particular flavor of religion they happen to favor is true.  Absent that assumption, their hand wringing boils down to the argument that we must all somehow force ourselves to believe in God whether that belief seems rational to us or not.  Otherwise, we won’t be happy, and humanity won’t flourish.

An example penned by Dennis Prager entitled Secular Conservatives Think America Can Survive the Death of God that appeared recently at National Review Online is typical of the genre.  Noting that even conservative intellectuals are becoming increasingly secular, he writes that,

They don’t seem to understand that the only solution to many, perhaps most, of the social problems ailing America and the West is some expression of Judeo-Christian religion.

In another article entitled If God is Dead…, Pat Buchanan echoes Prager, noting, in a rather selective interpretation of history, that,

When, after the fall of the Roman Empire, the West embraced Christianity as a faith superior to all others, as its founder was the Son of God, the West went on to create modern civilization, and then went out and conquered most of the known world.

The truths America has taught the world, of an inherent human dignity and worth, and inviolable human rights, are traceable to a Christianity that teaches that every person is a child of God.

Today, however, with Christianity virtually dead in Europe and slowly dying in America, Western culture grows debased and decadent, and Western civilization is in visible decline.

Both pundits draw attention to a consequence of the decline of traditional religions that is less a figment of their imaginations; the rise of secular religions to fill the ensuing vacuum.  The examples typically cited include Nazism and Communism.  There does seem to be some innate feature of human behavior that predisposes us to adopt such myths, whether of the spiritual or secular type.  It is most unlikely that it comes in the form of a “belief in God” or “religion” gene.  It would be very difficult to explain how anything of the sort could pop into existence via natural selection.  It seems reasonable, however, that less specialized and more plausible behavioral traits could account for the same phenomenon.  Which begs the question, “So what?”

Pundits like Prager and Buchanan are putting the cart before the horse.  Before one touts the advantages of one brand of religion or another, isn’t it first expedient to consider the question of whether it is true?  If not, then what is being suggested is that mankind can’t handle the truth.  We must be encouraged to believe in a pack of lies for our own good.  And whatever version of “Judeo-Christian religion” one happens to be peddling, it is, in fact, a pack of lies.  The fact that it is a pack of lies, and obviously a pack of lies, explains, among other things, the increasingly secular tone of conservative pundits so deplored by Buchanan and Prager.

It is hard to understand how anyone who uses his brain as something other than a convenient stuffing for his skull can still take traditional religions seriously.  The response of the remaining true believers to the so-called New Atheists is telling in itself.  Generally, they don’t even attempt to refute their arguments.  Instead, they resort to ad hominem attacks.  The New Atheists are too aggressive, they have bad manners, they’re just fanatics themselves, etc.  They are not arguing against the “real God,” who, we are told, is not an object, a subject, or a thing ever imagined by sane human beings, but some kind of an entity perched so high up on a shelf that profane atheists can never reach Him.  All this spares the faithful from making fools of themselves with ludicrous mental flip flops to explain the numerous contradictions in their holy books, tortured explanations of why it’s reasonable to assume the “intelligent design” of something less complicated by simply assuming the existence of something vastly more complicated, and implausible yarns about how an infinitely powerful super-being can be both terribly offended by the paltry sins committed by creatures far more inferior to Him than microbes are to us, and at the same time incapable of just stepping out of the clouds for once and giving us all a straightforward explanation of what, exactly, he wants from us.

In short, Prager and Buchanan would have us somehow force ourselves, perhaps with the aid of brainwashing and judicious use of mind-altering drugs, to believe implausible nonsense, in order to avoid “bad” consequences.  One can’t dismiss this suggestion out of hand.  Our species is a great deal less intelligent than many of us seem to think.  We use our vaunted reason to satisfy whims we take for noble causes, without ever bothering to consider why those whims exist, or what “function” they serve.  Some of them apparently predispose us to embrace ideological constructs that correspond to spiritual or secular religions.  If we use human life as a metric, P&B would be right to claim that traditional spiritual religions have been less “bad” than modern secular ones, costing only tens of millions of lives via religious wars, massacres of infidels, etc., whereas the modern secular religion of Communism cost, in round numbers, 100 million lives, and in a relatively short time, all by itself.  Communism was also “bad” to the extent that we value human intelligence, tending to selectively annihilate the brightest portions of the population in those countries where it prevailed.  There can be little doubt that this “bad” tendency substantially reduced the average IQ in nations like Cambodia and the Soviet Union, resulting in what one might call their self-decapitation.  Based on such metrics, Prager and Buchanan may have a point when they suggest that traditional religions are “better,” to the extent that one realizes that one is merely comparing one disaster to another.

Can we completely avoid the bad consequences of believing the bogus “truths” of religions, whether spiritual or secular?  There seems to be little reason for optimism on that score.  The demise of traditional religions has not led to much in the way of rational self-understanding.  Instead, as noted above, secular religions have arisen to fill the void.  Their ideological myths have often trumped reason in cases where there has been a serious confrontation between the two, occasionally resulting in the bowdlerization of whole branches of the sciences.  The Blank Slate debacle was the most spectacular example, but there have been others.  As belief in traditional religions has faded, we have gained little in the way of self-knowledge in their wake.  On the contrary, our species seems bitterly determined to avoid that knowledge.  Perhaps our best course really would be to start looking for a path back inside the “Matrix,” as Prager and Buchanan suggest.

All I can say is that, speaking as an individual, I don’t plan to take that path myself.  I has always seemed self-evident to me that, whatever our goals and aspirations happen to be, we are more likely to reach them if we base our actions on an accurate understanding of reality rather than myths, on truth rather than falsehood.  A rather fundamental class of truths are those that concern, among other things, where those goals and aspirations came from to begin with.  These are the truths about human behavior; why we want what we want, why we act the way we do, why we are moral beings, why we pursue what we imagine to be noble causes.  I believe that the source of all these truths, the “root cause” of all these behaviors, is to be found in our evolutionary history.  The “root cause” we seek is natural selection.  That fact may seem inglorious or demeaning to those who lack imagination, but it remains a fact for all that.  Perhaps, after we sacrifice a few more tens of millions in the process of chasing paradise, we will finally start to appreciate its implications.  I think we will all be better off if we do.

More Fun with Moral Realism

What is moral realism?  Edvard Westermarck provided a good definition in the first paragraph of his Ethical Relativity:

Ethics is generally looked upon as a “normative” science, the object of which is to find and formulate moral principles and rules possessing objective validity.  The supposed objectivity of moral values, as understood in this treatise, implies that they have a real existence apart from any reference to a human mind, that what is said to be good or bad, right or wrong, cannot be reduced merely to what people think to be good or bad, right or wrong.  It makes morality a matter of truth and falsity, and to say that a judgment is true obviously means something different from the statement that it is thought to be true.  The objectivity of moral judgments does not presuppose the infallibility of the individual who pronounces such a judgment, nor even the accuracy of a general consensus of opinion; but if a certain course of conduct is objectively right, it must be thought to be right by all rational beings who judge truly of the matter and cannot, without error, be judged to be wrong.

Westermarck dismissed moral realism as a chimera.  So do I.  Indeed, in view of what we now know about the evolutionary origins of moral emotions, the idea strikes me as ludicrous.  It is, however, treated as matter-of-factly as if it were an unquestionable truth, and not only in the general public.  Philosophers merrily discuss all kinds of moral conundrums and paradoxes in academic journals, apparently in the belief that they have finally uncovered the “truth” about such matters, to all appearances with no more fear of being ridiculed than the creators of the latest Paris fashions.  The fact is all the more disconcerting if one takes the trouble to excavate the reasons supplied for this stubborn belief that subjective emotional constructs in the minds of individuals actually relate to independent things.  Typically, they are threadbare almost beyond belief.

Recently I discussed the case of G. E. Moore, who, after dismissing the arguments of virtually everyone who had attempted a “proof” of moral realism before him as fatally flawed by the naturalistic fallacy, supplied a “proof” of his own.  It turned out that the “objective good” consisted of those things that were most likely to please an English country gentleman.  The summum bonum was described as something like sitting in a cozy house with a nice glass of wine while listening to Beethoven.  The only “proof” supplied for the independent existence of this “objective good” was Moore’s assurance that he was an expert in such matters, and that it was obvious to him that he was right.

I recently uncovered another such “proof,” this time concocted in the fertile imagination of the Swedish philosopher Torbjörn Tännsjö. It turned up in an interview on the website of 3:AM Magazine under the title, The Hedonistic Utilitarian.  In response to interviewer Richard Marshall’s question,

Why are you a moral realist and what difference does this make to how you go about investigating morals from, for example, a non-realist?

Tännsjö replies,

I am indeed a moral realist.  In particular, I believe that one basic question, what we ought to do, period (the moral question), is a genuine one.  There exists a true answer to it, which is independent of our thought and conceptualization.  My main argument in defense of the position is this.  It is true (independently of our conceptualization) that it is wrong to inflict pain on a sentient creature for no reason (she doesn’t deserve it, I haven’t promised to do it, it is not helpful to this creature or to anyone else if I do it, and so forth).  But if this is a truth, existing independently of our conceptualization, then at least one moral fact (this one) exists and moral realism is true.  We have to accept this, I submit, unless we can find strong reasons to think otherwise.

In reading this, I was reminded of PFC Littlejohn, who happened to serve in my unit when I was a young lieutenant in the Army.  Whenever I happened to pull his leg more egregiously than even he could bear, he would typically respond, “You must be trying to bullshit me, sir!”  Apparently Tännsjö doesn’t consider Darwin’s theory, or Darwin’s own opinion regarding the origin of the moral emotions, or the flood of books and papers on the evolutionary origins of moral behavior, or the convincing arguments for the selective advantage of just such an emotional response as he describes, or the utter lack of evidence for the physical existence of “moral truths” independent of our “thought and conceptualization,” as sufficiently strong reasons “to think otherwise.”  Tännsjö continues,

Moral nihilism comes with a price we can now see.  It implies that it is not wrong (independently of our conceptualization) to do what I describe above; this does not mean that it is all right to do it either, of course, but yet, for all this, I find this implication from nihilism hard to digest.  It is not difficult to accept for moral reasons.  If it is false both that it is wrong to perform this action and that it is righty to perform it, then we need to engage in difficult issues in deontic logic as well.

Yes, in the same sense that deontic logic is necessary to determine whether it is true or false that there are fairies in Richard Dawkins’ garden.  No deontic logic is necessary here – just the realization that Tännsjö is trying to make truth claims about something that is not subject to truth claims.  The claim that it is objectively “not wrong” to do what he describes is as much a truth claim, and therefore just as irrational, as the claim that it is wrong.  As for his equally irrational worries about “moral nihilism,” his argument is similar to those of the religious true believers who think that, because they find a world without a God unpalatable, one must therefore perforce pop into existence.  Westermarck accurately described the nature of Tännsjö’s “proof” in his The Origin and Development of the Moral Ideas, where he wrote,

As clearness and distinctness of the conception of an object easily produces the belief in its truth, so the intensity of a moral emotion makes him who feels it disposed to objectivise the moral estimate to which it gives rise, in other words, to assign to it universal validity.  The enthusiast is more likely than anybody else to regard his judgments as true, and so is the moral enthusiast with reference to his moral judgments.  The intensity of his emotions makes him the victim of an illusion

The presumed objectivity of moral judgments thus being a chimera, there can be no moral truth in the sense in which this term is generally understood.  The ultimate reason for this is, that the moral concepts are based upon emotions, and that the contents of an emotion fall entirely outside the category of truth.

Today, Westermarck is nearly forgotten, while G. E. Moore is a household name among moral philosophers.  The Gods and angels of traditional religions seem to be in eclipse in Europe and North America, but “the substance of things hoped for,” and “the evidence of things not seen” are still with us, transmogrified into the ghosts and goblins of moral realism.  We find atheist social justice warriors hurling down their anathemas and interdicts more furiously than anything ever dreamed of by the Puritans and Pharisees of old, supremely confident in their “objective” moral purity.

And what of moral nihilism?  Dream on!  Anyone who seriously believes that anything like moral nihilism can result from the scribblings of philosophers has either been living under a rock, or is constitutionally incapable of observing the behavior of his own species.  Human beings will always behave morally.  The question is, what kind of a morality can we craft for ourselves that is both in harmony with our moral emotions, that does the least harm, and that most of us can live with.  I personally would prefer one that is based on an accurate understanding of what morality is and where it comes from.

Do I think that anything of the sort is on the horizon in the foreseeable future?  No.  When it comes to belief in religion and/or moral realism, one must simply get used to living in Bedlam.

James Burnham and the Anthropology of Liberalism

James Burnham was an interesting anthropological data point in his own right.  A left wing activist in the 30’s, he eventually became a Trotskyite.  By the 50’s however, he had completed an ideological double back flip to conservatism, and became a Roman Catholic convert on his deathbed.  He was an extremely well-read intellectual, and a keen observer of political behavior.  His most familiar book is The Managerial Revolution, published in 1941.  Among others, it strongly influenced George Orwell, who had something of a love/hate relationship with Burnham.  For example, in an essay in Tribune magazine in January 1944 he wrote,

Recently, turning up a back number of Horizon, I came upon a long article on James Burnham’s Managerial Revolution, in which Burnham’s main thesis was accepted almost without examination.  It represented, many people would have claimed, the most intelligent forecast of our time.  And yet – founded as it was on a belief in the invincibility of the German army – events have already blown it to pieces.

A bit over a year later, in February 1945, however, we find Burnham had made more of an impression on Orwell than the first quote implies.  In another essay in the Tribune he wrote,

…by the way the world is actually shaping, it may be that war will become permanent.  Already, quite visibly and more or less with the acquiescence of all of us, the world is splitting up into the two or three huge super-states forecast in James Burnham’s Managerial Revolution.  One cannot draw their exact boundaries as yet, but one can see more or less what areas they will comprise.  And if the world does settle down into this pattern, it is likely that these vast states will be permanently at war with one another, although it will not necessarily be a very intensive or bloody kind of war.

Of course, these super-states later made their appearance in Orwell’s most famous novel, 1984.  However, he was right about Burnham the first time.  He had an unfortunate penchant for making wrong predictions, often based on the assumption that transitory events must represent a trend that would continue into the indefinite future.  For example, impressed by the massive industrial might brought to bear by the United States during World War II, and its monopoly of atomic weapons, he suggested in The Struggle for the World, published in 1947, that we immediately proceed to force the Soviet Union to its knees, and establish a Pax Americana.  A bit later, in 1949, impressed by a hardening of the U.S. attitude towards the Soviet Union after the war, he announced The Coming Defeat of Communism in a book of that name.  He probably should have left it at that, but reversed his prognosis in Suicide of the West, which appeared in 1964.  By that time it seemed to Burnham that the United States had become so soft on Communism that the defeat of Western civilization was almost inevitable.  The policy of containment could only delay, but not stop the spread of Communism, and in 1964 it seemed that once a state had fallen behind the Iron Curtain it could never throw off the yoke.

Burnham didn’t realize that, in the struggle with Communism, time was actually on our side.  A more far-sighted prophet, a Scotsman by the name of Sir James Mackintosh, had predicted in the early 19th century that the nascent versions of Communism then already making their appearance would eventually collapse.  He saw that the Achilles heel of what he recognized was really a secular religion was its ill-advised proclamation of a coming paradise on earth, where it could be fact-checked, instead of in the spiritual realms of the traditional religions, where it couldn’t.  In the end, he was right.  After they had broken 100 million eggs, people finally noticed that the Communists hadn’t produced an omelet after all, and the whole, seemingly impregnable edifice collapsed.

One thing Burnham did see very clearly, however, was the source of the West’s weakness – liberalism.  He was well aware of its demoralizing influence, and its tendency to collaborate with the forces that sought to destroy the civilization that had given birth to it.  Inspired by what he saw as an existential threat, he carefully studied and analyzed the type of the western liberal, and its evolution away from the earlier “liberalism” of the 19th century.  Therein lies the real value of his Suicide of the West.  It still stands as one of the greatest analyses of modern liberalism ever written.  The basic characteristics of the type he described are as familiar more than half a century later as they were in 1964.  And this time his predictions regarding the “adjustments” in liberal ideology that would take place as its power expanded were spot on.

Burnham developed nineteen “more or less systematic set of ideas, theories and beliefs about society” characteristic of the liberal syndrome in Chapters III-V of the book, and then listed them, along with possible contrary beliefs in Chapter VII.  Some of them have changed very little since Burnham’s day, such as,

It is society – through its bad institutions and its failure to eliminate ignorance – that is responsible for social evils.  Our attitude toward those who embody these evils – of crime, delinquency, war, hunger, unemployment, communism, urban blight – should not be retributive but rather the permissive, rehabilitating, education approach of social service; and our main concern should be the elimination of the social conditions that are the source of the evils.

Since there are no differences among human beings considered in their political capacity as the foundation of legitimate, that is democratic, government, the ideal state will include all human beings, and the ideal government is world government.

The goal of political and social life is secular:  to increase the material and functional well-being of humanity.

Some of the 19 have begun to change quite noticeably since the publication of Suicide of the West in just the ways Burnham suggested.  For example, items 9 and 10 on the list reflect a classic version of the ideology that would have been familiar to and embraced by “old school” liberals like John Stuart Mill:

Education must be thought of as a universal dialogue in which all teachers and students above elementary levels may express their opinions with complete academic freedom.

Politics must be though of as a universal dialogue in which all persons may express their opinions, whatever they may be, with complete freedom.

Burnham had already noticed signs of erosion in these particular shibboleths in his own day, as liberals gained increasing control of academia and the media.  As he put it,

In both Britain and the United States, liberals began in 1962 to develop the doctrine that words which are “inherently offensive,” as far-Right but not communist words seem to be, do not come under the free speech mantle.

In our own day of academic safe spaces and trigger warnings, there is certainly no longer anything subtle about this ideological shift.  Calls for suppression of “offensive” speech have now become so brazen that they have spawned divisions within the liberal camp itself.  One finds old school liberals of the Berkeley “Free Speech Movement” days resisting Gleichschaltung with the new regime, looking on with dismay as speaker after speaker is barred from university campuses for suspected thought crime.

As noted above, Communism imploded before it could overwhelm the Western democracies, but the process of decay goes on.  Nothing about the helplessness of Europe in the face of the current inundation by third world refugees would have surprised Burnham in the least.  He predicted it as an inevitable expression of another fundamental characteristic of the ideology – liberal guilt.  Burnham devoted Chapter 10 of his book to the subject, and noted therein,

Along one perspective, liberalism’s reformist, egalitarian, anti-discrimination, peace-seeking principles are, or at any rate can be interpreted as, the verbally elaborated projections of the liberal sense of guilt.

and

The guilt of the liberal causes him to feel obligated to try to do something about any and every social problem, to cure every social evil.  This feeling, too, is non-rational:  the liberal must try to cure the evil even if he has no knowledge of the suitable medicine or, for that matter, of the nature of the disease; he must do something about the social problem even when there is no objective reason to believe that what he does can solve the problem – when, in fact, it may well aggravate the problem instead of solving it.

I suspect Burnham himself would have been surprised at the degree to which such “social problems” have multiplied in the last half a century, and the pressure to do something about them has only increased in the meantime.  As for the European refugees, consider the following corollaries of liberal guilt as developed in Suicide of the West:

(The liberal) will not feel uneasy, certainly not indignant, when, sitting in conference or conversation with citizens of countries other than his own – writers or scientists or aspiring politicians, perhaps – they rake his country and his civilization fore and aft with bitter words; he is as likely to join with them in the criticism as to protest it.

It follows that,

…the ideology of modern liberalism – its theory of human nature, its rationalism, its doctrines of free speech, democracy and equality – leads to a weakening of attachment to groups less inclusive than Mankind.

All modern liberals agree that government has a positive duty to make sure that the citizens have jobs, food, clothing, housing, education, medical care, security against sickness, unemployment and old age; and that these should be ever more abundantly provided.  In fact, a government’s duty in these respects, if sufficient resources are at its disposition, is not only to its own citizens but to all humanity.

…under modern circumstances there is a multiplicity of interests besides those of our own nation and culture that must be taken into account, but an active internationalism in feeling as well as thought, for which “fellow citizens” tend to merge into “humanity,” sovereignty is judged an outmode conception, my religion or no-religion appears as a parochial variant of the “universal ideas common to mankind,” and the “survival of mankind” becomes more crucial than the survival of my country and my civilization.

For Western civilization in the present condition of the world, the most important practical consequence of the guilt encysted in the liberal ideology and psyche is this:  that the liberal, and the group, nation or civilization infected by liberal doctrine and values, are morally disarmed before those whom the liberal regards as less well off than himself.

The inevitable implication of the above is that the borders of the United States and Europe must become meaningless in an age of liberal hegemony, as, indeed, they have.  In 1964 Burnham was not without hope that the disease was curable.  Otherwise, of course, he would never have written Suicide of the West.  He concluded,

But of course the final collapse of the West is not yet inevitable; the report of its death would be premature.  If a decisive changes comes, if the contraction of the past fifty years should cease and be reversed, then the ideology of liberalism, deprived of its primary function, will fade away, like those feverish dreams of the ill man who, passing the crisis of his disease, finds he is not dying after all.  There are a few small signs, here and there, that liberalism may already have started fading.  Perhaps this book is one of them.

No, liberalism hasn’t faded.  The infection has only become more acute.  At best one might say that there are now a few more people in the West who are aware of the disease.  I am not optimistic about the future of Western civilization, but I am not foolhardy enough to predict historical outcomes.  Perhaps the fever will break, and we will recover, and perhaps not.  Perhaps there will be a violent crisis tomorrow, or perhaps the process of dissolution will drag itself out for centuries.  Objectively speaking, there is no “good” outcome and no “bad” outcome.  However, in the same vein, there is no objective reason why we must refrain from fighting for the survival or our civilization, our culture, or even the ethnic group to which we belong.

As for the liberals, perhaps they should consider why all the fine moral emotions they are so proud to wear on their sleeves exist to begin with.  I doubt that the reason has anything to do with suicide.

By all means, read the book.

Scientific Morality and the Illusion of Progress

British philosophers demonstrated the existence of a “moral sense” early in the 18th century.  We have now crawled through the rubble left in the wake of the Blank Slate debacle and finally arrived once again at a point they had reached more than two centuries ago.  Of course, men like Shaftesbury and Hutcheson thought this “moral sense” had been planted in our consciousness by God.  When Hume arrived on the scene a bit later it became possible to discuss the subject in secular terms.  Along came Darwin to suggest that the existence of this “moral sense” might have developed in the same way as the physical characteristics of our species; via evolution by natural selection.  Finally, a bit less than half a century later, Westermarck put two and two together, pointing out that morality was a subjective emotional phenomenon and, as such, not subject to truth claims.  His great work, The Origin and Development of the Moral Ideas, appeared in 1906.  Then the darkness fell.

Now, more than a century later, we can once again at least discuss evolved morality without fear of excommunication by the guardians of ideological purity.  However, the guardians are still there, defending a form of secular Puritanism that yields nothing in intolerant piety to the religious Puritans of old.  We must not push the envelope too far, lest we suffer the same fate as Tim Hunt, with his impious “jokes,” or Matt Taylor, with his impious shirt.  We cannot just blurt out, like Westermarck, that good and evil are merely subjective artifacts of human moral emotions, so powerful that they appear as objective things.  We must at least pretend that these “objects” still exist.  In a word, we are in a holding pattern.

One can actually pin down fairly accurately the extent to which we have recovered since our emergence from the dark age.  We are, give or take, about 15 years pre-Westermarck.  As evidence of this I invite the reader’s attention to a fascinating “textbook” for teachers of secular morality that appeared in 1891.  Entitled Elements of Ethical Science: A Manual for Teaching Secular Morality, by John Ogden, it taught the subject with all the most up-to-date Darwinian bells and whistles.  In an introduction worthy of Sam Harris the author asks the rhetorical question,

Can pure morality be taught without inculcating religious doctrines, as these are usually interpreted and understood?

and answers with a firm “Yes!”  He then proceeds to identify the basis for any “pure morality:”

Man has inherently a moral nature, an innate moral sense or capacity.  This is necessary to moral culture, since, without the nature or capacity, its cultivation were impossible… This moral nature or capacity is what we call Moral Sense.  It is the basis of conscience.  It exists in man inherently, and, when enlightened, cultivated, and improved, it becomes the active conscience itself.  Conscience, therefore, is moral sense plus intelligence.

The author recognizes the essential role of this Moral Sense as the universal basis of all the many manifestations of human morality, and one without which they could not exist.  It is to the moral sentiments what the sense of touch is to the other senses:

(The Moral Sense) furnishes the basis or the elements of the moral sentiments and conscience, much in the same manner in which the cognitive facilities furnish the data or elements for thought and reasoning.  It is not a sixth sense, but it is to the moral sentiments what touch is to the other senses, a base on which they are all built or founded; a soil into which they are planted, and from which they grow… All the moral sentiments are, therefore, but the concrete modifications of the moral sense, or the applications of it, in a developed form, to the ordinary duties of life, as a sense of justice, of right and wrong, of obligation, duty, gratitude, love, etc., just as seeing, hearing, tasting and smelling are but modified forms of feeling or touch, the basis of all sense.

And here, in a manner entirely similar to so many modern proponents of innate morality, Ogden goes off the tracks.  Like them, he cannot let go of the illusion of objective morality.  Just as the other senses inform us of the existence of physical things, the moral sense must inform us of the existence of another kind of “thing,” a disembodied, ghostly something that floats about independently of the “sense” that “detects” it, in the form of a pure, absolute truth.  There are numerous paths whereby one may, more or less closely, approach this truth, but they all converge on the same, universal thing-in-itself:

…it must be conceded that, while we have a body of incontestable truth, constituting the basis of all morality, still the opinions of men upon minor points are so diverse as to make a uniform belief in dogmatical principles impossible.  The author maintains that moral truths and moral conduct may be reached from different routes or sources; all converging, it is true, to the same point:  and that it savors somewhat of illiberality to insist upon a uniform belief in the means or doctrines whereby we are to arrive at a perfect knowledge of the truth, in a human sense.

The means by which this “absolute truth” acquires the normative power to dictate “oughts” to all and sundry is described in terms just as fuzzy as those used by the moral pontificators of our own day, as if it were ungenerous to even ask the question:

When man’s ideas of right and wrong are duly formulated, recognized and accepted, they constitute what we denominate MORAL LAW.  The moral law now becomes a standard by which to determine the quality of human actions, and a moral obligation demanding obedience to its mandates.  The truth of this proposition needs no further confirmation.

As they say in the academy to supply missing steps in otherwise elegant proofs, it’s “intuitively obvious to the casual observer.”  In those more enlightened times, only fifteen years elapsed before Westermarck demolished Ogden’s ephemeral thing-in-itself, pointing out that it couldn’t be confirmed because it didn’t exist, and was therefore not subject to truth claims.  I doubt that we’ll be able to recover the same lost ground so quickly in our own day.  Secular piety reigns in the academy, in some cases to a degree that would make the Puritans of old look like abandoned debauchees, and is hardly absent elsewhere.  Savage punishment is meted out to those who deviate from moral purity, whether flippant Nobel Prize winners or overly principled owners of small town bakeries.  Absent objective morality, the advocates of such treatment would lose their odor of sanctity and become recognizable as mere absurd bullies.  Without a satisfying sense of moral rectitude, bullying wouldn’t be nearly as much fun.  It follows that the illusion will probably persist a great deal longer than a decade and a half this time around.

Be that as it may, Westermarck still had it right.  The “moral sense” exists because it evolved.  Failing this basis, morality as we know it could not exist.  It follows that there is no such thing as moral truth, or any way in which the moral emotions of one individual can gain a legitimate power to dictate rules of behavior to some other individual.  Until we find our way back to that rather elementary level of self-understanding, it will be impossible for us to deal rationally with our own moral behavior.  We’ll simply have to leave it on automatic pilot, and indulge ourselves in the counter-intuitive hope that it will serve our species just as well now as it did in the vastly different environment in which it evolved.

The Regrettable Overreach of “Faith versus Fact”

The fact that the various gods that mankind has invented over the years, including the currently popular ones, don’t exist has been sufficiently obvious to any reasonably intelligent pre-adolescent who has taken the trouble to think about it since at least the days of Jean Meslier.  That unfortunate French priest left us with a Testament that exposed the folly of belief in imaginary super-beings long before the days of Darwin.  It included most of the “modern” arguments, including the dubious logic of inventing gods to explain everything we don’t understand, the many blatant contradictions in the holy scriptures, the absurdity of the notion that an infinitely wise and perfect being could be moved to fury or even offended by the pathetic sins of creatures as abject as ourselves, the lack of any need for a supernatural “grounding” for human morality, and many more.  Over the years these arguments have been elaborated and expanded by a host of thinkers, culminating in the work of today’s New Atheists.  These include Jerry Coyne, whose Faith versus Fact represents their latest effort to talk some sense into the true believers.

Coyne has the usual human tendency, shared by his religious opponents, of “othering” those who disagree with him.  However, besides sharing a “sin” that few if any of us are entirely free of, he has some admirable traits as well.  For example, he has rejected the Blank Slate ideology of his graduate school professor/advisor, Richard Lewontin, and even goes so far as to directly contradict him in FvF.  In spite of the fact that he is an old “New Leftist” himself, he has taken a principled stand against the recent attempts of the ideological Left to dismantle freedom of speech and otherwise decay to its Stalinist ground state.  Perhaps best of all as far as a major theme of this blog is concerned, he rejects the notion of objective morality that has been so counter-intuitively embraced by Sam Harris, another prominent New Atheist.

For the most part, Faith versus Fact is a worthy addition to the New Atheist arsenal.  It effectively dismantles the “sophisticated Christian” gambit that has encouraged meek and humble Christians of all stripes to imagine themselves on an infinitely higher intellectual plane than such “undergraduate atheists” as Richard Dawkins and Chris Hitchens.  It refutes the rapidly shrinking residue of “God of the gaps” arguments, and clearly illustrates the difference between scientific evidence and religious “evidence.”  It destroys the comfortable myth that religion is an “other way of knowing,” and exposes the folly of seeking to accommodate religion within a scientific worldview.  It was all the more disappointing, after nodding approvingly through most of the book, to suffer one of those “Oh, No!” moments in the final chapter.  Coyne ended by wandering off into an ideological swamp with a fumbling attempt to link obscurantist religion with “global warming denialism!”

As it happens, I am a scientist myself.  I am perfectly well aware that when an external source of radiation such as that emanating from the sun passes through an ideal earthlike atmosphere that has been mixed with a dose of greenhouse gases such as carbon dioxide, impinges on an ideal earthlike surface, and is re-radiated back into space, the resulting equilibrium temperature of the atmosphere will be higher than if no greenhouse gases were present.  I am also aware that we are rapidly adding such greenhouse gases to our atmosphere, and that it is therefore reasonable to be concerned about the potential effects of global warming.  However, in spite of that it is not altogether irrational to take a close look at whether all the nostrums proposed as solutions to the problem will actually do any good.

In fact, the earth does not have an ideal static atmosphere over an ideal static and uniform surface.  Our planet’s climate is affected by a great number of complex, interacting phenomena.  A deterministic computer model capable of reliably predicting climate change decades into the future is far beyond the current state of the art.  It would need to deal with literally millions of degrees of freedom in three dimensions, in many cases using potentially unreliable or missing data.  The codes currently used to address the problem are probabilistic, reduced basis models, that can give significantly different answers depending on the choice of initial conditions.

In a recently concluded physics campaign at Lawrence Livermore National Laboratory, scientists attempted to achieve thermonuclear fusion ignition by hitting tiny targets containing heavy isotopes of hydrogen with the most powerful laser system ever built.  The codes they used to model the process should have been far more accurate than any current model of the earth’s climate.  These computer models included all the known relevant physical phenomena, and had been carefully benchmarked against similar experiments carried out on less powerful laser systems.  In spite of that, the best experimental results didn’t come close to the computer predictions.  The actual number of fusion reactions hardly came within two orders of magnitude of expected values.  The number of physical approximations that must be used in climate models is far greater than were necessary in the Livermore fusion codes, and their value as predictive tools must be judged accordingly.

In a word, we have no way of accurately predicting the magnitude of the climate change we will experience in coming decades.  If we had unlimited resources, the best policy would obviously be to avoid rocking the only boat we have at the moment.  However, this is not an ideal world, and we must wisely allocate what resources we do have among competing priorities.  Resources devoted to fighting climate change will not be available for medical research and health care, education, building the infrastructure we need to maintain a healthy economy, and many other worthy purposes that could potentially not only improve human well-being but save many lives.  Before we succumb to frantic appeals to “do something,” and spend a huge amount of money to stop global warming, we should at least be reasonably confident that our actions will measurably reduce the danger.  To what degree can we expect “science” to inform our decisions, whatever they may be?

For starters, we might look at the track record of the environmental scientists who are now sounding the alarm.  The Danish scientist Bjorn Lomborg examined that record in his book, The Skeptical Environmentalist, in areas as diverse as soil erosion, storm frequency, deforestation, and declining energy resources.  Time after time he discovered that they had been crying “wolf,” distorting and cherry-picking the data to support dire predictions that never materialized.  Lomborg’s book did not start a serious discussion of potential shortcomings of the scientific method as applied in these areas.  Instead he was bullied and vilified.  A kangaroo court was organized in Denmark made up of some of the more abject examples of so-called “scientists” in that country, and quickly found Lomborg guilty of “scientific dishonesty,” a verdict which the Danish science ministry later had the decency to overturn.  In short, the same methods were used against Lomborg as were used decades earlier to silence critics of the Blank Slate orthodoxy in the behavioral sciences, resulting in what was possibly the greatest scientific debacle of all time.  At the very least we can conclude that all the scientific checks and balances that Coyne refers to in such glowing terms in Faith versus Fact have not always functioned with ideal efficiency in promoting the cause of truth.  There is reason to believe that the environmental sciences are one area in which this has been particularly true.

Under the circumstances it is regrettable that Coyne chose to equate “global warming denialism” a pejorative term used in ideological squabbles that is by its very nature unscientific, with some of the worst forms of religious obscurantism.  Instead of sticking to the message, in the end he let his political prejudices obscure it.  Objections to the prevailing climate change orthodoxy are hardly coming exclusively from the religious fanatics who sought to enlighten us with “creation science,” and “intelligent design.”  I invite anyone suffering from that delusion to have a look at some of the articles the physicist and mathematician Lubos Motl has written about the subject on his blog, The Reference Frame.  Examples may be found here, here and, for an example with a “religious” twist,  here.  There he will find documented more instances of the type of “scientific” behavior Lomborg cited in The Skeptical Environmentalist.  No doubt many readers will find Motl irritating and tendentious, but he knows his stuff.  Anyone who thinks he can refute his take on the “science” had better be equipped with more knowledge of the subject than is typically included in the bromides that appear in the New York Times.

Alas, I fear that I am once again crying over spilt milk.  I can only hope that Coyne has an arrow or two left in his New Atheist quiver, and that next time he chooses a publisher who will insist on ruthlessly chopping out all the political Nebensächlichkeiten.  Meanwhile, have a look at his Why Evolution is True website.  In addition to presenting a convincing case for evolution by natural selection and a universe free of wrathful super beings, Professor Ceiling Cat, as he is known to regular visitors for reasons that will soon become apparent to newbies, also posts some fantastic wildlife pictures.  And if it’s any consolation, I see his book has been panned by John Horgan.  Anyone with enemies like that can’t be all bad.  Apparently Horgan’s review was actually solicited by the editors of the Wall Street Journal.  Go figure!  One wonders what rock they’ve been sleeping under lately.

But What of Shaftesbury?

In this and my previous post, I discuss some British philosophers that even most well-educated laypeople have never heard of.  Why?  Because they shed a great deal of light on the subjects of human nature and morality.  These subjects are critical to our self-understanding, which, in turn, is critical to our survival.  If we had read, understood, and built on what they taught, we might have avoided wandering into many of the blind alleys into which we were led by subsequent generations of the “men of science.”  The most damaging and delusional blind alley of all was the Blank Slate orthodoxy.  Ironically, it was enforced by exploiting the very moral emotions whose existence it denied, setting back the behavioral sciences and moral philosophy by more than a century in the process.  View, if you like, these posts as an attempt to pick up the lost threads.

In my previous post I highlighted the philosophy of Francis Hutcheson.  I note in passing that he was actually born in Ireland, and studied and received his degree in Scotland.  I did that because Hutcheson was the first, or at least the first I know of, to elaborate a well thought out and coherent theory of the origins of morality in an innate “moral sense,” demonstrating in the process why, absent such a moral sense, moral behavior is not even possible.  In other words, the “root cause” of morality is this moral sense.  Furthermore, Hutcheson explained why, as a consequence, it is impossible to distinguish between good and evil using reason alone.  Two hundred years later the great Finnish moral philosopher Edvard Westermarck, who had read and admired Hutcheson, noted that in the ensuing years, his contention that, “the moral concepts are ultimately based on emotions either of indignation or approval, is a fact which a certain school of thinkers have in vain attempted to deny.”

That said, it is hardly true that the works of many other 18th century British authors do not contain ideas similar to Hutcheson’s.  Such authors are often able to see further and more clearly than those who have come before by virtue of the privilege of, as Einstein put it, “sitting on the shoulders of giants.”  In Hutcheson’s case, one such giant was Anthony Ashley-Cooper, the third Earl of Shaftesbury.  Hutcheson certainly left no one in doubt concerning his debt to Shaftesbury in his own time.  Sir James MacKintosh, who left sketches of many forgotten British moral philosophers who are well worth reading today in his, “On the progress of ethical philosophy, chiefly during the XVIIth & XVIIIth centuries,” which first appeared as a supplement to the Encyclopedia Brittanica in 1829, went so far as to refer to Shaftesbury as Hutcheson’s “master.”  Although Shaftesbury was born in England, MacKintosh claimed that, “…the philosophy of Shaftesbury was brought by Hutcheson from Ireland,” after it and similar works had been suppressed in England for some time “by an exemplary but unlettered clergy.”

Like Hutcheson, many of the themes in Shaftesbury’s writings would have sounded very familiar to modern evolutionary psychologists.  For example, he had this to say on the Blank Slate ideology of his day:

It was Mr. Locke that struck at all fundamentals, threw all order and virtue out of the world, and made the very ideas of these… unnatural and without foundation in our minds.

Locke, of course, is often cited as a forerunner of the Blank Slaters of the 20th century, although the comparison isn’t entirely accurate.  He rejected innate morality because it was incompatible with his Christian theology rather than the secular “progressive” ideology of a later day.

The key theme of Hutcheson’s work as far as the modern science of morality is concerned – the existence of an innate “moral sense” – is, if anything, emphasized even more strongly in the writings of Shaftesbury.  For example, from his Inquiry Concerning Virtue or Merit, probably his most important work on morality as far as modern readers are concerned,

Sense of right and wrong therefore being as natural to us as natural affection itself, and being a first principle in our constitution and make; there is no speculative opinion, persuasion or belief, which is capable immediately or directly to exclude or destroy it.  That which is of original and pure nature, nothing beside contrary habit or custom (a second nature) is able to displace.  And this affection being an original one of earliest rise in the soul or affectionate part; nothing beside a contrary affection, by frequent check and control, can operate upon it, so as either to diminish it in part, or destroy it in whole.

A somewhat startling aspect of Shaftesbury’s work, given the time in which it was written, was his recognition of the continuity between human beings and other animal species.  For example, again from the Inquiry,

We know that every creature has a private good and interest of his own, which Nature has compelled him to seek, by all the advantages afforded him within the compass of his make.  We know that there is in reality a right and a wrong state of every creature, and that this right one is by nature forwarded and by himself affectionately sought.

and

We have found that, to deserve the name of good or virtuous, a creature most have all his inclinations and affections, his dispositions of mind and temper, suitable, and agreeing with the good of his kind, or of that system in which he is included, and of which he constitutes a part.

and, finally,

The ordinary animals appear unnatural and monstrous when they lose their proper instincts, forsake their kind, neglect their offspring, and pervert those functions or capacities bestowed by nature.  How wretched must it be, therefore, for man, of all other creatures, to lose that sense and feeling which is proper to him as a man, and suitable to his character and genius?

If one didn’t know better, one might easily imagine that E. O. Wilson’s latest book, The Meaning of Human Existence, with its assertions about our “good” nature being the result of group selection, and our “evil” nature the result of selection at the level of the individual, had been inspired by Shaftesbury.  For example,

There being allowed therefore in a creature such affections as these towards the common nature or system of the kind, together with those other which regard the private nature or self-system, it will appear that in following the first of these affections, the creature must on many occasions contradict and go against the latter.  How else should the species be preserved?  Or what would signify that implanted natural affection, by which a creature through so many difficulties and hazards preserves its offspring and supports its kind.

One must hope that such passages won’t draw down on Shaftesbury’s head the anathemas of Richard Dawkins and Steven Pinker as the great heresiarch of group selection theory.

In a remarkable passage that might have been lifted from the pages of Westermarck, Shaftesbury reveals some doubt regarding the objective existence of good and evil, in spite of our tendency to imagine them in that way:

If there be no real amiableness or deformity in moral acts, there is at least an imaginary one of full force.  Though perhaps the thing itself should not be allowed in nature, the imagination or fancy of it must be allowed to be from nature alone.  Nor can anything besides art and strong endeavor, with long practice and meditation, overcome such a natural prevention or prepossession of the mind in favor of this moral distinction.

Finally, at the risk of exhausting the patience of even my most dogged readers, allow me to throw in another aspect of Shaftesbury’s writings that would put him “ahead of his time” even if he were alive today; his dispassionate and temperate comments on the subject of atheism.  Consider, for example, the following:

…it does not seem, that atheism should of itself be the cause of any estimation or valuing of anything as fair, noble, and deserving which was the contrary.  It can never, for instance, make it be thought that the being able to eat man’s flesh, or commit bestiality, is good and excellent in itself.  But this is certain, that by means of corrupt religion or superstition, many things the most horridly unnatural and inhuman come to be received as excellent, good, and laudable in themselves.

and

…religion, (according as the kind may prove) is capable of doing great good or harm, and atheism nothing positive in either way.   For however it may be indirectly an occasion of men’s losing a good and sufficient sense of right and wrong, it will not, as atheism merely, be the occasion of setting up a false species of it, which only false religion or fantastical opinion, derived commonly through superstition or credulity, is able to effect.

To confirm those observations, one need look no further than recent events in the Middle East.  When it comes to “fantastical opinion, derived commonly through superstition or credulity,” the 20th century gave us two outstanding examples, in the form of Communism and Nazism.  Pundits like Bill O’Reilly claim that atheism itself is responsible for all the crimes of these modern secular versions of “corrupt religion.”  This was a form of bigotry of which Shaftesbury, writing three centuries earlier, give or take, was not capable.

Of course, Shaftesbury no more wrote in a vacuum than Hutcheson.  Similar themes may be found in the work of many other British moral philosophers of the time.  In particular, Joseph Butler, like Hutcheson, borrowed heavily from Shaftesbury in developing his own ideas regarding the origins of morality in human nature.  Brief descriptions of the work of many others may be found in the book by Sir James MacKintosh referred to above, and in Michael Gill’s excellent book, The British Moralists on Human Nature and the Birth of Secular Ethics.

E. O. Wilson’s “The Meaning of Human Existence:” Doubling Down on Group Selection

It’s great to see another title by E. O. Wilson.  Reading his books is like continuing a conversation with a wise old friend.  If you run into him on the street you don’t expect to hear him say anything radically different from what he’s said in the past.  However, you always look forward to chatting with him because he’s never merely repetitious or tiresome.   He always has some thought-provoking new insight or acute comment on the latest news.  At this stage in his life he also delights in puncturing the prevailing orthodoxies, without the least fear of the inevitable anathemas of the defenders of the faith.

In his latest, The Meaning of Human Existence, he continues the open and unabashed defense of group selection that so rattled his peers in his previous book, The Social Conquest of Earth.  I’ve discussed some of the reasons for their unease in an earlier post.  In short, if it can really be shown that the role of group selection in human evolution has been as prominent as Wilson claims, it will seriously mar the legacy of such prominent public intellectuals as Richard Dawkins and Steven Pinker, as well as a host of other prominent scientists, who have loudly and tirelessly insisted on the insignificance of group selection.  It will also require some serious adjustments to the fanciful yarn that currently passes as the “history” of the Blank Slate affair.  Obviously, Wilson is firmly convinced that he’s on to something, because he’s not letting up.  He dismisses the alternative inclusive fitness interpretation of evolution as unsupported by the evidence and at odds with the most up-to-date mathematical models.  In his words,

Although the controversy between natural selection and inclusive fitness still flickers here and there, the assumptions of the theory of inclusive fitness have proved to be applicable only in a few extreme cases unlikely to occur on Earth on any other planet.  No example of inclusive fitness has been directly measured.  All that has been accomplished is an indirect analysis called the regressive method, which unfortunately has itself been mathematically invalidated.

Interestingly, while embracing group selection, Wilson then explicitly agrees with one of the most prominent defenders of inclusive fitness, Richard Dawkins, on the significance of the gene:

The use of the individual or group as the unit of heredity, rather than the gene, is an even more fundamental error.

Very clever, that, a preemptive disarming of the predictable invention of straw men to attack group selection via the bogus claim that it implies that groups are the unit of selection.  The theory of group selection already has a fascinating, not to mention ironical, history, and its future promises to be no less entertaining.

When it comes to the title of the book, Wilson himself lets us know early on that its just a forgivable form of “poetic license.”  In his words,

In ordinary usage the word “meaning” implies intention.  Intention implies design, and design implies a designer.  Any entity, any process, or definition of any word itself is put into play as a result of an intended consequence in the mind of the designer.  This is the heart of the philosophical worldview of organized religions, and in particular their creation stories.  Humanity, it assumes, exists for a purpose.  Individuals have a purpose in being on Earth.  Both humanity and individuals have meaning.

Wilson is right when he says that this is what most people understand by the term “meaning,” and he decidedly rejects the notion that the existence of such “meaning” is even possible later in the book by rejecting religious belief more bluntly than in any of his previous books.  He provides himself with a fig leaf in the form of a redefinition of “meaning” as follows:

There is a second, broader way the word “meaning” is used, and a very different worldview implied.  It is that the accidents of history, not the intentions of a designer, are the source of meaning.

I rather suspect most philosophers will find this redefinition unpalatable.  Beyond that, I won’t begrudge Wilson his fig leaf.  After all, if one takes the trouble to write books, one generally also has an interest in selling them.

As noted above, another significant difference between this and Wilson’s earlier books is his decisive support for what one might call the “New Atheist” line, as set forth in books by the likes of Richard Dawkins, Sam Harris, and Christopher Hitchens.  Obviously, Wilson has been carefully following the progress of the debate.  He rejects religions, significantly in both their secular as well as their traditional spiritual manifestations, as both false and dangerous, mainly because of their inevitable association with tribalism.  In his words,

Religious warriors are not an anomaly.  It is a mistake to classify believers of particular religious and dogmatic religionlike ideologies into two groups, moderate versus extremist.  The true cause of hatred and violence is faith versus faith, an outward expression of the ancient instinct of tribalism.  Faith is the one thing that makes otherwise good people do bad things.

and, embracing the ingroup/outgroup dichotomy in human moral behavior I’ve often alluded to on this blog,

The great religions… are impediments to the grasp of reality needed to solve most social problems in the real world.  Their exquisitely human flaw is tribalism.  The instinctual force of tribalism in the genesis of religiosity is far stronger than the yearning for spirituality.  People deeply need membership in a group, whether religious or secular.  From a lifetime of emotional experience, they know that happiness, and indeed survival itself, require that they bond with oth3ers who share some amount of genetic kinship, language, moral beliefs, geographical location, social purpose, and dress code – preferably all of these but at least two or three for most purposes.  It is tribalism, not the moral tenets and humanitarian thought of pure religion, that makes good people do bad things.

Finally, in a passage worthy of New Atheist Jerry Coyne himself, Wilson denounces both “accommodationists” and the obscurantist teachings of the “sophisticated Christians:”

Most serious writers on religion conflate the transcendent quest for meaning with the tribalistic defense of creation myths.  They accept, or fear to deny, the existence of a personal deity.  They read into the creation myths humanity’s effort to communicate with the deity, as part of the search for an uncorrupted life now and beyond death.  Intellectual compromisers one and all, they include liberal theologians of the Niebuhr school, philosophers battening on learned ambiguity, literary admirers of C. S. Lewis, and others persuaded, after deep thought, that there most be Something Out There.  They tend to be unconscious of prehistory and the biological evolution of human instinct, both of which beg to shed light on this very important subject.

In a word, Wilson has now positioned himself firmly in the New Atheist camp.  This is hardly likely to mollify many of the prominent New Atheists, who will remain bitter because of his promotion of group selection, but at this point in his career, Wilson can take their hostility pro granulum salis.

There is much more of interest in The Meaning of Human Existence than I can cover in a blog post, such as Wilson’s rather vague reasons for insisting on the importance of the humanities in solving our problems, his rejection of interplanetary and/or interstellar colonization, and his speculations on the nature of alien life forms.  I can only suggest that interested readers buy the book.

Of the War on Christmas and the Thinness of Leftist Skins

‘Twas the month before Christmas, and Bill O’Reilly launched his usual jihad against the purported “War on Christmas.” It drew the predictable counterblasts from the Left, and I just happened to run across one that appeared back on December 4 on Huffpo, entitled “A War on Reason, Not on Christmas.” I must admit I find the “War on Christmas” schtick tiresome. Conservatives rightly point to the assorted liberal cults of victimization as so much pious grandstanding. It would be nice if they practiced what they preach and refrained from concocting similar cults of their own. Be that as it may, I found the article in question somewhat more unctuous and self-righteous than usual, and left a comment to that effect. It was immediately deleted.

My comment included no ad hominem attacks, nor was it abusive. I simply disagreed with the author on a few points, and noted that the political Left has an exaggerated opinion of its devotion to reason. The main theme of the article was the nature of the political divide in the U.S. According to the author, it is less between rich and poor than between “reasonable” liberals and “irrational” conservatives. As he put it,

Before imploding in the face of his sordid extramarital trysts, presidential candidate John Edwards based his campaign on the idea of two Americas, one rich the other poor. He was right about the idea that American is divided, but wrong about the nature of the division. The deeper and more important split is defined by religiosity, not riches.

The conflict between these two world views is made apparent in the details of our voting booth preferences. Religiosity alone is the most important, obvious and conclusive factor in determining voter behavior. Simply put, church goers tend to vote Republican. Those who instead go the hardware store on Sunday vote Democrat by wide margins.

He then continued,

Those who accept the idea of god tend to divide the world into believers and atheists. Yet that is incorrect. Atheist means “without god” and one cannot be without something that does not exist. Atheism is really a pejorative term that defines one world view as the negative of another, as something not what something else is.

This evoked my first comment, which seemed to me rather harmless on the face of it. I merely said that as an atheist myself, I had no objection to the term, and would prefer to avoid the familiar game of inventing ever more politically correct replacements until we ended up with some abomination seven or eight syllables long. However, what followed was even more remarkable. The author proceeded to deliver himself of a pronouncement about the nature of morality that might have been lifted right out of one of Ardrey’s books. In a section entitled, “Secular and Religious Morality,” he writes,

Traits that we view as moral are deeply embedded in the human psyche. Honesty, fidelity, trustworthiness, kindness to others and reciprocity are primeval characteristics that helped our ancestors survive. In a world of dangerous predators, early man could thrive only in cooperative groups. Good behavior strengthened the tribal bonds that were essential to survival. What we now call morality is really a suite of behaviors favored by natural selection in an animal weak alone but strong in numbers. Morality is a biological necessity and a consequence of human development, not a gift from god.

Exactly! Now, as I’ve often pointed out to my readers, if morality really is the expression of evolved traits as the author suggests, it exists because it happened to enhance the chances that certain genes we carry would survive and reproduce in the environment in which they happened to appear. There is no conceivable way in which they could somehow acquire the magic quality of corresponding to some “real, objective” morality in the sky. There is no way in which they could assume a “purpose” attributed to them by anyone, whether on the left or the right of the political spectrum. Finally, there is no way in which they could acquire the independent legitimacy to dictate to anyone the things they “really” ought or ought not to do. So much is perfectly obvious. Assuming one really is “reasonable,” it follows immediately from what the author of the article says about the evolved origins of morality above. That, of course, is not how the Left is spinning the narrative these days.

No, for a large faction on the secular Left, the fact that morality is evolved means not merely that the God-given morality of the Christians and other religious sects is “unreasonable.” For them, it follows that whatever whims they happen to tart up as the secular morality du jour become “reasonable.” That means that they are not immoral, or amoral. They are, by default, the bearers of the “true morality.”  In the article in question it goes something like this:

The species-centric arrogance of religion cultivates a dangerous attitude about our relationship with the environment and the resources that sustain us. Humanists tend to view sustainability as a moral imperative while theists often view environmental concerns as liberal interference with god’s will. Conservative resistance to accepting the reality of climate change is just one example, and another point at which religious and secular morality diverge, as the world swelters.

It’s wonderful, really. The Left has always been addicted to moralistic posing, and now they don’t have to drop the charade! Now they can be as self-righteous as ever, as devotees of this secular version of morality that has miraculously acquired the power to become a thing-in-itself, presumably drifting up there in the clouds somewhere beyond the profane ken of the unenlightened Christians. As it happens, at the moment my neighbors are largely Mormon, and I must say their dogmas appear to me to be paragons of “reason” compared to this secular version of morality in the sky.

Of course, I couldn’t include all these observations in the Huffpo comment section. I merely pointed out that what the author had said about morality would have branded him as a heretic no more than 20 years ago, and evoked frenzied charges of “racism” and “fascism” from the same political Left in which he now imagines himself so comfortably ensconced. That’s because 20 years ago the behavioral sciences were still in thrall to the Blank Slate orthodoxy, as they had been for 50 years and more at the time. That orthodoxy was the greatest debacle in the history of science, and it was the gift, not of the Right, but of the “reasonable” secular Left. That was the point I made in the comment section, along with the observation that liberals would do well to keep it in mind before they break their arms patting themselves on the back for being so “reasonable.”

The author concluded his article with the following:

There is no war on Christmas; the idea is absurd at every level. Those who object to being forced to celebrate another’s religion are drowning in Christmas in a sea of Christianity dominating all aspects of social life. An 80 percent majority can claim victimhood only with an extraordinary flight from reality. You are probably being deafened by a rendition of Jingle Bells right now. No, there is no war on Christmas, but make no mistake: the Christian right is waging a war against reason. And they are winning. O’Reilly is riding the gale force winds of crazy, and his sails are full.

I must agree that the beloved Christian holiday does have a fighting chance of surviving the “War on Christmas.” Indeed, Bill O’Reilly himself has recently been so sanguine as to declare victory.  When it comes to popular delusions, however, I suspect the Left’s delusion that it has a monopoly on “reason” is likely to be even more enduring.  As for the deletion of my comment, we all know about the Left’s proclivity for suppressing speech that they find “offensive.”  Thin skins are encountered in those political precincts at least as frequently as the characteristic delusions about “reason.”