Thinking Thoughts About Gods and Science in Other Venues

I recently wrote a couple of brief op-ed for the website Atheist Republic, an online community for folks inclined toward secular thinking.

I figured I would link to them below. Follow the links for the full text.

Religious Belief is Hard Work 

Religious belief stands in belligerent indifference to information about what the world is like. It persists in spite of nature, not because of it. The scales started to fall from eyes as I developed a deeper and more expansive understanding of science. In a panicked state of youthful naivety, I tried to justify my religious beliefs despite the fact that they were contradicted by many of the more elegant and substantive truths derived from science. It was an exhausting struggle.

Aspirational Atheism

…an embrace of reason need not stop at recognition of and resistance to the harms of superstitious belief. It can also inform our sense of what we want for ourselves and our fellow humans. Reason leads us to reject religion, but it also leads us to recognize our shared humanity. It leads to the eradication of disease and the recognition of individual human rights. Embracing reason is the groundwork for unleashing human potential and building a world increasingly amenable to the business of human thriving.

Trolling the Professor Watchlist

I’ve pounded the keys exhaustively over the issue of the postmodern left’s petty, censorious crusade to sanitize discourse. But lest we forget, an opportunistic relationship to free speech is a bipartisan issue. Case in point: the recently christened “Professor Watchlist”, where concerned college students can report their instructors for “liberal bias”. This is a frankly absurd initiative, with more than a few ominous undertones. I’m no historian, but I can’t think of an example of anything good ever coming from putting academics on a watchlist. It’s a campaign that reeks of deep, authoritarian impulses.

In any event, the sheer ridiculousness of the Professor Watchlist calls for mockery. To that end, I went ahead and submitted a few particularly egregious offenders:

slide4slide3slide2

If you’re aware of any similarly subversive pinkos, feel free to submit a tip to the Professor Watchlist.

On the Value of Work

Historian James Livingston has written an interesting piece for Aeon. In it, he asks “what is the value of work?” – a question given added urgency by the fact that, hanging just over the horizon, is a future where advances in AI and automation may wipe out a huge segment of job market.

The conviction that there is a clear correspondence between effort and reward probably emerged on the pre-industrial American frontiers. Out in the hinterlands, the connection between hard work and economic return is always obvious. If you’re a farmer, the amount of food you harvest follows directly from the amount of seeds you sow, the work you put into building irrigation systems, and the time you spend tending your crops. Ranchers would have had more beef to sell if they spent more time watching their herds. For a fur trapper, supplies and money varied in proportion to the number of hides he could sell back to a company at the end of the season. Gold prospectors got more money out of mining more gold.

Today, the relationships are considerably more nebulous. Those who have had a lot of luck in life are quick to point to the efforts that preceded it. No doubt, those are causally efficacious, but they are hardly comprehensively explanatory. There are plenty of people who work hard and go nowhere. Likewise, there are even a few people who become wealthy beyond any coherent sense of proportion to the value they add to society. Was the work Lloyd Blankfein did in 2015 really worth over $23 million? Are there products circulating the globe whose value has been increased by $71.5 billion dollars by the efforts of Warren Buffett? Those questions are clearly rhetorical, because the objective answer is a flat, unequivocal “no”.

To argue otherwise is to imbue markets with a sort mystic omnipotence, suggesting that the prices that emerge from economic transactions are always and everywhere reflective of their true value. Which is pure, unadulterated nonsense. There’s simply no way for economic agents to account for all the information that kind of computation would require. As a result, situations emerge where subsidiaries achieve market valuations in excess of their parent companies, or where an 86 year old man is worth $71.5 billion dollars despite never having invented a world-altering technology, discovered a lifesaving medical treatment, or even sold a piece of art. That people have profited immensely from grossly unethical – sometimes even outright criminal behavior – without ever suffering the slightest consequence suggests that the myth that human value is somehow reflected or enhanced by wages and net worth is not only misguided, but laughably deranged.

Once, decades – maybe even centuries – ago, under certain conditions at the fringes of the industrialized world, that was true. Not any longer. Some people work hard and do pretty well for themselves. Others work just as hard and accrue riches greater than the GDPs of entire nations. Some work even harder – two jobs and brutal swing shifts – and can’t save enough to retire or afford health insurance. Desperate to preserve that crusty, ramshackle American ethos of rugged individualism and the self-made man, some might interject that surely, while it is possible to succeed tremendously or fail miserably despite your best efforts, it is also true that it is impossible to succeed at all without at least putting your shoulder to the wheel in the first place. For the most part, that’s probably true – but I would remind that misty-eyed romantic that there are people alive and wealthy today because a rich man’s sperm fertilized a rich woman’s egg – generations ago.

The Use and Abuse of Cultural Relativism

A little over a year and a half ago, my wife and I were in Cambodia, sitting cross-legged on the wooden bow of a weathered prop boat, chatting with a local guide as he took us through a floating village populated – he told us – by ethnic Vietnamese immigrants. The boat cut through a wide channel in a patch of inundated forest on the Tonle Sap, our wake fanning out lazily to disappear among the trees or slosh mildly against the wooden bases of floating houses.

I don’t recall the precise details, but somehow our conversation turned to taxonomy. Our guide wanted to know what, according to our view, counted as an animal. Was an ant an animal? What about a human? This initially struck me as something of an odd question, but through the course of our back-and-forth it became apparent that it was quite sensible when understood from his perspective. Though the Western system of taxonomic classification exudes a comfortable aura of familiarity to those who have been raised with it, it is far from the most obvious way to order the natural world. No doubt this is part of the reason it didn’t occur to anyone prior to the past three centuries of human history.

In the view of our guide, the relationships among living things are defined by principles of opposition and symmetry. There must be two of each kind in a category and the relationships among categories are defined by an array of similarities and differences. That dogs and monkeys and snakes count as animals seemed to him common sense. That ants and humans are also members of the same category struck him as a little more peculiar.

When it comes to ordering the world of birds and beasts, the primary difference between our guide and ourselves was that we happened to subscribe to a classification system that orders the living world according to relationships of descent with modification. We apply a Linaean classification system structured around the notion that all living things share a common origin, and that their relationships are defined by how recently they diverged from a shared ancestor in a nested hierarchy extending back to a successful batch of single-celled organisms that emerged from the primordial soup some 3.5 billion years ago. Though many of the features that seemed most salient in his classification system are irrelevant to our Darwinian framework, they nonetheless have their own internal logic.

Unfortunately, our conversation was far too short for anyone involved to build a comprehensive picture of the other’s worldview. Nevertheless, a few general points are obvious. Foremost, that our guide’s view differed so substantively from our own is surely not the product of any innate difference in cognitive capacity. Nor can it be attributed to any inherent difference between his most recent ancestors and our own.

Instead, his perspective is primarily a product of vast networks of highly contingent influences. A non-exhaustive list might include the distinct and ever changing traditional beliefs of whatever ethnic group (or groups) he might belong to, his parent’s interpretation thereof, the ideas of his friends, the recent political history of Cambodia, and so forth. It’s highly unlikely that his ideas about the world are identical to those of his friends and neighbors, but are nonetheless shaped by sampling a more widely shared cultural repertoire. The same can be said of my wife and I – the fact that we have learned about biological evolution is a consequence of the circumstances in which we were born and raised, and the distinct cultural trajectory pursued by a certain subset of Western Europeans following the invention of the printing press and the subsequent intellectual revolution of the Enlightenment.

 

A Brief History of Cultural Relativism

The view that human differences in knowledge, belief, or practice are usefully (if only partially) explained by the proximate influence of culture, and that these difference can be usefully illuminated through an understanding of the internal logic of the cultures that produce them, is the perspective offered by cultural relativism.

Unfortunately, cultural relativism is a widely misunderstood principle. The concept (though not the term itself) probably first emerged in the writings of the anthropologist Franz Boas in the closing decades of the 19th century. Since then, it has been a useful tool for ethnographers seeking to understand culturally motivated behavior. At the same time, it has also been a plague upon the larger enterprise of producing scientifically justifiable explanations for human behavior. Upon entering the popular vernacular, it has offered of a veneer of intellectual rigor and scholarly nuance to facile arguments about the plurality of knowledge and the primacy of subjective experience. It has granted succor to the idea that there are “other ways of knowing” as reliable and equally deserving of confidence as the scientific method. It has sired pleas for “trigger warnings”, outrage over “microaggressions”, and an exaggerated emphasis on the subjective experience of feeling offended as a suitable justification for curtailing speech.

In its most extreme incarnations, cultural relativism has been advanced as a prohibition on any and all cross-cultural evaluation of beliefs, practices, and knowledge-claims. Everything from traditional accounts of cosmic creation and religious explanations of natural phenomena to moral arguments about the nature of good and bad behavior is taken off the table, shielded from scrutiny by the notion that cultural differences are sufficiently deep to render the humans they divide mutually unintelligible. Superficially, this stance has often been made to look like a sophisticated embrace of uncertainty, but it is really nothing of the sort. Instead, it is a pedantic, dehumanizing, and pusillanimous retreat from the hard work of uncovering truth and defending modern ideas about universal human rights.

Cultural relativism can be decomposed into three components. The first is methodological cultural relativism. This is a tool used by anthropologists and ethnographers to understand the beliefs, customs, ideas, and behavior of people with cultures and histories different from their own. In many respects mundane, methodological cultural relativism has proven extremely useful, allowing researchers to strip away the often blinding baggage of personal history and cultural bias.

Sadly, methodological relativism has come to be associated with the parasitic vices of normative (or moral) and cognitive relativism. Too often, these have served to rob mundane cultural relativism of its methodological utility. Cognitive realism is what emerges when one looks upon the real problems inherent in the human quest for knowledge – the things that make rock-solid, axiomatic certainty so difficult to achieve – and withdraws from the search entirely. Similarly, normative relativism represents a withdrawal from the challenges that emerge from the fluid, turbulent ambiguity inherent in developing and applying coherent ethical systems.

Taken together, cognitive and normative relativism are the seeds for a system of intellectual and ethical obfuscation that will hence-forth be referred to as fundamentalist cultural relativism (FCR). Rather than a useful system of intellectual criticism and methodological prescriptions, FCR is the intellectual equivalent of a mountaineer collapsing at the base of gnarly, ferocious peak and declaring it impossible to summit before even giving it a serious attempt. Put more simply, it’s what happens when someone looks at a problem, notes that it’s hard, and immediately declares it intractable.

As a consequence, many serious academics have abandoned cultural relativism entirely. The physicist David Deutsch has cast it as a brand of “persistently popular irrationality.” Insofar as he is referring to the idea’s most common flavor, so thoroughly infused with cognitive and normative relativism and particularly pervasive among the social justice crowd, he isn’t far off the mark. Indeed, the extreme relativism that flourished in the humanities and social sciences in the 1980s and 1990s is nothing if not a persistently popular brand of irrationality. With the spread of the vapid school of vigorous intellectual masturbation and enthusiastic pedantry politely referred to as “postmodernism,” the notion that all knowledge-claims or moral positions are equally valid became dogma throughout many of the subdisciplines that fall within the scope of the humanities and social sciences. My own discipline, anthropology, is still recovering from the effects of this intellectual rot.

One of the chief problems with that the brand of cultural relativism (justifiably) decried by thinkers like Deutsch is that it represents an abuse of a more useful principle. The notion that the beliefs, customs, and behaviors of different cultural groups could be usefully analyzed and understood in their own terms was a radical advance over previous anthropological methods. When the earliest forms of the concept were first advanced, beliefs about the social – and even biological – supremacy of white Western men were scarcely given a second thought. White men in Western, industrialized societies occupied the pinnacle of a long chain of social and biological evolution.

According to these views, Western culture represented the purest, most complete manifestation of a natural order, the metric against which all other human cultures should be judged. The cultural differences uncovered in places like the equatorial rain forests of Papua New Guinea or the frigid expanses of the high Arctic were commonly viewed as deviations from perfection. Today, these notions are rightly considered signally ridiculous and repugnant by all but the most intransigent racists and xenophobes. But to many prominent and serious thinkers occupying important positions of academic and political authority in the late 19th and early 20th century, nothing could have been more obvious than their own innate superiority.

Eventually, ethnographers and anthropologists like Franz Boas came to recognize that the rampant ethnocentrism of their peers was translating into bad science, fundamentally hobbling efforts to actually understand the roots of human behavioral variation. The idea that “traditional” societies, such as those found practicing foraging or horticulturalist lifestyles in the rainforests of Brazil, were somehow frozen in time – primitive vestiges of points along the slow but inevitable march toward the industrialized modernity of centralized state authority and market economics – represented a roadblock to intellectual progress. Recognizing the faults in this perspective, some turn of the century anthropologists began to advocate analyzing cultures according to their own internal logic.

It’s worth noting that the kind of relativism espoused by Boas and his students was almost certainly more permissive that the version being advocated here. Nonetheless, as an analytical tool, cultural relativism represented a considerable improvement over the parochial, Western-centric worldview that had been distorting the thinking of early explorers and ethnographers. Rather than viewing non-Western cultures as primitive holdovers, serious anthropologists began to recognize that individual cultures – and the differences among them – could be fruitfully illuminated by seriously attempting to understand them in their own terms. Substantial utility could be found in recognizing that each culture is a product of a vast web of historical contingency.

 

Cultural Relativity as a Modern Research Tool

To understand why this is so, it’s useful to couch the argument in more modern terms. Human behavior, like that of other animals, is a product of a complex stew of influences. This includes a suite of genes – some highly conserved, others more recently selected as a result of the specific challenges that came along with being a bipedal, largely hairless, highly social primate living on the African savannah for the better share of the Pleistocene. It also includes a capacity to learn and usefully modify our behavior over the span of hours, days, weeks, months, or years – allowing us to respond to environmental variability much more rapidly than the relatively glacial pace at which natural selection sorts the genetic wheat from the chaff. But the inheritance of genetic information is ubiquitous across the living world, and learning is not particularly rare. Humanity’s peculiar spark is found in our unique capacity to create, accumulate, store, and transmit vast sums of non-genetic information – i.e. culture.

Culture itself is an evolving system, emerging as generations of individuals interact with one another while attempting to cope with the challenges of living in particular environments and trying to build stable social arrangements therein. Ideas about how best to live are dictated not only by hard practicality (e.g. what foods offer the best caloric returns relative to caloric energy spent in acquiring them), but the beliefs, customs, superstitions, and additional cultural bric-a-brac that tends to accumulate as a byproduct of human efforts to live in a world they don’t fully comprehend. The ability to accumulate, compress, retain, and manipulate all of that extra information symbolically, and transmit it across generations, is the primary feature that has allowed humans to not only survive, but actively flourish, in environments as diverse as the frozen expanses of Canadian Arctic to the tropical rain forests of Papua New Guinea. The capacity for culture transformed an African primate into a global species. Without understanding culture – and the processes that shape it – it is fundamentally impossible to produce a complete account of human behavioral variation.

Cultural relativism, then, is a useful lens for investigating one of the principle variables shaping human behavior. If you want to understand why people think and act the way they do, it is often useful to try your best to see things from their angle – to adopt what anthropologists call an “emic”, or “inside-looking-out” view of culture. This is not controversial. It is simply a specific incarnation of what philosopher James Woodward has called the manipulability conception of causal explanation, a counterfactual account of causal relationships that suggests one can find out why things are the way they are by imagining how they might be otherwise. People are the way they are, in part, because of the culture they experience. If you changed or removed that culture, they would be different. Thus, to understand why they are the way they are, you should understand the set of beliefs, customs, practices, ideas, rituals and what-not that are caught under the umbrella term, “culture”.

This is also an idea that has been hijacked and corrupted into something that, in its own ways, has come to prove almost as debilitating as 19th century Western ethnocentrism. Extreme relativists advocate the position that cultures can only be understood in their own terms, that outsiders have no justifiable basis for cross-cultural evaluation, that all culturally derived knowledge is equally true, and that all culturally derived moral positions are equally valid. Proponents of this school of cultural relativism argue not that the differences between my perspective and that of my guide are usefully illuminated by understanding the details of the differing cultural contexts from which they emerged, but that they are also equally true pictures of the way the world works.

 

Fundamentalist Cultural Relativism and Science

By completely abandoning the prospect of “etic” (or “outside-looking-in”) analysis, the proponents of fundamentalist relativism place themselves in a thorny situation. Extreme relativism implies a fairly strict allegiance to idealist epistemologies. This means that they subscribe to the notion that subjective experience is the absolute arbiter of reality – or, roughly, that our minds make the world, rather than our minds being a product of a world with an existence independent of and discoverable by human observers.

Under this view, the special utility of science as a knowledge gaining activity is implicitly (and often explicitly) denied. This represents a shift from the rather mundane observation that the process of scientific discovery is a cultural phenomenon to the much more drastic (and far less tenable) position that scientific knowledge – the product of that process – is culturally contingent as well. Force is only equivalent to the mass of an object multiplied by its acceleration in cultures with the concepts supplied by Newtonian physics, a molecule of water is only comprised of two hydrogen atoms covalently bonded to an oxygen atom is societies with chemists, and the electrical resistance of a conductive metal only decreases with lower temperatures in societies that have developed an understanding of electromagnetism.

Such a line of thinking may seem ridiculous – a hyperbolic straw-man set to topple in the slightest breeze – but is in fact an accurate representation of positions forwarded by serious and influential intellectuals in the postmodern movement. This has resulted in a disconcerting number of otherwise intelligent people (typically either heavily informed by or professionally engaged in the humanities and social sciences) taking fundamentalist cultural relativism seriously. Instead of placing special emphasis on the process of scientific discovery, progressive intellectuals too-often celebrate “other ways of knowing”, as if astrology or eastern spiritualism can produce a knowledge-claim half as worthy of confidence as rigorous empirical investigation and peer review.

To a degree, this breed of extreme relativism may seem harmless enough. After all, as astrophysicist and science-popularizer Neil deGrasse Tyson is fond of pointing out, the findings of science are true regardless of whether or not anyone buys into them. Most of the people who fall into the trap of fundamentalist CR aren’t likely to be engaged in the process of scientific discovery – they can watch from the sidelines and pooh-pooh the veracity of scientific claims without impinging on their actual veracity in any way, just as I could sit on the sidelines of a basketball game and deny that a player has successfully made a free-throw without affecting whether or not a new point has actually been scored.

This would be true if a tolerance for nonsense didn’t inevitably yield ugly results. Someone making important life choices based on the sign of the zodiac is foolish and risky, but generally innocuous. But what of an impotent Chinese man who thinks ground-up rhinoceros horn will give him a powerful erection? Or parents who treat their child’s meningitis with maple syrup? How about parents who believe in faith-healing, praying their hearts out as their child withers and dies in agony?

 

Fundamentalist Cultural Relativism and Morality

Dangerous consequences also emerge when the principles of FCR are applied to problems of moral and ethical evaluation. Here, identifying the deficits of extreme relativism requires slightly more nuance than is necessary to articulate its failings with regard to the problem of scientific truth. This is because the discovery or demonstration of moral absolutes is extraordinarily difficult. Moral systems emerge from culture and cultures change from place to place and evolve over time. As a result, ideas about right and wrong behavior change along with the cultural substrate in which they are embedded.

Consider, for example, the notion of individual human rights. Today, the idea that individuals within human societies have certain rights is frequently taken for granted, and the moral consequences of either supporting or infringing on those rights are often taken into account when debating the merits of law or political action on the local, federal, and international stage. Prior to the advent of modern notions of human rights, the idea that certain humans should have access to a certain range of social resources and granted a certain level of equal treatment as a de facto condition of their existence was an alien proposition. Rights instead flowed from monarchs or religious officials. But since the Enlightenment, the Western notion of human rights has continued to expand, becoming increasingly inclusive as people have come to identify and labor to eradicate previously unexamined strains of prejudice and bigotry. One day – probably very soon – they will expand to encompass certain non-human animals. Our ideas about human rights have changed over time, continually altering the standards by which we judge moral propriety.

Another example should drive the point down to bedrock. Nowhere is the temporal plasticity of moral strictures more clearly demonstrated than in religion, where interpretations of the dictates of moral propriety outlined in sacred texts are constantly renegotiated in light of secular changes. The precise phrasing varies with translation, but the content of the Bible possessed by modern Episcopalians is basically the same as the content of the one possessed by 17th century Puritans. Nevertheless, the two sects exhibit significant differences concerning what kinds of behavior are considered morally acceptable. The primary cause of these deviations is that modern Episcopalians have shifted their understanding of doctrine to accommodate wider cultural changes regarding the perception of what does and does not count as righteous behavior.

So clearly, moral prescriptions change with time and context. Given this, anyone advocating a cross-cultural evaluation of moral propriety might seem to be on shaky ground. Here, it becomes important to recognize that cultural relativism, as a methodological tool, says nothing about moral evaluation. Appropriately applied, cultural relativism supplies an observer with the perspective needed to see – for instance – why parents in certain sects of fundamentalist Christianity might deny their child life-saving medical care. It does prevent anyone from feeling or expressing moral revulsion at the underlying beliefs and the practices they engender. Nor does it prevent a society that places value on the preservation of human life from interceding on the child’s behalf and putting the parents in prison for criminal negligence or even homicide.

The same can be said of any of the horrors that flow from religious fundamentalism: honor killings, homophobic hate crimes, female genital mutilation, child brides, suicide bombings, the torture and murder of heretics in places like Saudi Arabia, and the full litany of offenses that trespass the bounds of the moral intuitions of people who value human life and equality. Cultural relativism should be deployed as a tool to understand why these things occur – why the confluence of certain social contexts and religious beliefs leads people to kill and mutilate one another. It says nothing about how we should react to these things. By positing cultural relativism as a prohibition on a moral evaluation, extreme relativists retreat from all responsibility for upholding modern liberal values like universal education, racial and gender equality, freedom from oppression, and access to basic healthcare.

Indeed, though the articulation of universal human values is a notoriously thorny problem, one can only deny their existence by adopting a fantastical view of the forces structuring human behavior. As a rather odious stew of idealism, FCR entails a rank denial of the existence of anything resembling human nature – replaced, presumably, with a Lockean conception of humans as infinitely malleable blank slates. In this view, human moral proscriptions are fashioned from the aether, their only worldly determinant the cultural milieu in which they arise.

But worldwide, humans create and enforce prohibitions on certain types of in-group killing. Likewise, a capacity to monitor social contracts and detect cheaters seems to be innate, offering a clear indication that humans universally appreciate fairness in social arrangements. An aversion to incest is similarly widespread. Though they are differently expressed in myriad taboos and moral prescriptions, these are strong contenders for universal moral preferences. They very likely have a basis in humanity’s evolved psychology, and therefore offer the crude foundations for a universal code of ethics.

On the other end of the spectrum, humans have innate predispositions toward misconduct that can be exacerbated by an exaggerated emphasis on a cultural as an unassailable fount of moral knowledge. For instance, humans have an ugly impulse toward tribalism. Allowing cultural boundaries to play a greater role in shaping values than our shared identity as humans not only grants tribalism succor, it is a natural consequence of taking absolute moral relativism seriously. It is also detrimental to the project of building and maintaining humanistic moral codes and the universalized standards of human thriving they entail. With its exaggerated celebration of boundless pluralism, FCR has the potential to prove inimical to the practical goals of building and maintaining the social institutions most amenable to human success. Consider the very project of building stable social institutions. While it is important to encourage diversity, it is also true that some degree of assimilation is critical to the formation and long-term stability of societies. When the locus of identity in a given society is a multitude of distinct religions, ethnic affiliations, or political subgroups, the resulting fragmentation is a recipe for long term instability and strife. Each group is bound to pursue clannish interests, guided by moral codes that may be both mutually exclusive and entirely divorced from the best interests of the collective.

This is precisely what has happened in the East Ramapo Central School district in New York, where Orthodox Jews seized the local school board and began cutting services in an attempt to alleviate their tax burden. Their goal was to avoid paying taxes on schools their children didn’t intend, but their myopic focus on religious and ethnic affiliation has lead them to neglect – and even harm – the wellbeing of their neighbors. Adherence to the moral proscriptions of an ancient faith, in concert with a very likely evolved predisposition to favor cultural familiars, has led them to place value on their identity as Orthodox Jews to the exclusion of their identity as human beings – and all the ethical imperatives that identity implies.

Yet even in the absence of any evolved, universal preferences to form the foundation of widely applicable moral prescriptions, it is impossible to advocate extreme forms of cultural relativism without abandoning modest claims about right and wrong. For example, it might be argued without a lot of protest that individual behaviors that encourage or contribute to the physical health or well-being of others are laudable. Conversely, those that can be shown to be directly or indirectly harmful to others are not. It’s good to feed your kids, bad to starve them. A parent who poisons the lungs of her offspring with second-hand smoke can be sensibly accused of engaging in some form of wrongdoing. But what of the men who have foisted upon their wives the bizarre and inarguably sexist tradition of wearing burqas? Not only is this practice reflective of the possessive, proprietary interests of a regressive patriarchy, it has also been linked to vitamin D deficiency and associated problems like rickets.

According to FCR, the latter claim offers no basis for ethical evaluation. Lacking any direct knowledge of what it is like to be a Muslim woman, I have no basis for suggesting that their adherence to the tenets of Islamic faith leads them to live a less healthy and fulfilling life than they could otherwise. Boiled down, extreme relativism argues that my desire to see all humans, everywhere, granted basic human rights is ill-founded. Moreover, it posits that my opinion that certain cultural practices or religious beliefs are inimical to that goal is bigoted. Instead of expressing an interest in the well-being of all humans, the proponents of FCR see this view as “Islamophobic”.

 

Cultural Relativism as Ethical Obstructionism

By suggesting that cultural differences are essentially unbridgeable and denying the possibility of either uncovering or negotiating a universal standard of human thriving, FCR  has the curious consequence of more substantially “otherizing” (to borrow a rather obscurantist term from the social justice world) people from distinct cultural backgrounds. In one of its more popular incarnations, it argues that everyone is gifted with a “positionality” (yet another term of polite PC pedantry) that renders their ideas about right and wrong both externally unintelligible and permanently unassailable. It is to say, in effect, “your worldview is so different from my own that I can hardly justify feeling or expressing outrage when you either abuse or are abused by a member of your cultural subgroup.” At best, this is a recipe for social stagnation. At worst, it’s a way of abrogating centuries of moral progress – dispensing with hard-earned notions of human rights in favor of milquetoast ideas about cultural sensitivity.

FCR is also a direct progenitor of the modern strain of intellectual and moral sensitivity sweeping college campuses in the form of pleas for “trigger warnings” and concern over “microaggressions”. These violations of Enlightened, humanistic ethics seem superficial in comparison to some of more heinous transgressions countenanced by FCR, but they presage something sinister. Where once FCR might have precipitated innocuously nondescript rhetorical utterances along the lines of “who are we to judge?”, it now motivates an authoritarian push for the establishment of pristine thought-sanctuaries – places where ideas are vetted for the slightest hint of potential trespass against the ideals and preferences of cultural subgroups. In this regard, FCR has turned from a bastion for moral cowardice to a direct assault on civil liberties – and it is here that it most significantly earns a sobriquet typically reserved for regressive strains of religious intolerance and slavish adherence to ideology: fundamentalism.

Thomas Paine was absolutely correct when he observed, in The Rights of Man, that natural rights are irrevocable. Though it may be a recent invention, the introduction of the concept of “human rights” represents a monumental transition in our understanding of human ethics. Absent its cataclysmic obliteration from the realm of human thought, it will remain a critical component of our modern assessments of right and wrong. The secular notion of individual human rights is, inarguably, an improvement over previous moral codes, yet FCR – manifest in the mutual unintelligibility implied by overwrought concerns over “positionality” and an obsequious devotion to cultural sensitivity – would have us abandon that progress by asserting that people’s culturally derived beliefs about the will of Allah or the efficacy of vaccines are more important that anyone’s right to live a healthy, independent life. Brass tacks, there are good reasons to think some ways of living are better than others.

I would humbly submit that we shouldn’t throw the proverbial baby out with the bathwater. Cultural relativism, in its original, lighter form has proven immensely useful to anthropologists and ethnographers seeking to understand the proximate causes of locally expressed human differences. It was a substantial leap over the ham-fisted, Western-centric, deeply racialized theorizing of 19th century intellectuals. The underlying principle, that all human cultural systems deserve to be understood in their own terms, is useful as more than just a methodological tool for ethnographers. On the global stage, it has a role to play in the shaping of foreign policy and international affairs. On the more humble scale of individual lives, it has a role to play in helping neighbors understand one another.

It just needs to be deployed under the recognition that, inasmuch as it is useful tool for building human understanding, it is has little value as tool for evaluating knowledge claims and moral proscriptions. To the extent that cultural relativism, manifest in its fundamentalist form, is interpreted as endorsing the primacy of subjective experience in dictating the structure of reality or mandating abstinence from adopting or defending human rights on the grounds that those rights are a “Western construct”, it deserves all the ridicule it receives. Science is the best tool for gauging truth. Cultural practices that inhibit the achievement of universal human rights can be justifiably viewed as harmful and ought to be stridently opposed and vigorously critiqued. These assertions aren’t oppressive or marginalizing or bigoted. They’re true.

Sacrificing Reason on the Alter of Purity: U. Va. Students Protest Use of Jefferson Quotes

University of Virginia students and faculty have signed a letter criticizing University President Teresa Sullivan for invoking the words of Thomas Jefferson. In an email apparently intended to salve the all-to understandable confusion and anxiety stimulated by the election of Donald Trump, Sullivan quoted Jefferson on the importance of U. Va. students, who “are not of ordinary significance only: they are exactly the persons who are to succeed to the government of our country, and to rule its future enmities, its friendships and fortunes.” In other words, “don’t let the election of a deranged demagogue lead you into hopelessness: you are the future, so act accordingly.”

The heart of the complaint is unsurprising: Jefferson owned slaves. Slavery was (and is) an ethical abomination. This is indisputable. That Thomas Jefferson – among other American founders – owned human beings as ranchers today own cattle is a telling stain on the American myth. For many, it gives lie the words Jefferson penned – “that all men are created equal.”

But this is a fallacy. All men are created equal. The truth of the idea exists independent of its originator. For centuries, powerful men in the United States have repeatedly failed to make this truth manifest in the lives of all citizens. Some rancorous bastards have even worked against that lofty proposition, exploiting the poor and the dispossessed, brutalizing those unfortunate enough to have been born without white skin, rich parents, and a penis. That places some of these people on an ethical spectrum somewhere between pitiful disappointments and full-bore monsters. For others, it clouds a veneer of heroic righteousness, leaving us to puzzle over what to make of people who have done both good and awful things.

Yet America’s history of racism and oppression says nothing of ideas about the equality of human beings. Either all humans are born with equal intrinsic value or they are not.

The same is true of the votive to intellectual pedantry and banality some of the students and faculty are building at U. Va. Either Jefferson’s statement is true and valuable, or it is not. His personal crimes are immaterial. To think otherwise is to sink into the trap of ad hominem thinking and, doing so, help perpetuate the rancid stew of identity politics currently corroding political discourse in the United States. It suggests not only that human beings should be judged entirely in terms of their worst behavior, but also that ideas cannot rise above the inevitable flaws of the humans who create them.

This is truly bizarre thinking. It’s hard to imagine what ideas and expressions would remain permissible in a climate where they must first be sterilized of any murky or odious associations. If the proscription is that ideas can’t come with any baggage, either in terms of the person who dreamt them up or the context in which they originated, then most ideas automatically become verboten. If readers were to judge my arguments entirely in terms of the worst things I’ve done or said, then my humble attempts at persuasion would be irrevocably impotent to a huge swath of the population.

Insofar as this view seems extreme, it is nonetheless implicit in the complaints of people who would rather not have to suffer under the tyranny of a Thomas Jefferson quote. This is ironic, because U. Va. was founded by Jefferson. If a Thomas Jefferson quote is an ethical provocation beyond anyone’s capacity to bear, what are we to make of a salary or education provided by a school that wouldn’t exist without him?

In the sweep of history, the insipid criticisms of a well-intention email will be (or at least should be) a mote of dust. But it is nonetheless illustrative. It tells us that becoming an enemy of reason clearly demands no specific political allegiance. All it takes is that perennially destructive commitment to ideological purity captured under the sprawling umbrella of fundamentalism. Religious fundamentalism. Communist fundamentalism. Free-market fundamentalism. Libertarian fundamentalism. And now, liberal fundamentalism: the belief that everyone’s personal experience is a window of unassailable insight and everyone’s opinion – except those with “privilege” and “power” – is infinitely precious. To satisfy this belief, its proponents are willing to wage war against the climate of open and free expression that gave rise to everything from life-saving vaccines to the very notion of individual human rights. There is very little good in this world that isn’t due to people who cherish reason and accept the premise that ideas should flourish or fail on their individual merits.

Perhaps President Sullivan’s email had other flaws. If it normalized Trump, for instance, it would present a prime target for serious criticism and a springboard for worthwhile debate. Maybe the idea that U.Va. students are special is false, in which case it should be refuted. But the idea that the it ought to be censured because it echoes an idea from a man who was, in terms of racial justice and human equality, quite clearly a hypocrite is dubious at best.

Consider an historical anecdote, at once usefully reductive and logically instructive. In the late 19th and early 20th century, agricultural production was limited by the availability of fertilizers. Using the technology available at the time, producing food required more land to feed far fewer people than it does today. A couple of German chemists changed this, developing a method to capture atmospheric nitrogen and turn it into ammonia for use in fertilizers. Billions of people are alive today who would never have existed had those German chemists not made those breakthroughs, inventing what is today known as the Haber-Bosch process.

Thing is, Fritz Haber (the Haber, in the Haber-Bosch process) was a real son of a bitch. Not only did he treat his family terribly, putting his professional ambitions and nationalistic impulses ahead of familial loyalty – thereby likely contributing to the suicides of his first wife and, later, two of his children – he is also considered the father of chemical warfare. He pioneered the weaponization of chlorine and other poisonous gases, directly contributing to the agonizing deaths of tens of thousands of Allied soldiers. Later, scientists working under Haber developed a form of cyanide gas known as Zyklon A – the predecessor to the Zyklon B pesticide used to murder Jews during the Holocaust.

Under the theory of discourse the complainants at U. Va. are implicitly advocating, the Haber-Bosch process – and all descendent technologies – should be immediately abandoned. After all, Fritz Haber was, to put things in disturbingly mild terms, a real dick. Of course, millions – if not billions – of people would starve to death, but the descendants of those who died miserably in the trenches of WWI or in the gas chambers of Nazi Germany wouldn’t have to deal with eating food tainted by Haber’s hideous legacy.

Ideas and opinions should be judged by their qualities, irrespective of the confusion of dastardly or enlightened deeds left in the wake of the people who produce them. The Haber-Bosch process should be weighed in terms of its effects: is it better that billions of people exist today who very likely wouldn’t have otherwise, or does it matter more that the Haber-Bosch process has contributed to overpopulation and all the attendant environmental and social costs that come with it? That’s an interesting question. Whether or not we should do away with the good works and useful ideas of people like Fritz Haber and Thomas Jefferson because the character of those men was blighted by the misery they inflicted on others is not. In fact, it’s not even a question. Those ideas exist. They are worthwhile or dispensable on their own merits. Only a reckless, enthusiastic embrace of authoritarianism could ever get rid of them. And that, I worry, is precisely where the postmodern left – in its urgent pursuit of ideological purity and boundless inclusivity – is headed.

In the world of ideas – that is, in other words, the world of higher education – what matters is not whether they make people feel welcome and offended. It’s whether or not they are true and make sense.