In Defense of Cultural Appropriation

Stripped to its core, cultural appropriation is a matter of one culture borrowing from another. But recently, it has morphed into an altogether more nebulous construct. At once a righteous denunciation of exploitation – capitalist and colonialist alike – and a strident clarion call asserting a bizarre provincialism of the oppressed, it’s difficult to nail down precisely. Yet concerns over cultural appropriation are motivating increasingly censorious campaigns.

By some accounts, cultural appropriation is universally onerous. By others, its propriety is conditional, depending explicitly on the power relationships between the appropriator and the appropriated. What is and is not permissible is a question of whether a member of a historically privileged group is coopting the cultural artifacts of the historically marginalized. Often, the bounds of propriety are marked by racial ascriptions, and are thus dependent on the maintenance and perpetuation of social constructions that only loosely track actual patterns of genetic variation and cultural inheritance.

Too frequently, concerns over cultural appropriation reflect legitimate grievances. It’s hard to fault the black pioneers of American blues – who often died penniless – and their descendants for harboring some resentment toward the white British and American artists who got rich borrowing heavily from their craft. Likewise for Native Americans, who see caricatures of their ceremonial garb suddenly populate suburban landscapes on Halloween, sexualized and commercialized in abject indifference to their traditional significance.

Other times, the apparent abuses are decidedly more banal. There are those who claim that it is inappropriate for white students to eat bad sushi at their university cafeteria. Recently, members of a student government were berated and disciplined for wearing sombreros – a hat with no particular significance beyond its capacity to shield the wearer from the sun – to a party. Annually, celebrities are berated for the choice of Halloween costume. More ominously, activists have begun to suggest that it is wrong – even criminal – for white novelists to portray the experiences of minority groups in their fiction.

In these latter cases, social justice warriors ostensibly clamoring for fairness and equality are inadvertently giving voice to a divisive program of artificially imposed ethnic purity. Foundational to their argument is the suggestion that people with the right kind of heritage can and should exercise absolute and exclusive dominion over specific cultural artifacts, setting the terms of exchange and expression for anyone who doesn’t meet the appropriate stipulations.

In this sense, much of the recent clamor over cultural appropriations should strike a profoundly unsettling note to anyone dedicated to the preservation of liberal values like freedom of thought and – more pointedly – freedom of expression. It suggests one person’s identity, however fluidly construed, can be used to establish limits on another person’s behavior. This is a perversely leftward retreat to tribalism, explicitly granting the erroneous claim that the differences between humans with different ethnic backgrounds, economic prospects, gender identities (and so forth) are not variations on a theme, but unbridgeable chasms. To accept that premise is to reject the larger enterprise of progressive humanism, abandoning hope for a global community in favor of insular patrimonialism.

Of course, it doesn’t help that many of the worries over cultural appropriation are implicitly rooted in an abject misunderstanding of what culture is and how it forms. For those interested in using the notion of cultural appropriation to establish limits on other people’s expression or consumption, the fact that most (if not all) cultural artifacts, from musical traditions to sacred religious icons, are forged at the interface between cultures is dismissed as ideologically inconvenient. Cultures are fluid and permeable – their only universal features are plasticity and change.  Cultural appropriation isn’t an assault on identity. It’s at the core of what culture is: people sharing information, learning from one another, borrowing and trading ideas.

###

Defining culture is almost as thorny a problem as defining cultural appropriation. In the late 19th century, the English anthropologist E. B. Tylor offered a comprehensive definition, writing that “Culture, or civilization, taken in its broad, ethnographic sense, is that complex whole which includes knowledge, belief, art, morals, law, custom, and any other capabilities and habits acquired by man as a member of society.” Recently, researchers have opted for a more parsimonious approach, defining culture as the non-genetic information people create and share with each other throughout their lifetimes and across generations, often embodied in artifacts (tools, art) and rituals. More simply still, culture is the trait that sets humans apart from other animals – it’s what has allowed a species of primate that evolved primarily in the African tropics to occupy nearly every ecological niche on the planet, along the way inventing everything from bows and arrows and iPods to algebra and Catholicism.

Regardless of how we choose to define it, one characteristic of culture remains obvious: it is not inert. A culture in stasis is a culture on the way to extinction. All cultures present today exist because of their capacity to evolve and change, incorporating new ideas about the world and new strategies for living in it according to the exigencies of human psychology and the vicissitudes of the natural world. Our decision to associate certain features of culture with specific geographic regions, nationalities, or ethnic divisions is largely a product of parochialism. Rare are the cuisines or religious traditions that can’t be traced to patterns of intergroup interaction and borrowing at some point in their history. That we think otherwise is due to the myopia foisted on us by minds selected to process time on the scale of weeks, months, or years instead of decades, centuries, and millenia.

Consider, for the sake of illustration, all the problems of creating a reliable definition of separate species in biology. One of the most widely accepted axioms is reproductive isolation – organisms are part of distinct species if they can’t successfully hybridize and produce fertile offspring. This seems reasonable, but it presents problems. For instance, transient and resident killer whales can produce fertile offspring, but in the wild they do not breed or commingle. Likewise, despite being separated by thousands of years of artificial selection, dogs and wolves can still breed – chihuahuas and wolves are members of the same species. In both killer whales and canines, there is a good case to be made that a broadly useful definition for what does and does not count as a particular species is insufficient. The factors that govern reproductive isolation can be both functional and facultative.

These problems intensify when we take a deeper view of animal relationships. Obviously humans can’t have babies with chimpanzees. But somewhere between six and seven millions years ago, we were both part of the same species – a common ancestor to both modern chimps and modern humans. A gap that is today unbridgeable is composed of hundreds of thousands of tiny steps, mothers and offspring that were essentially indistinguishable from one another. The process worked slowly, but at some point the creatures that would lead to modern humans and the creatures that would lead to modern chimps crossed an irrevocable frontier, beyond which hybridization was impossible.

This situation is much the same for human culture. At some point in our prehistory, it’s reasonable to say that there was so little variation in human culture that the distinctions between cultural groups would be difficult to recognize. The toolkit was simple, comprised of traditions for making chipped stone tools, processing food, and managing – if not outright creating – fire. Over time, these traditions were diversified and elaborated, becoming increasingly sophisticated and more highly specialized as the cognitive capacities of hominids increased, people migrated to new environments, and spent time developing in relative isolation. With the human migration out of Africa, cultural diversity exploded: In an exponentially accelerating curve, the cultural repertoire of humans has expanded to include thousands of diverse religious traditions, mutually exotic cuisines, strategies for living and thriving in disparate ecosystems like the high arctic and the tropical rainforest, and a cornucopia of technological innovations.

As with humans and chimps, the relationships between cultures can be characterized as part of a process of descent with modification. Pluck native, unilingual English and Hindi speakers from their homes in Omaha and New Delhi and put them in a room together. Very likely, verbal communication between the two would be extremely limited. But carefully trace the development of those languages back through the centuries and you arrive at Proto-IndoEuropean, a language ancestral to both Hindi and English – along with thousands of other languages and dialects.

But under close scrutiny, the similarities between cultural and biological change begin to break down. Significantly, biological change is a one way street. Genetic information passes from parents to offspring, but not from offspring to parents or offspring to offspring. That is, in most sexually reproducing organisms, genetic transmission is strictly vertical. This is why the boundaries between species eventually become irreversible: once you’re a chimp ancestor that can’t produce a fertile child with a human ancestor, there’s no going back.

Culture doesn’t work that way. Sure, a lot of cultural information gets passed down from parents to offspring. But it also gets passed around among friends and siblings, or from grandchildren to grandparents. Cultural information can even pass group boundaries marked by intense mutual antipathy. Social learning is the cornerstone of cultural evolution – the only essential requirement necessary for one person to acquire a cultural trait from another human. Consequently, in cultural evolution, the lines of transmission can be vertical, horizontal,  or oblique. They aren’t dependent on the strictures of genetic kinship. Those Hindi and English speakers are separated by vast geographical distances, thousands of years of unique history, and lifetimes of personal experience. But with a bit of effort, the native English speaker can learn Hindi (or vice versa) and they can begin to understand each other incredibly well.

The nature of human learning and cultural transmission make culture an incredibly dynamic force. It also makes establishing good definitions for particular cultures exceptionally difficult, strictly dependent on narrow frames of time and space. The bounds of culture are largely relative, constructed on the fluid substrate of human relationships and historical contingency. Any emblem considered diagnostic of a certain culture is often a product of perspective and perception. From the inside looking out, there are plenty of people who would confidently proclaim that the United States has no national culture (save perhaps consumerism) but there are plenty of people from other nationalities who would beg to disagree. Moreover, that national culture – whatever it is – emerges from the interaction of countless subcultures that can’t, no matter how hard anyone tries, be meaningfully dissected into discrete, universally recognizable units.

Look back to the embarrassing huff over bad sushi. Consider the absurd generalization at its core: that Japanese culture is a monothetic entity, coherently symbolized by a specific cuisine. This view conflates nationality with ethnicity, and both nationality and ethnicity with culture. Both are part of culture, but neither is entirely constitutive of it. Moreover, it disregards any internal variation within the Japanese nation-state and the complex historical processes that have shaped it. The implication is that being Japanese can be reduced to and signified by consumption of raw fish and rice of a certain style and quality.

The example is reductive, but it does get to the core of the issue. If we try to define a broad racial category like “Caucasian”, what we are dealing with is something like a normal curve that significantly overlaps with the curves defining other racial categories. In this definition, there might be a certain range of skin pigments that reliably track patterns of genetic variation and inheritance at high latitudes among populations with diets poor in vitamin D. But at the edges, where the tails begin to taper and overlap, things get tricky – there is no sharp partition between the variation that defines one “race” or another. And within the coarse boundaries of “people who descended from European populations living above a certain latitude”, there are dozens – even hundreds – of racial subcategories. These divisions are invisible to some, intensely meaningful to others, and all constructed from the shifting, malleable scaffolds of cultural change. Racial categories are post hoc constructions, selectively targeting physical features and treating them like essential characteristics.

These distinctions are further complicated when we attempt to dissect the world along cultural boundaries, which are more often than not only connected to underlying patterns in superficial physical characteristics and genetic variation in loose and complicated ways. Compared with other species, humans exhibit both very little genetic variation – most of it within, rather than between ethnic groupings – and huge extremes of behavioral variation, largely attributable to culture. Cultures are molded by patterns of diffusion and acculturation (fancy anthropological terms for the spread and sharing of culture) as ideas leap across the barriers that limit genetic transmission, tweaked and mutated and shared (or stolen) again and again and again. Talk of this or that culture is more a matter of convention, an imprecise reference to the blurry peak of a broad and shifting curve. This presents definitional problems, but as a reflection of the reality of human culture, it’s worth celebrating. It means the obstacles that divide us are surmountable, and the frontiers of human innovation and learning are virtually limitless.

The most dire anxieties about cultural appropriation are a product Platonic essentialism applied to culture and identity, as if a given culture is something discrete, bounded, and easily definable: “Japanese culture is the culture in which people eat sushi” or “African-American culture is the culture in which people listen to and create hip-hop”. Nothing could be farther from reality. Like race, most notions of culture – and the bounds between them – are social constructions. They rarely reflect the actual patterns of information transmission and inheritance that have shaped them into the forms we currently observe. Though talk of Japanese culture or Southern culture or Native American or dominant/mainstream American culture is convenient, the idea that these things exist as sharply defined entities is simply false. The boundaries between them are fuzzy.

Yet even if we are permissive in our assessment, granting that the cultural boundaries people see reliably track the patterns of divergent evolution and historical contingency that have shaped observable differences in belief, cuisine, and attire, we’re still left with a vexing quandary. Cultural appropriation is a matter of privileging our position in space and time. It says that the cultural differences we see to today are the ones that matter and we should make deliberate efforts to freeze them in place. And, when you take the time to recognize that cultural differences are really the only things that divide us, this impulse morphs from a kind of social sensitivity into a disheartening initiative to cement cultural divisions into the fabric of human experience.


###

All this discussion of the ambiguity of culture deserves a caveat. Taken to an extreme, these considerations could lead to some spurious conclusions. For example, one might conclude that culture is entirely a matter of perception – that it is, more or less, all in our heads. This is nonsense.The bounds of culture are both porous and flexible, responsive to the position and experience of observers. What does and does not count as a valid appropriation of a cultural artifact looks differently to those within the bounds of a given culture than it does to those without. More counterintuitively, what does and does not count as a reliable diagnostic feature of a given culture changes over time. But that does not mean that culture doesn’t exist as a measurable, affective part of the real world.

Black American culture is different today than it was fifty or a hundred years ago. It will be different fifty or a hundred years in the future. Yet it still makes sense to speak of that culture as something with continuity and coherence. It exists because patterns of human interaction have produced ideas and traditions that are reliably correlated with a broad group of people at a certain point in time. Those traditions and ideas and the people who share them change with time, but they are connected by discernible lines of history and heritage. That doesn’t mean black culture in the United States can be decomposed into a precise set of elements that will allow anyone to inerrantly spot a part or product of that culture when they encounter it. Instead, it means that culture is a thing that exists and, however difficult it is to pin down, exerts a real influence on the behavior of individuals and the shape of society.

Consequently, concern over cultural appropriation can’t be dismissed with a hand-wave. Most of the ruckus does entail some faulty assumptions about the nature of culture, but that doesn’t mean that cultural artifacts can’t be deployed in ways insulting either to their originators or to a group that has chosen to ascribe them with special significance. But because it’s impossible to assign any cultural artifact a fixed and universal meaning, the nature and degree of offense remain subjective.

For example, I see some room for offense if a white, upper middle-class kid dresses like a Sexy Indian on Halloween, dismissively treating hamfisted reconstructions of Native American ritual outfits like a Batman or Frankenstein costume. At the same time, I’m not the least bit bothered by a Satanist or atheist who wants to make a statement by wearing the Christian cross upside-down. Likewise, it doesn’t bother me that Christians annually decorate their homes with coniferous trees, a practice attributable to the pagans of northern Europe who medieval Christians routinely slaughtered. In each case, we are a dealing with a situation where an item held sacred by one group is being appropriated (and perhaps mistreated) by others. Only one of them bothers me (and only when I really put effort into being bothered by it), and though I can think of a number of justifications for ire, I can’t think of a consistent set of criteria for reliably distinguishing between permissible and unacceptable forms of appropriation.

The impulse is to set the litmus at relationships of dominance and oppression, but this isn’t as clear cut as it seems. It takes all the definitional problems of culture and heaps them on individuals, basically positing that a foolish and insensitive suburban kid is culpable for the entire history of oppression Native Americans have faced at the hands European colonists. This pushes aside the important work of educating someone about the historical and social facts that make some choices rude or reprehensible in favor of establishing a crude heuristic whereby people are taught to just avoid experimenting with unfamiliar cultural artifacts.

Defining valid or invalid cultural appropriation is ultimately a matter of how individual people feel about various forms of cultural expression. One can point to relationships of power, privilege, oppression, and exploitation as a guideline, but that doesn’t eliminate any of the problems inherent in any initiative that seeks to use cultural appropriation to construct limits on expression and consumption. Inarguably, cultural appropriation happens. Uncontroversially, some forms of cultural appropriation are thoughtless, crass, hurtful, and bigoted. Responding to them is a matter of vociferous criticism and serious debate, not a retreat to delicate cultural provincialism.

Taken seriously, the most strident attacks on cultural appropriation can be taken to advocate ethnic tribalism and cultural stasis. No one should be permitted to write a book that contains a character whose arc involves elements that trespass the frontiers of personal experience: a white, able-bodied, cisgender woman can’t write a story involving a character who is a transgender Australian aboriginal paraplegic. Suitable meals for breakfast, lunch, and dinner should be restricted to those involving recipes and ingredients directly attributable to your personal heritage (historical cutoff yet to be defined): no tacos or tikka masala for the Irish. Appropriate attire should scrupulously avoid incorporating elements invented or commonly used by people who don’t look like you: if you’re not a direct descendant of a horse-culture from the Eurasian steppes, probably best to avoid wearing pants.  

Given the general ideological inclinations of the people most vocally protesting cultural appropriation, this is a perplexingly conservative impulse, effectively sacralizing parochial snapshots of culture by establishing rigid strictures on its appropriate use. It ignores the hazy edges and dynamic nature of culture as a force for sculpting and expanding the human behavioral repertoire. Cultural appropriations can make people upset – with varying degrees of legitimacy – but they do not do any measurable harm. That puts efforts to constrict them on a dangerously slippery slope. If you’re not convinced, consider the following: an activist arguing for limitations on the use of a Native American headdress as inspiration for a Halloween costume is employing the same reasoning as a fundamentalist Christian arguing for limitations on the rights of gay men to marry. In both cases, the grievance boils down to a sense of offense rooted in a subjective, historically contingent definition of propriety. In neither case is anyone demonstrably harmed – gay marriage has never actually hurt a Christian family, nor has an ill-conceived choice of Halloween costume inflicted more than emotional harm on a cultural subgroup – but people nonetheless claim to have had their traditional values violated.

It’s important to be clear and precise about the argument I’m advocating. It is perfectly acceptable, even sometimes laudable, for people to feel outraged by the appropriation and perceived misuse of whatever they take to be their cultural heritage. Thus aggrieved, they should express themselves, and – especially in the case of historically marginalized or oppressed groups – have platforms for doing so. At no point, however, is it justifiable for one person or group’s sense of offense to motivate the establishment of boundaries on another person or group’s capacity to express themselves. One can point to specific injustices to validate this or that claim of cultural misappropriation, but in the end, it all reduces to one thing: a plea for humans to respect one another’s points of view. That endeavor can never involve placing restrictions on the realm of permissible expression. Indeed, if we accept that respect and understanding as praiseworthy aims, we must grant the corollary – that they are best served by allowing everyone as much room as possible to engage in the fraught business of exploring other people’s experiences.

For a comedic take:

The Use and Abuse of Cultural Relativism

A little over a year and a half ago, my wife and I were in Cambodia, sitting cross-legged on the wooden bow of a weathered prop boat, chatting with a local guide as he took us through a floating village populated – he told us – by ethnic Vietnamese immigrants. The boat cut through a wide channel in a patch of inundated forest on the Tonle Sap, our wake fanning out lazily to disappear among the trees or slosh mildly against the wooden bases of floating houses.

I don’t recall the precise details, but somehow our conversation turned to taxonomy. Our guide wanted to know what, according to our view, counted as an animal. Was an ant an animal? What about a human? This initially struck me as something of an odd question, but through the course of our back-and-forth it became apparent that it was quite sensible when understood from his perspective. Though the Western system of taxonomic classification exudes a comfortable aura of familiarity to those who have been raised with it, it is far from the most obvious way to order the natural world. No doubt this is part of the reason it didn’t occur to anyone prior to the past three centuries of human history.

In the view of our guide, the relationships among living things are defined by principles of opposition and symmetry. There must be two of each kind in a category and the relationships among categories are defined by an array of similarities and differences. That dogs and monkeys and snakes count as animals seemed to him common sense. That ants and humans are also members of the same category struck him as a little more peculiar.

When it comes to ordering the world of birds and beasts, the primary difference between our guide and ourselves was that we happened to subscribe to a classification system that orders the living world according to relationships of descent with modification. We apply a Linaean classification system structured around the notion that all living things share a common origin, and that their relationships are defined by how recently they diverged from a shared ancestor in a nested hierarchy extending back to a successful batch of single-celled organisms that emerged from the primordial soup some 3.5 billion years ago. Though many of the features that seemed most salient in his classification system are irrelevant to our Darwinian framework, they nonetheless have their own internal logic.

Unfortunately, our conversation was far too short for anyone involved to build a comprehensive picture of the other’s worldview. Nevertheless, a few general points are obvious. Foremost, that our guide’s view differed so substantively from our own is surely not the product of any innate difference in cognitive capacity. Nor can it be attributed to any inherent difference between his most recent ancestors and our own.

Instead, his perspective is primarily a product of vast networks of highly contingent influences. A non-exhaustive list might include the distinct and ever changing traditional beliefs of whatever ethnic group (or groups) he might belong to, his parent’s interpretation thereof, the ideas of his friends, the recent political history of Cambodia, and so forth. It’s highly unlikely that his ideas about the world are identical to those of his friends and neighbors, but are nonetheless shaped by sampling a more widely shared cultural repertoire. The same can be said of my wife and I – the fact that we have learned about biological evolution is a consequence of the circumstances in which we were born and raised, and the distinct cultural trajectory pursued by a certain subset of Western Europeans following the invention of the printing press and the subsequent intellectual revolution of the Enlightenment.

 

A Brief History of Cultural Relativism

The view that human differences in knowledge, belief, or practice are usefully (if only partially) explained by the proximate influence of culture, and that these difference can be usefully illuminated through an understanding of the internal logic of the cultures that produce them, is the perspective offered by cultural relativism.

Unfortunately, cultural relativism is a widely misunderstood principle. The concept (though not the term itself) probably first emerged in the writings of the anthropologist Franz Boas in the closing decades of the 19th century. Since then, it has been a useful tool for ethnographers seeking to understand culturally motivated behavior. At the same time, it has also been a plague upon the larger enterprise of producing scientifically justifiable explanations for human behavior. Upon entering the popular vernacular, it has offered of a veneer of intellectual rigor and scholarly nuance to facile arguments about the plurality of knowledge and the primacy of subjective experience. It has granted succor to the idea that there are “other ways of knowing” as reliable and equally deserving of confidence as the scientific method. It has sired pleas for “trigger warnings”, outrage over “microaggressions”, and an exaggerated emphasis on the subjective experience of feeling offended as a suitable justification for curtailing speech.

In its most extreme incarnations, cultural relativism has been advanced as a prohibition on any and all cross-cultural evaluation of beliefs, practices, and knowledge-claims. Everything from traditional accounts of cosmic creation and religious explanations of natural phenomena to moral arguments about the nature of good and bad behavior is taken off the table, shielded from scrutiny by the notion that cultural differences are sufficiently deep to render the humans they divide mutually unintelligible. Superficially, this stance has often been made to look like a sophisticated embrace of uncertainty, but it is really nothing of the sort. Instead, it is a pedantic, dehumanizing, and pusillanimous retreat from the hard work of uncovering truth and defending modern ideas about universal human rights.

Cultural relativism can be decomposed into three components. The first is methodological cultural relativism. This is a tool used by anthropologists and ethnographers to understand the beliefs, customs, ideas, and behavior of people with cultures and histories different from their own. In many respects mundane, methodological cultural relativism has proven extremely useful, allowing researchers to strip away the often blinding baggage of personal history and cultural bias.

Sadly, methodological relativism has come to be associated with the parasitic vices of normative (or moral) and cognitive relativism. Too often, these have served to rob mundane cultural relativism of its methodological utility. Cognitive realism is what emerges when one looks upon the real problems inherent in the human quest for knowledge – the things that make rock-solid, axiomatic certainty so difficult to achieve – and withdraws from the search entirely. Similarly, normative relativism represents a withdrawal from the challenges that emerge from the fluid, turbulent ambiguity inherent in developing and applying coherent ethical systems.

Taken together, cognitive and normative relativism are the seeds for a system of intellectual and ethical obfuscation that will hence-forth be referred to as fundamentalist cultural relativism (FCR). Rather than a useful system of intellectual criticism and methodological prescriptions, FCR is the intellectual equivalent of a mountaineer collapsing at the base of gnarly, ferocious peak and declaring it impossible to summit before even giving it a serious attempt. Put more simply, it’s what happens when someone looks at a problem, notes that it’s hard, and immediately declares it intractable.

As a consequence, many serious academics have abandoned cultural relativism entirely. The physicist David Deutsch has cast it as a brand of “persistently popular irrationality.” Insofar as he is referring to the idea’s most common flavor, so thoroughly infused with cognitive and normative relativism and particularly pervasive among the social justice crowd, he isn’t far off the mark. Indeed, the extreme relativism that flourished in the humanities and social sciences in the 1980s and 1990s is nothing if not a persistently popular brand of irrationality. With the spread of the vapid school of vigorous intellectual masturbation and enthusiastic pedantry politely referred to as “postmodernism,” the notion that all knowledge-claims or moral positions are equally valid became dogma throughout many of the subdisciplines that fall within the scope of the humanities and social sciences. My own discipline, anthropology, is still recovering from the effects of this intellectual rot.

One of the chief problems with that the brand of cultural relativism (justifiably) decried by thinkers like Deutsch is that it represents an abuse of a more useful principle. The notion that the beliefs, customs, and behaviors of different cultural groups could be usefully analyzed and understood in their own terms was a radical advance over previous anthropological methods. When the earliest forms of the concept were first advanced, beliefs about the social – and even biological – supremacy of white Western men were scarcely given a second thought. White men in Western, industrialized societies occupied the pinnacle of a long chain of social and biological evolution.

According to these views, Western culture represented the purest, most complete manifestation of a natural order, the metric against which all other human cultures should be judged. The cultural differences uncovered in places like the equatorial rain forests of Papua New Guinea or the frigid expanses of the high Arctic were commonly viewed as deviations from perfection. Today, these notions are rightly considered signally ridiculous and repugnant by all but the most intransigent racists and xenophobes. But to many prominent and serious thinkers occupying important positions of academic and political authority in the late 19th and early 20th century, nothing could have been more obvious than their own innate superiority.

Eventually, ethnographers and anthropologists like Franz Boas came to recognize that the rampant ethnocentrism of their peers was translating into bad science, fundamentally hobbling efforts to actually understand the roots of human behavioral variation. The idea that “traditional” societies, such as those found practicing foraging or horticulturalist lifestyles in the rainforests of Brazil, were somehow frozen in time – primitive vestiges of points along the slow but inevitable march toward the industrialized modernity of centralized state authority and market economics – represented a roadblock to intellectual progress. Recognizing the faults in this perspective, some turn of the century anthropologists began to advocate analyzing cultures according to their own internal logic.

It’s worth noting that the kind of relativism espoused by Boas and his students was almost certainly more permissive that the version being advocated here. Nonetheless, as an analytical tool, cultural relativism represented a considerable improvement over the parochial, Western-centric worldview that had been distorting the thinking of early explorers and ethnographers. Rather than viewing non-Western cultures as primitive holdovers, serious anthropologists began to recognize that individual cultures – and the differences among them – could be fruitfully illuminated by seriously attempting to understand them in their own terms. Substantial utility could be found in recognizing that each culture is a product of a vast web of historical contingency.

 

Cultural Relativity as a Modern Research Tool

To understand why this is so, it’s useful to couch the argument in more modern terms. Human behavior, like that of other animals, is a product of a complex stew of influences. This includes a suite of genes – some highly conserved, others more recently selected as a result of the specific challenges that came along with being a bipedal, largely hairless, highly social primate living on the African savannah for the better share of the Pleistocene. It also includes a capacity to learn and usefully modify our behavior over the span of hours, days, weeks, months, or years – allowing us to respond to environmental variability much more rapidly than the relatively glacial pace at which natural selection sorts the genetic wheat from the chaff. But the inheritance of genetic information is ubiquitous across the living world, and learning is not particularly rare. Humanity’s peculiar spark is found in our unique capacity to create, accumulate, store, and transmit vast sums of non-genetic information – i.e. culture.

Culture itself is an evolving system, emerging as generations of individuals interact with one another while attempting to cope with the challenges of living in particular environments and trying to build stable social arrangements therein. Ideas about how best to live are dictated not only by hard practicality (e.g. what foods offer the best caloric returns relative to caloric energy spent in acquiring them), but the beliefs, customs, superstitions, and additional cultural bric-a-brac that tends to accumulate as a byproduct of human efforts to live in a world they don’t fully comprehend. The ability to accumulate, compress, retain, and manipulate all of that extra information symbolically, and transmit it across generations, is the primary feature that has allowed humans to not only survive, but actively flourish, in environments as diverse as the frozen expanses of Canadian Arctic to the tropical rain forests of Papua New Guinea. The capacity for culture transformed an African primate into a global species. Without understanding culture – and the processes that shape it – it is fundamentally impossible to produce a complete account of human behavioral variation.

Cultural relativism, then, is a useful lens for investigating one of the principle variables shaping human behavior. If you want to understand why people think and act the way they do, it is often useful to try your best to see things from their angle – to adopt what anthropologists call an “emic”, or “inside-looking-out” view of culture. This is not controversial. It is simply a specific incarnation of what philosopher James Woodward has called the manipulability conception of causal explanation, a counterfactual account of causal relationships that suggests one can find out why things are the way they are by imagining how they might be otherwise. People are the way they are, in part, because of the culture they experience. If you changed or removed that culture, they would be different. Thus, to understand why they are the way they are, you should understand the set of beliefs, customs, practices, ideas, rituals and what-not that are caught under the umbrella term, “culture”.

This is also an idea that has been hijacked and corrupted into something that, in its own ways, has come to prove almost as debilitating as 19th century Western ethnocentrism. Extreme relativists advocate the position that cultures can only be understood in their own terms, that outsiders have no justifiable basis for cross-cultural evaluation, that all culturally derived knowledge is equally true, and that all culturally derived moral positions are equally valid. Proponents of this school of cultural relativism argue not that the differences between my perspective and that of my guide are usefully illuminated by understanding the details of the differing cultural contexts from which they emerged, but that they are also equally true pictures of the way the world works.

 

Fundamentalist Cultural Relativism and Science

By completely abandoning the prospect of “etic” (or “outside-looking-in”) analysis, the proponents of fundamentalist relativism place themselves in a thorny situation. Extreme relativism implies a fairly strict allegiance to idealist epistemologies. This means that they subscribe to the notion that subjective experience is the absolute arbiter of reality – or, roughly, that our minds make the world, rather than our minds being a product of a world with an existence independent of and discoverable by human observers.

Under this view, the special utility of science as a knowledge gaining activity is implicitly (and often explicitly) denied. This represents a shift from the rather mundane observation that the process of scientific discovery is a cultural phenomenon to the much more drastic (and far less tenable) position that scientific knowledge – the product of that process – is culturally contingent as well. Force is only equivalent to the mass of an object multiplied by its acceleration in cultures with the concepts supplied by Newtonian physics, a molecule of water is only comprised of two hydrogen atoms covalently bonded to an oxygen atom is societies with chemists, and the electrical resistance of a conductive metal only decreases with lower temperatures in societies that have developed an understanding of electromagnetism.

Such a line of thinking may seem ridiculous – a hyperbolic straw-man set to topple in the slightest breeze – but is in fact an accurate representation of positions forwarded by serious and influential intellectuals in the postmodern movement. This has resulted in a disconcerting number of otherwise intelligent people (typically either heavily informed by or professionally engaged in the humanities and social sciences) taking fundamentalist cultural relativism seriously. Instead of placing special emphasis on the process of scientific discovery, progressive intellectuals too-often celebrate “other ways of knowing”, as if astrology or eastern spiritualism can produce a knowledge-claim half as worthy of confidence as rigorous empirical investigation and peer review.

To a degree, this breed of extreme relativism may seem harmless enough. After all, as astrophysicist and science-popularizer Neil deGrasse Tyson is fond of pointing out, the findings of science are true regardless of whether or not anyone buys into them. Most of the people who fall into the trap of fundamentalist CR aren’t likely to be engaged in the process of scientific discovery – they can watch from the sidelines and pooh-pooh the veracity of scientific claims without impinging on their actual veracity in any way, just as I could sit on the sidelines of a basketball game and deny that a player has successfully made a free-throw without affecting whether or not a new point has actually been scored.

This would be true if a tolerance for nonsense didn’t inevitably yield ugly results. Someone making important life choices based on the sign of the zodiac is foolish and risky, but generally innocuous. But what of an impotent Chinese man who thinks ground-up rhinoceros horn will give him a powerful erection? Or parents who treat their child’s meningitis with maple syrup? How about parents who believe in faith-healing, praying their hearts out as their child withers and dies in agony?

 

Fundamentalist Cultural Relativism and Morality

Dangerous consequences also emerge when the principles of FCR are applied to problems of moral and ethical evaluation. Here, identifying the deficits of extreme relativism requires slightly more nuance than is necessary to articulate its failings with regard to the problem of scientific truth. This is because the discovery or demonstration of moral absolutes is extraordinarily difficult. Moral systems emerge from culture and cultures change from place to place and evolve over time. As a result, ideas about right and wrong behavior change along with the cultural substrate in which they are embedded.

Consider, for example, the notion of individual human rights. Today, the idea that individuals within human societies have certain rights is frequently taken for granted, and the moral consequences of either supporting or infringing on those rights are often taken into account when debating the merits of law or political action on the local, federal, and international stage. Prior to the advent of modern notions of human rights, the idea that certain humans should have access to a certain range of social resources and granted a certain level of equal treatment as a de facto condition of their existence was an alien proposition. Rights instead flowed from monarchs or religious officials. But since the Enlightenment, the Western notion of human rights has continued to expand, becoming increasingly inclusive as people have come to identify and labor to eradicate previously unexamined strains of prejudice and bigotry. One day – probably very soon – they will expand to encompass certain non-human animals. Our ideas about human rights have changed over time, continually altering the standards by which we judge moral propriety.

Another example should drive the point down to bedrock. Nowhere is the temporal plasticity of moral strictures more clearly demonstrated than in religion, where interpretations of the dictates of moral propriety outlined in sacred texts are constantly renegotiated in light of secular changes. The precise phrasing varies with translation, but the content of the Bible possessed by modern Episcopalians is basically the same as the content of the one possessed by 17th century Puritans. Nevertheless, the two sects exhibit significant differences concerning what kinds of behavior are considered morally acceptable. The primary cause of these deviations is that modern Episcopalians have shifted their understanding of doctrine to accommodate wider cultural changes regarding the perception of what does and does not count as righteous behavior.

So clearly, moral prescriptions change with time and context. Given this, anyone advocating a cross-cultural evaluation of moral propriety might seem to be on shaky ground. Here, it becomes important to recognize that cultural relativism, as a methodological tool, says nothing about moral evaluation. Appropriately applied, cultural relativism supplies an observer with the perspective needed to see – for instance – why parents in certain sects of fundamentalist Christianity might deny their child life-saving medical care. It does prevent anyone from feeling or expressing moral revulsion at the underlying beliefs and the practices they engender. Nor does it prevent a society that places value on the preservation of human life from interceding on the child’s behalf and putting the parents in prison for criminal negligence or even homicide.

The same can be said of any of the horrors that flow from religious fundamentalism: honor killings, homophobic hate crimes, female genital mutilation, child brides, suicide bombings, the torture and murder of heretics in places like Saudi Arabia, and the full litany of offenses that trespass the bounds of the moral intuitions of people who value human life and equality. Cultural relativism should be deployed as a tool to understand why these things occur – why the confluence of certain social contexts and religious beliefs leads people to kill and mutilate one another. It says nothing about how we should react to these things. By positing cultural relativism as a prohibition on a moral evaluation, extreme relativists retreat from all responsibility for upholding modern liberal values like universal education, racial and gender equality, freedom from oppression, and access to basic healthcare.

Indeed, though the articulation of universal human values is a notoriously thorny problem, one can only deny their existence by adopting a fantastical view of the forces structuring human behavior. As a rather odious stew of idealism, FCR entails a rank denial of the existence of anything resembling human nature – replaced, presumably, with a Lockean conception of humans as infinitely malleable blank slates. In this view, human moral proscriptions are fashioned from the aether, their only worldly determinant the cultural milieu in which they arise.

But worldwide, humans create and enforce prohibitions on certain types of in-group killing. Likewise, a capacity to monitor social contracts and detect cheaters seems to be innate, offering a clear indication that humans universally appreciate fairness in social arrangements. An aversion to incest is similarly widespread. Though they are differently expressed in myriad taboos and moral prescriptions, these are strong contenders for universal moral preferences. They very likely have a basis in humanity’s evolved psychology, and therefore offer the crude foundations for a universal code of ethics.

On the other end of the spectrum, humans have innate predispositions toward misconduct that can be exacerbated by an exaggerated emphasis on a cultural as an unassailable fount of moral knowledge. For instance, humans have an ugly impulse toward tribalism. Allowing cultural boundaries to play a greater role in shaping values than our shared identity as humans not only grants tribalism succor, it is a natural consequence of taking absolute moral relativism seriously. It is also detrimental to the project of building and maintaining humanistic moral codes and the universalized standards of human thriving they entail. With its exaggerated celebration of boundless pluralism, FCR has the potential to prove inimical to the practical goals of building and maintaining the social institutions most amenable to human success. Consider the very project of building stable social institutions. While it is important to encourage diversity, it is also true that some degree of assimilation is critical to the formation and long-term stability of societies. When the locus of identity in a given society is a multitude of distinct religions, ethnic affiliations, or political subgroups, the resulting fragmentation is a recipe for long term instability and strife. Each group is bound to pursue clannish interests, guided by moral codes that may be both mutually exclusive and entirely divorced from the best interests of the collective.

This is precisely what has happened in the East Ramapo Central School district in New York, where Orthodox Jews seized the local school board and began cutting services in an attempt to alleviate their tax burden. Their goal was to avoid paying taxes on schools their children didn’t intend, but their myopic focus on religious and ethnic affiliation has lead them to neglect – and even harm – the wellbeing of their neighbors. Adherence to the moral proscriptions of an ancient faith, in concert with a very likely evolved predisposition to favor cultural familiars, has led them to place value on their identity as Orthodox Jews to the exclusion of their identity as human beings – and all the ethical imperatives that identity implies.

Yet even in the absence of any evolved, universal preferences to form the foundation of widely applicable moral prescriptions, it is impossible to advocate extreme forms of cultural relativism without abandoning modest claims about right and wrong. For example, it might be argued without a lot of protest that individual behaviors that encourage or contribute to the physical health or well-being of others are laudable. Conversely, those that can be shown to be directly or indirectly harmful to others are not. It’s good to feed your kids, bad to starve them. A parent who poisons the lungs of her offspring with second-hand smoke can be sensibly accused of engaging in some form of wrongdoing. But what of the men who have foisted upon their wives the bizarre and inarguably sexist tradition of wearing burqas? Not only is this practice reflective of the possessive, proprietary interests of a regressive patriarchy, it has also been linked to vitamin D deficiency and associated problems like rickets.

According to FCR, the latter claim offers no basis for ethical evaluation. Lacking any direct knowledge of what it is like to be a Muslim woman, I have no basis for suggesting that their adherence to the tenets of Islamic faith leads them to live a less healthy and fulfilling life than they could otherwise. Boiled down, extreme relativism argues that my desire to see all humans, everywhere, granted basic human rights is ill-founded. Moreover, it posits that my opinion that certain cultural practices or religious beliefs are inimical to that goal is bigoted. Instead of expressing an interest in the well-being of all humans, the proponents of FCR see this view as “Islamophobic”.

 

Cultural Relativism as Ethical Obstructionism

By suggesting that cultural differences are essentially unbridgeable and denying the possibility of either uncovering or negotiating a universal standard of human thriving, FCR  has the curious consequence of more substantially “otherizing” (to borrow a rather obscurantist term from the social justice world) people from distinct cultural backgrounds. In one of its more popular incarnations, it argues that everyone is gifted with a “positionality” (yet another term of polite PC pedantry) that renders their ideas about right and wrong both externally unintelligible and permanently unassailable. It is to say, in effect, “your worldview is so different from my own that I can hardly justify feeling or expressing outrage when you either abuse or are abused by a member of your cultural subgroup.” At best, this is a recipe for social stagnation. At worst, it’s a way of abrogating centuries of moral progress – dispensing with hard-earned notions of human rights in favor of milquetoast ideas about cultural sensitivity.

FCR is also a direct progenitor of the modern strain of intellectual and moral sensitivity sweeping college campuses in the form of pleas for “trigger warnings” and concern over “microaggressions”. These violations of Enlightened, humanistic ethics seem superficial in comparison to some of more heinous transgressions countenanced by FCR, but they presage something sinister. Where once FCR might have precipitated innocuously nondescript rhetorical utterances along the lines of “who are we to judge?”, it now motivates an authoritarian push for the establishment of pristine thought-sanctuaries – places where ideas are vetted for the slightest hint of potential trespass against the ideals and preferences of cultural subgroups. In this regard, FCR has turned from a bastion for moral cowardice to a direct assault on civil liberties – and it is here that it most significantly earns a sobriquet typically reserved for regressive strains of religious intolerance and slavish adherence to ideology: fundamentalism.

Thomas Paine was absolutely correct when he observed, in The Rights of Man, that natural rights are irrevocable. Though it may be a recent invention, the introduction of the concept of “human rights” represents a monumental transition in our understanding of human ethics. Absent its cataclysmic obliteration from the realm of human thought, it will remain a critical component of our modern assessments of right and wrong. The secular notion of individual human rights is, inarguably, an improvement over previous moral codes, yet FCR – manifest in the mutual unintelligibility implied by overwrought concerns over “positionality” and an obsequious devotion to cultural sensitivity – would have us abandon that progress by asserting that people’s culturally derived beliefs about the will of Allah or the efficacy of vaccines are more important that anyone’s right to live a healthy, independent life. Brass tacks, there are good reasons to think some ways of living are better than others.

I would humbly submit that we shouldn’t throw the proverbial baby out with the bathwater. Cultural relativism, in its original, lighter form has proven immensely useful to anthropologists and ethnographers seeking to understand the proximate causes of locally expressed human differences. It was a substantial leap over the ham-fisted, Western-centric, deeply racialized theorizing of 19th century intellectuals. The underlying principle, that all human cultural systems deserve to be understood in their own terms, is useful as more than just a methodological tool for ethnographers. On the global stage, it has a role to play in the shaping of foreign policy and international affairs. On the more humble scale of individual lives, it has a role to play in helping neighbors understand one another.

It just needs to be deployed under the recognition that, inasmuch as it is useful tool for building human understanding, it is has little value as tool for evaluating knowledge claims and moral proscriptions. To the extent that cultural relativism, manifest in its fundamentalist form, is interpreted as endorsing the primacy of subjective experience in dictating the structure of reality or mandating abstinence from adopting or defending human rights on the grounds that those rights are a “Western construct”, it deserves all the ridicule it receives. Science is the best tool for gauging truth. Cultural practices that inhibit the achievement of universal human rights can be justifiably viewed as harmful and ought to be stridently opposed and vigorously critiqued. These assertions aren’t oppressive or marginalizing or bigoted. They’re true.

Sacrificing Reason on the Alter of Purity: U. Va. Students Protest Use of Jefferson Quotes

University of Virginia students and faculty have signed a letter criticizing University President Teresa Sullivan for invoking the words of Thomas Jefferson. In an email apparently intended to salve the all-to understandable confusion and anxiety stimulated by the election of Donald Trump, Sullivan quoted Jefferson on the importance of U. Va. students, who “are not of ordinary significance only: they are exactly the persons who are to succeed to the government of our country, and to rule its future enmities, its friendships and fortunes.” In other words, “don’t let the election of a deranged demagogue lead you into hopelessness: you are the future, so act accordingly.”

The heart of the complaint is unsurprising: Jefferson owned slaves. Slavery was (and is) an ethical abomination. This is indisputable. That Thomas Jefferson – among other American founders – owned human beings as ranchers today own cattle is a telling stain on the American myth. For many, it gives lie the words Jefferson penned – “that all men are created equal.”

But this is a fallacy. All men are created equal. The truth of the idea exists independent of its originator. For centuries, powerful men in the United States have repeatedly failed to make this truth manifest in the lives of all citizens. Some rancorous bastards have even worked against that lofty proposition, exploiting the poor and the dispossessed, brutalizing those unfortunate enough to have been born without white skin, rich parents, and a penis. That places some of these people on an ethical spectrum somewhere between pitiful disappointments and full-bore monsters. For others, it clouds a veneer of heroic righteousness, leaving us to puzzle over what to make of people who have done both good and awful things.

Yet America’s history of racism and oppression says nothing of ideas about the equality of human beings. Either all humans are born with equal intrinsic value or they are not.

The same is true of the votive to intellectual pedantry and banality some of the students and faculty are building at U. Va. Either Jefferson’s statement is true and valuable, or it is not. His personal crimes are immaterial. To think otherwise is to sink into the trap of ad hominem thinking and, doing so, help perpetuate the rancid stew of identity politics currently corroding political discourse in the United States. It suggests not only that human beings should be judged entirely in terms of their worst behavior, but also that ideas cannot rise above the inevitable flaws of the humans who create them.

This is truly bizarre thinking. It’s hard to imagine what ideas and expressions would remain permissible in a climate where they must first be sterilized of any murky or odious associations. If the proscription is that ideas can’t come with any baggage, either in terms of the person who dreamt them up or the context in which they originated, then most ideas automatically become verboten. If readers were to judge my arguments entirely in terms of the worst things I’ve done or said, then my humble attempts at persuasion would be irrevocably impotent to a huge swath of the population.

Insofar as this view seems extreme, it is nonetheless implicit in the complaints of people who would rather not have to suffer under the tyranny of a Thomas Jefferson quote. This is ironic, because U. Va. was founded by Jefferson. If a Thomas Jefferson quote is an ethical provocation beyond anyone’s capacity to bear, what are we to make of a salary or education provided by a school that wouldn’t exist without him?

In the sweep of history, the insipid criticisms of a well-intention email will be (or at least should be) a mote of dust. But it is nonetheless illustrative. It tells us that becoming an enemy of reason clearly demands no specific political allegiance. All it takes is that perennially destructive commitment to ideological purity captured under the sprawling umbrella of fundamentalism. Religious fundamentalism. Communist fundamentalism. Free-market fundamentalism. Libertarian fundamentalism. And now, liberal fundamentalism: the belief that everyone’s personal experience is a window of unassailable insight and everyone’s opinion – except those with “privilege” and “power” – is infinitely precious. To satisfy this belief, its proponents are willing to wage war against the climate of open and free expression that gave rise to everything from life-saving vaccines to the very notion of individual human rights. There is very little good in this world that isn’t due to people who cherish reason and accept the premise that ideas should flourish or fail on their individual merits.

Perhaps President Sullivan’s email had other flaws. If it normalized Trump, for instance, it would present a prime target for serious criticism and a springboard for worthwhile debate. Maybe the idea that U.Va. students are special is false, in which case it should be refuted. But the idea that the it ought to be censured because it echoes an idea from a man who was, in terms of racial justice and human equality, quite clearly a hypocrite is dubious at best.

Consider an historical anecdote, at once usefully reductive and logically instructive. In the late 19th and early 20th century, agricultural production was limited by the availability of fertilizers. Using the technology available at the time, producing food required more land to feed far fewer people than it does today. A couple of German chemists changed this, developing a method to capture atmospheric nitrogen and turn it into ammonia for use in fertilizers. Billions of people are alive today who would never have existed had those German chemists not made those breakthroughs, inventing what is today known as the Haber-Bosch process.

Thing is, Fritz Haber (the Haber, in the Haber-Bosch process) was a real son of a bitch. Not only did he treat his family terribly, putting his professional ambitions and nationalistic impulses ahead of familial loyalty – thereby likely contributing to the suicides of his first wife and, later, two of his children – he is also considered the father of chemical warfare. He pioneered the weaponization of chlorine and other poisonous gases, directly contributing to the agonizing deaths of tens of thousands of Allied soldiers. Later, scientists working under Haber developed a form of cyanide gas known as Zyklon A – the predecessor to the Zyklon B pesticide used to murder Jews during the Holocaust.

Under the theory of discourse the complainants at U. Va. are implicitly advocating, the Haber-Bosch process – and all descendent technologies – should be immediately abandoned. After all, Fritz Haber was, to put things in disturbingly mild terms, a real dick. Of course, millions – if not billions – of people would starve to death, but the descendants of those who died miserably in the trenches of WWI or in the gas chambers of Nazi Germany wouldn’t have to deal with eating food tainted by Haber’s hideous legacy.

Ideas and opinions should be judged by their qualities, irrespective of the confusion of dastardly or enlightened deeds left in the wake of the people who produce them. The Haber-Bosch process should be weighed in terms of its effects: is it better that billions of people exist today who very likely wouldn’t have otherwise, or does it matter more that the Haber-Bosch process has contributed to overpopulation and all the attendant environmental and social costs that come with it? That’s an interesting question. Whether or not we should do away with the good works and useful ideas of people like Fritz Haber and Thomas Jefferson because the character of those men was blighted by the misery they inflicted on others is not. In fact, it’s not even a question. Those ideas exist. They are worthwhile or dispensable on their own merits. Only a reckless, enthusiastic embrace of authoritarianism could ever get rid of them. And that, I worry, is precisely where the postmodern left – in its urgent pursuit of ideological purity and boundless inclusivity – is headed.

In the world of ideas – that is, in other words, the world of higher education – what matters is not whether they make people feel welcome and offended. It’s whether or not they are true and make sense.