Stripped to its core, cultural appropriation is a matter of one culture borrowing from another. But recently, it has morphed into an altogether more nebulous construct. At once a righteous denunciation of exploitation – capitalist and colonialist alike – and a strident clarion call asserting a bizarre provincialism of the oppressed, it’s difficult to nail down precisely. Yet concerns over cultural appropriation are motivating increasingly censorious campaigns.
By some accounts, cultural appropriation is universally onerous. By others, its propriety is conditional, depending explicitly on the power relationships between the appropriator and the appropriated. What is and is not permissible is a question of whether a member of a historically privileged group is coopting the cultural artifacts of the historically marginalized. Often, the bounds of propriety are marked by racial ascriptions, and are thus dependent on the maintenance and perpetuation of social constructions that only loosely track actual patterns of genetic variation and cultural inheritance.
Too frequently, concerns over cultural appropriation reflect legitimate grievances. It’s hard to fault the black pioneers of American blues – who often died penniless – and their descendants for harboring some resentment toward the white British and American artists who got rich borrowing heavily from their craft. Likewise for Native Americans, who see caricatures of their ceremonial garb suddenly populate suburban landscapes on Halloween, sexualized and commercialized in abject indifference to their traditional significance.
Other times, the apparent abuses are decidedly more banal. There are those who claim that it is inappropriate for white students to eat bad sushi at their university cafeteria. Recently, members of a student government were berated and disciplined for wearing sombreros – a hat with no particular significance beyond its capacity to shield the wearer from the sun – to a party. Annually, celebrities are berated for the choice of Halloween costume. More ominously, activists have begun to suggest that it is wrong – even criminal – for white novelists to portray the experiences of minority groups in their fiction.
In these latter cases, social justice warriors ostensibly clamoring for fairness and equality are inadvertently giving voice to a divisive program of artificially imposed ethnic purity. Foundational to their argument is the suggestion that people with the right kind of heritage can and should exercise absolute and exclusive dominion over specific cultural artifacts, setting the terms of exchange and expression for anyone who doesn’t meet the appropriate stipulations.
In this sense, much of the recent clamor over cultural appropriations should strike a profoundly unsettling note to anyone dedicated to the preservation of liberal values like freedom of thought and – more pointedly – freedom of expression. It suggests one person’s identity, however fluidly construed, can be used to establish limits on another person’s behavior. This is a perversely leftward retreat to tribalism, explicitly granting the erroneous claim that the differences between humans with different ethnic backgrounds, economic prospects, gender identities (and so forth) are not variations on a theme, but unbridgeable chasms. To accept that premise is to reject the larger enterprise of progressive humanism, abandoning hope for a global community in favor of insular patrimonialism.
Of course, it doesn’t help that many of the worries over cultural appropriation are implicitly rooted in an abject misunderstanding of what culture is and how it forms. For those interested in using the notion of cultural appropriation to establish limits on other people’s expression or consumption, the fact that most (if not all) cultural artifacts, from musical traditions to sacred religious icons, are forged at the interface between cultures is dismissed as ideologically inconvenient. Cultures are fluid and permeable – their only universal features are plasticity and change. Cultural appropriation isn’t an assault on identity. It’s at the core of what culture is: people sharing information, learning from one another, borrowing and trading ideas.
Defining culture is almost as thorny a problem as defining cultural appropriation. In the late 19th century, the English anthropologist E. B. Tylor offered a comprehensive definition, writing that “Culture, or civilization, taken in its broad, ethnographic sense, is that complex whole which includes knowledge, belief, art, morals, law, custom, and any other capabilities and habits acquired by man as a member of society.” Recently, researchers have opted for a more parsimonious approach, defining culture as the non-genetic information people create and share with each other throughout their lifetimes and across generations, often embodied in artifacts (tools, art) and rituals. More simply still, culture is the trait that sets humans apart from other animals – it’s what has allowed a species of primate that evolved primarily in the African tropics to occupy nearly every ecological niche on the planet, along the way inventing everything from bows and arrows and iPods to algebra and Catholicism.
Regardless of how we choose to define it, one characteristic of culture remains obvious: it is not inert. A culture in stasis is a culture on the way to extinction. All cultures present today exist because of their capacity to evolve and change, incorporating new ideas about the world and new strategies for living in it according to the exigencies of human psychology and the vicissitudes of the natural world. Our decision to associate certain features of culture with specific geographic regions, nationalities, or ethnic divisions is largely a product of parochialism. Rare are the cuisines or religious traditions that can’t be traced to patterns of intergroup interaction and borrowing at some point in their history. That we think otherwise is due to the myopia foisted on us by minds selected to process time on the scale of weeks, months, or years instead of decades, centuries, and millenia.
Consider, for the sake of illustration, all the problems of creating a reliable definition of separate species in biology. One of the most widely accepted axioms is reproductive isolation – organisms are part of distinct species if they can’t successfully hybridize and produce fertile offspring. This seems reasonable, but it presents problems. For instance, transient and resident killer whales can produce fertile offspring, but in the wild they do not breed or commingle. Likewise, despite being separated by thousands of years of artificial selection, dogs and wolves can still breed – chihuahuas and wolves are members of the same species. In both killer whales and canines, there is a good case to be made that a broadly useful definition for what does and does not count as a particular species is insufficient. The factors that govern reproductive isolation can be both functional and facultative.
These problems intensify when we take a deeper view of animal relationships. Obviously humans can’t have babies with chimpanzees. But somewhere between six and seven millions years ago, we were both part of the same species – a common ancestor to both modern chimps and modern humans. A gap that is today unbridgeable is composed of hundreds of thousands of tiny steps, mothers and offspring that were essentially indistinguishable from one another. The process worked slowly, but at some point the creatures that would lead to modern humans and the creatures that would lead to modern chimps crossed an irrevocable frontier, beyond which hybridization was impossible.
This situation is much the same for human culture. At some point in our prehistory, there was probably so little variation in human culture that the distinctions between cultural groups would have been difficult to recognize. The toolkit was simple, comprised of traditions for making chipped stone tools, processing food, and managing – if not outright creating – fire. Over time, these traditions were diversified and elaborated, becoming increasingly sophisticated and more highly specialized as the cognitive capacities of hominids increased, people migrated to new environments, and spent time developing in relative isolation. With the human migration out of Africa, cultural diversity exploded: In an exponentially accelerating curve, the cultural repertoire of humans has expanded to include thousands of diverse religious traditions, mutually exotic cuisines, strategies for living and thriving in disparate ecosystems like the high arctic and the tropical rainforest, and a cornucopia of technological innovations.
As with humans and chimps, the relationships between cultures can be characterized as part of a process of descent with modification. Pluck native, unilingual English and Hindi speakers from their homes in Omaha and New Delhi and put them in a room together. Very likely, verbal communication between the two would be extremely limited. But carefully trace the development of those languages back through the centuries and you arrive at Proto-IndoEuropean, a language ancestral to both Hindi and English – along with thousands of other languages and dialects.
But under close scrutiny, the similarities between cultural and biological change begin to break down. Significantly, biological change is a one way street. Genetic information passes from parents to offspring, but not from offspring to parents or offspring to offspring. That is, in most sexually reproducing organisms, genetic transmission is strictly vertical. This is why the boundaries between species eventually become irreversible: once you’re a chimp ancestor that can’t produce a fertile child with a human ancestor, there’s no going back.
Culture doesn’t work that way. Sure, a lot of cultural information gets passed down from parents to offspring. But it also gets passed around among friends and siblings, or from grandchildren to grandparents. Cultural information can even pass group boundaries marked by intense mutual antipathy. Social learning is the cornerstone of cultural evolution – the only essential requirement for one person to acquire a cultural trait from another human. Consequently, in cultural evolution, the lines of transmission can be vertical, horizontal, or oblique. They aren’t dependent on the strictures of genetic kinship. Those Hindi and English speakers are separated by vast geographical distances, thousands of years of unique history, and lifetimes of personal experience. But with a bit of effort, the native English speaker can learn Hindi (or vice versa) and they can begin to understand each other incredibly well.
The nature of human learning and cultural transmission make culture an incredibly dynamic force. It also makes establishing good definitions for particular cultures exceptionally difficult, strictly dependent on narrow frames of time and space. The bounds of culture are largely relative, constructed on the fluid substrate of human relationships and historical contingency. Any emblem considered diagnostic of a certain culture is a product of perspective and perception. From the inside looking out, there are plenty of people who would confidently proclaim that the United States has no national culture (save perhaps consumerism) but there are plenty of people from other nationalities who would beg to disagree. Moreover, that national culture – whatever it is – emerges from the interaction of countless subcultures that can’t, no matter how hard anyone tries, be meaningfully dissected into discrete, universally recognizable units.
Look back to the embarrassing huff over bad sushi. Consider the absurd generalization at its core: that Japanese culture is a monothetic entity, coherently symbolized by a specific cuisine. This view conflates nationality with ethnicity, and both nationality and ethnicity with culture. Both are part of culture, but neither is entirely constitutive of it. Moreover, it disregards any internal variation within the Japanese nation-state and the complex historical processes that have shaped it. The implication is that being Japanese can be reduced to and signified by consumption of raw fish and rice of a certain style and quality.
The example is reductive, but it does get to the core of the issue. If we try to define a broad racial category like “Caucasian”, what we are dealing with is something like a normal curve that significantly overlaps with the curves defining other racial categories. In this definition, there might be a certain range of skin pigments that reliably track patterns of genetic variation and inheritance at high latitudes among populations with diets poor in vitamin D. But at the edges, where the tails begin to taper and overlap, things get tricky – there is no sharp partition between the variation that defines one “race” or another. And within the coarse boundaries of “people who descended from European populations living above a certain latitude”, there are dozens – even hundreds – of racial subcategories. These divisions are invisible to some, intensely meaningful to others, and all constructed from the shifting, malleable scaffolds of cultural change. Racial categories are post hoc constructions, selectively targeting physical features and treating them like essential characteristics.
These distinctions are further complicated when we attempt to dissect the world along cultural boundaries, which are more often than not only connected to underlying patterns in superficial physical characteristics and genetic variation in loose and complicated ways. Compared with other species, humans exhibit both very little genetic variation – most of it within, rather than between ethnic groupings – and huge extremes of behavioral variation, largely attributable to culture. Cultures are molded by patterns of diffusion and acculturation (fancy anthropological terms for the spread and sharing of culture) as ideas leap across the barriers that limit genetic transmission, tweaked and mutated and shared (or stolen) again and again and again. Talk of this or that culture is more a matter of convention, an imprecise reference to the blurry peak of a broad and shifting curve. This presents definitional problems, but as a reflection of the reality of human culture, it’s worth celebrating. It means the obstacles that divide us are surmountable, and the frontiers of human innovation and learning are virtually limitless.
The most dire anxieties about cultural appropriation are a product Platonic essentialism applied to culture and identity, as if a given culture is something discrete, bounded, and easily definable: “Japanese culture is the culture in which people eat sushi” or “African-American culture is the culture in which people listen to and create hip-hop”. Nothing could be farther from reality. Like race, most notions of culture – and the bounds between them – are social constructions. They rarely reflect the actual patterns of information transmission and inheritance that have shaped them into the forms we currently observe. Though talk of Japanese culture or Southern culture or Native American or dominant/mainstream American culture is convenient, the idea that these things exist as sharply defined entities is simply false. The boundaries between them are fuzzy.
Yet even if we are permissive in our assessment, granting that the cultural boundaries people see reliably track the patterns of divergent evolution and historical contingency that have shaped observable differences in belief, cuisine, and attire, we’re still left with a vexing quandary. Cultural appropriation is a matter of privileging our position in space and time. It says that the cultural differences we see to today are the ones that matter and we should make deliberate efforts to freeze them in place. And, when you take the time to recognize that cultural differences are really the only things that divide us, this impulse morphs from a kind of social sensitivity into a disheartening initiative to cement cultural divisions into the fabric of human experience.
All this discussion of the ambiguity of culture deserves a caveat. Taken to an extreme, these considerations could lead to some spurious conclusions. For example, one might conclude that culture is entirely a matter of perception – that it is, more or less, all in our heads. This is nonsense.The bounds of culture are both porous and flexible, responsive to the position and experience of observers. What does and does not count as a valid appropriation of a cultural artifact looks differently to those within the bounds of a given culture than it does to those without. More counterintuitively, what does and does not count as a reliable diagnostic feature of a given culture changes over time. But that does not mean that culture doesn’t exist as a measurable, affective part of the real world.
Black American culture is different today than it was fifty or a hundred years ago. It will be different fifty or a hundred years in the future. Yet it still makes sense to speak of that culture as something with continuity and coherence. It exists because patterns of human interaction have produced ideas and traditions that are reliably correlated with a broad group of people at a certain point in time. Those traditions and ideas and the people who share them change with time, but they are connected by discernible lines of history and heritage. That doesn’t mean black culture in the United States can be decomposed into a precise set of elements that will allow anyone to inerrantly spot a part or product of that culture when they encounter it. Instead, it means that culture is a thing that exists and, however difficult it is to pin down, exerts a real influence on the behavior of individuals and the shape of society.
Consequently, concern over cultural appropriation can’t be dismissed with a hand-wave. Most of the ruckus does entail some faulty assumptions about the nature of culture, but that doesn’t mean that cultural artifacts can’t be deployed in ways insulting either to their originators or to a group that has chosen to ascribe them with special significance. But because it’s impossible to assign any cultural artifact a fixed and universal meaning, the nature and degree of offense remain subjective.
For example, I see some room for offense if a white, upper middle-class kid dresses like a Sexy Indian on Halloween, dismissively treating hamfisted reconstructions of Native American ritual outfits like a Batman or Frankenstein costume. At the same time, I’m not the least bit bothered by a Satanist or atheist who wants to make a statement by wearing the Christian cross upside-down. Likewise, it doesn’t bother me that Christians annually decorate their homes with coniferous trees, a practice attributable to the pagans of northern Europe who medieval Christians routinely slaughtered. In each case, we are a dealing with a situation where an item held sacred by one group is being appropriated (and perhaps mistreated) by others. Only one of them bothers me (and only when I really put effort into being bothered by it), and though I can think of a number of justifications for ire, I can’t think of a consistent set of criteria for reliably distinguishing between permissible and unacceptable forms of appropriation.
The impulse is to set the litmus at relationships of dominance and oppression, but this isn’t as clear cut as it seems. It takes all the definitional problems of culture and heaps them on individuals, basically positing that a foolish and insensitive suburban kid is culpable for the entire history of oppression Native Americans have faced at the hands European colonists. This pushes aside the important work of educating someone about the historical and social facts that make some choices rude or reprehensible in favor of establishing a crude heuristic whereby people are taught to just avoid experimenting with unfamiliar cultural artifacts.
Defining valid or invalid cultural appropriation is ultimately a matter of how individual people feel about various forms of cultural expression. One can point to relationships of power, privilege, oppression, and exploitation as a guideline, but that doesn’t eliminate any of the problems inherent in any initiative that seeks to use cultural appropriation to construct limits on expression and consumption. Inarguably, cultural appropriation happens. Uncontroversially, some forms of cultural appropriation are thoughtless, crass, hurtful, and bigoted. Responding to them is a matter of vociferous criticism and serious debate, not a retreat to delicate cultural provincialism.
Taken seriously, the most strident attacks on cultural appropriation can be taken to advocate ethnic tribalism and cultural stasis. No one should be permitted to write a book that contains a character whose arc involves elements that trespass the frontiers of personal experience: a white, able-bodied, cisgender woman can’t write a story involving a character who is a transgender Australian aboriginal paraplegic. Suitable meals for breakfast, lunch, and dinner should be restricted to those involving recipes and ingredients directly attributable to your personal heritage (historical cutoff yet to be defined): no tacos or tikka masala for the Irish. Appropriate attire should scrupulously avoid incorporating elements invented or commonly used by people who don’t look like you: if you’re not a direct descendant of a horse-culture from the Eurasian steppes, probably best to avoid wearing pants.
Given the general ideological inclinations of the people most vocally protesting cultural appropriation, this is a perplexingly conservative impulse, effectively sacralizing parochial snapshots of culture by establishing rigid strictures on its appropriate use. It ignores the hazy edges and dynamic nature of culture as a force for sculpting and expanding the human behavioral repertoire. Cultural appropriations can make people upset – with varying degrees of legitimacy – but they do not do any measurable harm. That puts efforts to constrict them on a dangerously slippery slope. If you’re not convinced, consider the following: an activist arguing for limitations on the use of a Native American headdress as inspiration for a Halloween costume is employing the same reasoning as a fundamentalist Christian arguing for limitations on the rights of gay men to marry. In both cases, the grievance boils down to a sense of offense rooted in a subjective, historically contingent definition of propriety. In neither case is anyone demonstrably harmed – gay marriage has never actually hurt a Christian family, nor has an ill-conceived choice of Halloween costume inflicted more than emotional harm on a cultural subgroup – but people nonetheless claim to have had their traditional values violated.
It’s important to be clear and precise about the argument I’m advocating. It is perfectly acceptable, even sometimes laudable, for people to feel outraged by the appropriation and perceived misuse of whatever they take to be their cultural heritage. Thus aggrieved, they should express themselves, and – especially in the case of historically marginalized or oppressed groups – have platforms for doing so. At no point, however, is it justifiable for one person or group’s sense of offense to motivate the establishment of boundaries on another person or group’s capacity to express themselves. One can point to specific injustices to validate this or that claim of cultural misappropriation, but in the end, it all reduces to one thing: a plea for humans to respect one another’s points of view. That endeavor can never involve placing restrictions on the realm of permissible expression. Indeed, if we accept that respect and understanding as praiseworthy aims, we must grant the corollary – that they are best served by allowing everyone as much room as possible to engage in the fraught business of exploring other people’s experiences.
For a comedic take: