The Dull Art of Problematizing Everything

Here’s an essay for Areo Magazine, a very fine place to go if you like to read interesting things:

Few things in life are certain. Some will populate a short list of inevitabilities with death and taxes, but really, only the former is guaranteed—just ask the sitting president of the United States. If you have spent any amount of time on the internet, however, I’d wager a lofty sum that you’ve seen plenty of headlines of the “Why Blank Is Problematic” variety. More often than not, these aren’t essays that offer insight or clarity. Instead, they simultaneously monetize a boring fact about the world—that everyone’s conception of it is necessarily incomplete—while snidely sidestepping all efforts to understand the intent behind a given act of communication or creation and empathize with its originator.

Read more here.

Read This Book, Dammit: The Secret of Our Success

Starting in the late 1970s and early 1980s, a new program of research began to emerge in the study of human culture and behavior. Building on pre-existing tools from population genetics and evolutionary biology, researchers like Luigi Cavalli-Sforza, Marcus Feldman, Robert Boyd, and Peter Richerson began to construct a theory of cultural evolution rooted in Darwinian principles. They showed that attention to the functional roots of culture could be couched in a larger framework capable of explaining both the nature of culture and the processes behind cultural change.

The notion that cultures evolve was hardly new. Archaeologists and anthropologists had been working under that assumption for a few decades, striving to refine their theoretical and methodological approaches into the roots of a mature, rigorous science. Ultimately, these efforts yielded a framework that used a thoroughly Darwinian lexicon – adaptation, selection, evolution – in only loosely Darwinian ways. Researchers developed a focus on local ecological specialization – a useful step forward – but frequently situated their insights in a framework that was both incomplete and inconsistent. The recognition that cultural change was not only evolutionary, but sensibly Darwinian, provided the tools necessary to build formal – and testable – explanations of cultural phenomena.

In the years since Cavalli-Sforza, Feldman, Boyd, and Richerson laid down their pioneering work, the theory of cultural evolution as a Darwinian process – capable of both causing and responding to new patterns of biological evolution – has been consistently vindicated, demonstrating its utility in the lab and field. Joseph Henrich’s book, The Secret of Our Success is a thrilling exploration of the frontiers of that research. Henrich puts up a strong case that underlying humanity’s broad ecological success and expansive behavioral repertoire is our faculty for creating, transmitting, manipulating, storing, and accumulating massive amounts of non-genetic information in the form of culture.

Cumulative cultural evolution, as it is called, is unique to humans (putting aside emerging evidence for simple forms in New Caledonian crows – the difference in degree is large enough that we might as well call it one of kind). Other species have cultural traditions – those crows, for instance, make tools, as do chimpanzees – but none of them retain that information and build on it in any meaningful way over successive generations. The techniques individual chimps learn for termite fishing or nut-cracking are lost at death, inevitably hung up on the barriers that inhibit the transmission of all the other traits organisms acquire throughout their lifetimes.

In technical parlance, this is called Weismann’s barrier. Put simply, it means that inheritance is a one way street – information moves from germ (sex) cells to somatic (body/tissue) cells, but not the other way around. A chimp might learn a great deal about how to use tools to access otherwise inaccessible resources throughout its lifetime, but it has no way to get that information from the neurons in its brain to the eggs or sperm in its reproductive system.

Somewhere in the hominid line, our ancestors found a way around that obstacle, sidestepping the whole business of one-way genetic transmission by transmitting lifetime’s worth of acquired information from individual to individual in the form of culture. The foundations of this remarkable evolutionary transition rest in humanity’s a spectacular facility for social learning. Other species are, of course, capable of social learning, but these abilities are vastly enhanced in humans. We pay far more attention to each other than other animals, selectively targeting individuals that exhibit signals of above-average proficiency or expertise. In the same vein, we have a highly developed theory of mind (the ability to think about what other people are thinking) allowing us to understand each other in terms of intentionality and purpose. It has even been suggested that our unusually small iris, set against a very white sclera, is an adaptation for non-verbal communication – making it easier for us to keep track of other people’s attention and communicate our own.

Critically, growing evidence indicates our social learning expertise – unlike many other forms of human knowledge and behavior – is innate. Henrich discusses experiments in which children matched against adult chimps and orangutans on a variety of tasks. In most domains, they do about the same or a little worse than the other primates. But in social learning, human children massively outperform their hairier cousins.

Such highly evolved adaptations for social learning provide a scaffold for extraordinary levels of information sharing. Even absent language, humans watch one another and pay special attention to signals of above average proficiency and prestige in order to learn new or better ways to solve adaptive challenges. They create and maintain social norms that encourage cooperation, foster stable traditions, and aid in patterns of ingroup-outgroup competition. Social learning allows us to make and accumulate culture.

This, more than anything else, explains why a species with relatively little genetic variation displays such a sweeping range of behavioral variation and ecological specialization. It gets to the why of the of descriptive insights uncovered by earlier cultural evolutionists – that humans display local ecological adaptation – by presenting a plausible and, increasingly, empirically justified mechanism. Humans can meet the challenges of living on frigid ice sheets in the high arctic and sweltering jungles in the subtropics because we have the capacity to accumulate information about how to live in those environments at a rate far in excess of that afforded by strict biological adaptation. And critically, it’s not a matter of individual genius. Humans learn about how to live in new environments through the accumulated wisdom of generations of trial-and-error learning, resulting in cultural packages that are expertly tailored to the challenges of specific ecosystems.

Clearly, this point stands in contradiction to those who would link humanity’s extreme success as a species to extraordinary – and innate – individual intelligence. Individual humans can be pretty smart, but they rarely (if ever) have the cognitive horsepower necessary to build the sophisticated cultural innovations necessary to survive in novel environments from scratch. This is true of modern technology and scientific progress as much as it is of forager subsistence and ritual observance. There is a popular tendency to think of technological innovation as a matter of lone geniuses and marvelous insights. But James Watt’s steam engine was inspired by the earlier Newcomen steam engine. Similarly, Albert Einstein’s theories of special and general relativity drew inspiration and built on insights from the work of Gottfried Leibniz and Bernhard Reimann. Individual genius is real (who could argue that Einstein wasn’t a genius?) but the fruits of genius accrue incrementally.

For his purposes, Henrich makes this point another way. To illustrate the failings of individual intelligence – and, by contrast, the power of cumulative cultural evolution – he relates a variety of historical anecdotes . In this light, we might think of them a little natural experiments. In each, healthy, intelligent European explorers found themselves in a scenario where they are forced to survive on their wits alone in an unfamiliar environment. Be it the muggy, swampy coasts of the Gulf of Mexico, the icy wastes of the high arctic, or the arid sprawl of the Australian outback, the outcome is inevitably the same: suffering and death. Those that survived did so because kindly natives, with the cumulative knowledge necessary to survive in a particular ecosystem, lent the naive Europeans a hand.

Tellingly, the causal mechanics of successful cultural adaptations are usually opaque to the people who employ and perpetuate them. Most people don’t understand the physics of bow and arrow technology or the insulative properties of snow and ice. They don’t understand the chemistry of effective poisons for hunting or detoxifying otherwise inedible plants. Yet, using the cumulative intelligence of many individuals over multiple generations, they develop technologies that successfully exploit principles of aerodynamics, thermodynamics, and chemistry to build sophisticated suites of cultural know-how that allow them to live and thrive in almost any environment.

The breakthrough Henrich presents is not that culture is useful. That’s pretty obvious, intuitively speaking. It’s in the emerging understanding of how humans make culture – and how culture makes humans – in dynamic patterns of feedback and response between our genetic architecture and cultural developments over successive generations. Learning how to process plants and meat, and passing that information down from generation to generation, has worked extraordinary changes on our guts. Domesticating certain ungulates and incorporating their milk into their diets has modified certain population’s ability to metabolize milk well into adulthood. Specialized adaptations allowed humans to move up into otherwise inhospitable latitudes, eventually altering the skin pigmentation of some European populations. The Darwinian framework of gene-culture coevolution allows researchers to move beyond insightful explanation about the plausible roots of human cultural and behavioral variation and get down to the serious business of scientifically explaining these things.

And that is the core point. The revolution here isn’t descriptive, it’s explanatory. Placing our understanding of cultural change in a comprehensive, unified Darwinian framework has moved the study of human behavior forward in a way that other, similarly minded attempts have so far failed to achieve. As more and more researchers across the social sciences – from psychology and sociology to economics and anthropology – come to appreciate and accept the utility of the Darwinian perspective, these fields (particularly anthropology) are beginning to move out of the aimless shadows of what Thomas Kuhn called pre-paradigmatic science.

The reasons for this are simple: the more researchers who work within a coherent, mutually intelligible framework, the greater a field’s capacity for real scientific progress. This is because science itself is something of a Darwinian process. It works through patterns of competition and cooperation among individual researchers (and research groups), who collaborate on complex problems and criticize each other’s work where it falls short of established criteria. This process doesn’t work very well if everyone is working under an entirely different framework – Marxist anthropologists can’t add much to the discussion of Darwinian approaches because they lack both the specialized knowledge and the shared values needed to make sense of and properly evaluate Darwinian work (and vice versa).

In this line, the work of Henrich and other evolutionarily minded social scientists has been immensely beneficial, forging as it has a deeper, broader understanding of the roots of human behavior. And there’s a compelling case to be made that the growing popularity of this theoretical framework isn’t some intellectual fad. Rather, it’s a product of people who share similar goals (to explain things) and similar standards for judging how well those aims have been met (internal coherence, experimental and observational evidence, falsifiability) responding to relevant evidence. The array of approaches couched under the wider framework of gene-culture coevolution just seem to work.

Henrich’s synthesis of this research is among the best that I have read, carefully explaining how evolved psychological traits – like a bias toward watching and mimicking prestigious or successful individuals or a tendency to monitor and enforce social contracts – work in concert with our ever-increasing capacity for high-fidelity information storage and transmission – language evolution, writing technology, printing presses, internet – to create a potentially boundless realm of cultural innovation. Humans are a remarkable species. But, as Henrich argues, our singularity comes not from our innate intelligence – which has been much overblown. Instead, it comes from our ability to put our heads together, creating resilient forms of collective intelligence that allow us to survive – and thrive – practically anywhere we find ourselves.

The Secret of Our Success: How Culture is Driving Human Evolution, Domesticating Our Species, and Making us Smarter – by Joseph Henrich

GOP Presidential Hopefuls Resoundinngly Reject Science and Constitutional Values

A recent Salon article provides a synopsis of the views the GOP’s current 2016 presidential hopeful hold regarding evolution. In aggregate, they take a bold stance against science and reason, which should come as a surprise to absolutely no one. Jeb Bush holds the most enlightened view by a considerable margin, accepting evolution on the one hand and arguing that it should not be a part of school curricula on the other. Compared to his fellow presidential hopefuls, this is a remarkably intelligent and nuanced position, but it still ultimately boils down to sycophantic pandering to the far-right religious zealots the GOP depends on to remain competitive. That anyone holding any of the views expressed by the GOP’s potential 2016 candidates – even Bush’s milquetoast appeals to the lowest common denominator – has some chance of securing the presidency is exceptionally disheartening.

The worst offenders – Ben Carson, Mike Huckabee, Rick Perry, and Rick Santorum – have adopted a position in abject opposition to all measures of rationality and evidence, essentially casting their lot with emotional/ideological preferences rooted in flimsy interpretations of ancient myths and, I suspect, deep fears regarding their own cosmic insignificance. The sad thing is that there is a significant proportion of the U.S. electorate that finds this sort of vehemently stubborn,  fact-averse religious fanaticism appealing. According to a recent Pew poll, some 31% of Americans reject the reality of human evolution. This is disconcerting, but offset by the 35% or so (depending on who you ask – Gallup comes up with a different number) who recognize that evolution by purely natural means in the best explanation for human origins. Still, the 31% who more or less reject everything the best evidence and most coherent theory tells them regarding the origins and diversity of life on the earth should not be written off.

FT_15.02.11_darwin

Pew survey results on acceptance of human evolution. Ideally, the public views would mirror those of AAAS scientists.

Not being a Jedi master/mind reader, I can only speculate about the motivations behind the GOP candidates’ stated beliefs. I get the impression that the four gentlemen mentioned in the previous paragraph aren’t being anything less than genuine. They are religious fanatics, pure and simple. The actual beliefs of the other candidates are harder to discern, clouded as they are in the nebulous miasma of obfuscation and pandering that seems to follow career politicians wherever they go. All of the candidates endorse some breed of “teach the controversy” nonsense (read: allow Christian creation myths to be taught in science class), and obsequious attention to the right-wing base seems like a plausible motive. Though the 31% of the population that rejects evolution aren’t likely to decide an election on their own, it’s worth noting that their votes aren’t evenly distributed. Results of a Gallup poll indicate that 58% of Republicans endorse the Creationist view that humans were created by god within the last 10,000 years, as opposed to 41% for Democrats. Consequently, pandering to anti-evolution religious zealots is essentially mandatory for anyone hoping to secure a chance at the Republican presidential nomination. The relationship between religious belief and party affiliated tells a similar story. 64% of white Protestants reject evolution; 67% of white Evangelical Protestants are registered as Republicans. The exact degree to which these two subsets of the white “I find reality intensely unsettling” demographic overlap is unclear, but I suspect it is considerable.

evolution2013-2FT_14.08.28_religion_midterms

In any event, the outlook for modern Republicans with presidential aspirations is bleak: grovel at the feet of superstitious troglodytes or lose. But perhaps I’m being too partisan in my analysis. Certainly the fact that Republicans can’t win an election without pandering to the one of the most stubbornly ill-informed subsets of the modern American populous should be properly viewed as stain on their party: the only way they can maintain their brand is to sell snake-oil to eager dupes. More disconcerting, however, is that any member of any party has to invest energy in either placating or pleasing society’s most grossly ignorant factions. No one who expresses any of the views enumerated in the Salon article should have a chance of becoming the president of the United States – or any other 21st century, for that matter. The answer, of course, is not to disenfranchise the ignorant. Rather, it is to work to eradicate ignorance by remedying the flaws in our educational system and the broader social milieu in which it rests that have allowed that ignorance to persist. In a supposedly advanced, modern society with near-instant access to endless information, the proportion of the population that rejects evolution, believes GMOs are unsafe, thinks vaccinations are dangerous, or any number of the hair-brained, lunatic fringe notions that have taken up residency in the popular consciousness should be 5% or less.

From this perspective, there is some reason to be hopeful. The proportion of the population that accepts naturalistic evolution is up to 19% (from 15% in 2012) even as the percentage of the population that takes the nonsensical creation myths of the Bible serious has dropped to 42% (from 46% in 2012). Slims improvements, to be sure, but I’ll take them enthusiastically. Viewed through properly rose-tinted glasses, this is a silver-lining that can be magnified, unfolding into a future in which presidential candidates don’t have to pander to religious zealots, and sincere religious nuts don’t even register as even far-shot options for the presidency. Maybe it’s a long shot, but I’m not quite prepared to abandon hope.

qlkv1bjc1ewmyfp0xrqvhg

The Open-Carry Movement: Turds in the Punchbowl of Civilized Society

Is the open-carry movement comprised almost entirely of bullies, douche-bags, and ignorant militants? Men and women possessed of an inexplicable sense of entitlement, passionately defending a radical interpretation of the Second Amendment like savage dogs squabbling over gristle? These questions are rhetorical of course. The evidence indicates that the answer all around is a resounding yes. As several recent videos demonstrate, the open-carry movement is one of shallow, dick-swinging bravado and reactionary thoughtlessness, conditioned by crude political indoctrination.

Don’t simply take my word for it. Watch as they crassly harass a U.S. Marine Veteran, goading him like crude caricatures of high-school bullies.

These people are some of the dregs of modern society. Their strongest arguments are rooted in wan appeals to a fever-dream of “American tradition”. Under scrutiny, they have all the resilience of dust in the wind. So, lacking the ammunition to win by persuasion, they resort to tactics not dissimilar from a cockerel advertising its dominance over a clutch of hens. Typically, men and women of this sort pride themselves on their independence, yet by their very nature, they are prime targets for political manipulation.

Take for instance the very cause they claim to champion – the inalienable right of each individual to own a gun. The very notion of an inalienable right – immutable, eternal, unassailable – doesn’t hold a lot of water. It’s pure linguistic trickery, likely rooted in the Western heritage of Platonic essentialism. Probably the closest we might come to defining a right as inalienable is to say it satisfies our senses of compassion and empathy in light of pertinent exigencies. The Second Amendment was written by men, and – as with the rest of the U.S. consititution – it was born of a mix between high-minded idealism and political pragmatism. Indeed, historical evidence suggests that the Second Amendment in particular was likely a concession to latter on behalf of the former. As Michael Waldman pointed out in his recent (and excellent) essay, the framers of the Constitution added the Second Amendment in order to placate anti-Federalist forces largely opposed to the very notion of a new constitution. The explicit purpose of the Amendment was to provide legal justification for the establishment and maintenance of state militias. After some fiddling, that justification was finally expressed as follows:

A well regulated militia, being necessary for the security of a free state, the right of the people to keep and bear arms, shall not be infringed.

Note that it says nothing of an individual right to bear arms. Up until 2008, the Supreme Court actually ruled consistently against said interpretation. It was not until the Bush administration gave us a Supreme Court dominated by right-wing activists – men who brought us such disastrous decisions as Citizens United and McCutcheon vs. FEC  – that the individual right to bear arms garnered any legal standing. This in turn was the product of a long process of political manipulation that began in the 1970s, when right-wing ideologues hijacked the NRA, which had previously supported common-sense gun-regulation.

The road to radicalization was both long and subtle – it’s worth reading Waldman’s entire piece for the full story – but the ultimate point is that the “inalienable” right the open-carry movement purports to defend is purely fictitious. It is a political fabrication, pure and simple. But more importantly, it is also ridiculous. It has nothing to do with compassion, empathy, or pragmatism. Nothing to do with promoting the general welfare of anyone, anywhere. The original intent of the Second Amendment was to pacify people wary of the potential for tyranny implicit in the foundation of a strong federal government. Given the historical context, this was not unreasonable. But with the dissolution of official state militias, the Second Amendment ceased to serve its intended purpose. In light of the massive imbalance of force and skill evident between the modern U.S. military and the most well-armed right-wing militant, it’s not even clear that it could.

None of which is even to argue against an individual right to bear arms. I own guns. I enjoy hunting and recreational shooting. Even the idea of owning a gun for self-defense is sensible. But fulfilling any of the aforementioned ends does not require carrying an assault rifle in public. It doesn’t even require owning one in private. The idea that individuals have a right to own any and all guns they want, absent any and all restrictions, is a complete and utter fiction. It has no grounding in either law or history, let alone utility. There hasn’t even been a serious political movement toward curtailing even the most outlandish interpretations of Second Amendment rights. Indeed, the political influence of the NRA and other radical gun interest is so great that even the most conservative of gun-laws – supported by 90% of the U.S. public – can’t find sufficient support in congress. The gun rights movement in general and the open-carry movement in particular is a fight against a paranoid delusion of government overreach. It is a cause for developmentally stunted men and women, hoping to act out their childish action-movie fantasies and preserve their petty sense of entitlement. Which a long and strenuous way of saying something simple:

The radicals in charge of the modern NRA, the political cowards that bow to their will, and the men and women of the open-carry movement can all go fuck themselves.

The end.

 

Trigger Fever – liberal censorship and ideological sensitivity

intelligent-books-to-read

I’ve spent enough time ranting about religious hokum and conservative chicanery that a casual reader could be forgiven for thinking I’m some kind of self-righteous, bleeding-heart lefty. Personally, I like to think that my ideological biases – whatever they may be – are rooted in the rational analysis of empirical evidence. I try not to take a strong stance on something unless I think I have good reason for doing so. Nevertheless, it would be disingenuous for me to suggest that my sympathies aren’t more closely aligned with those on the left end of the political spectrum then they are with those on the raving end of intellectual stagnation that is the modern conservative right. The reasons for this are simple. For the most part, the conservative outlook is purely ideological. It is about how people believe the world ought to be, irrespective of any (and very often all) available facts about the way the world actually is.

Fortunately, whenever my feelings of affinity for the ideological left become too strong, something usually comes along to remind me that much of what falls into the broad classificatory scheme of liberalism is just as deeply rooted in ideological preferences about how people think things should be as any conservative notions about the rationality of unregulated markets and the purity self-interest as motive for social action. Case in point: the rise of so-called “trigger warnings”, thoughtfully discussed Jenny Jarvie’s March piece in The New Republic.

Clearly I don’t spend enough time visiting forums populated by victims of violent trauma or afflicted with an overactive sense of political sensitivity, because I had no idea trigger warnings had become a topic of debate prior to reading Jarvie’s essay. In the abstract, trigger warnings are designed to steer people away from material that might stir up unpleasant memories regarding past experiences. Situated in the appropriate context, trigger warnings appear a sensible courtesy motivated by feelings of compassion and empathy. Unfortunately, a noble seed seems to have grown into a rather noxious weed, as trigger warnings have begun to crop up college campuses – outside the blogosphere in which they were incubated.

In the early months of 2014, students at the University of California, Santa Barbara, passed a resolution urging faculty members to employ trigger warnings before assigning or discussing material that might offend certain student’s sensibilities. There are definitely people out there who have experienced horrible trauma in their lives. Victims of sexual violence and combat veterans, for instance, have probably seen the most vicious human savagery imaginable. Traumatic experiences are bound to leave tender spots in their wake, wounds easily aggravated by provocative words and imagery.

Problematic in this is that there are no objective criteria defining what is and is not a trigger. What words or phrases count as triggers? What do those words or phrases trigger? What types of people with what kinds of histories should trigger warnings be constructed to protect? As Jarvie points out, triggers aren’t particularly amenable to discrete classification or prediction – they can be tied to simple, subtle cues or complex arrays of sensory information. Likewise, the responses they engender can vary widely. What is and is not a trigger is extraordinarily subjective. If we tip-toe around everything that anyone might find redolent of past trauma (real or perceived) then nothing is open for in depth discussion or debate. Jarvie writes:

Issuing caution on the basis of potential harm or insult doesn’t help us negotiate our reactions; it makes our dealings with others more fraught. As Breslin pointed out, trigger warnings can have the opposite of their intended effect, luring in sensitive people (and perhaps connoisseurs of graphic content, too). More importantly, they reinforce the fear of words by depicting an ever-expanding number of articles and books as dangerous and requiring of regulation. By framing more public spaces, from the Internet to the college classroom, as full of infinite yet ill-defined hazards, trigger warnings encourage us to think of ourselves as more weak and fragile than we really are.

Slippery slope arguments are, as the name might suggest, slippery. They are among the cheaper tools in the sophist’s toolkit and should be wielded with care. But in this case, I think the term applies. At some point, trigger warnings cross a line from polite consideration and begin to skirt the borders of ideological censorship as individuals attempt to cleans the social world of the things they find unsettling. In this sense, the idea of issuing trigger warnings whenever potentially controversial topics are going to be addressed in the classroom is well within the same ballpark as the idea that certain books should be banned because they risk offending certain people’s religious or moral sensibilities. The students that support the implementation of trigger warnings are toying with a mindset dangerously similar to that of the rigid prudes and rotten curmudgeons who recently pressured an Idaho school board into removing a controversial book from their curriculum. Both are notions that appeal primarily to intellectual cowards – people who cling to ideologies (feminism and obsequious political correctness in the case the former, fundamentalist Christianity and right-wing fascism in the case of the latter) as bastions of order and comfort in a sea of complex and conflicting ideas.

Book-burning-Shutterstock

On a more fundamental level, trigger warnings are a plea for a sanitized world, a place where no one ever encounters anything that makes them feel an unwanted emotion. This, unfortunately, is a fantasy: the world is filled with unpleasant things, including an absolutely staggering amount of human suffering. Should we table all discussion of tragedy and despair until everyone present has affirmed their comfort with the subjects or left for greener pastures? In addition to being something of a worn out cowboy platitude, the idea that the world can be a tough place to carve out a living also happens to have considerable grounding in reality. The universe is indifferent to our sensitivities. Carnage and brutality are as much a part of nature as beauty and elegance. That’s not to say we shouldn’t strive for the latter while attempting to minimize the former, but to highlight the cold reality that ugliness is a fact of life. Looked at in this way, it becomes readily apparent that the trigger warning movement is primarily a case of masturbatory activism. It is a product of WEIRD (western, educated, industrialized, rich, and democratic) privilege – the sort of thing that postmodern narratives, like those at the core of this debate, are designed to subvert.

In my previous post, I wrote about how building a college around Christian dogma was absolutely inimical to the very purpose of higher education. So it is with trigger warnings. Certainly there are people out there who have experienced horrible things. Those clinically diagnosed with P.T.S.D. might well deserve special accommodations, but these can be worked out privately with their institutions and instructors. But most people are not going to fall into that camp. Blanket trigger warnings are an implicit endorsement of a socially constructed, infinitely malleable sense of victimization. It reeks of entitlement – that, as an American, people have an inalienable right to never have contact with anything that makes them twist and squirm. If the desire for such a sterilized existence is unrealistic, it is also a pitiable. If you can’t take a bit of discomfort in your academic career, you should probably just stay the fuck home. If you aren’t made to feel uncomfortable by some of the things you read and hear, you probably aren’t going to learn a damn thing. Testing and exceeding your boundaries, both ethical and ideological, is how you develop new perspectives, nurture empathy, and cultivate wisdom. Here, Jarvie’s words are eloquent:

Structuring public life around the most fragile personal sensitivities will only restrict all of our horizons. Engaging with ideas involves risk, and slapping warnings on them only undermines the principle of intellectual exploration.

Personally, I have experienced profound insight and growth following periods in which I engaged with new ideas that left me untethered from the preconceptions I once held dear. Some folks have heaped a lot of ideas concerning propriety on the frail – if more or less well-intentioned – edifices of esoteric social theory, imbuing lofty reflections with the weight of moral imperative. Many of those ideas were forged out of a sense of intellectual curiosity, fortitude, and adventure. Now, a handful of liberal arts students, enamored of their convoluted ideas and expanding vocabularies, are willing to abandon all of that in favor of preserving their fragile sensibilities. Never let it be said that conservatives have a perfect monopoly on reactionary stupidity.

 


Postscript:

It has been pointed out to me that someone might read this entry and think it downplays sexual assault, both as a societal problem and as an experience of intense individual trauma. That is not the case. The statistics here are unequivocally grim. Even if they were not, the existence of even isolated cases of sexual assault should be considered socially unacceptable. So, to be clear, I think sexual assault is an absolutely horrific crime and its perpetrators are some of the most vile scum imaginable.

The point remains that trigger warnings aren’t a very practical way to address the legitimate emotional trauma of that type of experience. If an institution issues trigger warnings for one thing, fairness dictates that they must issue them for all potentially uncomfortable subject matter. The cravings of addicts can be triggered by a suite of stimuli, and the consequences of being triggered could be severe (up to and including death). Nevertheless, I don’t think issuing a trigger warning any time a class might address the topic of drugs or alcohol is reasonable. But more to the point, issuing a trigger warning any time drugs or alcohol come up will not guard addicts against potential triggers. It is a definitional problem – perhaps talking about drugs and alcohol will trigger cravings in an addict, or discussion of rape will trigger emotional pain in a victim, but so might any number of other stimuli pertinent to the context in which drug abuse or sexual assault occurred. Because of this, trigger warnings have the functional effect of sterilizing intellectual discourse without reliably protecting their intended targets.

As I said, people who have experienced real trauma and carry the resulting psychological scars should probably be allowed to work out some kind of accommodations with instructors. This is an issue that likely has more to do with the bizarre way western society treats psychological illness than it does with the propriety of trigger warnings. Trigger warnings in this case are at best a means of treating a symptom. The actual problems that trigger warnings might address are elsewhere. Which is what leads me to my concerns about trigger warnings as mechanism for ideological purification, rather than a legitimate means of helping individuals cope with past trauma.

Religious Moderation vs. Religious Fanatisicm vs. Science

sistine-chapel-creation-of-adam

Over at io9, Mark Strauss has written a nice piece cataloging the brouhaha underway at Bryan College, a Christian school in Dayton, Tennessee. The dispute revolves around a change to the wording of the college charter. The previous version was plenty nonsensical, but apparently the board of trustees wanted to make their pro-hokum position a little more rigid. Consequently, a charter that once read:

“that the origin of man was by fiat of God in the act of creation as related in the Book of Genesis; that he was created in the image of God; that he sinned and thereby incurred physical and spiritual death;”

now carries the adendum:

“We believe that all humanity is descended from Adam and Eve. They are historical persons created by God in a special formative act, and not from previously existing life forms.”

Both phrases mean approximately the same thing, but the addition of the clarification concerning the historical veracity of Adam and Eve strikes a stricter bearing, eliminating all room for “Bible as metaphor” apologetics. As Strauss points out, the altered wording gets right to the heart of one of the primary hurdles preventing Christian fundamentalists from accepting biological evolution: if Adam and Eve are not the literal progenitors of all mankind, then there is no “original sin”, and – here is the critical point – if there is no original sin, there is no reason for God to send his only begotten son, Jesus Christ, to learn carpentry and die for our sins. Of course, even in the absence of empirical contradictions, this story doesn’t make a lot of sense, but that’s not the point. To hardcore believers, the Biblical creation story is a literal recounting of actual events. For them, the story of Adam and Eve is the linchpin their religious beliefs.

Strauss’ take on the whole affair is thoughtful and lucidly written. According to Strauss, the change in wording and subsequent schism can be partially traced to the rise of genomics. New tools have increased the resolution and fidelity of genetic research, allowing researchers to both ask and answer important questions about human ancestry. Unsurprisingly, the resulting accumulation of evidence argues strongly for a human ancestry that is ancient and shared. More to the point, it argues strongly against a literal interpretation of the Biblical story of Adam and Eve. A recent study(published last year) conservatively estimated the minimum population size needed to account for the genetic diversity of modern humans is 2,250. Consequently, a hypothetical two person gene pool would fail to account for modern human genetic diversity by several orders of magnitude.

Despite the historical depth and philosophical breadth of Strauss’ analysis, he does eventually stumble. Everything you’ve read thus far is more or less a recapitulation of his take on a microcosm of the modern struggle between the forces of religious moderation and religious fanaticism. Now we get to the meat of things – what I really wanted to address. About two thirds of the way through his piece, Strauss tries to make a point by juxtaposing the opinion of David Coppedge, a former NASA JPL employee and paragon of cognitive dissonance, and Jerry Coyne, an evolutionary biologist and general advocate for reason. Coppedge is cast in the role of the raving religious fanatic, Coyne in that of the strident and dismissive scientific purist. Strauss writes:

Coyne is several magnitudes more rational than Coppedge. Yet, the underlying sentiment of both these statements bother me, in that they suggest a false dichotomy between faith and science—the idea that you can believe in the Bible or you can believe in evolution, but you can’t believe in both.

I think otherwise. Ever since Darwin first published On the Origin of Species, many theologians have reconciled evolution and scripture in ways that are not only elegant but that, in my view, have inspired new ways of thinking that enhance the tenets of existing belief systems for the better.

This is a nice sentiment. Unfortunately, as far as biological evolution is concerned, faith and science are fundamentally irreconcilable. This is true no matter how loosely one chooses to interpret the Bible. The problem is deeper than any question about whether or not a growing mountain of evidence renders a literal interpretation of the Bible untenable. This is because the Bible indisputably paints humanity as the ultimate object of God’s design. Consequently, even the most diplomatic form of theistic evolution construes biological change as a more or less teleological, goal-oriented process. Here, a liberal interpretation of the Bible allows room for evolution, with the caveat that evolution occurs for the express purpose of creating man.

That is simply not how evolution works.

Evolution is a blind process. It has no endgame in mind. In fact, it has no mind. It is a process of extreme contingency, unfolding according to the aggregate effects of the day-to-day exigences of the struggle to survive and reproduce. A person who thinks that belief in the Bible can be reconciled with acceptance of evolution has, somewhere along the line, stumbled into a profound misunderstanding about the meaning of the former or the consequences of the latter.

Humans are not the pinnacle of creation or the end point of the evolutionary process. Nevertheless, this is exactly what the Bible teaches, irrespective of how one chooses to spin it. Certainly one can believe in the Bible, on one hand, and evolution on the other. But the two views are not amenable to philosophical reconciliation. To espouse both the Bible and evolution is to simultaneous hold explicitly contradictory viewpoints. People can (and frequently do) have conflicting views. Which is perplexing, but fine. Far better that one accepts reality with a sprinkling of superfluous superstition that rejects reality altogether.  That said, to argue that scripture can be reconciled with the science of evolution (or geology, physics, astronomy, cosmology, archaeology, and so forth) is to adopt a extremely fragile conciliatory stance. It might sound smart to the ears of polite and sophisticated society – it certainly appeals to the lowest common denominator – but wait until the real wind blows.

In the end, the Bryan College story can be boiled down to a themes relating of ideological conflict – the stubborn traditionalists railing against the forces of progress and discovery. On the surface, it is about the conflict that results from the sort of ideological intransigence that leads one to reject science in favor of ancient superstition. However, there is also something deeper here, and that is the conflict implicit in the attempt to build institutions of higher learning where education is bound by the dictates of religious dogma. Bertrand Russell once wrote that…

“It may be said that an academic institution fulfills its proper function to the extent that it fosters independent habits of mind and a spirit of inquiry free from the bias and prejudices of the moments. In so far as a university fails in this task it sinks to the level of indoctrination.”

Wisdom of the West, 1959

How can a school like Bryan College possibly succeed in this regard? At a school like Bryan, the bounds of inquiry are strictly set. By purportedly divine fiat, there are places one cannot go, things one cannot think. This is made clear in the university charter: think like us or go elsewhere. Better still, the very doctrine of Christianity (as espoused by fundamentalists) can be roughly translated into “agree with us or burn in hell”. Such a philosophy is inimical to the very purpose of higher education. It is nothing short of crude indoctrination – the work of intensely insular minds grasping for company. In that sense, the Bryan College affair isn’t about whether it is best to interpret the Bible literally or metaphorically in light of scientific evidence. It is about an endeavor that is, by its very nature, doomed to fail: building an edifice of higher learning with built in limits on what is okay to learn.


edit: Billy Bryan pointed out that the Adam and Eve language is not replacing the previous passage. Having confirmed this, I’ve edited the blog to reflect that.