A Celebration of Ignorance and Curiosity

I’ve written a guest post for a site called macrospective.net. Here’s an excerpt:

One of the better expressions of what I’m getting at came from no less auspicious a source than Lt. Commander Data. In the Star Trek: The Next Generation episode “Where Silence Has Lease”, Data put it this way: “The most elementary and valuable statement in science, the beginning of wisdom is ‘I do not know’.” My argument is not that the groups that have been on the receiving end of my more critical barbs should abandon their explanations and put their trust blindly in science or expert opinion. Rather, they should endeavor to learn about the process of scientific discovery and to uncover the reasons why it is reasonable to put stock in certain claims but not in others. Before I would encourage people to put their trust in science, I would encourage them to recognize the gaps in their knowledge. That is—for example—accepting that they don’t have the tools to say one way or the other whether evolution is the proper account of the origins and diversity of life.

Read more here: “A Celebration of Ignorance and Curiosity”. 

Paleo-Hokum: The Human Tendency to Build Romantacized Versions of the Past

Conservatives often seem gripped by an almost crippling nostalgia for days gone by, idealizing the 1950s as some kind of wholesome social Eden or arguing that the moral strictures concocted by people living 3000 years ago provide a useful template for a how to live in 2014. Occasionally characterized as hallmarks of the conservative disposition, such willfully romanticized, ferociously uncritical views of the past are products of a type of delusional sentimentality wherein one constructs a largely fictitious picture of history and argues that the present should be structured accordingly. The notion that conservatives have a proclivity toward adopting signally imaginative pictures of history is not entirely unfair. Indeed, the Right’s habit of repeatedly attempting to rewrite public school history and science curriculum to better match their ideological sensitivities has been well documented. The flaw in this perspective has nothing to do with its veracity. Rather, it comes from the notion that it is a trait to unique to conservatives.

Artist's rendering of a Paleolithic hunt.

Artist’s rendering of a Paleolithic hunt.

Personally, I am more sympathetic to the perspective that this tendency to construct and subsequently fetishize incongruous versions of history is a more broadly human characteristic. Take for example the recent “Paleodiet/Paleo Lifestyle” trend. Personal experience suggests that the rank-and-file of the Paleodiet movement consists of left-leaning folks. Unfortunately, such anecdotal evidence makes for a rather wobbly foundation upon which to build broader conclusions, and reliable data on the political demographics of the Paleo movement are hard to come by. So, for the sake of inclusivity, let’s just say for now that the Paleo trend seems to be a load of bullshit just about anyone from anywhere on the political spectrum can get behind. Liberals might find the rhetoric more readily palatable, appealing as it does to their instinctive revulsion regarding all things industrialized, capitalistic, or otherwise offensive to a strong sense of equity. Conservatives – especially those huddled out in the feral, intellectual hinterlands of the Far Right – might be somewhat more likely to find Paleo lifestyles unsettling, given their appeal (both implicit and explicit) to the notion that humans have evolved, or their not-so-subtle suggestion that there was a period of history that could be called Paleolithic. After all, reputable scholars have demonstrated that the world is only 6000 or so years old.

In any event, the exact demographics of the Paleo movement are largely immaterial to my overall point. The core of my argument is that the Paleo diet is based on a nonsensical view of the past, and that such views are not entirely monopolized by people sympathetic to Right Wing ideology – rather, the construction of romanticized versions of the past is a broadly human theme.

To be clear at the outset, I do not take issue with whether or not this trend is actually healthy. Its health consequences are more or less incidental to the core philosophy. Those who experience health benefits probably do so because it is beneficial to cut back on high calorie, low nutrient foods, not because they are eating a diet that more closely resembles the one that humans have “evolved to eat”. And therein lies my primary grievance: it’s not that a Paleo lifestyle is somehow deleterious, it’s that it is based on both a distorted picture of the past and a shoddy understanding of the process of evolution.

The Paleodiet is essentially a hodge-podge of pseudoscience and outright fantasy, concocted more out of imagination than a real understanding of the evolutionary history of modern Homo sapiens. It would hardly be unfair to cast it as a modern manifestation of the primitive Eden Jean-Jacques Rousseau invented in the 18th century. Rousseau argued that humanity resided in a state of simple, gentle savagery, until the disease of avarice tore us from our position of grace. Such a rosy view – especially absent empirical evidence – appropriately warrants a hearty dose of incredulity. Nonetheless, Rousseau’s ideas about humanity’s utopian ancestry have proven immensely influential, particularly in the humanities, where researchers have seized upon his narrative as a reflex against the abject barbarity of European colonialism and the pervasive racism that clouded anthropological and sociological thought throughout the 19th and early 20th centuries. It is hard to see some of the more spurious ethnographic work produced by subsequent generations of scholars as anything less than ideologically motivated reactions to colonialism, racism, and “social Darwinism” (a misnomer among misnomers), fashioned in the image of Rousseau’s initial musings.

index

Jean-Jacques Rousseau, 1712-1778

The Paleodiet is heir to this tradition of romanticizing the primitive. As pioneered by Rousseau it is – make no mistake – a liberal tradition. It sets itself the task of vilifying the aspects of modernity some people find uncomfortable in light of their ideological disposition by casting them as “unnatural” (more on that shortly). Humans, the argument goes, are the behavioral and physiological progeny of selective pressures that would have been prevalent during the millennia our ancestors spent as bands of nomadic foragers, scratching a living out of the Pleistocene landscape. Our fall from grace began with plant and animal domestication, spiraling toward the cacophonous dénouement that is a modern world  populated by Big Macs, Hot Pockets, microwave burritos, and white bread.

In a way, the basic claim is a truism. Modern humans are the product of millions of years of evolution. The vast majority of our genetic architecture was in place well before the invention of agriculture. Similarly, the majority of the selective pressures that would have been significant in shaping the suite of behaviors that distinguish us from our primate relatives would have related to the environment(s) hominids inhabited during the five or so million years of the Plio-Pleistocene. During that time, our hominid ancestors were almost certainly foragers, and definitely didn’t have access to Big Macs. Shouldn’t we be adapted to eat a diet of fruit, grass fed meat, and tubers, rather than corn fed, hormone enriched beef product slathered in special sauce, sandwiched between two gluten-rich buns?

Though such a perspective might have some intuitive appeal, it rests on a number of fallacious assumptions. First, it bespeaks a teleological view of the evolutionary process. Humans, according to the Paleo philosophy, evolved to subsist on a diet typical to the average Paleolithic forager. Once we had reached that stage, selection ceased to operate and our evolution came to a halt. For several reasons, this is a clumsy way of looking at the process of evolution. For one, organisms – human included – don’t evolve to do anything. Genetic variation is generated in a manner blind to the challenges that will arbitrate its proliferation or eradication. Additionally, such arguments suggest that evolution has a stopping point – that once creatures have become appropriately adapted to their environment, they cease to evolve. The notion that we – or any other creature – evolved toward some end point and stopped once we got there is simply wrong. Some might argue that this is an overly simplistic caricature of the Paleo movement’s core arguments. The more sophisticated Paleo adherents probably don’t think humans have ceased to evolve. Instead, they recognize that evolution typically occurs gradually, so humans haven’t had time to adapt to a post-agricultural, and – more importantly – post-industrial diet.

cowsegyptianhieroglyphics1

Hieroglyphic depiction of agricultural practices.

In a sense, this is true, but it exposes more unfounded assumptions. Most evolutionary change is a product of the gradual accumulation of advantageous mutations. Populations evolve by slow, subtle steps, so the notion that humans might not have experienced a lot of evolutionary change since the advent of agriculture, some 10000 to 12000 years ago, is not completely unreasonable. However, there is evidence that some human populations have experienced non-negligible genetic change since the Neolithic Revolution (i.e. the widespread adoption of agriculture as a primary means of subsistence). Adult lactase persistence, for instance, is associated with a pair of single-nucleotide polymorphisms that arose – or were at least selected-for – after certain human populations began to consistently engage in ungulate husbandry. The ability some people have to metabolize milk into adulthood is a product of recent evolutionary change, probably due to the fitness gains associated with prolonged access to a new source of caloric energy. So while evolutionary change tends to occur quite slowly, it can still happen rapidly enough to make us better suited to modern diets than the staunchest Paleo advocates would have us believe.

Consider also that plant and animal domestication did not occur as suddenly as is popularly conceived. The long dance of coevolution that is domestication began long before the Neolithic Revolution, as humans began to interact with their ecological neighbors in more and more complex ways. The apparent suddenness with which agriculture became ubiquitous in places like Mesopotamia and Mesoamerica is a result of a cascade of innovations at the tail end of a longer process of give and take. Prior to building irrigation systems and tilling fields, humans were likely setting up camp near useful perennials and annuals, actively encouraging their growth through more subtle types of landscape modification. Later, individuals began to engage in broadcast sewing, actively distributing the seeds of useful plants around productive and accessible stretches of land. Modern wheat, corn, cows, and chickens may be recent innovations, but we’ve been eating their ancestors for centuries.

2517816_orig

Furthermore, the very notion that humans are specifically adapted to a certain diet is fallacious on at least two fronts. First, it ignores the extraordinary range of phenotypic plasticity Homo sapiens displays. We are a species marked by a remarkable range of behavioral and physiological flexibility. To suggest we are adapted, even in the broadest possible sense, to a particularly well bounded lifestyle ignores one of the primary components of our nature. Paleoecological evidence indicates the Plio-Pleistocene was a time of considerable environmental variability. Under such circumstances, a rigid diet regime would have been hard to maintain. Indeed, the overall trend of hominid dietary evolution seems to be one of increasing generality, with successive generations becoming better and better suited to eating a wider and wider variety of plants and animals.

A second, but not unrelated, point is that we do not have a perfect picture of what prehistoric diets really consisted of. Even if humans were adapted to eating a certain range of forager staples, exactly what those staples were remains cloudy. More than likely, our prehistoric diet was frequently dictated more by what we could physically capture and metabolize, rather than some idealized set of nutritional guidelines. Basic behavioral ecology would suggest most species, including humans, will preferentially target those food items that yield the largest caloric returns relative to costs associated with capture and/or collection. That means – and both archaeological and ethnographic evidence bears this out – that what people ate likely varied from ecosystem to ecosystem. Homo sapiens was a more or less global species before the dawn of institutionalized agriculture. Different populations occupying a wide range of environments seemed to get by just fine by exploiting an expansive variety of food items. In the high Arctic forager populations subsisted (and still do) on a diet massively biased toward animal protein, while groups living in the verdant tropics incorporated (and still do) myriad fruits and nuts, in addition to a greater diversity of animal protein, into their diet. Given this apparent versatility, the argument that there is one diet humans are designed to eat looks somewhat less than convincing.

Paleo Lifestyle rhetoric is littered with appeals to nature, as if we’ve somehow lost our way and become something abhorrent before the eyes the All Knowing Universe. Advocates of this lifestyle have set up an arbitrary demarcation between natural and unnatural, as if the innovations associated with agriculture mark a frontier beyond which humans suddenly began to behave “unnaturally”. This sets up a false dichotomy between the types of behavior we engage in today and the types of behavior our ancestors engaged in the largely invisible past. Eating Hot-Pockets and watching You Tube videos may accurately be considered behavioral novelties in the broad scope of human evolution. But it is not reasonable to infer from said novelty that such behaviors are somehow unnatural.

On a purely philosophical level, the very idea of unnatural behavior subverts the Paleo advocate’s attempts to find an evidentiary basis for their dietary choices, implying as it does a certain level of mysticism. Methodological naturalism – the primary scaffolding around which all scientific research is constructed – denies the very possibility of anything “unnatural” occurring. How could it? Nothing can or ever will transpire in violation of what we might loosely call the Laws of Nature. To suggest otherwise is a direct invocation of the supernatural or at the very least a crude argument for some kind of vitalism, both of which strain scientific credulity and provide sufficient room for motivated advocates to weasel around unwelcome dispositive evidence. Either way, it boils down to a pile of fluffy nonsense.

Brass tacks, the Paleo Lifestyle is based on a romanticized version of the past. Like the conservative fantasies of 1950s suburban utopia or Wild West individualism, it is a canvas onto which people project their dissatisfaction with the present, crying, “if only things were thus…” Perhaps such fantasies provide a star by which people reckon their course, assuaging their fears that there might not be a right way to live. After all, many people probably find the idea that humanity has lost its way more comforting than the idea that it never really had one to begin with.

Religious Moderation vs. Religious Fanatisicm vs. Science

sistine-chapel-creation-of-adam

Over at io9, Mark Strauss has written a nice piece cataloging the brouhaha underway at Bryan College, a Christian school in Dayton, Tennessee. The dispute revolves around a change to the wording of the college charter. The previous version was plenty nonsensical, but apparently the board of trustees wanted to make their pro-hokum position a little more rigid. Consequently, a charter that once read:

“that the origin of man was by fiat of God in the act of creation as related in the Book of Genesis; that he was created in the image of God; that he sinned and thereby incurred physical and spiritual death;”

now carries the adendum:

“We believe that all humanity is descended from Adam and Eve. They are historical persons created by God in a special formative act, and not from previously existing life forms.”

Both phrases mean approximately the same thing, but the addition of the clarification concerning the historical veracity of Adam and Eve strikes a stricter bearing, eliminating all room for “Bible as metaphor” apologetics. As Strauss points out, the altered wording gets right to the heart of one of the primary hurdles preventing Christian fundamentalists from accepting biological evolution: if Adam and Eve are not the literal progenitors of all mankind, then there is no “original sin”, and – here is the critical point – if there is no original sin, there is no reason for God to send his only begotten son, Jesus Christ, to learn carpentry and die for our sins. Of course, even in the absence of empirical contradictions, this story doesn’t make a lot of sense, but that’s not the point. To hardcore believers, the Biblical creation story is a literal recounting of actual events. For them, the story of Adam and Eve is the linchpin their religious beliefs.

Strauss’ take on the whole affair is thoughtful and lucidly written. According to Strauss, the change in wording and subsequent schism can be partially traced to the rise of genomics. New tools have increased the resolution and fidelity of genetic research, allowing researchers to both ask and answer important questions about human ancestry. Unsurprisingly, the resulting accumulation of evidence argues strongly for a human ancestry that is ancient and shared. More to the point, it argues strongly against a literal interpretation of the Biblical story of Adam and Eve. A recent study(published last year) conservatively estimated the minimum population size needed to account for the genetic diversity of modern humans is 2,250. Consequently, a hypothetical two person gene pool would fail to account for modern human genetic diversity by several orders of magnitude.

Despite the historical depth and philosophical breadth of Strauss’ analysis, he does eventually stumble. Everything you’ve read thus far is more or less a recapitulation of his take on a microcosm of the modern struggle between the forces of religious moderation and religious fanaticism. Now we get to the meat of things – what I really wanted to address. About two thirds of the way through his piece, Strauss tries to make a point by juxtaposing the opinion of David Coppedge, a former NASA JPL employee and paragon of cognitive dissonance, and Jerry Coyne, an evolutionary biologist and general advocate for reason. Coppedge is cast in the role of the raving religious fanatic, Coyne in that of the strident and dismissive scientific purist. Strauss writes:

Coyne is several magnitudes more rational than Coppedge. Yet, the underlying sentiment of both these statements bother me, in that they suggest a false dichotomy between faith and science—the idea that you can believe in the Bible or you can believe in evolution, but you can’t believe in both.

I think otherwise. Ever since Darwin first published On the Origin of Species, many theologians have reconciled evolution and scripture in ways that are not only elegant but that, in my view, have inspired new ways of thinking that enhance the tenets of existing belief systems for the better.

This is a nice sentiment. Unfortunately, as far as biological evolution is concerned, faith and science are fundamentally irreconcilable. This is true no matter how loosely one chooses to interpret the Bible. The problem is deeper than any question about whether or not a growing mountain of evidence renders a literal interpretation of the Bible untenable. This is because the Bible indisputably paints humanity as the ultimate object of God’s design. Consequently, even the most diplomatic form of theistic evolution construes biological change as a more or less teleological, goal-oriented process. Here, a liberal interpretation of the Bible allows room for evolution, with the caveat that evolution occurs for the express purpose of creating man.

That is simply not how evolution works.

Evolution is a blind process. It has no endgame in mind. In fact, it has no mind. It is a process of extreme contingency, unfolding according to the aggregate effects of the day-to-day exigences of the struggle to survive and reproduce. A person who thinks that belief in the Bible can be reconciled with acceptance of evolution has, somewhere along the line, stumbled into a profound misunderstanding about the meaning of the former or the consequences of the latter.

Humans are not the pinnacle of creation or the end point of the evolutionary process. Nevertheless, this is exactly what the Bible teaches, irrespective of how one chooses to spin it. Certainly one can believe in the Bible, on one hand, and evolution on the other. But the two views are not amenable to philosophical reconciliation. To espouse both the Bible and evolution is to simultaneous hold explicitly contradictory viewpoints. People can (and frequently do) have conflicting views. Which is perplexing, but fine. Far better that one accepts reality with a sprinkling of superfluous superstition that rejects reality altogether.  That said, to argue that scripture can be reconciled with the science of evolution (or geology, physics, astronomy, cosmology, archaeology, and so forth) is to adopt a extremely fragile conciliatory stance. It might sound smart to the ears of polite and sophisticated society – it certainly appeals to the lowest common denominator – but wait until the real wind blows.

In the end, the Bryan College story can be boiled down to a themes relating of ideological conflict – the stubborn traditionalists railing against the forces of progress and discovery. On the surface, it is about the conflict that results from the sort of ideological intransigence that leads one to reject science in favor of ancient superstition. However, there is also something deeper here, and that is the conflict implicit in the attempt to build institutions of higher learning where education is bound by the dictates of religious dogma. Bertrand Russell once wrote that…

“It may be said that an academic institution fulfills its proper function to the extent that it fosters independent habits of mind and a spirit of inquiry free from the bias and prejudices of the moments. In so far as a university fails in this task it sinks to the level of indoctrination.”

Wisdom of the West, 1959

How can a school like Bryan College possibly succeed in this regard? At a school like Bryan, the bounds of inquiry are strictly set. By purportedly divine fiat, there are places one cannot go, things one cannot think. This is made clear in the university charter: think like us or go elsewhere. Better still, the very doctrine of Christianity (as espoused by fundamentalists) can be roughly translated into “agree with us or burn in hell”. Such a philosophy is inimical to the very purpose of higher education. It is nothing short of crude indoctrination – the work of intensely insular minds grasping for company. In that sense, the Bryan College affair isn’t about whether it is best to interpret the Bible literally or metaphorically in light of scientific evidence. It is about an endeavor that is, by its very nature, doomed to fail: building an edifice of higher learning with built in limits on what is okay to learn.


edit: Billy Bryan pointed out that the Adam and Eve language is not replacing the previous passage. Having confirmed this, I’ve edited the blog to reflect that.