The New York Times and The Chronicle today both picked up the report in Science about the discovery of an archaeological site northwest of Austin that dates between 13,200 and 15,000 years ago. It appears significantly older than the paleo-Indian culture called Clovis that was once thought by archaeologists to represent the first human occupation on North America. The Clovis culture, known for its distinctively fluted projectile points, offered a relatively simple narrative of big-game hunters arriving from Siberia and moving quickly across the continent to exploit the rich opportunities, especially the abundance of mammoths.
The Clovis projectile points were first scientifically excavated (near Clovis, New Mexico) in the 1930s and were recognized as especially ancient artifacts. For a long time, nothing else turned up as unambiguously older, and New World archaeologists gradually settled on the “Clovis-first” hypothesis as the best fir with the data. There were, however, dissenters within the discipline—and herein lies the part of the story that I think bears on broader issues in academe.
“Clovis-first” began as a well-framed scientific hypothesis. It fit the available facts and it was open to revision if new facts contradicting it came to light. But it is seemingly hard for people, even empirical scientists, to live with provisionality, and it time “Clovis-first” hardened into a dogma. That became clear as more and more evidence accumulated of pre-Clovis artifacts and sites. Each time some excavation turned up artifacts and other evidence of pre-Clovis Americans, the guardians of the Clovis-first orthodoxy would announce that they saw flaws in the methodology, sloppiness in the analysis, and if all else failed they would invoke the principle that “extraordinary claims require extraordinary proof,” as though there were something inherently extraordinary about human beings arriving in the New World before the Clovis culture took off.
The accumulation of pre-Clovis evidence gradually wore down the old Clovis-first consensus among New World pre-historians. Probably the decisive event was the 1975 discovery of the Monte Verde site in Chile, which eventually yielded a child’s footprint in a strata dated between 13,800 to 14,800 years ago. Monte Verde pushed a lot of archaeologists still lingering in support of Clovis-first into re-thinking their positions. And the recognition of the validity of the Monte Verde site brought new attention to other sites that had been put forward as pre-Clovis. Nonetheless, some Clovis-firsters stuck to the old doctrine.
The new Buttermilk Creek, Tex., findings ought in principle put an end to the debate. The artifacts come from an undisturbed layer of sediments beneath a layer of Clovis artifacts, and have yielded on multiple samples (49 of them) a consistent set of dates older than Clovis. Nonetheless, some stalwarts of Clovis-first aren’t giving up. The Times quotes James M. Adovasio, a professor of archaeology at Mercyhurst College, noting that Clovis-first has been “dying a slow death.” Of the archaeologists who still support it, Adovasio observes, “The last spear carriers will die without changing their minds.”
“Spear carriers” is a nice touch.
The phenomenon of established scientists becoming over-committed to a favored theory has been noted many times before—sometimes as a knock against the idea that science is self-correcting. Thomas Kuhn’s endlessly quoted account in The Structure of Scientific Revolutions (1962) of how “paradigms” are challenged and replaced might be invoked here, except that it would be a stretch to call the Clovis-first hypothesis a paradigm. It was never really anything more than an empirical generalization, and it required no radical new methods or concepts to overthrow it—only new data.
But with what cunning obstinacy we academics can resist facts when they fail to suit us!
The Clovis-first dead-enders are only a particular case of a common ailment in American higher education. I refer to our collective efforts to conserve against all odds hypotheses about academe that have long since met the equivalent of their Monte Verde moments:
- The overwhelming preponderance of liberals in higher education continues because, in fair-minded open competitions for academic positions, liberals just happen to win out.
- The undergraduate degree is the best investment the average student can make after high school.
- The federal government’s creation of Title IV student loans and other forms of support has made college more affordable.
- Programs of study based on group grievance and political advocacy have enriched the college curriculum.
- Making “diversity” a supervening value of higher education has opened access and fostered opportunity for those who would otherwise be excluded.
Every one of these items has accumulated its own body not just of dissenting opinion but of substantial countervailing evidence: the child’s footprints in the ashes; the artifacts buried in the clay where the theory says they cannot be.
Why are these doctrines so impervious to criticism? In truth, I don’t think they really are. As with the Clovis-first hypothesis, the doubters are slowly making their way, less by trying to convince those who have a professional stake in the status quo and more by going over their heads to the broader community of educated and intelligent people, who having nothing invested in the doctrine per se. Still, it would be better for academe if it devoted less time and energy to propping up failed hypotheses and more to putting things on a better track.
This article was originally posted in the Chronicle of Higher Education's Innovations blog.