Quantcast

Adventures in Neurohumanities | The Nation

  •  

Adventures in Neurohumanities

  • Share
  • Decrease text size Increase text size


Brain scan image from Libertas Academica
 
At Stanford University in 2012, a young literature scholar named Natalie Phillips oversaw a big project: a new way of studying the nineteenth-century novelist Jane Austen. No surprise there—Austen, a superstar of English literature and the inspiration for an endless array of Hollywood and BBC productions based on her work, has been the subject of thousands of scholarly papers. 

About the Author

Alissa Quart
Alissa Quart, the author of The Republic of Outsiders and Branded, is the editor for the nonprofit Economic Hardship...

Also by the Author

When only US wages can support families in the Global South, parents and children divide to survive.

A band of finance wizards take on the system they now say is corrupt.

But the Stanford study was different. Phillips used a functional magnetic resonance imaging (fMRI) machine to track the blood flow of readers’ brains when they read Mansfield Park. The subjects—mostly graduate students—were asked to skim an excerpt and then read it closely. The results were part of a study on reading and distraction.

The “neuro novel” story was quickly picked up by the mainstream media, from NPR to The New York Times. But the Austen project wasn’t merely a clever one-off—the brainchild, so to speak, of one imaginatively interdisciplinary scholar. And it wasn’t just the result of ambitious academics crossing brain science with “the marriage plot” in unholy matrimony simply to grab headlines. The Stanford study reflects a real trend in the humanities. At Yale University, Lisa Zunshine, now a literature scholar at the University of Kentucky, was part of a research team that studied modernist authors using fMRI, also in order to better understand reading. Rather than a cramped office or library carrel, the researchers got to use the Haskins Laboratory in New Haven, with funding by the Teagle Foundation, to carry out their project, in which twelve participants were given texts with higher and lower levels of complexity and had their brains monitored. 

Duke and Vanderbilt universities now have neuroscience centers with specialties in humanities hybrids, from “neurolaw” onward: Duke has a Neurohumanities Research Group and even a neurohumanities abroad program. The money is serious as well. Semir Zeki, a neuroaesthetics specialist—that is, neuroscience applied to the study of visual art—was the recipient of a £1 million grant in the United Kingdom. And there are conferences aplenty: in 2012, you could have attended the aptly named Neuro-Humanities Entanglement Conference at Georgia Tech. 

Neurohumanities has been positioned as a savior of today’s liberal arts. The Times is able to ask “Can ‘Neuro Lit Crit’ Save the Humanities?” because of the assumption that literary study has descended into cultural irrelevance. Neurohumanities, then, is an attempt to provide the supposedly loosey-goosey art and lit crowds with the metal spines of hard science. 

The forces driving this phenomenon are many. Sure, it’s the result of scientific advancement. It’s also part of an interdisciplinary push into what is broadly termed the digital humanities, and it can be seen as offering an end run around intensifying funding challenges in the humanities. As Columbia University historian Alan Brinkley wrote in 2009,  the historic gulf between funding for science and engineering on the one hand and the humanities on the other is “neither new nor surprising. What is troubling is that the humanities, in fact, are falling farther and farther behind other areas of scholarship.” 

Neurohumanities offers a way to tap the popular enthusiasm for science and, in part, gin up more funding for humanities. It may also be a bid to give more authority to disciplines that are more qualitative and thus are construed, in today’s scientized and digitalized world, as less desirable or powerful. Deena Skolnick Weisberg, a Temple University postdoctoral fellow in psychology, wrote a 2008 paper titled “The Seductive Allure of Neuroscience Explanations,” in which she argued that the language of neuroscience affected nonexperts’ judgment, impressing them so much that they became convinced that illogical explanations actually made sense. Similarly, combining neuroscience with, say, the study of art nowadays can seem to offer an instant sheen of credibility.

But neurohumanities is also the result of something else. Neuroscience appears to be filling a vacuum where a single dominant mode of thought and criticism once existed. That plinth has been held in the American academy by critical theory, neo-Marxism and psychoanalysis. Alva Noë, a University of California, Berkeley, philosopher who might be called a “neuro doubter,” sees neurohumanities as a reaction to the previous postmodern moment. “The pre-eminence of neuroscience” has legitimated an “anti-theory stance” within the humanities, says Noë, the author of Out of Our Heads.

Noë argues that neurohumanities is the ultimate response to—and rejection of—critical theory, a mixture of literary theory, linguistics and anthropology that dominated the American humanities through the 1990s. Critical theory’s current decline was somewhat inevitable, as all intellectual movements erode over time. This was exemplified by the so-called Sokal affair in 1996, in which a physics professor named Alan Sokal submitted a hoax theoretical paper on science to Social Text, only to unmask himself and lambaste the theorists who accepted and published his piece as not understanding the science. Another clear public repudiation was the harsh Times obituary in 2004 of the philosopher Jacques Derrida, who was dubbed an “abstruse theorist”—in the obit’s headline, no less. But as critical theory’s power—along with that of Marxism and Freudianism—fades within the humanities, neurohumanities and literary Darwinism are stepping up, ready to explain how we live, love art and read a novel (or rather, how the cortex absorbs text). And while much was gained as “the brain” replaced “individual psychology” or social class readings, much has also been lost.

Critical theory offered us the fantasy that we have no control, making a fetish of haze and ambiguity and exhibiting what Noë terms “an allergy to anything essentialist.” In neurohumanities, by contrast, we do have mastery and concrete, empirical ends, which has proved more appealing, even as (or perhaps because) it is highly reductive. At least since George H.W. Bush declared the 1990s the decade of the brain, the media have been flooded with simplistic empirical answers to many of life’s questions. Neuroscience is now the favored method for explaining almost every element of human behavior. President Obama recently proposed an initiative called Brain Research Through Advancing Innovative Neurotechnologies, or BRAIN, to be modeled on the Human Genome Project. The aim is to create the first full model of brain circuitry and function. Scientists are hoping that BRAIN will be as successful (and as well funded) as the Human Genome Project turned out to be. 

* * *

  • Share
  • Decrease text size Increase text size