Research support for this article was provided by the Investigative Fund of The Nation Institute.
The accident happened during the construction of a railroad in Vermont, in 1848, and it happened fast: A three-foot-long tamping iron sparked an explosion, shot skyward and sailed through the frontal cortex of the project’s foreman, Phineas Gage. Gage, famously, got a whole new personality, and students of the brain got perhaps their most iconic case study. In transforming Gage from the amiable and responsible person he had been before the accident to the temperamental and bawdy one he became after, the iron bar also drilled a hole in Cartesian dualism, the intuitive distinction we all make between our minds and our brains. As the foreman had the misfortune to demonstrate, altering the physical brain can alter personality, behavior, mood–virtually everything we think of as constituting our essential (and incorporeal) self.
Scientifically fruitful construction accidents happen only just so often, thankfully, and brain research has traditionally been hamstrung by ethical constraints on experimenting with human subjects. In recent years, however, scientists have developed minimally invasive and comparatively benign techniques for exploring–and altering–the brain. Like advances in genetics (another field that investigates the biological substrata of selfhood), these developments raise significant philosophical, legal and ethical issues. Yet while genetics has spawned a robust watchdog industry, complete with academic departments, annual conferences and dedicated funding, neuroscience currently receives far less scrutiny.
Ultimately, though, neuroscience may raise even more troubling ethical issues, for the simple reason that it is easier to predict and control behavior by manipulating neurons than by manipulating genes. Even if all ethical and practical constraints on altering our DNA vanished tomorrow, we’d have to wait for years (or decades) to see the outcome of genetic experiments–and all the while environmental factors would confound our tinkering. Intervening on the brain, by contrast, can produce startlingly rapid results, as anyone knows who has ever downed too many margaritas or, for that matter, too many chocolate-covered coffee beans.
Caffeine and tequila are helpful reminders that, one way or another, we have been meddling with our brains since time immemorial. But the latest developments in neuroscience are sufficiently unique–different from coffee, and also different from cloning–to require a rethinking of both personal and social ethics. Broadly speaking, these developments can be divided into those technologies that seek to map the brain and those that seek to alter it.