Toggle Menu

Humanism, Science, and the Radical Expansion of the Possible

Why we shouldn’t let neuroscience banish mystery from human life.

Marilynne Robinson

October 22, 2015

Marilynne Robinson(Kelly Ruth Winter)

Humanism was the particular glory of the Renaissance. The recovery, translation, and dissemination of the literatures of antiquity created a new excitement, displaying so vividly the accomplishments and therefore the capacities of humankind, with consequences for civilization that are great beyond reckoning.

This essay is excerpted from The Givenness of Things, to be published by Farrar, Straus and Giroux on October 27, 2015. © Marilynne Robinson.

The disciplines that came with this awakening, the mastery of classical languages, the reverent attention to pagan poets and philosophers, the study of ancient history, and the adaptation of ancient forms to modern purposes, all bore the mark of their origins yet served as the robust foundation of education and culture for centuries, until the fairly recent past. In muted, expanded, and adapted forms, these Renaissance passions live on among us still in the study of the humanities, which, we are told, are now diminished and threatened. Their utility is in question, it seems, despite their having been at the center of learning throughout the period of the spectacular material and intellectual flourishing of Western civilization. Now we are less interested in equipping and refining thought, more interested in creating and mastering technologies that will yield measurable enhancements of material well-being—for those who create and master them, at least. Now we are less interested in the exploration of the glorious mind, more engrossed in the drama of staying ahead of whatever it is we think is pursuing us. Or perhaps we are just bent on evading the specter of entropy. In any case, the spirit of the times is one of joyless urgency, many of us preparing ourselves and our children to be means to inscrutable ends that are utterly not our own. In such an environment, the humanities do seem to have little place. They are poor preparation for economic servitude. This spirit is not the consequence but the cause of our present state of affairs. We have as good grounds for exulting in human brilliance as any generation that has ever lived.

The antidote to our gloom is to be found in contemporary science. This may seem an improbable stance from which to defend the humanities, and I do not wish to undervalue contemporary art or literature or music or philosophy. But it is difficult to recognize the genius of a period until it has passed. Milton, Bach, Mozart all suffered long periods of eclipse, beginning before their lives had ended. Our politics may appear in the light of history to have been filled with triumphs of statecraft, unlikely as this seems to us now. Science, on the other hand, can assert credible achievements and insights, however tentative, in present time. The last century and the beginning of this one have without question transformed the understanding of Being itself. “Understanding” is not quite the right word, since this mysterious old category, Being, fundamental to all experience past, present, and to come, is by no means understood. However, the terms in which understanding may, at the moment, be attempted have changed radically, and this in itself is potent information. The phenomenon called quantum entanglement, relatively old as theory and thoroughly demonstrated as fact, raises fundamental questions about time and space, and therefore about causality.

Particles that are “entangled,” however distant from one another, undergo the same changes simultaneously. This fact challenges our most deeply embedded habits of thought. To try to imagine any event occurring outside the constraints of locality and sequence is difficult enough. Then there is the problem of conceiving of a universe in which the old rituals of cause and effect seem a gross inefficiency beside the elegance and sleight of hand that operate discreetly beyond the reach of all but the most rarefied scientific inference and observation. However pervasive and robust entanglement is or is not, it implies a cosmos that unfolds or emerges on principles that bear scant analogy to the universe of common sense. It is abetted in this by string theory, which adds seven unexpressed dimensions to our familiar four. And, of course, those four seem suddenly tenuous when the fundamental character of time and space is being called into question. Mathematics, ontology, and metaphysics have become one thing. Einstein’s universe seems mechanistic in comparison. Newton’s, the work of a tinkerer. If Galileo shocked the world by removing the sun from its place, so to speak, then this polyglot army of mathematicians and cosmologists who offer always new grounds for new conceptions of absolute reality should dazzle us all, freeing us at last from the circle of old Urizen’s compass. But we are not free.

There is no art or discipline for which the nature of reality is a matter of indifference, so one ontology or another is always being assumed if not articulated. Great questions may be as open now as they have been since Babylonians began watching the stars, but certain disciplines are still deeply invested in a model of reality that is as simple and narrow as ideological reductionism can make it. I could mention a dominant school of economics with its anthropology. But I will instead consider science of a kind. The study of brain and consciousness, mind and self—associated with so-called neuroscience—asserts a model of mental function as straightforward, cau­sally speaking, as a game of billiards, and plumes itself on just this fact. It is by no means entangled with the sciences that address ontology. The most striking and consequential changes in the second of these, ontology, bring about no change at all in the first, neuroscience, either simultaneous or delayed. The gist of neuroscience is that the adverbs “simply” and “merely” can exorcise the mystifications that have always surrounded the operations of the mind/brain, exposing the machinery that in fact produces emotion, behavior, and all the rest. So while inquiries into the substance of reality reveal further subtleties, idioms of relation that are utterly new to our understanding, neuroscience tells us that the most complex object we know of, the human brain, can be explained sufficiently in terms of the activation of “packets of neurons,” which evolution has provided the organism in service to homeostasis. The amazing complexity of the individual cell is being pored over in other regions of science, while neuroscience persists in declaring the brain, this same complexity vastly compounded, an essentially simple thing. If this could be true, if this most intricate and vital object could be translated into an effective simplicity for which the living world seems to provide no analogy, this indeed would be one of nature’s wonders.

* * *

Neuroscience has, as its primary resource, technology that captures images of processes within the living brain. Fear lights up a certain area, therefore fear is a function of that area, which developed for the purposes of maintaining homeostasis. It prepares the organism to fight or flee. Well and good. But fear is rarely without context. People can be terrified of spiders, dentists, the Last Judgment, germs, the need to speak in public, the number 13, extraterrestrials, mathematics, hoodies, the discovery of a fraud in their past. All of these fears are the creatures of circumstance, of the history and state of health of a specific brain. They identify threat, interpreting an environment in highly individual terms. They, not threat in the abstract, trigger alarm, and they are the products of parts of the brain that do not light up under technological scrutiny and would elude interpretation if they did. If they are not taken into account, the mere evidence of an excitation has little descriptive and no predictive value. A fearful person might take a pill, faint, or commit mayhem. The assumptions behind the notion that the nature of fear and the impulses it triggers could be made legible or generalizable for the purposes of imaging would have to exclude complexity—the factor that introduces individuality with all its attendant mysteries. In fairness, however, the neuroscientists seem well content with the technology they have, extrapolating boldly from the data it yields. Refinements that introduced complication might not be welcome.

This all appears to be a straightforward instance of scientists taking as the whole of reality that part of it their methods can report. These methods are as much a matter of vocabulary as of technology, though the two interact and reinforce each other. Here is an example. Neuroscientists seem predisposed to the conclusion that there is no “self.” This would account for indifference to the modifying effects of individual history and experience, and to the quirks of the organism that arise from heredity, environment, interactions within the soma as a whole, and so on. What can the word “self” mean to those who wish to deny its reality? It can only signify an illusion we all participate in, as individuals, societies, and civilizations. So it must also be an important function of the brain, the brain aware of itself as it is modified by the infinite particulars of circumstance, that is, as it is not like others. But this would mean the self is not an illusion at all but a product of the mind at other work than the neuroscientists are inclined to acknowledge. Of course, the physical brain is subject to every sort of impairment, the areas that light up during imaging as surely as any others. Impairments that seem to compromise the sense of self may be taken to demonstrate that it is rooted in the physical brain, that same fleshly monument to provident evolution the neuroscientists admire, selectively. If the physical disruption of the sense of self is taken to prove that the self is an experience created by the physical brain, then there are no better grounds to call its existence into question than there would be to question equilibrium or depth perception. Obviously, there is a conceptual problem here—equilibrium does not “exist” except in the moment-to-moment orientation of an organism to its environment. Say as much of the self, mutatis mutandis, and it is granted the same kind of reality.

* * *

But to take a step back. It is absurd for scientists who insist on the category “physical,” and who argue that outside this category nothing exists, to dismiss the reality of the self on the grounds that its vulnerabilities can be said to place it solidly within this category. How can so basic an error of logic survive and flourish? There is a certain Prometheanism in this branch of science that would rescue us mortals from entrenched error—for so it sees the problem of making its view of things persuasive. For this reason—because questions might seem a betrayal of science as rescuer—its tenets enjoy a singular immunity from the criticism of peers. And its proponents feel confirmed by doubts and objections on the same grounds, that their origins and motives can be taken to lie in a hostility to science. On scrutiny, the physical is as elusive as anything to which a name can be given. The physical as we have come to know it frays away into dark matter, antimatter, and by implication on beyond them and beyond our present powers of inference. But for these scientists, it is a business of nuts and bolts, a mechanics of signals and receptors of which no more need be known. Their assertions are immune to objection and proof against information. One they dismiss and the other they ignore.

The real assertion being made in all this (neuroscience is remarkable among the sciences for its tendency to bypass hypothesis and even theory and go directly to assertion) is that there is no soul. Only the soul is ever claimed to be nonphysical, therefore immortal, therefore sacred and sanctifying as an aspect of human being. It is the self but stands apart from the self. It suffers injuries of a moral kind, when the self it is and is not lies or steals or murders, but it is untouched by the accidents that maim the self or kill it. Obviously, this intuition—it is much richer and deeper than anything conveyed by the word “belief”—cannot be dispelled by proving the soul’s physicality, from which it is aloof by definition. And on these same grounds, its nonphysicality is no proof of its nonexistence. This might seem a clever evasion of skepticism if the character of the soul were not established in remote antiquity, in many places and cultures, long before such a thing as science was brought to bear on the question.

I find the soul a valuable concept, a statement of the dignity of a human life and of the unutterable gravity of human action and experience. I would add that I find my own soul interesting company, if this did not seem to cast doubt on my impeccable objectivity. This is not entirely a joke. I am not prepared to concede objectivity to the arbitrarily reductionist model of reality that has so long claimed, and been granted, this virtue. The new cosmologies open so many ways of reconceiving the universe(s) that all sorts of speculations are respectable now. We might have any number of other selves. If most or all these speculations are only flaunting new definitions of the possible, the exercise is valuable and necessary. Possibility has been captive to a narrow definition for a very long time, ourselves with it, and we must expect to blink in the light. These new cosmologies preclude almost nothing, except “the physical” as a special category. The physicality enshrined by neuroscientists as the measure of all things is not objectivity, but instead a pure artifact of the scale at which, and the means by which, we and our devices perceive. So to invoke it as the test and standard of reality is quintessentially anthropocentric.

I am content to place humankind at the center of Creation. We are complex enough, interesting enough. What we have learned, limited as we must assume it to be, is wonderful even in the fact of its limitations. This is no proof, of course. Be that as it may. It is not anthropocentricity that is a problem here, but the fact that it is unacknowledged and misapplied, and all the while imputed to the other side of the controversy, as if it were, eo ipso, a flagrant error. The objectivity claimed by neuroscience implies that it is free of this bias. Yet there could be no more naive anthropocentricity than is reflected in the certainty and insistence that what we can know about the nature of things at this moment makes us capable of definitive judgments about much of anything. That we have come to this place is not a failure of science but a glorious achievement, the continuous opening of insights that science itself could never have anticipated. Nothing can account for the reductionist tendencies among neuroscientists except a lack of rigor and consistency, a loyalty to conclusions that are prior to evidence and argument, and an indifference to science as a whole.

This kind of criticism is conventionally made of religion. I am not attempting some sort of rhetorical tae kwon do, to turn the attack against the attacker. My point is simply that neuroscience, at least in its dominant forms, greatly overreaches the implications of its evidence and is tendentious. Its tendency is to insist on the necessity of a transformation of our conception of human nature—to make it consistent with a view of reality that it considers clear-eyed and tough-minded, therefore rational and true. Its ultimate argument seems to be that we all really know better than to subscribe to the mythic foolery that sustains us in a lofty estimation of ourselves and our kind. The evidence it offers is secondary to this conclusion and inadequate to it, because it is based in a simplistic materialism that is by now a nostalgia. The profound complexity of the brain is an established fact. The depiction of a certain traffic of activation in it can only understate its complexity. One might reasonably suspect that the large and costly machines that do the imaging are very crude tools whose main virtue is that they provide the kind of data their users desire and no more.

Is it fair to say that this school of thought is directed against humanism? This seems on its face to be true. The old humanists took the works of the human mind—literature, music, philosophy, art, and languages—as proof of what the mind is and might be. Out of this has come the great aura of brilliance and exceptionalism around our species that neuroscience would dispel. If Shakespeare had undergone an MRI, there is no reason to believe there would be any more evidence of extraordinary brilliance in him than there would be of a self or a soul. He left a formidable body of evidence that he was both brilliant and singular, but it has fallen under the rubric of Renaissance drama and is somehow not germane, perhaps because this places the mind so squarely at the center of the humanities. From the neuroscientific point of view, this only obscures the question. After all, where did our high sense of ourselves come from? From what we have done and what we do. And where is this awareness preserved and enhanced? In the arts and the humane disciplines. I am sure there are any number of neuroscientists who know and love Mozart better than I do, and who find his music uplifting. The inconsistency is for them to explain.

* * *

A type of Darwinism has a hand in this. If evolution means that the species have a common ancestry and have all variously adapted and changed, that is one thing. Ovid would not object. If it means that whatever development is judged to be in excess of the ability to establish and maintain homeostasis in given environments, to live and propagate, is less definitive of the creature than traits that are assumed to reflect unambiguous operations of natural selection, then this is an obvious solecism. It is as if there are tiers to existence or degrees of it, as if some things, though manifest, are less real than others and must be excluded from the narrative of origins in favor of traits that suit the teller’s preferences. So generosity is apparent and greed is real, the great poets and philosophers toiled in the hope of making themselves attractive to potential mates—as did pretty well every man who distinguished himself by any means or tried to, from Tamburlaine to Keats to anyone’s uncle. (Women have little place in these narratives—they are the drab hens who appraise the male plumage.) This positing of an essential and startlingly simple mechanism behind the world’s variety implies to some that these pretenses, these very indirect means to the few stark ends that underlie all human behaviors, ought to be put aside, if only for honesty’s sake. So, humanities, farewell. You do not survive Darwinian cost-benefit analysis.

If there is a scientific mode of thought that is crowding out and demoralizing the humanities, it is not research in the biology of the cell or the quest for life on other planets. It is this neo-Darwinism, which claims to cut through the dense miasmas of delusion to what is mere, simple, and real. Since these “miasmas” have been the main work of human consciousness for as long as the mind has left a record of itself, its devaluing is a major work of dehumanization. This is true because it is the great measure of our distinctiveness as a species. It is what we know about ourselves. It has everything in the world to do with how we think and feel, with what we value or despise or fear, all these things refracted through cultures and again through families and individuals. If the object of neuroscience or neo-Darwinism was to describe an essential human nature, it would surely seek confirmation in history and culture. But these things are endlessly complex, and they are continually open to variation and disruption. So the insistence on an essential simplicity is understandable, if it is not fruitful. If I am correct in seeing neuroscience as essentially neo-Darwinist, then it is affixed to a model of reality that has not gone through any meaningful change in a century, except in the kind of machinery it brings to bear in asserting its worldview.

A nematode is more complex than a human being was thought to be 50 years ago. Now biology is in the course of absorbing the implications of the fact that our bodies are largely colonies of specialized microorganisms, all of them certainly complex in their various ways and in their interactions. It is the elegance of nature that creates even the appearance of simplicity. The double helix as a structure expedites fluent change, modifications within the factors it contains or that compose it, which baffle determinist associations with the word “gene.” Elegance of this kind could be called efficiency, if that word did not have teleological implications. I think the prohibition against teleology must be an arbitrary constraint, in light of the fact that we do not know what time is. It is not respectable to say that an organism is designed to be both stable as an entity and mutable in response to environment, though it must be said that this complex equilibrium is amazing and beautiful and everywhere repeated in a wealth of variations that can seem like virtuosity regaling itself with its own brilliance.

I am a theist, so my habits of mind have a particular character. Such predispositions, long typical in Western civilization, have been carefully winnowed out of scientific thought over the last two centuries in favor of materialism, by which I mean a discipline of exclusive attention to the reality that can be tested by scientists. This project was necessary and very fruitful. The greatest proof of its legitimacy is that it has found its way to its own limits. Now scientific inference has moved past the old assumptions about materiality and beyond the testable. Presumably it would prefer not to have gone beyond its classic definitions of hypothesis, evidence, demonstration. And no doubt it will bring great ingenuity to bear on the questions that exceed any present ability to test responses to them. It seems science may never find a way to confirm or reject the idea of multiple universes, or to arrive at a satisfactory definition of time or gravity. We know things in the ways we encounter them. Our encounters, and our methods and assumptions, are determined by our senses, our techniques, our intuitions. The recent vast expansion and proliferation of our models of reality and of the possible bring with them the realization that our situation, on this planet, and within the cocoon of our senses, is radically exceptional, and that our capacity for awareness is therefore parochial in ways and degrees we cannot begin to estimate. Again, to have arrived at this point is not a failure of science but a spectacular achievement.

* * *

That said, it might be time to pause and reflect. Holding to the old faith that everything is in principle knowable or comprehensible by us is a little like assuming that every human structure or artifact must be based on yards, feet, and inches. The notion that the universe is constructed, or we are evolved, so that reality must finally answer in every case to the questions we bring to it, is entirely as anthropocentric as the notion that the universe was designed to make us possible. Indeed, the affinity between the two ideas should be acknowledged. While the assumption of the intelligibility of the universe is still useful, it is not appropriately regarded as a statement of doctrine, and should never have been. Science of the kind I criticize tends to assert that everything is explicable, that whatever has not been explained will be explained—and, furthermore, by its methods. Its practitioners have seen to the heart of it all. So mystery is banished—mystery being no more than whatever their methods cannot capture yet. Mystery being also those aspects of reality whose implications are not always factors in their worldview, for example, the human mind, the human self, history, and religion—in other words, the terrain of the humanities. Or of the human.

Now we know that chromosomes are modified cell by cell, and that inheritance is a mosaic of differentiation within the body, distinctive in each individual. Therefore the notion that one genetic formula, one script, is elaborated in the being of any creature must be put aside, with all the determinist assumptions it has seemed to authorize. Moreover, the impulse toward generalization that would claim to make the brain solvable should on these grounds be rejected, certainly until we have some grasp of the deeper sources of this complexity and order, the causal factors that lie behind this infinitesimal nuancing. The brain is certainly more profoundly individuated than its form or condition can reveal.

So if selfhood implies individuality, or if our undeniable individuality justifies the sense of selfhood, then there is another mystery to be acknowledged: that this impulse to deny the reality, which is to say the value, of the human self should still persist and flourish among us. Where slavery and other forms of extreme exploitation of human labor have been general, moral convenience would account for much of it, no doubt. Where population groups are seen as enemies or even as burdens, certain nefarious traits are attributed to them as a whole that are taken to override the qualities of individual members. Again, moral convenience could account for this. Both cases illustrate the association of the denial of selfhood with the devaluation of the human person. This would seem too obvious to be said, if it were not true that the denial of selfhood, which is, we are told, authorized by the methods of neuroscience and by the intentionally generalized reports it offers of the profoundly intricate workings of the brain, persists and flourishes.

There are so many works of the mind, so much humanity, that to disburden ourselves of our selves is an understandable temptation. Open a book and a voice speaks. A world, more or less alien or welcoming, emerges to enrich a reader’s store of hypotheses about how life is to be understood. As with scientific hypotheses, even failure is meaningful, a test of the boundaries of credibility. So many voices, so many worlds, we can weary of them. If there were only one human query to be heard in the universe, and it was only the sort of thing we were always inclined to wonder about—“Where did all this come from?” or “Why could we never refrain from war?”—we would hear in it a beauty that would overwhelm us. So frail a sound, so brave, so deeply inflected by the burden of thought, that we would ask, “Whose voice is this?” We would feel a barely tolerable loneliness, hers and ours. And if there were another hearer, not one of us, how starkly that hearer would apprehend what we are and were.

Marilynne RobinsonMarilynne Robinson is a novelist and essayist. Her recent books are Lila: A Novel and The Givenness of Things: Essays. She is professor emeritus at the University of Iowa Writers’ Workshop.


Latest from the nation