One of the great advantages of teaching psychology is exploring its own identity crisis. Is it a science? A humanities? Who is a psychologist – humanists, behaviourists, evolutionary psychologists, neuroscientists? What method would we use to identify which kind of psychologist you want to be? Are there limitations to what the scientific method can tell us about human nature? This turns out to be a great exercise in meta-cognitive skills – or knowledge of how and when to use the right cognitive strategies – something in short supply these days among the scientific community, triumphant as they are over the liberal arts in universities.
Students should be initiated into these discussions because they will dominate the world they live in. As our technological grasp of nature improves, we have a greater duty to engage in ethical discussions and decisions. It is a crucial time to understand meta-cognition, as biologist E. O. Wilson said in a recent NY Times opinion piece:
It is time to consider what science might give to the humanities and the humanities to science in a common search for a more solidly grounded answer to the great riddle….
Self-understanding is what counts for long-term survival, both for individuals and for the species.
The great riddle is human nature, and the great unknown is human consciousness. Current efforts to map the circuits in the human brain will bring this question to front and centre. Scientists widely acknowledge that there is no current explanation for it, neither in evolutionary biology or physics. Like so many past discoveries, scientists are making a faustian bargain that better technology will lead to the holy grail. Who can deny the fantastic progress technology, and thus science, has made in the past half-century?
The project has been criticized by neuroscientists such as Erin McKiernan for putting technology ahead of concepts, rather than the other way around. The concept of mind is not yet defined, so how would we know when we’ve found it? One underlying assumption of the project shared by many scientists is that human consciousness can be understood through its physical connections – or its material causes alone. McKiernan also that the initiative’s goals are not well definied –
What will a complete map of the brain look like? How will we know when it is complete? How will the dynamic nature of the brain be accounted for?
The Myers AP Psychology text I use reassures students that they aren’t just a bundle of neurons, that the their thoughts and beliefs are still valid, they are still real. While discussing evolutionary psychology, Myers takes great pains to explain that Darwinian science does not imply determinism because we have the capacity to shape our environments. They’re not just “jumped up monkeys,” as UC Berkeley economist and prominent blogger Brad DeLong believes. But that’s not the reality they will face in the universities.
While it’s true that scientific progress can be made without reducing all aspects to material causes, scientists should be more consistent in their belief about that possibility. When philosopher Thomas Nagel made this point in his book Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False, the philosophers Leiter and Weisberg said he was attacking a straw man – most scientists reject theoretical reductionism. But for materialism to hold, all aspects of reality must in principle be reducible to material causes. Biologist Jerry Coyne responded to Leiter and Weisberg and accused them of shying away from the hard conclusions that reductivism demands (From Andrew Ferguson’s essay on Thomas Nagel):
It’s not surprising that scientists in various disciplines aren’t actively trying to reduce all science to physics; that would be a theoretical problem that is only solvable in the distant future. However: “The view that all sciences are in principle reducible to the laws of physics,” he wrote, “must be true unless you’re religious.” Either we’re molecules in motion or we’re not.
As I’ve noted here , one of the primary intellectual struggles today is the battle for belief in the human person, the individual, as opposed to the bundle of neurons of scientific reductionism.
Michael Gazzaniga, the psychologist famous for his split-brain experiments showing a dual track mind, gave the 2009 Gifford Lectures defending the concept of moral and legal responsibility in light of the latest wave of neuroscientific determinism. Determinists such as Sam Harris have also defended personal responsibility, in spite of their view that it is in reality a fiction. Philosopher and cognitive scientist Daniel Dennett has expressed reservation at lifting the veil on the hoi polloi – civilization may collapse if we explain to all that their subjective experiences from love, guilt, joy, and perceptions are nothing but neural signals governed by millions of years of random mutations in their DNA.
It’s this idea that our subjective experiences of the world are false, or at least not “real” according to science, that will throw students for a loop once they come face to face with its practical consequences. Under the popular belief that morality and truth are “relative,” and the evolutionary explanation that morality is a useful convention for survival, students will become further alienated from their personal desires. I think Roger Scruton’s analysis is astute – they will view the world as a world of objects. They will end up treating people as objects since that is what they are believed to be. Self-knowledge will give way to object-knowledge.
This alienation from ourselves, from the ability to question our view of reality, is illustrated well in the popular idea that the brain is simply a complex computer. The idea has been most forcefully advanced by Google execs Larry Page and Sergey Brin (in the sense that it has changed our habits the most). They believe that increasing the speed at which we can access information will lead to greater human potential (see Nicholas Carr’s book The Shallows, or his 2008 Atlantic article “Is Google Making us Stupid?”). Neuroscientists such as MIT wunderkind Ed Boyden are hopeful that we may be one day able to upload or download memories and experiences, Star Trek style, with enough processing speed at our finger tips (see his fascinating TED talk about turning neural pathways on and off like a light switch).
While I’m a relative newcomer to the topic, it seems to me that scientists like Boyden have not engaged in the metacognitive discipline of philosophy as they should, where alternatives to the materialist conception of mind are discussed. In an article by CNN, “Top brain scientist is a philosopher at heart”, Boyden shows that he’s still a freshman when it comes to philosophy:
He’s a man of many ideas, and wants to understand the biology behind where ideas come from. “I guess I’m still drawn by the philosophy,” he said.
Sure, Aristotle was a biologist as well as logician. But it’s a sad day when philosophy is reduced to finding mechanical ways of enhancing your computational power:
People don’t like to talk about enhancing the brain, Boyden said; it makes people uneasy to think about designing or engineering a way to sharpen our minds. Yet plenty of people take pharmaceuticals — sometimes without a prescription — to help themselves focus or be less anxious, and caffeine and alcohol have been around for centuries.
“I think the most important thing is for humanity to openly discuss this topic,” he said. “If we can discuss it, and we also can talk about side effects, should we maybe try to design more optimized versions of things?”
What does this say about our society, about the place of liberal arts, when philosophic thought is reduced to utilitarian calculus, however successful?
More worrying is whether or not it’s safe to even question the dominant materialist reductionist account of the universe.
Thomas Nagel found out the hard way, mocked and ridiculed by the likes of thinkers and scientists such as Steven Pinker for questioning materialism and suggesting a teleological view of nature. The subtitle to his book didn’t help – Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False. As one of America’s greatest philosophers, the extent of his fall from grace was so great colleagues rushed to his defense…to propose that wasn’t isn’t in fact mentally ill.
The point about meta-cognitive skills: as it becomes harder to make sense of life and integrate views from various disciplines – physics, biology, ethics – it is precisely the disciplines that deal with meaning that can piece together a coherent story. Without meaning, life is reduced to the material cause and ceases to be worth living. There is always one more material fact underlying the last material fact. Knowing that the love I have for my daughter is a neurochemical response based on genetic expression does not tell me anything about my experience of that love, let alone whether I should love her, just because I seek to pass on my genes.
Students and scientists should be more aware of the need to justify or ground their disciplines in a view that brings together their ideas and their actions. In other words, E. O. Wilson is right, we need both the humanities and the sciences to understand human nature.
Do we really believe that we are a pack of neurons? If so, why do we insist on acting morally, of distinguishing right from wrong? Scientists like Sam Harris have tried to produce a biologically based ethics. But as Thomas Nagel points out – why should we, according to an Darwinian point of view, subject our emotion and intuition to reason? Why should we choose to act rationally according to what science is telling us? This is a question of moral philosophy and philosophy of science, not science per se.
Before we agree to a reductive materialist view of the world, it might be wise to consider whether it is possible to live in a world of pure facts. Might there be other ways of looking at the world, methods that don’t depend on observation alone?