Wash Your Hands, Cleanse Your Brain
First off, the usual apologies for the long radio silence. As some of you know, I was teaching this semester at both Columbia University and New York University, and things got predictably hectic in the final weeks of the term. But I want to congratulate a fantastic group of students at both places, and hope you have an opportunity to see their work, either in the scientific literature or in the popular media, in the not too distant future. Now, I’m going to try to weave some fascinating recent research on hand-washing, age-related cognitive declines, and Neanderthal hanky-panky into a web of wisdom.
Did you wash your hands before reading this blog?
The most recent (May 14, 2010) issue of Science is especially juicy in research articles touching upon wisdom. The shortest, and in many ways most provocative, is a study by Spike Lee (no, not that one) and Norbert Schwarz at the University of Michigan, who designed a clever experiment to see if washing one’s hands after a decision reduced the need for post-decision rationalization. One aim of the study was to see if ritual (or metaphoric) hand-washing did more than just “attenuate” moral angst or guilt, and in fact could reduce what is known as “cognitive dissonance”—the need to rationalize a choice when one is forced to choose between two similarly attractive (or, presumably, two similarly distasteful) options, which is certainly the case in many decisions demanding wisdom.
As usual, the decisions in these experiment involved relatively trivial choices (selecting a music CD or a flavor of jam), made by relatively young brains (college undergraduates). But because the experiment was cleverly disguised as a consumer survey, some participants “tested” a liquid soap or antiseptic wipe after making their decisions, while others did not. In both instances, hand washing or wiping “significantly reduced” the need to justify one’s prior decision. In other words, participants literally seemed to wash their hands of a difficult choice after making it. Lee and Schwarz conclude that “the psychological impact of physical cleansing extends beyond the moral domain.” So the next time you face a tough decision, you might want to wash your hands. Whether it’s psychology or mere metaphor, it seems to make a difference.
A second, highly technical article by a group of European researchers reports a significant finding about the cognitive impairments associated with aging. As I describe in Wisdom, Paul Baltes and other psychologists identified a narrow window for the exercise of wisdom—after the accumulation of lifetime experience and knowledge, but before the inevitable cognitive declines of advancing age begin to set in. Those declines have traditionally been assumed to be the simple wear and tear of age on the cognitive machinery; memory falters because the parts of the brain essential to memory, including the hippocampus and prefrontal cortex, function less crisply with the passage of time.
But an alternative explanation of cognitive decline—known as epigenetics, which this experiment addresses—is gaining empirical momentum. Epigenetics refers to the way environmental (or “life”) experiences can alter the way genes are turned on or off in the body, including in the brain (many cancers, for example, develop or accelerate due to epigenetic changes in cells, and I did an article for Newsweek in 2009 chronicling the development of new drugs that use epigenetic approaches to correct these changes). In the Science report, researchers based mainly in Gottingen, Germany showed that age-associated memory impairment in mice could in part be associated with epigenetic changes in hippocampal cells; in short, the DNA in these memory cells became entangled in its packaging, to the point where genes were inappropriately turned off and the mice were unable to consolidate memories after performing learning tasks. As I mention in the Newsweek article, understanding the basis of this process has already resulted in the FDA approval of several new “epigenetic” drugs to treat cancer, with many more on the way, and the German work hints at similar possibilities. Mice treated with an epigenetic “drug” regained the ability to turn on genes induced by learning experiences and recovered their cognitive abilities. This is early, but exciting, work suggesting that some cognitive declines associated with aging are potentially reversible.
Finally, this same issue of Science included a report revealing the complete Neanderthal genome, including clear genetic evidence that Neanderthals and modern humans must have mated perhaps 60,000 years ago somewhere in the Middle East.
When I reported on this project several years ago for National Geographic, I became curious about prehistoric origins of wisdom, and indeed, in an early draft of my last book, I had a chapter (which we ultimately cut for space) on the evolution of wisdom. The important concept here is group number.
One of the most adventurous and provocative theories on human evolution—and, by extension, on the evolution of wisdom—is the “social brain hypothesis,” first proposed by Robin I. M. Dunbar in the 1990s and recently updated by Dunbar and his colleagues at Oxford University. Dunbar is perhaps best known for the “Dunbar number,” his calculation that the human brain has the capacity to manage at most 150 different social relationships. But he has always been keenly interested in the interplay of cognitive function and group size—that is, how the functioning of our brains is affected by the size of our social group.
The core idea of the social brain hypothesis is that humans, like apes, need to attend to social relations to keep their group functioning smoothly, and that humans, unlike apes, evolved language “to service social bonds in a more generic sense by providing a substitute for social grooming, the main mechanism that our fellow primates use for bonding social relationships.” The critical need for language arose, Dunbar believes, as a function of both increasing primate brain size and increasing group size. By developing language to complement their relatively larger brains (where, it should be noted, most of the newer growth occurred in the neocortex), humans were able to maintain larger social groups, with all the cultural advantages (and baggage) that come with a larger group. I’ll elaborate on this in a future post, but Neanderthal groups are believed to have been fairly small, while prehistoric human groups were probably larger, and this difference in group size could have exerted subtle but important selective pressures improving social cognition in modern humans. Okay, it’s a bit of stretch to call it wisdom, but maybe proto-wisdom in the form of cooperation, group effort, and cultural knowledge—all of which increased the odds of survival for modern humans.
Hand-washing, failing memory, Neanderthal-human canoodling: all send tendrils into the world of wisdom.
A note to readers: There are a lot of clever spam-bots out there, and it’s hard for a vain mortal like me, susceptible to digital flattery, to distinguish genuine reader comments from generic spam-generated remarks. So if you’d like to comment or complain or point out—with civility, please—what an idiot I am, please try to include some specific reference to the content of the post so I can separate the much-appreciated wheat (however much it might cut) from the spammish chaff.