June 15, 2004
Ladies Home Journal
Melissa Hendricks

Your Ever-Evolving Brain
Every few months, a course catalog from a local community college appears in my mailbox. Usually I flip through the listings, then toss it in the recycling bin. But last week, when I got the catalog, a course in Latin beckoned to me. Why not give it a try, I thought? Tackling a new language could be fun, and it would be illuminating to know the Latin roots for English words.

But something held me back. Granted, a dead language spoken only by priests and Oxford dons has limited usefulness. But there was more to it than that. The prospect of studying, memorizing, and being tested filled me with trepidation. I hadn’t crammed for a test in 20 years. Between then and now, my life had grown to accommodate a career, a husband, two kids, and a dog, and my once agile mind seemed to have taken a vacation — perhaps, I feared, a permanent one. Had childbirth softened my brain as well as my abdominals?

Thirty years ago scientists might well have told us so. At birth, went the theory, we have acquired all the brain cells we will ever have, and by our early 20s they begin to die in large numbers. Then, a few years ago, researchers made a discovery that sparked hope in aging hearts everywhere: We continue to generate new brain cells throughout our lives. One of the hottest areas in neuroscience now involves figuring out what triggers the production of these cells, and what role they may play in learning and memory.

Meanwhile, neuroscientists have also discovered other important ways in which the adult brain changes: The connections between brain cells (which control the way information flows from one cell to the next) can grow stronger or weaker, and the cells, if stimulated appropriately, may even sprout new connections. “It turns out that the brain, even in adulthood, is remarkably adaptive,” says Michael Merzenich, PhD, professor of neuroscience at the University of California at San Francisco. Yes, our existing brain cells do start to die as we age, but recent research shows that even this does not have to be inevitable. We can stem the loss by using our brains, even in ways that require little effort.

Sounds great, you may be saying to yourself, but why do I sometimes have trouble remembering a person’s name, or even the name of a simple household item?

Sister, I hear you. I am all too familiar with the frustration of needing a word, having it within reaching distance, but being unable to seize it: “Honey, where’s the whatchamacallit — you know, the flat metal thingy for turning pancakes?”

The hard truth is that memory does start to decline noticeably beginning in our 40s, as does the rate at which we process information. Fortunately, the drop-off is slight, far less of a big deal than most of us make of it. And, indeed, experts attribute much midlife muddle to factors unrelated to anything physical. Kids, work, bills, chores, and e-mail all compete for time in our heads, making it harder to devote the attention needed to learn something new or recall something learned earlier. Like any overloaded computer system, the one between our ears occasionally crashes, bringing forth a “senior moment.”

This theory is borne out by a study from the University of Michigan, in Ann Arbor. Cognitive neuroscientist Denise Park, PhD, compared the ability of 121 people from 34 to 84 to remember to take their medicine. Guess who fared the worst? The middle-aged folks. Busy lives seem to overload the memory circuits, says Dr. Park.

Grow More Creative
The good news is that, while our ability to remember may fade with age, the brain continues to accrue more knowledge. In other words, it might take us longer to dredge up a particular word from our memory banks, but we know more words overall. And along with our vocabularies, our ability to solve problems and perform new tasks can expand as we age.

Barry Gordon, MD, professor of neurology and cognitive science at Johns Hopkins Medical Institutions, in Baltimore, has written a book on this topic called Intelligent Memory, which he defines as the ability to use facts and figures to solve problems, innovate, and think creatively. Some famous inventions and discoveries probably sprang from this sort of memory. Dr. Gordon cites the case of George de Mestral, whose observations of how burrs cling to clothing gave him the brainstorm that led him, in his 40s, to create Velcro.

“As people mature, they can integrate data into a larger framework,” says Lawrence Katz, PhD, professor of neurobiology at Duke University Medical Center, in Durham, North Carolina. “They see the connections between things and their ability to think strategically, on large scales, increases.” So, while a 15-year-old might learn and remember a fact more easily, a 35-year-old understands it in a broader context and sees the possible ramifications. In short, the old chestnut really is true: Younger people may be smarter, but older people are wiser.

Sheila McCabe Taylor, 37, of Annapolis, Maryland, made this discovery firsthand. When the newly divorced mother of four decided to return to college in her early 30s to become a paralegal, she was apprehensive. Would she remember how to study and take notes? Was her mind too rusty to memorize loads of information? As it turns out, Taylor has done extremely well in her classes, not least because her life experience (including her divorce) had already taught her a lot about the legal system. “In many ways, I have an advantage over the younger people in my classes,” she admits. “Because I’m so motivated, I’m more focused and I listen better. You don’t really understand the importance of these things until you’ve matured.” In fact, Taylor is so enthralled with her new field that she now plans to earn her law degree and work in family law.

Use It or Lose It
But why work so hard at learning at this point in your life? You sweated through years of homework, cramming, and test-taking. Why not simply coast?

One big reason is that, just as exercising your body protects against heart disease and osteoporosis, exercising your brain can keep the gray matter in tip-top shape, says John Ratey, MD, associate clinical professor of psychiatry at Harvard Medical School, in Boston, and author of A User’s Guide to the Brain (Vintage, 2002). Scientists have found that people who spend their free time reading, doing crossword puzzles, visiting museums, and pursuing mentally stimulating activities lower their risk of Alzheimer’s disease and other types of cognitive decline. Formal education may provide similar benefits. In autopsies on volunteers who died during a long-term study on aging, researchers at Chicago’s Rush Alzheimer’s Disease Center have found that the more education a person had, the less likely he was to exhibit symptoms of Alzheimer’s — even if his brain revealed the presence of the disease.

Indeed, research conducted by Yaakov Stern, PhD, a clinical neuropsychologist at Columbia University, in New York City, suggests that stimulating your mind may build a better brain, a more complex one with alternative networks for thinking through problems and retrieving memories. This extraordinary organ is, as Dr. Merzenich puts it, a special-purpose machine that can take in information from the outside world, then change itself to meet the requirements at hand. Pretty impressive.

I, for one, need no further convincing: I’m determined to get mentally fit, starting right now. I haven’t worked up to enrolling in a class yet, but I did buy a book of Latin roots that I’ve been studying, and I’ve been urging everyone I know to take some similar action. Whether you think big (enroll in graduate school) or, like me, start small (attempt this Sunday’s crossword puzzle), remember — your brain craves learning. So why not give it the attention and respect it deserves?

Viagra for Your Brain?
If a pill could make you smarter, would you take it? That possibility is less Brave New World-ish than you might think. Building on the work of Nobel laureate Eric Kandel, PhD, who helped decode the molecular changes involved in learning and memory, scientists around the country are developing memory-enhancing drugs for treatment of Alzheimer’s disease and dementia, as well as milder forms of age-related memory disorders. Ethical concerns abound: Many experts believe these pills would soon find their way into the medicine cabinets of people with no impairments, who might pop them whenever they needed a mental boost. Would taking this “Viagra for the brain” (as such drugs have been dubbed) give these users an unfair advantage? Furthermore, notes Tim Tully, PhD, a founder of Helicon Therapeutics, in Farmingdale, New York, “this new class of memory enhancer may produce heretofore unknown psychological side effects.” One risk is remembering too much. Natural memory is selective; we remember some things because we forget others. If a pill enabled us to retain every fleeting thought and sensation, most of us would have trouble making sense of the world.