Chapter 20: Don’t Forget to Remember to Forget
“The day she went away
I made myself a promise
That I’d soon forget we ever met,
Well, but something sure is wrong,
‘Cause I’m so blue and lonely.
I forgot to remember to forget.”
—Elvis Presley, 1955
My earliest memory is of lying in a crib, at age two or three, in a camp for Displaced Persons in Allied-occupied post-Second World War Germany. My mother comes over and gives me the heel of a loaf of black bread, which has been rubbed vigorously with garlic (and into which what I now know were meant to be medicinal bits of garlic had been poked), then baked. The warmth of the room, the exotic taste, the flooding sense of security after lots of dislocation and movement is so wonderful, I cry!
One of my most recent memories is of walking into my house a few weeks ago while thinking about writing this particular column, with my car keys in hand. I put the keys down in the customary spot, but a little while later, on my way out, they’re not there. I trace and retrace my steps and look and relook everywhere, in the likely places and even in the unlikely ones. No keys. Eventually I give up, get the spares and go about my day, very aware of the irony of having been ambushed by a classic “senior moment” exactly when I was thinking of writing about the subject; and royally irritated by it at the same time.
This column is the first in a group of Zoomer Philosophy Chapters in which I intend to explore the senses and how they change as we get older. Five of these you’ll recognize: sight, sound, touch, smell and taste. I’ve taken the liberty of adding a sixth – memory – and of giving it pride of place. Why? First, because in a subtle and disturbing way, memory seems more closely related to being human and to being who we are than any of the traditional senses. Lose your vision and you’ve lost your sense of sight. Lose your hearing and you’ve lost your sense of sound. But lose your memory and you’ve lost yourself. Without my eyes or ears or even limbs, I am still me; but without my memory, I am, by most definitions, someone else.
Losing our memory, of course, is very much on our collective mind these days. For people of our demographic, memory may be the cause of more anxiety – as we contemplate its decline – than any other faculty we have. Take my two “memories” above, one early, one late. They can be interpreted as illustrations of what’s, supposedly, a classic syndrome among aging people: long-term memory, recollections of things long past, remains crystal-clear (although there are people who, when I recount my memory of the crib and the bread, are highly skeptical that it’s a “real” memory and not some concoction of events I’ve heard about subsequently). Meantime, our short-term memory, of what we had for breakfast or what we watched on TV last night or where we left the keys, fades into mist.
The short-term-long-term memory divide is part of what you could call the “senility myth,” one that our culture – and, more unfortunately, we ourselves – have come to accept as gospel: that the memories of all older people are doomed to fail over time and that losing pieces of our memory is just more evidence that we are effectively losing our minds. I don’t deny or trivialize for a second the reality of dementia and Alzheimer’s in the aging population – it’s estimated that, in the developed West, 25 per cent of people over 85 suffer from Alzheimer’s (1) – but, for the majority of us, I have a personal layman’s theory of Memory and Aging that’s far less bleak than what most of us believe to be the case.
My theory is evolutionary. As recently as 500 to 600 years ago, the ability to remember huge tracts of written and spoken material by rote was considered a mark of education and culture. As a result, the invention of the printing press in the mid-1400s provoked a storm of criticism and warning from church leaders and other scholars, who were convinced that its arrival would mean the end of memory and, by extension, learning and morality. What it did to learning and morality is arguable, but it certainly had the predicted effect on rote memory in humans, as did every technological advance that followed, up to and including the smartphones of today (how many phone numbers do you still know off by heart?) My contention is that this progression is natural and not new. Evolution has simply selected out the need to have a massive on-hand memory, leaving the part of the brain that was previously used for certain kinds of information retention to be directed towards other, more complex tasks. A BlackBerry or iPhone probably contains as much information as the collective memory of an entire primitive tribe, maybe more; so the old storage ability becomes unnecessary and atrophies.
The perverse thing is that it’s only for older people that this general trend is considered not an evolution of the species but a decay of the person. In fact, studies show that a large percentage of what we call “senior moments” are the results of self-fulfilling prophecies that lead aging people to misidentify momentary lapses in memory and to underestimate their general memory capacities. (Everyone, young and old, has tip-of-the-tongue experiences, but you won’t hear anyone calling a teenager senile.)
one recent study done at John Hopkins University compared older Chinese and American adults regarding their beliefs about the effects of aging on memory and their respective abilities to actually remember. The Chinese seniors were far less likely to maintain that to be older was to be more forgetful (possibly a function of the superior respect given the elderly in China). They also performed far better on the actual memory tests – so much so that some of the oldest Chinese subjects did as well as the youngest. The simple conclusion: if you expect to forget, you probably will. (Remarkably, even those American seniors who performed extremely well on the memory tests had tended beforehand to rate their own memories as poor.)
It’s estimated by psychologists that the sum total of all memories acquired by a person over their lifetime is somewhere in the neighbourhood of a few hundred gigabytes (2) – or roughly the storage capacity of a personal computer. This relatively small capacity means that the real genius of our brains when it comes to memories isn’t in accumulating them but in filing them and continually discarding the ones that we don’t need, all at an incredible speed (the human brain operates 20 times faster, still, than the most powerful computers in existence today). This paring away of the “sludge” doesn’t even stop when we sleep: when we dream, that’s what we’re doing. “Dreams are just the body’s way,” says Dr. Robert Stickgold, a psychiatrist at Harvard Medical School who led a study on the process, “of clearing out the mental ‘inbox.’ ” For older people, who are closer to reaching the limit of their memory storage capacity, this discarding of extraneous memories is even more important.
Most of us know from personal experience what happens when our computer approaches its memory capacity; it slows down and may even start losing files. Same with the brain. What gets denigrated as a “senior moment” may actually be nothing more than what happens to an overtaxed computer. Our brains need to discard something to stay operational, and short-term memories become the logical thing to pare away; they’re far less significant in the grand arc of our lives than long-term ones.
The fact is, to survive viably, it’s as important for us to forget as to remember. This fall, a new crime series debuted on CBS television, called Unforgettable, about a New York City police detective who suffers from hyperthymesia, an extremely rare medical condition that renders her virtually incapable of forgetting anything that happens in her personal life. In the show, this is characterized as a valuable detecting tool; in real life, it’s something else altogether. The model for the fictional detective is a California woman named Jill Price, who, when a book about her was published in 2008, became a talk-show phenomenon, putting on displays of near-magical recall. The psychologists who studied Price, though, described her memory as “nonstop, uncontrollable and automatic.”
Hyperthymysiacs, it turns out (only 20 cases have been confirmed to date), aren’t prodigies of memory so much as involuntary obsessives, whose obsession (through no personal choice or fault) is their own lives. Jill Price has lived almost her entire 44-year life with her mother and still has every stuffed animal she ever received as a child. There is a difference between being able to recall anything at will and not being able to forget anything at all. The first is a great power, the second a great question mark. A person who can’t forget is a person who can’t unclutter his or her life, can’t “forgive and forget” and move on.
Older people are the diametric opposite. We’re not more forgetful human beings, I’d argue, but more evolved ones. We’re most advanced at discarding unnecessary memories – and if the occasional necessary one goes missing for a while too, well, let’s call it an occupational hazard. To long for perfect recall is like longing to master Morse code; it’s a quaint skill at best, superfluous in the modern age.
It turns out I hadn’t misplaced my car keys at all that day; a handyman doing some minor maintenance work around the house had mistaken them for his and picked them up. Actually, what I’m looking forward to is the day I won’t need keys at all because I’ll be able to use my smartphone to turn on my car and open my front door. When that happens, I’ll really be set.
As long as I don’t forget where I put that phone.
1. The National Institute on Aging (USA), November, 2000
2. ”How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory”, Cognitive Science 10, 477-493, 1986
Moses’ Zoomer Philosophy — which launched in ZOOMER Magazine in October 2009 — is a series of monthly essays on age and aging, and the secrets and the science to living better, longer, healthier and happier lives. The first volume of his collection is now available in e-book format on the Kobo Books website. Click here for more information.