Brains


Quotes

Hoarder brains

The brains of hoarders show differences in their cingulate cortex and insulas from those of non-hoarders and OCD sufferers, and brain injuries seem to be able to trigger hoarding behaviour. In 1848, an explosion drove a tamping rod through the brain of Phineas Gage, a railway worker in Vermont. He survived the accident, but was said by his doctor to have developed ‘a great fondness for pets and souvenirs’ that was ‘only exceeded by his attachment to his tamping iron, which was his constant companion during the remainder of his life’.

Neuroscience of afterlife

Now we know that rather than merely reacting to the external world, the brain spends much of its time and energy actively making predictions about the future—mostly the next few moments. Will that baseball flying through the air hit my head? Am I likely to become hungry soon? Is that approaching stranger a friend or a foe? These predictions are deeply rooted, automatic, and subconscious. They can’t be turned off through mere force of will.

And because our brains are organized to predict the near future, it presupposes that there will, in fact, be a near future. In this way, our brains are hardwired to prevent us from imagining the totality of death.

If I am allowed to speculate—and I hold that a dying person should be given such dispensation—I would contend that this basic cognitive limitation is not reserved for those of us who are preparing for imminent death, but rather is a widespread glitch that has profound implications for the cross-cultural practice of religious thought. Nearly every religion has the concept of an afterlife (or its cognitive cousin, reincarnation). Why are afterlife/reincarnation stories found all over the world? For the same reason we can’t truly imagine our own deaths: because our brains are built on the faulty premise that there will always be that next moment to predict. We cannot help but imagine that our own consciousness endures.

Remapping the neural circuitry

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. [...] Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

Reading gains and losses

One thing that changed pretty dramatically is that the visual cortex, the part of our brain that processes our vision, became dedicated to deciphering text. [...]

As we practice that, more and more neurons get dedicated to reading. Eventually, you no longer have to decipher a particular letter or even a particular word because our brains represent those letters and words — it’s automatic. So we got all the benefits that come with being good readers, whether it’s the value of losing yourself in a novel or the value of gaining complex information from some sophisticated nonfiction book.

But we also lost something. One thing we lost is a lot of our visual acuity in reading nature and reading the world. If you look at older cultures that aren’t text-based, you see incredible abilities to, for instance, navigate by all sorts of natural signs. This acuity in reading the world, which also requires a lot of the visual cortex, we lost some of that simply because we had to reprogram our brain to become good readers.

Primed for distraction

The suggestion that, in a few generations, our experience of media will be reinvented shouldn't surprise us. We should, instead, marvel at the fact we ever read books at all. Great researchers such as Maryanne Wolf and Alison Gopnik remind us that the human brain was never designed to read. Rather, elements of the visual cortex – which evolved for other purposes – were hijacked in order to pull off the trick. The deep reading that a novel demands doesn't come easy and it was never "natural." Our default state is, if anything, one of distractedness. The gaze shifts, the attention flits; we scour the environment for clues. (Otherwise, that predator in the shadows might eat us.) How primed are we for distraction? One famous study found humans would rather give themselves electric shocks than sit alone with their thoughts for 10 minutes. We disobey those instincts every time we get lost in a book.

Information overload

Yet back [in 2010] the evidence already was strongly suggesting that the internet was a very powerful way to access lots of information very quickly. We were all concentrating on that great new bounty of information: the more information, the better — the faster it comes to me, the better.

What we lost sight of was how we actually take that information into our mind. There’s all sorts of very good evidence that if you’re distracted — if your attention is shifting very quickly — you can gather lots of information in a very swift fashion, but you’re not going to assemble it very well into knowledge. It’s going to just remain bits of information. You’re not going to develop a rich store of personal knowledge, which is all about connections and associations.

Brain metaphors

The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.