Quotes



Neuroscience of afterlife

Now we know that rather than merely reacting to the external world, the brain spends much of its time and energy actively making predictions about the future—mostly the next few moments. Will that baseball flying through the air hit my head? Am I likely to become hungry soon? Is that approaching stranger a friend or a foe? These predictions are deeply rooted, automatic, and subconscious. They can’t be turned off through mere force of will.

And because our brains are organized to predict the near future, it presupposes that there will, in fact, be a near future. In this way, our brains are hardwired to prevent us from imagining the totality of death.

If I am allowed to speculate—and I hold that a dying person should be given such dispensation—I would contend that this basic cognitive limitation is not reserved for those of us who are preparing for imminent death, but rather is a widespread glitch that has profound implications for the cross-cultural practice of religious thought. Nearly every religion has the concept of an afterlife (or its cognitive cousin, reincarnation). Why are afterlife/reincarnation stories found all over the world? For the same reason we can’t truly imagine our own deaths: because our brains are built on the faulty premise that there will always be that next moment to predict. We cannot help but imagine that our own consciousness endures.



Concentrating

Think about what the word [concentrating] means. It means gathering yourself together into a single point rather than letting yourself be dispersed everywhere into a cloud of electronic and social input. It seems to me that Facebook and Twitter and YouTube—and just so you don’t think this is a generational thing, TV and radio and magazines and even newspapers, too—are all ultimately just an elaborate excuse to run away from yourself. To avoid the difficult and troubling questions that being human throws in your way.



Thinking for yourself

Thinking means concentrating on one thing long enough to develop an idea about it. Not learning other people’s ideas, or memorizing a body of information, however much those may sometimes be useful. Developing your own ideas. In short, thinking for yourself. You simply cannot do that in bursts of 20 seconds at a time, constantly interrupted by Facebook messages or Twitter tweets, or fiddling with your iPod, or watching something on YouTube.

I find for myself that my first thought is never my best thought. My first thought is always someone else’s; it’s always what I’ve already heard about the subject, always the conventional wisdom. It’s only by concentrating, sticking to the question, being patient, letting all the parts of my mind come into play, that I arrive at an original idea. By giving my brain a chance to make associations, draw connections, take me by surprise. And often even that idea doesn’t turn out to be very good. I need time to think about it, too, to make mistakes and recognize them, to make false starts and correct them, to outlast my impulses, to defeat my desire to declare the job done and move on to the next thing.



Acting against attention stealing

Some scientists say these worries about attention are a moral panic, comparable to the anxieties in the past about comic books or rap music, and that the evidence is shaky. Other scientists say the evidence is strong and these anxieties are like the early warnings about the obesity epidemic or the climate crisis in the 1970s. I think that given this uncertainty, we can’t wait for perfect evidence. We have to act based on a reasonable assessment of risk. If the people warning about the effects on our attention turn out to be wrong, and we still do what they suggest, what will be the cost? We will spend less time being harassed by our bosses, and we’ll be tracked and manipulated less by technology – along with lots of other improvements in our lives that are desirable in any case. But if they turn out to be right, and we don’t do what they say, what’s the cost? We will have – as the former Google engineer Tristan Harris told me – downgraded humanity, stripping us of our attention at the very time when we face big collective crises that require it more than ever.



Itching powder

At the moment it’s as though we are all having itching powder poured over us all day, and the people pouring the powder are saying: “You might want to learn to meditate. Then you wouldn’t scratch so much.” Meditation is a useful tool – but we actually need to stop the people who are pouring itching powder on us. We need to band together to take on the forces stealing our attention and take it back.



Attention-harming factors

I learned that the factors harming our attention are not all immediately obvious. I had been focused on tech at first, but in fact the causes range very widely – from the food we eat to the air we breathe, from the hours we work to the hours we no longer sleep. They include many things we have come to take for granted – from how we deprive our children of play, to how our schools strip learning of meaning by basing everything on tests.



Nothing as evidence

The spread of ever more realistic deep fakes will make it even more likely that people will be taken in by fake news and other lies. The havoc of the last few years is probably just the first act of a long misinformation crisis. Eventually, though, we’ll all begin to take deep fakes for granted. We’ll come to take it as a given that we can’t believe our eyes. At that point, deep fakes will start to have a very different and perhaps even more pernicious effect. They’ll amplify not our gullibility but our skepticism. As we lose trust in the information we receive, we’ll begin [...] to “doubt reality itself.” We’ll go from a world where our bias was to take everything as evidence [...] to one where our bias is to take nothing as evidence.



Consistent, boring and wonderful

Out there in the multiverse is a reality where the web is a complete borefest. Information is the only driving factor to visit a “web page” and PWAs have never come to exist. Custom styling, fancy interactive animations and single-page functionality isn’t even something that can be implemented. The web is just a system of HTML/plaintext documents sharing information and data. Users browse the web in quick bursts to satisfy their queries or read something interesting. Then, they return to real life.

My goodness what a beautiful reality that would be. Consistent, boring and wonderful.





Pages:  1 2 [3] 4 5 6 7 8 9 10 11 12  (newest to oldest)