26 Nov 2024
We’re at a crossroads again, and most people don’t even realise it.
About a year ago, I dived into some fascinating books that really challenge how we see reality:
• The Case Against Reality: How Evolution Hid the Truth from Our Eyes by Donald D. Hoffman
• The Experience Machine: How Our Minds Predict and Shape Reality by Andy Clark
• Phantoms in the Brain: Probing the Mysteries of the Human Mind by V. S. Ramachandran
• How Emotions Are Made by Lisa Feldman Barrett
Now, I’m not some academic heavyweight, so getting through these wasn’t exactly a walk in the park. But with a bit of curiosity, persistence, and some help from AI chatbots, I managed to wrap my head around their main ideas—or at least I like to think so.
I spent weeks asking AI models like ChatGPT and Claude questions, looking for patterns and common threads.
As a quick side note, this whole experiment wouldn’t have been possible without these AI tools because I didn’t know anyone who had read all four books and was willing to discuss them.
So there I was, trying to match the experiments and findings of Hoffman, Clark, and Ramachandran. Later, Lisa Feldman Barrett’s work added more colour.
Then, not too long ago, it all clicked. I started seeing things differently, especially when I looked at the impact of social media and the coming wave of artificial intelligence.
The usual suspects we talk about in articles, documentaries, and interviews are shorter attention spans, more anxiety and depression, and, to be honest, I think we don’t do enough to prevent and fix the above.
After my little experiment mentioned above, I realised something bigger was happening. So, let’s break it down because it’s all connected.
About a year ago, I dived into some fascinating books that really challenge how we see reality:
• The Case Against Reality: How Evolution Hid the Truth from Our Eyes by Donald D. Hoffman
• The Experience Machine: How Our Minds Predict and Shape Reality by Andy Clark
• Phantoms in the Brain: Probing the Mysteries of the Human Mind by V. S. Ramachandran
• How Emotions Are Made by Lisa Feldman Barrett
Now, I’m not some academic heavyweight, so getting through these wasn’t exactly a walk in the park. But with a bit of curiosity, persistence, and some help from AI chatbots, I managed to wrap my head around their main ideas—or at least I like to think so.
I spent weeks asking AI models like ChatGPT and Claude questions, looking for patterns and common threads.
As a quick side note, this whole experiment wouldn’t have been possible without these AI tools because I didn’t know anyone who had read all four books and was willing to discuss them.
So there I was, trying to match the experiments and findings of Hoffman, Clark, and Ramachandran. Later, Lisa Feldman Barrett’s work added more colour.
Then, not too long ago, it all clicked. I started seeing things differently, especially when I looked at the impact of social media and the coming wave of artificial intelligence.
The usual suspects we talk about in articles, documentaries, and interviews are shorter attention spans, more anxiety and depression, and, to be honest, I think we don’t do enough to prevent and fix the above.
After my little experiment mentioned above, I realised something bigger was happening. So, let’s break it down because it’s all connected.
The Attention Dilemma.
We live in interesting times, mainly because of how we consume information and the mental habits we’ve built as a culture. There’s an entire economy focused on grabbing our attention. Big and small companies and individuals are waking up every single day with one goal: to capture our attention and monetise whatever they can: our time, data, and actions.
It’s a cutthroat game, and these organisations use everything they’ve got—technology, psychological hacks, dark design patterns—to hook us and build habits that make them money.
Because there’s so much cash on the table, we’ve seen the rise of a new subculture: influencers. Somehow, these folks are seen as role models. Personally, I think they’re not much different from drug dealers, but that’s another post.
Today, we’ve got millions of people contributing to the erosion of our shared reality. Most of them operate without much regulation or oversight. Their main goal? To distract us, grab our attention, and sell it off along with any data they can gather. Maybe the following is news: Once a system is “aware” of your existence as an individual, the time spent outside their ecosystem counts as data points. If you have not checked your Instagram, LinkedIn, or Instagram for a few hours or days, it matters and “the system” will find a way to bring you back through notifications or anything that can push your buttons and motivate you to get back on the ride.
The fact that you scroll through the feed and don’t stop looking at an image or reacting to some information is considered valuable data. The systems will adjust, recombining the mix of information that triggers your reactions.
In this economy, every nanosecond matters, every choice, every move, and every space between.
Why does this matter?
Well, I see culture as the operating system of our societies.
The glue that holds our societies together is how we see things as a group and solve problems together. The first variable in this collaboration is our shared reality.
One significant outcome of this assault on our attention is that our ability to think deeply and nuancedly about complex issues has suffered, especially over the last few generations. Plenty of studies support this.
In this context, people often tell me to keep my writing short and punchy. Use snappy paragraphs with catchy headlines. “People don’t read long pieces anymore,” they say. “They get bored quickly. They mostly engage with stuff that outrages them or confirms their beliefs. Everything needs to be a story; otherwise, it will be ignored.”
And you know what? They’re right. It doesn’t sit well with me, but it’s true.
A big part of this shift comes from how we access information now: platforms that reward short-term thinking through instant gratification with micro-doses of dopamine. Everything is on-demand and happening right now, with little sense of history. If history does come up, it’s usually to make a point, shut down a conversation, and push you to the next one without a break, context, or mercy for your time.
I think our attention is being held hostage by this new way of life.
Most of us live in a world where our shared reality no longer forms a complete picture.
It’s a cutthroat game, and these organisations use everything they’ve got—technology, psychological hacks, dark design patterns—to hook us and build habits that make them money.
Because there’s so much cash on the table, we’ve seen the rise of a new subculture: influencers. Somehow, these folks are seen as role models. Personally, I think they’re not much different from drug dealers, but that’s another post.
Today, we’ve got millions of people contributing to the erosion of our shared reality. Most of them operate without much regulation or oversight. Their main goal? To distract us, grab our attention, and sell it off along with any data they can gather. Maybe the following is news: Once a system is “aware” of your existence as an individual, the time spent outside their ecosystem counts as data points. If you have not checked your Instagram, LinkedIn, or Instagram for a few hours or days, it matters and “the system” will find a way to bring you back through notifications or anything that can push your buttons and motivate you to get back on the ride.
The fact that you scroll through the feed and don’t stop looking at an image or reacting to some information is considered valuable data. The systems will adjust, recombining the mix of information that triggers your reactions.
In this economy, every nanosecond matters, every choice, every move, and every space between.
Why does this matter?
Well, I see culture as the operating system of our societies.
The glue that holds our societies together is how we see things as a group and solve problems together. The first variable in this collaboration is our shared reality.
One significant outcome of this assault on our attention is that our ability to think deeply and nuancedly about complex issues has suffered, especially over the last few generations. Plenty of studies support this.
In this context, people often tell me to keep my writing short and punchy. Use snappy paragraphs with catchy headlines. “People don’t read long pieces anymore,” they say. “They get bored quickly. They mostly engage with stuff that outrages them or confirms their beliefs. Everything needs to be a story; otherwise, it will be ignored.”
And you know what? They’re right. It doesn’t sit well with me, but it’s true.
A big part of this shift comes from how we access information now: platforms that reward short-term thinking through instant gratification with micro-doses of dopamine. Everything is on-demand and happening right now, with little sense of history. If history does come up, it’s usually to make a point, shut down a conversation, and push you to the next one without a break, context, or mercy for your time.
I think our attention is being held hostage by this new way of life.
Most of us live in a world where our shared reality no longer forms a complete picture.
Fragmented realities.
This brings me back to the main topic: our shared reality—the values, knowledge, and goals we hold together. All of this is guided by the “software” our brains are running: our minds. Minds connected to local traditions, global culture, personal and collective values, anchored in principles—well, what’s left of them, anyway.
According to Hoffman, Clark, Ramachandran, and Barrett, each of us lives in our own version of reality, shaped by the information we receive, our experiences, and our personal beliefs. Technology, which tailors content to our preferences, amplifies this fragmentation.
Our shared values are fading or never forming, threatening the future of communities and civilisation.
Enter Artificial Intelligence.
Why is this broader perspective relevant? A new technology is knocking at our door that’s set to make a massive impact on humanity. It’s designed to mimic human intelligence, connecting us intellectually and emotionally. And here’s the thing: it can adapt to each of us individually.
How do I know? I’ve spent a lot of time over the past 18 months with these technologies. I was blown away by the speed of this new infrastructure, by this mind extension. Then I realised something unsettling:
This new thinking habit has messed with my reality. My expectations when interacting with other human beings have changed. People around me seemed slow, unable to keep up with my AI-induced hallucinations.
Here’s some advice: if you find yourself lost in this echo chamber, unable to hear your voice, stop.
Step back. Talk to a human being, a friend.
This is important, whether you notice it or not. Our reality is being affected—both individually and collectively. It’s not an outright attack (at least not yet), but more like a slow-rising tide.
Idealists and tech enthusiasts [including myself] often say that AI is a great equaliser. This current realisation has made me rethink that stance.
According to Hoffman, Clark, Ramachandran, and Barrett, each of us lives in our own version of reality, shaped by the information we receive, our experiences, and our personal beliefs. Technology, which tailors content to our preferences, amplifies this fragmentation.
Our shared values are fading or never forming, threatening the future of communities and civilisation.
Enter Artificial Intelligence.
Why is this broader perspective relevant? A new technology is knocking at our door that’s set to make a massive impact on humanity. It’s designed to mimic human intelligence, connecting us intellectually and emotionally. And here’s the thing: it can adapt to each of us individually.
How do I know? I’ve spent a lot of time over the past 18 months with these technologies. I was blown away by the speed of this new infrastructure, by this mind extension. Then I realised something unsettling:
This new thinking habit has messed with my reality. My expectations when interacting with other human beings have changed. People around me seemed slow, unable to keep up with my AI-induced hallucinations.
Here’s some advice: if you find yourself lost in this echo chamber, unable to hear your voice, stop.
Step back. Talk to a human being, a friend.
This is important, whether you notice it or not. Our reality is being affected—both individually and collectively. It’s not an outright attack (at least not yet), but more like a slow-rising tide.
Idealists and tech enthusiasts [including myself] often say that AI is a great equaliser. This current realisation has made me rethink that stance.
The AI flood.
I must disagree with Mustafa Suleyman's thesis that AI will arrive as a wave. We’re already witnessing a slow-rising tide, not a sudden one. These days, our world revolves around one technology: AI. Its gravity is amplified by massive investments and the will to be first. The tide created by this force might lift some boats [mainly yachts] but push others ashore or leave them stranded.
As a species, we’ve survived and thrived because of trust and the structures we’ve built for the common good: principles, norms, laws, infrastructure, languages, currencies, and cooperation systems. Right now, all that is slowly being flooded with artificial intelligence and automated systems.
We’re not ready. We haven’t recovered from the last flood, the social media - the experiment that hooked us all.
This new tide is amplifying the minds affected. Conspiracy theories are on the rise. We have less and less trust in our shared reality, risking a slide from cynicism into nihilism. And that’s a tough spot for society to be in on a long-term, large-scale scale.
How vulnerable are we?
One in seven young men can’t name a single friend, and one in four can’t name a best friend. That alone should tell us we’ve got a sizable group of people at risk of addiction, depression, or worse. An essential part of our society is ready to be exploited, and it is the perfect candidate for nefarious AI experiments and the ideal target for f/cke-up influencers.
As a species, we’ve survived and thrived because of trust and the structures we’ve built for the common good: principles, norms, laws, infrastructure, languages, currencies, and cooperation systems. Right now, all that is slowly being flooded with artificial intelligence and automated systems.
We’re not ready. We haven’t recovered from the last flood, the social media - the experiment that hooked us all.
This new tide is amplifying the minds affected. Conspiracy theories are on the rise. We have less and less trust in our shared reality, risking a slide from cynicism into nihilism. And that’s a tough spot for society to be in on a long-term, large-scale scale.
How vulnerable are we?
One in seven young men can’t name a single friend, and one in four can’t name a best friend. That alone should tell us we’ve got a sizable group of people at risk of addiction, depression, or worse. An essential part of our society is ready to be exploited, and it is the perfect candidate for nefarious AI experiments and the ideal target for f/cke-up influencers.
Our shared reality is dissolving.
All this points to a bigger issue: our shared reality is disappearing. Technology offers powerful tools to diminish and reject the reality that holds society together. The senses that help us to experience this world are intermediated through technology. We don’t meet in person; we use technology and shop for food, distraction and love. Carefully crafted user experiences handle our connections.
The age of mass media is over. The age of individual media began when social media gave everyone a voice in this digital space.
We all jumped in, dreaming about having meaningful conversations and relationships, only to find that these spaces were built on principles we never agreed to: addiction, outrage, conspiracy thinking.
The age of mass media is over. The age of individual media began when social media gave everyone a voice in this digital space.
We all jumped in, dreaming about having meaningful conversations and relationships, only to find that these spaces were built on principles we never agreed to: addiction, outrage, conspiracy thinking.
Final thoughts.
Our shared reality matters more than ever because we have less and less of it.
The choices we make every day when engaging with technology will shape our society for generations.
Our shared reality is vital for our future and is constantly under threat. We spend increasingly more hours being users than humans, neighbours, parents, siblings, and strangers on a train saying hello to another person. Our realities are curated user experiences designed by specialists like myself. Every move is watched, and every action or inaction is analysed.
Real-life experience is losing value at the dawn of AI as fabricating new realities becomes cheaper.
In this context, we’re losing touch with each other and ourselves.
The choices we make every day when engaging with technology will shape our society for generations.
Our shared reality is vital for our future and is constantly under threat. We spend increasingly more hours being users than humans, neighbours, parents, siblings, and strangers on a train saying hello to another person. Our realities are curated user experiences designed by specialists like myself. Every move is watched, and every action or inaction is analysed.
Real-life experience is losing value at the dawn of AI as fabricating new realities becomes cheaper.
In this context, we’re losing touch with each other and ourselves.
Back