Opinion

An AI induced hallucination.

This thing isn’t a response. It’s an intervention.
But before everything else, it can be an AI-induced hallucination, mine.

Okay, let's do this.
Last week, Scott Galloway, Robert Armstrong, and Ed Elson sat down and had a fascinating, well-argued, and [respectfully] completely wrong conversation about AI. Listen to it here or read about it there.

The thesis? AI will be like aviation, bicycles, or vaccines: a utility that spreads, democratizes value and ultimately benefits the many rather than the few.

That’s cute. But no. I think Galloway’s initial instinct was closer to a possible future. Or just a different hallucination of mine.

AI isn’t aviation. It isn’t bicycles. It sure as hell isn’t vaccines.

My thesis? AI [synthetic intelligence] is more like Oil. A finite, extractable, and manipulable resource that will make a handful of people disgustingly wealthy while everyone else pays for access. And if history is any indicator, the people selling it will make sure you never get to own it outright.

Why Oil? I'm glad you asked: because Oil powers the school bus, but it also moves tanks. And with OpenAI and Google openly declaring their stake in what is set to become the most profitable industry of our time [read war]... Let's leave that for another post.

Open-source, a source of distraction.

Right now, the internet is flooded with AI tools, models, and open-source experiments. It feels like anyone can spin up an LLM or play with some AI agent. That’s not the future. That’s a carnival. The actual game isn’t what’s floating around on GitHub—it’s who owns the infrastructure and access to billions of devices.

DeepSeek? A distraction. Meta’s open models? A distraction.

Sure, you can download an AI model. But can you train it? Can you run it at scale? Can you integrate it into billions of devices, run it through the world’s most advanced semiconductor pipelines, and fine-tune it on proprietary datasets that cost billions to assemble?

No. And neither can most companies or governments. Because the real power of AI [like oil] isn’t in discovering it exists. It’s in who controls the extraction, refinement, and distribution. More importantly, it’s in what you do with it.
That’s where it gets interesting.

Intelligence isn’t valuable unless you can use it to make a difference [at scale]. Having 200 or more IQ points is a waste if you  don't have a setup to work in.

Synthetic Intelligence is a tool, arguably a weapon, a way to accelerate dominance in verticals, lock down markets, and extract revenue.
That's why we see the insane valuations for OpenAI.

My prophecy: Big AI will look like Big Oil.

A century ago, oil was just black goop in the ground. It took a few visionaries, like Rockefeller, Standard Oil, OPEC, to turn it into the most powerful economic weapon in history. Today, a few trillion-dollar companies: OpenAI (Microsoft), Alphabet, Amazon, Apple and Alibaba [cough... the CCP] are dressing up and positioning themselves as the OPEC of intelligence.

How? Three ways:
1. Control Over Extraction – AI needs compute. Compute needs chips. Chips need fabs. Guess who owns that pipeline? Not you.

2. Price Dictation & Market Influence – Want top-tier AI? Pay up. Want to build on their models? Pay up. Want access to the best intelligence? They decide how and when you get it. 

3. National & Corporate Power Play – AI is already a geopolitical asset, just like oil. If you think open-source AI will save you, ask yourself why the White House is regulating NVIDIA chip exports and why governments around the world are scrambling to control AI development.

The future of intelligence is paywalled.

The mistake we make is thinking AI is “freeing intelligence.” It’s not. It’s commoditizing it. Intelligence won’t be something you just “have.” It’ll be something you rent, packaged into subscription models, pay-as-you-go APIs, and enterprise contracts. We’re already seeing the early stages of this. But there is more to it.

Having access to an AI agent isn’t enough.
Look around. We’ve had the internet for decades. We’ve had individual computing power for decades. The result? A handful of people made fortunes, while the majority, despite owning laptops, smartphones, and internet access, remain just eyeballs generating revenue to influencers and corporations.

This isn’t doom-mongering. It’s precisely what happened with energy. And if you think AI is different because “open-source exists,” ask yourself this:
Can you build your own smartphone?
Can you manufacture your own electricity?
Can you refine your gasoline?

No. Not at scale. Because that’s not how power works.

The bottom line.

I think I'm running out of what I was under, so let's wrap this up.
Galloway, Elson, and Armstrong's conversation was fun, but it's leading in the wrong direction. It is an interesting thought, but is a distraction.

Scott's initial instinct was closer to a possible future in my distracted opinion.
AI isn't some benevolent, society-leveling force. It's a business model run by a handful of players who control the most valuable resource of the 21st century: synthetic intelligence.

And just like oil, those who own it will make sure you stay dependent.
No matter if you are an individual, a government, or a corporation.

/End of hallucination.
Back
crossmenu