From forbes.com
By Dan Fitzpatrick
There is a line from Mustafa Suleyman’s book, The Coming Wave, that I keep returning to. Suleyman, the co-founder of Google DeepMind, writes about children who grew up travelling by horse and cart in the late 19th century, but spent their final days flying on airplanes, living in houses heated by the splitting of the atom. All in one lifetime, from horse and cart to nuclear energy.
I mentioned this last week during a talk I gave for the UAE Girls in AI programme, founded by Abeda Natha, director of digital learning at GEMS Wellington International School. The students who joined me are going to live through something even more dramatic. The real question, I feel, is whether our education systems can get them ready for it or not.
A survey commissioned by Kingston University, which polled more than 2000 business leaders, found that 74 percent of those business leaders do not believe current graduates are prepared to succeed in a world of artificial intelligence. That is not a fringe concern; that is a near consensus.
What Has Changed?
To help the students make sense of this during my talk, I decided to walk them through some economics 101. That is, how the four factors of production have shifted across economic ages. Looking at the work of 18th century economist Adam Smith and then Alfred Marshall, we explored the classic framework of the four factors of production: land, capital, labour and entrepreneurship.
Going back to the feudal age, land was king. If you owned the land, you had the power. Then industry came along, the Industrial Revolution, and capital surged. You needed factories, machines, serious investment. At this time, labour became enormously more valuable than it ever had been because someone had to run those machines. Then the information age flipped it again. Suddenly you needed knowledge. Skilled, educated workers became the dominant factor.
Here is the thing. I think we are now entering the intelligence age. The economics are shifting again, quite radically. Capital requirements are collapsing. A laptop, a Wi Fi connection and a subscription to a powerful AI. That is your start-up kit.
Land barely registers for huge swathes of knowledge work. In the talk, I showed the students an image of a young woman sitting in a Starbucks with a laptop and asked them to look. All four factors of production were right there. The table was her land, her laptop was her capital, she was the labour, and the idea in her head was entrepreneurship. The barriers to build something have never been lower.
If capital costs have cratered and land is becoming irrelevant for digital work, what is left? I think it is entrepreneurship. By entrepreneurship, I do not mean in the sense of the Silicon Valley pitch deck, seed round culture. I mean entrepreneurial thinking, the ability to spot a problem, the creativity to imagine a fix, and the nerve to try building it.
In the session with the students, I gave them a simple exercise. Finish this sentence: “Someone really should make it easier to…” Then write down the first thing that pops into your head. Do not overthink it. Notice the friction in your daily life. This is where it starts.
I encouraged them to keep what I call a frustration diary. Every time you think this is annoying or why is there not a better way to do this, write it down. Then pick one a week and ask three questions. Who else has this problem? Would someone pay to fix it? Could I build a first version of this solution within a week?
That last question would have been laughable five years ago. In fact, it would have been laughable 12 months ago. Not anymore. A teenager with an idea and access to a powerful AI agent can prototype faster today than a funded start-up could manage even five years ago.
Humans Of The Gap
There is a concept I keep coming back to in my work, something I call the humans of the gap. As AI fills more of the capability space, writing code, generating content, crunching data and more, there is a temptation to place ourselves in the gaps, to get skilled in what AI cannot yet do. I get this question from parents almost weekly. What should my child be learning?
It is rooted in a flawed idea that we can thrive in the gaps where AI cannot perform yet. I think this is futile. The capability of AI will continue to increase and squeeze us out of those gaps. Instead, our human abilities become more valuable. Our judgement, our empathy, our ability to earn trust.
When talking with the students, I introduced them to an idea I have worked on with other groups: “the AI entourage”. I asked them to imagine that money was not an issue. Who would be in their entourage? What skills would their team have? The thing is, money is increasingly not the constraint. Any professional with AI “know-how” can work alongside a team of agents.
Think of it as an amplification layer. One person directing an AI entourage can handle work that used to require an entire department. The human directs. The AI executes. We outsource the doing, not the thinking.
This is a critical message for schools. In too many cases, we are still training students to be the doers, to store all knowledge, to follow instructions, and to produce standardised outputs. In the intelligence age, the doing is increasingly automated, and the value lies in directing, deciding, and imagining. It requires a fundamentally different educational model from what many are used to.
The Liminal Space
I then moved on to something that matters just as much as the skills conversation. The awareness that we are in a liminal space right now. We are somewhere between the old world, where a degree automatically led to a career, where knowledge was the currency, where institutions could afford to move slowly, and a new world that has not quite taken shape yet.
In that in-between zone, it can feel disorienting, anxious, and uncertain. There is also something else in that overlap space. A chance to create the future. Not just react to it, but create it. For young people feeling overwhelmed by change, that is a powerful reframe. We can be agents in what comes next.
Finally, I shared the Japanese concept of Ikigai, a framework for finding your reason for being at the intersection of what you love, what you are good at, what the world needs, and what you can be paid for. In this intelligence age, I would argue ikigai matters more than ever. When barriers to building are low, and AI handles much of the doing, the question that matters most is not what job should I get. It is what problem do I care about enough to solve? That is a question no AI can answer for you.
What does this mean for schools?
Every idea I shared with the UAE Girls in AI, from the economic shift to the frustration diary to the AI entourage, challenges how many schools currently operate. We are still preparing students for the information age. In some cases, we are still preparing them for the industrial age. Asking them to memorise, follow instructions, produce standardised outputs.
The intelligence age demands something different. Entrepreneurial instincts, creative problem solving, comfort with ambiguity and the ability to work with AI as a genuine partner. If graduates are falling short, then adding a coding module or hosting an AI awareness day will not be enough. The response needs to be structural and cultural. We need to rethink what we assess, how we teach and what we are actually preparing young people for.
Are we playing a finite game, optimising for exam results and university placements? Or are we playing an infinite game, building humans who can thrive at 25, 35, 45 and beyond?
William Gibson once wrote that the future is already here. It is just not evenly distributed. Some young people are already using AI, building things, thinking like entrepreneurial founders. Others are hearing these ideas for the first time. Some have yet to hear them at all. The distribution of the future depends in large part on the educators who choose to shape access to it.
The stakes have never been higher.

















