Some thoughts on imagination, AI, and the Jehovah’s Witness moment I had with friends who think this shift is still far off (hint: it’s already here).
Also: why the real bottleneck isn’t tech anymore—it’s our ability to see what’s possible (even if my seeing is somewhat limited these days, courtesy of hay fever).
The Great Cognitive Shift
For a while now, I had this idea.
One of those “wouldn’t it be amazing if…” kind of ideas: pretty bold, beautiful, clearly possible in theory, but completely out of reach unless I found the right people, paid a lot of money, and managed the chaotic puzzle of execution.
You know how it goes. Developers, designers, no-code tools, a maze of decisions and most likely some duct tape (Ai will never replace duct tape!).
Even if the idea is good, pulling it off becomes a mountain. And when that happens most people either don’t start at all, overwhelmed, or burn out halfway because it’s just too much.
But something shifted. Recently, I asked an AI to build the exact thing I had imagined.
And it’s doing it.
It’s not just offering ideas or tutorials. Not simply pointing me to tools. It’s actually building the damn thing: connecting the pieces, generating working versions, adapting as it goes. Yes, it still means I need to do some walking, but Ai is holding my hand, and telling me exactly what to do when I ask it to dumb it down for me (and I do, often).
It’s not flawless, but it’s real. And it’s happening. The game has changed, and it’s becoming obvious- to some.
Execution isn’t the hurdle anymore.
What we’re waiting on now is clarity, vision, and plenty of imagination.
A Different Kind of Dissonance
Recently, I went to visit some family and friends. We clearly live in different bubbles, which I know is common these days. But the contrast hit me more than usual.
I live in a funny space -largely online, consuming a very different information diet than the people I was sitting across from. We have similar backgrounds in many ways, and a lot in common, but we definitely live in different worlds. Possibly more than ever.
We started talking about AI. I asked them how they saw it—whether they’d used any of the tools, what they thought about where things were headed.
For the record: I don’t consider myself an expert. I’m merely interested. I’ve been watching things unfold since the early days of ChatGPT, when it just answered questions and wrote some decent text. Now, I use most of the major models regularly. I experiment and build things with them (clunky, ugly, often silly). I try to understand what they’re capable of and how to use them better. Not because I think everyone should automate everything, but because to not engage with this shift at all feels, frankly, a little wild right now.
But as I was talking to them—smart, educated people—I felt like a Jehovah’s Witness knocking on someone’s door to tell them the end is near. I wasn’t trying to sell anything. I just wanted to share what I was seeing.
They looked at me like I was crazy.
Some of them said AI isn’t even close to changing the world in any meaningful way. That it’s overhyped. That it’s still “just a new Google.” And I get the instinct. If you’re only judging it by what you used once last year, or worse, if you haven’t used it at all, you’re going to miss the trajectory. Most people aren’t even caught up to what it can do today, let alone what’s just around the corner.
At one point, I talked to an 18-year-old who said she wanted to be a writer or journalist. And I couldn’t help but think: are we back in 1998? Not in a dismissive way, just with a sense of disorientation, like watching a young person step into a game whose rules are already being rewritten.
That conversation reminded me how big this gap is. And not just between generations, but, most importantly, between awareness and application.
The Old Game: Execution
Not long ago, your value hinged on how well you could execute.
You needed to know how to build things, how to connect systems, navigate tools, understand which part went where. To some extent - it’s still the case. But it’s also different.
Execution meant friction, and the people who could navigate that complexity were in high demand.
But that friction is fading fast. AI is absorbing a huge portion of it. The stitching, mapping, formatting, deploying is increasingly handled for you.
What used to require a team of specialists now takes a well-phrased prompt and a bit of iteration. Funny enough, it still proves to be too much for most.
Anyway, the question has shifted.
The bottleneck is no longer how to do the thing.
It’s what to do—and why do it.
Vision, Synthesis, and Imagination
If AI handles the “how,” then the value moves upstream, to the “what,” “why,” and “when.”
What used to be a major friction point (e.g., stitching tools together, coding MVPs, building systems) is now being done for you. And soon it will be done with you, in real-time collaboration. This shifts the whole nature of what we need to be good at.
The new bottleneck is imagination, framing, and discernment.
What should be built?
Why is this worth solving?
What’s the real problem under the surface-level complaint? aka “problem behind the problem”
This is systems thinking, problem framing, and imaginative synthesis—skills that require both lived experience and conceptual clarity.
So here’s what (I believe) actually matters now:
1. Imagination
The ability to see what isn’t there yet.
To remix, reframe, and spot overlooked opportunities.
This is where meaningful work begins. Noos1 over noise.
2. Clarity of Thought and Language
If you can’t articulate what you want, AI can’t help you build it.
Language has become the interface. Precision of thought and articulation is the new superpower. You have to know what you want, explain it clearly, and understand the implications. Clear thinking is now a form of design.
3. Pattern Recognition
The ability to notice gaps, themes, and signals hiding in plain sight. Noticing trends, gaps, problems behind problems.
It’s one thing to respond to a prompt. It’s another to spot the thing no one else is seeing yet. To spot the real problem instead of just reacting to symptoms.
4. Problem Framing
AI can help you solve problems. But it can’t tell you if you’re solving the right one.
That’s on you. You need to be the one asking the better questions. Spotting what matters. Defining what needs fixing.
5. Systems Thinking
Zooming out. Understanding how parts connect.
Being able to anticipate second-order effects and build for durability, not just functionality. Understanding the consequences, and the ability to hold complexity without drowning in it.
This is how you design something that lasts, not just something that works today.
6. Critical Thinking and Judgment
Because even if AI gives you 10 options, you still need to choose. That can be overwhelming.
You need to know what makes sense, what feels aligned, what lands in the real world.
AI doesn’t have taste. You do.
7. Emotional Agility and Agency
Because even with AI, it’s still easy to stall out. To freeze, to overconsume, and to never decide. Actually, that’s one of the dangers I see: having too much choice.
Anyway - agency still matters. You still have to move (and do it fast but thoughtfully).
And you still have to handle uncertainty, resistance, and complexity without shutting down.
You still need to care.
Execution Is Not Dead.
Let’s not confuse ease with irrelevance. Execution still matters. It’s just morphed into something new.
Execution now means primarily:
Orchestrating: Knowing when to intervene, when to delegate, and how to keep momentum.
Sequencing: Structuring how tools and outputs connect, step by step.
Curating: Choosing what not to build. Deciding what’s essential.
Refining: Spotting the small details that make the biggest difference.
You could say it’s meta-execution; less about labor, and more focus on leverage.
You’re no longer doing all the work by hand, but rather designing systems that do it for you. And that’s tricky (for now).
Lower Barriers, Higher Demands
Here’s (yet another) paradox of today’s world: AI lowers the technical barrier to entry, but it raises the bar for thinking.
Now anyone can build and ship something.
But not everyone can recognize what’s worth building.That’s the real divide.
Some people will use AI to speed up what they already do.
Others will use it to step into entirely new possibilities.
This shift isn’t just about pace. It’s about direction.
The Deeper, Non-Tech Stack
As AI absorbs more of the tactical layer, these human qualities become more valuable. We kind of know that. But what does that even mean? I’m not sure there’s consensus on that, but here’s what I see as important:
Cultural attunement – Understanding context, nuance, and what actually resonates (yay to my Anthropology class from ages ago!)
Ethical discernment – Choosing wisely, not just efficiently
Embodied wisdom – Drawing on lived experience, not just data. Our bodies are here for a reason, and not just a bottleneck.
Narrative craft – Making meaning, shaping understanding, connecting the dots.
Signal detection – Spotting original insights before they become obvious, filtering the info soup with better filters: from bodies, to better questions.
These are not “soft skills” ; they’re strategic elements now.
Imagination as a Practice
Imagination isn’t magic. It’s not reserved for artists or visionaries.
It’s essential, trainable, and would probably benefit from a bit of rewilding.
Some ways to build it:
Cross-pollinate disciplines. Let unlike inputs collide.
Add constraints. Limitations sharpen creativity.
Ask better questions. Flip assumptions.
Track anomalies. Pay attention to the odd stuff.
Create white space. Ideas need room to breathe.
Prototype fast. Clarity often comes through action.
But all of the above requires a degree of intentionality, some deliberate practice before it becomes your second nature. It’s much easier to drown in noise and distractions than do it. Still, this is a fork in the road moment.
Co-Creation
AI isn’t replacing you, but it is most likely replacing a lot of what you do. But it’s also inviting you into a different kind of relationship where you bring depth, judgment, and lived experience. It brings speed, reach, and adaptability.
The magic is in the combination and in partnering with a new kind of intelligence.
The best outcomes come from people who know how to bring their full selves into that collaboration.
I still see people online arguing about who wrote what. Was this written by a human?By AI? By a “real” writer? These debates are everywhere right now.
But not long from now, we’ll look back and think, “Wow… that was so 2025.”
Because soon, nobody will care who or what wrote something.
The only question will be: Did it say something worth hearing?
We’re all holding the same three primary colors now: access to the same tools, models, and inputs. What matters isn’t whether the red came from a paint tube or a machine.
What matters is what we do with those colors. What we choose to depict. How we choose to see.
You can look at the world like Cézanne. Or like Picasso. Or like no one else.
The palette is infinite and the difference is your vision.
The Real Risk Is Complacency
The real risk isn’t that AI will replace you (ok, it kind of is IS, to some degree), but
that it will amplify whatever mindset you already have.If you’re passive, vague, or waiting for someone to tell you what to build, AI won’t fix that.
It’ll just make your stuckness faster.
Meanwhile, someone sharper, more curious, more playful and willing to experiment will move past you. Fast.
So, How To Live Life Now?
If you’re someone who thinks deeply, sees patterns early, or dreams in projects you’ve never quite executed—this is your moment, because it means you don’t need to master every tool.
You just need to know what matters, be able to describe it clearly.
And you need the courage to act before it’s perfect.
Your biggest leverage is:
Problem intuition
Clear articulation
Systems awareness
Relational thinking
Follow-through with feedback
We’re not just adapting to the future. We’re shaping it.
AI won’t replace your imagination.
But it will test how much of it you’re actually using.So ask better questions.
Make weirder connections.
Build the thing that only you can see.
The question is no longer: Can it be built? but: Is your vision worth building?
In Ancient Greek, νοῦς (noûs), or νόος (nóos), translates to mind, intellect, or understanding. It's a key concept in philosophy, particularly in the works of Plato and Aristotle, representing the faculty of the mind that allows for grasping truth and reality.
I've been resistant to using AI, but after reading this, I'm going to be courageous and lean in a bit more!
Anna, thank you. This is a masterwork. Not just in its clarity, but in its cadence. It reads like a map and mirror: revealing the shift underway while inviting us to see ourselves more clearly within it.
I especially resonated with your reframing of execution as leverage. Execution is not dead, it’s evolved. And with it, the real work becomes attention. Focus is no longer just a productivity tactic; it’s a philosophical stance. Because when we outsource execution to a tireless, eager AI, it’s all too easy to also outsource discernment. And that, I believe, is where the spirit of the thing begins to slip.
Your emphasis on imagination and co-creation is spot on. I also see AI less as a replacement, and much more as a powerful partner. But like any partner, it requires presence. It will give us back exactly what we feed it: often faster, sometimes louder, but rarely wiser. That’s where we come in.
I’m also struck by the educational implications you hinted at. We’re still teaching to memorize, when the real leverage lies in the understanding of "why" and "what now." Noos over noise, indeed. It’s time we trained for discernment, for synthesis, for that rarest of modern capacities: holding our attention long enough to shape something meaningful with it.
So thank you! Thank you for lighting up this conversation, and for articulating what so many of us have been feeling in fragments. The tools are here. The shift is real. The next step, as you so beautifully posed it, is choosing how to move with it.