Toward a Deeper Tomorrow

Chapters:

At this point in our distracted age, the warnings are loud and clear. We’ve witnessed the slow erosion of attention and depth as endless stimuli chip away at our ability to concentrate. James Williams (Googler turned Philosopher) warned in Stand Out of Our Light of the attention economy and how human freedom is threatened by Big Tech systems of intelligent distraction and persuasion. Such distraction-by-design has structural incentives: more profit for Big Tech, more compliant subjects for government—yet it undermines our mental clarity. Meanwhile, leadership often stands by as this grand experiment on our minds runs unbridled. The result? A culture skimming the surface of everything, with precious little willingness (or even ability) to dive deep.

While the problem isn’t new, the stakes have never been higher. Advancements in AI have accelerated our shift into an era where much of our meaning-making takes place online; at the same time, it’s an era where money-making (by corporations) and opinion-shaping (by governments) often trump genuine human engagement. This has profound implications. When we outsource our thinking to algorithms and reactive platforms, we risk surrendering something essential. Reflex replaces reflection; creativity atrophies in the shadow of automation. Writer Pedro Gonzales put it bluntly: “Completely outsourcing creativity to algorithms would be a grave mistake and nothing short of a betrayal of what makes us human.” If we rely on algorithmic feeds to decide what we read, watch, or believe, our inner life risks becoming a puppet of code.

Throughout this essay, we’ve cautioned about letting the emissaries of technology run the whole show—apps, feeds, and AI tools can be incredibly useful, but they can also depose their master if we’re not careful. Psychiatrist-philosopher Iain McGilchrist offers an uncanny metaphor in The Master and His Emissary: a society governed by the narrow, analytical part of the mind while the holistic, context-rich part is sidelined. In our rush for efficiency and instant gratification, we risk living in an imbalanced “left-brain” world—all detail, no big picture. Nuance, context, and wisdom get sidelined. Like the parable’s wise master, we’re overthrown by an overconfident emissary that sees only fragments. In an era when metrics trump meaning and the “background vision” needed for insight is lost, the cultural and spiritual risks loom large. Trivial notifications drown out any chance for introspection, and entire nations find themselves more malleable to commerce and politics than ever before.

Yet this is not a eulogy for humanity; it’s a call to arms for depth and agency. After all the critique, we turn to hope. The digital age has been a wild ride through the shallows, but it doesn’t have to end in tragedy. Recognizing the problem is the first step to reclaiming our minds. And here’s the heartening truth: depth is a choice. As AI becomes more capable, our distinctly human gifts—imagination, empathy, moral reasoning—become more important than ever. If AI can handle the routine or augment our knowledge work, that could free us for the kind of deep, strategic, and creative thinking no machine can replicate. Think of how calculators liberated mathematicians to tackle bigger questions; if we steward AI well, it can mark the beginning of a more humane future. But that’s a big if. We must shape technology with human values rather than let it dictate our values by default. Even if intelligence tasks shift to machines, we can still cultivate wisdom—and we may discover that true wisdom is ultimately what we’ve been missing.

Restoring depth is not about rejecting technology or indulging in nostalgia. This isn’t a Luddite call to smash our smartphones or a misty-eyed longing for pre-digital days. It’s about reclaiming control and refining our values in a world that profits from our distraction. We can embrace modern tools while insisting they serve as our tools—not our masters. That means setting boundaries that protect time for concentration and daydreaming, choosing media that nourishes rather than numbs, and demanding user-centered design that respects attention. Educationally, we can teach the next generation not just to code but to reflect, resist distraction, and find meaning in an age of endless information, distraction, and persuasion. Politically, it might mean treating the defense of human attention and cognition as a public good. In short, reclaiming depth is about choosing what we pay attention to and what we value—asserting that some things are too important to entrust to autopilot.

Democratizing AI and Filter Systems Link to heading

But we must also look beyond personal habits toward structural reforms. If governments and large companies continues to hold the secret sauce of AI and algorithmic filtering—tuning our feeds for profit or political convenience—shallowfication will persist. To fight shallowfication, we need ways to amplify our agency rather than constrain it. I believe we have much to learn from the open source movement here (especially in terms of transparency, shared ownership, community-driven improvements). By “open sourcing” personalization algorithms and/or allowing people to bring and modify their own “settings”, we can give people back control of their digital lives. Personally, I’m advocating for marketplaces of preferences where individuals can craft and grow their own belief sets and personal filters. Instead of relying on a single opaque algorithm curated in a black box, users could bring their own preferred filters—privately held algorithms—into digital spaces, social platforms, or even government sites. By “breaking open” the black box of AI, we let users choose how content is sorted, highlighted, or downplayed, respecting both their attention and their privacy. Imagine logging into a platform with your personal preference profile, co-created with experts and communities you trust, to shape what you see and how you see it. Your feed becomes a reflection of your values and curiosity, not a puppet of click-driven metrics. This user-level autonomy is a direct challenge to corporate profit motives and government manipulation—an act of reclaiming intellectual and moral agency in the digital sphere. By the way, this need not be detrimential to corporations at all; quite the opposite: by building digital places where people can build meaning, discover, and live, it can create more fulfilled and loyal customers.

A More Urgent Frontier Link to heading

The urgency for such reforms is far greater now than it was a decade ago. AI systems, capable of generating deepfakes, personalized persuasion, and even shaping our political discourse in real time, are rapidly outpacing the guardrails we once assumed would keep us safe. As meaning-making increasingly moves online, money-making (by corporations) and opinion-shaping (by governments) step in to steer the conversation. If we don’t address these tensions, shallowfication will deepen—and we risk losing our capacity for reflection at scale. This isn’t dystopian hyperbole; it’s the next evolution of the same dynamic we’ve been describing. But with the rise of AI, the stakes jump exponentially. The time to choose depth has never been more critical.

So what might a deeper digital tomorrow look like? Imagine an online culture designed for curiosity and patience, rather than quick hits of dopamine. Envision social platforms that foster meaningful connection and dialogue over mindless scrolling. Picture collaborations between humans and AI that amplify our imagination instead of undercutting it—where we remain firmly in charge of the narrative. In such a future, our devices become instruments of focus when needed, not constant distraction engines. Our workplaces measure success by creativity and insight, not just “engagement” metrics. Our leaders finally catch up, championing an “attention ecology” that regards the human capacity for thought as a treasure, not a resource to be strip-mined for profit.

This future is plausible—but it requires our will and wisdom to bring it about. Each of us, in our own lives, can practice the art of depth: reading a book without the urge to click away, engaging in an unhurried conversation, blocking out time for genuine creativity. These small, repeated acts spark a cultural shift. Coupled with systemic reforms—like democratized algorithms, thoughtful regulation, and a renewed cultural emphasis on genuine meaning-making—they form a blueprint for a new age of depth.

In the end, choosing depth is an act of courage and hope. We are not doomed to be shallow beings in a shallow world. We remain, defiantly, human, and we can continually redefine what that means in the face of change. The future belongs not to the fastest scroller but to those willing to think, feel, and imagine deeply. With enough resolve, we can build the structures—and personal habits—that make depth possible. And that deeper tomorrow is ours to create—if we care enough to choose it.