Thoughts on Reclaiming Depth

Chapters:

Algorithm Insubordination Link to heading

Reclaiming depth in a world that profits from our distraction requires some rewiring - both within each one of us and society. For us individuals, it means taking control of our mental environment rather than giving algorithms full reign to decide what to filter, when to interrupt, and how to approach everyday activities and decisions. Even simple acts can pave the way for deeper engagement. Take managing or turning off notifications: An app that constantly pushes meaningless prompts, much like a friend who cuts you off to promote their side hustle mid-sentence, might not deserve prime real estate on your home screen. Create your own rules of engagement apps and technology may have with you and enforce; our devices should be treated as helpful but silent partners rather than clingy, attention-hungry acquaintances.

Algorithms as Puppet Masters Link to heading

Meaningful experiences often bloom during moments of undistracted focus and playful idleness. To borrow from Karen Stenner’s work on social complexity, human beings sometimes flee from real depth when overwhelmed by noise, drifting instead toward simplistic or shallow solutions. Reducing that background clamor by setting firm boundaries around and (importantly!) WITHIN technology—much as one might enforce a healthy diet—can restore a sense of calm. But sure, if you feel that binge-scrolling can be relaxing, then have at it as long as it’s not the default setting of daily life. Things like Focus Mode, Notification Summaries, and time boxing social media can all be part in maintaining sanity.

Incentivize Humanity Link to heading

Of course, the problem runs deeper than personal habits. Major tech platforms thrive on capturing as many minutes of user attention as possible, regardless of whether the content is genuinely valuable. The easier it is to bucketize behavioral streams of people, the easier it is to shape their needs, wants, and desires. This is where bigger measures become essential. Humane technology design, for example, could replace infinite scrolling and autoplay with content on-ramps, off-ramps, and natural stopping points - requiring interaction of some sort to continue. Instead of luring people into a digital labyrinth, platforms might gamify knowledge acquisition and application. Calmer designs reorient the incentive structure so that technology serves users’ goals rather than the other way around. Yet as Peter Pomerantsev observes in his critiques of online propaganda, such ethical redirection might require external pressure—perhaps through government oversight—to ensure these companies put public interest before profit. Right-to-disconnect legislation in some countries already aims to give workers breathing room after hours, and the same principle could expand to broader protections like algorithmic transparency or recourse for wrongful content moderation.

Schools and universities can also do their part by reinforcing habits that foster depth. That means valuing long-term projects over quick performance metrics, ensuring that students read entire books instead of condensed summaries, and teaching them to handle complexity rather than avoid it. Steven Sloman’s warnings about the “knowledge illusion” come into play here: in a digital environment replete with short snippets and illusions of expertise, learners may believe they grasp a topic when they’ve only skimmed the surface. Educators who encourage reflective study and critical thinking help inoculate students against that false sense of mastery. Instilling these skills early might be the most durable solution, because adults who’ve grown up with a taste for deep engagement are less likely to become passive consumers of digital quick hits.

Human Preference Autonomy Link to heading

AI - especially in applications like LLMs - can be harnessed for expanding horizons but only if people remain in the driver’s seat. As Cailin O’Connor and others in the study of belief formation have pointed out, humans are social creatures who outsource much of their understanding to trusted networks. If those networks become dominated by fleeting content, misinformation, or manipulative algorithms, it takes collective effort to restore epistemic health. Societies must develop norms and regulations that reward thoughtful discourse, penalize exploitative data practices, and promote open, moderated dialogue around contentious topics. But I don’t believe that this sort of systemic “depth assurance” can come in the form of censorship, think tanks, and fact checkers; we’ve seen far too much polarization to think of these as precise instruments. Instead, I believe that a large part of the solution lies in democratizing algorithms and filters, for example by creating marketplaces of preferences where individuals can manage, craft, and grow their own sets of preferences and beliefs, build their own set of privately held algorithms, and then take them with them to systems, online platforms, governments, etc, to interact and view content in personal yet privacy-conscious ways. This would also involve helping individuals shape and share their preference sets with others.

All of this might sound ambitious, but the capacity for deep engagement is hardly extinct. People continue to produce novels, scientific studies, and heartfelt art, often facilitated by the very tools that can also distract. The challenge, then, is not to reject smartphones or AI outright—these are marvels of human ingenuity—but to reframe them so that they amplify rather than erode our ability to focus and connect. The challenge is also to bring the essence of digital experiences and AI (algorithms and filters) in the control of the individual. Imagine a future where your phone’s default settings are designed to protect your attention. Imagine a workplace culture that expects you to disconnect after five o’clock. Imagine schools giving children the tools to read deeply, argue respectfully, and question responsibly. If that sounds utopian, it needn’t. History is full of examples, from the spread of printing presses to public libraries, in which societies steered technological revolutions toward public enrichment instead of shallow consumption.

Enacting such transformations demands humor, discipline, transparency, privacy, and a certain confidence that depth can triumph over speed and spectacle. In an era of data overload, refusing to let every app flash updates is an act of liberation; choosing to linger in thought is an affirmation of our human capacity for wonder. To borrow from Csíkszentmihályi’s flow states, the richest experiences still arise when we devote ourselves wholly to a single pursuit. As long as we remain mindful of the traps, as long as we hold onto the idea that some things—like reading novels, playing sports with friends, or even daydreaming—are worth embracing unhurriedly, we stand a good chance of preserving the depths that make life truly meaningful.