Choosing Friction: Why Embracing Difficulty Enhances Life in the Age of AI
In an era valuing convenience, this piece champions embracing friction—difficulty, human interaction, and deliberate effort—as essential for personal growth, authentic art, and resilient communities, critically examining the 'frictionless' promises of generative AI.
In 2018, legal scholar Tim Wu articulated in The New York Times: "Today’s cult of convenience fails to acknowledge that difficulty is a constitutive feature of human experience. Convenience is all destination and no journey." This observation, made well before the current AI boom, perfectly encapsulates why using artificial intelligence to create art primarily appeals to those who view creativity through the lens of content production and intellectual property generation. For them, the goal is a marketable product, not the process of creation.
While the discourse around AI can be exhaustive, this piece, like much of my writing on technology, is fundamentally about people. I am currently engrossed in David Graeber and David Wengrow's The Dawn of Everything, a book that introduced me to the concept of schismogenesis—the deliberate formation of social divisions. This process isn't random; it's born from conscious choices made by an in-group to differentiate itself from an out-group. This can be seen in national identities, or, as I observe, in the stance taken by those who consciously refuse to engage with generative AI.
One might view this through a lens of despair, questioning why even our use of tools and technologies has become a matter of identity. However, anthropology reveals that humans have always been this way. Ancient tribes rejected agriculture not out of ignorance, but to distinguish themselves from neighboring agricultural societies. Others refused to domesticate cattle, upholding a group identity that revered bulls as wild. My own refusal to use generative AI stems from a desire not to be the kind of person who uses it.
The allure of AI lies in its promise of friction removal. Regardless of its actual capability, powerful entities with vast resources believe in this promise. Their ideal world is one where AI handles all thinking, eliminating the need for human thinkers and, by extension, the complexities of human desires, needs, and rights. This vision suggests a future where interactions with others are largely unnecessary.
I, however, value thinking and believe its exercise is vital for humanity. A reduction in our cognitive engagement risks atrophy, benefiting only authoritarians seeking easier control. While I personally find AI's "thinking" to be superficial, that's almost beside the point. I simply do not desire the frictionless world that the political project of generative AI offers—a world designed to minimize human interaction. My refusal is a philosophical stance, much like my choices to avoid Amazon, Netflix, or Spotify. These decisions introduce friction into my life, serving an abstract ideal rather than inflicting significant pain on corporate oligarchs. I harbor no illusions that my individual choices will halt AI's societal impact, just as I don't believe Jeff Bezos mourns my unspent money. Yet, if I cannot endure this minor friction in an otherwise comfortable life for my ideals, what discomfort could I withstand when it truly matters?
Suffering is not inherently virtuous, nor is discomfort noble. Nevertheless, every meaningful aspect of my life—love, friendship, art, community—has been forged by navigating discomfort. Pain doesn't guarantee growth, but growth invariably requires it. In our modern age, it's effortless to prioritize convenience, avoid friction at all costs, and rationalize it as self-care. I choose the friction of refusal because I fear that losing the capacity for discomfort means losing the capacity for growth. Both refusal and acquiescence are habit-forming.
Generative AI could undoubtedly expedite certain tasks, assuming one overcomes the fundamental moral disagreement with the technology. Embracing it might seemingly allow more into one's life. But "more" doesn't always equate to "better," and abundance is not an uncomplicated good. Our values and identities are expressed in what we dedicate our time to. The necessity of sacrificing some things to prioritize others—the recognition that we cannot "have it all"—is precisely what gives our choices meaning.
What makes art compelling is its origin: it's created by individuals who could have chosen countless other ways to spend their finite time. The sheer dedication involved in writing a novel, for instance, is astonishing. Every human-authored book, regardless of its perceived quality, represents hundreds, if not thousands, of hours of work from someone driven by an urgent need to express something. These hours could have been spent on leisure, yet the message was too important to ignore. That, I find truly remarkable.
The fundamental issue with AI output masquerading as art isn't technical ineptitude or uncanny valley effects. AI media generation has advanced significantly and will likely continue to improve. The problem with AI "art" is its lack of human intentionality; it's not the imperfect expression of a mortal being navigating mediocrity to communicate a feeling to other mortal beings. For this reason, it remains fundamentally uninteresting to me.
If a screenplay, symphony, or painting can be generated at the push of a button for a nominal subscription fee—one that scarcely covers the true global cost of this technology—without requiring the creator to confront their mortality and commit their finite time, without their desire to speak being strong enough to overcome the friction of learning to speak, then one must ask: was it truly something that needed to be said?
Regarding Taylor Swift’s new album, The Life of a Showgirl, the independent sports outlet Defector aptly quoted Fyodor Dostoevsky from The Idiot: "Lack of originality, everywhere, all over the world, from time immemorial, has always been considered the foremost quality and the recommendation of the active, efficient and practical man." This describes what art becomes when its primary goal is monetary gain: unoriginal, boring, and palatable. Good art, like good community, is inherently inefficient.
My book club, formed this year with individuals previously unknown to each other, exemplifies this inefficiency. We began by reading Robert Putnam’s Bowling Alone, a book exploring the decline of civil society when convenience triumphs over civic participation. We read it across months, meeting in various Toronto locations, discussing one segment at a time. This process is undeniably inefficient: coordinating multiple adult schedules, commuting, chatting, ordering food. It would have been far easier to read alone at home, perhaps with an audiobook at double speed or an AI-generated summary. Frictionless.
Authoritarianism, too, promises a frictionless world for a select group, with adherents always believing they belong to that group, safe from its consequences. The promise of frictionlessness often masquerades as efficiency. Why endure committees and public consultations when you can simply bulldoze your way to build a megaspa?
Having worked on many projects and teams, I've often felt immense frustration with collaborators, wishing for the power to simply dictate outcomes. The friction that AI promises to remove is largely the same friction authoritarianism seeks to eliminate: other people. It frees one from building relationships with complex, contradictory individuals who will inevitably challenge preconceptions or demand accountability. There's no need to learn compromise or accept that one won't always get their way. Instead, one can simply consult an AI that affirms their position.
This promised frictionlessness, however, is an illusion. It's not real because tech companies change their services, cash burns catch up, and governments might exploit AI to dismantle social services, leaving only friction. It's not real because deepfakes threaten consensus reality, a reality most of us have always known and cannot fathom living without. Most profoundly, it's not real because, despite all attempts to distance oneself from the inconvenient needs of others, we remain human. The friction within our own minds is inescapable. We need other people in countless ways, big and small, and we might as well learn to be needed by them too.
Does Mark Zuckerberg or Peter Thiel appear particularly happy or fulfilled?
I can read books alone, indulging only my own interpretations, never encountering dissent. Or I could press a button for a 10-minute AI summary, then another for an AI-generated blog post to share on LinkedIn as "thought leadership." Alternatively, I can take a 45-minute subway ride to a brightly lit food court, share delicious Indian food, and laugh with friends about historical figures. Which path is more frictionless? Which makes me feel more whole?
I choose friction.