There was a time when logging onto the internet felt like stepping into a living, breathing commons. It was messy, imperfect, sometimes hostile—but it was also vibrant, creative, and unexpectedly humane. Communities formed organically. Knowledge circulated freely. Platforms felt like tools rather than traps. For many users, that era reached its emotional peak during the early months of the COVID-19 pandemic, when social platforms briefly rediscovered their original purpose: connection.
Today, that feeling is gone. What remains is a digital environment that feels extractive, manipulative, cluttered with ads, algorithmic noise, and low-quality content. Platforms that once empowered users now exploit them. This decline has a name: enshittification.

Coined by writer and activist Cory Doctorow in 2022, the term has since entered mainstream discourse because it captures something deeply felt but poorly articulated—the sense that nearly every major digital service has become worse on purpose. This is not accidental decay. It is systemic, structural, and profitable.
The Golden Age of Platforms and the Illusion of Digital Utopia
In its formative years, the internet promised liberation. Barriers to publishing collapsed. Anyone could write, create, share, and be discovered. Early platforms prioritized usability, community trust, and growth through genuine engagement rather than coercive retention.
Twitter, before its rebranding and algorithmic distortions, exemplified this era. With careful curation, users could access journalists reporting in real time, experts sharing hard-won insights, artists refining their voices, and friends contributing humor and perspective. It felt intimate and global at the same time—a digital town square with personality.
That sense of belonging was not accidental. Platforms were incentivized to serve users well because they had not yet discovered how to extract maximum value from them. Profit mattered, but survival depended on goodwill.
The Turning Point: When Growth Became Extraction
As platforms matured, they encountered a problem familiar to all venture-backed enterprises: growth plateaued. User acquisition slowed. Investors demanded returns. At that moment, priorities shifted.
Rather than improving user experience, platforms began optimizing for monetization. Algorithms were retooled to maximize time on site. Engagement metrics replaced meaningful interaction. Advertising models grew increasingly invasive. Data collection expanded silently, turning users themselves into products.
This shift marked the beginning of enshittification’s first stage: platforms still appeared useful, but subtle degradations began accumulating beneath the surface.
Elon Musk, X, and the Acceleration of Decay
No single figure symbolizes enshittification more vividly than Elon Musk’s takeover of Twitter and its transformation into X. While the platform was already struggling, Musk’s changes accelerated its decline in visible and aggressive ways.
Verification was monetized, severing trust signals from credibility. Moderation systems were dismantled, allowing misinformation, harassment, and explicit content to flourish. Algorithms were altered to privilege certain voices and provoke outrage rather than inform.
The result was not ideological diversity or free expression, but chaos. Advertisers fled. Users complained. And yet, many stayed—not because the platform improved, but because their social graphs, audiences, and professional visibility were trapped there.
This is a defining feature of enshittification: exit becomes harder than endurance.
From Platforms to Prisons: The Lock-In Effect
Modern platforms are designed to minimize user mobility. Network effects ensure that leaving means social and economic loss. Interoperability is intentionally restricted. Data portability exists in theory but not in practice.
This is why dissatisfaction does not lead to mass exodus. Users remain not because platforms are good, but because alternatives lack critical mass—or because moving would mean starting over from nothing.
In this environment, competition becomes illusory. Dominance hardens into inevitability.
Cory Doctorow’s Theory of Enshittification Explained
Doctorow describes enshittification as a predictable lifecycle with three phases. Initially, platforms serve users generously to attract participation. Once a loyal user base is established, platforms pivot to serving business customers, offering access to users through advertising, promotion, or preferential treatment.
Eventually, platforms exploit both groups, extracting value from users while raising costs for businesses, until all surplus flows upward to the platform owners themselves.
This is not a failure of capitalism in theory, Doctorow argues—it is the logical outcome of unregulated digital monopolies.
Amazon: The Case Study of Digital Feudalism
Amazon illustrates enshittification with brutal clarity. It began as a consumer-friendly marketplace offering low prices and fast delivery. Sellers flocked to the platform, benefiting from access to a massive audience.
Over time, Amazon imposed increasing fees, copied successful third-party products, manipulated search rankings, and forced sellers into costly logistics programs. Sellers became tenants. Amazon became the landlord.
Once competition was eliminated, service quality stagnated. Search results filled with sponsored listings and counterfeit products. The user experience worsened—but profits soared.
Doctorow describes this system not as capitalism, but technofeudalism.
The Enshittoscene: Living Inside a Degraded Digital World
Doctorow’s term “enshittoscene” describes a world where degradation is no longer episodic but ambient. Poor quality becomes normalized. Frustration is routine. Users adapt rather than resist.
From subscription fatigue to dark-pattern design, from autoplay manipulation to AI-generated spam, enshittification is now the background condition of digital life.
The tragedy is not that platforms became bad—but that we stopped expecting them to be good.
Surveillance, Compliance, and the Illusion of Choice
Enshittification thrives on surveillance. Platforms track behavior with microscopic precision, shaping experiences not for user benefit but for monetization efficiency. Choice exists, but only within carefully constrained parameters.
You may choose content—but not how it is ranked. You may leave—but not easily. You may complain—but it will not change outcomes.
This asymmetry of power produces compliance, not consent.
Why This Is Not Inevitable
The most radical claim Doctorow makes is also the most hopeful: this didn’t have to happen.
The internet was once demonstrably better. Open standards thrived. Interoperability was assumed. Communities governed themselves. Profit existed—but it did not suffocate participation.
The decline was a policy choice, not a technological necessity.
The Four Paths to Deshittification
Doctorow argues that recovery requires structural intervention, not nostalgia.
Competition must be restored by breaking monopolies and preventing anti-competitive mergers. Regulation must ensure transparency, accountability, and fair markets. Interoperability must be mandated so users can move freely without losing social capital. And tech workers—the people who build and maintain these systems—must be empowered to resist harmful directives.
These are not utopian demands. They are pragmatic corrections.
Interoperability as Freedom Infrastructure
Just as electrical standards allow devices to function universally, digital systems must be required to communicate. Messaging platforms should interconnect. Social graphs should be portable. Content should not be locked behind proprietary walls.
Without interoperability, exit is punishment. With it, exit becomes leverage.
Tech Workers as Agents of Change
Most engineers, designers, and moderators do not want to build exploitative systems. They want to solve problems and create value. But their influence is constrained by corporate hierarchy.
Empowering tech workers through labor organization and ethical governance could realign incentives toward public good rather than extraction.
Conclusion: Making the Internet Good Again Is Possible
Enshittification is not destiny. It is a consequence of choices—legal, economic, and political. If those choices can degrade the internet, they can also restore it.
The internet became worse because it was allowed to. It can become better if we decide it must.
FAQs
1. What does “enshittification” mean?
It describes how digital platforms degrade over time to maximize profit.
2. Who coined the term?
Writer and activist Cory Doctorow in 2022.
3. Why do platforms get worse after becoming popular?
Because monopoly power allows extraction without fear of user exit.
4. Is this unique to social media?
No. It affects e-commerce, streaming, search, and software ecosystems.
5. What is technofeudalism?
A system where platforms act like landlords extracting rent from users.
6. Why don’t users just leave bad platforms?
Lock-in, network effects, and lack of interoperability prevent easy exit.
7. Can regulation fix this?
Yes, if it enforces competition, transparency, and interoperability.
8. What role do tech workers play?
They can resist harmful designs if empowered institutionally.
9. Was the internet ever actually better?
Yes—open standards and user-centric design once dominated.
10. Is deshittification realistic?
Yes, but it requires political and cultural will.