Old photographs of public transportation show everyone reading the same newspaper. Now everyone's on their phones, each consuming a personalized algorithmic feed. The homogeneity of ideas didn't disappear—it fragmented into a million echo chambers.
Web 2.0 made users into creators. It also made them into products.
Web 1.0 was read-only. Web 2.0 added a keyboard.
MySpace (2003) and Facebook (2004) proved users would create content for free if you gave them a platform. YouTube (2005) did the same for video. Twitter (2006) for short-form text. The pattern repeated across every medium.
Suddenly, anyone could publish. No web development skills required. No server costs. Just type and post.
The platforms captured the value.
Web 2.0 services are "free." The product is you.
The more you use Facebook, the more Facebook knows about you—what you click, how long you linger, what makes you angry, what makes you buy. That data trains algorithms that serve you content designed to maximize engagement, which generates more data, which serves better-targeted ads.
The same pattern applies everywhere:
You're not the customer. Advertisers are. You're what's being sold.
Technology compounds exponentially:
Moore's Law: Transistor density doubles roughly every two years. Computing power increases; costs decrease.
Metcalfe's Law: A network's value scales with the square of its users. Each new user makes the network more valuable for everyone—and harder to leave.
Facebook isn't valuable because of its technology. It's valuable because everyone you know is already there. That's the moat. That's why challengers fail.
The most influential Web 2.0 innovation. For better and worse.
Social networks enabled unprecedented connection. They also enabled:
The algorithms don't care about truth. They care about engagement. Outrage engages. Nuance doesn't.
Platforms like Amazon, eBay, Etsy, and Shopify let anyone build a virtual storefront. Manufacturing monopolies weakened. Distribution democratized—partially.
Amazon learned from its sellers, then competed with them. The platform became the gatekeeper, taking a cut of every transaction and controlling what buyers see first.
Twitch, YouTube, OnlyFans, Patreon—platforms where individuals monetize audiences directly. A genuine shift from institutional media to independent creators.
But the platforms take 30-50%. They control discovery. They can demonetize or deplatform at will. "Direct" still routes through intermediaries.
When Web 2.0 platforms launched, they seemed like free gifts. Use as much as you want. No charge.
The cost became clear later:
Robinhood's 2021 hack exposed 7 million accounts. Their liability? Minimal. They disclosed the risks in terms of service nobody reads.
Centralized platforms can remove any content, ban any user, for any reason. Sometimes this removes genuine harm. Sometimes it silences legitimate speech.
The distinction depends on who's making the call—and platforms have accumulated enormous power to make that call for billions of people.
Before Web 2.0, information sources were limited but shared. Everyone read the same newspapers, watched the same news broadcasts.
Now, hundreds of networks compete for attention in a 24/7 cycle. Clickbait outperforms nuance. Negative headlines drive more engagement than positive ones. Algorithmic feeds create filter bubbles where users see only content that confirms existing beliefs.
The result: people consuming completely different realities, with no shared factual baseline.
Web 2.0 built extraordinary capabilities—global communication, instant information access, platforms for creativity and commerce. It also created:
These aren't bugs. They're features of an architecture where platforms capture the value users create.
In 2008, as the financial system collapsed, an anonymous developer named Satoshi Nakamoto proposed a different architecture—one where users own the network. That's where Web 3.0 begins.
When free platforms made users the product
Old photographs of public transportation show everyone reading the same newspaper. Now everyone's on their phones, each consuming a personalized algorithmic feed. The homogeneity of ideas didn't disappear—it fragmented into a million echo chambers.
Web 2.0 made users into creators. It also made them into products.
Web 1.0 was read-only. Web 2.0 added a keyboard.
MySpace (2003) and Facebook (2004) proved users would create content for free if you gave them a platform. YouTube (2005) did the same for video. Twitter (2006) for short-form text. The pattern repeated across every medium.
Suddenly, anyone could publish. No web development skills required. No server costs. Just type and post.
The platforms captured the value.
Web 2.0 services are "free." The product is you.
The more you use Facebook, the more Facebook knows about you—what you click, how long you linger, what makes you angry, what makes you buy. That data trains algorithms that serve you content designed to maximize engagement, which generates more data, which serves better-targeted ads.
The same pattern applies everywhere:
You're not the customer. Advertisers are. You're what's being sold.
Technology compounds exponentially:
Moore's Law: Transistor density doubles roughly every two years. Computing power increases; costs decrease.
Metcalfe's Law: A network's value scales with the square of its users. Each new user makes the network more valuable for everyone—and harder to leave.
Facebook isn't valuable because of its technology. It's valuable because everyone you know is already there. That's the moat. That's why challengers fail.
The most influential Web 2.0 innovation. For better and worse.
Social networks enabled unprecedented connection. They also enabled:
The algorithms don't care about truth. They care about engagement. Outrage engages. Nuance doesn't.
Platforms like Amazon, eBay, Etsy, and Shopify let anyone build a virtual storefront. Manufacturing monopolies weakened. Distribution democratized—partially.
Amazon learned from its sellers, then competed with them. The platform became the gatekeeper, taking a cut of every transaction and controlling what buyers see first.
Twitch, YouTube, OnlyFans, Patreon—platforms where individuals monetize audiences directly. A genuine shift from institutional media to independent creators.
But the platforms take 30-50%. They control discovery. They can demonetize or deplatform at will. "Direct" still routes through intermediaries.
When Web 2.0 platforms launched, they seemed like free gifts. Use as much as you want. No charge.
The cost became clear later:
Robinhood's 2021 hack exposed 7 million accounts. Their liability? Minimal. They disclosed the risks in terms of service nobody reads.
Centralized platforms can remove any content, ban any user, for any reason. Sometimes this removes genuine harm. Sometimes it silences legitimate speech.
The distinction depends on who's making the call—and platforms have accumulated enormous power to make that call for billions of people.
Before Web 2.0, information sources were limited but shared. Everyone read the same newspapers, watched the same news broadcasts.
Now, hundreds of networks compete for attention in a 24/7 cycle. Clickbait outperforms nuance. Negative headlines drive more engagement than positive ones. Algorithmic feeds create filter bubbles where users see only content that confirms existing beliefs.
The result: people consuming completely different realities, with no shared factual baseline.
Web 2.0 built extraordinary capabilities—global communication, instant information access, platforms for creativity and commerce. It also created:
These aren't bugs. They're features of an architecture where platforms capture the value users create.
In 2008, as the financial system collapsed, an anonymous developer named Satoshi Nakamoto proposed a different architecture—one where users own the network. That's where Web 3.0 begins.