It’s over. Facebook is in decline, Twitter in chaos. Mark Zuckerberg’s empire has lost hundreds of billions of dollars in value and laid off 11,000 people, with its ad business in peril and its metaverse fantasy in irons. Elon Musk’s takeover of Twitter has caused advertisers to pull spending and power users to shun the platform (or at least to tweet a lot about doing so). It’s never felt more plausible that the age of social media might end—and soon.
Now that we’ve washed up on this unexpected shore, we can look back at the shipwreck that left us here with fresh eyes. Perhaps we can find some relief: Social media was never a natural way to work, play, and socialize, though it did become second nature. The practice evolved via a weird mutation, one so subtle that it was difficult to spot happening in the moment.
The shift began 20 years ago or so, when networked computers became sufficiently ubiquitous that people began using them to build and manage relationships. Social networking had its problems—collecting friends instead of, well, being friendly with them, for example—but they were modest compared with what followed. Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
But now, perhaps, it can also end. The possible downfall of Facebook and Twitter (and others) is an opportunity—not to shift to some equivalent platform, but to embrace their ruination, something previously unthinkable.
A long time ago, many social networks walked the Earth. Six Degrees launched in 1997, named after a Pulitzer-nominated play based on a psychological experiment. It shut down soon after the dot-com crash of 2000—the world wasn’t ready yet. Friendster arose from its ashes in 2002, followed by MySpace and LinkedIn the next year, then Hi5 and Facebook in 2004, the latter for students at select colleges and universities. That year also saw the arrival of Orkut, made and operated by Google. Bebo launched in 2005; eventually both AOL and Amazon would own it. Google Buzz and Google+ were born and then killed. You’ve probably never heard of some of these, but before Facebook was everywhere, many of these services were immensely popular.
Content-sharing sites also acted as de facto social networks, allowing people to see material posted mostly by people they knew or knew of, rather than from across the entire world. Flickr, the photo-sharing site, was one; YouTube—once seen as Flickr for video—was another. Blogs (and bloglike services, such as Tumblr) raced alongside them, hosting “musings” seen by few and engaged by fewer. In 2008, the Dutch media theorist Geert Lovink published a book about blogs and social networks whose title summarized their average reach: Zero Comments. (...)
That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts. Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
The terms social network and social media are used interchangeably now, but they shouldn’t be. A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
A 2003 paper published in Enterprise Information Systems made an early case that drives the point home. The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
Now that we’ve washed up on this unexpected shore, we can look back at the shipwreck that left us here with fresh eyes. Perhaps we can find some relief: Social media was never a natural way to work, play, and socialize, though it did become second nature. The practice evolved via a weird mutation, one so subtle that it was difficult to spot happening in the moment.
The shift began 20 years ago or so, when networked computers became sufficiently ubiquitous that people began using them to build and manage relationships. Social networking had its problems—collecting friends instead of, well, being friendly with them, for example—but they were modest compared with what followed. Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
But now, perhaps, it can also end. The possible downfall of Facebook and Twitter (and others) is an opportunity—not to shift to some equivalent platform, but to embrace their ruination, something previously unthinkable.
A long time ago, many social networks walked the Earth. Six Degrees launched in 1997, named after a Pulitzer-nominated play based on a psychological experiment. It shut down soon after the dot-com crash of 2000—the world wasn’t ready yet. Friendster arose from its ashes in 2002, followed by MySpace and LinkedIn the next year, then Hi5 and Facebook in 2004, the latter for students at select colleges and universities. That year also saw the arrival of Orkut, made and operated by Google. Bebo launched in 2005; eventually both AOL and Amazon would own it. Google Buzz and Google+ were born and then killed. You’ve probably never heard of some of these, but before Facebook was everywhere, many of these services were immensely popular.
Content-sharing sites also acted as de facto social networks, allowing people to see material posted mostly by people they knew or knew of, rather than from across the entire world. Flickr, the photo-sharing site, was one; YouTube—once seen as Flickr for video—was another. Blogs (and bloglike services, such as Tumblr) raced alongside them, hosting “musings” seen by few and engaged by fewer. In 2008, the Dutch media theorist Geert Lovink published a book about blogs and social networks whose title summarized their average reach: Zero Comments. (...)
That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts. Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
The terms social network and social media are used interchangeably now, but they shouldn’t be. A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
A 2003 paper published in Enterprise Information Systems made an early case that drives the point home. The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
by Ian Bogost, The Atlantic | Read more:
Image: Tayfun Coskun/Anadolu Agency/Getty[ed. See also: This Is What It Looks Like When Twitter Falls Apart (The Atlantic). Update: Facebook Parent Meta Will Pay $725M to Settle a Privacy Suit Over Cambridge Analytica (NPR); (ed... a few hundred million here, a few billion there... : )]
"Facebook's data leak to Cambridge Analytica sparked global backlash and government investigations into the company's privacy practices the past several years.
Facebook CEO Mark Zuckerberg gave high-profile testimonies in 2020 before Congress and as part of the Federal Trade Commission's privacy case for which Facebook also agreed to a $5 billion fine. The tech giant also agreed to pay $100 million to resolve U.S. Securities and Exchange Commission claims that Facebook misled investors about the risks of user data misuse. (...)
"Facebook's data leak to Cambridge Analytica sparked global backlash and government investigations into the company's privacy practices the past several years.
Facebook CEO Mark Zuckerberg gave high-profile testimonies in 2020 before Congress and as part of the Federal Trade Commission's privacy case for which Facebook also agreed to a $5 billion fine. The tech giant also agreed to pay $100 million to resolve U.S. Securities and Exchange Commission claims that Facebook misled investors about the risks of user data misuse. (...)
Cambridge Analytica was in the business to create psychological profiles of American voters so that campaigns could tailor their pitches to different people. The firm was used by Texas Sen. Ted Cruz's 2016 presidential campaign and then later by former President Donald Trump's campaign after he secured the Republican nomination.
According to a source close to the Trump campaign's data operations, Cambridge Analytica staffers did not use psychological profiling for his campaign but rather focused on more basic goals, like increasing online fundraising and reaching out to undecided voters. [ed. Uh, huh...]
Whistleblower Christopher Wylie then exposed the firm for its role in Brexit in 2019. He said Cambridge Analytica used Facebook user data to target people susceptible to conspiracy theories and convince British voters to support exiting the European Union. Former Trump adviser Steve Bannon was the vice president and U.S. hedge-fund billionaire Robert Mercer owned much of the firm at the time."
According to a source close to the Trump campaign's data operations, Cambridge Analytica staffers did not use psychological profiling for his campaign but rather focused on more basic goals, like increasing online fundraising and reaching out to undecided voters. [ed. Uh, huh...]
Whistleblower Christopher Wylie then exposed the firm for its role in Brexit in 2019. He said Cambridge Analytica used Facebook user data to target people susceptible to conspiracy theories and convince British voters to support exiting the European Union. Former Trump adviser Steve Bannon was the vice president and U.S. hedge-fund billionaire Robert Mercer owned much of the firm at the time."