A better kind of social media is possible — if we want it
Talk to almost anyone today about social media, and you’ll hear that it’s toxic. One might diagnose it with having an excess of outrage, another too little free speech. Some bemoan the invasion of privacy, the scourge of lies and hate, the capricious rule of technology titans, the trashing of attention spans. And some feel that no matter how delicious any morsel it offers, the indulgence leaves a bad aftertaste.
The pendulum has fully swung from early Pollyanna predictions that social media would unite families and topple repressive regimes to today’s declarations that it is depressing teenagers and destroying democracy. Frustration is widespread; calls for change cross the political aisle. Pioneers have decamped from Twitter and Facebook to join experimental platforms such as Mastodon and Post, despite clunky features and the sacrifice of caches of followers and friends.
But for many of us, the dream of digital town squares where we openly discuss important matters has lost its luster. Messaging apps such as Whatsapp and Facebook Messenger, where chats take place in private groups, have for years been more popular than broadcasting thoughts to a public feed. And who can blame teenagers for preferring Tiktok? Even on a good day, a whimsical video has more appeal than a heated exchange with a vitriolic stranger.
Meanwhile, the technologies and the talent pool to create new kinds of online communities are expanding. Thousands of workers have left Big Tech companies in recent months, some of whom itch to do something civic-minded. While building social networks once took a great deal of money and technical expertise, today’s wannabe hosts can use opensource protocols and their own servers to build micro-communities that offer users more control over content, privacy and the rules of the road.
Even in Washington, the political will to change the status quo is growing. Last year, bipartisan bills targeting the downsides of digital platforms emerged in both chambers of Congress. Though there’s still no consensus about what to do, there’s an emerging consensus that something must be done about social media.
Where all this momentum leads is anyone’s guess. But there’s no going back to a world before Facebook, however pretty it might look in the foggy rearview mirror. What we should hope for instead is a new era of social media — one that serves the best interests of society instead of exploiting its worst impulses. To get there will require new business models and funding sources — and probably some smart and not heavy-handed legislation. It also will require something sorely lacking from most social media conversations today: imagination.
How do we want to gather online in the future? What would lively, inviting, edifying social media communities give us, and how would they look and feel? Whom do we want to connect with, and on what terms? What kinds of conversations and content do we want to see and share? And most important, whom do we want to make choices about something that has become essential to the way we interact with one another, the way we learn about the world, and the way we impart what we know?
Most of us haven’t asked ourselves these questions. We accept the social media we have, perhaps out of convenience or because it’s all we know.
“We basically have two problems in digital governance: one, not knowing what we want; and two, not trusting people to give it to us,” said Jonathan Zittrain, faculty director of Harvard’s Berkman Klein Center for Internet and Society. “When we imagine a solution, we still think about Facebook and Twitter, single communities of millions or billions of people where one statement might get exposed to 10 million or 100 million people in the span of 24 hours.”
It’s little wonder that the prevailing platforms have become so noxious. The for-profit companies such as Meta and Twitter that host them rely on advertising for revenue; the more engaged we are by what we see, the more ad dollars they earn. Polarizing content creates stronger reactions and therefore more sharing and commenting, so social media companies’ algorithms amplify it. The profit scheme favors conflict.
Swashbuckling entrepreneurs now decide whether a post should be taken down — whether it’s merely an unpopular viewpoint or an outright lie, whether it’s mild nudity or a neo-nazi slur. These men are not sufficiently motivated by the marketplace or their consciences to cultivate communities that inform the masses with the truth, to surface nuance on public matters, or to protect our privacy from companies that seek to sell us their wares.
We stay because we still get something from social media. Facebook, Instagram and Twitter can feel like thriving cities where we already know a lot of people and our way around. This makes it hard to leave — despite discontent with our interactions or with how Mark Zuckerberg and Elon Musk police our posts, or welcome trolls to the party.
Mercifully, the possibility for a different kind of social media — where we could get more of what we like and less of what we don’t — is opening up.
Divya Siddarth, co-founder of a new effort to create more community control of emerging technologies called the Collective Intelligence Project, said it’s becoming easier to choose what kinds of online communities we want. Technology models such as federated networks, which underlie Mastodon, the ascendant Twitter refuge, allow people to choose from a range of options for privacy, transparency of identity and social norms in smaller communities linked to a common hub.
Innovators are creating ranking systems that would enable people to vote on a community’s content-moderation algorithm based on what it amplifies and suppresses, rather than accept a single company’s default strategy. Researchers are working on algorithms that would “bridge” divides, amplifying not what’s causing strong reactions but what people of various persuasions, whether political affiliations or musical tastes, agree on. In the future, people might be able to choose algorithms or entire networks that expose them to ideas from diverse sources and surface consensus views, and that also might better prevent harassment and hate speech.
Progress in artificial intelligence, including large language models such as CHATGPT — although it poses risks such as spreading more misinformation — could also help, Siddarth said, by summarizing convergence areas on hot topics or translating across languages to bring more people around the globe into online conversations.
“We have this core desire for community, and people are creative about using the internet in ways that go around the existing incentive structures,” she said. “A lot of people are experimenting and bootstrapping in this space.”
That experimentation has yet to yield vibrant and enticing alternatives to the cities we inhabit on the big social media sites. But with the right kind of thinking, investment and policy, it could.
Eli Pariser, author of “The Filter Bubble” and co-director of New_public, believes next-generation digital platforms should be public spaces, such as the parks, libraries and trails where we go in the real world to have fun, exercise, learn, meet new friends or connect with old ones, check out public art, and sometimes organize charity drives or have political conversations. In the future social media landscape, Pariser said, people might go to a certain online network to connect with people at their school, a different one to connect with neighbors, and yet another to connect with people in their cities or more globally. New_public calls itself a community and studio aimed at building such online spaces.
If the idea of encountering more digital spaces sounds overwhelming today, that’s because we don’t yet have seamless ways to move among networks. Each one has its own app, profile, data and connections. But it’s possible to create an open, shared protocol for social media that would enable us to use one or two apps to move among various networks — the way we now send one another email even if our correspondence relies on distinct servers and design interfaces (Gmail, Yahoo Mail and so on). For this to happen, companies will have to be compelled by their customers or the law to make their networks open and interoperable, allowing us to leave without losing our data and to move our profiles and connections around. This would make it easier to spend more time in communities that are better for us.
You might wonder, as I have, whether the social media we have today is the social media we deserve — if human nature is the true driver of all the outrage. But consider that the entire internet is not a cesspool of terrorists and conspiracy theorists. The environments that people inhabit, and their design, matter. Wikipedia, although it is an online encyclopedia and not a social media network, shows what can happen when people are invited to tend their own digital community gardens. Notably, as a nonprofit, this platform was designed not to keep its users addicted, but to share credible information with the public.
Wikipedia has motivated people to build and maintain its compendium of knowledge for free — with a collaborative spirit and remarkable accuracy. A cadre of volunteer editors sets the standards for how entries are added and updated. Conflict, lies, bias and bad faith are not rewarded; editors get demoted or promoted by the community based on their behavior, and their track records are made public.
Former Wikimedia chief executive Katherine Maher believes the next era of social media could similarly let communities govern themselves. “You can get platforms where users have responsibility for content moderation — friendly spaces, safe spaces with robust and meaningful discussion,” she said. Maher suggests that existing media companies hire facilitators charged with listening to their communities and setting the boundaries for what gets amplified and what gets taken down instead of relying on a central authority that farms the job out to people charged with following its rules.
Wikipedia is instructive, too, in that its operation relies on donations. It will probably take public or philanthropic investment — and sustained financial support over time from the government or directly from us via subscription fees — to support the next generation of social media technologies and platforms. The current offerings might seem free, but they come at the cost of our individual and collective well-being. Social media could be freed from its flawed business model if we thought of it as a public good worthy of public investment, similar to roadways, K-12 schools and the military.
That’s not to say that commercial platforms can be expected to disappear. But with more competition and greater capacity for users to leave them, ad-driven companies might be motivated to temper their preference for polarization over community-building. Already, several social media companies are struggling to grow their subscriber numbers.
For all its ills, social media still serves some good in our lives and in our society, and it can one day serve far more. Family and friends still share their milestones. People with rare diseases find solidarity and solace and advocate for cures together. It’s on social media that Iranian women brought worldwide attention to their protests over wearing the hijab and that pro-democracy activists organized in Hong Kong. And it’s still a place to witness awe-inspiring dance routines, optical illusions and, yes, cat antics.
Breaking the stranglehold that existing platforms have on users is not going to be easy. Technology companies will have to play by new rules, and civic leaders must decide to collectively invest in innovation. Just as critical, however, is that we begin to envision alternatives to our current malaise. Capital, political will and people follow good ideas. Without ideas, we may squander the chance to channel discontent into progress.
The first era of social media was designed for us by clever young men with technical expertise and vague notions of connecting people to one another — but not much foresight about how it would change the world. And politicians in their thrall have let them run amok.
Better online communities won’t grow from the same kinds of ideas, companies and thinkers who got us here. The next era should be shaped by the wide public it will serve. The new communities should be designed and shepherded by all of us: artists, teachers, entrepreneurs, policy wonks, nurses, architects, moms, mechanics, community organizers, musicians, writers. It’s our turn to choose the technological future we want.
Polarizing content creates stronger reactions and therefore more sharing and commenting, so social media companies’ algorithms amplify it. The profit scheme favors conflict.
Can you imagine a different, better kind of social media in the future? Share your ideas at wapo.st/fixsocialmedia.