Lodi News-Sentinel

Under Elon Musk’s Twitter takeover, who will protect users

- SAFIYA NOBLE/RASHAD ROBINSON

Elon Musk’s Twitter takeover has triggered widespread criticism. Many people are panicked about the direction Musk will take the social platform. There’s a reason for alarm, but focusing solely on Musk ignores the crisis of monopoly control without accountabi­lity that characteri­zes much of the media in this country.

In recent decades, the notion of a public square, or the space available to debate, contest, experiment with and expand democratic discourse is a struggle fraught with challenges. The tech sector has remade our understand­ing of who can speak and who should be heard, in both good and troubling ways. That has given rise to algorithmi­c and automated boosting of everything from evidence-based research and investigat­ive journalism to outright racist propaganda.

Social media has created new monopolist­s, such as Mark Zuckerberg, who leads Facebook and Instagram. But in reality, we’ve been living in a world of media controlled by a very few private actors — sometimes single families — for a very long time. Ownership of communicat­ion outlets continues to be consolidat­ed into the hands of a few, which has had an incredibly harmful effect on politics, education and the way we narrate and understand our shared societal challenges.

In announcing his $44 billion deal to buy Twitter, Musk said: “Free speech is the bedrock of a functionin­g democracy.” Over the years, Twitter has navigated how to handle content moderation, de-platformin­g Nazis and violent incitement to overthrow government­s. As a public company with a board of directors, it has had to face some legal accountabi­lity, however limited, to agencies such as the Securities and Exchange Commission.

By taking the company private, Musk will remove this layer of oversight from Twitter. There is no question that abuses on a platform that has already struggled with racism and harassment will become even more difficult to rein in.

The issue is not just that rich people have influence over the public square, it’s that they can dominate and control a wholly privatized square — they’ve created it, they own it, they shape it around how they can profit from it. So perhaps the real question is whether people are going to have any space and be able to engage in any activity that is not totally dominated by an entity seeking profits.

Technology companies are media companies. They have a responsibi­lity for the way they affect our lives and democracy. Yet, when a few uncontroll­able people control such platforms, that responsibi­lity becomes voluntary and unenforcea­ble. A self-regulated company is a nonregulat­ed company.

Just as we need rules for television and the telecommun­ications industry designed to protect people, we need rules for technology companies. Frameworks of fairness and accountabi­lity for harm are necessary for a just society free from exploitati­on, anchored in civil and human rights. That’s true of every industry, and media platforms are no exception.

Pundits arguing that “Twitter was great before and now it will be terrible” takes us off track from the bigger problems at hand: the lack of rules and accountabi­lity. Federal laws and regulation­s must be crystal clear: The tech sector’s products must be subject to regulatory scrutiny before they are released. Just as drugs are subject to oversight by the Food and Drug Administra­tion, tech products need to pass inspection — an independen­t auditing process conducted by civil rights experts that exposes what they want to hide and advance proof that their products do no harm.

Regulators cannot be allowed to shift the burden and blame to consumers. The lie that we simply need to “put more control in the hands of users” is like holding individual­s responsibl­e for the air we breathe, or the pollution that destroys our lives, rather than regulating water and air quality in the best interest of the public.

In the tech world, selfregula­tion by corporatio­ns is essentiall­y complete non-regulation. The real issue is about regulating deceptive and manipulati­ve content, consumer exploitati­on, calls to violence and discrimina­tory or harmful products. Section 230 of the Communicat­ions Decency Act states that no provider or user of an “interactiv­e computer service shall be treated as the publisher or speaker” of any informatio­n provided by another content provider. Long hailed as protecting free speech on the internet, the measure shouldn’t be used as a shield by tech-media publishers against demands for protecting users from harm to nullify 60 years of civil rights and consumer safety law.

The right approach is not complicate­d: If you make the internet safe, then you make the system safer for everyone. Big Tech puts Black people, people of color, women, queer and disabled people in more danger than anyone else. We need regulation­s and stronger digital civil rights that will safeguard the public, rather than debates about which billionair­e will own the next communicat­ions platform.

Safiya Noble is a professor at UCLA in the department­s of Gender Studies and African American Studies, and a 2021 MacArthur Foundation Fellow. Rashad Robinson is the president of Color of Change, a racial justice organizati­on, and served as the co-chair of the Aspen Institute’s Commission on Informatio­n Disorder.

Newspapers in English

Newspapers from United States