The Washington Post
Fla. takes clash on social media regulation to high court
It asks justices to rule on states’ power to govern content moderation
Florida’s attorney general on Wednesday asked the Supreme Court to decide whether states have the right to regulate how social media companies moderate content on their services. The move sends one of the most controversial debates of the internet age to the country’s highest court.
At stake is the constitutionality of state laws in Florida and Texas that would bar social media platforms such as Facebook, Twitter and Youtube from blocking or limiting certain types of political speech. Federal appeals courts have issued conflicting rulings on the two similar laws, with the U.S. Court of Appeals for the 11th Circuit striking down much of Florida’s law while the U.S. Court of Appeals for the 5th Circuit last week upheld Texas’s law.
“That irreconcilable divide warrants this Court’s review,” Florida Attorney General Ashley Moody wrote in the petition to the Supreme Court. Specifically, the petition asks the court to determine whether the First Amendment prohibits states from forcing platforms to host speech that they don’t want to host — such as news stories or posts by politicians that they deem to violate their rules.
The petition sets up the most serious test to date of assertions that Silicon Valley companies are unlawfully censoring conservative viewpoints, a view that gained momentum on the right after major social media sites suspended Donald Trump in January 2021. If the Supreme Court agrees to hear the case, its decision could have wide-ranging effects on the future of democracy and elections, as tech companies play an increasingly significant role in disseminating news and discussion about politics.
Critics of the state social media laws warn that restricting tech companies’ freedom to moderate content could lead to a torrent of hate speech, misinformation and other violent material.
The question of how the First Amendment rights of social media companies interact with the speech rights of their users is important and unresolved, said Genevieve Lakier, a professor at the University of Chicago Law School. She expects the Supreme Court to take it up, possibly by consolidating the Florida and Texas cases to issue a single ruling.
“This is a really major question: How do we regulate social media platforms?” Lakier said. “I think it could shape the operation of the internet really significantly. If these laws are upheld, it’s going to require the platforms to host a lot of speech that they don’t want to host.”
The 11th Circuit earlier this year ruled that Florida could not prohibit social media platforms from removing or limiting the posts of news organizations of candidates for office. It also struck down a provision that would require platforms to provide notice and explanation to users anytime it limits or removes something they post. It upheld parts of the law requiring companies to provide more transparency on their content policies.
The Florida attorney general incorporated in the state’s petition the recent conservative victory from the 5th Circuit, which upheld a Texas law that bars companies from removing posts based on a person’s political ideology. The Florida petition says the circuit courts’ decisions are in conflict, and the Supreme Court must resolve those differences. Moody did not immediately respond to a request for comment.
The legal battle over the Florida law began in May 2021, when Netchoice and the Computer & Communications Industry Association (CCIA), two industry groups representing major social media companies, filed a lawsuit to block the law from taking effect. The tech companies scored major victories when a federal judge in June of last year blocked the law from taking effect and then when the 11th Circuit upheld much of that ruling. The tech companies say they believe they will see a similar outcome in the Supreme Court.
“We agree with Florida that the U.S. Supreme Court should hear this case, and we’re confident that First Amendment rights will be upheld,” Netchoice vice president and general counsel Carl Szabo said in a statement. “We have the Constitution and 200 years of precedent on our side.”
As the Florida petition points out, some members of the Supreme Court have already expressed an interest in taking up the questions at issue. In a dissent from a Supreme Court decision that granted an emergency stay on the Texas social media law, Justice Samuel A. Alito wrote that the case raised “issues of great importance” that “plainly merit this court’s review.” He added, “It is not at all obvious how our existing precedents, which predate the age of the internet, should apply to large social media companies.”
Florida argues that social media companies have grown so powerful that their content moderation decisions, such as the decision to suppress baseless assertions about the origin of the coronavirus, or a New York Post story about Hunter Biden’s laptop, “distort the marketplace of ideas.” Florida contends that gives the state a compelling interest in regulating them.
On the other side, Netchoice argues that such decisions amount to an exercise of editorial discretion akin to the editorial decisions of newspapers and TV stations — which are considered protected speech under the First Amendment. That would set a high legal bar for any government to interfere with those decisions.
A Supreme Court decision would have consequences that stretch far beyond Florida, as more than 100 bills related to social media content moderation have been introduced in state legislatures across the country, according to a July analysis from CCIA. Many of the state legislatures have already recessed until 2023, and they are closely watching how the litigation over the Florida and Texas laws resolves.
Although the first social media content regulation laws were passed in conservative states, liberal states are now following with legislation to force more transparency on how the companies respond to threatening and hate speech. Any decision on states’ First Amendment power to regulate how companies police their platforms could have implications for those bills as well.