This is Facebook’s chance to hit the reset button
Facebook gave itself a test, and the results are in: The company’s recent record on nondiscrimination, according to an expert civil rights audit it commissioned, has been marked by pitfalls as much as progress.
Facebook’s audit began two years ago at the encouragement of advocacy groups concerned that the company’s commitment to free speech above all else had allowed toxicity to seep throughout the platform, giving bad actors an excuse to harass and to spread hate, and even leading to real-world violence as seen in Charlottesville, Va. Now, the top-level takeaway from the detailed assessment is that Facebook wrongly prioritizes free expression — yet that’s a too-simple reading of the complicated world of platform governance. Facebook was created to give people a place to speak and be spoken to; free expression will always be a lodestar. The question is to what extent the company allows other values to guide it, too. The civil rights report reveals how Facebook has struggled to weave a commitment to nondiscrimination and protection for minorities into a system that didn’t put its focus there to start with.
Facebook clearly recognizes that harms such as incitement of violence, voter suppression or mere intimidation into silence can outweigh the benefits of unfettered expression. But it’s a constant challenge to figure out when. The audit is useful in spelling out the various levels where that challenge takes place, from policy formulation to enforcement to platform design, including algorithmic bias.
The auditors recognize improvements in all these areas. Facebook has, for instance, instituted more robust bans on voter suppression and barred explicit support for white nationalism. The site has launched a pilot to guard against a toocommon enforcement error of punishing those who complain about hate speech. It is going after census interference with tough policy and tough enforcement alike. And Facebook has overhauled its advertising infrastructure so that U.S. housing, employment and credit ads can no longer be targeted by age, gender or Zip code.
Still, the efforts so far have been, as the auditors put it, “reactive
and piecemeal.” The updated policies on voter suppression came at the investigators’ urging. The advertising changes were impressive but arose after lawsuits called attention to the failure. More generally, Facebook’s most notable changes in service of nondiscrimination most often have come after public pushback.
This is also in sync with the “heartbreaking” decisions auditors note on three posts by President Donald Trump bearing on the cherished rights to assemble and to vote — one threatening violence against Black Lives Matter protesters and the others falsely alleging the illegality of mail-in ballots. The auditors viewed these as obvious violations of Facebook’s rules against incitement to violence, as well as the voter suppression policies. Yet when the matters were “escalated” to leadership, Facebook decided to do nothing — neither removing the posts as the auditors wished, nor taking the more balanced approach we favor of adding context with warning labels and fact-checks. No one on Facebook’s senior executive team has expertise in civil rights; the platform has pledged to change that.
Facebook simply isn’t wired top to bottom for nondiscrimination the way it is wired for free expression, in its staffing, its terms of service and its product. The task ahead, then, is a rewiring: not one that strips away the company’s commitment to free expression, but one that integrates that commitment with one to nondiscrimination at every level.