Allowing Big Tech to police themselves hasn’t worked
WE KNOW BIG TECH’S LOBBYING POWER AND ARMY OF LAWYERS ARE NOT MAKING THINGS EASY, BUT HOW IS THERE NOT A REGULATORY BODY HOLDING BIG TECH ACCOUNTABLE AND BIPARTISAN — YES BIPARTISAN — LEGISLATION ENSURING LEGAL LIABILITY ON THESE COMPANIES?
The explosion of social media has fundamentally changed our humanity. The ultimate communication and networking platforms have revealed incredible benefit, but not without alarming side effects.
The recent “Big Tech and the Online Child Sexual Exploitation Crisis” congressional hearing on Jan. 31 gained national headlines with the CEOs of the major social media companies being called to testify at the Senate Judiciary Committee.
What made this hearing particularly compelling were the families present – many holding pictures up of their loved ones who have seriously harmed themselves or taken their own lives.
At one point, Mark Zuckerberg even turned and faced the families to apologize for what they have gone through. But talk is cheap. Have Big Tech companies done enough to protect children? This hearing focused on many of the dark corners of social media including mental health, unwanted sexual images, sexual harassment, drug trade, and human trafficking recruitment – again, involving teens and preteens.
Is social media a human trafficker’s best friend or worst enemy?
Human trafficking is a pandemic – the modern slavery of our time. Twenty-one million people globally are victims of forced labor — labor exploitation, sexual exploitation, or state-imposed forced labor, a $150 billion-dollar industry of which marginalized populations are disproportionately impacted — and traffickers are highly active on social media as a recruitment engine.
In fact, 30% of all victims in federal sex trafficking cases were recruited online. One month prior to this hearing we saw reports of New Mexico Attorney General Raúl Torrez launching legal action against Meta, claiming that Facebook and Instagram are “breeding grounds” for predators targeting children for human trafficking, grooming and solicitation.
Each CEO claimed to have made “industry leading” and “unprecedented efforts” to prevent the issue and partner with law enforcement with all identified cases. But on social media networks, with imperfect algorithms, and where people can lie about their age and have no identity verification, how many cases are never identified or identified after someone is enslaved?
Some alarming investigative reports have shined light on companies like Meta “failing to report or even detect the full extent of what is happening.” In a 2020 report by the Human Trafficking Institute of 105 child sex trafficking cases, Facebook was the No. 1 platform, 65%, followed by Instagram and Snapchat. And these Big Tech companies “doing all we can” – if you watch the hearing, you’ll learn there is no regulatory body over these companies. With that, social media companies are protected from legal liability under Section 230 as they are not viewed as the publishers of the content.
Good luck collecting damages. That being the case, what real accountability does Big Tech feel?
The longer you watch the hearing, you may wonder why our elected officials are not the ones apologizing to these families. Certain words spoken by our elected officials give us hope – “it’s on us,” “it takes a village,” and “bipartisan legislation” – and the realization that Big Tech “grading their own homework” will not work any longer.
But again, talk is cheap. We know Big Tech’s lobbying power and army of lawyers are not making things easy, but how is there not a regulatory body holding Big Tech accountable and bipartisan — yes bipartisan — legislation ensuring legal liability on these companies?
Like the FDA in medical device regulation, companies should not be allowed to operate on their own honor code. Both in bringing the product to market and continued monitoring of that product’s safety after release, companies are held accountable with industry-wide requirements – not a CEO’s words.
And therein lies the rub. When Mark Zuckerberg and others have to apologize with their wallets and/or company dollars, we’ll see true accountability. And then we can get to industry-wide requirements, perhaps things like age/identity verification, investment in personnel and advanced algorithms, and empowering/ simplifying parental controls/ visibility.
Do social media platforms want to be a human trafficker’s best friend or worst nightmare – and what will we allow as parents, voters, and legislators? Time – and the village – will tell. Let’s have Big Tech, but the best of Big Tech, with true accountability on user safety. Only then will we see change.
Andrew Murtagh works as an executive in the medical device industry and is a former resident of Albuquerque, now living in Fishers, Indiana. He writes for Entrepreneur and Medium on the intersection of philosophy, leadership, and sports/martial arts.