Albuquerque Journal

Allowing Big Tech to police themselves hasn’t worked

- BY ANDREW MURTAGH FISHERS, INDIANA RESIDENT

WE KNOW BIG TECH’S LOBBYING POWER AND ARMY OF LAWYERS ARE NOT MAKING THINGS EASY, BUT HOW IS THERE NOT A REGULATORY BODY HOLDING BIG TECH ACCOUNTABL­E AND BIPARTISAN — YES BIPARTISAN — LEGISLATIO­N ENSURING LEGAL LIABILITY ON THESE COMPANIES?

The explosion of social media has fundamenta­lly changed our humanity. The ultimate communicat­ion and networking platforms have revealed incredible benefit, but not without alarming side effects.

The recent “Big Tech and the Online Child Sexual Exploitati­on Crisis” congressio­nal hearing on Jan. 31 gained national headlines with the CEOs of the major social media companies being called to testify at the Senate Judiciary Committee.

What made this hearing particular­ly compelling were the families present – many holding pictures up of their loved ones who have seriously harmed themselves or taken their own lives.

At one point, Mark Zuckerberg even turned and faced the families to apologize for what they have gone through. But talk is cheap. Have Big Tech companies done enough to protect children? This hearing focused on many of the dark corners of social media including mental health, unwanted sexual images, sexual harassment, drug trade, and human traffickin­g recruitmen­t – again, involving teens and preteens.

Is social media a human trafficker’s best friend or worst enemy?

Human traffickin­g is a pandemic – the modern slavery of our time. Twenty-one million people globally are victims of forced labor — labor exploitati­on, sexual exploitati­on, or state-imposed forced labor, a $150 billion-dollar industry of which marginaliz­ed population­s are disproport­ionately impacted — and trafficker­s are highly active on social media as a recruitmen­t engine.

In fact, 30% of all victims in federal sex traffickin­g cases were recruited online. One month prior to this hearing we saw reports of New Mexico Attorney General Raúl Torrez launching legal action against Meta, claiming that Facebook and Instagram are “breeding grounds” for predators targeting children for human traffickin­g, grooming and solicitati­on.

Each CEO claimed to have made “industry leading” and “unpreceden­ted efforts” to prevent the issue and partner with law enforcemen­t with all identified cases. But on social media networks, with imperfect algorithms, and where people can lie about their age and have no identity verificati­on, how many cases are never identified or identified after someone is enslaved?

Some alarming investigat­ive reports have shined light on companies like Meta “failing to report or even detect the full extent of what is happening.” In a 2020 report by the Human Traffickin­g Institute of 105 child sex traffickin­g cases, Facebook was the No. 1 platform, 65%, followed by Instagram and Snapchat. And these Big Tech companies “doing all we can” – if you watch the hearing, you’ll learn there is no regulatory body over these companies. With that, social media companies are protected from legal liability under Section 230 as they are not viewed as the publishers of the content.

Good luck collecting damages. That being the case, what real accountabi­lity does Big Tech feel?

The longer you watch the hearing, you may wonder why our elected officials are not the ones apologizin­g to these families. Certain words spoken by our elected officials give us hope – “it’s on us,” “it takes a village,” and “bipartisan legislatio­n” – and the realizatio­n that Big Tech “grading their own homework” will not work any longer.

But again, talk is cheap. We know Big Tech’s lobbying power and army of lawyers are not making things easy, but how is there not a regulatory body holding Big Tech accountabl­e and bipartisan — yes bipartisan — legislatio­n ensuring legal liability on these companies?

Like the FDA in medical device regulation, companies should not be allowed to operate on their own honor code. Both in bringing the product to market and continued monitoring of that product’s safety after release, companies are held accountabl­e with industry-wide requiremen­ts – not a CEO’s words.

And therein lies the rub. When Mark Zuckerberg and others have to apologize with their wallets and/or company dollars, we’ll see true accountabi­lity. And then we can get to industry-wide requiremen­ts, perhaps things like age/identity verificati­on, investment in personnel and advanced algorithms, and empowering/ simplifyin­g parental controls/ visibility.

Do social media platforms want to be a human trafficker’s best friend or worst nightmare – and what will we allow as parents, voters, and legislator­s? Time – and the village – will tell. Let’s have Big Tech, but the best of Big Tech, with true accountabi­lity on user safety. Only then will we see change.

Andrew Murtagh works as an executive in the medical device industry and is a former resident of Albuquerqu­e, now living in Fishers, Indiana. He writes for Entreprene­ur and Medium on the intersecti­on of philosophy, leadership, and sports/martial arts.

 ?? JOSE LUIS MAGANA/ASSOCIATED PRESS ?? Meta CEO Mark Zuckerberg turns to address the audience during a Senate Judiciary Committee hearing on Capitol Hill in Washington on Jan. 31 to discuss child safety. X CEO Linda Yaccarino watches at left.
JOSE LUIS MAGANA/ASSOCIATED PRESS Meta CEO Mark Zuckerberg turns to address the audience during a Senate Judiciary Committee hearing on Capitol Hill in Washington on Jan. 31 to discuss child safety. X CEO Linda Yaccarino watches at left.
 ?? ?? Andrew Murtagh
Andrew Murtagh

Newspapers in English

Newspapers from United States