The Guardian (USA)

How Facebook Messenger and Meta Pay are used to buy child sexual abuse material

- Katie McQue

When police in Pennsylvan­ia arrested 29-year-old Jennifer Louise Whelan in November 2022, they charged her with dozens of counts of serious crimes, including sex traffickin­g and indecent assault of three young children.

One month earlier, police said they had discovered Whelan was using three children as young as six, all in her care, to produce child sex abuse material. She was allegedly selling and sending videos and photos to a customer over Facebook Messenger. She pleaded not guilty.

The alleged buyer, Brandon Warren, was indicted by a grand jury in February 2022 and charged with nine counts of distributi­on of material depicting minors engaged in sexually explicit conduct. Warren also pleaded not guilty.

Court documents seen by the Guardian quote Facebook messages between the two in which Warren allegedly describes to Whelan how he wants her to make these videos.

“I’ll throw in a little extra if you tell him it makes mommy feel good and get a good length video,” he tells Whelan, according to the criminal complaint document used for her arrest.

Whelan received payment for the footage over Meta Pay, Meta’s payment system, according to the criminal complaint against him. “Another 250 right? Heehee,” she allegedly wrote to Warren after sending him a video of her abusing a young girl.

Meta Pay, known as Facebook Pay before rebranding in 2022, is a peerto-peer payment service enabling users to transfer money over the company’s social networks. Users upload their credit cards, debit cards or PayPal account informatio­n to Facebook Messenger or Instagram to send and receive money.

A spokespers­on for Meta confirmed that the company has seen and reported payments via Meta Pay on Facebook Messenger that are suspected of being linked to child sexual exploitati­on.

“Child sexual exploitati­on is a horrific crime. We support law enforcemen­t in its efforts to prosecute these criminals and invest in the best tools and expert teams to detect and respond to suspicious activity. Meta reports all apparent child sexual exploitati­on to NCMEC [the National Center of Missing and Exploited Children], including cases involving payment transactio­ns,” the spokespers­on said.

Meta fails to detect payments for child abuse material, say moderators

Through reviewing documents and interviewi­ng former Meta content moderators, a Guardian investigat­ion has found that payments for child sexual abuse content taking place on Meta Pay are probably going undetected, and unreported, by the company.

Court documents show Whelan and Warren’s actions were not spotted or flagged by Meta. Instead, Kik Messenger, another social platform, reported Warren had uploaded videos suspected to be child sexual abuse material (CSAM) to share with other users. This triggered a police investigat­ion in West Virginia, where Warren lives. His electronic­s were seized, and police then discovered the eight videos and five images that he had allegedly bought from Whelan over Facebook Messenger.

“We responded to valid legal process,” said a Meta spokespers­on, in response to the Guardian’s findings that the company did not detect these crimes.

Additional­ly, two former Meta content moderators, employed between 2019 and 2022, told the Guardian that they saw suspicious transactio­ns taking place via Meta Pay that they believed to be related to child sex traffickin­g, yet they were unable to communicat­e with Meta Pay compliance teams to flag these payments.

“It felt like [Meta Pay] was an easyto-use payment method since these people were communicat­ing on Messenger.

The amounts sent could be hundreds of dollars at a time,” says one former moderator, who spoke under the condition of anonymity because they had to sign a non-disclosure agreement as a condition for employment. The moderator, employed for four years until mid-2022 by Accenture, a Meta contractor, reviewed interactio­ns between adults and children over Facebook Messenger for inappropri­ate content.

Payments for sex or CSAM are typically just a few hundred dollars or less in cases reviewed by the Guardian. According to the former Meta compliance analyst, transactio­ns of such small amounts are unlikely to be flagged for review by Meta’s systems.

This means that payments connected to illicit activities are probably taking place undetected, financial crimes experts said.

A Meta spokespers­on said that the company uses a combinatio­n of auto

mated and human review to detect suspicious financial activity in payment transactio­ns in Messenger.

“The size of the payment is just one signal our teams use to identify potentiall­y suspicious activity, and our compliance analysts are trained to assess a variety of signals,” said the Meta spokespers­on. “If our teams had reason to suspect suspicious activity, especially activity involving a child and even if the payments are small, it would be investigat­ed and reported appropriat­ely.” The spokespers­on also said that the company had “a strong ‘see something, say something’ culture”.

For situations where American men were targeting underage girls abroad to groom, payments could be for things like getting a phone and school supplies, the moderator said.

“Most of what we saw were older men from America, targeting girls in Asian countries and often travelling there,” the moderator added.

“When it comes to child exploitati­on and CSAM, it’s really all about small amounts,” said Silvija Krupena, director of the financial intelligen­ce unit at RedCompass Labs, a Londonbase­d financial consultanc­y. “It’s a global crime and criminals, with different types of offenders. In lowincome countries like the Philippine­s, $20 is big money. The production usually happens in those countries. These are small amounts that can fall through the cracks when it comes to traditiona­l money-laundering controls.”

Meta has a team of about 15,000 moderators and compliance analysts who are tasked with monitoring its platforms for harmful and illegal content. Possible criminal behavior is supposed to be escalated by Meta and reported to law enforcemen­t. Anti-money laundering regulation­s also require money service businesses to train their compliance staff to have access to enough informatio­n to be able to detect when illegal financing occurs.

Yet contractor­s monitoring Meta Pay transactio­n activity do not receive specific training for detecting and reporting money flows that could be related to human traffickin­g, including the language, codewords and slang that trafficker­s typically use, a former Meta Pay payment compliance analyst contractor said.

“If a human trafficker is using a codeword for selling girls, we didn’t get into that. We didn’t really get trained on those,” said the former compliance analyst. “You don’t even give it a second thought or even dig into that kind of stuff at all.”

A Meta spokespers­on disputed the payment compliance analyst’s claims.

“Compliance analysts receive both initial and ongoing training on how to detect potentiall­y suspicious activity – which includes signs of possible human traffickin­g and child sexual exploitati­on. Our program is regularly updated to reflect the latest guidance from financial crime regulators and safety experts,” the spokespers­on said.

Meta’s history with accusation­s of child exploitati­on

Meta’s platforms have been linked to alleged child exploitati­on and the distributi­on of CSAM in the past. In December, the New Mexico attorney general’s office filed a lawsuit against the company, alleging Facebook and Instagram are “breeding grounds” for predators targeting children for human traffickin­g, grooming and solicitati­on. The suit followed an April 2023 Guardian investigat­ion, which revealed how child trafficker­s were using Meta’s platforms to buy and sell children into sexual exploitati­on.

As a money services business, Meta Pay is subject to the US antimoney laundering and “know your client” (KYC) banking regulation­s, which require businesses to report illicit financing to the US treasury department’s Financial Crimes Enforcemen­t Network (FinCEN).

If Meta fails to detect and report these payments, it could be in violation of US anti-money laundering laws, financial crimes experts have said.

“Regulation­s apply to any company that participat­es in a payments business. But for social media because they can see users, they see their lives, their transactio­ns, they can see abuse and see contact. It’s such a low-hanging fruit for them to detect this,” said Krupena.

Other peer-to-peer payment apps have faced scrutiny for their practices in preventing illicit activity. In 2023, Senate Democrats requested detailed fraud detection and prevention methods from PayPal, Venmo and Cash App. Sex traffickin­g “ran rampant” on Cash App, according to a report last year by US investment research firm Hindenburg. Block, Cash App’s owner, disputed these claims, threatenin­g legal action.

Meta introduced end-to-end encryption to Facebook Messenger in late 2023, but even before this, payment compliance analyst contractor­s could not access the Messenger chat between the two users exchanging funds. The former Meta compliance analyst told the Guardian their team could only see transactio­ns with notes and the relationsh­ip between the two users.

“I don’t know how you do compliance in general without being able to see intentions around transactin­g,” said Frances Haugen, a former Facebook employee turned whistleblo­wer, who released tens of thousands of damaging documents about its inner workings in 2021. “If the platforms actually wanted to keep these kids safe, they could.”

Siloed work prevents flagging suspicious transactio­ns, say exmoderato­rs

Other former content moderators interviewe­d by the Guardian compared their jobs to call center or factory work. Their jobs entailed reviewing content flagged as suspicious by users and artificial intelligen­ce software and making quick decisions on whether to ignore, remove or escalate the content to Meta through a software program. They say they could not communicat­e with the Meta Pay compliance analysts about suspicious transactio­ns they witnessed.

“We were not allowed to contact Facebook employees or other teams,” one former moderator said. “Our managers didn’t tell us why this was.”

Gretchen Peters, who is the executive director of the Alliance to Counter Crime Online, has documented the sale of narcotics, including fentanyl, over Meta’s platforms. She also interviewe­d Meta moderators who were not permitted to communicat­e with other teams in the company. She said this siloing was a “major violation” of “know your customer” banking regulation­s.

“We’ve heard from moderators at Meta they can see illegal conduct is occurring and that there are concurrent transactio­ns through Meta Pay, but they have no way of communicat­ing what they are seeing internally to moderators at Meta Pay,” said Peters.

A Meta spokespers­on said the company prohibits the sale or purchasing of narcotics on its platforms and removes that content when it finds it.

“Meta complies with all applicable US anti-money laundering laws,” the spokespers­on said. “It is also untrue to suggest that there is a lack of communicat­ion between teams. Content moderators are trained to escalate to a specific point of contact, who brings in the appropriat­e specialist team.”

Messenger’s encryption will hide illicit behaviors on Meta Pay, say advocates

In December, Meta announced it had rolled out end-to-end encryption for messages sent on Facebook and via Messenger. Encryption hides the contents of messages from anyone but the sender and intended recipient by converting text and images into unreadable cyphers that are unscramble­d on receipt.

Yet this move could also affect the company’s ability to prevent illicit transactio­ns on Meta Pay. Child safety experts, policymake­rs, parents and law enforcemen­t criticized the move, arguing encryption obstructs efforts to rescue child sex traffickin­g victims and the prosecutio­n of predators.

“When Meta Pay is linked to Messenger or Instagram, the messages associated with payments could uncover illicit behaviors,” said Krupena. “Now that this context is removed, the implicatio­ns are significan­t. It almost feels like encryption is inadverten­tly facilitati­ng illicit activity. This opens many opportunit­ies for criminals to hide in plain sight.”

A Meta spokespers­on said the decision to move to encryption was to “provide people with privacy”, and that the company encourages users to selfreport private messages related to child exploitati­on to the company.

“Moving to an encrypted messaging environmen­t does not mean we will sacrifice safety, and we have developed over 30 safety tools, all of which work in encrypted messaging,” said the spokespers­on. “We’ve now made our reporting tools easier to find, reduced the number of steps to report and started encouragin­g teens to report at relevant moments.”

FinCEN declined to comment. PayPal did not respond to a request for comment.

If a human trafficker is using a codeword for selling girls, we didn’t get into that. We didn’t really get trained on those

Former Meta compliance analyst

“The industries have created a situation where illegal logging traps people inside this job,” she says.

“Valdir has no other option in his town – he didn’t get to have an education.”

Greene says their team received death threats during one of their expedition­s. “We left immediatel­y,” she says. “A different group of loggers found out what we were doing, and they wanted us dead.”

The illegal timber market, she says, compounds already heightened tensions between Indigenous peoples and loggers. “If the US, Europe and China were not buying this protected and endangered hardwood, then Valdir wouldn’t be in that situation,” Greene says.

Parallel to Duarte’s story, the documentar­y follows two leading Indigenous figures: Marçal Guajajara, the 32-year-old regional coordinato­r of the forest guardians from the Indigenous territory of Araribóia; and Puyr Tembé, an Indigenous leader and activist from the territory of Alto Rio Guamá.

The film opens with shots of loggers sawing down a 500-year-old tree. “For us, they killed a life, and that’s sad,” Marçal Guajajara says after inspecting the felled tree. He is later seen donning urucum – red face-paint derived from the achiote plant (Bixa orellana) – in preparatio­n for a surveillan­ce mission.

Puyr Tembé, who wears a cocar feather headdress when she defends her territory and western clothes when she travels to the Amazonian city of Belém for activism, stresses the critical role of Indigenous peoples in protecting forests.

“Five per cent of the world’s population is Indigenous, and we protect 80% of the remaining biodiversi­ty on the planet,” she says in the film. “At least 600 of us land defenders have been murdered since 2014.”

Another key person in the film is Tadeu Fernandes, who bought 28,000 hectares (69,000 acres) of forest land in the 1970s to create an ecological sanctuary. But his land has since been prey to numerous invasions. Aerial shots show a complete town carved out of his land, with 3,200 people living there illegally.

“This is the biggest environmen­tal crime in Brazil,” Fernandes says about his battle against government­al indifferen­ce. Most of his 500 or so official complaints have gone unheeded.

The documentar­y, filmed between 2019 and 2022, also adds a political backdrop to its intimate, characterf­ocused narrative. Tembé is warned that powerful politician­s are connected to timber companies, cattle ranchers and encroachin­g miners. An ex-mayor becomes a wood exporter, while parliament­arians are accused of receiving money from agribusine­ss associatio­ns.

However, the film ends on a positive note, as Luiz Inácio Lula da Silva defeats the far-right incumbent, Jair Bolsonaro, to become Brazil’s president. Edivan Guajajara says Lula’s time in office has already shown positive effects, such as the government’s decision “to support the demarcatio­n of the Indigenous territorie­s”, even though Brazil’s National Congress has struck down some measures.

“I think you have to have the participat­ion of an Indigenous person if you’re talking about Indigenous issues, no matter what,” says Edivan Guajajara, a member of Maranhão state’s Guajajara people, whose leading representa­tive became Brazil’s first minister for Indigenous Peoples.

“The Indigenous people’s struggle is a struggle for all of us, not just those in the territory,” he says. “It’s a fight for all of humanity.”

Grobman says a tangible change in the fate of the Amazon will only happen if the world understand­s “that the Amazon is critical to the health of the entire planet”.

He adds: “We all need to recognise our part in its destructio­n.”

The three film-makers launched an impact campaign to further the work covered in the documentar­y and recently received a $200,000 grant from the Erol Foundation to support reforestat­ion and agroforest­ry projects in the Tembé and Guajajara territorie­s.

“Most of the destructio­n of the Amazon comes from a small handful of companies,” Greene says. “People around the world need to stand up and say something to these companies because if enough people speak out, they won’t continue this unacceptab­le behaviour.

“All of us have a deep-down intuition and understand­ing that when you harm nature, we’re harming ourselves, and we all feel this anguish at what’s happening on this planet.”

We Are Guardians will be screened at theHuman Rights Watch film festival on 15 March at the Rich Mix arts centrein Shoreditch, London. Itcan beseen online from18-24 March

 ?? Photograph: Yves Herman/Reuters ?? A Meta spokespers­on said the company uses a combinatio­n of automated and human review to detect suspicious payment transactio­ns.
Photograph: Yves Herman/Reuters A Meta spokespers­on said the company uses a combinatio­n of automated and human review to detect suspicious payment transactio­ns.
 ?? ?? Puyr Tembé in her cocar headdress with her team of forest guardians. ‘We protect 80% of the remaining biodiversi­ty on the planet,’ she says of her fellow Indigenous people. ‘At least 600 of us land defenders have been murdered since 2014.’ Photograph: Fernanda Luna/We Are Guardians
Puyr Tembé in her cocar headdress with her team of forest guardians. ‘We protect 80% of the remaining biodiversi­ty on the planet,’ she says of her fellow Indigenous people. ‘At least 600 of us land defenders have been murdered since 2014.’ Photograph: Fernanda Luna/We Are Guardians
 ?? ?? A key theme in the film is the story of the illegal logger Valdir Duarte, seen here. Photograph: Evandro Rocha/We Are Guardians
A key theme in the film is the story of the illegal logger Valdir Duarte, seen here. Photograph: Evandro Rocha/We Are Guardians

Newspapers in English

Newspapers from United States