Messenger photos get a peek from Facebook
Social network says it looks for signs of malware, abuse
Facebook Messenger can seem the least social part of the social network — much of the time, you’re only conversing with one other person.
But it’s not just you and your chat partner on Messenger. Facebook itself automatically scans links and attached photos on its chat system for malware and child sexual abuse.
Facebook’s data-policy pages don’t explicitly describe this automated scanning, but the company has confirmed these practices to USA TODAY and other news sites after comments by CEO Mark Zuckerberg drew attention to the practice. Facebook can also investigate reports by users of Messenger content that violates its posted community standards. Bloomberg earlier reported on the scanning.
“Most services do some form of this,” said Joseph Lorenzo Hall, chief technologist at the Center for Democracy & Technology (the Washington non-profit derives 35% of its funding from corporate donors, Facebook among them). He noted the key benefit of checking links for sites blacklisted for abusive behavior: Spammers and scammers can’t get as many people to click on pages pushing dangerous malware or annoying “adware.”
Screening for images of exploited children has been an industrystandard practice for years. A mail or messaging service performs a mathematical check, usually employing a Microsoft-maintained system called PhotoDNA, for matches against a database maintained by the National Center for Missing and Exploited Children.
“It’s a clear example of how technology tools and (artificial intelligence) can work, as it were, behind the scenes to catch the most egregious content,” said Stephen Balkam, CEO of the Family Online Safety Institute. (Facebook is among this Washington-based group’s member firms.)
A year ago, some of Facebook Messenger’s mobile apps added a different sort of robot reading: Its M digital assistant can surface to suggest you use such Messenger features as stickers, polls and location sharing. You can mute M in Messenger’s settings.
In that respect, Facebook was only following the lead of Google, which introduced the option of Smart Reply in its Inbox app in 2015 and has since added it to Gmail’s mobile apps.
Facebook, Google and Microsoft do not scan messages for ad-targeting purposes. Google did so for years in its free Gmail service but stopped that last June. That’s not the case with Yahoo and AOL mail services of Verizon’s Oath media division, as a new privacy FAQ reminds users while linking to pages where they can decline this targeting. Charles Stewart, an Oath spokesperson, said the company will add a privacy dashboard that will let users see and control how their data gets used across Oath’s various sites.
(Disclosure: I also write for Yahoo Finance, another Oath subsidiary.)
For maximum messaging privacy, you’ll have to use a service that encrypts your conversation from your screen to the recipient’s.
Hall called the free and open-source mobile app Signal “the top-of-the-line and most secure messaging service out there.” He also suggested the Secret Conversations encryption option in Facebook Messenger and the Incognito Mode of Google’s Allo Messenger.
Facebook-owned WhatsApp is another option, although its end-to-end encryption only works in conversations where everybody uses that app.