Fake news deadlier than first thought, British report finds
LONDON — The “very future of democracy” is being threatened by fake news, a British Parliamentary committee is warning, and the government response should include making tech companies legally liable and subject to algorithm auditing.
That’s the conclusion after a lengthy probe by Britain’s digital, culture, media and sport committee.
The committee, led by Conservative lawmaker Damian Collins, has covered issues including Facebook data mining, Russian election meddling and questionable Brexit financing. Witnesses who testified before it have generated some of the biggest political stories of the year.
The report concludes that an even bigger concern than obviously false information is the manipulation and misuse of personal data and the deliberate stoking of fears and prejudices — by statesponsored actors, private companies and other groups with an agenda — in order to influence voting.
In an interview with The Post last month, Collins said that after interviewing U.S. tech companies and researchers in Washington in February, the committee realized that fake news based on lies was only a small part of the problem. “One of the bigger areas, the much more difficult area, is the relentless targeting of people with hyper partisan content, that’s not necessarily fake, but it’s highly skewed to a particular point of view,” Collins said.
“The issues there is: Do people understand why they are receiving this information? And also where is it coming from?
“One of the big issues with the Russian activities is it’s not just that they are doing it, but they are masquerading as people in your country.”
Among the committee’s recommendations are that the government should update electoral law to account for modern campaigning techniques, consider new restrictions of political advertising on social media and “establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms.”
The report suggests that companies should be responsible for both “content that has been referred to them for takedown by their users, and other content that should have been easy for the tech companies to identify for themselves.” It cites free speech concerns related to a German law that fines companies 20 million euros if they don’t remove hate speech from their sites within 24 hours. But the report also notes, “As a result of this law, one in six of Facebook’s moderators now works in Germany, which is practical evidence that legislation can work.” The report also proposes that the government should have the power to audit nonfinancial aspects of tech companies, including their algorithms.