Americans are more vulnerable to foreign propaganda, senator warns
“With polarization in this country, and the lack of faith in institutions, people will believe anything or not believe things that come from what used to be viewed as trusted sources of information. So there's a much greater willingness to accept conspiracy theories.” — Sen. Mark Warner, D-Va.
The threat against U.S. elections by Russia and other foreign powers is far greater today than it was in 2020, the chair of the Senate Intelligence Committee said Tuesday.
Sen. Mark Warner, D-Va., who leads the committee, said the danger had grown for multiple reasons: Adversarial countries have become more adept at spreading disinformation, Americans are more vulnerable to propaganda, communication between the government and social media companies has become more difficult and artificial intelligence is giving foreign powers new abilities.
The Intelligence Committee was set to hold a hearing on election threats today, but it was postponed Tuesday to allow the Senate to consider the articles of impeachment against Alejandro Mayorkas, the homeland security secretary. Warner said he hoped to reschedule the hearing quickly.
A bipartisan report by the Senate Intelligence Committee, the final volume of which was released in 2020, chronicled extensive efforts by Russians to influence U.S. politics in 2016. Since then, Russia has only honed its ability to shape debates in Europe and the United States, while people everywhere have become more vulnerable, Warner said.
For a time, Americans and Europeans were becoming more aware of disinformation or influence operations by Russia or other foreign powers. But today, conspiracy theories seem to be gaining more traction.
“With polarization in this country, and the lack of faith in institutions, people will believe anything or not believe things that come from what used to be viewed as trusted sources of information,” Warner said. “So there's a much greater willingness to accept conspiracy theories.”
Vulnerability to influence operations, Warner said, is not confined to the United States. In Slovakia, for example, Russian information operations influenced views of Russia's war in Ukraine.
After 2016, intelligence agencies intensified efforts to go after foreign governments influencing the election. During the 2018 midterm elections, U.S. Cyber Command issued warnings to Russians conducting influence operations and shut down a troll farm trying to spread disinformation.
In 2020, senior intelligence officials repeatedly warned the public about Russian efforts to spread false stories about Joe Biden.
But Warner said he believed some of the focus may have waned.
At the same time, artificial intelligence tools have revolutionized the deceptive information that can be spread by foreign governments.
Deepfake videos remain, for now, imperfect. And such manipulated imagery is often detected quickly. Experts on artificial intelligence say, however, that faked voice recordings are a more effective vehicle for disinformation campaigns.
Warner said faked video could still influence the public, especially if an adversary mimicked a less-known local politician. Artificial intelligence, he said, is advancing very quickly, and it would be foolish to discount any aspect of the threat against elections.