Sen­a­tor: Mary­land among top states tar­geted by Rus­sian ads

Cecil Whig - - OBITUARIES & REGIONAL - By CHANGEZ ALI & J.F. MEILS Cap­i­tal News Ser­vice

WASH­ING­TON — Heav­ily Demo­cratic Mary­land was one of the three states most tar­geted by Rus­sian ads dur­ing the 2016 pres­i­den­tial elec­tion, the chair­man of the Se­nate In­tel­li­gence Com­mit­tee said Wed­nes­day.

Sen. Richard Burr, R-North Carolina, used the ex­am­ple in an ap­par­ent ef­fort to di­min­ish the per­cep­tion that Rus­sian ads may have had an im­pact on the elec­tion of fel­low Repub­li­can Don­ald Trump to the pres­i­dency.

“The (cur­rent) nar­ra­tive here is that ads linked to Rus­sia were tar­geted at piv­otal states and di­rectly in­flu­enced the elec­tion out­come,” Burr said, open­ing a hear­ing on the in­flu­ence of so­cial me­dia on last year’s elec­tion. “What you haven’t heard is that al­most five times more ads were tar­geted at the state of Mary­land than of Wis­con­sin.”

Mary­land was tar­geted by 262 ads, com­pared to 55 in Wis­con­sin lead­ing up to the 2016 elec­tion, ac­cord­ing to Burr. Mary­land was won eas­ily by Demo­crat Hil­lary Clin­ton while Wis­con­sin went nar­rowly for Trump. The lat­ter state was one of the keys to Trump’s Elec­toral Col­lege vic­tory.

“With the 2018 midterm elec­tions just around the cor­ner, Congress must come to­gether and ad­vance bipartisan re­forms to pre­vent for­eign agents from un­der­min­ing our elec­toral process,” said Rep. John Sar­banes, D-Tow­son, who chairs the Democ­racy Re­form Task Force.

One of the Rus­sian-linked mes­sages to ap­pear in Mary­land was a fake Black Lives Mat­ter ad al­legedly aimed at dis­en­fran­chis­ing African-Amer­i­can vot­ers in Bal­ti­more, ac­cord­ing to re­port­ing by CNN and the Bal­ti­more Sun.

“Some ads tar­geted users in Fer­gu­son, Bal­ti­more and Cleve­land,” Sen. Chuck Grass­ley, R-Iowa, told a Se­nate Ju­di­ciary crime and ter­ror­ism sub­com­mit­tee hear­ing on Tues­day. “These ads spread sto­ries about abuse of black Amer­i­cans by law en­force­ment. These ads are clearly in­tended to worsen racial ten­sions and pos­si­ble vi­o­lence in those cities.”

Al­though the ex­act con­tent of the fake Black Lives Mat­ter ad used in Bal­ti­more has yet to be made pub­lic, it is has been de­scribed as ex­press­ing both sup­port for the Black Lives Mat­ter move­ment while also im­ply­ing that the group rep­re­sented a threat to oth­ers, pre­sum­ably white peo­ple.

The ad was one of the more than 3,000 linked to Rus­sian en­ti­ties that ap­peared on Face­book be­tween June 2015 and May 2017 and has been turned over to Congress.

At Wed­nes­day’s in­tel­li­gence panel hear­ing, se­na­tors from both par­ties lashed out at lawyers from so­cial me­dia com­pa­nies Face­book, Twit­ter and in­ter­net gi­ant Google, the sec­ond day tech com­pa­nies’ ex­ec­u­tives faced ques­tions about what the in­dus­try in­tended to do to block fake Rus­sian ads and ac­counts in fu­ture elec­tions.

Face­book’s gen­eral coun­sel, Colin Stretch, re­vealed on Tues­day that Rus­sian-linked ac­counts de­liv­ered ads to more than 126 mil­lion Amer­i­cans in the lead up and af­ter­math of the 2016 elec­tions.

“Many of these (Rus­sia-linked) ads and posts are in­flam­ma­tory, some are down­right of­fen­sive,” Stretch told law­mak­ers. “And much of it will be par­tic­u­larly painful to com­mu­ni­ties that en­gaged with this con­tent be­liev­ing it to be au­then­tic. They have every right to ex­pect more of us.”

Some se­na­tors put the chal­lenge fac­ing these com­pa­nies and the Amer­i­can pub­lic in starker terms.

“What we’re talk­ing about is a cat­a­clysmic change. What we’re talk­ing about is the be­gin­ning of cy­ber war­fare,” said Sen. Dianne Fe­in­stein, DCal­i­for­nia. “What we’re talk­ing about is a ma­jor for­eign power with the so­phis­ti­ca­tion and abil­ity to in­volve them­selves in an elec­tion and sow con­flict and dis­con­tent all over this coun­try.”

The crux of the is­sue go­ing for­ward is not just how to se­cure vul­ner­a­ble so­cial me­dia plat­forms but who will do it — the govern­ment or the com­pa­nies them­selves.

“We be­lieve as a user-gen­er­ated plat­form, the rules around sec­tion 230 (of the Com­mu­ni­ca­tions De­cency Act) pro­vide a plat­form to our users around free speech and ex­pres­sion and don’t re­quire us to take a bias on re­mov­ing con­tent that we fear will vi­o­late cer­tain rights,” said Sean Ed­gett, Twit­ter’s act­ing gen­eral coun­sel.

An­other is­sue with which law­mak­ers and tech com­pa­nies must wres­tle is how to man­age un­ver­i­fied or anony­mous con­tent with­out also reg­u­lat­ing speech.

“We don’t want to put our­selves in the po­si­tion of be­ing the ar­biter of truth. We don’t think that’s a ten­able po­si­tion for any com­pany,” Stretch told the House In­tel­li­gence Com­mit­tee on Tues­day.

But some in govern­ment feel there is a way for so­cial me­dia com­pa­nies to do just that.

“It’s not ac­tu­ally news. These are sto­ries that are placed by peo­ple with … ma­li­cious in­tent,” Ellen Wein­traub, a mem­ber of the Fed­eral Elec­tion Com­mis­sion, told Cap­i­tal News Ser­vice. “That’s not news. That’s peo­ple try­ing to cause trou­ble and what I want to know is are they U.S. peo­ple or are they for­eign peo­ple.”

“When you get in­for­ma­tion on the in­ter­net there’s re­ally no way of know­ing where it’s com­ing from if it doesn’t carry some kind of dis­claimer,” Wein­traub added.

A bipartisan bill has been in­tro­duced in the Se­nate re­quir­ing po­lit­i­cal ads on so­cial me­dia to be sub­ject to the same trans­parency laws as ad­ver­tise­ments on tele­vi­sion and ra­dio.

The Hon­est Ads Act is spon­sored by Demo­cratic Sens. Amy Klobuchar of Min­nesota and Mark Warner of Vir­ginia, and is cospon­sored by Sen. John McCain, R-Ari­zona.

“We’re sim­ply ask­ing the com­pa­nies to make a rea­son­able at­tempt so that if that ad is be­ing paid for by a for­eign agent, that they will try to re­veal that for­eign agent,” Warner said in an in­ter­view to NPR.

Wein­traub sup­ports the leg­is­la­tion. “I think we need to get bet­ter dis­clo­sure,” she said. “I think there’s some of that we can do by reg­u­la­tion.”

How tech­nol­ogy com­pa­nies or the govern­ment will ac­tu­ally reg­u­late con­tent or de­ter­mine what can be reg­u­lated is an open ques­tion.

All three com­pa­nies out­lined ef­forts al­ready un­der­way or planned to com­bat not just po­lit­i­cal con­tent but ma­li­cious so­cial con­tent as well. These in­cluded var­i­ous forms of trans­parency re­port­ing for ads, ad­di­tional ver­i­fi­ca­tion for ad­ver­tis­ers, ad­justed al­go­rithms to spot fake news and more staff to man­u­ally re­view sen­si­tive con­tent.

Some law­mak­ers were cau­tious about com­mit­ting to a spe­cific rem­edy.

“For every com­plex prob­lem, there is a very clear, sim­ple and wrong an­swer and so we need to be very care­ful, I think, in how we deal with this,” said Sen. John Cornyn, R-Texas.

RON SACHS/CNP/SIPA USA/TNS

Sen. Richard Burr (R-N.C.), Chair­man of the U.S. Se­nate Se­lect Com­mit­tee on In­tel­li­gence, said Mary­land was one of three states most tar­geted by Rus­sian ads in 2016.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.