FTC Chair Edith Ramirez talks about her role as the coun­try’s top cop on data pri­vacy.

The Washington Post Sunday - - BUSINESS - BY AN­DREA PETER­SON an­drea.peter­son@wash­post.com

As the dig­i­tal econ­omy has ex­ploded, tech com­pa­nies are col­lect­ing un­told amounts of data on ev­ery­day Amer­i­cans. At the cen­ter of the dis­cus­sion of how to pro­tect that in­for­ma­tion is the Fed­eral Trade Com­mis­sion, which has in­creas­ingly played the role of the coun­try’s top of­fi­cer on dig­i­tal pri­vacy and se­cu­rity.

Edith Ramirez, a Har­vard Law School class­mate of Pres­i­dent Obama, took the agency’s helm in 2013. Since then, it has se­cured set­tle­ments against tech gi­ants, in­clud­ing Google and Ap­ple, for al­low­ing chil­dren to make mo­bile pur­chases within apps on their plat­forms, and against Snapchat for promis­ing that pho­tos taken with its ser­vice would dis­ap­pear for­ever (but they don’t).

The job is only get­ting tougher as Amer­i­cans grow more de­pen­dent on smartphones and as tech com­pa­nies wield more power in Wash­ing­ton.

The FTC has been dou­bling down on its ef­forts, in part, by open­ing a new re­search of­fice to bol­ster its tech­ni­cal ex­per­tise. It also plans to host public work­shops ex­am­in­ing “shar­ing econ­omy” com­pa­nies such as Uber and Airbnb, as well as ad­ver­tis­ers’ abil­ity to fol­low users on­line from their com­put­ers to their smartphones.

Ramirez sat down with The Wash­ing­ton Post to talk about what’s next in the FTC’s quest to keep con­sumers— and their data — safe. This in­ter­view has been edited for length and clar­ity.

Q: Of­ten, the FTC is de­scribed as the gov­ern­ment’s de facto pri­vacy cop. Do you see it that way?

A: We ab­so­lutely are, in my mind, the key cop on the beat when it comes to pri­vacy. We do a very ef­fec­tive job on en­force­ment and are also think­ing on the pol­icy side. It’s very im­por­tant for us to stay on top of tech­no­log­i­cal de­vel­op­ments, so we’re not only think­ing about what’s hap­pen­ing to­day and en­sur­ing com­pa­nies are com­ply­ing with the law, but also about what com­pa­nies will do to­mor­row.

Th­ese are is­sues that are com­plex and chal­leng­ing. We want to deal with them in a way that al­lows com­pa­nies to in­no­vate and new play­ers to de­velop new prod­ucts and lead new fron­tiers. But how do we al­low that to hap­pen while at the same time make sure that con­sumers are in a mar­ket­place they can trust?

The com­mis­sion’s en­force­ment in the area of pri­vacy and data se­cu­rity comes pri­mar­ily from its author­ity to guard con­sumers from de­cep­tive and un­fair prac­tices. Could you ex­plain how that over­sight ap­plies to tech­nol­ogy?

In terms of the de­cep­tion prin­ci­ple, it’s re­ally very sim­ple: We ex­pect com­pa­nies that make prom­ises to ac­tu­ally ful­fill those prom­ises. If a com­pany makes a par­tic­u­lar prom­ise in their pri­vacy pol­icy or through some other mech­a­nism, we ex­pect them to com­ply.

Sim­i­larly, when it comes to data se­cu­rity, if a com­pany makes a par­tic­u­lar prom­ise to con­sumers about pro­vid­ing rea­son­able pro­tec­tions, we ex­pect them to ful­fill that prom­ise. It’s quite a sim­ple test, and we’ve used it very ef­fec­tively, be­cause we find that com­pa­nies say things about their prac­tices and don’t fol­low through.

In terms of our un­fair­ness au­thor­i­ties, there, the test is a bit dif­fer­ent. It’s ba­si­cally: Has some­thing caused con­sumers sig­nif­i­cant harm that they could not have rea­son­ably avoided and that isn’t out­weighed by some other ben­e­fit ei­ther to con­sumers or to com­pe­ti­tion?

One core ex­am­ple is the area of data se­cu­rity — we think a com­pany’s fail­ure to pro­vide rea­son­able data pro­tec­tions con­sti­tutes an un­fair prac­tice, be­cause we think it’s a rea­son­able ex­pec­ta­tion for a con­sumer. If a com­pany is mak­ing use of per­sonal fi­nan­cial in­for­ma­tion, they ought to have ap­pro­pri­ate pro­tec­tions in place to make sure that in­for­ma­tion isn’t com­pro­mised.

It seems that com­pa­nies fac­ing FTC en­force­ment of­ten get their first strike free, then get slapped with fines the next time, once they are un­der an en­force­ment or­der. Is that gen­er­ally how it works?

As a gen­eral mat­ter, we don’t have civil penalty author­ity — we can’t sim­ply fine a com­pany be­cause they failed to com­ply with Sec­tion V [which con­tains the author­ity to pro­tect con­sumers from un­fair and de­cep­tive pow­ers]. If, how­ever, a com­pany is un­der or­der [from the FTC] and they vi­o­late the or­der, at that point we do have civil penalty author­ity.

The com­mis­sion as a whole has been urg­ing Congress to en­act data se­cu­rity leg­is­la­tion, and as part of that we be­lieve we ought to have civil penalty author­ity. There are other ar­eas where Congress has given us that spe­cific author­ity, although pri­vacy and data se­cu­rity isn’t one of those ar­eas right now.

What we can do, how­ever, is seek mon­e­tary re­lief for re­dress to con­sumers — so it’s not al­ways the case that a com­pany we don’t have an or­der against won’t be sub­ject to a judg­ment that would en­com­pass fi­nan­cial penal­ties. A lot of our pri­vacy cases rely pri­mar­ily on in­junc­tive re­lief — where we man­date that the com­pany put in place com­pre­hen­sive pri­vacy pro­grams and also en­join them from sim­i­lar ac­tions in vi­o­la­tion of the FTC act go­ing for­ward.

How has the FTC’s ap­proach to tech­nol­ogy evolved as that tech­nol­ogy has evolved? Where do you think that’s go­ing next?

One of our re­spon­si­bil­i­ties has al­ways been to stay on top of evolv­ing busi­ness mod­els. We were cer­tainly look­ing at on­line com­merce when the In­ter­net first be­came popular.

Back in 2000, only a small per­cent­age of Amer­i­cans used mo­bile de­vices. To­day, more than 60 per­cent are us­ing smartphones.

As a con­se­quence of that, we’ve been in­creas­ingly plac­ing pri­or­i­ties on mak­ing sure that con­sumer pro­tec­tion extends be­yond the brick-and-mor­tar world and into the mo­bile ecosys­tem. We brought a num­ber of cases, even just in 2014, that em­pha­sized to com­pa­nies the need to en­sure that con­sumers have ad­e­quate in­for­ma­tion about pur­chases and that dis­clo­sures are made ef­fec­tively on mo­bile de­vices.

Do you think con­sumers are gen­er­ally aware of the trade­offs that they’re mak­ing when it comes to pri­vacy?

Most of us carry our phones all the time, and that means a lot of in­for­ma­tion is be­ing col­lected. That brings a lot of benefits to the con­sumer but then also raises cer­tain risks.

You might pur­chase a smartbed that could mon­i­tor your heart rate and your res­pi­ra­tion, as well as cap­ture snor­ing pat­terns. It might also per­mit you, from the com­fort of your bed, to lock your doors or turn off your lights.

Not only is this smartbed col­lect­ing a lot of health in­for­ma­tion, it’s also now pro­vid­ing con­nec­tiv­ity that could raise se­cu­rity is­sues.

We want to high­light what those risks are but also think about ways to mit­i­gate those risks. We en­cour­age com­pa­nies to think hard about pri­vacy and data se­cu­rity from the get-go. From the time that they con­ceive a ser­vice or prod­uct, we want them to be think­ing about how to in­cor­po­rate pro­tec­tions.

How has your back­ground in cor­po­rate law in­flu­enced your ap­proach to en­force­ment?

Be­ing both on the side of de­fend­ing com­pa­nies and as an en­forcer gives you an im­por­tant per­spec­tive — so it’s not just one side pit­ted against the other. I think it’s im­por­tant to un­der­stand that the vast ma­jor­ity of com­pa­nies want to com­ply with the law. You need to have a con­struc­tive re­la­tion­ship with them and pro­vide guid­ance.

Learn­ing how to as­sess a case, learn­ing how to eval­u­ate whether it’s ap­pro­pri­ate to move for­ward with an en­force­ment ac­tion, de­ter­min­ing what type of re­lief is needed — th­ese are all things I was familiar with from my days as a lit­i­ga­tor.

At the same time, be­ing in an agency that places such an im­por­tance on pol­icy and re­search has been new ter­rain for me, but it is some­thing I feel is valu­able.

Do you feel like your back­ground in lit­i­ga­tion played into the com­mis­sion’s de­ci­sion not to pur­sue an­titrust ac­tion against Google for its search prac­tices?

I’m not go­ing to get into the Google sit­u­a­tion. We is­sued a state­ment that ar­tic­u­lated our think­ing and why it was that we felt it ap­pro­pri­ate to close that par­tic­u­lar in­ves­ti­ga­tion.

But as a gen­eral mat­ter, the first ques­tion we ask when we are de­ter­min­ing whether to bring an en­force­ment is: What is the right out­come here? Has there been a vi­o­la­tion of the law? Some­times we deal with is­sues that are very com­plex, but that’s the first and fore­most ques­tion that is on my mind when I’m help­ing to de­cide whether we ought to pro­ceed.

NIKKI KAHN/THE WASH­ING­TON POST

Fed­eral Trade Com­mis­sion Chair­woman Edith Ramirez says the FTC “has been urg­ing Congress to en­act data se­cu­rity leg­is­la­tion.”

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.