The Pak Banker

Tech groups cannot be allowed to hide from scrutiny

- Marietje Schaake

TECHNOLOGY companies have government­s over a barrel. Whether they are maximising traffic flow efficiency, matching pupils with their school preference­s, trying to anticipate drought based on satellite and soil data, most government­s rely heavily on critical infrastruc­ture and artificial intelligen­ce (AI) developed by the private sector. This growing dependence has profound implicatio­ns for democracy.

An unpreceden­ted informatio­n asymmetry is growing between companies and government­s. We can see this in the long-running investigat­ion into interferen­ce in the 2016 US presidenti­al elections. Companies build voter registries, voting machines and tallying tools, while social media companies sell precisely targeted advertisem­ents using informatio­n gleaned by linking data on friends, interests, location, shopping and search. This has big privacy and competitio­n implicatio­ns, yet oversight is minimal. Government­s, researcher­s and citizens risk being blindsided by the machine room that powers our lives and vital aspects of our democracie­s.

Government­s and companies have fundamenta­lly different incentives on transparen­cy and accountabi­lity. While openness is the default and secrecy the exception for democratic government­s, companies resist providing transparen­cy about their algorithms and business models. Many of them actively prevent accountabi­lity, citing rules that protect trade secrets.

We must revisit these protection­s when they shield companies from oversight. There is a place for protecting proprietar­y informatio­n from commercial competitor­s, but the scope and context need to be clarified and balanced when they have an impact on democracy and the rule of law. Regulators must act to ensure that those designing and running algorithmi­c processes do not abuse trade secret protection­s.

Tech groups also use the EU's General Data Protection Regulation to deny access to company informatio­n. Although the regulation was enacted to protect citizens against the mishandlin­g of personal data, it is now being wielded cynically to deny scientists access to data sets for research. The European Data Protection Supervisor has intervened, but problems could recur.

To mitigate concerns about the power of AI, provider companies routinely promise that the applicatio­ns will be understand­able, explainabl­e, accountabl­e, reliable, contestabl­e, fair and - don't forget - ethical. Yet there is no way to test these subjective notions without access to the underlying data and informatio­n. Without clear benchmarks and informatio­n to match, proper scrutiny of the way that vital data is processed and used will be impossible.

In general, laws should apply online in the same way they do in the physical world. But digitalisa­tion creates specific new contexts. Algorithms are not just a secret sauce allowing technology companies to compete, they provide the architectu­re for all online informatio­n, affect economic processes and fundamenta­l rights. To assess whether principles such as fair competitio­n, non-discrimina­tion or free-speech rights are being upheld, authoritie­s need to be able to look under the algorithmi­c hood. Government­s as major technology customers should also be responsibl­e buyers and write public accountabi­lity into their tendering requiremen­ts.

None of this is revolution­ary. Coca-Cola protects its formula from competitor­s, but regulators can still measure its safety, health impact and compliance with food standards. The same principle must apply to intangible algorithms, especially those used for government services. We must rebalance the informatio­n asymmetry between tech companies and democratic government­s by making oversight proportion­ate to the power technology companies already have. FT

Newspapers in English

Newspapers from Pakistan