NHS’s “inexcusable” Google deal a lesson for Big Data
Identifiable record-sharing highlights naivety over privacy issues and value of data
Identifiable record-sharing highlights naivety over privacy issues and the value of data.
the deal that gave Google’s DeepMind access to millions of identifiable NHS patient records has become a case study in how not to run a Big Data programme, according to research from the University of Cambridge.
The data-sharing was part of a scheme to monitor kidney injury, but instead of sharing information only on kidney patients, the Royal Free trust gave DeepMind an entire data dump of its patients.
The deal was rewritten in 2016 after the scope of the data handed to Google became apparent, but experts say the project structure remains flawed and that the same mistakes could be made again.
“The ground’s been constantly shifting and the very flawed basis on which the deal was originally struck is a real lesson about how this can happen and we don’t want it happen again,” Dr Julia Powles of the Faculty of Law and Computer Laboratory at the University of Cambridge told PC Pro.
“There has been some response to the issues raised when the deal became public, but many of the issues still remain,” she added. “It continues to be the case that nearly two million Londoners – whose data is now sitting on servers controlled by Google and DeepMind – haven’t given consent and don’t know how the data is being used. The failure on both sides to engage in any conversation with patients and citizens is inexcusable.”
At the heart of the criticism is a confused set of guidelines for using identifiable and anonymised data for clinical and research purposes. Where patients fall under the definition of “direct care”, such as the kidney patients, it is acceptable to provide identifiable data to tech providers with implied consent. Otherwise, the patient’s consent should be explicit.
“The approach is too blunt,” explained Powles. “It’s not just for the patients with relevant medical histories and it captures the whole of the population.”
Unhealthy secrecy
The relationship between the health authority and Google has also been criticised for lack of transparency. “It still fails to admit to the public key details, such as how much money is passing hands, what they are actually developing and the data still includes the whole population,” Powles said.
“Tech is capable of having a more efficient solution, which keeps control of the data with the trust,” Powles added. “The Royal Free is basically giving up the data and saying ‘do good things with it’.”
The report also questions public bodies giving away data to private companies that will profit from the information. “We need to signal to people working in health, surveillance or science the value of these datasets and the public interest in making sure that any innovation that happens is principled – with value for patients and the NHS in this case.”
As the report points out: “We do know that DeepMind will keep all algorithms that are developed during the studies.”
DeepMind and the Royal Free trust issued a statement – refuted by the researchers – dismissing the report’s findings. “This paper completely misrepresents the reality of how the NHS uses technology to process data,” the statement claimed.