Since the GDPR became effective, there has been significant concern about NHS Digital’s failure to secure appropriate consents to data sharing in a flawed IT project.

September 16, 2021

5 Min Read

The enduring link between health and data has saved thousands of lives during the pandemic.

But the history of NHS data usage is more problematic, particularly data sharing with third parties and data protection.

In May, the government announced The General Practice Data for Planning and Research scheme.

It meant that for everyone registered in England, GP health data would be made available for healthcare research and planning.

To protect their privacy, individuals’ identities would be partially removed.

But privacy campaigners cautioned that the removal of identities could be reversed. Within weeks, an online campaign encouraging people to opt-out of NHS data sharing had gained nearly 1.4m supporters.

Following the backlash against making patient data available to private companies, the plan was put on hold.

The NHS has a checkered history in data sharing and protection. In 2016, the Royal Free NHS Foundation Trust was censured by the Information Commission (ICO) in relation to data on 1.6m people which the Trust handed over to Google's DeepMind division (an AI company) to enhance its machine learning capability.

The ICO ruled that the Royal Free failed to protect patients’ privacy and that it was "inexcusable" that they had not been informed.

Since the General Data Protection Regulation (GDPR) became effective in 2018, there has been significant concern about NHS Digital’s failure to secure appropriate consents to data sharing in a flawed IT project.

Currently, the NHS is working on AI projects via NHSX, which deploy machine learning in R&D projects, raising questions over what people know about how their data is being used.  

This year, Big Brother Watch reported that NHS Digital’s management of Covid vaccination status data had failed to deliver basic safeguards, so that information could be exploited by insurers, companies, employers, and scammers.

Once it was revealed that people’s vaccination status was being leaked, NHS Digital altered its vaccination booking website.

Potential misuse is a big issue when the NHS shares confidential patient data with third-party organizations.

Should that data become part of an AI project and what happens to it? Critically, what do data subjects know about the consent that they provide for processing that data and where it is used?

The NHS has nearly 1,000 external suppliers, while the number of NHS Digital supply chain partners is not itemized.

NHS Digital simply states: "Our supply chain partners are fundamental to our ongoing success, creating significant value through the delivery of new thinking and innovative solutions. Through the deployment of Strategic Supplier Relationship Management (SSRM) we are focused on creating an effective and collaborative relationship with our most important suppliers, creating additional value and innovation that goes beyond our contracts."

It adds the following on data usage: ‘We ensure that external organizations can access the information they need to improve outcomes, and the public are confident that their data will be stored safely by NHS Digital.’

Public confidence in NHS Digital’s commercial relationships is open to question.

The Data Protection Act of 2018 and the GDPR are designed to ensure that an individual data subject - the person giving consent - should be fully appraised of all of the uses of that data and for how long it will be stored.

Post-GDPR, uncertainty exists concerning AI projects: how often data is used in the machine learning process and where it ends up.

The AI designers of machine learning programs carefully guard information about the algorithms underpinning their programs. Individual data subjects do not know what happens to their data: there is very little transparency.

Individuals are rightly concerned that once they have given consent, can they withdraw it and remove the data from the AI tank? If not, then it does not accord with GDPR principles.

The EU is now considering GDPR-like regulation for AI, but the UK seems unlikely to follow.

Future GDPR and data protection rights will probably evolve more by judicial intervention in the UK than by additional regulation. Ultimately, divergence from the EU will result in greater judicial divergence from EU law.

On future relationships of NHS Digital with third-party companies, there is cause for concern.

Given its track record, state-owned entities like the NHS simply do not seem to have the technical capabilities to understand what AI projects do.

The NHS buys an outside resource which sometimes has its own agenda. Notably, some US tech companies operating in the NHS market have a very well-established agenda around the provision of current and future services, and how they can monetize them.

NHS Digital’s technical capability is not in doubt, but its understanding of some tech companies with which they do business may well be.

Their core objectives do not easily align with those of the NHS. Although the NHS has sophisticated diagnostic processes to facilitate early diagnosis of difficult conditions, essential changes need to be made.

In some areas, it is floundering in terms of technical systems: the more the NHS relies on outside agencies, the greater the risk that it will not have the appropriate level of compliance, particularly where no alignment of interests exists.

Regrettably, significant data misuse seems inevitable, as does ensuing litigation. Increased consumer understanding and greater transparency in how individuals’ data is managed are inevitable.

Presently, few people appreciate the commercial value of their personal and medical data. In some instances, it's probably worth more than gold.

Greater awareness will be achieved by investigations into what tech and AI products are doing.

Access to information will then enable people to understand and hopefully allow them to regain some control.

Inevitably, this will provoke litigation against some of the organizations with which the NHS had commercial relationships: their financial motives will lead to actions against them. Implementation and processing of data will be key.

People will not sue the NHS for how they endeavor to improve their services and handle their data. Instead, those that handle that data for them will be in the line of sight.

Kingsley Hayes is Keller Lenkner UK’s head of data breach. Previously a managing director at data breach law firm Hayes Connor Solicitors, Hayes is an expert litigator and has represented multiple claimants in complicated class actions.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like