In February 2019, Matt Hancock announced plans for a new innovation agency in the NHS. A year on, its chief executive, Matthew Gould, talks to Beckie Smith about health data, listening to patients, and his move from diplomacy to digital


Photo AdobeStock

This interview was conducted before the UK's coronavirus outbreak

You could say Matthew Gould’s journey to leading NHSX started three and a half thousand miles away. A career diplomat, Gould was serving as the British ambassador to Israel when he helped to launch the UK-Israel Technologies Hub – an initiative run out of the British Embassy in Tel Aviv to forge technology partnerships between UK and Israeli companies.

“I spent a lot of time building links between the UK and Israel in tech. The Israeli tech scene is extraordinary,” Gould says.

Israel was his last international posting – previously he had spent time in Islamabad, Washington, Manila and Tehran, during which time he was sent to negotiate the release of a group of marines and sailors who had been taken prisoner by the Iranian Revolutionary Guard. (“When you get off a plane and into a taxi and turn on the BBC News, and item number one on the worldwide news is the fact that you’ve just got off a plane, you feel very much like you are the centre of things and need to deliver,” he says.)


RELATED CONTENT


Rather than go back to the Foreign Office when he returned to the UK in 2015, he became director of cybersecurity at the Cabinet Office – a move in part spurred by his involvement in the tech partnership in Israel, he says.

“That in turn led me to become the government’s first director general for digital policy. And then having lived it on a national and policy level, the prospect of doing it for real in a very sort of punchy way here [for the NHS] was too good an opportunity to pass up.”

Gould became chief executive of NHSX – an organisation that sits across NHS England and the Department for Health and Social Care – last May, three months after health secretary Matt Hancock announced its creation. Its aim, according to Hancock, is to “bring strategy, leadership and technical expertise to... the world’s most exciting public sector digital transformation project”.

The role of NHSX – and its relationship to existing bodies like NHS Digital, which provides tech and information systems to support healthcare – is not yet well understood by many outside the organisation. But when CSW puts this to the former ambassador, he says the two have very different roles to play.

“NHSX is the guiding mind at the centre: we are the centre for strategy, we hold the budgets, we set the policy and we commission from NHS Digital and from others like NHS Business Services Authority.”

He refers to a speech by Hancock in January, which described NHS Digital and NHS BSA as the “tracks down which tech runs in the NHS”. Gould adds: “So they are the core delivery partner for health and care.”

“They do a lot of build, normally commissioned by us, so there is a clear distinction between our world and theirs. We haven’t done a good enough job yet explaining that publicly – partly because we’ve been working out ourselves exactly how we want the delineation to work – but I think it is a familiar one. And I think it’s useful. And I think it will be clear.”

For now, the two words the public are likeliest to associate with NHSX would be “innovation” and “data”. Last year headlines declared that the innovation body had taken control of industry access to patients’ data and commercial deals with the NHS.

“I’m not sure I’d frame it quite like that,” Gould says, choosing his words carefully. Rather, it provides policy and guidance on the use of data. NHSX is in the process of setting up a team known as the Centre for Expertise, whose job will include providing commercial and legal advice to NHS organisations signing data agreements, developing standard contracts and guidance, and advising on ensuring deals benefit the health service.

And NHSX has worked with NHS Digital and with the Office for Life Sciences – which straddles DHSC and the Department for Business, Energy and Industrial Strategy, and which champions research, innovation and tech in healthcare – to come up with a set of principles to govern the use of patient data, including in commercial transactions.

“In all this, we’re very clear, first of all, that the privacy of patient data is absolutely paramount. We need to keep patients with us and we need to keep staff with us, so we can’t do anything that erodes trust,” Gould says.

“People who are trying to care for patients at their most vulnerable are doing so with a hand tied behind their back because we haven’t found a way to make sure that the information about the patient can flow in the way that it needs to”

The principles establish, among other things, that any use of patient data must have an “explicit aim to improve the health, welfare and/or care of patients in the NHS”; that the terms of data-sharing agreements should include “quantifiable and explicit benefits for patients”; and that agreements should be transparent and must not undermine the work of the health service.

And there is one that Gould calls an “absolute”: no hospital or trust can sign a deal giving a research group or company exclusive access to its patient data. “Because one of the things that is most valuable about the patient data that the NHS holds, is that it is a coherent set of 55 million people’s records.” Whittling away at the whole would undermine that value, he says.

“That’s very corrosive. I don’t think the public would support it, and I think it’s damaging to the system. So we’ve laid down that rule.”

But CSW wonders how many members of the public need to get as far as thinking about cohorts or exclusive agreements before they start to get nervous about how their data is used. The bigger worries for many people are twofold: that they won’t be given a choice over what happens to their data, and that it could be traced back to them.

These aren’t new worries. In 2016, the ill-fated care.data project, which was to extract anonymised records from GP surgeries into a central database, was closed after concerns about patient consent and confidentiality. But one need only look at the last year’s worth of headlines to see that debate around the topic is not getting any less heated, and that some of the public already have already lost trust in what is happening to health data. In December, the Atlantic ran an article warning about the trend of big companies gathering up huge reams of health data – whether that’s Apple gathering it directly through footstep and sleep trackers on people’s iPhones, or Facebook’s AI division working with radiologists to teach its machine-learning systems to read MRI scans. The headline? “Everyone should be worried by big tech’s huge NHS data grab.”

What does Gould think – should everyone be worried? “Well, I think what I would rather frame it as is: we should make sure that whatever we do is in line with the principles that we’ve set out,” Gould says.

He says polling shows that while there will be people at either end of the spectrum of attitudes to data, “there’s a large group of people in the middle... who are comfortable with their data being used to develop new treatments or therapies or whatever it is, as long as they are confident that it’s being done in a way that sufficient benefit comes back to the NHS”.

Indeed, a poll commissioned last year by the National Data Guardian Dame Fiona Caldicott – who advises and challenges the health and care system on safeguarding citizens’ confidential information – found about seven in 10 people in England approved of partnerships that led to benefits like access to new technologies at a reduced cost and better patient care. In other areas the respondents were more evenly split: half said private companies should share in any profit generated from the deal, or that it was fair that they could; and exactly half were happy for the NHS to receive a one-off payment for access to data.

But the question that remains unaddressed is what to do about the cohort that Gould readily admits “would find it difficult to countenance their data being used for any purpose other than direct care”. And another uncomfortable issue remains. What if the safeguards that Gould talks about don’t work?

Last year, the Clinical Practice Research Datalink – part of the Medicines and Healthcare Products Regulatory Agency, through which companies can buy patient data compiled from GP surgeries and hospitals for research purposes – changed the way it described the data it was selling from “anonymous” to “anonymised”. The latter means it has been through processes to de-identify the patients it comes from – but there is no rigid standard for how this should be done.

Two days before CSW meets Gould, the Observer has published claims by anonymous “senior NHS figures” saying patient data sold for research via CPRD could be traced back to individual medical records.

Gould says he wouldn’t want to comment on that story, given that it concerns a separate government body. “That was a particular data set belonging to a particular organisation. I wouldn’t leap to a conclusion that because people have concerns, therefore it’s not working.”

But leaving aside that specific example, Gould acknowledges that when data is shared, “there is certainly a risk, depending on what data you have, that it can be pieced together and the process of anonymisation can be reversed.”

He adds: “The question is, what systems and what safeguards do you have in place, so that you can be sufficiently confident that doesn’t happen?”

The conversation returns to those safeguards when Gould speaks about another of NHSX’s areas of expertise: artificial intelligence. Last September, the agency put out a report on good practice in AI, and is now in the process of setting up the National AI Lab.

The lab – a joint venture with the Accelerated Access Collaborative, which speeds up adoption of new treatments and tech in the NHS – will fund and develop artificial intelligence technologies to tackle tough healthcare challenges.

Gould says the lab is “starting to turn into more concrete plans” after the initial announcement of £250m to set it up last year. It will provide grants and facilitate cross-government, industry and academic collaborations on both mature and earlier-stage AI tech. The first tranche of funding, £114m, was announced in January.

But many people have mixed feelings about AI – not least because of a scandal that erupted when Royal Free Hospital handed over 1.6 million patients’ health records to DeepMind, an AI company owned by Google, without their consent. The Information Commissioner’s Office later found the hospital had failed to comply with data-protection legislation.

When CSW asks what some of the safeguards around data and AI should be, Gould says: “First of all, making sure that patient confidentiality is protected. So the data needs to be shared in a way that does that. If you allow an algorithm to span over a data set, have you done that in a way that protects patient data or puts it at risk? It’s a technical question about how you structure the access.”

Gould also acknowledges that “there is concern, both in the NHS and in the health tech community, that we haven’t created coherent rules [for AI] that all fit together and give people the sense that there’s cover for what they’re doing”.

In January, he convened a roundtable with the heads of 14 regulators and agencies involved in setting those rules – among them the MHRA, the information commissioner and the Health Research Authority. There they agreed to develop a platform to improve communication and share advice; create a regulatory “sandbox” where AI tech can be tested; and identify any gaps in regulation.

Gould is keen to realise what he says has been a long-held belief that “smart, innovation-friendly regulation that keeps citizens safe” will give the UK a competitive edge internationally. In other words, rules must be in place, but must not be so stringent that they “strangle the technology”.

One risk the regulators will consider has become something of a ministerial talking point ever since then-PM Theresa May warned in Davos in 2018 that public bodies must ensure “algorithms don’t perpetuate the human biases of their developers”. She said the government’s new Centre for Data Ethics and Innovation must tackle this.

Gould was on the team that set up the data-ethics centre in his previous role as director general for digital and media policy in the Department for Digital, Culture, Media and Sport, “so it’s an area of thought that I’ve sort of brought with me to this job,” he says. He wants AI to be used to tackle health inequalities, not worsen them.

“We’re super mindful of the risks. There’s no easy answer other than making sure that in regulation and practice, how we distribute resources and everything we do, the risks of exclusion, algorithmic bias and unintended consequences have been properly thought about, and we’re doing it in a way that mitigates against those risks.”

In October, NHSX published a report entitled “Artificial intelligence: how to get it right” setting out a framework for governance, models of “good AI” and guidelines for international best practice.

It’s important to get this right because AI has huge potential benefits for the NHS, Gould says: “One of the exciting things about this is that some of the more mature and proven AI systems are precisely in those areas where the NHS has some of its most acute clinical shortages.” AI programmes have been developed to read mammograms, and work with DeepMind at Moorfields Eye Hospital in London has led to a breakthrough tool to improve referrals for eye disease.

And there is also a wealth of non-clinical AI applications that could save the NHS time and money, Gould adds. A doctor at University College London Hospital has developed a system to predict when someone is likely to miss an appointment, which can then be used to send out prompts.

“It’s really easy in a big system to lock yourself away at the centre and come up with the answers. But we won’t get that right unless we listen to patients”

Gould is quick to point out that NHSX has another – rather less controversial – strand of work on data sharing, within the health service itself. “There’s a lot of stress in the system around when it’s okay to share patient data, even for direct care,” he says.

In his early days in the role, Gould visited hospitals, GP surgeries and mental-health trusts and found that while it was technically possible to transfer patients’ records from one of these bodies to another, it often wasn’t happening because people were nervous about what would happen if they mistakenly shared the wrong data. Overwhelmed by seemingly conflicting guidance from different authorities, “some people have got a bit paralysed”, he says.

NHSX is using what Gould calls its “convening power at the centre” to work with the National Data Guardian, the Information Commissioner’s Office and NHS Digital to simplify the guidance “so that people know when it’s safe to share and do so”. He adds that this is a way NHSX “can directly impact on patient care and patient outcomes.”

Gould became familiar with these data blockages on a personal level when his wife underwent treatment for breast cancer last year. “She was treated in four places, all local, none of which could speak to any of the others. She would have to carry around her scans, all her details, her test results; she would very often go and see the oncologist, who wouldn’t be able to call up the results of the scans that he had ordered.”

And Gould points out that not everyone can do that. “It’s infuriating when you’re in control of it; it’s heart breaking when the patient isn’t in a state to be able to provide their own interoperability.” He recalls visiting a hospice outside Manchester, where staff found themselves unable to access records for the patients in their care.

“That’s heart breaking, when you think that these people who are trying to care for patients at their most vulnerable are doing so with a hand tied behind their back because we haven’t found a way to make sure that the information about the patient can flow in the way that it needs to,” he says.

“This just is not the best, most efficient, safest way to run the system. If we can help with that, through creating clarity around the rules so that the data can flow, creating shared care records so that technically it’s possible, having both technical standards, but also standards for how we describe different things so that systems can speak to each other – I think that will probably be the single most useful thing NHSX does.”

As well as drawing on his own experiences and visits to health providers, Gould says he values patients’ perspectives on the health service.

In November, he announced NHSX was seeking two citizens to act as part-time advisers to ensure “the patient voice is properly built into everything we do”. The agency will consult them during the conception, design, and development of new services.

Asked where the idea comes from, he says the “X” in the organisation’s name stands for “user experience”.

“One of our guiding principles is it’s all about user need. It’s really easy in a big system to lock yourself away at the centre and come up with the answers. But actually, we won’t get that right unless we are talking to users, and we’re listening to patients, listening to staff and listening to people at the front line.”

Every project should have a “patient voice” represented, he says, as well as the rigorous user research that already takes place. Otherwise, it’s too easy to get caught up in what an organisation thinks patients need, even if that doesn’t quite align with what they actually do need.

“Because ultimately, this is not about the tech. It’s not even about the data. It’s about patients. It’s about the citizens. And so having those people who will make sure that in everything we do, we are focused on that and we remember why we’re doing it and we skew what we do to the patient, I think can only be a good thing.”

Read the most recent articles written by Beckie Smith - Winter fuel payment cut will push 50,000 pensioners into poverty, DWP admits

Share this page