By Jonathan Owen

14 Oct 2019

Having high quality data is a prerequisite if policy and decisions are to be truly evidence-based, Jonathan Owen reports on a CS Live round table where leading civil servants came together to share ideas on how government can improve the way data is used in helping design and deliver services


Cabinet Office permanent secretary John Manzoni chaired the discussion, sponsored by global digital transformation specialists Cognizant, which was held at Civil Service Live in London.

Britain is “one of the most digital governments in the world” and data has “enormous potential for us” Manzoni said.

“The question is making better decisions, how can government use data more effectively to improve how we design and deliver services?”

Tom Smith, managing director of the data science campus at the Office for National Statistics, paid tribute to some “really excellent work going on” in government. The campus has 70 people recruited from industry, academia and across government, and its role is “across the government’s statistical service as an analysis function.”

Smith commented that the “government and the civil service has a huge skill set.” The question is how those are shared and “how do we build up cross-government capability for data sharing and also for sharing the tools and techniques that we are developing?”

However, the government is not necessarily the best placed to exploit its own data, according to Cathrine Armour, director of the customer division at the UK Hydrographic Office.

“How can government use its own data more effectively? Perhaps we shouldn’t be using it, perhaps, particularly where it’s not GDPR collected, we need to provide it to others who have better domain expertise and understand our customers and their journeys more than we do,” she said.

“Not all data is created equal, we need to be really conscious of the fact that we want integrity of the decisions that we are making, we need integrity of the data we are proving to make those decisions,” Armour added.

On the issue of building up skills within government, Steven Steer, head of data, Ofgem commented: “We need to have expertise ourselves to be able to have good approaches to policies on the market. He cited the rise of online consultations in recent years as an example of how digital technology can increase participation in government. In Ofgem’s case, the organisation is moving towards “self service of regulation” and “crowd sourcing of regulatory rules,” he said.

“We need data and digital capabilities to enable the right decision to be made, not necessarily to be the ones directly doing it,” Steer commented. 

The issue of greater standardisation over the collection, sharing and usage of data dominated the discussion. Marcus Bell, director at the Cabinet Office’s race disparity unit, expressed concern over a lack of standardisation in data.

He outlined how different departments use different words and terms to define ethnicity, something which is part of a wider issue. “Big tech is based on linking together different data sets. If the currency is all different it’s almost impossible to do that, so I’m wondering if there’s a case for a bit more top down prescription about data standards,” he said.

'We now have a whole wealth of data, all collected in lots of different ways, the question then is can we share it?'

The case for common standards was reiterated by Erika Lewis, deputy director, national data strategy at the Department for Digital, Culture, Media & Sport.  “We are pulling together, with ONS and GDS, some work for a spending review package but the focus will be on some cross governmental work but also a big push on standards because that’s exactly what I’ve been hearing since I arrived in DCMS, that’s the big thing, we’ve got to grab it by the horns,” she said.

Using master data management, where there is a central source of data that everything else spins off from, “is a very standard way of doing things in the private sector” and could be useful for government too, according to Matthew O’Kane, VP EMEA head of AI and analytics, Cognizant.

In terms of the wealth of data available, not just within government, Julie Pierce, director of openness, data and digital, Food Standards Agency, commented: “There’s a huge amount of global data out there that’s open and published and that’s a huge asset to us all.”

Potential legal issues around the use of data were raised by Wendy Hardaker, commercial law director at the Government Legal Department. She warned: “It’s about the way in which you’ve collected the data, often, as to what you can then do with it, and you can fall into all sorts of elephant traps if you’ve not thought about that at the start of the process. We now have a whole wealth of data, all collected in lots of different ways, the question then is can we share it?”

Some of the difficulties around the use of data revolve around trust, according to Yatin Mahandru, head of public sector, Cognizant. “The citizen’s trust in the data and what government are going to do with it and the service the government is going to provide, and then also the trust between departments when you are exchanging that data as to what it’s going to be used for, whether it’s enforcement or something else.”

Manzoni brought the discussion to a close by outlining what he called “three big dilemmas as we nudge forward.” The first is the “top down bottom up one, do we let a thousand flowers bloom, do we put in the standards, at what point do we do this?”

The second surrounds open data. “Is it a total muddle or does it have value for somebody else and if it has value for somebody else why would I give my data away when I could sell it?” he asked.

Third is the skills issue. “How do we actually teach people to frame the question right, so have we got that right or are we focussing on the high-end data scientists or actually, should we do something completely different?”

Read the most recent articles written by Jonathan Owen - FCDO contractors vote to strike over ‘insulting’ pay offer

Share this page