The government has no plans to introduce legally-binding measures to allow citizens to challenge the outcome of decisions made using algorithms, its response to a select committee report has indicated.
MPs said in May that the government should consider giving people the right to ask how machine-learning programs made decisions affecting them. This “right to explanation” would increase accountability and transparency in public services and in the private sector, the Science and Technology Committee said.
In its report, the committee also said the government's Centre for Data Ethics and Innovation and the Information Commissioner’s Office should examine how to allow individuals to challenge the results of algorithm decisions and seek redress where appropriate.
RELATED CONTENT
The government’s response, published on 10 September, said the centre would “have an ongoing role in reviewing the adequacy of existing regulatory frameworks” and identifying ways to strengthen them, but stopped short of addressing the lack of a legal framework for challenging algorithm decisions.
The government acknowledged the need for the civil service to be transparent about how they use algorithms. However, it did not commit to producing and maintaining a list of where algorithms “with significant impacts” are used, as recommended by the committee.
It also failed to address the committee’s call to add the use of algorithms to a ministerial brief. MPs had urged the government to appoint a named minister to oversee and coordinate its departments’ use and development of algorithms and their partnerships with industry.
And it appeared to reject the committee's recommendation for the Crown Commercial Service to commission the Alan Turing Institute or another expert body to conduct a review establishing an appropriate procurement model for algorithms. The response said the CCS regularly reviewed where there may be a need for commercial procurement agreements, but said it would work with the Turing Institute and others to inform the categories for review.
It did accept the committee’s recommendation that government departments should continue to make public sector datasets available for big data and algorithm developers. It also agreed that departments should engage with “innovative organisational models… to provide access to more, better quality, timely, and machine readable open data”.
“The government recognises that there is work to be done in order to ensure the quality of published data is of the highest calibre,” it said.
It also accepted many of the recommendations relating to the Centre for Data Ethics, which the committee said should play a critical role in safeguarding against bias in decisions made using algorithms and ensuring their use is transparent. In a letter accompanying the response, Margot James, minister for digital, said the committee’s report would “inform our thinking both as we finalise the terms of reference for the centre, and as we discuss the initial work programme with the centre’s chair”.
However, the government did not go as far as to accept MPs’ recommendation that the centre’s first task should be to assess the tools available for auditing algorithms. Such tools were “urgently needed” given the proliferation of algorithms, and the centre should advise government on which to prioritise, the committee said.
“The government expects the centre to support industry and the public sector by identifying, sharing and building on best practice - including (but not limited to) mechanisms such as codes of conduct, standards and principles,” the response said.
The response also accepted the committee’s recommendation to work with the funding body UK Research and Innovation to determine whether there was a need for more government-backed research into the benefits and risks of algorithms.