Last week, departments published figures showing their spending on a range of back office functions. Suzannah Brecknell analyses what they reveal about departments’ efficiency – and about the weakness of government data.
Last Thursday was a bumper day for data-hungry Whitehall-watchers. Departments published information on bonuses paid over the last year; workforce data over the last six months; their spending on government procurement cards; and the second of their Quarterly Data Summaries (QDS) – updates to departmental business plans, providing the latest management information and reports on progress against key business plan indicators.
The ministerial statement which heralded these publications focused on the first three data sets – perhaps understandably, since performance-related pay and spending on procurement cards have attracted great interest from the media, and can be presented as positive stories for the coalition government. The QDS, meanwhile, have attracted relatively little external comment and present an altogether more mixed story.
The QDS update management and financial information which was first published in annexes accompanying business plans in May 2011. At first glance, the business plan annexes and QDS provide a remarkable resource and a big step along the road towards transparent government. They follow a template and definitions set out by the Cabinet Office, and are accompanied by documents giving qualifications where data is unavailable or not fully comparable. They should, therefore, allow interested outsiders – and concerned civil servants – to compare a range of indicators such as staff turnover, absence rates, accuracy of financial forecasts and broad progress against business plans. They also appear to enable comparisons of departmental spending in key common areas such as HR – exactly the sort of data that the government hopes armchair auditors will scrutinise.
Before making these comparisons, however, armchair auditors must note some fairly large caveats. These are best summarised by the message which sits at the bottom of all of these documents, warning observers to pay careful attention to the notes and qualifications, and adding: “Many of the measures are not yet directly comparable because they do not have common definitions, time periods or data-collection processes.” For corporate services costs such as HR and finance, for example, the Cabinet Office specified that departments could either use the definitions set out in the Treasury’s Operational Efficiency Programme, or their own definitions. This leaves the door wide open for variations in the data, before we even begin to consider the different ways in which departments report and collect their financial information.
There is also a qualification that the data is based on management accounts rather than audited financial statements, and since it is quarterly data it does not compare easily with other data published in annual accounts or by the NAO. Armed with these caveats, CSW waded into the information, looking at a few key areas to consider what the data tells us about corporate services and transparency across Whitehall.
Human resources
In order to consider the efficiency of departments’ HR functions, the amount they spend on HR per full-time equivalent (FTE) employee can be compared. The data included in business plans and QDS allows us to do this for the 2009-10 financial year, and for the period January to June 2011.
The 2009-10 figures show that the 17 departments which filed these reports spent a total of £545m on HR services for nearly 500,000 FTEs, equating to a cross-Whitehall average of £1,269 per FTE. There is, however, considerable variation between departments (see Figure 1 - click here to see an enlarged version), with spend per FTE ranging from £920 at HM Revenue & Customs (HMRC) to £5,137 at the environment department (Defra).
There is a close correlation between size and spend per head, with larger departments coming out more efficient. This variation is what we would expect for a service such as HR, where many processes are repeatable and economies of scale can be achieved. It may also lend strength to arguments for greater use of shared services, which might reduce costs for smaller departments by enabling them to share the efficiencies which big players can achieve.
It’s trickier to examine changes in the data over time, most obviously because we move from an annual figure to a quarterly one. The average spend per FTE in the first quarter of 2011-12 (see figure 2 -click here to see an enlarged version) works out at £263 (excluding departments which did not report all information). If spend and FTEs remained constant over the following nine months, this would equate to an annual figure of £1,053 – an 18 per cent cost reduction from 2009. However, consistency is not a feature of Whitehall workforces or spending at the moment. According to the figures here, the number of FTEs in departments fell by two per cent between the last quarter of 2009-10 and the first of 2011-12, while average HR spending per FTE fell by seven per cent.
James Page, a senior researcher at the Institute for Government, who is leading on studying the transformations taking place across Whitehall, suggests the fact that HR spend fell more quickly than FTE numbers fits with what we know about the intentions of senior leaders in the HR profession, who have made it clear that they aim to reduce the ratio of HR professionals to staff through their Next Generation HR programme. Another possible interpretation, Page says, is that by the end of the 2009-10 financial year several departments had already been through major reductions in FTE numbers, and then began to cut the corporate services which had supported those major reductions. “Potentially, this is HR catching up after seeing through the downsizing in some departments,” he says.
Page does, however, raise some question marks over the reliability of the data. He says an increase of nearly 33 per cent in HR costs in the Department of Health from one quarter to the next seems “improbable”, while it seems “equally unlikely” that the Cabinet Office has reduced HR costs by 54 per cent over this period. This could point to a change in the way spend has been counted from one quarter to next, as definitions are standardised or improved; it could also point to annual fixed costs that appear in one quarter’s accounts.
While Page cautions that “these are just two data points, so it’s dangerous projecting long-term trends from them”, he adds: “it certainly shows that there’s a huge amount of change taking place.” Even allowing for ongoing changes in both civil service headcount and the structure of HR teams, the figures here suggest that we will see a substantial reduction in HR costs by the end of 2011-12. Whether that change is accompanied by a reduction in the variation in departments’ HR costs will be an indicator of the success or otherwise of both the Next Generation HR team, and proponents of shared services across Whitehall.
Estates
The QDS and business plan annexes include two figures that can be used to compare departments’ efficiency in managing their estates. Alongside the total cost and size of their office estate, each organisation is required to report the cost per m2 and per FTE. As with HR, we can compare figures in 2009-10, and quarterly figures for January to June 2011. In addition, some departments have given figures for the whole 2010-11 year.
In all of the categories we examined, some departments were unable to provide information for the requested time periods, but it was in the estates data that this became most obvious. Of 17 departments, seven were unable to report data for April-June 2011, and eight were unable to report data for the 2010-11 year. Many indicated that they were waiting for annual figures collected through the Government Property Unit’s benchmarking exercise.
Taking only those that were able to provide the most recent figures, there was a surprising shift in the total size of office estate: it grew by eight per cent. This seems odd given the current focus on efficiency and budget cuts, but on closer inspection it was caused by just two departments – Cabinet Office and the energy department, DECC – which had increased their estates by 30 and 17 per cent respectively. A DECC spokesperson explained that expansion was needed to house new staff working on a policy area that has transfered to DECC from OFGEM. The Cabinet Office, which has also taken on extra functions and staff this year, did not offer a formal explanation. All other reporting departments had reduced their estate size by between five and 19 per cent. However, these figures should be treated with extreme caution as an indicator of the total government office estate, as some of the big landowning departments – including those for work and pensions, justice and defence – did not provide figures for the most recent quarter.
James Grierson, head of public sector at property consultancy DTZ, says that although there has been “lots of good work in both improving and reducing [the government estate], there is frankly a long way still to go”. If you compare the total size of the government estate with the number of staff and the Government Property Unit’s targets (which set a ratio of 10m2 for each FTE), he says, it’s “not difficult to conclude that the civil estate is twice as big as it needs to be”.
He has, however, seen a definite change in departmental attitudes to their estates: “There’s nothing like budget pressure to get departments and agencies to fall into line,” he says.
Considering the 2009-10 numbers (see figure 3-click here to see an enlarged version),which are the most complete indicator of cross-government performance, the larger departments appear more efficient – though the correlation is not a strong one. The Department of Work and Pensions, which has to run a large number of small offices around the country, has a huge and rather inefficient estate; HMRC, whose large estate is combined into fewer, bigger offices, pays significantly less per FTE. There are also differences between small departments with largely Whitehall-based estates such as the Cabinet Office and culture department (DCMS), which have to pay for expensive central London properties, and those with larger estates in cheaper parts of the country.
The Treasury stands out as one of the most expensive estates, at a cost of £10,560 per FTE in 2009-10. The department’s Whitehall HQ is “a ridiculous building as an office,” says Grierson, despite big improvements in its usability made in the recent (and very costly) PFI-funded refurbishment. On the other hand, he says, the department has been working hard to improve efficiency – for example, by accommodating staff from the Cabinet Office – and it may be unfair to judge this building by the standards of other spaces. “They could have sold it and turned it into a hotel,” he suggests, but “it’s part of the nation’s heritage.” He compares it to the Palace of Westminster: “Parliament isn’t an efficient place of work, but it has a role that’s about very much more than that.”
Where it is possible to compare costs between 2009-10 and 2010-11, we see that several departments seem to be spending more per FTE while their costs per m2 have remained constant or fallen – suggesting that staffing levels are falling faster than floor space (see Figure 4 -click here to see an enlarged version). It may take longer to reduce estate size than headcount, which means “these ratios go wrong even where [a department is] doing the right thing,” says Grierson. He also advises caution about the cost per FTE indicator. “It’s relatively straightforward for us to work out metrics related to properties,” he says. “The really hard bit for us is to get a handle on FTEs. There are so many assumptions. Do you count contractors? Do you count staff on short-life, externally-funded projects? Do you count staff that are based in three different locations?”
At HMT, costs have risen in 2010-11 both for the FTE and m2 indicators. A Treasury spokesperson said that this can be partly explained by the fact that payments on the department’s PFI contract are rising in line with inflation. Also, she noted, the department has reduced headcount, which will have changed FTE ratios. Staff from the Cabinet Office have begun to move into the Treasury building, improving occupancy and thus efficiency, but this work is still ongoing.
Like Page, Grierson cautions that there is “a danger of over-interpreting this data”. There is “so much change in the pipeline” that the data “may already have been overtaken by events”, he says, warning that we may not see the real results of estates teams’ current work to rationalise buildings until early next year.
He notes, as well, that one cannot simply consider cost and floor space targets when looking at how estates can contribute to the efficiency agenda. “The most important thing is that you get value for money out of the work that is done by civil servants,” he says. “If you screw around with their efficiency because you’re trying to cram them into somewhere that’s unsuitable, you’re winning a battle but losing a war.”
Procurement
To contrast the efficiency of departments’ procurement operations, CSW looked at how much they spent on buying decisions and processes: we compared the value of goods and services procured by each department with the costs of its procurement function in both 2009-10 and 2010-11 (see Figure 5-click here to see an enlarged version). Assuming that the figures are correct, higher numbers here would signal more efficient departments, with the caveat that some departments buy many large-value contracts – roads, for example – which might inflate their figures. However, wide variations in results, including some figures well outside the normal range, led both of the procurement experts among our commentators to question the validity of the data.
Both experts commented on the very poor result for HMRC, which appears to have bought just £14 of goods in 2009-10 for every pound it spent on procurement. As a large department, one would expect it to be a more efficient procurer. Colin Cram, managing director of consultancy Marc1, notes that HMRC is “recognised as one of the best departments” for procurement, and is also working to improve efficiency with the use of ‘continuous improvement’ management techniques, so its reported cost of £93m for procurement seems very high. He believes that HMRC may be using an unusual definition of procurement cost, and says a breakdown of what each department includes in its calculation would be helpful for further analysis.
A spokesperson from HMRC confirmed this theory, explaining that HMRC’s QDS figures cover a wider basket of functions than other departments. “The cost of procurement for HMRC includes all costs charged to procurement cost centres. It includes the centralised budgets that procurement manage on behalf of HMRC such as post, print, couriers, vehicles, envelopes, office machines and stationery. This inflates the figure materially,” she said.
Another obvious outlier is the Department for Energy and Climate Change, which seems to be buying far more with each ‘procurement pound’ than any other department. This anomaly is explained by the fact that the department’s total procurement spend includes its nuclear decommissioning budget – a vast amount of money which is in fact controlled by the Nuclear Decommissioning Agency rather than the department itself. If we adjust the figures to remove NDA’s procurement spend (which was £2.7bn in 2009-10 and £2.8 bn in 2010-11), then DECC’s figures reduce to £307 for ‘09-10 and a very poor £34 for ‘10-11, when its procurement spending dropped while its fixed procurement costs remained roughly constant.
Peter Smith, consultant and writer of the Spend Mattersblog, believes this example also points to a question about defining procurement which is broader than just cost attribution. The NDA, he suggests, could be seen broadly as a “purely procurement and contract organisation” and one could argue that all of its costs should be counted as ‘procurement’.
The figures also show significant variation between 2009-10 and 2010-11 in many departments (also see Figure 5). Cram believes that much of this will be due to changing definitions, though Smith notes that at least some of the changes may be due to budget constraints and changing working practices. He suggests, for example, that the DWP’s figures – which show lower procurement costs and therefore improved efficiency – could be reliable, since “we know they have decent data and they’re a good organisation in terms of procurement”. He suggests that in this and several other cases, the drop in costs may reflect “quite a big drop in the use of consultants and contractors” in this financial year.
Cram adds that the changing results suggest a need for greater clarity about what is included in the total procurement spend for each year. Where a department is letting long-duration contracts, such as those for prisons, he asks: “Should you count the procurement spend in the year that you let the contract, or should you spread the procurement spend over the 15 or 25 years?” To give a fair picture of efficiency, it would be important to weight spending figures according to the contract’s timescale.
Both men agree that more work will be needed to align definitions and explain what is being counted in these numbers before the data becomes very valuable. In this regard, there is hope for the armchair auditors. The HMRC spokesperson said: “The Cabinet Office Efficiency and Reform Group has recently produced and consulted on new guidance, and we expect that departments will be able to provide more consistent data once that has been issued.”
Another way to measure procurement efficiency is to compare the price departments pay for common goods or services. The QDS, for example, include information on the price departments pay for energy – a useful indicator since it is such a large part of spending, says Cram (the environment department and Ministry of Defence are most efficient, according to the most recent data). It also covers the cost of a ream of A4 paper, which Cram calls “pointless, as it is such a tiny proportion of procurement spend”.
Information technology
Departments are also asked to provide a figure for the amount they spend on desktop PC provision per FTE (see figure 6 -click here to see an enlarged version). Again departments were able to use their own definitions, so often we are not comparing like with like. It’s also important to consider the departmental context: the FCO’s cost of £3000 per FTE may seem rather high, for example, but – as the department painstakingly points out in the annex to its data – this includes high-spec security provisions and international contracts. Next to this, the Cabinet Office’s 2009-10 cost of £3,665 for desktops mainly located in Whitehall looks very high. However, in 2010-11 the department changed the way it calculated the cost, adopting a method used by most departments, and the figure fell by a dramatic 62 per cent.
Sir Ian Magee, senior fellow at the IFG and co-author of System Error, a recent report into government ICT, says that while “it’s to the government’s credit that they are publishing this data, it’s only a start. Piecing together a story from this is difficult.” Discussing the figures on desktops, he suggests large variations between departments suggest the data may not be trustworthy and “therefore any analysis would be tentative.” As with HR and estates, large departments do appear to get better value, though Magee notes that “if these figures are true, the variation in cost is unacceptable.”
On the same day as the QDS were released, government published a set of strategies setting out plans to achieve reform in key areas of ICT provision, including one for ‘end user devices’ – desktops, laptops and so on. Sureyya Cansoy, head of public sector at technology industry association Intellect, notes that the strategy will aim to create a minimum set of standards and definitions which, the Cabinet Office believes, will deliver cost savings. We will soon be able to judge the success of this strategy, she says: “The DWP contract is due to be renewed next year, and this will provide a test-bed for the strategy.”
The QDS also allow us to compare departments’ external spend on ICT services, at least for 2009-10. The available data shows that three of the 17 reporting departments appear to account for over half of external ICT spend in 2009-10, with six of them accounting for 82 per cent. Magee suggests this shows there is a strong potential for savings by improving “efficiency and effectiveness in these departments”, and “this is reflected in the government’s arrangements for governing ICT strategy implementation, with the Delivery Board – the six big spenders and the Cabinet Office – meeting regularly in addition to the CIO Council meetings.”
The available figures suggest that external ICT spend fell by around 15 per cent in reporting departments between 2009-10 and 2010-11 and this, according to IFG programme director Tom Gash, “reflects our intelligence of significant reductions in spending, partly as a result of contract renegotiations and partly as a result of a reduction in activity”.
He continues: “Collecting and publishing data on ICT spending is a good start. However, ideally we need to see some indicative data on quality of provision as well as cost. Without some data on quality, it’s impossible to tell whether value for money is improving or getting worse. Many suppliers would say that during recent contract negotiations, at least part of the cost reduction they were able to provide was due to agreements to reduce service levels.”
Finance
In all the areas we’ve considered, CSW has had to pick fairly simplistic measures by which to assess departmental efficiency. Clearly, better data and greater transparency over definitions and calculation methods are required in order to paint a picture of the true value for money in many areas – but assessing the finance function proved particularly difficult.
One option was to compare spend on finance function in each department with their total annual managed expenditure (AME) and departmental expenditure limit (DEL) – but this would be overly simplistic. In a fair assessment, finance function spending should be compared with the overall role and objectives of the department and its finance team.
Sir John Bourn, who was Comptroller and Auditor General for 20 years, points out several reasons why the figures could vary. Some departments – such as the education department – seem to have relatively small finance costs compared to overall expenditure, but this may be because much of their budget is made up of grants which are then spent by other organisations.
Conversely, the Foreign Office could look rather inefficient since it has (according to the 2010-11 figures provided in the QDS) the sixth-most expensive finance function but a relatively small total expenditure. Bourn notes the department is “spending money throughout the world, in many embassies and high commissions which are very vulnerable to corruption, as many NAO reports have shown”. In this context, a strong finance function would be essential.
The QDS also include three ‘finance indicators’ which could give a clue as to the effectiveness of a department’s finance team – for example, showing how accurate they’ve been in forecasting cash flow over the quarter. However, these indicators would be likely to fluctuate widely, both between quarters and with external changes such as the sudden creation or abolition of departmental agencies, or changes to policies. CSW therefore decided that it would not be possible to compare performance in this area until at least a year’s data is available.
Conclusion
The scepticism with which our set of experts treated much of the information in QDS begs the question of whether the data as it stands has any use at all, and whether it should have been published in its current state. One consultant who works with government on financial information, but preferred not to be named, says that he understands that senior officials were generally opposed to publishing this data, but Cabinet Office minister Francis Maude was insistent. He was told by one official that the information in QDS is “not used for board-level decision making” – indicating scepticism within Whitehall about the figures, their reliability and the value of this exercise.
Then there’s the fact that the data is currently so hard to compare. Procurement expert Peter Smith suggests that there are only two reasons why one would publish this data: to allow departments to carry out their own benchmarking exercises; and to allow the public to “hold people to account if it appears that things aren’t being done as well as they should”. Both reasons require comparable data, he suggests, and in its absence the figures risk looking like “a purely political sop to transparency, in that you can say: ‘We publish all this’, but you’re deliberately doing it in a way that makes it meaningless – though I don’t think that’s what is actually happening here”.
So what is happening? The fact that government is working to build common definitions supports Smith’s assertion that there isn’t a deliberate attempt to make this data meaningless; the caveats reflect variations in information management across government, which the Cabinet Office is working to eliminate.
Meanwhile, the QDS also contain other data sets which we chose not to analyse, as they don’t relate directly to efficiency. However, some of these – such as those measuring staff turnover and absence, and spending on major projects – are more readily comparable, and thus more valuable.
At Civil Service Live, a panel of experts concluded that if the transparency agenda is to be effective, the data released must be good quality. But they differed in their response to the current poor quality of much government data. The NAO’s current head, Amyas Morse, advised that government should not be too precipitate in publishing data. He warned that if published data is “not particularly useful and informative, and low-to-rubbish grade”, people will lose faith in the agenda. The Cabinet Office’s director of transparency, Tim Kelsey, argued that there isn’t time for government to proceed with the caution Morse advocates, and that his own experience in launching health information service Dr Foster showed that the fear of inaccurate criticisms “was a big incentive [for managers] to get on with data quality. We need to get the data out there as quickly as possible, particularly so that we can improve it.”
In an interview with CSW last month, Morse said he understands this argument, but it does present challenges. “The risk is that you have a very messy debate, and people ask questions or draw conclusions based on not-very-good information,” he warned. “And that has a cyclic effect: you get drawn into handling that, denying it, and instead of becoming clearer, things become more opaque.”
If these figures cannot yet be used to make confident analyses of departmental efficiency in key areas, the QDS certainly tell us something about the state of management information across government. Close inspection of the annexes reveals which departments haven’t got the processes to collect certain information on a quarterly basis, for example, and identifies those whose definitions are widely out of kilter with the majority. The next few QDS publications will allow us to judge just how much progress is being made by the Cabinet Office and Treasury as they try to improve financial information and produce meaningful comparisons of departmental spending. For those concerned with efficiency, these lessons may be just as interesting as any about particular corporate services, since good management information is the starting point of good decision-making – and therefore, in turn, the basis of any successful efficiency programme.