More Action Is Needed to Improve Recovery Act Data Quality

The Recovery Act may be a great step forward for spending transparency, but it is also exposing the problems of obtaining quality recipient reporting. Two new government reports show that recent revisions and additions to Office of Management and Budget (OMB) rules on recipient reporting are not necessarily "magic bullets" for addressing reporting errors. The reports also make clear that ensuring that recipients have a clear understanding of existing guidance is a crucial aspect of any data quality improvement effort.

The first report is from the Government Accountability Office (GAO) and comes in the form of one of its regular Recovery Act oversight reports. The report's main focus is on how a select group of states spent Recovery Act funds, but a significant portion is devoted to recipient report data quality.

In December 2009, only a month before the second recipient reporting cycle began, OMB, which is responsible for writing the rules for the recipient reporting process, published a new Recovery Act guidance document. The new guidance addressed data quality issues and tried to simplify some reporting processes. The GAO report found that there are far fewer recipient reporting errors following the release of this guidance, but there are still many problems.

According to GAO, "The second round of reporting appears to have gone more smoothly as recipients have become more familiar with the reporting system and requirements." At the same time, GAO noted, "Data errors, reporting inconsistencies, and decisions by some recipients not to use the new job reporting guidance for this round compromise data quality and the ability to aggregate the data."

The new jobs reporting guidance GAO refers to was published by OMB and says that any work paid for with Recovery Act dollars should be reported. Yet only 56 percent of prime recipients reported paying anyone with Recovery Act funds. Sixteen percent of the prime recipient reports showed jobs created or saved despite having received no funding from the government. The GAO report does note that some of these unusual numbers could be explained by the time lag involved in the reimbursement of government funding.

In trying to ascertain how these recipient errors arose, the GAO found a disturbing trend. Some recipients, according to the GAO, did not use the new formula for reporting FTEs, which OMB outlined in the December guidance. Instead, these recipients used the old formula, which was used in the first reporting quarter. Under the old formula, recipients were to identify the full-time equivalents that they created or saved, which left some ambiguity of what a "saved" job was. GAO noted that without interviewing every single recipient, there is no way to tell which method each recipient used in his or her report. However, when it came to education, which was the largest category of jobs reported, GAO found that a number of states reported job numbers using the old methodology.

This means that analysts and policymakers will not be able to rely on the numbers from this last quarter because of a lack of consistency in what is being reported. Additionally, the change in defining jobs between the first and second quarters means it will be impossible to compare data over the two quarters or to get cumulative data. GAO remains hopeful that the changes that have been made in the jobs reporting guidance and other reporting system enhancements will "ultimately result in improved data quality and reliability."

The second report further emphasizes how the current reporting problems cannot be simply solved with new guidance. The report came from the Recovery Accountability and Transparency Board (Recovery Board), which is tasked under the Recovery Act with overseeing recipient reporting and Recovery.gov. As part of its one-year assessment, one of the Recovery Board's members, the inspector general of the Department of Transportation, led a study on Recovery Act data quality. While the Board's report has substantially fewer specific examples and figures than GAO's voluminous publication, the Board's report does efficiently analyze types of recipient errors. The Board categorized the errors it found into four distinct groups:

  • Recipients misinterpreting OMB and agency guidance
  • Technical challenges
  • Recipients not knowing or having incorrect codes or numbers
  • Human error

Unfortunately, the report does not provide a detailed definition of these various error types. For instance, one Department of Justice office the Board contacted found that 31 percent of its data inaccuracies came from incorrect DUNS numbers, which are company identifiers provided by Dun & Bradstreet. This could either be the result of human error (e.g., typing errors) or recipients having the wrong DUNS numbers. Nevertheless, having these categories allows the government to begin to identify possible data quality solutions, and they indicate that more guidance from OMB may not prevent future recipient errors.

The Recovery Board indicates that rather than a dearth of OMB guidance, recipients are having difficulty understanding what exactly the existing guidance requires. To mitigate the rate of those errors falling under the first three categories, the Recovery Board recommends OMB and the agencies should increase communication between themselves and recipients. Similarly, addressing errors described by the fourth category – human error – requires tighter coordination among the Recovery Board, OMB, and the agencies. While the agencies could catch project-specific errors, limiting the number of human errors soon after they are reported by recipients, the Recovery Board can also work to limit these errors from even being entered.

Indeed, the Board is already implementing solutions that do just that, with so-called "hard checks." These checks prevent recipients from inputting clearly erroneous information. For example, by checking to see if the "Amount Received" field is larger than the "Award Amount" field, and by preventing recipients from filing if the numbers don't add up, recipients are prevented from entering incorrect data. In fact, by using these kinds of checks, the Board completely eliminated the "phantom" congressional district problem from the first cycle. Similarly, the GAO found that while 133 records in first quarter reported receiving more than the award amount, none in second quarter did so.

The nature of the problems found by both the GAO and the Recovery Board indicate that more recipient guidance won't necessarily help improve data quality. What does seem to help, as the success of the Recovery Board's hard checks shows, is increased attention by federal agencies and the implementation of mechanized data validation. Whether it is external efforts, such as agencies helping recipients understand the existing reporting rules, or internal efforts, such as hard checks, more action is needed to improve recipient report data quality.

back to Blog