The Data Quality Act passed through Congress in Sec. 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). The guidelines, implemented in 2002, could be misused to delay, manipulate, and unfairly affect the outcome of federal agencies' activities. The Data Quality Act OMB's guidelines were required by an appropriations rider, sometimes referred to as the "Data Quality Act," contained in Sec. 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). There were no hearings on the rider -- which was added at the last second by Rep. Jo Ann Emerson (R-MO), who serves on the Appropriations Committee -- and no debate on the floor, leaving little to judge congressional intent besides the legislative language. Specifically, the law directed OMB to issue, by Sept. 30, 2001, "policy and procedural guidance to Federal agencies" subject to the Paperwork Reduction Act (44 U.S.C chapter 35) requiring that they:
  • Issue their own data quality guidelines, within one year of OMB's implementing guidelines, "ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated";
  • Establish "administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency that does not comply with the guidelines"; and
  • Report periodically to OMB once the guidelines are put in practice detailing "the number and nature" of data quality complaints received by the agency, as well as "how such complaints were handled."
This rider builds on an industry lobbying effort to put roadblocks in the regulatory process. As noted by the Center for Regulatory Effectiveness, a strong advocate for the rider, there was similar report language added to the FY 99 Omnibus Appropriations Act (PL 105-277), also added at the last second without any debate. The report language (at House Report 105-592 (at 49-50)) directed OMB to develop "rules providing policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies, and information disseminated by non-Federal entities with financial support from the Federal government." The report language, incorporated by reference in the conference report, also calls for administrative mechanisms for error correction to be established at each agency. CRE and other pro-industry representatives were frustrated that OMB never issued guidelines based on the report language, persuading Emerson to put it into law. Unlike the report language, however, the Emerson rider does not specifically apply to "non-Federal entities with financial support" from the government. Some have expressed concern that the Data Quality Act (DQA) also builds on another appropriations rider offered on the FY 99 Omnibus Appropriations Act by Sen. Richard Shelby (R-AL). The Shelby amendment directed OMB to revise OMB Circular A-110 -- a circular dealing with grants to nonprofits -- to allow public access to federally funded research data through Freedom of Information Act (FOIA) requests. The rider only applies to research done by federal grantees, not contractors, and like the DQA was done without any public debate or scrutiny. Some speculated that industry would request underlying data from universities, for instance, potentially stifling ongoing research that might lead to regulation by federal agencies. OMB's Implementing Guidelines OMB published final implementing guidelines in the Federal Register on Sept. 28, 2001. In doing so, OMB requested additional comments on the "reproducibility" standard and the related definition of "influential" information (discussed below), which were issued on an interim final basis. A final verdict on these issues was published in the Federal Register on Jan. 3, 2002, corrected on Feb. 5, 2002, and republished on Feb. 22, 2002 (Vol. 67, No. 36, pg. 8452). Specifically, in developing their own guidelines, which are to be finalized by Oct. 1, 2002, agencies are to:
  • Adopt specific standards for data quality, consistent with OMB definitions of "objectivity, utility, and integrity" -- which given the expansive definition of "objectivity" is enormously crucial, as discussed further below.
  • Develop a process for reviewing information for quality before it is disseminated (applicable to information disseminated on or after Oct. 1, 2002).
  • Establish administrative mechanisms to allow for challenges from "affected persons" (which OMB leaves to the agencies to define) to seek and obtain "timely correction of information maintained and disseminated by the agency that does not comply with OMB or agency guidelines." According to OMB, this applies to any information disseminated by the agency on or after Oct. 1, 2002, regardless of when it was first disseminated. However, in their draft guidelines, several agencies have placed time limits on data quality challenges; the State Dept., for instance, provides that a challenge must be brought within 60 days after the information is first disseminated. Agencies must respond to a challenge "in a manner appropriate to the nature and extent of the complaint," including through personal contacts, form letters, press releases or mass mailings that "correct a widely disseminated error or address a frequently raised complaint."
  • Implement an appeals process, which the statute does not speak to, allowing those who disagree with an agency's verdict on a data-quality challenge to "file for reconsideration within the agency."
  • Submit draft guidelines to OMB for review no later than July 1, 2002. These guidelines, which OMB will review for consistency with its own implementing guidelines, are to be presented in a report that explains how the agency will achieve OMB's data-quality objectives. Once the guidelines are put in practice, agencies are to report to OMB on an annual basis beginning on Jan. 1, 2004, on the "number and nature of complaints" and "how such complaints were resolved."
OMB notes that its guidelines are intended to allow agencies to incorporate their existing practices in a "common-sense and workable manner," rather than "create new and potentially duplicative or contradictory processes." For example, OMB acknowledges that under OMB Circular A-130, agencies already address data quality issues. At the same time, OMB prescribes a number of requirements that go beyond the statute, instructing that "agencies should not disseminate substantive information that does not meet a basic level of quality." There is significant debate on whether the new law and OMB's guidelines create any new judicially reviewable process. Industry lobbyists suggest that the administrative mechanisms for error correction, including the appeals process, establish new legally reviewable responsibilities. In their draft guidelines, several agencies argue the opposite. OMB has taken no position on this. Through all of the above requirements, the definition of "quality" information is crucial. As stated above, the statute directs guidelines that ensure "quality, objectivity, utility, and integrity" of information disseminated to the public. OMB treats "quality" as "an encompassing term comprising utility, objectivity, and integrity" and provides definitions for each of these constituent terms. The definitions for "utility" and "integrity" appear relatively benign. "Utility" refers to information's usefulness to the public. "Integrity" refers to the "protection of the information from unauthorized access or revision." (Because OMB's guidelines encourage flexibility, agencies may have their own unique, and perhaps more specific, definitions for these terms.) The expansive definition of "objectivity" is where things get complicated, as OMB packs in a number of controversial requirements. OMB instructs that "objectivity" contains two elements. The first involves presentation -- whether information "is being presented in an accurate, clear, complete, and unbiased manner. This involves whether the information is presented within a proper context." (emphasis added) How this is woven into administrative mechanisms for error correction remains unclear and will likely be a contentious issue for some agencies. The second element involves the substance -- whether it is "accurate, reliable, and unbiased information" and uses "sound statistical and research methods." For dissemination of information that is "influential," OMB calls for even higher standards of data quality. Specifically: "Influential" scientific, financial, or statistical information must be "reproducible." "Influential" means that the agency can "reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions." Such information must be reproducible by "qualified third parties," meaning the same result would be achieved following reanalysis. Accordingly, agencies must offer a "high degree of transparency about data and methods" to ensure reproducibility, which is applied differently for three types of "influential" information:
  • For "original and supporting data," agencies are to consult with "relevant scientific and technical communities" to determine which data is subject to the reproducibility requirement. However, reproducibility in this context means that there is a high level of transparency about research design and methods, negating the need to replicate work before dissemination.
  • For "analytic results," there must be "sufficient transparency about data and methods that an independent reanalysis could be undertaken." OMB adds that this means that "independent analysis of the original or supporting data using identical methods would generate similar analytic results, subject to an acceptable degree of imprecision or error." However, the transparency necessary to achieve this is not meant to "override other compelling interests such as privacy, trade secrets, intellectual property, and other confidentiality protections." In such cases where the public does not have access to data and methods, "agencies shall apply especially rigorous robustness checks to analytic results and document what checks were undertaken."
  • For analysis of "risks to human health, safety and the environment maintained or disseminated" by agencies (emphasis added), agencies must "adopt or adapt" the principles related to risk analysis in the Safe Drinking Water Act. More on this below.
Peer review may be necessary to demonstrate objectivity. According to OMB, "'objectivity' involves a focus on ensuring accurate, reliable, and unbiased information," which can be achieved "using sound statistical and research methods." Independent, external peer-reviewed information "may generally be presumed to be of acceptable objectivity." However, OMB throws in a kink; peer-review may not be adequate to demonstrate "objectivity" if a "persuasive" showing is made to the contrary. OMB's guidelines do not direct peer review panels to be balanced in terms of viewpoints. Rather, as recommended by OMB on Sept. 20, 2001, peer reviewers are to be selected "primarily on the basis of necessary technical expertise," and any financial conflicts of interest or related policy positions taken prior to peer review must be disclosed to the agency, but not necessarily the public. Agencies must "adapt or adopt" requirements for risk assessment under the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. 300g-1(b)(3)(A) & (B)). "With regard to analysis of risks to human health, safety, and the environment maintained or disseminated by the agencies," agencies are to meet principles laid out in the Safe Drinking Water Act (SDWA), which are perhaps the most rigorous standards for risk assessment written into statute. Previously, John Graham, administrator of OMB's Office of Information and Regulatory Affairs (OIRA), had issued an agency-wide memo on regulatory analysis that also pressed SDWA principles for risk assessment, saying that agency proposals employing these methods would be viewed more favorably by OIRA -- which must grant clearance to all health, safety, and environmental protections before they can take effect. Graham, who was in charge of formulating OMB's data quality guidelines, seized the opportunity to achieve formal adoption of these risk assessment principles across agencies. The SDWA places particular emphasis on "peer-reviewed science and supporting studies" and asks for very detailed information about the risk being examined. For instance, the agency is to identify "each population" affected, the "expected risk" for each of these populations, and "each significant uncertainty" that emerges in the risk assessment. Graham has said such rigor, specifically the practice of agency peer review, should satisfy the "objectivity" requirement of the guidelines. The U.S. Chamber of Commerce and other industry representatives argue that all information related to a rulemaking, regardless of whether it was generated by a third party, should be presumed to be "influential" and subject to the robust requirements suggested by OMB. Accordingly, industry would like agencies to label information as "influential" or "non-influential" and have a very open definition. Industry does not want agencies to define "influential" as equivalent to an "economically significant" rule under Executive Order 12866; they consider that too narrow. Background and Data Quality History
  • "Scientific Integrity in Policymaking: An Investigation into the Bush Administration's Misuse of Science" (02/2004)
  • "Information Quality and the Law, or, How to Catch a Difficult Horse" (11/2003)
  • "Agencies Finalize Data Quality Guidelines" (11/06/02)
  • "OMB Speaks on Data Quality, Again" (09/16/02)
  • "New Work on Data Quality" (05/28/02)
  • "Agencies 'Adapt' Data Quality Guidelines" (05/15/02)
  • "Industry Targets EPA Data Quality" (04/17/02)
  • "Data Quality Approaches" (04/15/02)
back to Blog