
Forest Service Challenged on Categorical Exclusions for Small Timber Harvesting
by Guest Blogger, 7/18/2003
In March 2003 the John Muir Project of Earth Island Institute, Sierra Club and Heartwood filed a data quality challenge with the U.S. Department of Agriculture (USDA). The groups sought a correction of information related to January 8, 2003 Federal Register notice for the "National Environmental Policy Act Documentation Needed for Limited Timber Harvest." The Forest Service used information from monitoring timber sales in order to develop new criteria on limited tree removal Categorical Exclusions (CEs), under the National Environmental Policy Act (NEPA).
Background Information
Under NEPA, federal agencies can issue CEs for small-scale activities. The CEs exempt the actions from requirements for either an environmental impact statement or environmental assessment. This limits the administrative burden for activities that have very minimal or no environmental impact, such as maintenance activities or developing rules that establish administrative activities. The challenged Forest Service proposal would add three categorical exclusions to their regulations, applicable to small timber harvesting projects. The Forest Service Federal Register notice claims that the timber harvest projects would not have significant effects on the human environment. The data quality petition challenges the data used to develop the new criteria on CEs.
In order to develop the new criteria referenced in the Federal Register notice, the Forest Service developed protocols for monitoring forest resources. Data from the monitoring of 154 projects influenced the agency's development of the tree removal CEs. Apparently, the agency used the technique of observation for 85 percent of the data points monitored.
Request for Correction
Pointing to the Forest Service's use of observation, the petitioners assert that this technique "is considered the least reliable monitoring technique by the science community" and that it is not replicable. In other measurements such as water quality, they argue the agency did not rely on baseline measurements before or after, therefore making the agency unable to make determinations about degradation in quality.
The petitioners argue that information supporting the Federal Register Notice violates the objectivity standard of USDA's information quality guidelines. According to the USDA's guidelines, all environmental assessments, impact statements, and documents prepared under NEPA are subject to these guidelines. The challenge states several reasons for noncompliance. First, data pertaining to the kinds of observation or measurement techniques used to monitor projects, which was specifically referenced in the proposed rule, was not publicly available. Second, the technique of observation is not objective because it can neither be independently validated nor can it be duplicated.
The challenge asserts that because the information and monitoring techniques are used to create a new Categorical Exclusion for logging, they must be considered "influential" information. The petitioners claim that the proposed rule violates the data quality guideline's objectivity standard for influential regulatory information because the data it relies on does not use the best science nor was it collected by accepted or best available methods.
The petition lists a number of effects resulting from the use of this challenged data. The challengers cannot assess the significance of the CEs and therefore cannot provide accurate comments or provide advice to constituents. Also, if the proposed rule is finalized, the petitioners believe their ability to petition the government for redress of grievances would be harmed because the projects would be excluded from administrative appeal. Lastly, the petition implies that the environment would be unduly harmed if the challenged information were used to implement the CEs as currently planned.
To correct the data the petitioners recommend that the Forest Service use "measurement" and not "observation" as a monitoring technique for all data, making the data more easily replicable. They also ask that the specific measurement techniques and entire data set used in the proposed rule be made public. After releasing the information, the petitioners believe that the agency should re-evaluate their conclusions based on the data and re-start the rulemaking process.
Agency Response
The USDA Forest Service responded in a July 29, 2003 letter, denying the request for correction. The agency believes that the observation techniques constitute expert opinion, allowable under USDA Data Quality Guidelines. Furthermore, the letter states the resource specialists who collected the data are highly trained and their judgment is exerted within the context of protective laws. The Forest Services uses these factors to justify their data, rejecting the petitioners' request and allowing the rulemaking to continue.
The agency initially stated in a May 22, 2003 letter of acknowledgement that the data quality challenge would be considered as part of the comment process for the rulemaking. However, the agency response signifies it is examining data quality petitions outside the comment process contradicting its previous statement that the comment process within a rulemaking is an appropriate forum for data quality challenges.
Appeal
The John Muir Project, Sierra Club, and Heartwood submitted a request for reconsideration September 10, 2003, again citing the improper use of observation as a technique and the lack of transparency in the supporting information. The specific claims that the petitioners outline are:
- The initial agency review or original request not conducted with due diligence.
- Sound analytical methods were not used in analyses.
- Reliable data and information was not used.
- The reliance on observation fails to ensure transparency of analysis.
- The agency fails to explain rationale for its choices in data.
- No model or logical analysis is presented to support recommendations.
- Sources of uncertainty are not identified.
- Proper methods were not followed for "influential" information.
- The data fails the test of "objectivity."
- Statistically-flawed data was included without explanation.
