Presentation is loading. Please wait.

Presentation is loading. Please wait.

Keith M. Reynolds, USDA Forest Service

Similar presentations


Presentation on theme: "Keith M. Reynolds, USDA Forest Service"— Presentation transcript:

1 The Science/Policy Interface in Logic‑based Evaluation of Forest Ecosystem Sustainability
Keith M. Reynolds, USDA Forest Service K. Norman Johnson, Oregon State University Sean N. Gordon, Oregon State University

2 Acknowledgments USDA Forest Service Washington Office
National Forest System, Ecosystem Management Pacific Northwest Research Station Human and Natural Resource Interactions RD&A Program

3 Objectives Illustrate the value of a logic-based approach in designing a formal specification to evaluate the Montreal criteria and indicators. Identify the roles of science and policy in this effort. Highlight lessons learned from this process. Suggest some general recommendations.

4 Overview Introduction Knowledge bases and logic modeling Analysis
Model design issues Lessons learned Recommendations

5 Introduction: criterion
A standard that a thing is judged by (Prabhu et al. 2001). Criteria are the intermediate points to which the information provided by the indicators can be integrated and where an interpretable assessment crystallizes. Principles [e.g., sustainability] form the final point of integration. A criterion should be treated as a reflection of knowledge. It can be viewed as a large‑scale selective combination … of related pieces of information.

6 Introduction: indicators
An indicator is any variable or component of the forest ecosystem … used to infer attributes of the sustainability of the resource and its utilization. Indicators should convey a ‘single meaningful message.’ This ‘single message’ is termed information. It represents an aggregate of one or more data elements with certain established relationships (Prabhu et al. 2001).

7 Introduction: measurement endpoint
Some indicators are simple. Their definition suggests an obvious one‑to‑one correspondence between an indicator and a metric for that indicator. Definitions of some indicators are more complex. They represent a synthesis of two or more data elements, which we refer to as measurement endpoints.

8 Introduction: scales of application
Purpose varies with scale (Castañeda 2001) National and regional Policy instruments to evaluate laws, policy, regulations E.g., Montreal Process, NWFP, ICBEMP Management unit Evaluation and adjustment of management practices E.g., CIFOR, USDA FS IMI (LUCID)

9 Knowledge bases A form of meta database Advantages
A formal logical representation of how to evaluate information Networks of interrelated topics Mental map Advantages Interactive, graphic design (modularity) Numerous & diverse topics can be analyzed within a single integrated analysis

10 Knowledge bases: forms of uncertainty
Probabilistic uncertainty Uncertainty of events Linguistic uncertainty Uncertainty about the definition of events Vagueness or imprecision A proposition is the smallest unit of thought to which one can assign a measure of truth (strength of evidence) SE: a measure that quantifies the degree of support for a proposition provided by its premises

11 Knowledge bases: networks of topics
Concern 1 Concern 2 Ecostate A Ecostate B Etc. Ecostate C Ecostate D Data link Data link Data link Data link Data link Data 10 6

12 Knowledge bases: topics
Each topic typically evaluates a proposition Attributes of topics Name (e.g., watershed condition) Proposition (e.g., WS condition is suitable) Strength of evidence for the proposition NetWeaver scale is [-1, 1] A formal logic specification (e.g., a proof) Documentation Explanation, source, citations, assumptions

13 Knowledge bases: evaluation
Concern 1 Ecostate A Get data requirements Evaluate data Ecostate B Ecostate C Data link Data link Data link 11 7

14 Knowledge bases: strength of evidence

15 Analysis: Montreal C&I
The Montreal specifications provide relatively clear definitions of biophysical, socioeconomic, and framework attributes requiring evaluation (WGCICSMTBF 1995) ... But, design of evaluation procedures that allow interpretation of the Montreal C&I is one of the major technical issues that remain to be resolved (Raison et al. 2001).

16 Analysis: conceptual framework
Specified conditions or outcomes to be sustained (the indicators). A measure for each condition or outcome. Calculation of the level of the indicator over some time period using the selected measure. A frame of reference for gauging sustainability. Methods for evaluating sustainability (sustainability check). A monitoring program. A formalism that supports requirements 1 to 6.

17 Analysis: logic models as design frameworks
Logic models (knowledge bases) provide a formal specification for organizing and interpreting information. NetWeaver kb developer system Problem represented in terms of propositions about topics of interest and their interdependencies. Topics translated into propositions. Lexical uncertainty.

18 Analysis: logic models as design frameworks (continued)
Need for transparency (Prabhu et al. 2001) Models embody important policy decisions. Models depend on value judgments and critical assumptions that need clear documentation. Model development Graphic representation is an effective basis for organizing discussion and for evolution of design. Communication Between scientists and policy makers. With interested publics.

19 Design issues: model organization
Basic organization of topics. For example, evaluation of criteria in the current prototype.

20 Design issues: model organization
An alternative organization with very different emphasis on criteria.

21 Design issues: synthesis
AND operator: arguments evaluated as limiting factors. SUM operator: arguments contribute incrementally to evaluation and can compensate.

22 Design issues: synthesis
Another example, including the OR operator.

23 Design issues: weighting
Intrinsic weights Each topic in a NetWeaver logic model has an intrinsic weight attribute. E.g., set weight attribute on any topic to adjust its contribution of evidence to a proposition. Bad idea: part of specification, but not obvious. Explicit weights Better, but add another layer of subjectivity. Some valid purposes, however.

24 Design issues: reference conditions
Each elementary network evaluates a measurement endpoint against reference conditions. Lack of reference conditions is a basic problem for most measurement endpoints.

25 Design issues: reference conditions
Implementation of a elementary network to evaluate measurement endpoint against reference conditions.

26 Design issues: qualitative measures
Outcomes evaluated on an ordinal scale.

27 Design issues: reliability of data
Reliability of data for evaluation of Montreal C&I. Stochastic, rather than lexical, uncertainty Formal representation of stochastic uncertainty is problematic in the context of a logic model. Not addressed in the current Montreal C&I prototype. Possible solution Adjusting topic weights with a normalized metric such as standard error of the mean. Problems: availability, unknown error correlations

28 Design issues: precision of knowledge
Sequential OR (SOR) to specify multiple alternative pathways in order of preference.

29 Lessons learned Lexical uncertainty is an important issue in evaluation of Montreal criteria and indicators. Many aspects of evaluating sustainability cannot be answered by science alone. Acquiring data on sustainability is necessary but not sufficient for setting policy. Evaluating sustainability is not the same as defining desired future conditions. Evaluating the state of sustainability and deciding how to respond are separate but interdependent decision processes.

30 Recommendations Assess the policy role in sustainability evaluation, and undertake a policy review of model organization and strategies for integrating sustainability information. The clearest, and most critical, role of science is in development of reference conditions. A major effort is needed to identify measurement endpoints for indicators of the institutional framework (criterion 7).

31 Followup Scientist peer review (Mar 14/21, 2002)
John Gordon (forester, Yale) Jerry Franklin (ecologist, U. of Washington) Hal Salwasser (Dean, CoF, Oregon State University) Denise Lach (sociologist, Oregon State University) Gordie Reeves (fisheries biologist, PNW) Richard Haynes (economist, PNW)

32 Followup Management peer review (Mar 14/21, 2002)
Darrel Kenops (supervisor, Willamette NF) Gloria Brown (supervisor, Siuslaw NF) Ted Lorensen (manager, Oregon DoF) Jon Martin (REO monitoring, Region 6) Dick Phillips (economist, Region 6) Sarah Crim (analyst, Region 6)

33 Followup Demo with criterion 6 data from 2003 report
Richard Haynes (PNW) Ken Skog (FPL) Sue Alexander (PNW) Others? Demo with LUCID link? Possibly July/Augest 2002

34 Authors Keith M. Reynolds K. Norman Johnson Sean N. Gordon
USDA Forest Service, Pacific Northwest Research Stn. K. Norman Johnson Oregon State University, College of Forestry Sean N. Gordon


Download ppt "Keith M. Reynolds, USDA Forest Service"

Similar presentations


Ads by Google