Validity and utility of theoretical tools - does the systematic review process from clinical medicine have a use in conservation? Ioan Fazey & David Lindenmayer Centre for Resource and Environmental Studies The Australian National University
Overview: PART 1: Theoretical tools in conservation biology: - Theory in conservation research - Validity and utility of theory - what are we trying to achieve? PART 2: Can we use systematic review methods from clinical medicine to clarify the biological limitations of theoretical tools? Aim: Is to stimulate discussion...
Theory in ecology - how the world works Theory in conservation biology - how the world works when we mess around with it Ecology and conservation related, but theory in ecology has not necessarily had conservation in mind Often resort to theoretical tools or concepts... Food web PART 1: Theory and theoretical tools in conservation biology
Examples of theoretical tools: Indicators/surrogates, 30% rule for habitat thresholds Keystone species Umbrella species Focal species etc... All theoretical tools are simplified versions of the real world We use them to build mental models of complexity and help us to make decisions or solve problems
How much theory is in the mainstream conservation literature? Survey of publications in conservation biology All papers in the 3 highest impact conservation journals from 2001 were read (n = 547) (Conservation Biology, Biological Conservation, Biodiversity and Conservation) Papers were classified as addressing theory if they explicitly commented on, reviewed, tested or proposed theory and concepts
Results: 14.1% (n = 77) of all publications explicitly addressed some aspect of theory
Results: Of the theoretical papers, 52% assessed empirical evidence (i.e. 7.3% of all papers surveyed) Indicators and surrogates - 93% papers assessed evidence Were theories fully or partially supported by evidence? –Theoretical papers:70% –Indicators/surrogates:57% Main points: –Most of theory is implicitly assumed –The types of theories tested are more tractable (e.g. edge effects vs. fragmentation theory) What about other theoretical tools?
The term “validity” is not appropriate for assessing theoretical tools... Theory Reality Theory is a simplified version of reality Implies theory is to be supported by objective truth “Validity” - verus (truth) - “to establish truth, accuracy, reality of…” If we ‘test’ them, there will always be something wrong! If we ‘test’ and falsify them, are we rejecting a useful tool?
E.g. Metapopulation biology Successful because the principle is simple: –Helps practitioners explain the need for the conservation >1 population –Helps visualise landscape processes –Fits well in the human way of looking at the world But many species don’t fit the paradigm… Need to be clear about what a model is for
Is “usefulness” a better way of assessing theoretical tools? Does a tool reliably: –Explain? –Describe? –Anticipate? –Facilitate design? Theoretical tools will never cater for all researchers and practitioners at the same time Therefore need to be explicit about a model’s limitations Often difficult to be sincere about limitations when presenting new theoretical construct
Most theory in the literature is implicitly assumed Because all theories are simplified representations of the real world, trying to “validate a theory” is not appropriate May be better to ask if a theory is useful Usefulness can mean many things Need to clearly state for whom and what the model is intended Need to be aware that theories have limitations and will never cater for all circumstances Key points:
PART 2: Can systematic reviews help clarify the biological limitations of theoretical tools?
Systematic reviews in clinical medicine Evidence for medical interventions reviewed in a systematic way Presented in accessible and understandable formats Widely disseminated e.g. through international collaborative organisation Evidence-based medicine To ensure that the best available evidence is used when treating patients
Reviewing evidence: Systematic reviews: “…a review of a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant research, and to collect and analyse data from the studies that are included in the review” Sacket et al. 1997
3 ‘systematic’ components: 1) Systematically searching for studies 2) Using specific criteria for including a particular study 3) Process used for appraising chosen studies
Types of evidence in conservation and medicine: - Difficulty in obtaining replicates - Multiple species/communities - Difficulty defining measurable outcomes Natural experiments MedicineConservation Observation Systematic reviews Experiments Relative proportion of evidence at different levels Easier to review experimental evidence Majority of evidence for management in conservation is not experimental
Methods now being used to rank different types of evidence (e.g. combining observational and experiential evidence) We can use at least some components of systematic reviews for assessing evidence (e.g. being explicit in how we search for studies and what criteria we use to decide whether we include them in a review)
Disseminating reviews: –60 international review groups (editorial boards) that review the systematic reviews –Central non-profit organisation manages main database –15 Cochrane centres worldwide to provide support –Strict peer review like system for systematic reviews –Collaboration based on independence and altruism The Cochrane Collaboration (CC) was set up specifically to publish reviews and maintain independence of the approach
Many benefits of reviewing evidence through the CC: Forum for producing reviews Forum for 2 way communication between researchers, doctors and patients Increased accessibility to primary literature Increase of null results in literature Increased recognition for researchers for producing reviews Demonstrates significance of medical research to wider community
Can at least use some components of systematic reviews to increase objectivity of assessing empirical evidence Success of systematic reviews in medicine is in part due to: –Their aim to clearly inform research and practice –The independent process for reviewing information –Wide dissemination of results The reviews and dissemination process go hand in hand - increasing synthesising activities results in an increased understanding of current limitations to knowledge and promotes the collection of more and better evidence Key points:
5 Main Conclusions: 1. Need to be more explicit about what a theory is for 2. “Validity” is not a helpful word to use assessing a theoretical tool 3. Assessing the utility of a theory may be more appropriate 4. Using at least some of the components of systematic reviews can help us assess biological limitations to theoretical tools 5.We may need to consider whether the fora for publishing reviews of theoretical tools are currently adequate