Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Framework for Monitoring. Conference outcomes  The Board work groups reviewed the work group specific and general Board related conference recommendations.

Similar presentations


Presentation on theme: "A Framework for Monitoring. Conference outcomes  The Board work groups reviewed the work group specific and general Board related conference recommendations."— Presentation transcript:

1 A Framework for Monitoring

2 Conference outcomes  The Board work groups reviewed the work group specific and general Board related conference recommendations  The Board and work groups discussed how the recommendations could be meshed with current Board product efforts and included in a longer term strategy  The Board will consider how we can make the framework most relevant to the various monitoring entities  The Board and it’s work groups will continue this process, to culminate in a revised work plan at the next Board meeting  As a part of this effort the Board will identify audiences that can provide success stories and/or use Board products

3 Board Strategy  Develop element considerations and related goal group products –Framework related product (s) –Use conference input to focus the product choice (especially discussion recommendations and questions 18 and 19?) –Determine new technology relationships to products  Develop two-year work group product strategy to prioritize products to showcase at next conference –Compile/develop “success stories” – pilot studies that demonstrate relevance of framework or product –link products together via an information warehouse/expert system –Determine IMPACT contributions

4 Some considerations for developing a framework related product strategy for the Council  Develop list of element considerations and relationships to goal groups  Use conference recommendations and framework to develop potential goal group product list  Develop work groups to deliver products  Prioritize products and consider 2 year strategy to showcase products at conference  Compile/develop “success stories” that demonstrate relevance of the framework  Determine contributions to: expert system to link products together IMPACT issue consider new technology relationships to products data management needs – which goal groups

5 Collaboration and Comparability  Each year, government agencies, industry, academic researchers, and private organizations devote enormous amounts of time and money to monitor, protect, manage, and restore water resources and watersheds.

6 2002 National Monitoring Conference  The mission of the National Council is to provide a national forum to coordinate consistent and scientifically defensible methods and strategies for improving water quality monitoring, assessment, and reporting.

7 Why Focus on Collaboration & Comparability?  Critical differences in project design, methods, data analysis, and data management make it difficult for monitoring information to be shared by more potential data users.

8 Why do we monitor?  Describe status and trends  Describe and rank existing and emerging problems  Design management and regulatory programs  Respond to emergencies From the Final Report of the Intergovernmental Task Force on Monitoring (1995)

9 Collaboration and Comparability  Development of a national monitoring strategy requires that we create a framework for collaboration and comparability among programs

10 What is a Monitoring Framework?  The process of monitoring and assessment should principally be seen as a sequence of related activities that –start with the definition of information needs and –end with the use of the information product. UN/ECE Task Force on Monitoring and Assessment (2000)

11 Proposed National Monitoring Framework

12 Elements of the Framework  Identify Monitoring Objectives  Design Monitoring Program  Collect Data in the Field and Lab  Manage Data  Interpret Data  Convey Information and Results

13 Examples of Element Considerations  Identify monitoring objectives –Define Data Quality Objectives (DQOs) –Determine information expectations of legal requirements –Determine data and information required to support watershed assessments and other collaborators needs  Design monitoring program –Articulate and document overall monitoring/information strategy –Identify the environmental setting and water-quality issues –Determine spatial/temporal and constituent approach to meet information needs

14 Examples of Element Considerations (cont.)  Collecting Data in the Field and Lab –Determine Measurement Quality Objectives (MQOs) –Identify optimal methods –Develop a sample management plan –Train and certify personnel  Managing Data –Determine data management requirements and develop and document data handling and audit approach –Develop meta data requirements –Use data checking programs to determine reliability of chemical data – data verification

15 Examples of Element Considerations (cont.)  Interpreting Data –Interpretation/implications: historical evaluation, water quality relevance, management relevance, professional judgment, information goals met? –Use existing indicators/indices –Choose and run appropriate water-quality models  Conveying Information and Results –Determine audience –Determine media – internet, reports, news releases, oral, conference/meeting displays –Peer review of information

16 Examples of Element Considerations (cont.)  Coordination/Collaboration –National and regional monitoring conferences –State and regional monitoring council participation –Partner identification –Partner comparability studies –Monitoring data inventories –Conduct data and information swaps

17 A Framework and the Council  A framework will support the Council’s mission by providing a systematic & conceptual approach to the monitoring process to guide the NWQMC, Methods Board, and State and Regional Councils efforts

18 Council’s Product-Based Approach  Develop products through goal group structure  Deliver products in the short term while thinking and planning strategically in the long term Water Information Strategies Methods & Data Comparability Collaboration & Outreach Watershed Components Interaction

19 Council’s Product-Based Approach  Product Based Approach –Generate intermediate and final products to demonstrate success –Prioritize longer term product activities –Organize meetings to focus on product accomplishments –Not attempt more than can be accomplished –Continue to involve additional volunteer stakeholders –Publicize what we do Water Information Strategies Methods & Data Comparability Collaboration & Outreach Watershed Components Interaction

20  Purpose: Create and communicate goal- oriented monitoring design guidance that results in comparable information, over time and space, being produced in support of management decision making.  Current framework focus: Water Information Strategies

21  Purpose: Explore, evaluate and develop methods and approaches to measurement that facilitate collaboration and promote comparability between water quality monitoring programs.  Current framework focus: Methods and Data Comparability

22 Watershed Components Interactions  Purpose: Provide a national forum to advance the integration of ground and surface water monitoring to more fully understand the connected nature of these watershed components and their combined impact on the ecological integrity of the hydrologic system.  Current framework focus:

23  Purpose: Build and support creative partnerships among the many elements of the monitoring community, particularly by supporting the development of state and regional monitoring councils. Provide support so that Council members can serve as ambassadors to heighten the awareness and involvement of all stakeholders in water resource monitoring, protection, and restoration.  Current framework focus: Collaboration and Outreach

24 Using the Framework to help coordinate monitoring efforts  the “cogs” of the graphic define the six elements of the Framework  Each of the elements include monitoring considerations  Products can be developed and information summarized to address the element considerations  Products can be linked via an on-line expert system (information warehouse)

25 ElementElement ConsiderationsProduct or Activity Identify objectives and design of monitoring project Study objectives Monitoring questions Data quality objectives Measurement quality objectives Sampling design DQO paper (future activity) Expert system (ongoing) NEMI (beta release) PBMS paper (NWQMC Tech Report 01-02) COD pilot paper (submitted to ES&T) Collect data in the field Field certification & training Field protocols Field method performance Sample handling & preservation Field certification position paper (future activity) NEMI (phase 3 – 2002 start) Field biology PBMS paper (draft 2002) Nutrient PBMS pilot (2002 start) Macroinvertebrate PBMS pilot (2002 start) Collect data in the laboratory Method comparability Laboratory accreditation Reference materials availability Laboratory method verification NEMI (beta release) Federal laboratory accreditation position (ACWI approved 2002) Coordination with NELAC (ongoing) State laboratory accreditation position (future activity) PBMS position paper (NWQMC Tech Report 01-02) COD pilot paper (submitted to ES&T) Manage data Required metadata Data quality documentation Water quality data elements Chemical & microbiological list (ACWI approved 2001) Biological list (2001 start) NEMI coordination (ongoing) Methods and Data Comparability Framework

26  Determine data management requirements and develop and document data handling and audit approach  Develop meta data requirements  Use data checking programs to determine reliability of chemical data – data verification Product based approach –WQDE example Manage data common set of WQDE Identify the aspects that have to be considered within each element of the framework Develop products, activities—TOOLS—that help people address the considerations within each element of the framework. Having and using a common set of data elements builds our capacity to understand our water resources

27 IMPACT Issue to describe and announce the Framework  September 2003 issue  Teams to prepare short “cog” articles  Handout provides draft outline for issue and suggestions for lead authors and collaborators  Determine/agree upon lead authors for articles

28 Building a Framework for the Future  Conference organized around 6 thematic tracks, reflecting pieces of the framework

29 Conference Organized Around 6 Tracks T1--Setting the Stage for Monitoring T2 & T3--Field & Lab Methods for Today & Tomorrow T4--Exploring Opportunities in Data Management T5--Making Sense of the Data T6--Data to Information to Action

30 Conference Structure  Four different session types: –workshops and extended sessions (Monday), –presentation sessions (Tuesday & Wednesday—each track had 5 90- minute presentation sessions) –poster sessions (Tuesday & Wednesday) –Council goal group discussion sessions (Wednesday).  All sessions feed into creating the framework

31 Workshops, Presentations, and Posters followed by Discussions  One discussion session for each of the Council’s four working goal groups  Forum for sharing experiences and exploring ways the Council’s workgroups can build, foster, and promote a monitoring framework for the future  Opportunity to incorporate the ideas and issues raised in the workshops and track sessions

32 Discussions (continued)  Brainstorm specific roles the workgroup can play in emphasizing a monitoring framework for the future.  Produce recommendations on how the workgroup can promote, foster, and support the framework and the national monitoring community.  Recommendations will help guide the Council’s work

33 Framework workshop  Adding Structure to the Monitoring Framework –Discuss conference outcomes –Brainstorm the missing pieces –Guide the National Council’s current and future efforts to promote and sustain the monitoring framework.

34 Conference evaluations  109 evaluations completed  Each session rated on a 1-5 scale –(1 = poor, 2 = fair, 3 = satisfactory, 4 = good, 5 = excellent)

35 Monday-- Workshops  T2 & T3--Field & Lab Methods for Today & Tomorrow 4.0 4.29

36 Tuesday-- Tracks 2 and 3  Field and Laboratory Methods for Today and Tomorrow NEMI PBMS 3.67 3.833.95 3.94 4.15 3.96 4.00 4.40

37 Wednesday-- Track 4  Exploring Opportunities in Data Management (mini workshop) 3.90

38 Wednesday -- Discussion Sessions  One for each of the Council’s four working goal groups  Forum for sharing experiences and exploring ways the Council’s workgroups can build, foster, and promote a monitoring framework for the future  Opportunity to incorporate the ideas and issues raised in the workshops and track sessions 4.00

39 Workgroup discussions  Attendance at the various Council workgroup discussions (based only on evaluation responses)

40 Conference attendance breakdown Based only on evaluation responses

41 Framework Workshop recommendations  Add “cog” ID potential users of data – 1 st step in process  Consider “cog” to evaluate outcomes  Discussed elements to consider in each “cog”  Provide case studies where framework saved resources

42 Public Outreach/Communication  Communicate the most important monitoring information –ID benefits of monitoring –Demonstrate that wq info is making a difference –Include economic/quality of life values –Market use of indicators  We need to remind the public of the importance of wq monitoring  Make the wq information more relevant to more audiences

43 Establish working relationships between state/regional councils and NWQMC  Compile a directory of state/regional councils –ID & inventory all existing monitoring programs –Foster and encourage two-way communication Bottom-up & top down

44 Establish working relationships …  Develop a communication system to facilitate this info exchange –Showcase successes –Document efficiencies, value added benefits  Encourage organizations to look beyond their immediate needs  Encourage establishment of new councils

45 Fully Involve the Monitoring Community  Give all interested parties the opportunity to become involved  Communicate the value of field and lab certification –builds trust –improve comparability of data –Investigate different levels of certification for different data uses

46 Fully Involve the Monitoring Community  Develop a compendium/directory of training tools  Develop a glossary –move toward a common monitoring language  Report results to interested public and decision makers, –“If its worth the effort to monitor its worth the effort to report the results” –Use variety of communication tools to get the message out  NWQMC needs to set up booth at other conferences, advertise, advertise, advertise

47 Promote NWQMC as a vehicle for federal and state agency collaboration  Provide assistance to electronically share data –Develop consistent formats for sharing data and reporting results –Promote secondary uses of data (ie Secchi DipIn)  Promote use of uniform indicators  Explore opportunities for volunteer monitoring input (harmonious data sets)  Foster greater interstate collaboration for monitoring and assessment of shared water resources  Communicate the Unified Federal Agency policy for WQ monitoring on federal lands

48 Watershed Components  Traditionally we’ve focused on the interactions of ground and surface water.  Need to expand to include other key watershed components, –air deposition, wetlands, soil zone  watershed characteristics, –land use, impervious cover, urbanization, agriculture, population expansion, disposal techniques, and underground injection  and watershed interrelationships –ground water, surface water and wetlands, water quality and quantity  Surface and ground watersheds do not usually coincide geographically.

49 Watershed Components  Need to identify and assemble watershed case studies, e.g. –Impact of ground water withdrawal on surface water (WA) –Models that have multiple management objectives (Dane County, WI) –Nitrogen loading (Chesapeake Bay)

50 Promoting Consistent Methods: Models  Need models that identify regional characteristics  Use models to identify quality/quantity issues  Promote models that are relevant for many stakeholders (helps to promote buy-in and funding)  Promote models that are good management tools

51 Promoting Consistent Methods:Data Collection  Need water quality data elements specific to ground water  Ground water field collection methods – Are samples truly representative of aquifer water quality?

52 Promoting Consistent Methods: Implementation  BMPs for surface and ground water - don’t transfer the problem!  Integration of ground water loadings into TMDLs

53 Impacts of Ground Water Discharge  Expand from marine environments to include freshwater systems  Expand to include impacts to ground water from surface water recharge  Council should promote interactions between coastal and freshwater stakeholders and seek expertise, e.g. –WEF, AWWA, AWRA, EWRI, NGWA, GWPC, NOAA, LTER, ASIWPCA, SWCS, and international organizations

54 Public Outreach and Education  Coordinate with Collaboration and Outreach workgroup  Form sub-workgroups within WCI for education/outreach and technical issues  Provide educational materials for school curriculums  Develop website activities for children, and provide links to other educational programs from NWQMC site

55 Recommendations to NWQMC  NWQMC needs to provide financial support to this workgroup, WCI, to recruit outside expertise.  In the future, WCI should –focus on the “convey information and results” segment of the proposed monitoring framework. –carry and support the issues of watershed component interactions to the other Council workgroups. –develop a list of suggested elements to include in watershed models to address the holistic system.

56 Overarching Methods Related Issues  Need for Communication, Collaboration, and Coordination -- transcended track 2/3 sessions and NEMI and New Tech Workshops  Promote implementation of ACWI approved recommendations by senior managers at federal and state agencies  Differentiate between information quality and data quality  Standard definition of terms such as accuracy, precision, etc., for use by the monitoring community  Need success stories as agents for change

57 Overarching Methods Related Issues  Need to develop a Comparability Protocol –How to design studies to demonstrate comparability for field and lab methods –Conduct pilot studies –Evaluation of meta data -- precision, accuracy, etc. –Evaluation of previously collected data

58 National Environmental Methods Index  Include additional explanatory information as a part of NEMI  Use NEMI as basis for data base registries  Develop an expert system to provide monitoring design recommendations  Need to prioritize methods that are added to NEMI  Implement suggestions made to NEMI prototype

59 Methods Acceptance Issues  Address method approval issues for compliance and other monitoring programs  Continue to evaluate outstanding PBS issues  Advocate PBS implementation  Coordinate with NELAC PBS approach

60 New Technologies and Early Warning  Develop a protocol for decision making with respect to data interpretation and use  Develop partnerships for sensor technology development  Document performance and acceptability criteria  Provide broad range of testing -- methods and environmental conditions

61 New Technologies and Early Warning  Provide training in use of emerging technologies  Coordinate global expertise in biomonitoring -- biohazards and emerging technologies  Address technical and management issues with false positives/false negatives in early warning alarm systems.

62 Water Quality Data Elements  Need for targeted outreach  Need for implementation approaches  Demonstrated success -- leads to adoption  Hierarchy from core to desired elements  Need participants to develop biological WQDEs and to test chemical and microbiological WQDEs -- so please volunteer

63 Prioritize Methods Board Projects  PBS  Method Comparison protocols including field methods  Glossary of terms related to comparability  Outreach  Implementation of recommendations at all levels

64 Additional Methods Issues  Reassessment of detection limit protocols  Reporting of low level data  Reference materials  Field accreditation  Training

65 System Wide Considerations  Database management  Quality Assurance/Quality Control  Information goal (information strategy)  Costs  Peer-reviewed system elements

66 Flexibility vs. Standardization  Unintended impacts –“One size does not fit all” vs. Tower of Babel  Monitoring for regulation or “management”

67 Instructions to the Workgroup  Need glossary/thesaurus of terms  Need tools to connect cogs of framework – smoothly & seamlessly –Strategies are important to connect cogs with each other –Better define the content of the cogs of the framework  Need case studies to illustrate definition of cogs and connection between cogs

68 Some considerations for developing a framework related product strategy for the Council  Develop list of element considerations and relationships to goal groups  Use conference recommendations and framework to develop potential goal group product list  Develop work groups to deliver products  Prioritize products and consider 2 year strategy to showcase products at conference  Compile/develop “success stories” that demonstrate relevance of the framework  Determine contributions to: expert system to link products together IMPACT issue consider new technology relationships to products data management needs – which goal groups

69 Expert System Concept  Concept discussed at Council meetings for several years (late 1999/early 2000)  A coordinated product approach – internet based information guide through the monitoring framework  Pilot being developed under an NSF small business grant by Instant References Sources, Inc (Larry Keith, chair NEMI work group)

70 Expert system pilot EMMA - interactive software  Designed to help you plan improved and cost-effective environmental monitoring projects.  Guides you through complex decisions to tailor your plans to meet specific project needs by considering the physical and chemical characteristics of the sampling site and target analytes, desired data quality, available budget, and your objectives.  Combines decision criteria based on EPA’s DQO process, your specific project needs, and methods information from the new National Environmental Methods Index (NEMI).  Consists of three modules, each based on a group of interactive decision criteria. It helps you to consider, and answer, all critical questions for project planning so that you will have a plan that ensures that you will get the right data on time the first time with no unpleasant surprises

71  EMMA currently has three Modules  Authoritative Decisions - Objectives, decisions, timing, budget, sampling site, and data quality  Method Selection - Accuracy, precision, sensitivity, selectivity, cost, regulatory approval, etc.  Numbers of Samples - Environmental site samples, QA/QC samples, and decision rules  Second grant developed to move pilot further

72 Possible Council Strategies  Council build off of the EMMA effort?  Develop 3 rd proposal to NSF (not a small business grant) to connect other Council products/design complete Framework system?  Council provide technical advice/support/review/guidance and develop product pieces?  http://infotrek.er.usgs.gov/doc/nemi/emma/

73 New Technologies and Early Warning  Develop a protocol for decision making with respect to data interpretation and use  Develop partnerships for sensor technology development  Document performance and acceptability criteria  Provide broad range of testing -- methods and environmental conditions

74 New Technologies and Early Warning  Provide training in use of emerging technologies  Coordinate global expertise in biomonitoring -- biohazards and emerging technologies  Address technical and management issues with false positives/false negatives in early warning alarm systems.

75 Goal Groups objectives  Product strategy  Framework Success stories  IMPACT  Expert system  New technologies

76 Few slides on MDCB progress and plans – near term and longer term  Show revised Board framework  Work groups developing 2 year product strategies that consider conference recommendations  NEMI – showcase at conference, going public – 650 methods, phase 3 add constituent groups and field protocols  WQDE – implement and outreach for Chem and micro, develop structure to connect various WQDE lists, develop various biology lists  PBMS – publish COD results, tackle “data integrity”?, Biology and Nutrient pilots  Accreditation – promote recommendations, tackle State lab recommendations, tackle field certification, work with new NELAC  Nutrient, biology, New technologies contribute to these product efforts.


Download ppt "A Framework for Monitoring. Conference outcomes  The Board work groups reviewed the work group specific and general Board related conference recommendations."

Similar presentations


Ads by Google