Presentation is loading. Please wait.

Presentation is loading. Please wait.

Water Quality Monitoring -Quality Assurance/Quality Control-

Similar presentations


Presentation on theme: "Water Quality Monitoring -Quality Assurance/Quality Control-"— Presentation transcript:

1 Water Quality Monitoring -Quality Assurance/Quality Control-
May 17, 2012

2 Presentation Outline:
Why is Quality Assurance Important? Definitions Volunteer Monitors’ QA Responsibilities Good Documentation Data Evaluations DEQ’s Use of Secondary Data

3 Why Do Volunteers Monitor?
You believe that someone can put the data to good use!

4 Monitoring Goals Monitoring: Support decision-making Data:
Provide relevant information Data Quality: Credible, defensible, documented In order to influence someone in power to act on your data, you need to have believable data. Wouldn’t you like to be able to say: You can trust my data and I can prove it? You can demonstrate that your data is credible, defensible and documented by using two words: Quality Assurance.

5 Why Is Quality Assurance Important?
Quality makes your data acceptable so that others may USE your data Allows data to support decision-making Data quality: Ensures usability of the data and allows data to support decision-making Data of unknown quality are useless

6 Definitions Quality Assurance (QA)
A set of operating principles, which if strictly followed during sample collection and analysis, will produce data of known and defensible quality Examples of QA Activities Project planning Data collection activities Quality control Evaluation of data QA ensures that your data will meet defined standards of quality with a stated level of confidence QA involves the upfront planning and management activities conducted prior to sampling and analysis to ensure that the appropriate kinds and quantities of data are collected. QA refers to a broad management system which includes the organization, planning, data collection, quality control, documentation, evaluation, and reporting activities of your group. Examples: DQO planning, QAPP, SAP & SOP development, Audits, training, documentation and records

7 Examples of QC Activities
Definitions Quality Control (QC) Activities that are implemented to evaluate the effectiveness of QA activities Used to produce and document the quality of the data (QA) and quality control (QC) are distinct but related activities. QC refers to the routine verification activities whose purpose is to detect and control errors. Since errors can occur in either the field, the laboratory or during data analysis, QC must be part of each of these functions. QC are the criteria used to determine whether the method remained in control Examples of QC Activities Collecting field duplicates/blanks Calibrating equipment Reviewing data sheets

8 How? Volunteer Monitor’s QA Responsibilities
Show you know what you are doing and doing it right How? Developing QAPPs/SAPs Establishing DQOs Participating in training Using SOPs or specified methods Documenting calibrations/field activities

9 How? Volunteer Monitor’s QA Responsibilities
Show you know what you are doing and doing it right How? Recording data accurately Collecting samples according to QAPP/SAP Ensuring proper equipment maintenance Performing data evaluation Data management

10 Good Documentation Metadata are data descriptors or qualifiers that document the when, where, what, why, how, and “how good” of sample collection and analysis. The more background information (metadata) you can provide, the more valuable your data will be for multiple purposes and users. It is important to use and completely fill out data sheets. The same holds true for sample bottle labels, lab sheets (if applicable) and sample drop-off sheets. Ten years from now, when key people may have moved on, how easy will it be to locate old laboratory logbooks, volunteer training records, or field datasheets and connect these documents with the appropriate monitoring results in your database? Incorporating as much metadata as possible into your electronic database is the best way to ensure that the metadata stays with the data. A number of volunteer monitoring programs are already moving in this direction — for example, storing detailed site location information such as verbal descriptions, site codes, and latitude/longitude coordinates in tables that are linked to results tables in the database.

11 Good Documentation Electronically “flagging” questionable data is another tool to help document data quality. These flags and comments constitute a type of metadata—essentially, the data manager’s professional judgment that there is some question about the validity of a particular result. In a well-run monitoring program, the data manager is constantly on the lookout for irregularities. Anything that suggests a problem—a holding time exceeded, a result that’s very different from past results at the same site—is flagged in the database, usually with an accompanying comment explaining the reason for the flag.

12 Data Evaluation Evaluate the field and laboratory data to determine if the data meets project objectives Use a QC Checklist Determine how the data that doesn’t meet requirements affect the usability of the data Screen data for outliers Flag data as appropriate So you have finished the field and lab work, now what? Evaluate the usefulness of the data. Did you meet all of your DQOs? Is the data useable?

13 DEQ’s Use of Secondary Data
Data must be of known quality for DEQ to use the data for decision-making. Secondary data must include the following minimal requirements: Data < 10 years Written documentation (i.e., QAPP or SAP) QA/QC documentation Notes indicating deviations from QAPP or SAP Data location information (i.e., latitude/longitude) DEQ solicits outside data and information from other local, state, and federal agencies, volunteer monitors, private entities, non-profit organizations, and individuals with an interest in water quality. Data submissions are screened to determine if the objectives of the original collection design allow for secondary use in a water quality assessment, the spatial and temporal representation of these data relative to a waterbody assessment unit, and the rigor of quality assurance and quality controls applied during the original collection. Data that does not meet the needs of a structured water quality assessment may not be used explicitly for justifying an impairment decision, but may still be helpful to DEQ for identifying problems that require further attention and study. Written assurance or QA/QC documentation demonstrating that the procedures and methods written in the QAPP and SAP were followed to support the conclusion that the results are reproducible and that data requirements were met. Any field notes, laboratory notations or summaries that indicate deviations from the QAPP or SAP and their potential impact on the data quality and objective outcome.

14 Temporal Requirements
DEQ’s Assessment Methodology Data used by DEQ for assessments must meet certain requirements that are specific to each pollutant group. Pollutant Specific Assessment Methods Core Indicators Spatial Requirements Defined index period Minimum sample size Temporal Requirements Each pollutant group has specific core indicators that have spatial and temporal requirements, defined index periods, and a minimum sample size. Please note the collection methods haven’t changed, just the way we assess waterbodies. Our process allows for a more structured and consistent approach so that decisions can be replicated. Clear decision framework!!

15 DEQ’s Data Quality Assessment
Spatial/ Temporal Technical Data Currency Evaluate data to consider the technical, representativeness, quality, and age components of data and information. Using data quality assessments (DQA), DEQ reviews chemical, biological, and physical/habitat data to determine if it has adequate rigor for use in decision-making. The technical, spatial/temporal, and quality aspects, as well as age, of the data are considered. In addition, data must represent the ambient water quality conditions in order to be useful for assessing the waterbody. If data are of sufficient quality, they are incorporated into the water quality assessments. Data quality: Are QAPPs and SAPs available? Written assurance that procedures and methods were followed? Does the lab performing the analysis certified or have a documented quality system? Does sample collection follow DEQ SOPs or cite methods/procedures that are applicable? Spatial /Temporal Independence: Does the data demonstrate spatial /temporal independence and meet requirements from our assessment method for index period and minimum sample size? Technical: Does it contain all the required metadata? Does the data meet the required sensitivity? Data Currency: Is data < 10 years old or are conditions known not to have changed since the data was collected? Representative of current conditions?

16 QA/QC Summary Credible & Usable Data QA/QC
The ultimate goal of most volunteer monitoring programs is to ensure that well-trained volunteers collect high quality data and that the data are used. Despite decades of demonstrating that volunteers can and do collect representative data, government agencies, scientists and often the general public are sometimes reluctant to use data not collected by “experts”. Therefore volunteer water quality monitoring programs must work especially hard to build and maintain credibility – some have even said, “twice as hard for half the recognition.”

17 Questions? ?


Download ppt "Water Quality Monitoring -Quality Assurance/Quality Control-"

Similar presentations


Ads by Google