South Carolina Oyster Restoration and Enhancement Water Quality Monitoring: Evaluation of a Digital Training Product for South Carolina Oyster Restoration.

Slides:



Advertisements
Similar presentations
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Advertisements

North Carolina Graduation Project An overview of the GP process at FVHS.
Teacher Evaluation Model
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
DESIGN AND ONLINE DELIVERY OF A COASTAL SCIENCE MODULE USING A CONSTRUCTIVIST APPROACH. RAMESSUR R.T. 1 and SANTALLY M.I. 2 1.Faculty of Science, University.
School children are always eager to learn about their environment and excited when they are allowed to participate. By demonstrating the importance of.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
INSTRUCTIONAL LEADERSHIP: CLASSROOM WALKTHROUGHS
Measuring and reporting outcomes for your BTOP grant 1Measuring and Reporting Outcomes.
The Ocean Institute’s new Weather and Ocean Monitoring Program developed in collaboration with the Southern California Coastal Ocean Observing System Funded.
Formative and Summative Evaluations
How to use Student Voice Training Session Gary Ratcliff, AVC - Student Life.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Alternate Assessment on Alternate Achievement Standards Aligned to Common Core State Standards 1.
PECAT Physical Education Curriculum Analysis Tool Lessons for Physical Education Teacher Preparation Programs National Center for Chronic Disease Prevention.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Meeting the Digital Literacy Needs of Low-Skilled Workforce Center Clients: Using supported distance learning across systems Summer Institute 2013.
UNIVERSITY OF CALIFORNIA, IRVINE CAOMP UNIVERSITY OF CALIFORNIA, IRVINE CAOMP ONLINE TOOL BOX.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Aligning Academic Review and Performance Evaluation (AARPE)
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Teachers Discovering Computers Integrating Technology and Digital Media in the Classroom 7 th Edition Evaluating Educational Technology and Integration.
EDUCATION FOR THE NEW AGE: TRADITION, REFORM, INNOVATION Tradition and Reform in Secondary School.
Copyright © 2007 Learning Point Associates. All rights reserved. TM Overview of the Oregon Attendees Module Neil Naftzger Principal Researcher To hear.
Programs of the Intel Education Initiative are funded by the Intel Foundation and Intel Corporation. Copyright © 2007 Intel Corporation. All rights reserved.
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
CLASS Keys Orientation Douglas County School System August /17/20151.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
111 © 2001, Cisco Systems, Inc. All rights reserved. Presentation_ID.
Outcome Based Evaluation for Digital Library Projects and Services
District Trainer Program Helping you to plan and conduct training meetings that support effective Rotary clubs.
An Independent Evaluation of the OK-FIRST Decision Support System Thomas E. James and Paula O. Long Institute for Public Affairs / Department of Political.
My Guide Practitioner 1 Level 3 training course. 2 My Guide training The My Guide training programme has been developed by Guide Dogs, in collaboration.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
0 District Self-Monitoring October 2015 Web-based Monitoring System (WBMS) for the Tennessee Department of Education Division of Consolidated Planning.
Ipod Project Welcome Back Session September 27, :30- 4:30 PM (SEA # NA–Credit) Lenoir County Public Schools Preparing all students to be competitive.
Training Program Proposal December  Review ~ of The Educational Technology Plan from Donegal’s Strategic Plan  Analysis ~ of Donegal’s School.
Assuring Safety for Clinical Techniques and Procedures MODULE 5 Facilitative Supervision for Quality Improvement Curriculum 2008.
Community Planning Training 5- Community Planning Training 5-1.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
FALCON Meeting #3 Preparation for Harnett County Schools Thursday, March 8, 2012.
1 Support Provider Workshop # East Bay BTSA Induction Consortium.
Science Department Draft of Goals, Objectives and Concerns 2010.
Aligning Academic Review and Performance Evaluation AARPE Session 5 Virginia Department of Education Office of School Improvement.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
Staff conference 2012 Using Web2.0 tools for research Colin Harrison.
Program Evaluation Principles and Applications PAS 2010.
EDUCAUSE 2003 Copyright Toshiyuki Urata 2003 This work is the intellectual property of the author. Permission is granted for this material to be shared.
Evaluating Media Service Program Chapter 13 Presented by: Dal Marie Hawkins.
Unit Presentation Anne Donadio
An Overview of Revisions to the Rhode Island Model
Domain Champion Updates New Mexico Department of Health.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
OCLC Online Computer Library Center 1 Introduction.
Training Personnel Using Autism online ebp Modules
Example of IRB Application and Resulting Research Paper
Process Evaluation the implementation phase
Example of IRB Application and Resulting Research Paper
Aligning Academic Review and Performance Evaluation (AARPE)
Presentation transcript:

South Carolina Oyster Restoration and Enhancement Water Quality Monitoring: Evaluation of a Digital Training Product for South Carolina Oyster Restoration and Enhancement (SCORE) Volunteers Steven O’Shields I.M. Systems Group, Inc. NOAA Coastal Services Center Charleston, South Carolina Nancy Hadley South Carolina Department of Natural Resources (SCDNR) National Water Quality Monitoring Conference San Jose, California May 10, 2006

South Carolina Oyster Restoration and Enhancement Outline Overview of SCORE Program Overview of Digital Training Product Evaluation of Digital Training Product –Purpose –Methods –Results –Conclusions Questions

South Carolina Oyster Restoration and Enhancement Overview of SCORE Program Lead organization: South Carolina Department of Natural Resources (SCDNR) Purpose: To restore and enhance oyster habitat by planting recycled oyster shells in the intertidal environment to form new, self-sustaining oyster reefs—all with the help of volunteers. Partners: Various school and community volunteer groups Numerous local, state, and federal funding partners

South Carolina Oyster Restoration and Enhancement Overview of SCORE Program Program components: Oyster shell recycling Oyster reef building Oyster reef monitoring

South Carolina Oyster Restoration and Enhancement Overview of SCORE Program NOAA Coastal Services Center Support: Funding Services –O–Outreach support –W–Web site development –C–Content development –D–Data management system development –D–Digital training product development

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product The Product: Informs about water quality characteristics important to oyster health Interactively demonstrates how to measure those water quality characteristics

South Carolina Oyster Restoration and Enhancement Defines water quality characteristics Overview of Digital Training Product

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Explains how water quality characteristics are measured

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Explains importance of water quality to aquatic life

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Explains natural influences on water quality

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Explains how human actions influence water quality

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Illustrates measuring tools and demonstrates their use

South Carolina Oyster Restoration and Enhancement Overview of Digital Training Product Interactively simulates procedural steps and requires user input

South Carolina Oyster Restoration and Enhancement Product Evaluation: Purpose The purpose of the evaluation was to gauge how effective the product has been with: Helping SCDNR staff members train and prepare SCORE volunteers to measure water quality. Educating SCORE volunteers and the public on water quality characteristics that are important to understanding oyster habitat.

South Carolina Oyster Restoration and Enhancement Product Evaluation: Methods Product success examined on five levels: 1.Product usage – Is the product being used? 2.User reaction – Is the product well received and considered useful? 3.Learning results – Are product users gaining knowledge? 4.Behavior in the field – Are behaviors in the field changing? 5.Business impacts – Are day-to-day volunteer program operations improving? Evaluation methods included: Web log analyses Surveys Pre-test / post-test assessments Field observations Time period of data analysis: October February 2006 (5 months)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 1 Results Level 1 – Product usage Web log (usage) analysis: (excludes NOAA and SCDNR URL’s) –Unique visitors (IP Addresses): 951 –Loyalty (repeat visitors): 178 –Pages visited: 3,620 –Unique files downloaded: 42 –Total files downloads: 2,737 –Time on-line: 60+ minute sessions accounted for 75 percent of total pages visited On-line surveys: –User survey (no responses) –Teacher survey (two high school and one middle school science teachers) Volunteer survey: –Four out of 18 adult volunteers surveyed had completed the tutorial –Five out of five volunteer school groups had completed the tutorial Product evaluation participants: –118 students –Four teachers

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 2 Results Level 2 – User reaction On-line teacher survey: –All three teachers/respondents “agreed” or “strongly agreed” that the tutorial: Covers topics adequately Contains information at the appropriate scientific level Makes appropriate use of Web technology Is easy to use Should be recommended to others –“Absolutely amazing program—copyright it now and sell it in a textbook/pamphlet form.” Teacher responses: –“…all i can say over and over again is...WOW!!!!!!!!!” –“Any teacher or citizen volunteer who has privilege to this module should be very grateful for all the time and work put into the program. It is so clear and flows so logically from one point to another, we just can't say enough about how much we are impressed and how much we like it.” Unsolicited student responses: –Most found the material interesting, easy-to-understand, and useful.

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results Self-assessments –On-line survey (no responses) Pre-test vs. post-test evaluations –118 Participants

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results (pre-test / post-test results)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results (pre-test / post-test results)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results (pre-test / post-test results)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results (pre-test / post-test results)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 3 Results Level 3 – Learning results (pre-test / post-test results)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field Before and after field comparisons –Data collection (protocols and technique) –Data accuracy Field participant survey –Self perception of change SCDNR staff member survey –Staff member perception of change in volunteers

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (before and after field comparisons) Procedure Who: –Four groups of ninth grade students (two to three per group) Month 1: –Completed background portion of training product –Received on-site, instructional field training Month 3: –Students measure water quality (first time – “before”) –Completed instructional portion of training product Month 4: –Students measure water quality (second time – “after”)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (before and after field comparisons) Results –Data collection: some things improved, some things got worse –Data accuracy: some things improved, some things got worse Notes –Three out of four groups did not use written protocols –No practice between “before” and “after” period –Initial field training seemed significantly improved Outcomes –Laminated, abbreviated protocols should be included with the field equipment –Teachers and trainers should stress importance of reading protocols

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (field participant survey)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (field participant survey)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (field participant survey)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (field participant survey)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (field participant survey)

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 4 Results Level 4 – Behavior in the field (SCDNR staff member survey) Have there been fewer errors found in volunteer collected data? –2002 and 2003 (pre-tutorial) = ~10 percent error rate –2004 and September 2005 (pre-tutorial) = ~5.5 percent error rate –October 2005 to March 2006 (post-tutorial) = ~2 percent error rate*** ***Only 11 percent (or 31 records) of “post-tutorial” data was entered by those who had completed the tutorial. No errors were found in that 11 percent. Have there been fewer problems with improper care and storage of monitoring equipment? –No noticeable difference

South Carolina Oyster Restoration and Enhancement Product Evaluation: Level 5 Results Level 5 – Business impacts (SCDNR staff member survey) Has the number of volunteer requests for technical assistance decreased? –About the same, very few to begin with Has the average time required to conduct field training decreased? –Yes, generally quicker and more productive training sessions Any other observations? –Volunteers seem to have a better understanding of why we are concerned with monitoring water quality.

South Carolina Oyster Restoration and Enhancement Product Evaluation: Conclusions Levels 1 to 5 – Product success: 1.Product usage – Is the product being used? Yes, by volunteers and school groups 2.User reaction – Is the product well received and considered useful? Yes, particularly by teachers (limited student/volunteer feedback) 3.Learning results – Are product users gaining knowledge? Yes, all ages and backgrounds 4.Behavior in the field – Are behaviors in the field changing? Yes / No (subjective vs. objective) 5.Business impacts – Are day-to-day volunteer program operations improving? Yes, generally quicker and more productive training sessions