Caitlin Carpenter Sheri Denning Alan Elmore Patricia Oliver Samarah Shakir.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

Preparation of the Self-Study and Documentation
One of the most important aspects of any CME activity is evaluation, or outcomes measurement. CME Compliance: Evaluation Measuring the educational outcomes.
There is no program and no policy that can substitute for a parent who is involved in their childs education from day one. President Barack Obama Overview.
Educational Specialists Performance Evaluation System
INCREASING IMPACT Understanding Measurements of Impact in Evaluation June 2013 Jennifer Farley.
Directions for this Template  Use the Slide Master to make universal changes to the presentation, including inserting your organization’s logo –“View”
Oregon State Library Transformation Project Launch
Girl scouts river valleys Welcome to 2015 Early Bird Registration River Valleys will start the webinar at 12:00 p.m. Please mute your phone by pressing.
4 Conducting Marketing Research 1. What is Marketing Research? Marketing research is the systematic design, collection, analysis, and reporting of data.
Envision SFA developing the next strategic plan….
Creating a Unified PLA Policy Recommendation for the State University of New York Ross Garmil Empire State College PLA with a Purpose Symposium April 29,
CADTH Therapeutic Reviews
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
Nadine Drew Lynn Goldman Merrie Meyers Charles Webster.
Post-Secondary Education, Training and Labour Dr. Bill Morrison Dr. Patricia Peterson Positive Mental Health Initiative.
Booster/Refresher Training: Team & Faculty Commitment Benchmarks of Quality Items # 1 –
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Atlanta Public Schools Principal Selection Process Human Resources, Center of Expertise Updated March 3, 2014 East Region Community Meeting April.
Ontario Psychological Association (OPA) Student Assessment Project “Designing a Project for Success” Date: February 6, 2009 Presented by: Marg Peppler,
Principals’ Council Meetings May  Given feedback from multiple stakeholders and after much deliberation, PDE has made the determination to classify.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
OCTEO Fall Conference Insights from Ohio’s edTPA Field Test ( ) October 25, 2012 Donna Hanby, PhD.
Positive Behavioral Interventions and Supports (PBIS) Leadership Summit Breakout Sessions March 30, 2009.
Evaluation of the Indianapolis, Indiana September 2002 – August 2003 Stamp Out Syphilis Coalition.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Introduction & Step 1 Presenter:. Training Overview Introduction Participation requirements FET Tool Orientation Distribution of username & passwords.
Strengthening global awareness in the local communities - Kolping 2020 Strategy.
Atlanta Board of Education AdvancED/SACS “Required Actions” February 14, 2011.
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
Introduction & Step 1 Presenter: Updated 6/21/2013.
NC STATE UNIVERSITY Campus Systems and Calendar Systems: a self assessment Sarah Noell, ITD, Project Coordinator Harry Nicholos, ITD, Technical co-chair.
Introduction Who is Girl’s Incorporated? Mission The mission of Girl’s Incorporated is to use new technology to help girls develop higher learning skills.
Background Management Council (MC) was briefed on approach in early Feb 2003 and approved it Agreed that every Service Group (SG) will participate in.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
2005 HR Effectiveness Survey Responses and Results.
12-14 Pindari Rd Peakhurst NSW 2210 p: e: Employee Survey Links2Success.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
Whiteboard Zoom In Attend a train-the-trainer session on value- added models in spring or summer 2014 Get Training Develop and implement plan for distributing.
Niskayuna CSD Middle School Review Vicki Wyld- Iroquois MS Luke Rakoczy- Van Antwerp MS.
Developing your Research Plan for FemNorthNet Community Case Studies 1.
PBIS Team: Establishing a Foundation for Collaboration and Operation.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
2014–2018 State Strategic Plan Survey Results Technology Planning, Policy, and Governance.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
ClimateQUAL™: Organizational Climate and Diversity Assessment Sue Baughman Texas Library Association April 2009.
Jillian Gourwitz, Ph.D. Suzanne M. Martin, Ph. D. University of Central Florida.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
EBSCC 2016 DOLLARHIDE & DOGAN 1 Dr. Colette Dollarhide and Soon-to-be Dr. Sabri Dogan The Ohio State University.
Accreditation (AdvancED) Process School Improvement Activities February 2016 Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator of.
Module 6.0: Communication Protocol DIT Installation Series Trainer Name Date.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 2.
AETC INTERPROFESSIONAL EDUCATION PROJECT: IMPLEMENTATION INSTRUCTIONS FOR BASELINE DATA COLLECTION Updated May 6 th,
IA-DMM Additional Evaluation Efforts. Evaluation questions focused on areas not addressed by FSU Evaluation Plan Evaluation Measures Evaluation Results.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Queen’s University Belfast – Institute of Lifelong Learning 4th ALPINE Project Meeting: University of Tartu, Estonia Adults Learning and Participation.
Driver Education – Assessing What Works and What Doesn’t The Assessment Team Perspective AAMVA Region 1 Conference July 18, 2011 Baltimore, MD.
Teacher Licensure PI-34 Wisconsin’s New Process. New License Stages  Initial Educator 5 year, non-renewable  Professional Educator 5 year renewable.
What? + How. = Wow! Imani Butler Silver Creek High School San Jose, CA The Methodology In Action: Migrating to a School Loop Hosted Site.
AETC INTERPROFESSIONAL EDUCATION PROJECT: IMPLEMENTATION INSTRUCTIONS FOR BASELINE DATA COLLECTION Updated April 28 th,
PILOT SCHOOL PRINCIPAL EVALUATION
After-Session Actions
Preparation of the Self-Study and Documentation
TN: TEACH AACTE Grant TN TEACH: The TN EPP Assistive and Collaborative Help Network.
360o Feedback Report Post-Training After-Session Actions
Family Engagement Coordinator Meeting July 25, 2018
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
Team Check-Up Orientation Briefing
Presentation transcript:

Caitlin Carpenter Sheri Denning Alan Elmore Patricia Oliver Samarah Shakir

 Evaluation Management  Data collection  Analysis  Results  Reporting  Stakeholders  Closing Statements  References

Stakeholder Name Stakeholder Category Interest or Perspective Role in the EvaluationHow and When to Engage Scott WatkinsSecondary Front-line assistance Scott is allowing NCSU Evaluators to incorporate the pre and post test materials into the seminar curriculum. Scott has already agreed to assist NCSU Evaluators in the collection of data. Samarah ShakirPrimaryProject Leader Samarah is very active in working with Scott to coordinate the inclusion of pre and post test materials into the seminar curriculum. Engaged throughout the process Caitlin CarpenterPrimary Project Team Member Caitlin is responsible for analyzing and interpreting the data collected from the evaluation Engaged throughout the process Sheri DenningPrimary Project Team Member Sheri is responsible for the management and implementation of the evaluation Engaged throughout the process Alan ElmorePrimary Project Team Member Alan is responsible for stating the evaluation purpose and identifying the project stakeholders Engaged throughout the process Patricia OliverPrimary Project Team Member Patricia is responsible for identifying how the evaluation will be designed and the data collected. Engaged throughout the process Dr. Kate GuerdatPrimaryProject CustomerDr. Guerdat will provide final approval or denial of evaluation proposal as well as feedback on results and presentation. Dr. Guerdat has already provided approval on evaluation proposal. Final draft of results due on April 26, 2013

IndividualTitle or RoleResponsibilities Scott WatkinsDELTA Associate Director & Seminar Facilitator Distribute and collect the pre/posttest, send out level 3 survey to seminar participants Samarah ShakirNSCU Evaluators Project Leader Manage the evaluation process and coordinate the communication between DELTA and NCSU Evaluators for the administration of evaluation instruments. Will collection of Level one and two results Alan ElmoreNCSU Evaluators Project Team Member Level three evaluation instrument Sheri DenningNCSU Evaluators Project Team Member Level two evaluation instruments Patricia OliverNCSU Evaluators Project Team Member Level two evaluation instruments Caitlin CarpenterNCSU Evaluators Project Team Member Level three evaluation instrument

Evaluation Questions Data Collection Method Activities Needed Person(s) Responsible Due Date Do Participants understand that the differences between Moodle 1 and Moodle 2? Pre/posttest Paper pre-testScott WatkinsMarch 21, 2013 Paper posttestScott WatkinsMarch 21, 2013 Post seminar & Survey links Participant addresses Scott Watkins April 4, 2013 Survey linksScott Watkins and NCSU Evaluators Team April 4, 2013 Was excitement and momentum built for the Moodle 2 upgrade Post seminar & Survey links Participant addresses Scott WatkinsApril 4, 2013 Survey linksScott Watkins and NCSU Evaluators Team April 4, 2013

Analysis to Be Performed Data to Be Analyzed Person(s) Responsible Due Date Level one SurveySurvey resultsNCSU Evaluators Team April 11, 2013 Level two Survey instrument Pre/posttest resultsNCSU Evaluators Team April 11, 2013 Level three SurveySurvey resultsNCSU Evaluators Team April 11, 2013

 ed survey after completion of seminar  6 responses received  Contained 7 Likert Scale Questions on a Strongly Agree to Strongly Disagree scale  6 questions received 100% Agree  1 question received 83% Agree and 17% Neutral  Key Evaluation Questions

 Pre/Post Test Design  Sample group selected by date  Results indicate learning transfer  Reported increase in level of comfort with the new software  Improvement in excitement for the new software  More than 80% of respondents can list three changes between the prior and current versions of the software.

 Post-session Survey through Survey Monkey  Results indicate behavioral change  Respondents continue to feel excited about the change  Respondents report an increase in knowledge and comfort level after attending the session  More than 80% of respondents discussed the training and the elements of the new software with coworkers.

Stakeholder NameStakeholder Category Interest or Perspective Role in the EvaluationHow and When to Engage Scott WatkinsSecondaryFront-line assistance Scott is allowing NCSU Evaluators to incorporate the pre and post test materials into the seminar curriculum. Scott has already agreed to assist NCSU Evaluators in the collection of data. Dr. Kate Guerdat PrimaryProject Customer Dr. Guerdat will provide final approval or denial of evaluation proposal as well as feedback on results and presentation. Dr. Guerdat has already provided approval on evaluation proposal. Final draft of results due on April 26, 2013

 Results  Questions/Comments  Please contact Samarah Shakir at

 (2013, March 15). Evaluation plan outline. website: =2&ved=0CC8QFjAA&url=http%3A%2F%2Fwww.cdc.gov%2Fasthma%2Fprogra m_eval%2FAppendixF_Evaluation_Plan_Outline.doc&ei=TXJoUfHQOYWK8QT _tIHACg&usg=AFQjCNFFOBS4wHk65mDmY7uaElpmpRSH3Q&bvm=bv ,d.eWU =2&ved=0CC8QFjAA&url=http%3A%2F%2Fwww.cdc.gov%2Fasthma%2Fprogra m_eval%2FAppendixF_Evaluation_Plan_Outline.doc&ei=TXJoUfHQOYWK8QT _tIHACg&usg=AFQjCNFFOBS4wHk65mDmY7uaElpmpRSH3Q&bvm=bv ,d.eWU  Bates, R. (2004, June 8). A critical analysis of evaluation practice: the Kirkpatrick model. Retrieved April 12, 2013, from Evaluation and Program Planning:  Russ-Eft, D. “Evaluability Assessment of the Adult Education Program (AEP).” Evaluation and Program Planning 9 (1986):  Russ-Eft, D. and Preskill, H. (2009). Evaluation in Organizations, 2nd Edition. New York: Basic Books.