 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?

Slides:



Advertisements
Similar presentations
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Advertisements

What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
March 23, Todays Outcomes By the end of todays workshop, participants will be able to... Revise and/or finalize SAOs within their areas Discover.
Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
Mission: To increase student success and student engagement by building collaborative relationships between students, staff and faculty. Concept: Provide.
Project Monitoring Evaluation and Assessment
Learn and Serve Higher Education Grant. What is the purpose of Learn and Serve America? Learn and Serve America supports service- learning programs in.
A.M.P. (Alumni Mentor Program). What is a Mentor?  A mentor is both a friend and a role model who supports and encourages a student in his/her academic.
Key Communities and Objectives Outcomes- Based Assessment Telling the Story Results Closing the Loop.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
How to use Student Voice Training Session Gary Ratcliff, AVC - Student Life.
The Academic Assessment Process
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Presented by Margaret Shandorf
Molly Chamberlin, Ph.D. Indiana Youth Institute
Inferential Statistics
How to Develop the Right Research Questions for Program Evaluation
Evaluation Research Step by Step Step 1 – Formulate research question Step 2 – Operationalize concepts ◦ Valid and reliable indicators Step 3 – Decide.
EVIDENCE BASED PROGRAMS Dr. Carol AlbrechtUtah State Extension Assessment
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
The Role of Institutional Research in Delta Sigma Theta Sorority, Inc Regional Conference.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Participant Observation Purpose  Observe Human Social Behavior. Often used to observe behavior over time.  This data collection technique is used when.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
PAC Committee Information Brenda Hill, Ph.D. FCS Educator Cleveland County 2013.
1 Performance Measures Rebecca Stark, Court Administrator Austin Municipal Court.
Start at the Beginning How do we collect information to answer questions?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Writing a Journal Article. Sections of a Journal Article Introduction or Statement of Purpose Literature Review Specific Statement of Hypothesis(es) Description.
Interactive Power Point Evaluating Your Program. Evaluating your Program First – Review the Steps Step 1 ◦State overall objectives Step 2 ◦State desired.
Welcome! Please join us via teleconference: Phone: Code:
Proposed National SET Goals for 2009 National SET Mission Mandate Team and National 4-H Council.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
“Working to ensure children, birth to 5, are prepared for success in school and life.” Wake County SmartStart Logic Model Training May 2013.
Quality Assessment July 31, 2006 Informing Practice.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Regional Educational Laboratory at EDC relnei.org Logic Models to Support Program Design, Implementation and Evaluation Sheila Rodriguez Education Development.
Mapping the logic behind your programming Primary Prevention Institute
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
SECONDARY DATA Census Data Historical Documents Old Newspapers Data Collected by State or National Government Data Collected by Colleagues.
2009 Shelby County Schools District Accreditation Thompson High School.
School Counselors & Assignments \ Elementary Schools Demographic Information.
State Board of Education Achievement and Graduation Requirements Committee January 11, 2016.
Are we there yet? Evaluating your graduation SiMR.
Session 2: Developing a Comprehensive M&E Work Plan.
Dr. Darlene Murray Nate Saari Ruby Marin-Duran. Reedley’s Target Population Hispanic/Latino African American Low-income Male Success Indicator Degree.
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
INVOLVING STAKEHOLDERS Heather Ouzts, NC DPI Parent Liaison Beverly Roberts, ECAC NC SIP Project Coordinator.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Office of School Improvement Contractor Update Division Leadership Support Team Meeting The College of William and Mary March 31, 2014.
Functional Area Assessment
LSTA Grant Workshop Jennifer Peacock, Administrative Services Bureau Director David Collins, Grant Programs Director Mississippi Library Commission October.
Working with your AoA Project Officer
Development of Internal Quality Assurance and its Challenges in Taiwan Higher Education from University and Students’ Perspectives Angela Yung Chi Hou.
Presentation transcript:

 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?

Step 1 – Start with Program Model Step 2 – Identify Resources Step 3 – Identify Inputs Step 4 Identify Activities Step 5 – Identify Outputs Step 6 – Identify Outcomes

Step 1 – Identify Program Model Step 2 – Identify Inputs Step 3 - Activities Step 4 - Outputs Step 5 - Outcomes Step 1 – Identify Program Model Step 2 – Identify Resources Step 3 – Identify Program Events Step 4 – Identify Resulting Products Step 5 – Identify Changes Success Other Common Terms Used to Identify Steps

Program Model InputsActivitiesOutputsOutcomes Program Model Teachers Grant Money Service Learning Activities Service Learning Projects Completed Measure Leadership Skills

Program Model # of Teacher Amount of Money for Building Materials Number of Service Learning Projects Planned by Students Number of Projects Completed Perception of Leadership Skills Developed Program Model Teachers Grant Money Service Learning Activities Service Learning Projects Completed Leadership Skills

Resources Step One – Identify and count resources used to instigate program Activities Step Two – Identify and count activities that took place Outputs Step Three – Identify and count events or products that resulted Short-Term Outcomes Step Four – Identify and count short-term changes that took place (i.e., positive changes or accomplishments immediately after participation) Long Term Outcomes Step Five – Identify and count long-term impacts that occurred (i.e., positive changes or accomplishments at specified future date)

Resources Count number of staff and amount of money used to instigate program. Count number of networks established. Activities Count number of planning meetings and number of service learning activities planned by students. Count number of hours students spent in activities. Outputs Count number of projects completed by students. This could be subdivided into things such as number of picnic tables built for a park etc. Short-Term Outcomes Use survey to assess positive changes or accomplishments as perceived by service learning participants. Compare to control group. Long Term Outcomes At a specified future time, use survey to assess positive changes in the lives of students (i.e., number of hours spent in volunteer work etc.) Compare to control group

Resources Count number of staff and amount of money used to instigate 4-H program. Count number of networks established. Activities Number of 4-H activities that occurred, and/or number of students who participated in planning activities Outputs Number of students who participated in workshops and projects. Number of educational packets distributed. Short-Term Outcomes Assessment of their sense of accomplishment using survey. Assessment of increase in knowledge using survey. Long Term Outcomes Number of 4-H students who attend college after attending educational workshops. Compare to control group.

Outcome Evaluation Process Evaluation Formative Evaluation Needs Assessment

Example - Surveys, intensive interviews and focus groups are used to determine (1) Long-term impact on USU students and (2)Long term - impact on high school Outcome Evaluation Process Evaluation An ongoing evaluation is used to monitor the program Example -Grantee comes from time to time to “see how things are going”. Formative Evaluation Information is gathered to formulate a plan to establish a program. Example-Planning meetings with high school administrators etc. Needs Assessment A need for a program is identified Example –need for mentoring program with public school students An Example of Each Type of Evaluation

 Purpose  Evaluate the need for a program/intervention/ project  Basic Steps  Use secondary data (i.e., census data, community reports, etc.) and/or surveys, etc. to create a profile of the community that assess existing resources and current needs  Determine if current resources are adequately meeting needs  Identify one or more specific needs that are not currently being met

 Purpose  Determine the percent of local post secondary students that are from an underrepresented population. Compare this to the percent of students in post secondary educational institutions. Based on this and programs already available determine the need to provide additional opportunities for this group.  Basic Steps  Use secondary data to determine percent of population that is Latino. Use USU data to compute the percent of USU that is Latino. Compare differences in percentages. Note any discrepancy.  Utilize local webpages and phone books, identify programs/organizations that focus on the Latino population and meeting their needs.

 Purpose  Plan (form) a program that will meet the needs of the community, keeping in mind the need to develop indicators that can be used to evaluate the success of the program  Basic Steps  Locate other similar programs to use as a model  Meet with “experts” in this area and get their advice  Organize a program and apply for a grant  Form an “ad hoc” evaluation committee to evaluate program

 Purpose  Plan a mentoring program where USU students mentor high school students. Identify objectives (i.e., increase high school graduation rates, increase college retention rates of mentees).  Basic Steps  Identify other mentoring programs  Meet with directors of programs, professors, etc. who have been/are involved in mentoring programs  Organize your mentoring program and apply for a grant to implement program  Select members of your committee to engage in evaluation research

 Purpose  To monitor the implementation of your program on an ongoing basis. It is a type of program monitoring.  Steps  Specifically describe your program  Decide what products to count  Consider objectives (verb) (target) (date)  Decide on valid ways to count products  Identify process indicators  Quality assurance  Identify process indicators

 Purpose  Monitor the mentoring program and address two questions: (1) Is the mentoring occurring as planned? (2) How are the mentees responding?  Steps  Describe program  The mentoring program connects high school mentees with USU mentors through a monitored blog  Decide what products to count  The number of hours that mentors spent in contact with their mentee, etc.  Decide on valid way to count products  Have program administrator count hours spent on website  Quality assurance  High school counselors will be contacted periodically to determine how mentees are responding

 Purpose  To determine how well you meet the objectives of your program  Steps  Identify short-term and long-term outcomes  Construct indicators to measure outcomes  Decide on data collection and sampling techniques  Collect data  Analyze data  Write Report

 Purpose  To determine how well you meet the objectives of your program  Steps  Short-term outcomes are to establish relationship between mentors and mentees  Long-term outcomes are higher graduation and retention rates for mentees.  Mentees will be asked questions about their experience, and then about their graduation/retention  Surveys will be administered to all mentees after completion of program and then again in three years  Surveys are administered  Using SAS, data is entered and analyzed  Report is written

Inform Administrators and Clients Rely on “Expert” Opinion Inform AdministratorsRely on Statistics Quality Assurance Legislative Mandate Client Satisfaction is Focus Outcome Evaluation Not Mandated Does Not Necessarily Focus on Client Satisfaction

Questions or comments? Please contact: Carol Albrecht Assessment Specialist USU Extension