Dr. Carol Albrecht Research Team EXAMPLE of EVALUATION RESEARCH.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Difficult Conversations
Increasing your confidence that you really found what you think you found. Reliability and Validity.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Transition.
Third Grade AIG Nomination and Identification 101 OR: Everything you Wanted to Know about how Students are Identified but were Afraid to Ask.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Experiments Types of Experiments Laboratory Experiments Experiments of short term duration and usually conducted in a lab under very controlled conditions.
Assessing and Evaluating Learning
Quantitative Research
Chapter 2 Understanding the Research Process
The Effectiveness of Videoconferencing in the Elementary Classroom Christina Millson and Amanda Mounce.
DRIVING INSTRUCTION THROUGH DATA WITHOUT DATA IT IS JUST AN OPINION.
Sara Xiong & Rebecca Radle, Advisor: Dr. Susan Wolfgram, University of Wisconsin-Stout Research Question & Hypothesis What resources do young single parents.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
School Innovation in Science Formerly Science in Schools An overview of the SIS Model & supporting research Russell Tytler Faculty of Education, Deakin.
Servant Leadership Paper The student will concentrate on their individual workplace or business as the focus of a 5-7 page research paper discussing Servant.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
So What Can I Expect When I Serve on an NEASC/CPSS Visiting Team? A Primer for New Team Members.
Research Process Research Process Step One – Conceptualize Objectives Step One – Conceptualize Objectives Step Two – Measure Objectives Step Two – Measure.
Measurement and Scaling
Research Process Step One – Conceptualization of Objectives Step Two – Measurement of Objectives Step Three – Determine Sampling Technique Step Four –
Participant Observation Purpose  Observe Human Social Behavior. Often used to observe behavior over time.  This data collection technique is used when.
Longitudinal Studies The ultimate Goal for many extension faculty is to implement programs that result in permanent, positive change in human social behavior.
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
Student Engagement Survey Results and Analysis June 2011.
Research Method Step 1 – Formulate research question Step 2 – Operationalize concepts ◦ Valid and reliable indicators Step 3 – Decide on sampling technique.
Curriculum and Learning Omaha Public Schools
TEMPLATE DESIGN © The Homework Effect: Does Homework Help or Harm Students? Katherine Field EdD Candidate, Department.
Start at the Beginning How do we collect information to answer questions?
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Writing a Journal Article. Sections of a Journal Article Introduction or Statement of Purpose Literature Review Specific Statement of Hypothesis(es) Description.
The Impact of the MMP on Student Achievement Cindy M. Walker, PhD Jacqueline Gosz, MS University of Wisconsin - Milwaukee.
Evaluating a Research Report
Investigating K-12/University Partnerships: A Case Study Analysis Zulma Y. Méndez, Ph.D. Rodolfo Rincones, Ph.D. College of Education Department of Educational.
GETTING HIGH SCHOOL STUDENT’S BUY-IN: Target Language Only Mandarin Chinese Classes.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Quantitative Research. Quantitative Methods based in the collection and analysis of numerical data, usually obtained from questionnaires, tests, checklists,
L EARNING TEAM 2. A DMINISTRATION W HAT AMOUNT OF EDUCATION ? Our administrators ranged from BA degree to Education Specialist degree in School Administration.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
Using qualitative data Look for patterns Example 1. How do new teachers feel about their preparation for entering the classroom? I felt prepared as far.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
SURVEY RESEARCH AND TYPES OF INFORMATION GATHERED.
CORRELATES OF TEACHING EFFECTIVENESS By Dr M.G. Sajjanar KLE Society`s College of Education Hubballi.
QUANTITATIVE RESEARCH Presented by SANIA IQBAL M.Ed Course Instructor SIR RASOOL BUKSH RAISANI.
Inquiry-Based Learning How It Looks, Sounds and Feels.
USING MUSIC TO SUPPORT LEARNING How Can the Use of Music as a Teaching Tool Support and Enhance Learning and Improve Learning Outcomes ? Millicent Howard.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
University of Wisconsin-Eau Claire Andrea C. Privratsky, M.S.E., William Frankenberger, Ph.D. Teacher Attitudes on the use of the Responsive Classroom.
Research Designs. Types of Research Quantitative - Quantitative - Uses data Uses data numbers– statistics numbers– statistics Can be descriptive Can be.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
THE BIG INTERVIEW Presentation 1. Education Philosophy Working four-teen years in the California Public School system taught me how an intercity School.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Introduction Data may be presented in a way that seems flawless, but upon further review, we might question conclusions that are drawn and assumptions.
Abstract Introduction Method Discussion
The effects of physical activity on third grade math scores
EXPERIMENTAL RESEARCH
Principles of Quantitative Research
Evaluation of An Urban Natural Science Initiative
Logistics OUTCOMES EVALUATION.
Impact of ATTENDING Incoming Freshmen Academy on Student Achievement
Research Methods Tutorial
INTRODUCTION TO RESEARCH
Response to Instruction/Intervention (RtI) for Parents and Community
Response to Instruction/Intervention (RtI) for Parents and Community
Educational Testing Service
Careers in Psychology Module 3.
Presentation transcript:

Dr. Carol Albrecht Research Team EXAMPLE of EVALUATION RESEARCH

18 public school districts in the state of Texas were involved in Service Learning Projects. The State of Texas Service Learning Center hired Dr. Carol Albrecht and her students to evaluate this program. These power point slides outline the steps they took to complete this evaluation. We met with them to identify their objectives. They wanted to know how the program impacted public school children, teachers, community partners and parents of students.

Elementary Students High School Students TeachersParents Community Partners

One Identify the Objectives, Conceptualize Variables and Develop Indicators. Two Select the Sample. Three Construct the Indicators. Four Select the Research Design. Five Collect and Analyze the Data. Six Write Report and Present Findings.

Click here to see the surveys and codebooks.

An Old Chinese Proverb States: I HEAR, I FORGET I SEE, I REMEMBER I DO, I UNDERSTAND By “doing” this project we learned and really understood some important components of valid and reliable evaluation research.

We learned to Carefully Consider the Impact of Timing We learned to Select the “right” Indicators We learned to Deal with Social Desirability Use Multi- Methods and Multiple Units of Analysis We learned to

First Carefully Consider the Impact of Timing Selecting the “right” Indicators Dealing with Social Desirability Using Multi- Methods and Multiple Units of Analysis

 Timing of the Pre-test  Many programs are ongoing, and this can have a major impact on the pre-test. have a major impact on the pre-test.  In our study, many of the students had already participated in a Service Learning already participated in a Service Learning activity at some point in their school years. activity at some point in their school years. So, we didn’t have a true “pre” test. The So, we didn’t have a true “pre” test. The “pre” test scores were contiminated by prior participation. prior participation.

 Timing of the Post-Test  The “May” Effect  Outside environmental/social factors need to be considered.  In our study, we discovered that both teachers and students were more negative about almost EVERYTHING related to school at the end of the year. This may be explained by two factors: explained by two factors: First, they were just TIRED of school, and looking forward to vacation. Second, they had just taken standardized tests, TAKS. These tests were stressful for both teachers and students.

Carefully Considering the Impact of Timing Second Selecting the “right” Indicators Dealing with Social Desirability Using Multi- Methods and Multiple Units of Analysis

Selecting the Right Indicators  Head Start Program  In the 1960’s the head start program was launched. The objective was to increase the IQ scores of underrepresented populations including children living in poverty. Early research showed that standardized tests of IQ increased for several years, and the decreased, until there was no difference between the experimental and control groups. While some felt this was evidence for discontinuing the program, parents came forward arguing that the researchers weren’t using the right measurements.

Selecting the Right Indicators  Head Start Program  A group of researchers call The Perry Preschool Consortium, with The input of teachers and parents, identified (1) social, (2) educational and (3) socioeconomic indicators that differentiated preschool participants from a control group up to 19 years after participation in the program. The differences were compelling.

Accurate Interpretation of Indicators  Furthermore, this group argued that the decreasing IQ scores actually provided evidence that environmental factors CAN influence IQ – both positively and negatively. Thus being In an “enriched” environment (i.e., the Head Start Program) can increase IQ but then being transferred to an impoverished environment (i.e., public schools in poor neighborhoods) can decrease IQ.

Schooling Success High School Graduation or Equivalent College or Vocational Training Functional Competence Ever Classified as Mentally Retarded Time Spent in Special Education Social Responsibility Ever Detained or Arrested Teen Pregnancies Employed Receiving Welfare

Selecting the Right Indicators  Using focus groups and intensive interviews, we looked to (1)Teachers, (2)Parents (3)And student participants as well as past research, to help us identify valid and accurate indicators. Analysis of this qualitative data indicated (1)Some students did experience the desired impact. (2)We needed to use “control” variables to accurately assess the impact of Service Learning.

Selecting the Right Control Variables The following control variables were all highly significantly related to EVERY outcome measurement. Ownership of Project Students who were involved in planning the project, and felt they “had a voice” were significantly more likely to experience positive outcomes. Amount of Participation A significant number of students spent four hours or less involved in the project. This level of involvement did not result in positive outcomes. However students who spent more time did experience positive outcomes. Teacher’s Attitudes and Experience The teacher’s attitude toward the project was an important factor. If teachers were excited about the program then their students tended to have more positive outcomes.

1. Leadership Skills 2. Problem Solving Skills 3. Academic Aspirations 4. Liking for School Whether or not students felt they made decisions about the project Amount of time students indicated they participated in the project. Teacher’s self reported attitudes about the project. Student Success Results from focus groups with students and intensive interviews with teachers indicated that these were valid indicators of the quality and quantity of participations were related to outcomes.

Mean Score on Outcome Measurements by Amount of Time Student Planned or Participated in Project Students Planned -Four Hours or More YesNo Attitudes Toward Attending College Attitudes Toward School Problem Solving Skills Leadership Skills 12.81*** 19.84*** 17.01*** 17.15*** 12.50*** 18.45*** 15.58*** 15.85*** Students Participated - Four Hours or More YesNo Attitudes Toward Attending College Attitudes Toward School Problem Solving Skills Leadership Skills 12.89*** 19.54*** 16.91*** 17.04*** 12.18*** 18.38*** 15.09*** 15.46*** ***p<.0001 * p<.05 **p< 0.01 ***p<

Mean Scores on Outcome Measurements by Sense of Ownership – Whether or Not Students Made Decisions Students Made Decisions YesNo Attitudes Toward College Attitudes Toward School Problem Solving Skills Leadership Skills 13.04*** 19.59*** 17.26***p 17.51*** 12.36*** 18.84*** 15.58*** 15.81*** * p<.05 **p< 0.01 ***p< Click here to see the Power Point Presentation.

CHART 1. High School Students’ Perception of How Good They are at Speaking in Front of Groups by Whether or Not They Made Decisions about Service Learning Projects (p <0.0001)

CHART 2. High School Students’ Perception of How Good They are at Finding Resources by Whether or Not They Made Decisions about Service Learning Projects (p <0.0001)

Carefully Considering the Impact of Timing Selecting the “right” Indicators Third Dealing with Social Desirability Using Multi- Methods and Multiple Units of Analysis

Beware of Social Desirability In Evaluation Research, Participant Often Evaluate a Program Positively – EVEN when the Program is “poor” and Ineffective. It may not be seen as socially acceptable to do otherwise.

Social Desirability Social Desirability  Why did we become concerned?  What are the “danger” signs?  How did we attempt to alleviate it?  How did we modify the construction of our surveys, our research design and of our surveys, our research design and analysis of the data to deal with this analysis of the data to deal with this problem? problem?

Social Desirability Social Desirability  Danger Signs  Type of research  Past literature indicates that respondents tend to be very positive when asked about their participation in a program even when it is a poor program. They don’ want to believe they wasted their time, and they often feel an obligation to express appreciation for those who implemented the program.

Social Desirability Social Desirability  Danger Signs  Self selection into the program  Students and teachers were not required to participate in the program. Therefore the program participate in the program. Therefore the program was more likely to attract participants who was more likely to attract participants who already had positive attitudes toward these already had positive attitudes toward these “types” of activities. “types” of activities.

Social Desirability Social Desirability  Danger Signs  Consistently high scores on every aspect of the program – no variation aspect of the program – no variation  Response Set can occur. This is where respondents give you the same response (usually positive) without seriously considering the question.  The “ceiling” effect is a similar problem. This is when you get consistently highly positive scores is when you get consistently highly positive scores on the pre-test. In this case, there is little room on the pre-test. In this case, there is little room for improvement in scores. for improvement in scores.

Dealing with Social Desirability when Constructing your Survey/Questionnaire Dealing with Social Desirability when Constructing your Survey/Questionnaire Check List Make participation voluntary and make answers anonymous or confidential. Make participation voluntary and make answers anonymous or confidential. Vary negative/positive statements in Vary negative/positive statements in Your index Your index Avoid misleading/biased questions Avoid misleading/biased questions Make statements or questions very specific Make statements or questions very specific

Dealing with Social Desirability when Constructing your Survey/Questionnaire Dealing with Social Desirability when Constructing your Survey/Questionnaire Check List – continued Make participation voluntary and make answers anonymous or confidential. Make participation voluntary and make answers anonymous or confidential. Put “sensitive” questions at the end Put “sensitive” questions at the end Ask how they would change program “under ideal circumstances”. Ask how they would change program “under ideal circumstances”. Avoid (1) yes or (2) no answers – ask “degrees” of positive or negative. Avoid (1) yes or (2) no answers – ask “degrees” of positive or negative. Ask for their input in improving the program – rather than simply Ask for their input in improving the program – rather than simply evaluating the program for instance: evaluating the program for instance:  NOT – Is this a successful program, but rather - what factors increase or decrease the success of this program.

Dealing with Social Desirability in Your Research Design Dealing with Social Desirability in Your Research Design Check List If possible, don’t evaluate your own program If possible, don’t evaluate your own program  An “outsider” would tend to be more objective and participants would be more likely to provide unbiased answers. Have a variety of participants evaluate the program so Have a variety of participants evaluate the program so you can look for consistencies/inconsistencies in answers. you can look for consistencies/inconsistencies in answers.  Students  Teachers  Parents of participants  Community Partners

Dealing with Social Desirability in Your Research Design Dealing with Social Desirability in Your Research Design Check List - continued Use multi-methods so you can compare results across Use multi-methods so you can compare results across to see if you get similar results and look for additional to see if you get similar results and look for additional insights. These could include: insights. These could include:  Focus groups  Participant observation  Surveys  Intensive interviews  Content analysis Content analysis is especially important for researchers who identify tangible products (e.g., bushels of grains) as their outcomes.

Dealing with Social Desirability when Analyzing the Data Dealing with Social Desirability when Analyzing the Data Check List Compare your program with other programs Compare your program with other programs Compare across different levels of participation within your sample to see if there are variations Compare across different levels of participation within your sample to see if there are variations Compare across different types of participation within your sample (i.e., in our study, we compared across types of Compare across different types of participation within your sample (i.e., in our study, we compared across types of Service Learning projects). Service Learning projects).

Dealing with Social Desirability when Analyzing the Data Dealing with Social Desirability when Analyzing the Data Check List  Compare across different “types” of participants (This would include males vs. females, parents vs. children, rural vs. include males vs. females, parents vs. children, rural vs. urban dwellers). urban dwellers).  Compare scores across questions – especially questions that measure the same outcomes. measure the same outcomes.  Compare answers across time fall vs. summer participants fall vs. summer participants The most important thing to remember here is to NOT just ask if the program was successful, but rather, HOW and WHEN it is most successful.

Carefully Considering the Impact of Timing Selecting the “right” Indicators Dealing with Social Desirability Using Multi- Methods and Diverse Groups Fourth

Different Research Designs can provide both additional insights and further support for your results. Focus groupsSurveys Content Analysis Participant Observation Case StudiesField Trials Laboratory Experiments Intensive Interviews

Evaluation of Service Learning Project Intensive Interviews with Teachers Focus Groups with Students Mail-Out Surveys with Community Partners. Face to Face Surveys with Parents On-line surveys of Service Learning Coordinators Surveys with Students

Evaluation of the Service Learning Program  Examples of data  that the program is producing the desired outcomes outcomes  collected from Teachers  Using Telephone Surveys  Using Focus Groups

Descriptive Statistics for Elementary and Middle/High School Service Learning Teachers: Extent to Which Teachers Agree with the Following Statements about the Impact of Service Learning in Their Classroom Elementary Middle/High School NumberPercentNumberPercent Positive Addition to Classroom Learning Agree/Strongly Agree Agree/Strongly Agree Beneficial for the ALL Students Agree/Strongly Agree Agree/Strongly Agree Motivates Students to be Involved Agree/Strongly Agree Agree/Strongly Agree Helps Students Learn Curriculum Agree/Strongly Agree Agree/Strongly Agree Should be Required for All Students Agree/Strongly Agree Agree/Strongly Agree

Descriptive Statistics: Identification of GREATEST BENEFITS by Middle/High School and Elementary Service Learning Teachers Elementary Middle/High School PercentPercent Benefits for Students Service to Others Service to Others Understanding of World Understanding of World Personal Growth Personal Growth Help Learn Curriculum Help Learn Curriculum Benefits for Teachers Student Growth Student Growth Service to Others Service to Others Involvement with Students Involvement with Students Break in TAKS Break in TAKS

As One Teacher Stated, “ Service Learning is the most powerful and impactful thing I ever did in the classroom as a teacher. It hooked me, and I am a believer in the power. Another Teacher Claimed, “ I think this program has transcended anything that anyone expected when they began the program. It has extended beyond what they thought it could achieve.”

One Teacher Argued, “ I could have never ever taught the lessons they learned about human nature.” While Another Claimed, “ It teaches kids the skills that are not book skills….skills like how to think, how to plan, how to organize, how to manage - stuff you can read about in a book, but until you do it, you don’t know you have the ability to do it.”

One Teacher Stated, “ school is not as…engaging as when they learn through these projects…they are learning all of these things by action – their great public speaking skills, their writing skills, their marketing…” Another Teacher Explained, “ in the writing TAKS, we had to write with a prompt so it kind of helped with the writing and the reading TAKS too.”

Evaluation of the Service Learning Program  Examples of data  that the program is producing the desired outcomes outcomes  collected from parents and community partners  using telephone surveys  using focus groups

Descriptive Statistics For Parents and Community Partners; Descriptive Statistics For Parents and Community Partners; An Evaluation of the Service Learning Program Parents Community Partners MeanRangeMeanRange Positive Addition to Classroom Beneficial for All Students Motivates Students to be Involved Helps Agency Achieve Goals Is Valued by Agency Is Valued by Community *on 5 pts. Scale (SD to SA) **sample size is small

One Community Partner Described Their Relationship with the School, “ We Actually came to the schools…and we were looking for assistance. It’s a great marriage. We are still married.” And when Describing the Benefits for Students, “.. we’ve watched students mature into more socially aware students - much more mature. It’s amazing. It’s just amazing.”

 Timing of data collection is important.  Selecting reliable/valid indicators is critically important. Spend some time doing this.  IF you are doing Evaluation Research, plan ways to reduce the impact of social desirability on your results.  Use multi-methods when feasibility to provide additional insights and greater support for your results.  Try to gather information from all the different groups that may be impacted by the program (i.e., parents, students etc.).

 Dr. Carol Albrecht  Assessment Specialist USU Ext   (979)