Research and Evaluation: The Basics A Workshop for Educators Session 519 Saturday, June 30, 2007 Rita DeMaria, Ph.D. Scott Gardner, Ph.D
Research Careful, thorough, patient, systematic investigation undertaken to establish facts or principles in field of study Use of the scientific method quantitative: statistics qualitative: interviews
Evaluation Is my program accomplishing the goals and objectives? How do participants view and experience the service?
What’s good and what’s bad research? DDDDon’t do anything TTTTouchy Feely SSSSample problems (biased) NNNNo Control TTTToo Broad (change the world) BBBBad Measures RRRRandom Assignment
Something is Much Better than Nothing!
Ways of Collecting Data Insider vs. Outsider Questionnaires Self-Anchored Rating Scales (Likert) Interviews Behavioral Observations (self and others, ratings) Example of Husband TV Habits Timed Daily logs “Hard” measures
Use Multiple Methods & Levels Individual Sibling Dyad Couple Dyad Parent-Child Dyad Family Community Etc.
What is reliability and validity?
The benefit of using reliable and valid established measures
Examples From our Research From Scott’s research From Rita’s research: Comparison of Marriage Education Program Participants
Scales You Can Use Dyadic Adjustment Scale Marital Status Inventory Conflict Tactics Scale
Resources for finding measurement scales
Overview of different research designs Basics of pretest – posttest Observation, interview, surveys
Other issues Informed consent Protection of human subjects
Statistics Generalizability What is statistical significance?
Some statistical programs SPSS Excel
Stat Help 101
Dissemination
What’s Your Plan? Small Groups My program My population My plan Pre, Post, follow-up Getting the word out
free consultation and support Please contact either of us for help or questions: Rita DeMaria, Ph.D. Scott Gardner, Ph.D.