Broadening Participation in Computing (BPC) Alliance Evaluation Workshop Patricia Campbell, PhD Campbell-Kibler Associates, Inc.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Title I Directors Conference Sept 2007 Carol Diedrichsen Gwen Pollock Surveys of the Enacted Curriculum for English.
Implementing the Tech Standards Presenter: Eric Curts eTech|OHIO Tech Conference 2006.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
California Preschool Learning Foundations
Disproportionality in Special Education
August 8, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Shannon Housson, Director Overview of.
Introduction to the ESRC Question Bank Julie Lamb Department of Sociology University of Surrey.
School Based Assessment and Reporting Unit Curriculum Directorate
4. NLTS2 Data Sources: Parent and Youth Surveys. 4. Sources: Parent and Youth Surveys Prerequisites Recommended modules to complete before viewing this.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Orientation to the Planning 10 IRP  How does Planning 10 fit into the 2004 Graduation Program?  How was Planning 10 developed?  What’s the difference.
12. NLTS2 Documentation: Quick References. 1 Prerequisites Recommended modules to complete before viewing this module  1. Introduction to the NLTS2 Training.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
10. NLTS2 Documentation Overview. 1 Prerequisites Recommended modules to complete before viewing this module  1. Introduction to the NLTS2 Training Modules.
CAPA California Alternate Performance Assessment (CAPA) Presented by Contra Costa SELPA with information provided by California Department of Education.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Roberta Spalter-Roth, Ph.D Director of Research American Sociological Association Enhancing Diversity in Science: Working Together to Develop Common Data,
Integrated Postsecondary Education Data System (IPEDS) Interrelated surveys conducted annually by the National Center for Education Statistics (NCES)
Alliance for Graduate Education and Professoriate (AGEP) Evaluating Retention Strategies: Some Thoughts On The Homework Patricia Campbell Campbell-Kibler.
8. Evidence-based management Step 3: Critical appraisal of studies
Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc.
Using Mixed-Methods and Survey Research in Project Evaluation/Assessment Pat Campbell QEM Workshop on Assessment and Evaluation August 7-8, 2009.
Instructional Design and My Job at the Medical Library Lei Wang June 17, 2005.
Creating Tutorials for the Web: A Designer’s Challenge Module 3: Design for Learning.
Pey-Yan Liou and Frances Lawrenz Quantitative Methods in Education of the Department of Educational Psychology, University of Minnesota Abstract This research.
Student Technological Mastery: It's Not Just the Hardware Wm. H. Huffman, Ph.D. Ann H. Huffman, Ph.D.
Julia Bauder, Grinnell College & Jenny Emanuel, University of Illinois Be Where our Faculty Are: Emerging Technology Use and Faculty Information Seeking.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
WELCOME TO SBIP Action Research. Outlines: i/ RESEARCH ii/ WHAT IS AN ACTION RESEARCH? iii/ TYPES OF ACTION RESEARCH iv/ FEATURES OF ACTION RESEARCH v/
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Evaluating NSF Programs
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Graduate School of Education Leading, Learning, Life Changing Evolving Oregon Educational Policy Courtesy of Pat Burk, Ph.D. Department of Educational.
Tran Keys, Ph.D. Research & Evaluation, Santa Ana USD
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Competency Training for aeronautical meteorological forecasters and observers Wang Yong WMO RTC Nanjing, China.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Quantitative and Qualitative Approaches
Planning an Applied Research Project Chapter 3 – Conducting a Literature Review © 2014 by John Wiley & Sons, Inc. All rights reserved.
NAME Evaluation Report Name of author(s) Name of institution Year.
Communication Skills: Connecting Personally Sheridan Institute of Technology and Advanced Learning Wednesday, April 27, 2005 Michael Kunka, TCDSB Literacy.
Biology Scholars Program Amy Chang American Society for Microbiology Goal: Develop faculty expertise in evidenced-based science education reform Seven.
Formative Assessment vs. Summative Assessment Assessment OF Learning (Summative) vs. Assessment FOR Learning (Formative)
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Welcome! A-Team Session 4 Fall Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps  Sampling  Validity/Reliability.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Education Transform Resources
Summer Series, 2007 Building Capacity to Make Research-Based Practice Common Practice In Georgia Utilizing the Keys to Quality.
AUTHOR: NADIRAN TANYELI PRESENTER: SAMANTHA INSTRUCTOR: KATE CHEN DATE: MARCH 10, 2010 The Efficiency of Online English Language Instruction on Students’
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Defining 21st Century Skills: A Frameworks for Norfolk Public Schools NORFOLK BOARD OF EDUCATION Fall 2009.
Patricia B. Campbell, PhD, Campbell-Kibler Associates, Inc. _________________ This material is based upon work supported by the National Science Foundation.
A nationwide US student survey
Program Evaluation Essentials-- Part 2
Geoscience Projects and The Online Evaluation Resource Library (OERL)
Presentation transcript:

Broadening Participation in Computing (BPC) Alliance Evaluation Workshop Patricia Campbell, PhD Campbell-Kibler Associates, Inc.

Evaluation Basics: Soup, Cooks, Guests & Improvement When cooks taste the soup, its formative evaluation; the collection of information that can be used to improve the soup. If necessary, the cooks next step is to explore strategies to fix the problem. The cook makes some changes and then re-tastes the soup, collecting more formative evaluation data. When the guests taste the soup at the table, theyre doing summative evaluation. They are collecting information to make a judgment about the overall quality and value of soup. Once the soup is on the table and in the guests mouths, there is little that can be done to improve that soup. Thanks to Bob Stake for first introducing this metaphor.

Challenging Assumptions When I was a physicist people would often come and ask me to check their numbers, which were almost always right. They never came and asked me to check their assumptions, which were almost never right. Eli Goldratt

Pats Evaluation Assumptions The core evaluation question is What works for whom in what context? The core evaluation question is What works for whom in what context? Black hole evaluations are bad. Black hole evaluations are bad. If you arent going to use the data, dont ask for it. If you arent going to use the data, dont ask for it. A bad measure of the right thing is better than a good measure of the wrong thing. A bad measure of the right thing is better than a good measure of the wrong thing. Acknowledging WIIFM increases response rates. Acknowledging WIIFM increases response rates. Process is a tool to help understand outcomes. Process is a tool to help understand outcomes. Outcomes are at the core of accountability. Outcomes are at the core of accountability.

Some Thoughts on Measurement Dont reinvent the wheel; where possible use existing measures Dont reinvent the wheel; where possible use existing measures Share measures with other projects. Common questions can be useful. Share measures with other projects. Common questions can be useful. Look for benchmark measures that are predictors of your longer term goals. Look for benchmark measures that are predictors of your longer term goals. All self developed measures need some checking for validity and reliability. All self developed measures need some checking for validity and reliability. A sample with a high response rate is better than a population with a low one. A sample with a high response rate is better than a population with a low one.

Some Web-based Sources of Measures OERL, the Online Evaluation Resource Library. ETS Test Link (a library of more than 25,000 measures). 2ecfd5b8849a77b13bc /?vgnextoid=ed462d3 631df4010VgnVCM f95190RCRD&vgnextcha nnel=85af197a484f4010VgnVCM f95190RCR D 2ecfd5b8849a77b13bc /?vgnextoid=ed462d3 631df4010VgnVCM f95190RCRD&vgnextcha nnel=85af197a484f4010VgnVCM f95190RCR D 2ecfd5b8849a77b13bc /?vgnextoid=ed462d3 631df4010VgnVCM f95190RCRD&vgnextcha nnel=85af197a484f4010VgnVCM f95190RCR D

Sample Under-represented Student Instruments From OERL Attitude Surveys Attitude Surveys Attitude Surveys Attitude Surveys Content Assessments Content Assessments Content Assessments Content Assessments Course Evaluations Course Evaluations Course Evaluations Course Evaluations Focus Groups Focus Groups Focus Groups Focus Groups Interviews Interviews Interviews Journal/Log Entries Journal/Log Entries Journal/Log Entries Journal/Log Entries Project Evaluations Project Evaluations Project Evaluations Project Evaluations Surveys Surveys Surveys Workshop Evaluations Workshop Evaluations Workshop Evaluations Workshop Evaluations

Compared to What? Evaluation Designs Experimental designs Experimental designs Quasi-experimental designs Quasi-experimental designs Mixed methods designs Mixed methods designs Case studies Case studies NSF does not promote one design, rather it wants the design that will do the best job answering your evaluation questions!

Making Comparisons: Why Bother

Web-based Sources of Comparisons: K- 12 By state, all public schools have web-based school report cards that include grade level student achievement test scores on standardized mathematics and language arts/reading tests, often disaggregated by race/ethnicity and by sex, for a period of years. The U.S. Department of Educations Common Core of Data (CCD) ( reports public school data including student enrollment by grade, student demographic characteristics, and the percent of students eligible for free or reduced price lunches. Comparison schools can be selected using the CCD and achievement data for both sets of schools over time can be downloaded from states report cards.

Caveats on using Web-based Sources of Comparisons: K-12 These data can be used only if: the goal of the strategy/project is to increase student achievement in a subject area tested by the state; the goal of the strategy/project is to increase student achievement in a subject area tested by the state; participating students have not yet taken their final state mandated test in that subject area; participating students have not yet taken their final state mandated test in that subject area; most of the teachers in a school teaching in that subject area are part of the strategy/project, and/or most of the students studying the subject area are part of that strategy/project. most of the teachers in a school teaching in that subject area are part of the strategy/project, and/or most of the students studying the subject area are part of that strategy/project.

Web-based Sources of Comparisons: College and University WebCASPAR database ( provides free access to institutional level data on students from surveys as Integrated Postsecondary Education Data System (IPEDS) and the Survey of Earned Doctorates. The Engineering Workforce Commission ( online.org/) provides institutional level data (for members) on bachelors, masters and doctorate enrollees and recipients by sex by race/ethnicity for US students and by sex for foreign students. online.org/ online.org/ Comparison institutions can be selected from the Carnegie Foundation for the Advancement of Teachings website, ( based on Carnegie Classification, location, private/public designation, size and profit/nonprofit status.

Some Web-based Sources of Resources OERL, the Online Evaluation Resource Library. User Friendly Guide to Program Evaluation AGEP Collecting, Analyzing and Displaying Data gData.pdf gData.pdf gData.pdf American Evaluation Association

OERL, the Online Evaluation Resource Library. Includes NSF project evaluation plans, instruments, reports and professional development modules on Designing an Evaluation Designing an Evaluation Developing Written Questionnaires Developing Written Questionnaires Developing Interviews Developing Interviews Developing Observation Instruments Developing Observation Instruments Data Collection Data Collection Instrument Triangulation and Adaptation. Instrument Triangulation and Adaptation.

User Friendly Guide to Program Evaluation User Friendly Guide to Program Evaluation Introduction Introduction Introduction Section I - Evaluation and Types of Evaluation Section I - Evaluation and Types of Evaluation Section I - Evaluation and Types of Evaluation Section I - Evaluation and Types of Evaluation Section II - The Steps in Doing an Evaluation Section II - The Steps in Doing an Evaluation Section II - The Steps in Doing an Evaluation Section II - The Steps in Doing an Evaluation Section III - An Overview of Quantitative and Qualitative Data Collection Methods Section III - An Overview of Quantitative and Qualitative Data Collection Methods Section III - An Overview of Quantitative and Qualitative Data Collection Methods Section III - An Overview of Quantitative and Qualitative Data Collection Methods Section IV - Strategies That Address Culturally Responsive Evaluations Section IV - Strategies That Address Culturally Responsive Evaluations Section IV - Strategies That Address Culturally Responsive Evaluations Section IV - Strategies That Address Culturally Responsive Evaluations Other Recommending Reading, Glossary, and Appendix A: Finding An Evaluator Other Recommending Reading, Glossary, and Appendix A: Finding An Evaluator Other Recommending Reading, Glossary, and Appendix A: Finding An Evaluator Other Recommending Reading, Glossary, and Appendix A: Finding An Evaluator

AGEP Collecting, Analyzing and Displaying Data ngDisplayingData.pdf AGEP Collecting, Analyzing and Displaying Data ngDisplayingData.pdf I. Make Your Message Clear I. Make Your Message Clear II. Use Pictures, Where Appropriate II. Use Pictures, Where Appropriate III. Use Statistics and Stories III. Use Statistics and Stories IV. Be Responsive to Your Audience. IV. Be Responsive to Your Audience. V. Make Comparisons V. Make Comparisons VI. Find Ways To Deal With Volatile Data VI. Find Ways To Deal With Volatile Data VII. Use the Results VII. Use the Results

Some Thoughts for Discussion Some Thoughts for Discussion 1. What evaluation resources do you have you would like to share? What evaluation resources would you like to have from others? 1. What evaluation resources do you have you would like to share? What evaluation resources would you like to have from others? 2. What can you (and we) do to make your evaluation results known to and useful to: 2. What can you (and we) do to make your evaluation results known to and useful to: your project? your project? other projects? other projects? Jan? the broader world? the broader world? 3. Why do you think your strategies will lead to your desired outcomes? Use research, theory or just plain logic 3. Why do you think your strategies will lead to your desired outcomes? Use research, theory or just plain logic