S EPTEMBER SESSION 1: R ESEARCH Q UESTIONS & S ELECTING M EASURES The first steps toward getting your EA research project underway.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Assessing Student Performance
[Your District's] Comprehensive Guidance Program: Linking School Success with Life Success 1 [Your District’s] Comprehensive Guidance Program Responsive.
Research Design It is a Quasi-Experimental Design using the symbolic design: OX,O. A single group is pretested (O), exposed to a treatment (X), and post.
What is Evaluation Research?. Evaluation is…. A way of viewing the project, its participants, it effect Measurements are the currency.
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Evidence That I’m a Difference Maker Tommi Leach, Erica Harris and Kelly Arrington ODCTE.
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
DEVELOPMENTAL RESEARCH  Mitch Kielb  Juanita Tilgner  Andy Sen.
Who was there and how many were there? Did they learning something? Will they change behavior? Inputs: Time, Money, Materials Numbers Reached, Frequency.
Assessing Financial Education: A Practitioner’s Guide December 2010.
CS Spring 5/3/ Presenter : Yubin Li Professor : Dr. Bamshad Mobasher Week 6: Descriptive Research.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
+ STARS Evaluation Assistant Webinar 1 September 19, 2014 Evaluation Projects.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Extensive Reading Research in Action
Evaluation Process/Requirements for CAPP Algebra Project.
Key Performance Measures, Evaluation Plans, and Work Plan
Interactive Notebooks in the middle school Social Studies classroom EDSS 8420 Lindsey Banos.
COMMUNITY OF PRACTICE SESSION STEM Education: Communication Tools and Collaboration Opportunities May 20, /20/11Superintendents Community of Practice.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Student Services Assessment Workshop College of the Redwoods Angelina Hill & Cheryl Tucker Nov 28 th & 30 th, 2011.
S EPTEMBER SESSION 1: R ESEARCH Q UESTIONS & S ELECTING M EASURES The first steps toward getting your EA research project underway.
Research Methods in Education
S-005 Collecting data: What methods to use. Common methods Interviews – Face-to-face – Focus group – Telephone – Skype / video conference Questionnaires.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Understanding by Design Assessment for Learning Understanding by Design (UbD) and Assessment for Learning.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Activity Reports What are Activity Reports? An online report of ALL STARS Computing Corps activity (events, outreach, workshops, etc.) Why are they necessary?
TAH Project Evaluation Data Collection Sun Associates.
Quantitative and Qualitative Approaches
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: That students do not.
Foundations of Physics Science Inquiry. Science Process of gathering and organizing information about the physical world.
Middle Leadership Programme Day 1: The Effective Middle Leader.
Program Evaluation.
EA Project Types Formative Assessment  Looking at what works, what doesn’t, what improvements can be made to the SLC  Utilize meeting notes, experiences,
A Parent’s Guide to Formative Assessment Communication is Key! Education is shared between the home and the school. Good communication is important as.
EDU 5900 AB. RAHIM BAKAR 1 Research Methods in Education.
OBSERVING CHILDREN: A Tool for Assessment
Facilitate Group Learning
3.04 Interpret marketing information to test hypotheses and/ or solve issues Marketing Management.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Evaluation Assistant (EA) August Session STARS Alliance 2011 Tuesday Cohort C1 - Aug. 23 Wednesday Cohort C2 - Aug. 24 Noon - 1:30 pm.
PEER Module 4: Research & Evaluation Questions
Evaluation Assistant Research Projects EAs are required to lead an evaluation research project for the academic year.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
European Training Session 5: “Deepening the Understanding of Inquiry in Natural Sciences” Preparing to Observe IBSE Using the Fibonacci IBSE Diagnostic.
Intentionally Targeting Student Learning Outcomes through Course Assessment and Design Patty Roberts, Ph.D. Patricia Noteboom, Ph.D NASPA INTERNATIONAL.
The purpose of evaluation is not to prove, but to improve.
What is Action Research?. “It is a part of a broad movement that has been going on in education generally for some time. Action Research involves taking.
Denise Kervin, Prevention Coordinator.  Background on Our Program  Evaluation Path  Some Observations.
Action Plan Amanda Foster aaa. Area of focus Variables:
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Assessing Teaching Through Critical Reflection and Evidence to Encourage Renewal Phyllis Blumberg - University of the Sciences
Evaluation Emma King.
Scaled Leadership Vision 20/20 Continues: Assessing Key Elements in the 5E Mathematics Classroom Silvia Aday, District instructional supervisor David galarce,
Teaching and Educational Psychology
Program Evaluation Essentials-- Part 2
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
Evaluation Jacqui McDowell.
Implementing the Routines
STUDENTS’ CONCEPTUAL REASONING IN SECONDARY SCHOOL COMPUTER APPLICATIONS (MICROSOFT WORD AND EXCEL) THROUGH META-COGNITIVE.
Presentation transcript:

S EPTEMBER SESSION 1: R ESEARCH Q UESTIONS & S ELECTING M EASURES The first steps toward getting your EA research project underway

Y OUR EA R ESEARCH P ROJECT As an EA, you will be evaluating the impact of your SLC project Why? Because you need to know if it works Typical SLC projects do outreach to a group of students with a particular goal in mind, say, getting middle school students excited about computing. If you don’t evaluate it, you won’t know if it is effective!

E VALUATION Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object Questions about the value or worth of something Context of how value is measured The data measured The knowledge from the measured data 1. Research Question 2. What data can be collected 3. What data will be collected, when, how & by whom 4. Findings & Interpretations

R ESEARCH Q UESTIONS Questions require answers- They give you a way to evaluate evidence Clear, open-ended questions require research and critical thinking

STEP 1: I DENTIFY T OPIC What are you interested in? What is your SLC doing? What are you curious about? Example: an EA, is curious if doing a pair programming activity with students and parents will make parents think that computer science is a viable option for their kids

STEP 2: B RAINSTORM Start asking questions about the SLC outreach What are the goals of the SLC outreach How will you know that the goals are being met? Example: EA starts thinking about how he can answer his question; do parents enjoy the activity? Do they think computing is a viable career for their kids? Is there a larger effect on the kids attitudes when doing pair programming with their parents, versus a group of similar kids doing pair programming with peers?

S TEP 3: H YPOTHESIZE What do you expect to happen? What do you think the outcome will be? Or what is the intended outcome? Are there several expected outcomes? Prioritize What’s feasible in your timeframe? For our example: The ability to get groups of parents and a comparable group of kids will determine the questions and research design

S TEP 4: D ESIGN YOUR R ESEARCH What measurement tool(s) will you use? Student survey, knowledge test, interviews, observation Teacher survey, interviews Parents survey, interviews What will be the design? Quasi-experimental: Pre/post assessment (before activity/after activity) [repeated measures design] Post assessment only Qualitative- interviews, focus groups, open ended items on a survey [mixed-methods]; observations (documented) Experimental: require control group

C OMMON R ESEARCH D ESIGNS Descriptive Goal is to observe & describe what happened Methods used: Surveys about opinions, attitudes, behaviors (called self- report) Observation Example- Collect teacher observation of classroom environment (mood of students) following an SLC activity

C OMMON R ESEARCH D ESIGNS Quasi-Experimental (because you have no ‘control’ group) Goal is to determine cause, i.e. see if your SLC outreach (called an “intervention”) creates the outcomes you hypothesize Methods used: Surveys about opinions, attitudes, behaviors (called self- report) Tests of knowledge Can do pre/post design (before and after intervention) Example- Collect student attitudes (survey) about robots before an SLC activity and after- works best for long-term interventions, not one time workshop

C OMMON R ESEARCH D ESIGNS Experimental (has ‘control’ group) Goal is to determine cause, i.e. see if your SLC outreach (called an “intervention”) creates the outcomes you hypothesize Control group for comparison, who has no ‘intervention’ Methods used: Surveys about opinions, attitudes, behaviors (called self-report) Tests of knowledge Can do pre/post design (before and after intervention) Example- Collect student attitudes (survey) about robots before an SLC activity and after- works best for long-term interventions, not one time workshop; collect same survey from a classroom of students in same school and same grade who didn’t have the SLC activity

Q UALITATIVE Interviews Protocol- the questions you will ask Semi-structured vs Structured Focus Groups Small groups of 8-10 people Observations Identify what you will record Documentation (form, pictures, recordings, etc.)

W HAT WILL YOU DO ? Report back in the EA session what your initial thoughts are Your write up is your IRB