1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.

Slides:



Advertisements
Similar presentations
Curriculum Development and Course Design
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
LEARNER CENTERED LEARNER DESIGNED Learning & Preparation Objectives Learning Resources and Strategies Evidence of Accomplishment of Objectives Criteria.
Adapted by: Kim Scott (From Vicki Nicolson & Jane Nicholls)
Consistency of Assessment
Birzeit University Experience in Designing, Developing and Delivering e-enabled Courses Palestine December,2005 Dr. Osama Mimi, Birzeit University.
Chapter 41 Training for Organizations Research Skills.
Formative and Summative Evaluations
Implementing Undergraduate-Faculty Teaching Partnerships in Your Classroom Anna L. Ball Neil A. Knobloch University of Illinois, Urbana-Champaign.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Tutorial of Instructional Design
Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin,
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Action Research: For Both Teacher and Student
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
ADVANCED RESEARCH METHODS. I.The methodology of survey research Descriptive research, also known as statistical research, describes data and characteristics.
Looking at Student work to Improve Learning
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
The Evaluation Plan.
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
Debby Deal Tidewater Team STEM Grades 4-5 August 4, 2011 Action/Teacher Research.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
RESEARCH IN MATH EDUCATION-3
Research Design & the Research Proposal Qualitative, Quantitative, and Mixed Methods Approaches Dr. Mary Alberici PY550 Research Methods and Statistics.
Research Methods in Education
Gathering SoTL Evidence: Methods for Systematic Inquiry into Student Learning Renee A. Meyers Coordinator, UWS SoTL Leadership Site Faculty College, May.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: Students are not adequately.
Chapter Four Managing Marketing Information. Copyright 2007, Prentice Hall, Inc.4-2 The Importance of Marketing Information  Companies need information.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Evaluator Workshop for Personnel Evaluating Teachers Rhode Island Model The contents of this training were developed under a Race to the Top grant from.
LESSON 6.11: SURVEY DESIGN Module 6: Rural Health Obj. 6.11: Identify potential problems in the design of survey questions.
Quantitative and Qualitative Approaches
Performance and Portfolio Assessment. Performance Assessment An assessment in which the teacher observes and makes a judgement about a student’s demonstration.
Qualitative Data and Quantitative Data: Are they different?
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Eleven Watching And Listening: Qualitative Research For In-depth Understanding.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
EDU 5900 AB. RAHIM BAKAR 1 Research Methods in Education.
By Lynnece Edmond.  It is essential for researchers to collect data as a way to gather evidence to support their research question.  Collecting data.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
The purpose of evaluation is not to prove, but to improve.
The New Face of Assessment in the Common Core Wake County Public Schools Common Core Summer Institute August 6, 2013.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Learning Active Citizenship using IPADS, Political Blogs and Social Media Professor Bryony Hoskins, University of Roehampton.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
By Dr Hidayathulla Shaikh. Objectives  At the end of the lecture student should be able to –  Define survey  Mention uses of survey  Discuss types.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Scaffolding Learning in Online Courses
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
TEACHERS COLLABORATING TO IMPROVE INSTRUCTION
D2L Refresher Upload content into the Content section in a D2L course
Writing your reflection in Stage 1 & 2 Indonesian (continuers)
Data Collection Methods
Teacher and Leader Quality Education Support and Improvement
Georgia Department of Education
ISTE Workshop Research Methods in Educational Technology
Presentation transcript:

1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching and Learning Services and Digital Media Center University of Minnesota

Evaluation Plan  Evaluation: A formal appraisal of the quality of an educational phenomenon (Popham, 1993)  Gathering evidence to determine how well the innovation is “working”.  What does the grant say about the evaluation plan?

Foci of the Bush Grant Course-level Evaluation  Student Outcomes  Learning Process: Student Engagement Reflective Learning Responsible Learning

The Bush grant re: Evidence  Grades—ABC and DFW rates (outcomes).  Faculty Reflection Logs and conversations with consultants (process)  Surveys e.g. compare engagement across groups: with/without innovation and/or:  Surveys to compare across time, e.g. before/after the innovation.

Survey Items: Closed-ended (Examples)  Student Engagement  Mostly, I come to class because I want the certificate/degree.  Most of the time, I enjoy this class.  Reflective Learning  I enjoy applying theories or concepts from this class to new situations.  I sometimes find myself thinking about the lecture after it has ended.  Responsible Learning  I usually come to class prepared.  I usually review my lecture notes after class.

Survey Items: Open-ended  What are the most important benefits of this innovation for you?  What are the drawbacks?  What kind of problems have you had with the technology? Be specific.  What suggestions do you have for improvement?

Tailoring the Evaluation Plan to Meet Your Needs  What do you want to know?  Collect the kind of evidence to find out what you want to know, for example:  Business: experimental design with two treatment groups and long pre- and post- intervention surveys.  Architecture: a series of short surveys, interviews and observations, then analyze them for recurring themes.

Optional Evaluation Methods  Scoring Rubrics for student assignments  Class Observation: take notes, analyze them for themes, can inform surveys.  Narrative: To learn about learners’ reflective practices. Triangulation is important.  Interviews: Open-ended items, can deviate, tape-record and analyze for themes.  Focus groups: Validity checks.

Implementing the Plan  Decide what you want to know  Choose your preferred methods/tools  Write surveys and/or interview protocols  Collect data  Data entry: training?  Data analysis

Tools: Scoring Rubrics  “Scoring rubrics…guide the analysis of the products and/or processes of students' efforts…[and] provide a description of what is expected at each score level.” (Moskal, 2000)  a systematic way of evaluating qualitative data; a way of reducing subjectivity

Tools: Scoring Rubrics Development process:  break general concepts down into evaluation criteria  develop descriptions of degrees to which, and ways in which, the criteria can be satisfied  conduct pilot testing on sample data

Tools: Surveys Survey questions should:  be interpreted by respondents in the same way;  ask for information that respondents are able to provide;  provide information that is interpretable by you.

Tools: Questionnaires Time estimates: How many hours per day do you typically study?  Less than 1 hour  1 – 1.5 hours  1.5 – 2 hours  hours  More than 2.5 hours  Less than 2.5 hours  2.5 – 3 hours  3 – 3.5 hours  3.5 – 4 hours  More than 4 hours

Tools: Questionnaires Scale issues: How would you rate your instructor’s knowledge of the subject matter of this class?  poor  fair  adequate  good  very good  exceptional

Tools: Questionnaires Scale issues: How effective were the small group discussions in helping you to learn the course material?  very effective  effective  don’t know  ineffective  very ineffective

Tools: Questionnaires Item order effects: Which of the following activities helped you to learn the course material?  Lectures  Large group discussions  Small group discussions  Group assignments  Course readings  Studying for quizzes

Tools: Questionnaires Open versus closed-ended questions: How easy or difficult to use did you find each of the components of our course website?

Tools: Questionnaires Open versus closed-ended questions: How easy or difficult to use did you find these components of our course website? Discussions  very easy Quizzes  easy Content modules  difficult Audio files  very difficult PowerPoint files etc.

Tools: Questionnaires Interpretability: Which of these statements best describes your professor?  very good teacher, but not very approachable  about the same on approachability and teaching quality  very approachable, but not a very good teacher  a born teacher  extremely approachable