Elizabeth Fry and Rebekah Isaak University of Minnesota

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
Mark Troy – Data and Research Services –
Writing an Effective Proposal for Innovations in Teaching Grant
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
Evaluation.
How to Write Goals and Objectives
Assessing and Evaluating Learning
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
How to Develop the Right Research Questions for Program Evaluation
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
Custom Faculty Development: Reach Faculty Where They Live Linda A. Leake, M. Ed. Instructional Designer/Blackboard Support Specialist University of Louisville.
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
New Advanced Higher Subject Implementation Events
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Aiming to Improve Students' Statistical Reasoning: An Introduction to AIMS Materials Bob delMas, Joan Garfield, and Andy Zieffler University of Minnesota.
Marsha Lovett, Oded Meyer and Candace Thille Presented by John Rinderle More Students, New Instructors: Measuring the Effectiveness of the OLI Statistics.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evaluating a Research Report
Welcome to the State of the STEM School Address National Inventor’s Hall of Fame ® School Center for Science, Technology, Engineering and Mathematics (STEM)
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
1 Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta, th April 2014.
Teacher Behaviors The teacher should allow the students to figure out the main idea of a lesson on their own. (SD, D, A, SA) –SD=4, D=3, A=2, SA=1 The.
Educators’ Attitudes about the Accessibility and Integration of Technology into the Secondary Curriculum Dr. Christal C. Pritchett Auburn University
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
A New Approach to Performing Course Evaluations: Using Q Methodology to Better Understand Student Attitudes Joe Jurczyk Susan Ramlo University of Akron.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
GRANT WRITING Find the right grant and win the funding!!
Evaluating the Impact of Change in Curriculum and Teaching Bob delMas, Elizabeth Fry, Laura Le and Anelise Sabbag Funded by NSF DUE
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
The Evaluation of the Web-based ARTIST Ann Ooms, Joan Garfield, Bob delMas – University of Minnesota Assessment Resource Tools for Improving Statistical.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluating Educational Technology Brian McNurlen & Chris Migotsky University of Illinois at Urbana-Champaign.
4. Marketing research After carefully studying this chapter, you should be able to: Define marketing research; Identify and explain the major forms of.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
ACES: Developing a Valid and Reliable Survey to Assess Faculty Support of Diversity Goals.
Parent Workshop Year 2 Assessment without levels January 2016.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Defining & Aligning Local Curriculum. What is Curriculum? Individually consider your personal definition of the term curriculum What words do you think.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
CLASSROOM ASSESSMENT TECHNIQUES Departmental Workshop Wayne State University English Department January 11, 2012.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Critical thinking in the context of biology CETL Workshop February 25, 2015 Ben Montgomery NSE (Biology)
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Program Evaluations Jennifer Glenski The Next Step Public Charter School.
DEPARTMENT OF HUMAN AND SOCIAL CIENCES APPLIED LINGUISTICS IN ENGLISH CAREER    “THE INFLUENCE OF TEACHER’S ATTITUDES AND BELIEFS INTO TECHNOLOGY-RELATED.
Reliability and Validity in Research
Initial Findings about Graduate Teaching Assistants’ Training Needs to Foster Active Learning in Statistics Kristen E. Roland and Jennifer J. Kaplan.
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Presentation transcript:

Evaluating Innovative Courses in Introductory Statistics: Resources from the eATLAS Project Elizabeth Fry and Rebekah Isaak University of Minnesota eATLAS funded by NSF DUE 1044812 & 1043141

Overview Principles of Curriculum Evaluation Example: Evaluation of CATALST Project Instruments developed for CATALST that became part of eATLAS Additional instrument developed for eATLAS Recommendations for future curriculum evaluations

Why evaluate?

Why evaluate? Evaluation produces information that can be used to improve the project Evaluation can document what has been achieved, and to what extent the desired goals and impacts have been attained

Guidelines for Evaluating Curriculum One size does not fit all (Frechtling, 2010) Clearly define the purpose Formative vs. summative “The purpose of an evaluation should derive in part from the project, what it is intended to achieve, and the questions it is addressing.” (Frechtling, 2010, p. 114) Use multiple methods Document well

Guidelines for Evaluating Curriculum Tradeoffs depend on: Evaluation Purpose Degree of confidence needed Work smart, not hard: Choose evaluation activities that cover multiple purposes Breadth Cost Time Rigor

The CATALST Project http://www.tc.umn.edu/~catalst/ 5 year project Purpose: To create and implement innovative learning materials for an introductory, non-calculus based statistics course To assess student achievement

The CATALST Project http://www.tc.umn.edu/~catalst/ Evaluation Ongoing formative evaluation Final summative evaluation External evaluator: Rob Gould (UCLA)

CATALST Goals & Evaluation Questions Goal 1: Create innovative learning materials for an introductory, non-calculus based statistics course based on modeling and simulation Evaluation Question: Has the project succeeded in this goal?

CATALST Goals & Evaluation Questions Goal 2: Implement the Educational Innovations Evaluation Question: What is the feasibility of implementing the CATALST materials and approach in an undergraduate statistics course?

CATALST Goals & Evaluation Questions Goal 3: Assess Student Achievement Evaluation Question: Has this been accomplished?

CATALST Goals & Evaluation Questions Goal 4: Conduct Research on Undergraduate Statistics Education Evaluation Question: Have these studies taken place and what has been learned from these studies?

CATALST Goals & Evaluation Questions Goal 5: Develop Faculty Expertise (to teach a CATALST course) Evaluation Questions: What is the impact on teachers who attend CATALST workshops and implement aspects of the CATALST curriculum? What are the barriers for teachers who want to adapt aspects of this approach, and what are effective ways of overcoming these barriers? What is the feasibility of other instructors adopting the methods and materials developed by this project?

CATALST: Formative Evaluation Constant changes, updates & improvements Curriculum Content Contexts Activities Pedagogy Scaffolding Inverted classroom Cooperative learning Group assessments

CATALST: Formative Evaluation Implementation Workshops and gatherings Lesson plans Implementer visits Feedback from implementers

Summative Evaluation What was the impact of CATALST? Clinical interviews with students Retention Study 2012 Instruments To compare with non-CATALST courses across different institutions Both qualitative and quantitative components

Summative Evaluation Data Gathered Fall 2011/Spring 2012 14 instructors at 8 institutions CATALST Spring 2012 289 students taught by 8 instructors Non-CATALST Fall 2011/Spring 2012: 440 students taught by 6 instructors

Instruments developed for CATALST For assessing student outcomes: Goals and Outcomes Associated with Learning Statistics (GOALS), 2 versions: TRAD: for students in traditional courses RAND: for students in randomization-based courses Models of Statistical Thinking (MOST) For assessing student attitudes: Affect Survey (Attitudes and beliefs about statistics) These instruments were developed for evaluation of CATALST, but can also be used in other settings

Goals and Outcomes Associated with Learning Statistics (GOALS) 27 forced-choice items Items assess statistical reasoning in a first course in statistics Two versions TRAD: Items 19-22 assess traditional approach to statistical inference RAND: Items 19-22 assess randomization-based approach to statistical inference 23 items common to both versions

GOALS: Example Item A certain manufacturer claims that 50% of the candies they produce are brown and that candy pieces are randomly placed into bags. Sam plans to buy a large family size bag of these candies and Kerry plans to buy a small fun size bag. Which bag is more likely to have more than 70% brown candies? Sam’s, because a larger bag is more likely to have a larger proportion of brown candies. Kerry’s, because there is more variability in proportions of colors among smaller samples. Both have the same chance because the bags they buy are both random samples of candy pieces.

Models of Statistical Thinking (MOST) 4 real-world contexts 4 open-ended items that ask students to explain how they would set up and solve a statistical problem that involves a statistical inference 7 forced-choice follow-up items Used in both traditional and randomization-based courses

MOST: Example Item Consider a random sample of 50 breakups reported on Facebook within the last year. Of these 50, 20% occurred on Monday. Explain how you could determine whether this result would be surprising if there really is no difference in the chance for relationship break-ups among the seven days of the week. Be sure to give enough detail that someone else could easily follow your explanation in order to implement your proposed analysis and draw an appropriate inference (conclusion).

Affect Survey 12 questions 4 response categories 4 items assess experience in an introductory statistics course 4 items assess use of statistical software 4 items assess beliefs about statistics 4 response categories Strongly Disagree Disagree Agree Strongly Agree

Affect Survey: Example Items This course helped me understand statistical information I hear or read about in the media. I would be comfortable using software to test for a difference between groups after completing this class. I feel that statistics offers valuable methods to analyze data to answer important research questions.

Information Provided by Evaluation CATALST can be taught successfully by a variety of instructors in a variety of settings Data are still being analyzed , but preliminary results suggest that CATALST students seem to show higher levels of: Statistical thinking Positive attitudes and beliefs Understanding and interpreting p-values and confidence intervals than students in comparison courses

Information Provided by Evaluation Even though CATALST students did not study a lot of traditional content, they did not score lower on the 23 common items on GOALS Weakest areas: understanding how sample size affects sampling variability Several months after the course: positive attitudes remain about what students have learned, and good understanding of modeling and inference is retained.

eATLAS Instruments e-ATLAS (Evaluation and Assessment of Teaching and Learning About Statistics) grant from NSF 2011-2013 Developed instruments to use in large scale assessments across introductory statistics classes in USA as well as in evaluations of new curricula Assessments of student outcomes: GOALS, MOST and Affect Survey Assessment of teacher practice and beliefs: Statistics Teaching Inventory (STI)

Statistics Teaching Inventory (STI) 4 versions Online classes Face-to-face classes (2 versions) Hybrid classes One instructor per section Lecture/recitation format (lecturer plus TA)

Statistics Teaching Inventory (STI) Six different sections: Pedagogy Curricular Emphasis Technology Assessment Beliefs Course Characteristics

STI: Example Items From Curricular Emphasis section

Next Steps for eATLAS Statistics Teaching Inventory will be given to a national random sample to track change over time and provide baseline data Subset of STI respondents to administer student instruments (GOALS, MOST, Affect) for their courses STI can also be used in evaluations of projects that seek to impact instructors

Recommendations for Designing Curriculum Evaluations Clarify purpose and goals for the project Have clear, focused evaluation questions and identify what types of information can be used to answer each question Clarify processes for gathering both formative and summative data Use good assessment instruments! Have a good external evaluator to provide critical feedback Gather different types of information to continually improve materials

Contact Information fryxx069@umn.edu Elizabeth Fry isaak009@umn.edu Rebekah Isaak jbg@umn.edu Joan Garfield Thank you!

References Frechtling, J. (2010). The 2010 User-Friendly Handbook for Project Evaluation. Retrieved from: http://www.westat.com/pdf/projects/2010ufhb.pdf