Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.

Slides:



Advertisements
Similar presentations
Initiative on K-12 Teacher Preparation Natasha Speer, Univ. of Maine Tim Scott and Omah Williams, Texas A & M Noah Finkelstein, Univ. Colorado-Boulder.
Advertisements

USING THE FRAMEWORK FOR TEACHING TO SUPPORT EFFECTIVE TEACHER EVALUATION Mary Weck, Ed. D Danielson Group Member.
Guidelines for Assessment and Instruction in Statistics Education (GAISE) Kari Lock Morgan STA 790: Teaching Statistics 9/19/12.
Action Plan Mr. Ahmed Zaki Uddin Mathematics O-Level.
A. John Bailer Statistics and Statistical Modeling in The First Two Years of College Math.
Robert delMas, Joan Garfield, and Ann Ooms University of Minnesota
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Greenville Technical College Assessing and Developing Student Computing Technology Skills September 19, 2012 Dr. Lenna Young, Mark Krawczyk, and Mary Locke.
Student Evaluations. Introduction: Conducted: Qualtrics Survey Fall 2011 o Sample Size: 642 o FT Tenured: 158, FT Untenured: 59 o Adjunct: 190 o Students:
Elizabeth Fry and Rebekah Isaak University of Minnesota
Faculty Views of Statistics in Teaching and Research Laura Taylor and Kirsten Doehler Assistant Professors of Statistics 1.
Genre Shift: Instructor Presence and its Impact on Student Satisfaction in Online Learning.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Lab Modules For a 1 st or 2 nd Statistics Course Shonda Kuiper
Psychometric Properties of the Job Search Self-Efficacy Scale Investigators: Jeff Christianson Cody Foster Jon Ingram Dan Neighbors Faculty Mentor: Dr.
GAISE Robin Lock Jack and Sylvia Burry Professor of Statistics St. Lawrence University Guidelines for Assessment and Instruction in Statistics.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
+ Teaching psychological research methods through a pragmatic and programmatic approach. Patrick Rosenkranz, Amy Fielden, Efstathia Tzemou.
Joan Hughes The Role of Teacher Knowledge and Learning Experience in Forming Technology- Integrated Pedagogy Jessica Barron and Adam Wasilko.
Statistics Education Research Journal Publishing in the Statistics Education Research Journal Robert C. delMas University of Minnesota Co-Editor Statistics.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Conducting a Job Analysis to Establish the Examination Content Domain Patricia M. Muenzen Associate Director of Research Programs Professional Examination.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
CISIP Writing Framework Approaches to Teaching and Learning Science and Writing.
Sample an AIMS activity: What Makes the Standard Deviation Larger or Smaller (NSF DUE ) Bob delMas, Joan Garfield, and Andy Zieffler University.
Andragogy and Online Learning Assignment #3 for Glen Gatin EL5006-8
Aiming to Improve Students' Statistical Reasoning: An Introduction to AIMS Materials Bob delMas, Joan Garfield, and Andy Zieffler University of Minnesota.
Marsha Lovett, Oded Meyer and Candace Thille Presented by John Rinderle More Students, New Instructors: Measuring the Effectiveness of the OLI Statistics.
 Cynthia J. Miller, Ph.D. Assistant Professor Dept. of Physiology & Biophysics University of Louisville.
Helping Students Develop Statistical Reasoning: Implementing a Statistical Reasoning Learning Environment (SRLE) Dani Ben-Zvi
ONLINE VS. FACE-TO-FACE: EDUCATOR OPINIONS ON PROFESSIONAL DEVELOPMENT DELIVERY METHODS BY TERESA SCRUGGS THOMAS Tamar AvineriEMS 792x.
Chapter 11: Qualitative and Mixed-Method Research Design
Evaluating a Research Report
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
RESEARCH IN MATHEMATİCS EDUCATION Karadeniz Technical University Fatih Faculty of Education Prof.Dr. Adnan Baki.
Introductory Statistics “Get It” Cluster Group Ginger Holmes Rowell, Ph. D. Megan Hall Middle Tennessee State University.
Workshop on Teaching Introductory Statistics Session 1: Planning A Conceptual Course Using Common Threads And Big Ideas, Part I: GAISE Recommendations.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Exploring Evidence.
Teaching Assistants Facilitating Conceptual Understanding in an Introductory Chemistry Laboratory Course Using the Science Writing Heuristic: Quantitative.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
The Evaluation of the Web-based ARTIST Ann Ooms, Joan Garfield, Bob delMas – University of Minnesota Assessment Resource Tools for Improving Statistical.
The Relationship between Elementary Teachers’ Beliefs and Teaching Mathematics through Problem Solving Misfer AlSalouli May 31, 2005.
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Authentic Discovery Projects in Statistics GCTM Conference October 16, 2009 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
WHAT MAKES A GOOD AND BAD ACTIVITY? CAUSE/SERC Workshop May 16, 2007 Joan Garfield.
Assessing the Impact of the Interactive Learning Space Initiative on Faculty Morale and Student Achievement Ball State University 2015 Spring Assessment.
Please sit in groups of four. Please sit in front of a colored card.
Evidence… Policy Institutional Support and Facilitation Need to Change Learning and Implementing New Practices Facilitating the Learning and Implementation.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Assessment in First Courses of Statistics: Today and Tomorrow Ann Ooms University of Minnesota.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Pedagogical Content Knowledge in primary technology education Ellen J. Rohaan Constructing a multiple choice test to measure teachers’
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Guidelines for Assessment and Instruction in Statistics Education
Melanie Taylor Horizon Research, Inc.
Initial Findings about Graduate Teaching Assistants’ Training Needs to Foster Active Learning in Statistics Kristen E. Roland and Jennifer J. Kaplan.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Research on Geoscience Learning
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Implementing the Common Core’s Promise of Bringing Statistical Curricula into Line with Recommendations of NCTM, MAA, & GAISE Beverly L. Wood, Ph.D. and.
Presentation transcript:

Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching Practice Survey Project (NSF DUE )

Statistics Teaching Inventory (STI) –P–PIs: Joan Garfield, Robert delMas, and Andrew Zieffler –N–NSF funded project to develop, pilot, and gather validity evidence Background of Statistics Teaching Inventory (STI) Development of Instrument Process of STI Validation –P–Psychometric validation –I–Interview validation Validation Results Future plans Overview

Increasing calls for reform in undergraduate education in STEM disciplines Implications for teaching and learning: Students learn by constructing knowledge Real-world problems provide effective ways to structur e learning Collaborative groups facilitate learning Classroom discourse plays a critical role in learning Well-designed technological tools can help students visualize and explore abstract concepts and processes. Background of STI

Background of STI - continued Reform in statistics –GAISE report endorsed by the ASA (2005) – GAISE recommendations: Emphasize statistical literacy and develop statistical thinking. Use real data. Stress conceptual understanding rather than mere knowledge of procedures. Foster active learning in the classroom. Use technology for developing conceptual understanding and analyzing data. Integrate assessments that are aligned with course goals to improve as well as evaluate student learning.

Background of STI - continued Studies showing students’ lack understanding of statistical concepts and statistical reasoning Teachers’ resistance to recommended ways of teaching ARTIST project: need for instrument to use in research studies along with CAOS INSPIRE project: need for data on teachers

Development of STI Objectives of the STI –To assess practice and beliefs of teachers in introductory statistics courses –To pilot an instrument and integrate it into a database to explore the relationship between teaching and student learning in introductory statistics courses Funding from NSF (ARTIST, INSPIRE, and STEPS grants, ) Support from two mini grants from Dept of Ed. Psych, University of Minnesota

Development of STI Development process of the STI –The first version (102 items) –Pared down based on feedback from members of the statistics education community (e.g., the Research Advisory Board (RAB) of CAUSE) –Focus group conducted with faculty from the two different disciplines –Online pilot testing followed by focus group interviews – The resulting version of the STI administered to 101 participants of the 2009 US Conference on Teaching Statistics (USCOTS)

STI The latest version of the STI: 50 multiple-choice items Four sections -Teaching Practice -Assessment Practice -Teaching Beliefs -Assessment Belief Also, course characteristics and additional teacher information

STI - Examples Part 1: Teaching Practice –e.g.) Small group discussions are used to help students learn. Never, Seldom, Some of the time, Most of the time, All of the time Part 2: Course characteristics –e.g.) Please indicate the mathematical prerequisite for this course Part 3: Assessment Practice –e.g.) My assessments evaluate students’ abilities to use formulas to produce numerical summaries of a data set. Disagree, Agree

Examples of the STI Part 4: Teaching Beliefs –e.g.) Students learn statistics more effectively from a good lecture than from a good activity. Strongly Disagree, Disagree, Agree, Strongly Agree, Undecided Part 5: Assessment Beliefs –e.g.) Alternative assessments (e.g., projects, presentations, minute papers) should be used to evaluate student learning. Strongly Disagree, Disagree, Agree, Strongly Agree, Undecided Part 6: Additional information (demographic)

Data collection and Coding Data collection for Pilot test –Administered to 101 participants of the 2009 US Conference on Teaching Statistics (USCOTS) Responses

Validation Results – Psychometric Properties Validation based on the Classical Test Theory Examination of Reliability, individual item properties, and scale scores Analysis results

Validation Results –Interview Interview validation process –Interviewees: 16 instructors who participated in the STI –Face-to-face interview with 10 people at USCOTS –Phone interview with 6 people –Course syllabus and other course materials provided for validity evidence –Rating of interview based on the GAISE report –A consensus of rating reached from discussions with three professionals

Validation Results – Interview Correlated 16 interviewees’ scores on the STI with their interview rating Four outliers found: 2 of them are high on STI but low in interview; 2 of them are low on STI but high in interview Indicates that these four responses on two measurements (STI & interview) do not match.

Validation Results – Interview Why the outliers? –Thorough examination of interview recording and course materials –Some of them changed the course after taking the STI –Some of them have different ideas of use of technology or the teaching methods recommended in GAISE.

Conclusions Psychometric validation of the STI supported through high value of Cronbach-alpha (>0.80) The instructors sampled use a moderately reformed approach in teaching statistics Difference between the mean scores for the Teaching Practice (0.58) and the Assessment Practice (0.74) which suggests

Discussion Instructors have different conceptions in rating themselves Their beliefs and practices were sometimes influenced by constraints Revisions needed –some items have less than 0.30 (in item discrimination), one item has zero standard deviation –Mixed item format made the interpretation of the result inconsistent

Limitations A larger sample is needed to make better statistical and psychometric analyses Biased sample (USCOTS particpaints) The coding of items and the scaling of scores regarding the mixed-item-format

Current work Modular version with new parts for online and hybrid classes International version, based on interviews at International Conference of Teaching Statistics (ICOTS) National survey Fall 2011 (e-ATLAS project to NSF) Linking STI results to new CAOS results (e-ATLAS) Implementation of classroom observation rating instrument for further validation

Thank You Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology