Discovery Learning Projects in Introductory Statistics

Slides:



Advertisements
Similar presentations
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Advertisements

Student Learning Outcomes and Assessment An Overview of Purpose, Structures and Systems Approved by Curriculum Committee Approved by Academic.
An Overview of Service Learning: Building Bridges, Making Connections
Academy 2: Using Data to Assess Student Progress and Inform Educational Decisions in Culturally Responsive RTI Models Academy 2: Culturally Responsive.
Teacher Evaluation New Teacher Orientation August 15, 2013.
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Using Video Segments to Enhance Early Clinical Experiences of Prospective Teachers Kristen Cuthrell, Michael Vitale, College of Education, East Carolina.
Mary R. Callahan, Ed. D.. Study Description Mixed-methods study (April, 2013) Use of survey (PSSDS,2009) and 10 semi-structured interview questions Three.
Research Methods Overview of quantitative and qualitative methods.
Genre Shift: Instructor Presence and its Impact on Student Satisfaction in Online Learning.
Historical Research.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Scientific Teaching 28 September 2004 Diane Ebert-May Department of Plant Biology Michigan State University
Quantitative Research
The Impact of a Faculty Learning Community Approach on Pre-Service Teachers’ English Learner Pedagogy Michael P. Alfano, John Zack, Mary E. Yakimowski,
Training Teachers to Use Authentic Discovery Learning Projects in Statistics AMTE January 30, 2010 Robb Sinn Dianna Spence Department of Mathematics &
Journal of Statistics Education Webinar Series February 18, 2014 This work supported by NSF grants DUE and DUE Brad Bailey Dianna Spence.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Using Authentic Discovery Projects to Improve Student Outcomes in Statistics Joint Mathematics Meetings January 16, 2010 Dianna Spence Brad Bailey Robb.
EVALUATION REPORT Derek R. Lane, Ph.D. Department of Communication University of Kentucky.
CLASS Keys Orientation Douglas County School System August /17/20151.
CAUSE Teaching and Learning Webinar December 14, 2010 Dianna Spence and Brad Bailey North Georgia College & State University This work supported by grants.
Student Projects in Statistics GCTM Conference October 14, 2010 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
Amy Reynolds ECOMP 6102 Assessment Portfolio I have been in the field of education for four years. My teaching experience includes teaching fourth grade,
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Evaluating a Research Report
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
Authentic Discovery Learning Projects in Statistics NCTM Conference April 23, 2010 Dianna Spence Robb Sinn Department of Mathematics & Computer Science.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
FOR 500 PRINCIPLES OF RESEARCH: PROPOSAL WRITING PROCESS
Authentic Discovery Projects in Statistics GAMTE Annual Conference October 14, 2009 Dianna Spence Robb Sinn NGCSU Math/CS Dept, Dahlonega, GA.
Jenefer Husman Arizona State University Jenefer Husman Arizona State University When learning seems (un)important: Future Time Perspective and post-secondary.
Anatomy of a Research Article Five (or six) major sections Abstract Introduction (without a heading!) Method (and procedures) Results Discussion and conclusions.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: The traditional approach.
Authentic Discovery Projects in Statistics GCTM Conference October 16, 2009 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
Online students’ perceived self-efficacy: Does it change? Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: July 11, 2007 C. Y. Lee & E. L. Witta (2001).
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Using Technology to Support Secondary Students with Mild Disabilities Susie Gronseth Department of Instructional Systems Technology Indiana University.
Quantitative research Meeting 7. Research method is the most concrete and specific part of the proposal.
Measuring Mathematics Self Efficacy of students at the beginning of their Higher Education Studies With the TransMaths group BCME Manchester Maria.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
CAEP Standard 4 Program Impact Case Study
Pilot Course-Based Undergraduate Research Experience for 1st-Year Student Success curriculum Regina Williams Davis Asst. Provost for Student Success &
DATA COLLECTION METHODS IN NURSING RESEARCH
Lindsay K. Lightner and Judith A. Morrison Washington State University
Joan Donohue University of South Carolina
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
Qualitative research: an overview
Evaluation of An Urban Natural Science Initiative
ED 690: Reflecting, Writing, and Reviewing the PDP
–Anonymous Participant
Melanie Taylor Horizon Research, Inc.
TEACHER PERCEPTIONS OF TECHNOLOGY INTEGRATION IN A RURAL COMMUNITY
Surface energy modification for biomedical material by corona streamer plasma processing to mitigate bacterial adhesion Ibrahim Al-Hamarneh Patrick.
Travis Weiland Kaput Center – School of Education Abstract Methods
INTRODUCTION TO RESEARCH PROJECT
Methods Choices Overall Approach/Design
Teacher Evaluation “SLO 101”
Chapter Eight: Quantitative Methods
Session 4 Objectives Participants will:
CCLI Evaluation Planning Webinar
AET/515 Instructional Plan Template (Shirmen McDonald)
Chapter 2: Studying Social Life: Sociological Research Methods
Factorial Designs Factorial design: a research design that includes two or more factors (Independent Variables) A two-factor design has two IVs. Example:
Educational Research Project Title
Presentation transcript:

Discovery Learning Projects in Introductory Statistics PI: Dianna Spence Co-PI: Brad Bailey Overview NSF DUE-1021584 Objective: Develop materials to help instructors combine real-world, discovery-learning projects in introductory statistics courses; measure the impact on student performance and beliefs. Design of Pilot Test: Eight participating instructors (nationwide) test the methods and materials developed to facilitate discovery projects in their classes. CONTROL GROUPS: Each instructor teaches one or more sections without using the methods and materials designed to facilitate projects. (Completed AY 2011-2012) TREATMENT GROUPS: During subsequent academic terms, each instructor teaches one or more sections using the methods and materials developed. (Completed AY 2012-2014) Curriculum Development Materials (Online/Print) Student Guide Instructor Guide Technology Guide Contents include Scoring rubrics Proposal forms Guidance at each phase Condensed and extended project timelines Resources for developing variables/constructs Student Project Description Students generate their own research questions, define their own variables, and draft a research proposal. Upon approval, students collect data; organize, analyze, and interpret results, and report findings in a formal paper and an oral presentation. Below are project types and examples. Linear Regression – Examples: NBA Player Salaries and Points per Game Car Engine Horsepower and Average Miles per Gallon t-Tests – Examples: Comparing Males and Females: Attitudes about Tattoos Comparing Northern and Southern States: Divorce Rates Instruments: Student Outcomes and Subscales Content Knowledge (CK) – 17 multiple choice items Statistics Self-Efficacy (SE) – 16 Likert scale items Perceived Usefulness of Statistics (PU) – 11 Likert scale items. Subscales of Student Outcome Measures Mixed Methods Research Qualitative Inquiry Five instructors participated in qualitative data collection. Quantitative Analyses Control vs treatment: compare student outcomes For individual instructors For overall groups All 8 instructors (N = 353 Control, 441 Treatment) Preferred protocol only: 5 instructors (N = 198 Control, 344 Treatment) Instructors omitted for protocol exceptions: Treatment section taught in summer mini-mester format Target student population substantially different in treatment group Treatment section was different course than control Repeated treatment sections after initial trial: 3 instructors (N = 122 Control; N = 122 / 103 / 43 in 1st, 2nd, 3rd Treatments) Multivariate models: predict student outcomes by treatment, instructor variables, other factors (esp. for instructors with repeated treatment sections) Results (Qualitative) Each theme is explored with respect to codes (left). Participants are compared, with commonalities, differences, and specific manifestations noted. Sample Observations Nature of interaction between instructor and students is changed when projects are incorporated (Engagement/communication) Student misconceptions are easier to identify earlier in learning cycle when carrying out projects (Conceptual learning/project purpose) Students appear more invested in the outcome of studies they have designed themselves (Real-world application/student dispositions) Instructors vary widely in how they choose to guide students conceptually through projects (Conceptual learning/Instructor pedagogy) Results (Quantitative) Control vs. Treatment – Overall Group Gains: All Instructors (Overall group declines: none) Control vs. Treatment – Overall Gains: Instructors within Protocol Significant Gains and Declines: Individual Instructors (*Protocol exceptions; omitted from table directly above) The Experience Factor: Instructors Repeating Treatment Sections Guiding Questions How do instructors vary in their implementation of the discovery statistics projects? How do the instructors’ teaching philosophies pervade the project implementation? What are the implications of the observed instructor differences for student learning? Data Collection Written prompts Course syllabi/materials Phone interviews Classroom observations Analysis Code and Triangulate (team process) Identify and describe themes Synthesize with quantitative findings Scale Group N Mean (% of max) p Content Knowledge – Subscale: Identifying Analysis C T 353 441 1.33 (44.3%) 1.54 (51.3%) .001 Self-Efficacy Main Scale 78.51 (81.8%) 80.20 (83.5%) .021 Self-Efficacy – Subscale: Hypothesis Testing 23.41 (78.0%) 24.43 (81.4%) .002 Self-Efficacy – Subscale: Data Collection 19.12 (79.7%) 19.98 (83.3%) .000 Scale Group N Mean (% of max) p Content Knowledge Main Scale C T 198 344 7.52 (44.2%) 8.63 (50.7%) .000 Content Knowledge – Subscale: Linear Regression 2.50 (35.7%) 3.12 (44.6%) Content Knowledge – Subscale: Identifying Analysis 1.23 (41.0%) 1.61 (53.7%) Self-Efficacy Main Scale 76.00 (79.2%) 80.13 (83.5%) Self-Efficacy – Subscale: Hypothesis Testing 22.08 (73.6%) 24.38 (81.3%) Self-Efficacy – Subscale: Data Collection 18.86 (78.6%) 20.04 (83.5%) Primary Coding Categories Emerging Themes Conceptual Framework Project Purpose Instructor Pedagogy Student Dispositions Communication Engagement Real-world Application Conceptual Learning Scale #1 #2 #3 #4 #5* #6* #7 #8* CK – Content Knowledge (Main)  CK Subscale: Linear Regression  CK Subscale: Hypothesis Testing CK Subscale: Identifying Analysis CK Subscale: Sampling SE – Self Efficacy (Main) SE Subscale: Linear Regression SE Subscale: Hypothesis Testing SE Subscale: Data Collection SE Subscale: General Scale / Subscale #1 #2 #7 Group Content Knowledge  CK: Linear Regression CK: Hypothesis Testing CK: Identifying Analysis CK: Sampling Self Efficacy SE: Linear Regression SE: Hypothesis Testing SE: Data Collection SE: General Contact Us PI Dr. Dianna Spence djspence@ung.edu Department of Mathematics University of North Georgia Co-PI Dr. Brad Bailey Dahlonega, GA 30597 bbailey@ung.edu Phone: 706-864-1805 Project Website http://faculty.ung.edu/DJSpence/NSF/ Content Knowledge Statistics Self-Efficacy Perceived Utility Linear Regression Hypothesis Testing Identifying Appropriate Type of Analysis Sampling Data Collection General (Learning and Understanding Statistics) Real-World Relevance Personal Benefit to Understanding N = 122 N = 122 N = 103 N = 43 Experience with Treatment: Significant Gains