Teaching the Control of Variables Strategy in Fourth Grade Classrooms Robert F. Lorch, Jr., William J. Calderhead, Emily E. Dunlap, Emily C. Hodell, Benjamin.

Slides:



Advertisements
Similar presentations
Methodology and Explanation XX50125 Lecture 2: Experiments Dr. Danaë Stanton Fraser.
Advertisements

Experimental and Ex Post Facto Designs
Copyright © Allyn & Bacon (2007) Single-Variable, Independent-Groups Designs Graziano and Raulin Research Methods: Chapter 10 This multimedia product and.
Experimental Designs Dr. Farzin Madjidi Pepperdine University
COURSEWORK ON BASE NUMERATION SYSTEMS AND ITS INFLUENCE ON PRE- SERVICE ELEMENTARY TEACHER’S UNDERSTANDING OF PLACE VALUE CONCEPTS BY DOROTHY J. RADIN.
1 COMM 301: Empirical Research in Communication Lecture 10 Kwan M Lee.
Experimental Research Designs
Statistical Issues in Research Planning and Evaluation
Does Schema-Based Instruction and Self-Monitoring Influence Seventh Grade Students’ Proportional Thinking? Asha Jitendra, University of Minnesota Jon R.
1 National Reading First Impact Study: Critique in the Context of Oregon Reading First Oregon Reading First Center May 13, 2008 Scott K. Baker, Ph.D. Hank.
Chapter 9 Experimental Research Gay, Mills, and Airasian
L2 Vocabulary Acquisition in Children: Effects of Learning Method and Cognate Status I-Pei Tsai NA1C0004.
Science Inquiry Minds-on Hands-on.
TEACHING ALPHABETIC KNOWLEDGE SKILLS TO PRESCHOOLERS WITH SPECIFIC LANGUAGE IMPAIRMENT AND TYPICALLY DEVELOPING LANGUAGE Addie Lafferty, Shelley Gray,
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
Why study educational psychology?
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Formal Reasoning and Science Teaching Nicolaos C. Valanides (1996). Formal Reasoning and Science Teaching. School Science and Mathematics, 96(2),
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Prompts to Self-Explain Why examples are (in-)correct Focus on Procedures 58% of explanations were procedure- based Self-explanation is thought to facilitate.
Quantitative Research. Quantitative Methods based in the collection and analysis of numerical data, usually obtained from questionnaires, tests, checklists,
Chapter 8 Causal-Comparative Research Gay, Mills, and Airasian
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
 Are false memories more likely to develop when people are motivated to believe in the false event?  Sharman and Calacouris (2010)
MSRP Year 1 (Preliminary) Impact Research for Better Schools RMC Corporation.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Using Common Core State Standards of Seventh Grade Mathematics in the Application of NXT LEGO® Robotics for CReSIS Middle School Students.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: March 02, 2009 Papastergiou, M.(2009). Digital Game-Based Learning in high school Computer Science.
Module 3: Research in Psychology Learning Objectives What is the scientific method? How do psychologist use theory and research to answer questions of.
Experiments. The essential feature of the strategy of experimental research is that you… Compare two or more situations (e.g., schools) that are as similar.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Chapter 10 Experimental Research Gay, Mills, and Airasian 10th Edition
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Research Design Week 6 Part February 2011 PPAL 6200.
Comparing Pedagogical Approaches for the Acquisition and Long-Term Robustness of the Control of Variables Strategy By: Michael São Pedro Advisor: Janice.
EXPERIMENTS AND EXPERIMENTAL DESIGN
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
T tests comparing two means t tests comparing two means.
SCHOOL COUNSELING INTERVENTIONS Adrienne WatkinsBall State University.
Experimental Control Definition Is a predictable change in behavior (dependent variable) that can be reliably produced by the systematic manipulation.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
The Power of Comparison in Learning & Instruction Learning Outcomes Supported by Different Types of Comparisons Dr. Jon R. Star, Harvard University Dr.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
PSY 219 – Academic Writing in Psychology Fall Çağ University Faculty of Arts and Sciences Department of Psychology Inst. Nilay Avcı Week 9.
Solution to Questions 1 and 2: Students' Use of Rule-based Reasoning in the Context of Calorimetry and Thermal Phenomena* Ngoc-Loan P. Nguyen, Warren M.
Fostering elementary school students’ understanding of simple electricity by combining simulation and laboratory activities Adviser: Ming-Puu Chen Presenter:
Experimental and Ex Post Facto Designs
EDP 520 – R ESEARCH IN E DUCATION By Obed Nartey.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
Thinking aloud about NEW AND OLD BALLS and ramps Bemilies, et al. University of Kentucky--Lexington, KY ABSTRACT INTRODUCTION  Scientists use many methods.
¿What's The Best Way To Teach Children To Read? According To The National Reading Panel.
Some Terminology experiment vs. correlational study IV vs. DV descriptive vs. inferential statistics sample vs. population statistic vs. parameter H 0.
8 Experimental Research Design.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Chapter 11: The Nuts and Bolts of one-factor experiments.
Regression to Predict Absences on MAP Scores
EXPERIMENTAL RESEARCH
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
The sample under analysis at the end of the scoring was made up of 99 participants as follows:
DISCUSSION AND CONCLUSIONS
Single-Variable, Independent-Groups Designs
Instructional Practices in the Early Grades that Foster Language & Comprehension Development Timothy Shanahan University of Illinois at Chicago
2 independent Groups Graziano & Raulin (1997).
The Experimental Method in Psychology
9 Experimental Design.
Second Language Vocabulary Acquisition
Research Methods for the Behavioral Sciences
Exercises Causal Comparative Research
Presentation transcript:

Teaching the Control of Variables Strategy in Fourth Grade Classrooms Robert F. Lorch, Jr., William J. Calderhead, Emily E. Dunlap, Emily C. Hodell, Benjamin Dunham Freer, & Elizabeth P. Lorch University of Kentucky ABSTRACT We investigated an intervention (Chen & Klahr, 1999) that was designed to teach the “control of variables strategy” (CVS) to young students. Four questions were addressed by the study: (1) Can the teaching intervention be successfully translated to the classroom? (2) Is the intervention effective with students from lower-achieving schools, as well as higher-achieving schools? (3) Is direct instruction sufficient for teaching CVS in 4 th -grade classrooms? (4) Does hands-on experience conducting experiments contribute to learning CVS beyond the gains produced by direct instruction in CVS? Students from high- and low-achieving schools were taught CVS by direct instruction, or hands-on experimentation, or both. The results demonstrate that the basic intervention can be successfully translated to the classroom, but it is relatively less effective in lower-achieving classrooms. Further, direct instruction alone was effective in teaching CVS, but it was more effective when combined with hands-on manipulation. Hands-on manipulation unaccompanied by direct instruction was relatively ineffective. INTRODUCTION The experimental method is based on the logic that in order to demonstrate a causal relationship between two variables, the experimenter must manipulate the independent variable and find corresponding variation in the dependent variable while holding constant all other variables. Chen and Klahr (1999) developed a brief instructional intervention that they showed to be very effective at teaching CVS to 4 th -grade students. Klahr and his associates (Klahr & Nigram, 2004; Toth, Klahr & Chen, 2000) have successfully applied the teaching intervention in the classroom, albeit on a limited basis. The current study scales up the intervention by both applying the intervention in many 4 th -grade classrooms (i.e., 36) and by testing its effectiveness in two, very distinct environments (i.e., high-achieving classrooms vs. lower-achieving classrooms). In addition to examining whether Chen and Klahr’s teaching intervention can be scaled up, the study addressed basic theoretical questions about the conditions necessary for successful teaching of CVS. The teaching intervention is comprised of both direct instruction in the logic of CVS, and a “discovery” component involving hands-on manipulation of the experimental apparatus. Klahr and associates have already demonstrated that relatively little learning results if students conduct hands-on manipulation in the absence of any direct instruction. Based on their results, they have argued that direct instruction is a necessary component of teaching of CVS. However, they have not demonstrated that direct instruction is sufficient for learning of CVS, nor have they shown that the hands-on component of their intervention produces no benefits beyond the gains of direct instruction. Our study addresses four questions concerning the basic teaching intervention first developed by Chen and Klahr: 1.Can we successfully translate the basic teaching intervention into 4 th- grade classrooms on a relatively large scale (i.e., more than 1 or 2 classrooms)? 2.Will the intervention be successful in both high- and low-achieving classrooms? 3.Direct instruction may be necessary for teaching CVS, but is it sufficient? 4.Does hands-on experience in designing and conducting experiments contribute to learning CVS beyond the gains found with direct instruction alone? RESULTS Figure 1 displays mean performance on the paper + pencil tests of CVS understanding. The data are averaged across high- and low-achieving schools. Several patterns can be seen in Figure 1: Performance is at chance levels (50%) on the pretest Significant gains are found in all three instructional conditions on the posttest. Gains are ordered: Both > Instruct > Manipulate Gains at posttest are maintained 4-5 months later at the delayed test Figures 2 and 3 present the effects of instructional condition across the three tests broken down separately for higher-achieving and lower-achieving schools. METHOD Participants Participants included in analyses were 543 children from 36, 4 th grade classrooms. Students were from high achieving (n = 269) and low achieving schools (n = 274) in the Fayette County school district, Lexington, KY. Design The experimental design included three factors: 1.Students attended schools that were either high or low scoring on a statewide test of science achievement (Kentucky Core Content Test). 2.Classrooms were assigned at random to one of three instructional interventions: (a) CVS instruction (Instruct), (b) hands-on manipulation (Manipulate), or (c) CVS instruction + hands-on manipulation (Both). 3.Students were tested at three points: (a) the day before instruction (i.e., Pretest), (b) the day after instruction (Posttest), and (c) 4-5 months after instruction (Delayed Test). Materials Procedure Day 1  All students completed the paper + pencil test to obtain a baseline measure of their knowledge of CVS.  Students in the Manipulate and Both conditions worked in groups of 3 to design and run experiments to test the effects of individual variables on the distance the ball rolled. Day 2  Students in the Instruct and Both conditions received the teaching protocol, which provided confounded “bad” experiments and corrected “good” experiments, with explanations. Students in the Manipulate condition did not receive the teaching protocol.  Students in the Manipulate and Both conditions designed and ran experiments with the ramps to test the effects of each of the variables. Students in the Instruct condition did not do experiments with the ramps. Day 3 and Delayed  All students received the paper + pencil test on Day 3 and at a delay of 4- 5 months. The version of the paper + pencil test differed for each student across the three testing phases. Comparison of Figures 2 and 3 shows several similarities between the performances of students in the higher- vs. lower-achieving schools: Students in both environments improve from pretest to posttest Students in both environments maintain their gains at the delayed test. The pattern of relative performance as a function of instructional condition is the same for students in both environments. However, comparison of Figures 2 and 3 also shows some differences between the performances of students in the higher- vs. lower-achieving schools: The same instruction produced greater gains in the higher-achieving schools than in the lower-achieving schools. The Manipulate condition produced significant learning only in the higher-achieving schools. The addition of hands-on experience to the direct instruction produced significant gains relative to direct instruction alone for the higher-achieving schools, but not for the lower-achieving schools (compare Both vs. Instruct in each graph). CONCLUSIONS 1.The teaching intervention developed by Klahr and his associates was successfully translated into 4 th- grade classrooms on a relatively large scale. Instruction in the Both condition resulted in substantial improvement from the pretest to the posttest. Further, the gains were maintained several months later on the delayed test. 2.Students in both higher- and lower-achieving classrooms benefited from instruction in the two conditions involving direct instruction (i.e., Instruct and Both conditions). However, the gains were much larger in the higher-achieving classrooms than in the lower-achieving classrooms. There were indications that hands-on experience benefited students in higher-achieving classrooms, but had relatively little benefit for students in lower-achieving classrooms. 3.Direct instruction is sufficient for teaching CVS, as demonstrated by the fact that the gains in the Instruct condition were reliable and were maintained at the delayed test. 4.However, including hands-on experience in designing and constructing experiments results in better learning of CVS than direct instruction alone. This is demonstrated by the finding that the Both condition produced better learning than the Instruct condition. References Chen, Z., & Klahr, D. (1999). All other things being equal: Children’s acquisition of the control of variables strategy. Child Development, 70, Klahr, D., & Nigram, M. (2004). The equivalence of learning paths in early science instruction. Psychological Science, 15, Toth, E.E., Klahr, D., & Chen, Z. (2000). Bridging research and practice: A research-based classroom intervention for teaching experimentation skills to elementary school children. Cognition and Instruction, 18, Acknowledgment The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant 1 R305 H to the University of Kentucky. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.