STAT MINI- PROJECT NKU Executive Doctoral Co-hort August, 2012 Dot Perkins.

Slides:



Advertisements
Similar presentations
Chapter 2 The Process of Experimentation
Advertisements

Standardized Scales.
Team Teaching Section 7: Monitoring Teacher. The Monitoring Teacher model One teacher assumes the responsibility for instructing the entire class. The.
Collecting data Chapter 5
1-1 Copyright © 2015, 2010, 2007 Pearson Education, Inc. Chapter 21, Slide 1 Chapter 21 Comparing Two Proportions.
Section 9.3 Inferences About Two Means (Independent)
Increasing Preservice Teachers' Capacity for Technology Integration Through the Use of Electronic Models Melissa Dark Purdue University.
Stat 512 – Lecture 12 Two sample comparisons (Ch. 7) Experiments revisited.
Agenda Introductions Objectives and Agenda Review Research & Literature From Session 1 Homework Video Exercise Mid-Year Conferences.
Ch. 9 Fundamental of Hypothesis Testing
Today Concepts underlying inferential statistics
1 BA 555 Practical Business Analysis Review of Statistics Confidence Interval Estimation Hypothesis Testing Linear Regression Analysis Introduction Case.
Getting Started with Hypothesis Testing The Single Sample.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
8/20/2015Slide 1 SOLVING THE PROBLEM The two-sample t-test compare the means for two groups on a single variable. the The paired t-test compares the means.
Student Learning Objectives 1 Implementing High Quality Student Learning Objectives: The Promise and the Challenge Maryland Association of Secondary School.
Annotated Bibliography Presentation April 17, 2013 Carol Redmond ED 521: Educational Research and Analysis Flipping Literacy and Improved Reading Levels.
Review of Basic Statistics. Definitions Population - The set of all items of interest in a statistical problem e.g. - Houses in Sacramento Parameter -
American Reading Company ANNUAL PERFORMANCE REPORT AND SURVEY RESULTS.
To Think Or Not To Think: That Is The Question Abstract Year after year, teachers recognize that many of their students lack critical thinking skills or.
Student Engagement Survey Results and Analysis June 2011.
Copyright © 2010 Pearson Education, Inc. Warm Up- Good Morning! If all the values of a data set are the same, all of the following must equal zero except.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 January 13, 2010.
Dependent Samples: Hypothesis Test For Hypothesis tests for dependent samples, we 1.list the pairs of data in 2 columns (or rows), 2.take the difference.
Project.  Topic should be: Clear and specific Practical and meaningful, this means the results of your research must have some implications in real life.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Comparing Two Proportions
The Impact of the Maine Learning Technology Initiative on Teachers, Students, and Learning Maine’s Middle School 1-to-1 Laptop Program Dr. David L. Silvernail.
Robert W. Arts, Ph.D. Professor of Education & Physics University of Pikeville Pikeville, KY The Mini-Zam: Formative Assessment for the Physics Classroom.
Topic 5 Statistical inference: point and interval estimate
Timber Trace Elementary School October 4, Introduction Module # 1 Structure of the Handbook Design Questions and Modules Sample Activity Box How.
COURSE: JUST 3900 TIPS FOR APLIA Developed By: Ethan Cooper (Lead Tutor) John Lohman Michael Mattocks Aubrey Urwick Chapter : 10 Independent Samples t.
Chicago Foundation for Education Action Research Study Group What are the effects of literacy coaching on teacher implementation and confidence with new.
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
Dr. Samar Sinno Amal Farhat Lebanese University Educational Measurement for Continuous Professional Development March 26 and 27, 2011 Broumana, Lebanon.
James G Ladwig Newcastle Institute for Research in Education The impact of teacher practice on student outcomes in Te Kotahitanga.
Cooperative Learning Statistical Significance and Effect Size By: Jake Eichten and Shorena Dolaberidze.
Training Individuals to Implement a Brief Experimental Analysis of Oral Reading Fluency Amber Zank, M.S.E & Michael Axelrod, Ph.D. Human Development Center.
Teaching Assistants Facilitating Conceptual Understanding in an Introductory Chemistry Laboratory Course Using the Science Writing Heuristic: Quantitative.
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
DEA/FCAT is YOUR game! You are the greatest teachers!
Assessment Formats Charlotte Kotopoulous Regis University EDEL_450 Assessment of Learning.
Copyright © 2010 Pearson Education, Inc. Warm Up- Good Morning! If all the values of a data set are the same, all of the following must equal zero except.
Hypothesis Testing Errors. Hypothesis Testing Suppose we believe the average systolic blood pressure of healthy adults is normally distributed with mean.
Hypothesis test flow chart frequency data Measurement scale number of variables 1 basic χ 2 test (19.5) Table I χ 2 test for independence (19.9) Table.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Where Do You Stand? Using Data to Size Up Your School’s Progress Michael C. McKenna University of Virginia.
T tests comparing two means t tests comparing two means.
Lecture 8 Estimation and Hypothesis Testing for Two Population Parameters.
PBIS PLC: MAXIMIZING INSTRUCTIONAL MINUTES KATHLEEN BEAUDOIN UW TACOMA TOM EDWARDS AND SHANTI KESSLER KEITHLEY MIDDLE SCHOOL.
How do you know your product “works”? And what does it mean for a product to “work”?
Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Statistics for Business and Economics 8 th Edition Chapter 9 Hypothesis Testing: Single.
Between the Simpson County Schools and the Boys and Girls Club of Franklin.
Summer ChAMP Carroll County Schools. Summer can set kids on the Right- or Wrong- Course Study links a lack of academic achievement, high drop out rates,
Statistics 22 Comparing Two Proportions. Comparisons between two percentages are much more common than questions about isolated percentages. And they.
Do 1 problem in which you do a 1-sample hypothesis test with statistics (mean). Veronica Coronado Sandra Gomez.
Professional Development: Imagine Difference Shapes and Sizes
Comparing Two Proportions
NKU Executive Doctoral Co-hort August, 2012 Dot Perkins
Interactive Whiteboard Use and Student Achievement
Evaluation of An Urban Natural Science Initiative
The Expanded Learning Partnership
RELIABILITY OF QUANTITATIVE & QUALITATIVE RESEARCH TOOLS
Program Evaluation Essentials-- Part 2
Student Assessment and Evaluation
Creating Coaching Cycles that Move Coach Practice Forward
Biological Science Applications in Agriculture
Student Assessment and Evaluation
Presentation transcript:

STAT MINI- PROJECT NKU Executive Doctoral Co-hort August, 2012 Dot Perkins

Gallatin County High School  Implemented E-prep in to improve ACT results  My hypothesis is that students who utilize E-PREP will score statistically higher on the ACT in March 2012 than students who did not utilize E-PREP, i.e., the previous three junior classes ( 2011, 2010, 2009)

 I will compare the 2012 junior class ACT scores to the previous three years (2011, 2010, 2009) junior classes ACT scores.  These classes did not have access to E- PREP. I will also compare the sophomore PLAN scores to each classes’ ACT scores to determine possible student growth or decline

E-PREP  An on-line tool to quickly improve skills and confidence on the ACT test.  Each course features engaging, expert and personalized instruction, available 24/7 through on-demand videos and interactive lessons  Student had access 24/7, 365  The high school scheduled E-PREP three times during the school year to benchmark

PLAN & ACT CLASS OF 2013 ColumnnMeanVariance Std. Dev. Std. Err. Media n RangeMinMaxQ1Q3 PLAN ACT Summary statistics: *Gain of 2 points typical) growth. Shows promise. Not conclusive.

Boxplot of All PLAN and ACT Results for Class of 2013, 2012, 2011

CLASS OF 2013 ACT – PLAN DIFFERENCE COMPARED TO CLASS OF 2012 & 2011 ACT & PLAN DIFFERENCE Colum n nMeanVariance Std. Dev. Std. Err. Media n RangeMinMaxQ1Q3 PLAN ACT PLAN 2012 & ACT 2012 &

PLANACTDIFFERENCE 2013(E-PREP Access) &

Here are the hypothesis tests for combined ACT scores and growth. Hypothesis test results: μ 1 : mean of ACT 2013 μ 2 : mean of ACT 2012 & 2011 μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ VERY slight indication of significance. Indicates difference in sample mean ACT scores is slightly unusual, if there is truly no difference in mean ACT scores for all students in these grades. On average, E-PREP group.7 higher than non-E-PREP group

Hypothesis test results: μ 1 : mean of 2013 growth μ 2 : mean of 2011/12 Growth μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ ACT Growth –(minus) PLAN: How much growth? Non-EPREP actually increased slightly more from PLAN To ACT (****Starter higher*****)

Here are the hypothesis tests for ACT and growth comparisons for each year (2013 vs 2012, 2013 vs 2011) Hypothesis test results: μ 1 : mean of ACT 2013 μ 2 : mean of ACT 2012 μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ VERY slight indication of significance. Indicates difference in sample mean ACT scores is slightly unusual, if there is truly no difference in mean ACT scores for all students in these grades.

Hypothesis test results: μ 1 : mean of 2013 Growth μ 2 : mean of 2012 Growth μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ P-value: No evidence that there is growth/change between the two groups E-PREP and Non-E-PREP groups. *Slight indication that ACT scores are higher but no Conclusive evidence that E-PREP impacted.

Hypothesis test results: μ 1 : mean of ACT 2013 μ 2 : mean of ACT 2011 μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ P-value: No indication of a difference between ACT of 2013 & Not unusual.

Hypothesis test results: μ 1 : mean of 2013 Growth μ 2 : mean of 2011 Growth μ 1 - μ 2 : mean difference H 0 : μ 1 - μ 2 = 0 H A : μ 1 - μ 2 > 0 (without pooled variances) DifferenceSample MeanStd. Err.DFT-StatP-value μ 1 - μ P-value:.7 No evidence of a difference in change between 2013 & 2011.

Quantitative Data Examination  A summative sample of ACT test results data was obtained and compiled from Gallatin County High School Conclusions: * E-PREP was implemented with all juniors, sophomores, and freshmen. This on-line program has potential because E-PREP group exhibited higher ACT scores. The data are also inconclusive because there is no difference in the growth. (There are higher ACT scores for the first E-PREP class. ***May just have had a good class last year)

Quantitative Data Examination  Checking assumptions: samples were selected independently; number of individuals in each group exceeds 30; standard deviation of ACT and growth scores for all students in each class is unknown. Potential threat to validity is the lack of random selection. (How was the program used? Consistently?)

Qualitative Data Examination A focus group of eight randomly selected juniors was held to discuss with them their impressions, observations, likes and dislikes of using E-PREP as a tool to improve their performance on the ACT. The conversation was recorded and transcribed.

QUESTIONS  1. How many times did you take E-PREP? How much time was involved? Did you ever utilize E-PREP at home? The public library? On your cell phone?  2. After experiencing E-PREP, did you find it rigorous/difficult/challenging? Why or why not?  3. Did you give E-PREP your best effort? Why or why not?  4. As a result of participating in E-PREP, do you think your ACT score improved? Why or why not?  5. Were there other strategies/things that you did to improve your ACT score along with using E_PREP? If so, what were these?

QUESTIONS  6. Did the technology work for you?  7. Did your teacher, principal, assistant principal, or guidance counselor discuss your E-PREP results with you? How? When?  8. If your teacher reviewed your E-PREP results with you, did instruction within your classroom change as a result of theses results in order to help you improve your ACT score? How?  9. If our current sophomores who will be juniors use E-PREP, do you think this program would benefit them in their ACT preparation? Why or why not?  10. What could be improved in using E-PREP if anything? Why?

Qualitative Data Conclusions After convening this focus group of students to discuss E-PREP the following are common observations and impressions: * Students desire a set schedule of when E-PREP will be given. Let them know when they will be expected to perform. *Give students time to do the tutoring and individual lessons that E-PREP offers to help students improve. *Individualize instruction so students can improve once they get their E-PREP results back. Students want teachers to go over their results with them (Feedback).

 *Students suggested counting E-PREP as part of grade. Recognize students who are working hard and doing what they’re supposed to do.  *Time – Give students time to read questions they missed and the correct answers.  *Need full implementation – E-PREP will do more. We don’t know how to use it all yet.  *All teachers need to take E-PREP seriously

NOTE  The variability in administration approaches and use of E-PREP from class to class may undermine the validity of statistical analyses. Suggest greater uniformity of use, if results are to be studied statistically.

Final Conclusions  *Consensus of the focus group is that E-PREP is a valuable tool and does impact ACT results by giving students confidence in the types of questions that are asked on this assessment and the time constraints that are imposed with this test.  *E-PREP will continue to be implemented next year and more data will be collected to determine value of this program on improving ACT scores among our juniors.