Presented by Ogbonnaya “John” Nwoha, Rosemary Mokia, & Bilal Dia ERCBEC Conference Myrtle Beach, SC, Oct. 22 - 24, 2009.

Slides:



Advertisements
Similar presentations
Survey design. What is a survey?? Asking questions – questionnaires Finding out things about people Simple things – lots of people What things? What people?
Advertisements

Evaluating the Preparation-Guide as a Tool for Increasing Students’ Accountability for Reading the Textbook Etty Vandsburger, Ph.D., LCSW, Rana Daston-Duncan,
Azra Rafique Khalid Mahmood. Introduction “To learn each and everything in a limited time frame of degree course is not possible for students”. (Mahmood,
1 Practicals, Methodology & Statistics II Laura McAvinue School of Psychology Trinity College Dublin.
ISEM 3120 Seminar in ISEM Semester
The Effectiveness of Supplemental Online vs. Traditional Tutorials on Students’ English Proficiency and Learning Achievement Ponlak Pantahachart Faculty.
Enjoyability of English Language Learning from Iranian EFL Learners' Perspective.
Blended Courses: How to have the best of both worlds in higher education By Susan C. Slowey.
1 1 Slide IS 310 – Business Statistics IS 310 Business Statistics CSU Long Beach.
Research & Statistics Student Learning Assessment comparing Online vs. Live (Face to Face) Instruction.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
PhD Research Seminar Series: Writing the Method Section Dr. K. A. Korb University of Jos.
Copyright © 2012 by Nelson Education Limited. Chapter 9 Hypothesis Testing III: The Analysis of Variance 9-1.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
The Role of Automation in Undergraduate Computer Science Chris Wilcox Colorado State University 3/5/2015.
1 Self-regulated Learning Strategies and Achievement in an Introduction to Information Systems Course Catherine, S. C. (2002). Self- regulated learning.
Chris Evans, University of Winchester Dr Paul Redford, UWE Chris Evans, University of Winchester Dr Paul Redford, UWE Self-Efficacy and Academic Performance:
S519: Evaluation of Information Systems Week 14: April 7, 2008.
Transition, Engagement and Retention of First Year Computing Students Heather Sayers Mairin Nicell Anne Hinds.
November 3, 2010 Department of Nutrition Online vs. Face-to-Face: A Course Comparison Jessica Bulova, Ashley Person, Brittan Bibb, Sarah Mammarella, Sarah.
1 BA 275 Quantitative Business Methods Housekeeping Introduction to Statistics Elements of Statistical Analysis Concept of Statistical Analysis Statgraphics.
Does Formative Feedback Help or Hinder Students? An Empirical Investigation 2015 DEE Conference Carlos Cortinhas, University of Exeter.
1.
College Algebra: An Overview of Program Change Dr. Laura J. Pyzdrowski Dr. Anthony S. Pyzdrowski Dr. Melanie Butler Vennessa Walker.
What do Centurions think about our school spirit? Diana Tapia Period
Redesign of Precalculus Mathematics THE UNIVERSITY OF ALABAMA College of Arts and Sciences Course Redesign Workshop October 21, 2006.
N97C0004 Betty Exploration of The Attitudes of Freshman Foreign Language Students Toward Using Computers A Turkish State University.
WEST VIRGINIA UNIVERSITY Institutional Research WEST VIRGINIA ADVENTURE ASSESSMENT Created by Jessica Michael & Vicky Morris-Dueer.
Data analysis was conducted on the conceptions and misconceptions regarding hybrid learning for those faculty who taught in traditional classroom settings.
Jay Summet CS 1 with Robots IPRE Evaluation – Data Collection Overview.
The Use of Distance Learning Technology by Business Educators for Credentialing and Instruction Christal C. Pritchett, Ed.D. NABTE Research Session Anaheim,
Shaking Up Statistics: A Blended Learning Perspective My Vu, Erin M. Buchanan, Kayla Jordan, Marilee Teasley, Kathrene Valentine Missouri State University.
Improving Course Completion and Success Rates Easily: Leverage Summative Assessment for Formative Purposes Robert E. Vaden-Goad, PhD Southern Connecticut.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
Implications of Mediated Instruction to Remote Learning in Mathematics Joy L. Matthews-López Educational Testing Service Sergio R. López-Permouth David.
Establishing an Assessment Process for a Computing Program Cheryl Aasheim, Georgia Southern University Art Gowan, Georgia Southern University Han Reichgelt,
Effectiveness Evaluation for “Introduction to Computer Technology” Subject Base on Learning Management System (LMS) Naruemon Saengduangkhae Rajamangala.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
Perceptions of Distance Learning: A Comparison of On-line and Traditional Learning Maureen Hannay Troy University Tracy Newvine Troy University.
Student Engagement and Academic Performance: Identifying Effective Practices to Improve Student Success Shuqi Wu Leeward Community College Hawaii Strategy.
9.1 Notes Introduction to Hypothesis Testing. In hypothesis testing there are 2 hypothesis for each problem, the null hypothesis and the alternate hypothesis.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
Understanding Basic Statistics Fourth Edition By Brase and Brase Prepared by: Lynn Smith Gloucester County College Chapter Nine Hypothesis Testing.
Researching Technology in South Dakota Classrooms Dr. Debra Schwietert TIE Presentation April 2010 Research Findings.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
Chi-Square Analyses.
Outline of Today’s Discussion 1.The Chi-Square Test of Independence 2.The Chi-Square Test of Goodness of Fit.
Independent Samples T-Test. Outline of Today’s Discussion 1.About T-Tests 2.The One-Sample T-Test 3.Independent Samples T-Tests 4.Two Tails or One? 5.Independent.
Two sides of optimism: The positive and negative consequences of dispositional optimism and optimistic attributional style Evgeny Osin (Higher School of.
The research process Psych 231: Research Methods in Psychology.
The research process Psych 231: Research Methods in Psychology.
Monday, June 23, 2008Slide 1 KSU Females prospective on Maternity Services in PHC Maternity Services in Primary Health Care Centers : The Females Perception.
BUS 308 Entire Course (Ash Course) For more course tutorials visit BUS 308 Week 1 Assignment Problems 1.2, 1.17, 3.3 & 3.22 BUS 308.
Assessing Student Engagement at the Course Level Rick D. Axelson, PhD, Univ. of Iowa, Off. Consult & Research in Medical Educ. Arend Flick, PhD, Norco.
BUS 308 Entire Course (Ash Course) FOR MORE CLASSES VISIT BUS 308 Week 1 Assignment Problems 1.2, 1.17, 3.3 & 3.22 BUS 308 Week 1.
WEI WEI* KATHERINE JOHNSON MATHEMATICS DEPARTMENT METROPOLITAN STATE UNIVERSITY SAINT PAUL, MN The fair use of graphing calculator in introductory statistics.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
Using a Strategy Project to Promote Self-Regulated Learning
Module Example: Influence of Race and Gender on Income1
PSY 325 aid Something Great/psy325aid.com
psy 325 aid Expect Success/psy325aiddotcom
Experimental Design.
Redesign of OPRE 202: Statistical Data Analysis
RESEARCH METHODOLOGY ON ENVIRONMENTAL HEALTH PRACTICE IN WEST AFRICA
Goodness of Fit.
Section 11.1: Significance Tests: Basics
Mastery Assessment in Teaching Statistics
Presentation transcript:

Presented by Ogbonnaya “John” Nwoha, Rosemary Mokia, & Bilal Dia ERCBEC Conference Myrtle Beach, SC, Oct , 2009

Presentation Outline Introduction Review of Literature Study Design Data Methodology Results Conclusion

Introduction The purpose of the study is to ascertain if the use of Aplia makes any difference in the test score of students. It is hypothesized that students who use Aplia in a Business Statistics course should score higher on course tests than students who did not use the technology.

Introduction Contd’ The study is necessary to justify the money and time spent by students and faculty adopting the Aplia technology. The study provides an independent assessment of the technology’s claim. The study contributes to the literature on Computer Assisted Instruction (CAI) n general and Aplia in particular.

Literature Reviewed CAI as a supplement to classroom instruction (Burke Marilyn & Others, 1992) CAI against paper based assessments (Clariana Roy & Wallace Patricia, 2002) Differences in exam scores by: gender, ethnicity, GPA, ACT, SAT, and citizenship (Guili Zhang at all, 2008) employment (Swanson Vivien et al, 2006)

Literature Reviewed Contd’ Differences in exam scores by: age, educational level, learning style, self efficacy task value and motivation (Yukselkurt, et al, 2007) class attendance (Tiruneh, gizachew, 2007) nationality, race and ethnic background (Viadero Debra, 2003)

Study Design Two sections of Business Statistics Taught by same instructor Offered same days of the week Administered same tests Identical course syllabi Graded components included Aplia assignments for one section only

The Grading Scales Grading Scale for the AM Class (Aplia)Grading Scale for the PM Class (No Aplia) Students will be evaluated in the following manner: 1. Three exams (worth 20 percent each) 60% 2. Nine quizzes (worth 1 percent each) 9% 3. Assignments and Home work 6% 4. Aplia Assignments 25% Grading will be on a scale as follows: A 90 – 100 percent B 80 – 89 percent C70 – 79 percent D60 – 69 percent FLess than 60 Students will be evaluated in the following manner: 1.Three exams (worth 100 points each) Nine quizzes (worth 10 points each) Assignments and Home work60 Grading will be on a scale as follows: A 405 – 450 B 360 – 404 C315 – 359 D270 – 314 FLess than 270 Table 1: Grading Scales

Data Test scores for three exams were recorded Aplia scores for Aplia users was obtained A questionnaire was administered to obtain additional information on demographics, study style, employment, preferred class time, level of computer use and many other variables Open ended questions asked for pros and cons of the Aplia technology

Methodology Descriptive statistics was used to summarize data on exam scores and Aplia scores. A test of the difference between two population means is used to make inferences in this study. Microsoft Excel Analysis Tool Pak was used to implement the analysis.

Results Table 2: Mean Test Scores AM Class (Aplia) PM Class (No Aplia)Difference P-Value (two-tail) EXAM EXAM EXAM EXAMTOTAL Sample Size5442

Results Contd’ Table 2 indicates that: The mean score for the AM class (Aplia users) was higher than the PM class in all three exams The differences were : 11 percent for the first exam 16 percent for the second exam 23 percent for the third exam The differences are statistically significant at the 2% level

Results Contd’ Interestingly, the difference in mean score between the two classes increased with each exam possible explanation : the students became increasing familiar with Aplia as the semester progressed the students increased their use of Aplia as the instructor began every class section with a discussion of Aplia assignments

Results Contd’ The frequency distribution for the sum of the three exams (EXAMTOTAL). Scores AM Class (Aplia) PM Class (No Aplia) Grade Equivalent (43%)31 (74%) F (17%)5 (12%) D (13%)2 (5%) C (17%)2 (5%) B (11%)2 (5%) A Sample Size5442 ***The percent distribution for each class is in brackets.***

Results Contd’ Table 3 reveals that: 74 percent of the students who did not use Aplia would have failed the course if exam scores were the only considerations in assigning final grades. This compares to 43 percent of “F” grades for those that used Aplia. 28 percent of those who used Aplia would have made either an “A” or a “B” compared to only 10 percent for those who did not use Aplia.

Results Contd’ On the open ended questions, most of the students stated that Aplia helped them understand the material. On the con side, students stated that Aplia was expensive and time consuming.

Conclusions Students who used Aplia scored higher in business statistics exams than those who did not use the technology. Aplia users scored 16.5 percentage points higher on the average than non-users. Students identified cost and time as the disadvantages of using Aplia. However, the students who used Aplia admitted that it was useful in furthering their understanding of business statistics.

Conclusions Contd’ One implication of the findings of this study is that CAIs including Aplia should be part of an instructor’s tool kit. These CAIs fit the way most of our students work. They spend a lot of time on computers and they enjoy all kinds of computer games. They might as well be meaningfully engaged on the computer.

Future Studies Willingness to Pay Multinomial Vs. Simple Regression Concentrating on Intervening Variables ANOVA Report on other variables collected