Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia

Slides:



Advertisements
Similar presentations
HONG KONG EXAMINATIONS AND ASSESSMENT AUTHORITY PROPOSED HKDSE ENGLISH LANGUAGE ASSESSMENT FRAMEWORK.
Advertisements

Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Quantifying Data.
Perception between Regular and Sped teachers in Handling Children with Intellectual Disability: Basis for a Specialized Training Program for Teachers by.
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Modified for EPE/EDP 711 by Kelly Bradley on January 8, 2013.
Evaluation and analysis of the application of interactive digital resources in a blended-learning methodology for a computer networks subject F.A. Candelas,
The Learning Behaviors Scale
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Student Engagement Survey Results and Analysis June 2011.
What is EzyOMR? Answer sheet reading solution which provides – High speed scanning – 100% Accuracy – High Flexibility – Low cost – So easy for everyone.
Exam Taking Kinds of Tests and Test Taking Strategies.
Tailoring Course Evaluations/Student Feedback to Improve Teaching Jeffrey Lindstrom, Ph.D. Siena Heights University Webinar 6 October 2014.
IntroductionDiscussion  Academic, mental health, behavioral, and social deficits in student adjustment are major causes of college attrition rates. 1.
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
Your Own Research Method and Materials. Procedure BATs Write a method Create materials such as - consent forms, standardised instructions, questionnaire,
Assessment and Testing
The effects of Peer Pressure, Living Standards and Gender on Underage Drinking Psychologist- Kanari zukoshi.
Researching Technology in South Dakota Classrooms Dr. Debra Schwietert TIE Presentation April 2010 Research Findings.
Psychometrics: Exam Analysis David Hope
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Viewpoints of Academics and Students on Using the E-EPOSTL at University of Prešov Barbora Popovičová and Nikola Mihoková.
Multivariate Analysis - Introduction. What is Multivariate Analysis? The expression multivariate analysis is used to describe analyses of data that have.
PROCESSING DATA.
Where We Are and Where We Want to Be
Alexandria City Public Schools Preliminary Results of the 2016 Teaching, Empowering, Leading, and Learning (TELL) Survey. Dawn Shephard Associate Director, Teaching,
Why do we need a compensation survey
Dr. Scott Thur Dr. Kathleen Marshall
Multiple Regression: I
Paulina Liżewska Paweł Kamiński
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
Multivariate Analysis - Introduction
Assessments Attitudes of UK Teachers & Parents
Key findings on comparability of language testing in Europe ECML Colloquium 7th December 2016 Dr Nick Saville.
Method.
Using EduStat© Software
Parts of an Academic Paper
Classroom Analytics.
Item Analysis: Classical and Beyond
Factors influencing customer behavior
Research Methodology: Instruments and Procedures
Preparing to Teach and Overview of Teaching Assignments
SATs Information Evening
Research amongst Physical Therapists in the State of Kuwait: Participation, Perception, Attitude and Barriers Presented by Sameera Aljadi, PT, PhD Assistant.
AND THE DRAWING BOARD Jamie Long
Digital Learning Framework Evaluation Overview
Awatef Ahmed Ben Ramadan
The new educational model Assesment of project groups
Survey What? It's a way of asking group or community members what they see as the most important needs of that group or community is. The results of the.
Calculating Reliability of Quantitative Measures
Theme 4 Describing Variables Numerically
Survey phases, survey errors and quality control system
The new educational model Assesment of project groups
Asist. Prof. Dr. Duygu FIRAT Asist. Prof.. Dr. Şenol HACIEFENDİOĞLU
Designed for internal training use:
Survey phases, survey errors and quality control system
Dept. of Community Medicine, PDU Government Medical College,
The new educational model Assesment of project groups
Student Satisfaction Results
Writing the IA Report: Analysis and Evaluation
Preparing to Teach and Overview of Teaching Assignments
Item Analysis: Classical and Beyond
Multivariate Analysis - Introduction
Item Analysis: Classical and Beyond
This lesson is for both investigation and artefact projects.
Assessment - the hidden curriculum
CLASSROOM ENVIRONMENT AND THE STRATIFICATION OF SENIOR HIGH SCHOOL STUDENT’S MATHEMATICS ABILITY PERCEPTIONS Nelda a. nacion 5th international scholars’
  Using the RUMM2030 outputs as feedback on learner performance in Communication in English for Adult learners Nthabeleng Lepota 13th SAAEA Conference.
M.A. Vargas Londoño E.O. Cardoso Espinosa J.A. Cortés Ruíz
Presentation transcript:

Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia Teacher Practices in Scoring Multiple-Choice Items, Interpreting Test Scores & Performing Item Analysis Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia Introduction – M.Ed (Measurement and Evaluation) final semester; carried out as part of my master’s project

Introduction Multiple choice items are widely used to assess student learning. They feature prominently in public examinations and likewise in teacher-made-tests. Research Project – to develop a tool to assist teachers in scoring multiple choice items, interpreting test scores and performing item analysis. Thus, this preliminary needs analysis study was conducted…. Easy to score objectively – more items can be included in a single test – better coverage of the scope of study However raw score alone does not give us much information of educational value

Aim and Purposes of the Study Objectives: Identify teachers’ current practice and the challenges they face in scoring multiple-choice items, interpreting test scores and performing item analysis; and Ascertain the prevalence of OMR scanners and item analysis software in Malaysian secondary schools. Aim: Findings from this preliminary study serve to inform subsequent development of a software to aid teachers in these three aspects.

Methodology Research Design: Survey Participants: 130 secondary school teachers serving in public schools across Malaysia. Instrument: Questionnaire with 3 sections: Section A: Demographic Profile – gender, location and type of school, teaching experience Section B: Teacher Practices – scoring OMR answer sheets, interpreting test scores, performing item analysis Section C: OMR Scanner and Item Analysis Software – availability in school, willingness to invest in one etc.

Methodology Precedures: Questionnaires were disseminated to various schools through the help of colleagues and friends. An online version of the questionnaire was created using a free online survey tool and the link was sent to potential respondents. The collected data were analysed using descriptive statistics.

Respondents’ Demographic Profile Number of respondents, N = 130 teachers Number of schools represented = 74 schools Teaching experience 0-33 years

Teachers’ Practices : Scoring Teachers’ estimates on the amount of time (in seconds) needed to score one OMR answer sheet (assuming that there are 40 items), count the number of correct answers and convert the raw score to percentages :- Table 1 Means and Standard Deviations for Teachers’ Practices in Scoring Mean Standard Deviation Scoring OMR answer sheet (N=130) 51.52 37.69 Counting the number of correct answers (N=127) 27.29 26.43 Converting the raw score to percentage (N=114) 22.40 25.87 Total 101.21 As the number of items differ across subjects and levels, the respondents were told to assume that there were 40 items. Scoring + Counting + Converting = 101.21 seconds; 40 students = 1hr 8 mins / class

Teachers’ Practices : Scoring Teachers have to spend a substantial amount of time to process OMR answer sheets when they do it manually. Based on the mean of teachers’ estimates, 1 OMR answer sheet takes around 101.21 seconds (i.e. 1.69 minutes) to mark. 5 classes x 40 students = 200 OMR answer sheets 200 sheets x 1.69 minutes = 5 hours 38 minutes

Teachers’ Practices : Scoring They would need to spend even more time if they were to do the following (commented by individual teachers): “Check if two answers were marked for one question” (3s) “Make a note of items answered wrongly by students” (120s) “Recheck for errors in marking” (20s) “Key in the data” (60s)

School Requirements to Interpret Test Scores The respondents were asked to indicate on a five-point Likert scale (every time, often, occasionally, seldom and never) how frequently they were required by their schools to interpret test scores. The result shows that school requirements vary from school to school. Mean = 3.09, S.D. = 1.406

Teachers’ Practices – Interpreting Test Scores Every time Often Occasionally Seldom Never (5) (4) (3) (2) (1) Mean S.D. Determine class average 25 20 24 28 33 2.82 1.462 19% 15% 18% 22% 25% Determine standard deviation 5 23 69 1.84 1.091 4% 21% 53% Convert raw scores to standard scores 15 8 12 30 65 2.06 1.374 12% 6% 9% 23% 50% Only a third of the respondents (34%) consistently determine the class mean. Almost half of them (47%) seldom or never do so. Almost three quarters of the respondents indicated that they seldom or never determine the standard deviation (74%) and convert raw scores to standard scores (73%).

Challenges in Interpreting Test Scores “I struggle with time constraints.” (106 respondents, 81.54%) “I do not know how to do it.” (44 respondents, 33.85%) Other comments: “Too much paper work.” “Do not do it as I find the result irrelevant.” “Do not face any challenges (not required to do it / SU Peperiksaan does it for teachers)” Time constraint is the greatest challenge that hinders teachers from interpreting test scores. (106 respondents , 81.54%) Another one third of the respondents reported that they did not know how to interpret test scores. (44 respondents, 33.85%)

School Requirements to Perform Item Analysis School requirements to perform item analysis vary from school to school. Comparatively, more schools require teachers to perform item analysis. Mean = 3.44, S.D. = 1.341

Teachers’ Practices – Performing Item Analysis Every time Often Occasionally Seldom Never (5) (4) (3) (2) (1) Mean S.D. Determine the item difficulty of each item 19 41 30 24 16 3.18 1.256 15% 32% 23% 19% 12% Determine the discrimination index of each item 6 32 36 40 2.32 1.169 5% 24% 28% 31% 70% of the respondents determine the item difficulty (every time 15%; often 32%; occasionally 23%). More than half (59%) seldom or never determine the discrimination index; only 5% did so for every test.

Challenges in Performing Item Analysis “I struggle with time constraints.” (115 respondents, 88.46%) “I do not know how to do it.” (42 respondents, 32.31%) Other comments: “The information from item analysis is considered irrelevant by students.” “Do not do it as I find the result irrelevant.” “Do not face any challenges (not required to do it)” Similarly, time constraint is the greatest challenge that hinders teachers from performing item analysis. (115 respondents , 88.46%) Another one third of the respondents reported that they did not know how to do it. (42 respondents, 32.31%)

Prevalence of OMR Scanners & Item Analysis Software Out of the 74 schools represented in this study, only 13 schools (18%) have invested in an OMR scanner and 16 schools (22%) provide teachers with item analysis software.

Teachers’ Willingness to Invest in the Tool

Teachers’ Willingness to Invest in the Tool Among those who are willing to invest (N=84), the respondents gave a wide range of answers when asked how much they are willing to pay for such a tool. Their answers range from RM5 (min) to RM3000 (max). The most common response was RM100 (mode, 22 respondents). The median was slightly higher at RM150. The mean was RM337.65, with a standard deviation of RM512.97. Max = RM3000 Mean = RM337.65 Median = RM150 Mode = RM100 Min = RM5

Discussion Teachers have to spend a substantial amount of time to score OMR answer sheets, interpret test scores and perform item analysis if they do it manually. Time constraint is the greatest challenge that hinder teachers from interpreting test scores and performing item analysis. Therefore, a tool to aid teachers in scoring OMR answer sheets would greatly reduce teachers’ burden and allow them more time to focus on enhancing test items and improving their teaching practice. However, the tool should be made available to teachers at an affordable price range (RM100-RM350).

Discussion Interpreting test scores and performing item analysis are crucial aspects of the assessment process. Interpreting test scores will give insights to the teachers as to their students’ performance in the test. It also helps to highlight areas of weakness that need to be addressed. Information from the item analysis is crucial for improving the item quality. From the item analysis results, teachers can decide whether to retain, modify or discard test items. Functional and modified items could be stored for future use. (Gronlund and Linn, 1990; Reynolds, Livingston and Willson, 2010) .

Discussion Requirements to interpret test scores and perform item analysis vary from school to school. If teachers neglect to interpret test scores and perform item analysis, then the value of the assessment process is greatly reduced. Suggestion: Teachers should be required to interpret test scores and perform item analysis as part of their professional practice. However, they should be released from the burden of clerical work (i.e. manually scoring OMR answer sheets and keying in the data) which could be done effectively with the assistance of an OMR processing and item analysis tool.

Discussion In conclusion, there is a need to develop an affordable and teacher-friendly tool to help teachers to process OMR answer sheets and provide them with information on students’ performance in the test as well as the quality of the items.

Universiti Teknologi Malaysia ~Thank you~ Cheryl Ng Ling Hui Universiti Teknologi Malaysia cherlinghui@gmail.com http://intoherworldofteaching.wordpress.com