Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia

Similar presentations


Presentation on theme: "Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia"— Presentation transcript:

1 Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia
Teacher Practices in Scoring Multiple-Choice Items, Interpreting Test Scores & Performing Item Analysis Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia Introduction – M.Ed (Measurement and Evaluation) final semester; carried out as part of my master’s project

2 Introduction Multiple choice items are widely used to assess student learning. They feature prominently in public examinations and likewise in teacher-made-tests. Research Project – to develop a tool to assist teachers in scoring multiple choice items, interpreting test scores and performing item analysis. Thus, this preliminary needs analysis study was conducted…. Easy to score objectively – more items can be included in a single test – better coverage of the scope of study However raw score alone does not give us much information of educational value

3 Aim and Purposes of the Study
Objectives: Identify teachers’ current practice and the challenges they face in scoring multiple-choice items, interpreting test scores and performing item analysis; and Ascertain the prevalence of OMR scanners and item analysis software in Malaysian secondary schools. Aim: Findings from this preliminary study serve to inform subsequent development of a software to aid teachers in these three aspects.

4 Methodology Research Design: Survey
Participants: 130 secondary school teachers serving in public schools across Malaysia. Instrument: Questionnaire with 3 sections: Section A: Demographic Profile – gender, location and type of school, teaching experience Section B: Teacher Practices – scoring OMR answer sheets, interpreting test scores, performing item analysis Section C: OMR Scanner and Item Analysis Software – availability in school, willingness to invest in one etc.

5 Methodology Precedures:
Questionnaires were disseminated to various schools through the help of colleagues and friends. An online version of the questionnaire was created using a free online survey tool and the link was sent to potential respondents. The collected data were analysed using descriptive statistics.

6 Respondents’ Demographic Profile
Number of respondents, N = 130 teachers Number of schools represented = 74 schools Teaching experience 0-33 years

7 Teachers’ Practices : Scoring
Teachers’ estimates on the amount of time (in seconds) needed to score one OMR answer sheet (assuming that there are 40 items), count the number of correct answers and convert the raw score to percentages :- Table 1 Means and Standard Deviations for Teachers’ Practices in Scoring Mean Standard Deviation Scoring OMR answer sheet (N=130) 51.52 37.69 Counting the number of correct answers (N=127) 27.29 26.43 Converting the raw score to percentage (N=114) 22.40 25.87 Total 101.21 As the number of items differ across subjects and levels, the respondents were told to assume that there were 40 items. Scoring + Counting + Converting = seconds; 40 students = 1hr 8 mins / class

8 Teachers’ Practices : Scoring
Teachers have to spend a substantial amount of time to process OMR answer sheets when they do it manually. Based on the mean of teachers’ estimates, 1 OMR answer sheet takes around seconds (i.e minutes) to mark. 5 classes x 40 students = 200 OMR answer sheets 200 sheets x 1.69 minutes = 5 hours 38 minutes

9 Teachers’ Practices : Scoring
They would need to spend even more time if they were to do the following (commented by individual teachers): “Check if two answers were marked for one question” (3s) “Make a note of items answered wrongly by students” (120s) “Recheck for errors in marking” (20s) “Key in the data” (60s)

10 School Requirements to Interpret Test Scores
The respondents were asked to indicate on a five-point Likert scale (every time, often, occasionally, seldom and never) how frequently they were required by their schools to interpret test scores. The result shows that school requirements vary from school to school. Mean = 3.09, S.D. = 1.406

11 Teachers’ Practices – Interpreting Test Scores
Every time Often Occasionally Seldom Never (5) (4) (3) (2) (1) Mean S.D. Determine class average 25 20 24 28 33 2.82 1.462 19% 15% 18% 22% 25% Determine standard deviation 5 23 69 1.84 1.091 4% 21% 53% Convert raw scores to standard scores 15 8 12 30 65 2.06 1.374 12% 6% 9% 23% 50% Only a third of the respondents (34%) consistently determine the class mean. Almost half of them (47%) seldom or never do so. Almost three quarters of the respondents indicated that they seldom or never determine the standard deviation (74%) and convert raw scores to standard scores (73%).

12 Challenges in Interpreting Test Scores
“I struggle with time constraints.” (106 respondents, 81.54%) “I do not know how to do it.” (44 respondents, 33.85%) Other comments: “Too much paper work.” “Do not do it as I find the result irrelevant.” “Do not face any challenges (not required to do it / SU Peperiksaan does it for teachers)” Time constraint is the greatest challenge that hinders teachers from interpreting test scores. (106 respondents , 81.54%) Another one third of the respondents reported that they did not know how to interpret test scores. (44 respondents, 33.85%)

13 School Requirements to Perform Item Analysis
School requirements to perform item analysis vary from school to school. Comparatively, more schools require teachers to perform item analysis. Mean = 3.44, S.D. = 1.341

14 Teachers’ Practices – Performing Item Analysis
Every time Often Occasionally Seldom Never (5) (4) (3) (2) (1) Mean S.D. Determine the item difficulty of each item 19 41 30 24 16 3.18 1.256 15% 32% 23% 19% 12% Determine the discrimination index of each item 6 32 36 40 2.32 1.169 5% 24% 28% 31% 70% of the respondents determine the item difficulty (every time 15%; often 32%; occasionally 23%). More than half (59%) seldom or never determine the discrimination index; only 5% did so for every test.

15 Challenges in Performing Item Analysis
“I struggle with time constraints.” (115 respondents, 88.46%) “I do not know how to do it.” (42 respondents, 32.31%) Other comments: “The information from item analysis is considered irrelevant by students.” “Do not do it as I find the result irrelevant.” “Do not face any challenges (not required to do it)” Similarly, time constraint is the greatest challenge that hinders teachers from performing item analysis. (115 respondents , 88.46%) Another one third of the respondents reported that they did not know how to do it. (42 respondents, 32.31%)

16 Prevalence of OMR Scanners & Item Analysis Software
Out of the 74 schools represented in this study, only 13 schools (18%) have invested in an OMR scanner and 16 schools (22%) provide teachers with item analysis software.

17 Teachers’ Willingness to Invest in the Tool

18 Teachers’ Willingness to Invest in the Tool
Among those who are willing to invest (N=84), the respondents gave a wide range of answers when asked how much they are willing to pay for such a tool. Their answers range from RM5 (min) to RM3000 (max). The most common response was RM100 (mode, 22 respondents). The median was slightly higher at RM150. The mean was RM337.65, with a standard deviation of RM Max = RM3000 Mean = RM337.65 Median = RM150 Mode = RM100 Min = RM5

19 Discussion Teachers have to spend a substantial amount of time to score OMR answer sheets, interpret test scores and perform item analysis if they do it manually. Time constraint is the greatest challenge that hinder teachers from interpreting test scores and performing item analysis. Therefore, a tool to aid teachers in scoring OMR answer sheets would greatly reduce teachers’ burden and allow them more time to focus on enhancing test items and improving their teaching practice. However, the tool should be made available to teachers at an affordable price range (RM100-RM350).

20 Discussion Interpreting test scores and performing item analysis are crucial aspects of the assessment process. Interpreting test scores will give insights to the teachers as to their students’ performance in the test. It also helps to highlight areas of weakness that need to be addressed. Information from the item analysis is crucial for improving the item quality. From the item analysis results, teachers can decide whether to retain, modify or discard test items. Functional and modified items could be stored for future use. (Gronlund and Linn, 1990; Reynolds, Livingston and Willson, 2010) .

21 Discussion Requirements to interpret test scores and perform item analysis vary from school to school. If teachers neglect to interpret test scores and perform item analysis, then the value of the assessment process is greatly reduced. Suggestion: Teachers should be required to interpret test scores and perform item analysis as part of their professional practice. However, they should be released from the burden of clerical work (i.e. manually scoring OMR answer sheets and keying in the data) which could be done effectively with the assistance of an OMR processing and item analysis tool.

22 Discussion In conclusion, there is a need to develop an affordable and teacher-friendly tool to help teachers to process OMR answer sheets and provide them with information on students’ performance in the test as well as the quality of the items.

23 Universiti Teknologi Malaysia
~Thank you~ Cheryl Ng Ling Hui Universiti Teknologi Malaysia


Download ppt "Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia"

Similar presentations


Ads by Google