Download presentation
Presentation is loading. Please wait.
Published byAlexis Hamilton Modified over 9 years ago
1
College of Engineering and Science Louisiana Tech University College of Engineering and Science Using Item Analysis to Adjust Testing and Topical Coverage in Individual Courses Bernd S. W. Schröder
2
ABET’s requirements § “The institution must evaluate, advise, and monitor students to determine its success in meeting program objectives.” (From criterion 1) §“(The institution must have) a system of ongoing evaluation that demonstrates achievement of these objectives and uses the results to improve the effectiveness of the program.” (2d) College of Engineering and Science
3
ABET’s requirements (cont.) § “Each program must have an assessment process with documented results. Evidence must be given that the results are applied to the further development and improvement of the program. The assessment process must demonstrate that the outcomes important to the mission of the institution and the objectives of the program, including those listed above, are being measured.” (postlude to a-k) College of Engineering and Science
4
ABET’s requirements and us §Long term assessment is the only way to go, but how can more immediate data be obtained? §Large feedback loops need to be complemented with small feedback loops §All this costs time and money §Still feels foreign to some faculty §“Free” tools that “do something for me” would be nice College of Engineering and Science
5
Data that is immediately available §Faculty give LOTS of tests §Tests are graded and returned §Next term we make a new test §… and we may wonder why things do not improve College of Engineering and Science
6
Presenter’s context §Adjustment of topical coverage in Louisiana Tech University’s integrated curriculum §Presenter had not taught all courses previously §Some material was moved to “nontraditional” places §How do we find out what works well? College of Engineering and Science
7
Integrated Courses fallspringwinter math 240 3math 241 3math 242 3 Precalc algebra & trig, single variable differential calculus Single variable differential calculus Integral calculus, intro differential equations engr 120 2engr 121 2 engr 122 2 Problem solving, data analysis, team skills, statistics Statics, strengths, report writing, sketching, design Circuits, engr economics, CAD, design project Freshman Year chem 100 2 chem 101 2 Engineering chemistry phys 201 3 Mechanics Plus 1 additional class -- History, English, Art,... Engineering Fundamentals Teamwork Communication Skills Design Computer Skills Laboratory Experiences 2 classes/labs (2 hrs each) per week Engineering Class
8
Integrated Courses fallspringwinter math 242 3math 244 3math 245 3 Basic statistics, multivariable integral calculus Multivariable differential calculus, vector analysis Sequences, series, differential equations engr 220 3engr 221 3 engr 222 3 Statics and strengthsEE applications and circuitsThermodynamics memt 201 2 physics 202 3 Engineering materialsElectric and magnetic fields, optics Sophomore Year Plus 1 additional class -- History, English, Art,... Engineering Fundamentals Teamwork Communication Skills Design Statistics & Engr Economics Laboratory Experiences 3 hours lab & 2.5 hours lecture per week Engineering Class
9
Implementation Schedule §AY 1997-98: One pilot group of 40 §AY 1998-99: One pilot group of 120 §AY 1999-2000 Full implementation College of Engineering and Science
10
Item analysis §Structured method to analyze (MC) test data §Can detect “good” and “bad” test questions l Awkward formulation l “Blindsided” students §Can detect problem areas in the instruction l Difficult material l Teaching that was less than optimal §Plus, data that usually is lost is stored College of Engineering and Science
11
“But I don’t give tests...” §Do you grade projects, presentations, lab reports with a rubric? l Scores are sums of scores on parts §Do you evaluate surveys? (Gloria asked) l Individual questions may have numerical responses (Likert scale) §Item analysis is applicable to situations in which many “scores” are to be analyzed College of Engineering and Science
12
Literature §R. M. Zurawski, Making the Most of Exams: Procedures for Item Analysis, The National Teaching and Learning Forum, vol. 7, nr. 6, 1998, 1-4 l http://www.ntlf.com §http://ericae.net/ft/tamu/Espy.htm (Bio!) §http://ericae.net/ (On-line Library) College of Engineering and Science
13
Underlying Assumptions in Literature §Multiple Choice §Homogeneous test §Need to separate high from low §Are these valid for our tests? (This will affect how we use the data.) College of Engineering and Science
14
How does it work? §Input all individual scores in a spreadsheet l If you use any calculating device to do this already, then this step is free l the same goes for machine recorded scores (multiple choice, surveys) §Compute averages, correlations etc. l But what does it tell us? (Presentation based on actual and “cooked” sample data.) College of Engineering and Science
15
Item Difficulty §Compute the average score of students on the given item §Is a high/low average good or bad? §How do we react? College of Engineering and Science
16
Comparison Top vs. Bottom §General idea: high performers should outperform low performers on all test items §Compare average scores of top X% to average scores of bottom X% §Problems on which the top group outscores the bottom group by about 30% are good separators (retain) §Advantage: Simple College of Engineering and Science
17
Comparison Top vs. Bottom §Problems on which the bottom group scores near or above the top group should be analyzed l Is the formulation intelligible? l Was the material taught adequately? l Was the objective clear to everyone? l Does the problem simply address a different learning style? l Is there a problem with the top performers? College of Engineering and Science
18
Comparison Student Group vs. Rest §Can analyze strengths and weaknesses of specific demographics (even vs. odd, 11-20 vs. rest). §Knowing a weakness and doing something about it unfortunately need not be the same thing. (3NT) College of Engineering and Science
19
Comparison Class vs. Class §If the test is given to several groups of individuals, then scores of the different groups can be compared §Differences in scores can sometimes be traced to differences in teaching style §Similarity in scores can reassure faculty that a particular subject may have been genuinely “easy” or “hard”. College of Engineering and Science
20
Correlation §Related material should have scores that correlate §Individual problem scores should correlate with the total? (What if very different skills are tested on the same test?) College of Engineering and Science
21
Correlation and Separation §Often the two are correlated but “cross fires” can occur l Questions with same correlation can have different separations and vice versa l A question may separate well, yet not correlate well and vice versa College of Engineering and Science
22
Distractor Analysis §Incorrect MC item that was not selected by anyone should be replaced §Possible by slightly misusing the tool. College of Engineering and Science
23
Data remains available §Many faculty look to old tests (of their own or from others) when making a new test §Past problems are often forgotten §Item analysis provides a detailed record of the outcome and allows faculty to re-think testing and teaching strategies §Anyone who has spent time thinking about curving may want to spend this time on item analysis College of Engineering and Science
24
Consequences of the Evaluation (Gloria’s law: E=mc 2 ) §Don’t panic, keep data with test §Better test design §Identification of challenging parts of the class leads to adjustment in coverage §Students are better prepared for following classes College of Engineering and Science
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.