What Should You Consider When Talking About Technical Assessments? John M. Townsend Tennessee Board of Regents NACTEI 27 th Annual Conference
Why are you assessing? Most important question –For Perkins IV? –For Program Improvement? –For student attainment?
If For Student Attainment At what level is appropriate? At what level is appropriate? –Entry level –Mastery level –Industry certification level
When do you assess? End-of-programEnd-of-course End of Pathway Cluster related
Norm- or Competency-referenced Norm-referenced Norm-referenced –Comparison among individuals –Demonstrates changes of students as a group Competency- referenced Competency- referenced –Attainment of an individual –Measurable learning outcomes based upon industry standards
Buy or Develop? 1. Standardized instruments 2. Centralized state-developed instruments 3. Local assessments recognized by state
Performance or Cognate Performance Performance –What the student should be able to do –Checklist –Limited number of areas can be assessed in specified time Cognate Cognate –What the student should know –Test-items –Many areas can be assessed in a specified time
Building an Assessment Adopt – seek items or assessments that you can adopt as is Adopt – seek items or assessments that you can adopt as is Adapt – items or assessments that can be modified to fit your standards Adapt – items or assessments that can be modified to fit your standards Build – items or assessments when others cannot be found Build – items or assessments when others cannot be found
Table of Specifications (blueprint) Determine competencies to be assessed Determine competencies to be assessed Determine competency groups (3 – 10) Determine competency groups (3 – 10) Determine percentage of each cognitive typology per group Determine percentage of each cognitive typology per group
Development Questions Is cost and time worth benefit of test? Is cost and time worth benefit of test? Who should develop the assessment? Who should develop the assessment? Test security issues? Test security issues?
Multiple-choice elements: Stem Correct response Foils or distractors
Fixed-form v Variable-form Fixed-form High reliability High reliability Needs high security Needs high security Limited number of test items needed Limited number of test items neededVariable-form Good reliability Good reliability Lesser security needed Lesser security needed Requires a large number of test items Requires a large number of test items
Assessment Administration Decide the media for administration Decide the media for administration Identify who should conduct the assessment Identify who should conduct the assessment Determine when the assessment activities should occur Determine when the assessment activities should occur
Reporting issues: 1. Grading of the assessment 2. Assessment improvement needs 3. Reports needed
Assessment Review Validity – the degree to which certain inferences can be made from test scores. Validity – the degree to which certain inferences can be made from test scores. Reliability – the degree of consistency between two or more measures of the same thing. Reliability – the degree of consistency between two or more measures of the same thing.
Item & Assessment Analysis Find a good statistician, psychometrician or educational psychologist !
Ethical Issues in Testing Code of Fair Testing Practices in Education by the Joint Committee on Testing Practices, American Educational Research Association, American Psychological Association & National Council on Measurement in Education
Legal Issues Accommodations for members of special populations Accommodations for members of special populations Access to the test-items by parents and others Access to the test-items by parents and others Utilization of assessments in relation to passing or graduation Utilization of assessments in relation to passing or graduation
Development Cycle Competency learning outcome development Competency learning outcome development Blueprint development Blueprint development Test-item development Test-item development Field Testing Field Testing Item-analysis Item-analysis
Development Cycle Set cut scores Set cut scores Test form development Test form development Assessment administration Assessment administration Reporting of results Reporting of results
Contact Information: Dr. John M. Townsend Executive Director, Workforce Development Office of Academic Affairs Tennessee Board of Regents 1415 Murfreesboro Pike, Suite 350 Nashville, TN