Technical Assessment Development and Validation: Methods for Ensuring The Utility, Validity and Reliability of Technical Skill Assessment Systems.

Slides:



Advertisements
Similar presentations
Project L.O.F.T. Report May 2007 through October 2007 Creating a design to meet stakeholder desires and dissolve our current set of interacting problems.
Advertisements

Making the Connection to Assessment. Three components: Common Core State Standards Excellent Matches to State Curriculum Essential Skills and Knowledge.
The Minnesota CTE Assessment Project Building a CTE Assessment System for Student Results and Program Improvement.
Selecting and Identifying Programs of Study Division of School and Community Academic Programs Camden County College Camden Pathways Professional Development.
Common Core State Standards OVERVIEW CESA #9 - September 2010 Presented by: CESA #9 School Improvement Services Jayne Werner and Yvonne Vandenberg.
A Systemic Approach February, Two important changes in the Perkins Act of 2006 A requirement for the establishment of Programs of Study A new approach.
CATE UPDATE Susan Flanagan, Director Office of Career and Technology Education March 12, 2013.
NOCTI Overview Amie Birdsall and Patricia Kelley February 23, 2012.
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
CCTC Background Process coordinated by NASDCTEc 42 states, DC, and one territory involved in development Modeled the process and outcomes of Common Core.
Northeast Metro Perkins Consortium Technical Skill Assessment Career and Technical Center 1 partners in education Northeast Metro Intermediate School District.
PEIMS is a Five Letter Word! Ruthie Pe’Vey Kneupper Educational Specialist, CTE Education Service Center, Region 20
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Consistency of Assessment
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Assessing and Evaluating Learning
Understanding Validity for Teachers
Nevada CTE & CTECS: Programs, Standards, Assessments & Credentials January, 2014 Nevada Department of Education Office of Career, Technical and Adult.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Michigan Career Education Conference February 6, 2012.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
Strategic Planning Board Update February 27, 2012 Draft - For Discussion Purposes Only.
Sustainable Building Oregon Team Oregon Department of Education, June 2010.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
The Minnesota CTE Assessment Project Building a CTE Assessment System for Student Results and Program Improvement.
New York State Education Department Understanding The Process: Science Assessments and the New York State Learning Standards.
The Carl D. Perkins Act of 2006 An Overview for Career and Technical Education WI Dept. of Public Instruction Academic Excellence Division Deborah Mahaffey,
Strategies for Utilizing Assessment Results to Improve Instruction Rae Lees, NOCTI Consultant NACTEI Conference Philadelphia, Pennsylvania May 2011.
Quality in language assessment – guidelines and standards Waldek Martyniuk ECML Graz, Austria.
Student Learning Objectives: Considerations for Teachers of Career and Technical Education Courses Name Title Date 1 Copyright © 2014 American Institutes.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Documenting How WTCS Colleges Assess Learning Outcomes.
Kansas CTE Conference, Wichita, KS February 21, 2011 Mike Gross
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot May 8, 2009.
What Should You Consider When Talking About Technical Assessments? John M. Townsend Tennessee Board of Regents NACTEI 28 th Annual Conference Boise, Idaho.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Oregon Diploma & Essential Skills Task Force Defining the Essential Skills Work Session October 2, 2007.
October 2011 CAO/CSAO/Deans Meeting The New Professional Development in CTE Debra Wilcox Hsu.
Smarter Balanced Assessment System March 11, 2013.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
Measuring Academic Achievement in Career- Technical Education Presented by: Sandra Pritz, Senior Consultant National Occupational Competency Testing Institute.
21 st Century Education A conversation regarding the changes in academic standards, assessment, funding, and accountability.
Career and Technical Education Assessments New CTE Teacher Workshop September 22-23, 2015 Valerie Felder Office of Career and Technical Education Grants,
Making the Connection between Assessment and Learning More Intentional Carol Rovello, Director of Employee & Organization Development Sherian Howard, CAD.
Presentation will start at 2:00 If you are using your phone for audio, please put it on mute. Submit questions through the chat feature For assistance.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
What’s New in Assessment Presented by: Sandra Pritz, Senior Consultant National Occupational Competency Testing Institute.
Evaluating Survey Items and Scales Bonnie L. Halpern-Felsher, Ph.D. Professor University of California, San Francisco.
CTE Programs: Moving to Pathways with Courses. What is a Career Field? Career Field Pathway Course A career field is a grouping of occupations and broad.
Assessment My favorite topic (after grammar, of course)
Presentation to the Nevada Council to Establish Academic Standards Proposed Math I and Math II End of Course Cut Scores December 22, 2015 Carson City,
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Programs of Study. Program of Study A Program of Study is a sequence of instruction (based on recommended standards and knowledge and skills) consisting.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Improving Career & Technical Education Through Statewide Use of
The State of Career Pathways in Minnesota Programs of Study Technical Skill Attainment Minnesota State Colleges and Universities Minnesota Department.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
The Year of Core Instruction
The Minnesota CTE Assessment Project Building a CTE Assessment System for Student Results and Program Improvement
CTE Standards Perkins Grant Management System
Presented by: Skyline College SLOAC Committee Fall 2007
Technical Assessment Development and Validation: Methods for Ensuring The Utility, Validity and Reliability of Technical Skill Assessment Systems Good.
Grants Program Consultant
Assessment Literacy: Test Purpose and Use
An Introduction to Evaluating Federal Title Funding
Presentation transcript:

Technical Assessment Development and Validation: Methods for Ensuring The Utility, Validity and Reliability of Technical Skill Assessment Systems

Session Outcomes Build understanding of key criteria for technical skill assessments:  Utility  Validity  Reliability Understand a state-led process for developing a technical assessment system that meets such criteria.

Carl D. Perkins Career and Technical Education Act of 2006 Each state established a performance accountability system with multiple measures of student learning, program completion, and transitions to further education, employment and the military Perkins III allowed wide flexibility in how to measure “technical skill attainment” Perkins IV requires a more focused assessment approach for technical skill attainment

Technical Skill Attainment -- Secondary Sec 113 (b)(2)(A) …”core indicators of performance…that are valid and reliable… measures of each of the following:” “Student attainment of career and technical skill proficiencies, including student achievement on technical assessments, that are aligned with industry-recognized standards, if available and appropriate.”

Technical Skill Attainment -- Postsecondary  Sec 113 (b)(2)(B)  …”core indicators of performance…that are valid and reliable…measures of each of the following:”  “Student attainment of career and technical skill proficiencies, including student achievement on technical assessments, that are aligned with industry-recognized standards, if available and appropriate.”  “Student attainment of an industry-recognized credential, a certificate, or a degree.”

Critical Features … a) Measure what is important b) Useful and timely feedback to stakeholders c) Fair, consistent and accurate measures (e.g., reliable & valid assessments) Key Concepts  Utility  Validity  Reliability

Utility : Something useful or designed for use Some Core Assumptions 1.We have to do this, so let’s do it in a way that is going to be maximally useful to our stakeholders. 2.Assessment systems should ultimately influence and reflect what is occurring in the educational setting(s). 3.Without buy-in, items 1 & 2 will never happen. 4.A systematic process for stakeholder involvement and communication must be explicitly planned and built-in.

Validity: To what extent does the assessment measure what it is supposed to measure? Commonly used methods…  Face and content validity  Construct validity (convergent, divergent, factor analytic techniques, etc.)  Criterion-related validity (concurrent validity, predictive validity)

Reliability refers to the stability or consistency of assessment results. Does the assessment measure yield consistent results across different raters, different periods of time, different samples of tasks, and so forth. Commonly used methods  Internal consistency reliability  Test-Retest reliability  Inter-rater reliability  Others: (equivalency/parallel content, expert- rater reliability, etc.)

Wyoming CTE Assessment Project Goals Establish shared expectations as to what students should know and be able to do in Wyoming’s CTE programs Develop a valid and reliable CTE assessment system Ensure the system provides useful, timely and accurate feedback to teachers, administrators, students and employers

Options to fulfill Perkins IV are: Use Industry-Based Certifications or other standardized-assessments – AND/OR Develop valid and reliable assessments through a statewide collaborative process

Challenges and Considerations Access to assessment data to improve classroom instructionAccess to assessment data to improve classroom instruction Dealing with the expense of buying IBC’sDealing with the expense of buying IBC’s Making sure the IBC’s match up to the course contentMaking sure the IBC’s match up to the course content Making sure the IBC is valuable to employers and the job marketMaking sure the IBC is valuable to employers and the job market Getting data from externally administered examsGetting data from externally administered exams Deciding when to assess (end-of-program or course by course)Deciding when to assess (end-of-program or course by course) Assessments that are appropriate to various program structures and goalsAssessments that are appropriate to various program structures and goals Assessing CTE skills AND employability skillsAssessing CTE skills AND employability skills

Putting First Things First FIRST, decide WHAT to assess THEN decide HOW to assess

Source of Standards State standards SCANS State standards SCCI K&S State standards SCCI K&S State standards Industry State standards defacto – texts and tests Courtesy of Steve Klein & MPR Associates

Can One Assessment Measure It All? No matter the approach, it is inherent that the program will include applied academic skills, employability skills, cluster- and pathway-level skills, and program/occupation skills. When considering assessments, consider if one assessment can adequately measure all those skills. Industry certification test State test from National Item Banks Commercial employability skills test Program of Study

Setting up the Structure Assessment Project Advisory Group participants, CTE administrators, community college administrators, teachers from various clusters, state agency staff, Provide general input on development process Liaison to the education communities at secondary and postsecondary levels Identify and prioritize clusters for development in remainder of project Meet in-person and through webinars, 2-3 times per year

Setting up the Structure Business/Industry Advisory Group Cross-section of business/industry representatives. Should include representatives from each of the three initial clusters (Agriculture, Construction, Manufacturing) Provide general input on development process from business/industry perspective Liaison to the business communities across the state Review, provide input, affirm content developed by Cluster/Pathway Work Team Advise on raising value of CTE and the CTE assessment system within Wyoming business/industry. Meet in-person and through webinars, 2-3 times per year

Setting up the Structure Cluster/Pathway Work Groups 7-10 content experts in each of three clusters: Agriculture. Construction, Manufacturing Provide input on the priority competencies to include in the assessment system Assist in identifying the relative usefulness and applicability of existing assessments Provide input on any state-developed assessments that are determined to be necessary Kick-off briefings on March 7, Work sessions March 16-20, In-person and webinar follow-up sessions through June Optional involvement in assessment pilot phase.

Identifying Competencies and Objectives March 08 Convene initial Cluster/Pathway Working Groups (CPWG). Each CPWG identifies core competencies (technical, academic, employability) that need to be assessed in each Cluster/Pathway. April-May 08 Draft Competencies are completed by CPWG and posted online for review. Other WY teachers and faculty invited to review and comment on Draft Competencies. Cluster/Pathway Competencies finalized

Identifying Test Items and Assessment Options May-June 08 5/12/08, Manufacturing and Arch/Construction CPWG’s meet to review sample test items for the Competencies June 08 Agriculture/Natural Resources CPWG meets to review sample test items for competencies Consultant team gathers information on assessment resources (NOCTI, SkillsUSA, industry groups) and delivery system options Consultant team completes feasibility report for assessment development phase.

Pilot Testing Assessments and Next Steps Fall 2008 – Spring 2009 Development and pilot testing of first phase assessments for initial clusters Possibly begin to work with additional Cluster/Pathway Work Groups to identify Essential Core competencies for other areas

Example – the Utah CTE assessment system

Overview of Assessment Development Process Identify competencies and objectives. Decide what to assess and how (e.g., develop an assessment blueprint). Feasibility phases  examine existing options Pilot assessments and conduct necessary analyses to document technical quality of assessments. Finalize assessments, delivery, key features the system must possess.

What is an Assessment Blueprint? An assessment blueprint helps us determine what should be covered in the assessment(s) as well as the number of test items that should be included in each category. You can also use it to help determine the total length of the test as well as the types of items to be included.

Example: NOCTI Experienced Worker Assessment Blueprint Areas covered in the Building Construction Occupations, Written Assessment: 29%Carpentry 17%Electrical 7%Plumbing 4%Math 6%Metal Work/Guttering 10%Painting & Decorating 8% Building Code & Safety 19%Masonry

Example: Texas Education Agency TAKS 9 th Grade Reading Objective 1: Reading and basic understanding 9 multiple-choice items Objective 2: Reading – literacy elements and techniques 12 multiple-choice items 1 short answer item Objective 3: Reading – analysis and critical evaluation 12 multiple-choice items 2 short answer items TOTAL NUMBER OF ITEMS 33 multiple-choice items 3 short answer items

One last example … A sample assessment blueprint from a test on human geography is provided below.: The construction of the blueprint is a useful process as it helps to ensure that items are produced that cover both the content of the program and the educational objectives. It can also allow the balance of 'worth' of individual items to be determined. AreaKnows common terms Knows specific facts Understand principles Applies principles Interprets charts and graphs Total Food Clothing Shelter101215

Developing an Assessment Blueprint What is the relative emphasis you want to place on cluster level competencies versus pathway level competencies? What is the relative emphasis you want to place on areas within the pathway? Within a cluster/pathway, are there objectives that are a greater priority for you to measure than others? What parameters do you wish to set for the total length and duration of the test? What are your thoughts regarding the distribution and use of different types of assessment items across the areas? How many multiple choice, short-answer/constructed response, or performance tasks? (Note: Blooms Taxonomy, etc.)

Some factors to consider when examining potential existing assessments Alignment: Do the items align or match up with the competencies/objectives we’ve identified as important? (e.g., Is the assessment measuring what we want it to measure? Ease of use: Is it manageable for teachers and students (e.g., administration method, directions clear, requirements in terms of resources, etc.) Administration Time Cost Flexibility Fairness Content in terms of assessment items (see next page)

Some characteristics of good assessment items … Each item has a specific purpose and is designed to test a significant learning outcome. Items are clear (avoid irrelevant material, also language should be clear and easy to understand or else you may be measuring student comprehension of English rather than the trait you wish to measure). Contains plausible distractors Do the items contained within the assessment employ multiple methods to provide a more complete picture of student knowledge and/or skills? Do questions discriminate between the more able and less able students? Do they allow students to go well beyond the threshold requirements if they are able to?

Key features of the assessment system? Timing of assessments Types of scores produced Reporting (ongoing documentation? Access?) Order of presentation (items presented in a specific order, randomly, etc.) If online, desired features? (security/access, timed access, etc.) Other things you want to be sure the system has or is able to do??? As a teacher, what are the key features this CTE assessment system should have in order to make it really useful for you and your students?

For more information, contact: Mariam Azin: Azin: Hans Meeder: Meeder: