This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers.

Slides:



Advertisements
Similar presentations
AAFCS Annual Conference Research-to- Practice Session June 27, 2013 EMBRACE AND MANAGE ONLINE TEACHING METHODS.
Advertisements

Risk Management - Request for Electronic Field Trip Waiver Process For class related field trips only – Students must be 18 years or older to utilize this.
Introduction to the Benchmarks of Quality (BoQ) Revised version of the presentation given at the Summer 2012 Minnesota PBIS Institute.
The Application for Renewal Accreditation: Electronic Submissions.
CRC Protocol Documents Protocol Submissions Amendments Publications Study Closure.
Training for Faculty Search Committees UAB Office of the Provost.
The Assistant Principal Pool Process 2014
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Model School Application Process. 2 Model School Candidates Ask your Coaches if any of their schools are planning to apply for Model.
Model School Application Process. 2 Model School Candidates Ask your Coaches if any of their schools are planning to apply for Model.
Review of 2010 Biennial Audit of Capital Projects Cyndi Fout Project Services Director February 28, 2011.
SAISD Federal Programs Department. Stage 1 of the Organization and Development Process Form the Planning Team 1 2.
Presented by Michelle Scharf, Transfer Center Director.
SOAR – Preparing for Launch Task Force Information January 2015.
PSLE Report Writing Protocol. The Report Template  The template is available on the PSLE website at 3.dot.
Reviewing the 2015 AmeriCorps Applications & Conducting the Review AmeriCorps External Review.
Advance and the Electronic Packet Advance and the Electronic Packet April 5,
Managerial Role – Setting the Stage Lesson 6 Jeneen T. Chapman John Madden Facilitators.
The Third Year Review A Mini-Accreditation Florida Catholic Conference National Standards and Benchmarks.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
© The Johns Hopkins University and The Johns Hopkins Health System Corporation, 2011 Using the Online HSOPS & RC Apps for CSTS Armstrong Institute for.
Public School-Operated UPK Information Session. Goals Increase your understanding of QUALITYstarsNY Answer your questions and concerns about participating.
1. Proposal deadline 2. Timeline  A grant opportunity announcement will include a sponsor deadline for receipt of the proposal.  The instructions will.
Master Teacher Program Presenters:Ginny Elliott Winifred Nweke.
Universally Designed Syllabi Kirsten Behling, MA Suffolk University.
InTeGrate Assessment David Steer, University of Akron Ellen Iverson, Carleton College May 2012.
Providing Effective Descriptive Feedback: Designing Rubrics, Part 2 --Are you assessing what you think you’re assessing? Princess Anne Middle School Instructional.
MED 595 Research in Mathematics Education. What you need for this course. You MUST activate your student account. The format is The format for the.
The Electronic Packet and the On-line Review Process The Electronic Packet and the On-line Review Process April
The Griffith PRO- Teaching Project A Process for Peer Review and Observation of Teaching.
Portfolio for Tenure & Promotion Grand Rapids Community College Faculty Evaluation System.
Promotion Process A how-to for DEOs. How is a promotion review initiated? Required in the final probationary year of a tenure track appointment (year.
Assessment and Continuous Improvement in Teacher Education.
Education Unit The Practicum Experience Session Two.
Wednesday, October 28 2:30 – 3:30 PM
Building Rubrics that Align with Standards & Documenting Candidates’ Effects on Student Learning Cynthia Conn, Ph.D., Associate Director, Office of Academic.
STUDENT FACULTY / ITP COMMITTEE SENATE START 2. Look for company for ITP placement END 1. Generate the list of eligible students for ITP and conduct briefing.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Program Assessment Technical Assistance Meetings December 2009.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Rubrics, and Validity, and Reliability: Oh My! Pre Conference Session The Committee on Preparation and Professional Accountability AACTE Annual Meeting.
Cayuse Submitting a New IRB Protocol OFFICE OF RESEARCH COMPLIANCE V
Accreditation (AdvancED) Process School Improvement Activities February 2016 Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator of.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
College of Arts & Sciences Promotion and tenure Dossier assembly workshop spring 2017.
EVALUATING EPP-CREATED ASSESSMENTS
Submitting a New IRB Protocol
Tenure and Recontracting August 29, 2017
College of Arts & Sciences Promotion and tenure Dossier assembly workshop fall 2017.
Training for Faculty Search Committees
Polices, procedures & protocols
Twelve Step Program to Meeting Quality Matters
Submitting a New IRB Protocol
Benchmarks of Quality (BOQ) Training
The final steps to the HDA project
Dr. Laura Hart Dr. Teresa Petty
BECOMING AN EXPERT IN SURVEY DESIGN By Ghania Zgheib EDIT 732
Directions for Expert Review Panel
Parent and Family Partnership Surveys
Office of Faculty Affairs
Tenure and Recontracting February 7, 2018
Tenure and Recontracting August 27, 2018
College of Arts & Sciences Promotion and tenure Dossier assembly workshop fall 2018.
Tenure and Recontracting February 6, 2018
Tenure and Recontracting October 6, 2017
Submitting a New IRB Protocol
COED SLO Reporting.
Assessing Academic Programs at IPFW
Links for Academic Learning: Planning An Alignment Study
Tenure and Recontracting February 26, 2019
Presentation transcript:

This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers will give feedback on about your rubrics. A separate video of directions for reviewers is available on our COED website. Please do not share this current video you are now watching with reviewers.

Completing the initial review of rubrics Selecting the panel of experts Creating the materials for reviewers Response Form Assessment Packet Collecting the information Submitting the Data to COED Assessment

Rubric review has been ongoing Rubrics must meet minimal CAEP guidelines A checklist for this is on the COED Assessment website under the “Content Protocol” link Recommended date to have rubric changes made is February 15, Send your rubrics to Laura Hart for a quick review before you proceed to the next step

Identify a panel of experts and credentials for their selection. Should include a mixture of IHE Faculty (i.e., content experts) and B12 school or community practitioners (lay experts). Minimal credentials for each expert should be established by consensus from program faculty; credentials should bear up to reasonable external scrutiny Invite them to participate, let them know they will be receiving additional information in the near future.

At least 3 content experts from the program/department in the College of Education at UNC Charlotte; At least 1 external content expert from outside the program/department. This person could be from UNC Charlotte or from another IHE, as long as the requisite content expertise is established; and At least 3 practitioner experts from the field. TOTAL NUMBER OF EXPERTS: At least seven (7)

For each internally-development assessment/rubric, there should be an accompanying response form that panel members are asked to use to rate items that appear on the rubric. Multiple rubrics can be reviewed by the same panel of reviewers, provided they have credentials. Can be hard copy, electronic, survey monkey/survey share … your preference Example …

How Representative the Item is of Key Construct How Important the Item is in measuring Key Construct How Clear the Item is Rate each item on scale of 1-4

Representativeness of item in measuring the overarching construct 1 = item is not representative 2 = item needs major revisions to be representative 3 = item needs minor revisions to be representative 4 = item is representative

Importance of item in measuring the overarching construct 1 = item is not necessary to measure the construct 2 = item provides some information but is not essential to measure the construct 3 = item is useful not but essential to measure the construct 4 = item is essential to measure the construct

Clarity of item 1 = item is not clear 2 = item needs some major revisions to be clear 3 = item needs some minor revisions to be clear 4 = item is clear

Provide response form or link to it in materials Reviewer watches directions video Rate each item  how well it measures key constructs Open-ended items: Are there enough items to measure the construct? Too many? Other feedback Key Construct Item 1 Item 2 Item 3

Hard copies or electronic versions 1.Letter of purpose (may be in the body of ) 2.A copy of the assessment instructions provided to candidates. 3.A copy of the rubric used to evaluate the assessment. 4.The response form aligned with the assessment/rubric for the panel member to rate each item.

Send materials to participants Set a reasonable deadline

Once response data has been collected, submit the complied results to the COED Assessment Office. Please do not submit individual results; please compile the results into a summary chart or document. Copies of all forms and/or an excel file of submitted scores (if collected electronically) should be submitted in the designated file on the S: drive. This file is accessible by program directors (if you need access, please contact Ashley Flatley in the COED Assessment Office). Content Validity Results are due by May 15, 2016.Ashley FlatleyContent Validity Results are due by May 15, 2016 There is a specific format – to see this and the file path, go to the COED Assessment website.

Once Content Validity Results have been submitted, the COED Assessment Office will generate a Content Validity Index (CVI). CVI = The number of experts who rated the item as 3 or 4 ÷ The number of total experts A CVI score of.80 or higher will be considered acceptable.

Final changes will be made based on the CVI results from the expert panel of review. If we’ve done our work well, hopefully these are small changes (or no changes) but some adjustments will probably need to be made. Final result = VALID RUBRIC (interrater reliability becomes next step)

COED Website