Presentation is loading. Please wait.

Presentation is loading. Please wait.

This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers.

Similar presentations


Presentation on theme: "This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers."— Presentation transcript:

1

2 This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers will give feedback on about your rubrics. A separate video of directions for reviewers is available on our COED website. Please do not share this current video you are now watching with reviewers.

3 Completing the initial review of rubrics Selecting the panel of experts Creating the materials for reviewers Response Form Assessment Packet Collecting the information Submitting the Data to COED Assessment

4 Rubric review has been ongoing Rubrics must meet minimal CAEP guidelines A checklist for this is on the COED Assessment website under the “Content Protocol” link Recommended date to have rubric changes made is February 15, 2016. Send your rubrics to Laura Hart for a quick review before you proceed to the next step

5 Identify a panel of experts and credentials for their selection. Should include a mixture of IHE Faculty (i.e., content experts) and B12 school or community practitioners (lay experts). Minimal credentials for each expert should be established by consensus from program faculty; credentials should bear up to reasonable external scrutiny Invite them to participate, let them know they will be receiving additional information in the near future.

6 At least 3 content experts from the program/department in the College of Education at UNC Charlotte; At least 1 external content expert from outside the program/department. This person could be from UNC Charlotte or from another IHE, as long as the requisite content expertise is established; and At least 3 practitioner experts from the field. TOTAL NUMBER OF EXPERTS: At least seven (7)

7 For each internally-development assessment/rubric, there should be an accompanying response form that panel members are asked to use to rate items that appear on the rubric. Multiple rubrics can be reviewed by the same panel of reviewers, provided they have credentials. Can be hard copy, electronic, survey monkey/survey share … your preference Example …

8

9 How Representative the Item is of Key Construct How Important the Item is in measuring Key Construct How Clear the Item is Rate each item on scale of 1-4

10 Representativeness of item in measuring the overarching construct 1 = item is not representative 2 = item needs major revisions to be representative 3 = item needs minor revisions to be representative 4 = item is representative

11 Importance of item in measuring the overarching construct 1 = item is not necessary to measure the construct 2 = item provides some information but is not essential to measure the construct 3 = item is useful not but essential to measure the construct 4 = item is essential to measure the construct

12 Clarity of item 1 = item is not clear 2 = item needs some major revisions to be clear 3 = item needs some minor revisions to be clear 4 = item is clear

13 Provide response form or link to it in materials Reviewer watches directions video Rate each item  how well it measures key constructs Open-ended items: Are there enough items to measure the construct? Too many? Other feedback Key Construct Item 1 Item 2 Item 3

14 Hard copies or electronic versions 1.Letter of purpose (may be in the body of email) 2.A copy of the assessment instructions provided to candidates. 3.A copy of the rubric used to evaluate the assessment. 4.The response form aligned with the assessment/rubric for the panel member to rate each item.

15 Send materials to participants Set a reasonable deadline

16 Once response data has been collected, submit the complied results to the COED Assessment Office. Please do not submit individual results; please compile the results into a summary chart or document. Copies of all forms and/or an excel file of submitted scores (if collected electronically) should be submitted in the designated file on the S: drive. This file is accessible by program directors (if you need access, please contact Ashley Flatley in the COED Assessment Office). Content Validity Results are due by May 15, 2016.Ashley FlatleyContent Validity Results are due by May 15, 2016 There is a specific format – to see this and the file path, go to the COED Assessment website.

17 Once Content Validity Results have been submitted, the COED Assessment Office will generate a Content Validity Index (CVI). CVI = The number of experts who rated the item as 3 or 4 ÷ The number of total experts A CVI score of.80 or higher will be considered acceptable.

18 Final changes will be made based on the CVI results from the expert panel of review. If we’ve done our work well, hopefully these are small changes (or no changes) but some adjustments will probably need to be made. Final result = VALID RUBRIC (interrater reliability becomes next step)

19 COED Website

20


Download ppt "This video is for UNC Charlotte faculty developing a response form as part of the content validity protocol. The response form is what expert reviewers."

Similar presentations


Ads by Google