Presentation is loading. Please wait.

Presentation is loading. Please wait.

WE GAVE THE TEST - NOW WHAT? ANALYSIS AND REPORTING FOR THE SOUTH CAROLINA ARTS ASSESSMENT PROGRAM Ashlee A. Lewis, Office of Program Evaluation R. Scot.

Similar presentations


Presentation on theme: "WE GAVE THE TEST - NOW WHAT? ANALYSIS AND REPORTING FOR THE SOUTH CAROLINA ARTS ASSESSMENT PROGRAM Ashlee A. Lewis, Office of Program Evaluation R. Scot."— Presentation transcript:

1 WE GAVE THE TEST - NOW WHAT? ANALYSIS AND REPORTING FOR THE SOUTH CAROLINA ARTS ASSESSMENT PROGRAM Ashlee A. Lewis, Office of Program Evaluation R. Scot Hockman, SC Department of Education

2 Intent(s) : To demonstrate the process by which the SC Arts Assessment Program moves from test administration to data analysis to reporting results to schools. To talk through the continuous improvement process used for the SCAAP.

3 SOUTH CAROLINA ARTS ASSESSMENT PROGRAM (SCAAP) https://scaap.ed.sc.edu

4 SCAAP Collaborators  South Carolina Department of Education (SCDE)  Funding Agency  Office of Program Evaluation (OPE) at USC  Test And Measurement Specialists  Logistics  South Carolina Arts Educators  Content-area Experts  Capacity building through statewide arts assessment institutes

5 Current SCAAP Assessments Six different assessments in various stages:  Four Entry Level Assessments  Music & Visual Arts (pilot tested in 2002)  Dance & Theatre (pilot tested in 2005)  Two Middle Level Assessments  Music & Visual Arts (pilot tested in 2008)

6 Current SCAAP Assessments All assessments…  are aligned to SC Academic Standards for Visual & Performing Arts (2010)  have two sections  Multiple-choice/Selected Response Section (45 items)  Performance Task Section (2 tasks)

7 SCAAP Administration  Administered each spring  to schools that received Distinguished Arts Program (DAP) grants from the SC Department of Education  Administered at individual schools  by appointed test administrators trained by the SCAAP team  Test administrators trained online and in person by the SCAAP team

8 SCAAP Participants  2004—66 schools with approximately 5,200 students  2005—51 schools with approximately 3,700 students  2006—70 schools with approximately 4,900 students  2007—81 schools with approximately 5,800 students  2008—56 schools with approximately 4,400 students  2009 – 46 schools with approximately 3,500 students  2010 – 41 schools with approximately 3,740 students  2011 - 45 schools with approximately 3,540 students  2012 – 32 schools with approximately 2,545 students  2013 – 34 schools with approximately 2,763 students  2014 - 34 schools with approximately 2,572  2015 - 41 schools with approximately 3,700 students (Expected)

9 Music Test Specifications Standard Overall Emphasis Percent covered by Assessment Format Selected ResponsePerformance Tasks Standard 1: Performance25%-100% Standard 2: Creating Music20% 25% 75% Standard 3: Music Literacy25%100%- Standard 4: Critical Response to Music15%100%- Standard 5: History and Culture10%100%- Standard 6: Connections 5%100%-

10 Visual Arts Test Specifications Standard Overall Emphasis Percent covered by Assessment Format Selected ResponsePerformance Tasks Standard 1: Creating Art25% 40%60% Standard 2: Structures and Functions25% 50% Standard 3: Exploring Content10%100%- Standard 4: History and Culture10%100%- Standard 5: Interpreting Works of Visual Art25% 75% Standard 6: Connections 5%100%-

11 MULTIPLE CHOICE/SELECTED RESPONSE https://scaap.ed.sc.edu

12 The Assessment Process – Achieving the Arts Assessment Mission (Brophy) Establish Goals, and Outcomes Develop and implement assessments Collect Assessment Data Interpret and Evaluate the Data Modify and Improve Improve teaching and learning Assessment

13 The Assessment Process – Achieving the Arts Assessment Mission (Brophy) Establish Goals, and Outcomes Develop and implement assessments Collect Assessment Data Interpret and Evaluate the Data Modify and Improve Improve teaching and learning Assessment Arts Education Mission

14 SCAAP Item Bank

15 Example Music Item

16 Example Visual Arts Item

17 Analyses Performed  Reliability indices for test forms  Cronbach’s alpha and corrected split-half index  Test form equating  Using Item Response Theory (IRT)  Cross year and cross form test equating  Empirical reliability based on fitted IRT model  Differential item functioning (DIF) analysis for gender and ethnicity  Distribution of p-values (percent correct) for items  Discrimination indices for each item

18 Reliability Estimates TestEmpirical Reliability Form# of Items Cronbach’s Alpha Corrected Split Half Music.83 1450.79 2450.82 Visual Arts.86 1450.850.86 2450.860.85

19 Item Review Process  Convene arts advisors in fall to revise items identified as problematic.  Review and revise based on:  P-values  Discrimination indices  Differential Item Functioning (DIF)  Distribution of distractors  Archive items  Write new items to incorporate into test forms

20 Improving the Assessment Each year, item analysis guides the revisions of the assessment for the following year:  P-values: Between.20 and.85  Discrimination index:.19 or higher  Differential Item Functioning (DIF): All items which receive a C classification are examined.

21 PERFORMANCE TASKS https://scaap.ed.sc.edu

22 Performance Tasks  Visual Arts  Compare and contrast two artworks using art terms  Create a drawing with given art elements/principles based on a prompt  Music  Sing a familiar song  Improvise an 8-beat rhythm pattern using rhythm syllables and maintaining a steady beat

23 Preparing Performance Tasks  Receive tasks from schools (flash drives and booklets)  Scanning visual arts performance tasks  Mass uploading of tasks to website for online rating

24 Performance Task Benchmarking  A sampling of responses is chosen and “pre-scored” by members of the SCAAP team based on the rubrics  Arts advisors indicate their agreement or dissent with existing scores  Discrepancies in scores are discussed and elements of the rubric are clarified  Validated, benchmarked tasks are used to train and monitor raters

25 Rater Training  Rater trainings have been done online and in person, depending on level of experience  Raters review rubrics and anchor items and review rater bias types  Interactive practice tests provide feedback on scoring  Raters must pass qualifying test before they begin rating

26 Rating System  Entire rating system is online  Raters must pass a refresher test after scoring 100 tasks  Monitoring: Seeds are placed randomly throughout un- scored tasks  On average, each rater scores 600 tasks over the course of 3 weeks

27 Analyses performed  Inter-rater reliability (GENOVA)  Both MC and Performance tasks:  Correlations between multiple-choice test forms and performance tasks for each area.

28 Inter-rater reliability estimates Music 2013 Performance Task Criteria Generalizability Coefficient Index of Dependability 1 (Singing) Tonal0.94 Rhythm0.840.83 Vocal0.87 2 (Improvisation) Rhythm0.87 Improvisation0.79

29 Inter-rater reliability estimates Visual Arts 2013 Performance Task Generalizability Coefficient Index of Dependability 1 (Compare and Contrast) 0.88 2 (Drawing) 0.74

30 Annual technical report provided to SC Arts Education Associate and posted online each fall. Full report available at: https://scaap.ed.sc.edu

31 REPORTING https://scaap.ed.sc.edu

32 Report Cards  Revisions to report cards made based on teacher feedback  Report cards generated in collaboration with programmer  Multiple-choice section of report cards generated and disseminated prior to end of school year in May  Full report cards including performance tasks results disseminated the following September.

33 Sample Report Card

34 Research on the SCAAP Assessments  Comparing the Dimensionality Structures of Music & Visual Arts Multiple-Choice Assessments (SCEPUR, 2006)  An Exploratory Study of the Dimensionality Structure of a Music Multiple-Choice Assessment (AERA, 2006)  Efficacy of a Web-Based Training and Monitoring Procedure in Scoring Performance Tasks (AERA, 2007)  Raters Characteristics and Performance Scores (AERA, 2008)

35 Research on the SCAAP Assessments  Rhythm Syllable System and Rhythm Achievement (AERA 2008)  The Effect of Gender on a Language-related Theatre Task (SCEPUR, 2009)  Teachers’ Use of Assessment Results (AEA, 2010)  Teachers Making Meaning of Displays of Student Results (AEA, 2011)

36 Research Using SCAAP Results  Comparing Arts Achievement to English Language Arts and Mathematics Achievement in Arts Education Reform Schools (SCEPUR, 2005)  Evaluating the Program Characteristics of Arts Schools with Disparate Achievement Levels (SCEPUR, 2006)  Multiyear Evaluation of the Arts Education Reform Effort in South Carolina (AERA, 2007)  Investigating Arts Programs and Implementation Strategies for Infusing Arts Into Curriculum (AERA, 2007)

37 SCAAP Publications Featured in an assessment textbook:  Assessing Performance: Designing, Scoring, and Validating Performance Tasks (Johnson, Penny, & Gordon, 2008) Music Assessment Symposium Proceedings:  Assessment in Music Education: Integrating Curriculum, Theory, and Practice (Yap & Pearsall, 2007)

38 Thank you! We welcome your questions! lewisaa2@mailbox.sc.edu shockman@ed.sc.gov


Download ppt "WE GAVE THE TEST - NOW WHAT? ANALYSIS AND REPORTING FOR THE SOUTH CAROLINA ARTS ASSESSMENT PROGRAM Ashlee A. Lewis, Office of Program Evaluation R. Scot."

Similar presentations


Ads by Google