Measuring Student Growth in the Instrumental Music Classroom Dr. Phillip M. Hash Illinois State University Normal, Illinois pmhash@ilstu.edu www.pmhmusic.weebly.com Wednesday, July 11, 2018 ISU Band Director Workshop
Purpose Assessing student growth in school band and orchestra programs Basic principles Definitions Performance Assessments Traditional Pencil/Paper Assessments Insuring Integrity of the Process Current Practices in Michigan Discuss ways of assessing and demonstrating student progress in school band and orchestra programs
IL School Code on Evaluation (part 50) Growth = at least 30% of eval score (after 2 yrs. implementation) Two or more assessments to determine student growth aligned to school improvement goals. Types of Assessments: (I) – standardized; (II) – district wide; (III) – teacher/school specific Includes: teacher-created, textbook publishers, work samples/portfolios, measures of student performance Specify growth expectations and how growth is calculated (data analysis) Must include at least one type I or II, and at least one type III unless A district joint committee on evaluation can allow two type III for a particular classification of teachers (e.g., music, PE, etc.) Interval of Instruction could be one year to the next Evaluation rating related to number of students meeting growth expectation
Basic Principles Assessment = Art of the Possible Growth vs. Achievement Should NOT Dominate What do you already do? Meaningful & Useful vs. “Hoops” Individual Student Progress Skills & Concepts vs. “the piece” Not necessarily for a Grade Administrators want music educators to lead their process Assessment = Art of the Possible (e.g., Rubrics, Multiple Choice Tests) Growth vs. Achievement (Pre/Post Test or Levels of Achievement) Multiple Measures (3?) (Performance test, Written test, Group assessment) Should NOT dominate instruction (Incorporate assessment into auditions – Use last year’s spring playing test as a baseline for next year’s test) Meaningful & Useful vs. “Hoops” Individual student progress (vs. Group measures) Consistent across district & music content areas (to compare teachers) Assessing MI Music Standards & Benchmarks (Adds validity to what we teach. Performance standards are only part of what is required)
Assessment Terms Reliability = Consistency Test/retest Interrater Validity = the extent to which an assessment measures what it intends [Connect w/ IL Music Standards] Authentic Assessment = Demonstrate knowledge and skills in real-world context (e.g., performance) Psychometric = Pencil & paper test Reliability = Consistency Test/retest (regardless of # times taking the test) – T/F not reli. Interrater (every judge the same) e.g., Solo/Ensemble contest Validity = the extent to which an assessment measures what it intends (Scales: Reflection, Written, Playing) [Connect w/ Michigan Music Standards] Authentic Assessment = Demonstrate knowledge and skills in real-world context (e.g., performance) Psychometric = Pencil & paper test Quantitative = data is numerical (anything that can be counted, percentages) Qualitative = data is in words (descriptions, written critiques)
Performance Assessment
Rubrics [examples in handout] Assessment Tool Containing: Categories or Dimensions of Performance Multiple Levels of Achievement Descriptors for each level of each category Advantages Improves reliability Informs student & teacher Focuses attention on whole performance Combine dim. scores into summative total OK to measure only a few categories Analytic Rubric One of the most effective tools for measuring student music performance are rubrics. Examples are provided in your handout and many others are available online. Just put orchestra, strings, band, instrumental rubric into your search engine. Look at both web and image results. Several Types: Analytic Rubric Multiple Categories Focuses on Quality in each Category
Additive Rubric (Point awarded for each aspect of a dimension that is demonstrated)
Holistic Rubric (overall performance) OK to underline statements that apply and score using a decimal. Fast and easy to use, but each level might not accurately represent the performance. Use the rubric as a guide and average numbers when needed.
Creating a Rubric Be prepared to pilot test & revise!! Determine Categories – between 3-5 Write descriptors for different levels of proficiency (1-4 or 5) [Analytic] Short paragraph (2-4 sentences) Top (far left) describes the very best Bottom (far right) describes unacceptable 2-3 levels in between Constructive labels (excellent/needs work, Pro to novice, etc.) [see handout] Be prepared to pilot test & revise!! http://rubistar.4teachers.org/index.php
Rubistar http://rubistar.4teachers.org/ Create rubrics using existing templates & descriptors Search other teachers’ rubrics for samples Edit to fit your needs Free
Using the Rubric Distribute ahead of time Self/Peer evaluation Be clear as to how the rubric will be used Formative/Pretest/Ungraded Summative/Posttest/Graded Mastery Learning – Do it until it is of a high level
SmartMusic© Interactive practice and assessment tool Extensive Library Create, send, and grade assignments Students record performance and submit the grade (%), assessment screenshot, and recording. Correct notes and rhythms in green/ incorrect in red Accuracy of notes and rhythms only Most objective Educator = $40; Per Student = $4-$12 Interactive practice and assessment tool Create and send assignments to students, grade and progress As a student plays an exercise or selection, SmartMusic’s® assessment feature displays the music and shows correct notes and rhythms in green and incorrect in red. Using a computer, students can record their performance at home and submit the grade, assessment screenshot, and recording to their teacher. Although SmartMusic® only measures accuracy of notes and rhythms, directors can choose to conduct further assessment of tone, intonation, phrasing, and other aspects while listening to the recording (MakeMusic, Inc., 2013).
www.vocaroo.com Record or Upload (e.g. from smart phone) Very easy! Archived up to 5 months Sends link to an email address Download as .WAV or Mp3 Useful for performance tests Rec function better for strings & WWs
Resources Wendy Barden (Kjos) Paul Kimpton (GIA)
RCMDP Syllabi Components (10-11 levels) https://www.rcmusic.com/ Progressive Curriculum for all instruments Repertoire (a & b lists) Technical Req. (scales, arpeggios) Ear Training Intervals Clapback playback Sight reading Theory & History Tests are available Adapt as needed
Contest Ratings [Group Measure] Possible to use festival ratings as a group measure of student growth Concerns w/ teacher created local measures as well IF festival ratings are used: 1. Up to the director 2. Clear to director and administration HOW they will be used 3. One of MULTIPLE measures
Contest Ratings: Advantages/Disadvantages Third party assessment - Credibility Focuses on a major aspect of ensemble curr. Final ratings are likely reliable over time ISBE Type I Assessment(?) Disadvantages Narrow: 3 pieces & sight reading at one point in time Ceiling effect Subject to outside influences Role of Contest? Are a quantitative third party assessment Can show growth over time in some circumstances II, II, II, II/I, II, II, II/I, I, II, II = growth over three years, assuming adjudication and repertoire is consistent (reliable) Are valid to the extent that they measure the quality of an ensemble’s performance of three selected pieces & sight reading at one point in time. Final ratings are likely reliable over 3-yr. period based on previous research Will probably be possible to fit into state-wide evaluation tool if we use individual adjudicators’ ratings
Ratings Growth Example Hypothetical Contest Ratings for One Ensemble over a Three-year Period Judge 1 Judge 2 Judge 3 Sight-Reading Average Annual Increasea Final Year 1 II III 2.25 - 2 Year 2 I 1.75 22% Year 3 1.25 29% 1 Note: aTotal increase from year 1 to year 3 = 44%.
Solo/Ensemble Contest Ratings [Type I Assessment] Consider requiring S/E participation Perhaps only two consecutive grade levels (e.g., 6-7; 9-10) Need time to work w/ students Other considerations (e.g., $) Solos, duets, trios Use ABC grading chart (handout) or FJH (online) to standardize grade levels. OK to use ½ grades. Growth expectation > grade/same rating Same grade/> rating > grade/one rating lower Wholistic, authentic assessment
Psychometric Tests [Refer to HS Orchestra Example & “Strategic Testing” article in Handout]
Uses Theory History Listen Analyze Describe Evaluate
Psychometric Tests Eimer (2007) [See sample HS orch. exam in handout] Goal = Test Clarity & Reduced Anxiety Give study guide Same basic format and scoring for every test Reasonable length No clues w/in the test Test important information/concepts Avoid T/F Unreliable Matching Only facts No more than 10 per set Same type/topic for each set Let student know how many times to use an answer SO THAT TEST IS CLEAR to STUDENTS: Review and provide study guides that let students know what to expect. Same format and possible score (e.g., 100) for every test Reasonable length No clues w/in the test Test important information/concepts Avoid T/F Only facts if used Matching Only facts Same type/topic for each set No more than 10 per set Uneven # questions vs. possible answers?? Let student know how many times to use an answer
Multiple Choice Incomplete sentence (stem) w/ clear answer & 2-3 distractors Match grammar b/w stem & choices Choices alpha/numerical Stem longer than choices Avoid all/none of the above, a & c, etc.
Psychometric Tests Essay & Short Answer [See HS Orchestra Example] NOT for factual info Make connections, use higher order thinking skills, evaluate understanding Make expectation clear in question Grade w/ wholistic rubric [See HS Orchestra Example] Notate & Respond
Insuring Integrity Demonstrate validity & reliability Demonstrate connection b/w state standards and assessments Explain/demonstrate process for creating, administering, & grading Archive recordings & other student work
Conclusion Work together Share good ideas pmhash@ilstu.edu