Download presentation
Presentation is loading. Please wait.
Published byGeorgia Goodman Modified over 9 years ago
1
Assessment Strategies in the Music Classroom Dr. Phillip M. Hash, Calvin College pmh3@calvin.edu pmh3@calvin.edu November 10, 2014
2
Overview of AM Workshop 1.Current Trends 2.Evaluation strategies 3.Assessment Resources 4.Assessment Basics 5.Assessment Strategies 1.Performance 2.Psychometric Tests 3.Festival Ratings 6.Your Experience
3
Current Trends in MI Teacher Evaluation
4
EVALUATION IMPROVEMENTS IN HB 5223 & HB 5224 Student growth must be determined by 2 or more measures All similarly situated teachers must be evaluated on same measurements & assessments 2014-15 – growth data = 25% (vs. 40%) of evaluation 2015-16/2016-17 – growth data = 25% (12.5% state tests & 12.5% local measures for teachers of tested subjects) 2017-18 – growth data = 40% (20 % state tests & 20% local measures for teachers of tested subjects)
5
Tools Used in Local Evaluation of Teachers (2012-13)
6
Current Trends: Effectiveness Ratings for 2011-13
7
Assessment Practices MSBOA Informal Survey (2012) N = 76 MI School Band & Orchestra Teachers Number of Growth Assessments Administered to Students – Unspecified (n = 3) – 1 (n = 41) – 2 (n = 21) – 3 (n = 10) Types of Assessments by Teacher – Individual Performance (n = 40) – Group Performance (n = 31) – Psychometric Test (n = 31) – Building Measures (n = 5) – Composition (n = 1) – Student Reflection (n = 5)
8
MSBOA Informal Survey (2012) Additional Observations – Scope of all assessments varied widely – Some districts require psychometric tests – Many teachers are utilizing technology such as SmartMusic, Garage Band, Audacity, Data Director, Smart Phones, Ipads, etc. – Survey available on MSBOA website
9
Insuring Integrity Demonstrate validity & reliability Demonstrate connection b/w state standards and assessments Explain/demonstrate process for creating, administering, & grading Archive recordings & other student work
10
Evaluation Strategies
11
Basic Principles Assessment = Art of the Possible Growth vs. Achievement Multiple Measures (3?) Should NOT Dominate – What do you already do? Meaningful & Useful vs. “Hoops” Individual Student Progress Skills & Concepts vs. “the piece” Not necessarily for a Grade Consistent across District
12
Evaluation Strategies Always have lesson plans connecting to standards – See MI GLCE – Incorporate as many standards as make sense for your class – but not just perform and read notation Study the evaluation form Plan lessons using evaluation rubric as a guide Be prepared to provide evidence of instructional & professional practices – Student work, rubrics, lesson plans, parent call log, etc. Use a variety of instructional practices. Focus on student engagement. Don’t try to put on a show for evaluator [Is it time to reconsider the number of performances per year??]
13
NAfME Evaluation Workbooks Philosophical Premise – “Good music teacher evaluation is not only about valid & reliable summative evaluation, but it is also about quality formative professional development.” “[Intended] to provide a helpful tool to music educators, principals and/or supervisors engaged in the entire process of professional development. It should be used as a guide to personal reflection and improvement.” Part 1: Instruction Manual Part 2: Ensemble Teacher Evaluation Summary Form: Criteria for Evaluation (based on Danielson) Part 3: Evaluation Worksheets Appendix - Resources
14
Danielson Example 1e
15
NAfME Workbook Example Secondary 1e
16
Danielson & NAfME (GM) 1f – Designing Student Assessments
17
Developing Local Assessment Strategies
18
Creating an Assessment Plan District Music Faculty (by area) – Est. curriculum based on MI Standards What should students in each grade level know and be able to do? How and when will objectives be assessed? – Perhaps not every grade every year How will assessments show growth? (e.g., difference in % b/w pre- post test, defined NP, PP, P, HP?) Take plan to administration for approval – Law says that “with the involvement of teachers” Pilot, Review, Revise, Implement
19
MI Grade Level Content Expectations (June 2011) What students should know and be able to do in grades K-8, & HS Aligned w/ VPAA & 21 st century skills Standards, & benchmarks by grade level Teachers evaluated on use of standards [See handout]
20
Assessment Terms Reliability = Consistency – Test/retest (regardless of yr., location, etc.) – Interrater (every judge the same) Validity = the extent to which an assessment measures what they purport to measure Authentic Assessment = Students demonstrate knowledge and skills in real-world context (e.g., performance) Quantitative – data is numerical (anything that can be counted, percentages) Qualitative – data is in words (descriptions, written critiques) Formative vs. Summative – practice vs. final Formal vs. Informal - Planned & produced vs. on the spot
21
Assessment Terms - RTTT Rigorous – assessments that measure grade-level standards Two points in time – pre- & post-test – proficiency from one year to the next – ongoing assessments of musical skills (steady beat, pitch matching, singing, recorder, instrumental performance, sight-reading, etc.) Comparable across classrooms – same for all teachers at a particular level or area – assessments comparable in rigor to other subjects
22
Resources
23
Wendy Barden (Kjos) Paul Kimpton (GIA)
24
www.vocaroo.com Audio emails Archived up to 5 months Sends link to an email address Download as.WAV or.Ogg Useful for performance tests Very easy! http://vocaroo.com/?media=vAdx5RJr1DVC7upIc
25
SmartMusic© Interactive practice and assessment tool Extensive Library Create, send, and grade assignments Students record performance and submit the grade (%), assessment screenshot, and recording. Correct notes and rhythms in green/ incorrect in red Accuracy of notes and rhythms only Most objective Educator = $140; Student = $40
26
Rubistar http://rubistar.4teachers.org/ http://rubistar.4teachers.org/ Create rubrics using existing descriptors Search other teachers’ rubrics for samples – Edit to fit your needs
27
Student Growth Measures
28
Checklists 1.Define activity or task (e.g., students will sing “Brother John” on pitch) 2.Define criterion (student sings on pitch) 3.Conduct the Assessment: – Scale = Yes (+ or 2) Sometimes (1 or *) No (0 or -) – Embedded into instruction Student’s Name Singing on Pitch “Brother John” Maintaining Steady Beat w/ Orff Accompaniment Trial 1Trial 2Trial 1Trial 2 John Bill Susan Sherri Damon Etc.
29
The Systemic Assessment: Maintaining Vocal Independence in a 2- or 3-part Vocal Context The Activity/Task: The students will sing “Inanaya” in a 2 or 3 parts, maintaining her/his own voice part independently. Criterion: The student maintains her/his own part independently in a multiple part context. Assessment: Inform the students that you’ll be observing their performance of Inanaya and keeping track of who is maintaining their part in the harmony and who is not. Describe the scoring procedure to the students, and ask for questions. Multilevel, single criterion scoring procedure: “+” maintains vocal independence consistently “~” vocal independence is inconsistent “|” does not maintain independence Dr. Tim Brophy – Univ. of FL
31
Sample Data Collection Instrument – Vocal Independence Data Date/ Assessment JimmySherreeIdaLeDarrius 1/15/13 Vocal Independence, 3 parts ||+~ 1/22/13 Vocal Independence, 3 parts +~++
32
Rubrics Types include: – Holistic (overall performance) – Analytic (specific dimensions of performance) – Additive (yes/no) Descriptors must be valid (meaningful) Scores – Must be reliable (consistent) – Should relate to actual levels of students learning Can be used by students for self-assessment and to assess the performance of other students Give to students b/f assessment 14
33
What does a rubric look like? BeginningBasicProficientAdvanced TONEBreathy; Unclear; Lacks focus; Unsupported Inconsistent; Beginning to be centered and clear; Breath support needs improvement Consistent breath support; Centered and clear; Beginning to be resonant Resonant; Centered; Vibrant; Projecting Adapted from: K. Dirth, Instituting Portfolio Assessment in Performing Ensembles, NYSSMA Winter Conference, Dec. 2, 1997. Features: Scale includes rating points (at least 4). See next slide & handout for sample headings Highest point represents exemplary performance Criterion-based categories (3-5 work best) Descriptors are provided for each level of student performance Pre- and/or Post-test. Teacher, peer, & self assessment 13
34
Constructive Rubric Headings
35
Holistic Rubric
37
Piano Rubric - Analytic Quiz #1 Scales Two octaves, hands together, ascending and descendingKeys ____________ 1/6/12
38
Sample Rating Scale vs. Analytic Rubric 12
39
Additive Rubric
40
Showing Growth w/ Rubrics (or any other pre- post-test) Pre- & post-test average class posttest % - average class pretest % = % growth PostPre% growth 675710 796514 593227 908010 827210 584513 72.558.514
41
Est. Personal Reliability Record 10 students Grade w/ rubric Grade again in 2 weeks Measure the difference in score for each recording Calculate average difference Lower = better Trial 1Trial 2Difference 990 671 862 11101 972 12102 440 Av. Diff. 1.14
42
Rate these 6 recorder performances on a scale of 1-12 Rate the same examples using rubric in handout Trial 1 1 _____ 2 _____ 3 _____ 4 _____ 5 _____ 6 _____
43
Recorder Trial 2 Use rubric Training – Procedures – Definitions Add up score Match score from Trial 1 to Scores from Trial 2 Is there a difference? In which scores are you most confident?
44
Progressive Curricula – Levels of Achievement Jason Lowe – Beal City Public Schools (MS & HS examples) – Fundamental (MS)/Comprehensive (HS) Musicianship Battery – http://bealcitybands.weebly.com/ or http://pmhmusic.weebly.com http://bealcitybands.weebly.com/ http://pmhmusic.weebly.com MSBOA Proficiency Levels (only 3) ASBDA Curriculum Guide (pub. by Alfred) Royal Conservatory Music Development Program
45
RCMDP Syllabi Components (10-11 levels) http://www.musicdevelopmentprogram.org/ http://www.musicdevelopmentprogram.org/ Repertoire (a & b lists) Technical Req. (scales, arpeggios) Ear Training – Intervals – Clapback – playback Sight reading Theory & History Tests are available Adapt as needed
46
Royal Conservatory Music Development Program (see handout) Recorder, strings, woodwinds, brass, percussion, voice Graded preparatory, 1-10 – RC Grade 8 considered college entrance Includes solos, etudes, scales/arpeggios, ear training, sight reading, theory Curricula online Adapt for your program
47
PSYCHOMETRIC TESTS [Refer to HS Orchestra Example & “Strategic Testing” article in Handout]
48
Uses Theory History Listen Analyze Describe Evaluate
49
Psychometric Tests Eimer (2007) [See sample HS orch. exam in handout] Goal = Test Clarity & Reduced Anxiety Give study guide Same basic format and scoring for every test Reasonable length No clues w/in the test Test important information/concepts Avoid T/F – Unreliable Matching – Only facts – No more than 10 per set – Same type/topic for each set – Let student know how many times to use an answer
50
Multiple Choice Incomplete sentence (stem) w/ clear answer & 2-3 distractors Match grammar b/w stem & choices Choices alpha/numerical Stem longer than choices Avoid all/none of the above, a & c, etc.
51
Psychometric Tests Essay & Short Answer – NOT for factual info – Make connections, use higher order thinking skills, evaluate understanding – Make expectation clear in question – Grade w/ wholistic rubric [See HS Orchestra Example] – Notate & Respond
52
Elementary General Music – Grade 3 Pre- & Post Test Sample [See handout] Paper/pencil, but relies on musical response Prompts can be different for pre-test Pre-test can be an abbreviated version Require 2-3 class periods to complete Music supervisor could issue musical examples & prompts before the test (avoid teaching to the test)
53
Creating Similar Elementary General Music Assessment For grades 3-5, determine what GLCEs can be measured through paper/pencil response Create question(s) for each benchmark – deliberately connect question to GLCEs (validity, rigor, comparable a/c classrooms) Decide # of questions needed to determine competency Create questions that fit different prompts
54
Excellence in Theory or Standard of Excellence Music Theory & History Workbooks Kjos - publisher 3 volumes (see handout sample) Includes theory, ear training, history Take MS & HS to complete 3 volumes Students work on lessons during down time in rehearsal Establish grade level expectations and written exam
55
Festival Ratings
56
NAfME Position Statement Successful music teacher evaluation must, where the most easily observable outcomes of student learning in music are customarily measured in a collective manner (e.g., adjudicated ratings of large ensemble performances), limit the use of these data to valid and reliable measures and should form only part of a teacher’s evaluation. (NAfME, 2011)
57
Festival Ratings: Advantages/Disadvantages Advantages Third party assessment - Credibility Focuses on a major aspect of ensemble curr. Final ratings are likely reliable over time Disadvantages Narrow: 3 pieces & sight reading at one point in time Ceiling effect Subject to outside influences Role of MSBOA?
58
Ratings Growth Example Hypothetical Contest Ratings for One Ensemble over a Three-year Period Judge 1Judge 2Judge 3 Sight- Reading Average Annual Increase a Final Year 1 IIIIIII 2.25-2 Year 2 II I 1.7522%2 Year 3 IIIII1.2529%1 Note: a Total increase from year 1 to year 3 = 44%.
59
Experiences
60
Describe Your Situation In roundtables by area? How are you measuring student growth at your school? What support are you getting? What needs or concerns do you have?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.