Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College February 14, 2013.

Slides:



Advertisements
Similar presentations
Teacher Evaluation New Teacher Orientation August 15, 2013.
Advertisements

Value-Added Teacher Evaluation: Explanations and Recommendations for Music Educators Abby Butler – Wayne State University Colleen Conway – University of.
1 Triangulated Standards-based Evaluation Framework Kathleen J. Skinner, Ed.D. Director, MTA Center for Education Policy & Practice Kansas Evaluation Committee.
1.  Why and How Did We Get Here? o A New Instructional Model And Evaluation System o Timelines And Milestones o Our Work (Admin and Faculty, DET, DEAC,
Assessing Learning in the Gifted Classroom
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Domain 1: Planning and Preparation
Student Growth Developing Quality Growth Goals II
PROPOSED MULTIPLE MEASURES FOR TEACHER EFFECTIVENESS
Consistency of Assessment
Music Teacher Evaluation Phillip Hash Calvin College Grand Rapids
Teacher Evaluation & Music Education: What You Need to Know
Stronge Teacher Effectiveness Performance Evaluation System
Accountability Assessment Parents & Community Preparing College, Career, & Culturally Ready Graduates Standards Support 1.
Your Mentoring Program: Step by Step including the Danielson Framework North Palos #117 Presenters: Marilyn Marino, NBCT – Mentor Coordinator David Creagan.
Principles of Assessment
ASSESSMENT Formative, Summative, and Performance-Based
Collaboration I nstruction Assessment 1st AnalysisReflection Intervention Assessment 2nd COMING FULL CIRCLE Mallard Creek and UNCC PDS Work Plan Outcomes.
Skill Based Assessments Skills Singing Playing instruments Moving Listening Composing, Improvising, Arranging Notating (perform, read, write -PRW) Examples.
Becoming a Teacher Ninth Edition
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
CLASS Keys Orientation Douglas County School System August /17/20151.
Assessment Strategies in the Music Classroom Dr. Phillip M. Hash, Calvin College November 10, 2014.
Stronge Teacher Effectiveness Performance Evaluation System
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
TEACHER EVALUATION TRAINING November 1 st, 2012 General Admin Meeting BY GLENN MALEYKO, Ph.D Director of Human Resources John McKelvey– Teachscape November.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
TEACHER EVALUATION TRAINING November 1 st, 2012 General Admin Meeting BY GLENN MALEYKO, Ph.D Director of Human Resources John McKelvey– Teachscape November.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Value-Added Teacher Evaluation: Explanations and Implications for Michigan Music Educators Colleen Conway, University of Michigan (Session Presider) Abby.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
WORKING TOGETHER TO IMPROVE SCIENCE EDUCATION PRESENTED BY GIBSON & ASSOCIATES A CALIFORNIA MATH AND SCIENCE PARTNERSHIP RESEARCH GRANT WISE II Evaluation.
Authentic Assessment Kellie Dimmette CI Pretest on Evaluation Part I 1.C & D 2.B & C 3.T 4.Valid, reliable 5.T 6.T 7.T 8.A & B 9.C 10.B.
March Madness Professional Development Goals/Data Workshop.
CASD Librarians: Do You Speak SAS? What We Need to Know October 25, 2011.
Teacher in Residence  Sign in and take any graded homework from your folders.  Turn in your TWS – Teaching Context and Unit Overview  Please.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Springfield Effective Educator Development System (SEEDS)
Identifying Assessments
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Goal Setting in Educator Evaluation Sept. 11 th,
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
Presented by Mary Barton SATIF CFN 204 Principals’ Conference September 16, 2011.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Tia Juana Malone, English Professor Ruth Ronan, Course Developer Assessment Strategies That Promote Student Engagement.
Education.state.mn.us Principal Evaluation Components in Legislation Work Plan for Meeting Rose Assistant Commissioner Minnesota Department of Education.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Success through Technology and Assessment August 1 st -3 rd Attended workshops and training to prepare for the grant o Achievement Series o eMetric o Obtaining.
Introduction to Teacher Evaluation
Instructional Personnel Performance Appraisal System
Introduction to Teacher Evaluation
Roland Wilson, David Potter, & Dr. Dru Davison
Teacher Evaluation “SLO 101”
Instructional Personnel Performance Appraisal System
Measuring Student Growth in the Instrumental Music Classroom
Effective Use of Rubrics to Assess Student Learning
Exploring Assessment Options NC Teaching Standard 4
McREL TEACHER EVALUATION SYSTEM
Instructional Personnel Performance Appraisal System
McREL TEACHER EVALUATION SYSTEM
Instructional Personnel Performance Appraisal System
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College February 14, 2013

Overview of PM Workshop 1.New Legislation 2.Current Trends 3.Evaluation strategies 4.Assessment Strategies 5.Your Experience

Legislative Review All Teachers Evaluated Annually Percentage of Evaluation to Relate to Student Growth National, State, And Local Assessments Evaluations vs. Seniority in Personnel Decisions Michigan Council On Educator Effectiveness 2

MDE Will Provide Measures For every educator, regardless of subject taught, based on and data: – Student growth levels in reading and math – Student proficiency levels in math, reading, writing, science, social studies – Foundational measure of student proficiency and improvement (same for each teacher in a school) Understanding Michigan's Educator Evaluations, MDE (December 2010) How will this data be used for arts educators? – Currently up to school districts – Might be specified by the state after this year

Performance-Based Compensation A district shall implement a compensation method for teachers and administrators that includes “job performance and job accomplishments as a significant factor” to determine “compensation and additional compensation.” MCL (1) Meaning for arts educators?

New Prohibited Bargaining Subjects 1. Teacher Placement 2. Reduction in Force/Recall 3. Classroom Observation 4. Performance Evaluation 5. Teacher Discharge/Discipline 6. Performance-Based Compensation 7. Parent Notification

Pilot Programs Pilot – 14 districts – 4 evaluation models – Standardized tests – Local measures for non- tested subjects – Recommendations by school year 4

Current Trends in MI Teacher Evaluation

Frameworks, Methods, Systems Used as part of Local Evaluation

% Student Growth Counted in Teacher Evaluation ( ) % of Growth in Local Evaluation Systems

Current Trends: Effectiveness Ratings for

Teacher Ratings & Student Growth

Evaluation Strategies

Always have lesson plans connecting to standards – See MI GLCE – Incorporate as many standards as make sense for your class – but not just perform and read notation Study the evaluation form Plan lessons using evaluation rubric as a guide Be prepared to provide evidence of instructional & professional practices – Student work, rubrics, lesson plans, parent call log, etc. Use a variety of instructional practices. Focus on student engagement. Don’t try to put on a show for evaluator [Is it time to reconsider the number of performances per year??]

Danielson Example

Student Engagement in Rehearsal WrY (student led warm-ups - breathing) WrY kc (chorale) kc (student sectionals - feedback) 4-0

Developing Local Assessment Strategies

Creating an Assessment Plan District Music Faculty (by area) – Est. curriculum based on MI Standards What should students in each grade level know and be able to do? How and when will objectives be assessed? – Perhaps not every grade every year How will assessments show growth? (e.g., difference in % b/w pre- post test, defined NP, PP, P, HP?) Take plan to administration for approval – Law says that “with the involvement of teachers” Pilot, Review, Revise, Implement

MI Grade Level Content Expectations (June 2011) What students should know and be able to do in grades K-8, & HS Aligned w/ VPAA & 21 st century skills Standards, & benchmarks by grade level Teachers evaluated on use of standards [See handout]

Assessment Terms Reliability = Consistency – Test/retest (regardless of yr., location, etc.) – Interrater (every judge the same) Validity = the extent to which an assessment measures what they purport to measure Authentic Assessment = Students demonstrate knowledge and skills in real-world context (e.g., performance) Quantitative – data is numerical (anything that can be counted, percentages) Qualitative – data is in words (descriptions, written critiques) Formative vs. Summative – Formal vs. Informal -

Assessment Terms - RTTT Rigorous – assessments that measure grade-level standards Two points in time – pre- & post-test – Proficiency from one year to the next – Ongoing assessments of musical skills (steady beat, pitch matching, singing, recorder, instrumental performance, sight-reading, etc.) Comparable across classrooms – same for all teachers at a particular level or area – Assessments comparable in rigor to other subjects

Student Growth Measures

Rubistar Create rubrics using existing descriptors Search other teachers’ rubrics for samples – Edit to fit your needs

Rubrics Types include: – Holistic (overall performance) – Analytic (specific dimensions of performance) – Additive Descriptors must be valid (meaningful) Scores – Must be reliable (consistent) – Should relate to actual levels of students learning Can be used by students for self-assessment and to assess the performance of other students Give to students b/f assessment 14

What does a rubric look like? BeginningBasicProficientAdvanced TONEBreathy; Unclear; Lacks focus; Unsupported Inconsistent; Beginning to be centered and clear; Breath support needs improvement Consistent breath support; Centered and clear; Beginning to be resonant Resonant; Centered; Vibrant; Projecting Adapted from: K. Dirth, Instituting Portfolio Assessment in Performing Ensembles, NYSSMA Winter Conference, Dec. 2, Features: Scale includes rating points (at least 4). See handout for sample headings Highest point represents exemplary performance Criterion—based categories Descriptors are provided for each level of student performance Pre- and/or Post-test. Teacher, peer, & self assessment 13

Holistic Rubric

Piano Rubric - Analytic Quiz #1 Scales Two octaves, hands together, ascending and descendingKeys ____________ 1/6/12

Sample Rating Scale 12

Showing Growth w/ Rubrics (or any other pre- post-test) Pre- & post-test average class posttest % - average class pretest % = % growth PostPre% growth

Est. Personal Reliability Record 10 students Grade w/ rubric Grade again in 2 weeks Measure the difference in score for each recording Calculate average difference Lower = better Trial 1Trial 2Difference Av. Diff. 1.14

Rate these 6 recorder performances on a scale of 1-12 Rate the same examples using rubric in handout Trial 1 1 _____ 2 _____ 3 _____ 4 _____ 5 _____ 6 _____

Recorder Trial 2 Use rubric on loose sheet Add up score Match score from Trial 1 to Scores from Trial 2 Is there a difference? In which scores are you most confident?

Elementary General Music – Grade 3 Pre- & Post Test Sample [See handout] Paper/pencil, but relies on musical response Prompts can be different for pre-test Pre-test can be an abbreviated version Require 2-3 class periods to complete Music supervisor could issue musical examples & prompts before the test (avoid teaching to the test)

Creating Similar Elementary General Music Assessment For grades 3-5, determine what GLCEs can be measured through paper/pencil response Create question(s) for each benchmark – deliberately connect question to GLCEs (validity, rigor, comparable a/c classrooms) Decide # of questions needed to determine competency Create questions that fit different prompts

Performing Ensembles Semester Exam [see handout] Jason Lowe – Bay City HS Bands Mandy Smith – Rockford HS Choirs

Watkins – Farnum Performance Scale Sight reading – band Published by Hal Leonard Reliable & valid assessment Forms A & B Easy to score as per directions in handout 14 exercises worth X pts. Score until student earns 0 on 2 consecutive exercises

Royal Conservatory Music Development Program (see handout) Recorder, strings, woodwinds, brass, percussion, voice Graded preparatory, 1-10 – RC Grade 8 considered college entrance Includes solos, etudes, scales/arpeggios, ear training, sight reading, theory Curricula online Adapt for your program

Performing Ensembles

Excellence in Theory or Standard of Excellence Music Theory & History Workbooks Kjos - publisher 3 volumes (see handout sample) Includes theory, ear training, history Take MS & HS to complete 3 volumes Students work on lessons during down time in rehearsal Establish grade level expectations and written exam

Insuring Integrity

Self created, administered, and graded assessments Colleagues & administrators will ask Standards Based assessments Comparable across classrooms Demonstrate validity & reliability – Explain/demonstrate process for creating, administering, & grading – Demonstrate connection b/w state standards and assessments – Archive recordings

Audio s Archived up to 5 months Sends link to an address Download as.WAV or.Ogg Useful for performance tests Very easy!

Festival Ratings

NAfME Position Statement Successful music teacher evaluation must, where the most easily observable outcomes of student learning in music are customarily measured in a collective manner (e.g., adjudicated ratings of large ensemble performances), limit the use of these data to valid and reliable measures and should form only part of a teacher’s evaluation. (NAfME, 2011)

Festival Ratings: Advantages Provide quantitative third party assessment Can show growth over time in some circumstances – Individual judges’ ratings – Repertoire difficulty – 3 yr. period Valid to the extent that they measure the quality of an ensemble’s performance of three selected pieces & sight reading at one point in time Likely reliable over 3-yr. period based on previous research Probably adaptable to state-wide evaluation tool Assess a few performance standards

Ratings Growth Example Hypothetical Contest Ratings for One Ensemble over a Three-year Period Note. Roman numerals represent division ratings. a Total increase from year 1 to year 3 = 44%. Judge 1Judge 2Judge 3 Sight- Reading Average Annual Increase a Final Year 1IIIIIII Year 2II I %2 Year 3IIIII1.2529%1

Ratings ≠ MEAP or MME Exams MEAP & MME Same for all each yr. Rel. and val. established Many Standards Individual Mostly objective Reflect multiple levels of achievement Ratings Rep., adj. change Val. & rel. not est. Per. standards only Group Mostly subjective 90%+ earn I or II out of V ratings.

Festival/Contest Ratings: Challenges Reliability Curricular limitations Score Inflation Ratings Effectiveness in differentiating quality Influence of non-performance factors Group vs. Individual performance Other factors Role of MSBOA & MSVMA?

Experiences

Describe Your Situation In roundtables by area? How are you measuring student growth at your school? What support are you getting? What needs or concerns do you have?