Measuring Student Growth in the Instrumental Music Classroom

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Performance Assessment
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Introduction to Creating a Balanced Assessment System Presented by: Illinois State Board of Education.
Value-Added Teacher Evaluation: Explanations and Recommendations for Music Educators Abby Butler – Wayne State University Colleen Conway – University of.
Common Core State Standards (CCSS) Nevada Joint Union High School District Nevada Union High School September 23, 2013 Louise Johnson, Ed.D. Superintendent.
Student Growth Developing Quality Growth Goals II
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Alternative Assessments FOUN 3100 Fall 2003 Sondra M. Parmer.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
September 2013 The Teacher Evaluation and Professional Growth Program Module 2: Student Learning Objectives.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Music Teacher Evaluation Phillip Hash Calvin College Grand Rapids
Teacher Evaluation & Music Education: What You Need to Know
Principles of Assessment
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Curriculum Mapping Overview Based on the work of Heidi Hayes Jacobs, Ph.D and Susan Udelhofen, Ph.D Compiled and Presented to IUP undergraduate students.
Classroom Assessment LTC 5 ITS REAL Project Vicki DeWittDeb Greaney Director Grant Coordinator.
Assessment Strategies in the Music Classroom Dr. Phillip M. Hash, Calvin College November 10, 2014.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
CASL: Target -Method Match Statesville Middle School January 13, 2009.
Assessing Writing Writing skill at least at rudimentary levels, is a necessary condition for achieving employment in many walks of life and is simply taken.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Alternative Assessment
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Value-Added Teacher Evaluation: Explanations and Implications for Michigan Music Educators Colleen Conway, University of Michigan (Session Presider) Abby.
Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College February 14, 2013.
ERead and Report. What is... Independent eBook Reading with a Vocabulary and Comprehension Assessment Focuses mainly on Reading Informational Texts Aligns.
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
PRESENTED AT: THE ISU SUMMER DIRECTOR WORKSHOPS, JULY 10, 2012 DAVID W. SNYDER MAKING THE GRADE: QUALITY ASSESSMENT FOR INSTRUMENTAL GROUPS.
Biology Partnership Assessment Pedagogy Session Saturday, September 29, 2012 Dr. Susan Butler.
Assessment and Testing
Session 4 Performance-Based Assessment
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Evaluation, Testing and Assessment June 9, Curriculum Evaluation Necessary to determine – How the program works – How successfully it works – Whether.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Tia Juana Malone, English Professor Ruth Ronan, Course Developer Assessment Strategies That Promote Student Engagement.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Philippines – Australia Basic Education Assistance for Mindanao Beam Pre-service Workshop “Authentic Assessment”
Classroom Assessments Checklists, Rating Scales, and Rubrics
School – Based Assessment – Framework
Classroom Assessment A Practical Guide for Educators by Craig A
Using Victorian Curriculum to plan F-6 Music learning
Formative and Summative Assessment
Classroom Assessments Checklists, Rating Scales, and Rubrics
Informational Writing Unit Grade 7-Looking at Grade 6/year 1
Writing Performance Assessment
Session 4 Objectives Participants will:
Effective Use of Rubrics to Assess Student Learning
Exploring Assessment Options NC Teaching Standard 4
jot down your thoughts re:
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Assessment Practices in a Balanced Assessment System
Rubrics for evaluation
jot down your thoughts re:
EDUC 2130 Quiz #10 W. Huitt.
Understanding Standards An overview of course assessment
Presentation transcript:

Measuring Student Growth in the Instrumental Music Classroom Dr. Phillip M. Hash Illinois State University Normal, Illinois pmhash@ilstu.edu www.pmhmusic.weebly.com Wednesday, July 11, 2018 ISU Band Director Workshop

Purpose Assessing student growth in school band and orchestra programs Basic principles Definitions Performance Assessments Traditional Pencil/Paper Assessments Insuring Integrity of the Process Current Practices in Michigan Discuss ways of assessing and demonstrating student progress in school band and orchestra programs

IL School Code on Evaluation (part 50) Growth = at least 30% of eval score (after 2 yrs. implementation) Two or more assessments to determine student growth aligned to school improvement goals. Types of Assessments: (I) – standardized; (II) – district wide; (III) – teacher/school specific Includes: teacher-created, textbook publishers, work samples/portfolios, measures of student performance Specify growth expectations and how growth is calculated (data analysis) Must include at least one type I or II, and at least one type III unless A district joint committee on evaluation can allow two type III for a particular classification of teachers (e.g., music, PE, etc.) Interval of Instruction could be one year to the next Evaluation rating related to number of students meeting growth expectation

Basic Principles Assessment = Art of the Possible Growth vs. Achievement Should NOT Dominate What do you already do? Meaningful & Useful vs. “Hoops” Individual Student Progress Skills & Concepts vs. “the piece” Not necessarily for a Grade Administrators want music educators to lead their process Assessment = Art of the Possible (e.g., Rubrics, Multiple Choice Tests) Growth vs. Achievement (Pre/Post Test or Levels of Achievement) Multiple Measures (3?) (Performance test, Written test, Group assessment) Should NOT dominate instruction (Incorporate assessment into auditions – Use last year’s spring playing test as a baseline for next year’s test) Meaningful & Useful vs. “Hoops” Individual student progress (vs. Group measures) Consistent across district & music content areas (to compare teachers) Assessing MI Music Standards & Benchmarks (Adds validity to what we teach. Performance standards are only part of what is required)

Assessment Terms Reliability = Consistency Test/retest Interrater Validity = the extent to which an assessment measures what it intends [Connect w/ IL Music Standards] Authentic Assessment = Demonstrate knowledge and skills in real-world context (e.g., performance) Psychometric = Pencil & paper test Reliability = Consistency Test/retest (regardless of # times taking the test) – T/F not reli. Interrater (every judge the same) e.g., Solo/Ensemble contest Validity = the extent to which an assessment measures what it intends (Scales: Reflection, Written, Playing) [Connect w/ Michigan Music Standards] Authentic Assessment = Demonstrate knowledge and skills in real-world context (e.g., performance) Psychometric = Pencil & paper test Quantitative = data is numerical (anything that can be counted, percentages) Qualitative = data is in words (descriptions, written critiques)

Performance Assessment

Rubrics [examples in handout] Assessment Tool Containing: Categories or Dimensions of Performance Multiple Levels of Achievement Descriptors for each level of each category Advantages Improves reliability Informs student & teacher Focuses attention on whole performance Combine dim. scores into summative total OK to measure only a few categories Analytic Rubric One of the most effective tools for measuring student music performance are rubrics. Examples are provided in your handout and many others are available online. Just put orchestra, strings, band, instrumental rubric into your search engine. Look at both web and image results. Several Types: Analytic Rubric Multiple Categories Focuses on Quality in each Category

Additive Rubric (Point awarded for each aspect of a dimension that is demonstrated)

Holistic Rubric (overall performance) OK to underline statements that apply and score using a decimal. Fast and easy to use, but each level might not accurately represent the performance. Use the rubric as a guide and average numbers when needed.

Creating a Rubric Be prepared to pilot test & revise!! Determine Categories – between 3-5 Write descriptors for different levels of proficiency (1-4 or 5) [Analytic] Short paragraph (2-4 sentences) Top (far left) describes the very best Bottom (far right) describes unacceptable 2-3 levels in between Constructive labels (excellent/needs work, Pro to novice, etc.) [see handout] Be prepared to pilot test & revise!! http://rubistar.4teachers.org/index.php

Rubistar http://rubistar.4teachers.org/ Create rubrics using existing templates & descriptors Search other teachers’ rubrics for samples Edit to fit your needs Free

Using the Rubric Distribute ahead of time Self/Peer evaluation Be clear as to how the rubric will be used Formative/Pretest/Ungraded Summative/Posttest/Graded Mastery Learning – Do it until it is of a high level

SmartMusic© Interactive practice and assessment tool Extensive Library Create, send, and grade assignments Students record performance and submit the grade (%), assessment screenshot, and recording. Correct notes and rhythms in green/ incorrect in red Accuracy of notes and rhythms only Most objective Educator = $40; Per Student = $4-$12 Interactive practice and assessment tool Create and send assignments to students, grade and progress As a student plays an exercise or selection, SmartMusic’s® assessment feature displays the music and shows correct notes and rhythms in green and incorrect in red. Using a computer, students can record their performance at home and submit the grade, assessment screenshot, and recording to their teacher. Although SmartMusic® only measures accuracy of notes and rhythms, directors can choose to conduct further assessment of tone, intonation, phrasing, and other aspects while listening to the recording (MakeMusic, Inc., 2013).

www.vocaroo.com Record or Upload (e.g. from smart phone) Very easy! Archived up to 5 months Sends link to an email address Download as .WAV or Mp3 Useful for performance tests Rec function better for strings & WWs

Resources Wendy Barden (Kjos) Paul Kimpton (GIA)

RCMDP Syllabi Components (10-11 levels) https://www.rcmusic.com/ Progressive Curriculum for all instruments Repertoire (a & b lists) Technical Req. (scales, arpeggios) Ear Training Intervals Clapback playback Sight reading Theory & History Tests are available Adapt as needed

Contest Ratings [Group Measure] Possible to use festival ratings as a group measure of student growth Concerns w/ teacher created local measures as well IF festival ratings are used: 1. Up to the director 2. Clear to director and administration HOW they will be used 3. One of MULTIPLE measures

Contest Ratings: Advantages/Disadvantages Third party assessment - Credibility Focuses on a major aspect of ensemble curr. Final ratings are likely reliable over time ISBE Type I Assessment(?) Disadvantages Narrow: 3 pieces & sight reading at one point in time Ceiling effect Subject to outside influences Role of Contest? Are a quantitative third party assessment Can show growth over time in some circumstances II, II, II, II/I, II, II, II/I, I, II, II = growth over three years, assuming adjudication and repertoire is consistent (reliable) Are valid to the extent that they measure the quality of an ensemble’s performance of three selected pieces & sight reading at one point in time. Final ratings are likely reliable over 3-yr. period based on previous research Will probably be possible to fit into state-wide evaluation tool if we use individual adjudicators’ ratings

Ratings Growth Example Hypothetical Contest Ratings for One Ensemble over a Three-year Period Judge 1 Judge 2 Judge 3 Sight-Reading Average Annual Increasea Final Year 1 II III 2.25 - 2 Year 2 I 1.75 22% Year 3 1.25 29% 1 Note: aTotal increase from year 1 to year 3 = 44%.

Solo/Ensemble Contest Ratings [Type I Assessment] Consider requiring S/E participation Perhaps only two consecutive grade levels (e.g., 6-7; 9-10) Need time to work w/ students Other considerations (e.g., $) Solos, duets, trios Use ABC grading chart (handout) or FJH (online) to standardize grade levels. OK to use ½ grades. Growth expectation > grade/same rating Same grade/> rating > grade/one rating lower Wholistic, authentic assessment

Psychometric Tests [Refer to HS Orchestra Example & “Strategic Testing” article in Handout]

Uses Theory History Listen Analyze Describe Evaluate

Psychometric Tests Eimer (2007) [See sample HS orch. exam in handout] Goal = Test Clarity & Reduced Anxiety Give study guide Same basic format and scoring for every test Reasonable length No clues w/in the test Test important information/concepts Avoid T/F Unreliable Matching Only facts No more than 10 per set Same type/topic for each set Let student know how many times to use an answer SO THAT TEST IS CLEAR to STUDENTS: Review and provide study guides that let students know what to expect. Same format and possible score (e.g., 100) for every test Reasonable length No clues w/in the test Test important information/concepts Avoid T/F Only facts if used Matching Only facts Same type/topic for each set No more than 10 per set Uneven # questions vs. possible answers?? Let student know how many times to use an answer

Multiple Choice Incomplete sentence (stem) w/ clear answer & 2-3 distractors Match grammar b/w stem & choices Choices alpha/numerical Stem longer than choices Avoid all/none of the above, a & c, etc.

Psychometric Tests Essay & Short Answer [See HS Orchestra Example] NOT for factual info Make connections, use higher order thinking skills, evaluate understanding Make expectation clear in question Grade w/ wholistic rubric [See HS Orchestra Example] Notate & Respond

Insuring Integrity Demonstrate validity & reliability Demonstrate connection b/w state standards and assessments Explain/demonstrate process for creating, administering, & grading Archive recordings & other student work

Conclusion Work together Share good ideas pmhash@ilstu.edu