 Rubrics promote student performance and encourage reflective teaching practices (Beeth et al.,2001; Luft, 1998)  Consistent rubric use in the classroom.

Slides:



Advertisements
Similar presentations
Classroom Walkthrough with Reflective Practice
Advertisements

Performance Assessment
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Gwinnett Teacher Effectiveness System Training
Preparing for Learning Objectives Review the TAPS component of the Cobb Keys for Teacher Effectiveness Explore the Teacher Performance Standards.
Domain A A5 Creating or selecting evaluation strategies that are appropriate for the students and that are aligned with the goals of the lesson.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Curriculum & Instruction Webinar October 18, 2013.
Assessing Learning in the Gifted Classroom
First Administration: June 3, 2014 Presented by Brenna Farrell, RCSD.
Teachers’ Use of Standards-Based Instructional Materials Karen D. King New York University Abstract The purpose of the study is: To explore the ways teachers.
Handouts for Session 4 Goals, Sources of Evidence, Rubrics.
EdTPA: Task 1 Support Module Mike Vitale Mark L’Esperance College of Education East Carolina University Introduction edTPA INTERDISCIPLINARY MODULE SERIES.
Grade 12 Subject Specific Ministry Training Sessions
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Understanding the Science and Social Studies Tasks.
Connecting Literature with the California Science Standards Strategic Science Teaching Kindergarten – Physical Science 1 SST K/Physical Science.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Assessing Student Learning. What is assessment? What was learned and how well was it learned Specific performance must be assessed Variety of forms.
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
Principles of Assessment
Write to Learn K-3 Partner Share With your partner discuss your current understanding and use of writing to learn.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Tools for Instruction and Assessment for the Maryland College- and Career-Ready Standards Time to Revisit Tools that will Inform Instruction.
ASSESSMENT IN ONLINE ENVIRONMENTS. WELCOME o Facilitator name Position at university Contact info.
WE GAVE THE TEST - NOW WHAT? ANALYSIS AND REPORTING FOR THE SOUTH CAROLINA ARTS ASSESSMENT PROGRAM Ashlee A. Lewis, Office of Program Evaluation R. Scot.
Presented by: Marianne Serratore Principal - Montgomery Elementary School.
Wisconsin Extended Grade Band Standards
Junior High Literacy Assessment May 26-28, 2008.
A Framework for Inquiry-Based Instruction through
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Subject Specific Pedagogy Task 1 Multiple Subject Credential Program Division of Curriculum and Instruction Charter College of Education.
Putting Reading First Building Blocks for Teaching Children to Read Erika Alleyne.
CLASS Keys Orientation Douglas County School System August /17/20151.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Common Assessments Social Studies. Objectives: Create common assessments that measure student learning and determine the effectiveness of the Parkway.
Summer 2012 Day 2, Session 6 10/13/2015R/ELA.EEA.2012.©MSDE1 Educator Effectiveness Academy English Language Arts And the journey continues… “Transitioning.
Data Sources Artifacts: Lesson plans and/or curriculum units which evidence planned use of diagnostic tools, pre- assessment activities, activating strategies,
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Copyright © 2008, Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and Intel Teach Program are trademarks of.
Integrating Technology & Media Into Instruction: The ASSURE Model
1 NYSESLAT Training Copyright 2005 by Harcourt Assessment, Inc. NYSESLAT CONTENTS OF THIS OVERVIEW  Test features  Materials  Administration.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Standards-Based Science Assessment. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Identifying Assessments
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Researching Technology in South Dakota Classrooms Dr. Debra Schwietert TIE Presentation April 2010 Research Findings.
Pre-Session Checklist MUTE mic TEST Bridgit PRINT Handouts Sign-in sheet Alberta Assessment Consortium Winter Seminar 2010 Student Engagement.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Science Notebooks Research-Based Strategies on how to implement them in today's science classroom by Karen Shepherd.
Math Study Group Meeting #1 November 3, 2014 Facilitator: Simi Minhas Math Achievement Coach, Network 204.
Test Writing as Genre: How to Apply What the Students Already Know Presented by: Tara Falasco and Kathleen Masone.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
Introduction My class is a 7 th grade Science class which consist of 20 students total, 11 females-9 males, 4students are special needs and.
ORGANIZING LESSON PLANNING. WHAT TO EXPECT  How to organize WEEKLY lesson plans  How to be prepared for the WEEK  How to organize examples of WEEKLY.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Oconee Keys Training September, Oconee Keys is designed to: Evaluate classroom teachers using qualitative rubrics to assess instructional practices.
Planning Instruction Component 3: Session 4
School – Based Assessment – Framework
Writing in Math: Digging Deeper into Short Constructed Responses
Planning Instruction Component 3: Session 4
Georgia Department of Education
SUPPORTING THE Progress Report in MATH
The resources for this candidate support has been created and provided by CERRA utilizing materials from the National Board of Professional Teaching Standards.
Presentation transcript:

 Rubrics promote student performance and encourage reflective teaching practices (Beeth et al.,2001; Luft, 1998)  Consistent rubric use in the classroom can effectively guide students’ revisions; this may lead to improvements in overall writing ability (Andrade, 2001; Schirmer & Bailey, 2000 )  The use of rubrics as a teaching strategy significantly improves writing in the following areas: topic, content, story development and organization (Schirmer, Bailey, and Fitzgerald,1999)

 Rubrics can give teachers insight into the effectiveness of their instructional practices (Waltman et al., 1998)  Providing scoring rubrics to educators and students can improve student performance (Johnson, Penny, and Gordon, 2008; U.S. Congress, 1992)

 Providing rubrics can provide teachers with a sense of control and agency over student assessment.  Releasing scoring rubrics might allay the fears and mysteries surrounding standardized forms of assessment.

 Purpose › Provide arts educators and school administrators with a tool to authentically measure students’ arts achievement and to evaluate schools’ arts program › Improve the quality of arts education in South Carolina  Goal › Develop standards-based arts assessments in dance, music, theatre, and visual arts in order to encourage implementation of standards and align curriculum, instruction, and assessment.

 Current Assessments › Four Entry Level  Music, Visual Arts, Dance & Theatre › Two Intermediate Level  Music & Visual Arts  Assessment Format › 2 parallel multiple-choice test forms with 45 items › 2 performance tasks

 Performance Task 1 › Compare and Contrast two artworks  Performance Task 2 › Draw and evaluate own artwork

 Are performance assessment scores improved by releasing rubrics to test administrators?  What other factors might influence students’ scores?  Are the impacts different across types of performance task ?

 Data Collected › SCAAP visual arts performance assessment results  Task 1 & Task 2a  N = 24 schools  Mean scores from each school › Teacher Feedback Survey  Quantitative Analysis › Repeated-Measures ANOVA › Kruskal-Wallis test  Qualitative Analysis of Teacher Feedback Survey

 Mean score changes for the SCAAP schools  Mean score changes for individual SCAAP schools: › Twelve schools out of 24 (50%) improved mean scores from 2008 to 2009 › Ten schools out of 24 (42%) improved mean scores from 2009 to All SchoolsMean SD SchoolsMean SD

Task 1: Compare and Contrast

 Mean score changes for various time used to administer SCAAP Task 1 Note: We suggest students be given about 30 minutes to complete each task Time Used ≤ to > N/A

 Repeated-Measures ANOVA: To examine differences across years › No significant differences were found between the three years, with F = 2.39, p =.10  Kruskal-Wallis on Time Allocation: To examine how the amount of time used impacted student scores › No significant differences were found for any of the years (using α=.05) H CV

 Mean score changes for the SCAAP schools  Mean score changes for individual SCAAP schools: › Thirteen schools out of 24 (54%) improved mean scores from 2008 to 2009 › Sixteen schools out of 24 (67%) improved mean scores from 2009 to All SchoolsMean SD SchoolsMean SD

Task 2a: Drawing

 Mean score changes for various time used to administer SCAAP Task 2a Note: We suggest students be given about 30 minutes to complete each task. Time Used ≤ to > N/A

 Repeated-Measures ANOVA: F =7.46, p =.002 › 2008 with 2009: F =.12, p =.73 › 2009 with 2010: F = 11.26, p =.003 › 2008 with 2010: F = 8.26, p =.009  Kruskal-Wallis on Time Allocation: To examine how the amount of time used impacted student scores No significant difference was found for any of the years (using α=.05) H CV

 Teachers were asked 3 questions regarding their use of the SCAAP rubrics: › Have you reviewed the performance task rubrics (provided in the Visual Arts Script)? › Have you shared the information in the rubrics with yours students in any way? (If yes, what methods did you use to share that information?). › Did you use, or will you use, the information in the rubrics to inform your instructional practices? (If yes, please explain).  29 visual arts teachers responded to the feedback survey

Have you reviewed the performance task rubrics (provided in the Visual Arts Script)?  Yes: 72%  No: 28%

Have you shared the information in the rubrics with your students in any way (if Yes, what methods did you use to share that information)?  Yes: 25%  No: 75%

 For open-ended portion, n=6  Responses indicated that teachers: › Read the rubrics aloud to the students › Summarized the rubrics for the students › Provided what was required to receive the highest score on the task

I believe that last year the first year that these were provided…As we practiced writing prompts during the year, I tried to get across the need to use terms, refer to the pictures and explain what is being compared… After receiving materials this year, I tried to show them some examples of sentences that might be used and possible scores they may receive. (This was as we practiced in the classroom using different types of pictures.) We had practiced comparing and contrasting landscapes. I may have written a sentence and said, "This may get you a zero or a one." I would add information to the sentence to show how they could give more information.

Did you use, or will you use, the information in the rubrics to inform your instructional practices? (If yes, please share how you have used or will use the information in the rubrics to inform instruction.)  Yes: 50%  No: 50%

 For open-ended portion, n=10  Responses indicated that rubrics inform instruction by: › Being a tool for sharing assessment results › Assisting teachers in preparing for the assessment › Informing future instructional planning › Inspiring teachers to incorporate similar elements in their classrooms

 Importantly, one teacher responded: I assumed that the instructions were the same as in the past, so, sadly, I did not know to look for any rubrics. Because of this mistake, I did not share with my students. I do think that sharing this information with the students ahead of taking the test would be helpful.

 I did like the rubric and if I am allowed to have a copy of it, it would be very helpful touse in future lessons when writing an art critique.  I would like to incorporate elements of your rubrics into my classroom so the terminology is consistent in class and in the rubric. I also think it will help the students on the multiple choice section because they will know the terms.  I will use it to help students understand expectations of what is expected in the art room and what they can do to improve not only their work, but also their grades. It is a tangible and understandable measuring tool for art expectations.

 Small sample size: 24 schools  No control over whether and how teachers use the rubrics  Lack of information about how teachers administered the test

 Releasing rubrics may impact student performance, but those impacts might differ based on the nature of the task  Time allocated to administer tasks caused variation in scores student achieved, but no evidence was found that it significantly changed the results

 Investigation of how teachers used the rubrics  Examination of how the nature of the task is related to the different impact of rubrics on student scores

 Make uses of the rubrics more explicit to teachers  Facilitate teachers in sharing suggestions for how to use the rubrics  Create and provide grade-appropriate student versions of rubrics

Andrade, H.G. (2001). The effects of instructional rubrics on learning to write. Current Issues in Education[Online], 4(4). Retrieved from : Beeth, M. E., Cross, L., Pearl, C., Pirro, J., Yagnesak, K., & Kennedy, J. (2001). A continuum for assessing science process knowledge in grades K-6. Electronic Journal of Science Education, 5(3). Johnson, R.L., Penny, J.A., & Gordon, B. (2009). Assessing Performance: Designing, Scoring, and Validating Performance Tasks. New York, NY: The Guilford Press. Luft, J. (1998). Rubrics: Design and use in science teacher education. In Paper Presented at the Annual Meeting of the Association for the Education of Teachers in Science. Schirmer, B. R., & Bailey, J. (2000). Writing assessment rubric. Teaching Exceptional Children, 33, Schirmer, B. R., Bailey, J., & Fitzgerald, S. M. (1999). Using a writing assessment rubric for writing development of children who are deaf. Exceptional Children, 65, 383–397. Waltman, K., Kahn, A., & Koency, G. (1998). Alternative approaches to scoring: The effects of using different scoring methods on the validity of scores from a performance assessment. CSE Technical Report 488. Los Angeles. U.S. Congress, Office of Technology Assessment. (1992). Testing in American schools: Asking the right questions, OTA-SET-519. Washington, DC: U.S. Government Printing Office.

Thank you! Thank you! Contact Information Ashlee Lewis: Min Zhu: Xiaofang Zhang: Office of Program Evaluation University of South Carolina