Network 536 Principal’s Institute MOSL Overview August 28, 2013

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

NYCDOE’s Advance: Measures of Student Learning
APS Teacher Evaluation Module 7: Preparing for My Mid-Year Conversation.
Teacher Evaluation & Developing Goals Glenn Maleyko, Executive Director, Ph.D Haigh Elementary September 8, 2014.
 Reading School Committee January 23,
Student Learning Objectives Session 3 Denver Public Schools Assessment, Research and Evaluation, 2014.
NYCDOE’s Advance: NYC Performance Assessments
LOGO August The Basics What is the Analysis of Student Work (ASW)? ASW is the process North Carolina has decided to implement in order to obtain.
NYS Assessment Updates & Processes for New Social Studies Regents Exams September 18, 2014 Candace Shyer Assistant Commissioner for Assessment, Standards.
Annual Professional performance review (APPR overview) Wappingers CSD.
Annual Professional Performance Review (APPR) as approved by the Board of Regents, May 2011 NOTE: Reflects guidance through September 13, 2011 UPDATED.
GOAL SETTING CONFERENCES BRIDGEPORT, CT SEPTEMBER 2-3,
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Principal Performance Evaluation System
Leader & Teacher SLTs 2014 – ComponentEvaluation for TeachersEvaluation for School Leaders Setting GoalsTeachers set two SLTs in collaboration with.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Teacher Evaluation Training June 30, 2014
Principles of Assessment
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ICSD District RtI Committee Agenda 3/13/12 3:45- Review of Our Norms and today’s agenda 4:00- Defining RtI and screening tool criteria 4:30- Begin review.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Purpose & Context This work, particularly around MoSL, is very nuanced and complex. There are places where some policy decisions are still being made.
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Student Learning targets
Wisconsin Extended Grade Band Standards
Professional Development by Johns Hopkins School of Education, Center for Technology in Education Supporting Individual Children Administering the Kindergarten.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
Assistant Principal Meeting August 28, :00am to 12:00pm.
Presented by Debbie Godsen DePalma.  What is the plan for NYS and the CCS?  What are the CCS?  FAQ  What are the benefits?  What are the models of.
Stronge Teacher Effectiveness Performance Evaluation System
* Provide clarity in the purpose and function of the Student Learning Objectives (SLOs) as a part of the APPR system * Describe procedures for using.
Teacher Effectiveness Day 5. Housekeeping Parking Breaks and lunch Emergencies.
Update on Virginia’s Growth Measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education July-August.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
SGO 2.0: from Compliance to Quality Increasing SGO Quality through Better Assessments and Target Setting 1.
OCM BOCES SLOs Workshop. Race To The Top: Standards Data Professional Practice Culture APPR.
Geelong High School Performance Development & Review Process in 2014.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Winter, 2012 Teacher Effectivensss Day 5. To download powerpoint:
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Student Learning Objectives. Introductions Training Norms Be present Actively participate in activities Respect time boundaries Use electronics respectfully.
OREGON DEPARTMENT OF EDUCATION COSA LAW CONFERENCE 2015 ODE Update on Educator Effectiveness.
Professional Learning: NYC Performance Task Norming Workshop
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Student Learning Objectives NYS District-Wide Growth Goal Setting Process December 1, 2011 EVOLVING.
Using the framework to ground decisions in the quality of teacher practice.
Supporting the Development of Student Learning Objectives How to Create an SLO.
Collaborative Grouping 6-12 Math Teachers. Workshop Outcomes Participants will gain effective strategies for forming and facilitating a classroom culture.
Proposed End-of-Course (EOC) Cut Scores for the Spring 2015 Test Administration Presentation to the Nevada State Board of Education March 17, 2016.
Curriculum Night Elementary. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Curriculum Night Middle School. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
V: Maryland’s High School Assessments (HSAs) & the Bridge Plan for Academic Validation Overview.
1 Overview of Teacher Evaluation 60% Multiple Measures of Teacher Performance At least 31 points based on “at least 2” observations At least one observation.
1 Testing Various Models in Support of Improving API Scores.
Teacher Evaluation “SLO 101”
Updates on the Next-Generation MCAS
North Country/ Mohawk Regional NTI
Lead Evaluator for Principals Part I, Series 1
Student Learning Objective (SLO) Staff Development
Gary Carlin, CFN 603 September, 2012
Introduction to Student Achievement Objectives
Network Institute Advance Measures of Student Learning
2019 Local School District Charter Application Process
Unit 7: Instructional Communication and Technology
SGM Mid-Year Conference Gina Graham
Presentation transcript:

Network 536 Principal’s Institute MOSL Overview August 28, 2013 Agenda MOSL/ADVANCE Deadlines and Overview Baseline assessments and Growth Measures MOSL Quiz NYC Performance Assessments -Overview NYC Performance Assessments – Tools for Norming with Staff. Talk about shifts from content knowledge to content literacy…it is more holistic and less about skills in isolation as we move towards college and career readiness CFN 536 – Gerard Beirne, Network Leader Center for Education Innovation – Public Education Association Facilitators: Antonio Arocho and Niobe Hayes

Fall Timeline See: ADVANCE TOOLS and TIMELINES August September October November MOSL Menu Interactive Tool released. Advance Web Application 1.0 released. Sample NYC Performance Assessments available. Continued MOSL training webinars and resources. Committees recommend Local Measure. Principal approves (or chooses default). (9/9) Principals select State Measure (where the option exists). Teachers administer and score baseline assessments. Schools submit baseline assessment data (optional–for inclusion in DOE targets).(10/31) (deadline is 10/4 for inclusion in DOE targets) DOE targets for individual student goal-setting released (goal- setting only).(11/15) Goals finalized (goal-setting only). We can add specific date to this slide as they are released - Review fall timeline. All of these items will be discussed throughout the day and more specific details and dates provided. See: ADVANCE TOOLS and TIMELINES

MOSL Scoring Requirements Scoring Assessments MOSL Scoring Requirements In accordance with state teacher evaluation law, teachers may not score their own end-of-year assessments. Schools may opt to either: Have another teacher in the school score both baseline and end-of-year assessments for consistency. Have the classroom teacher score the baseline and another teacher in the school score the end-of-year assessment. 2 min - Discuss scoring requirements. Be sure to note that this will require some work from schools in terms of thinking of new scoring protocols for assessments that are scored by teachers.

Who is eligible? Look at page 5 of your FAQ Take out your MOSL guide and your MOSL FAQ Note that teachers who teach less than 40% are rated S or U—but still subject to Danielson rubric Look at page 5 of your FAQ

Baseline Assessments Where choices exist, schools may opt to use a DOE-recommended baseline assessment or a school-selected baseline. Remember: Baselines only need to be selected for a few state assessments and AP exams.* All other 3rd party and Performance Assessments come with baseline assessments. Where baselines must be selected, the PRINCIPAL chooses. If the selected baseline is not on the DOE list, schools must use goal-setting. DOE targets provided to those schools will not include results from the baseline. 3 min Review information about baselines Draw attention to the fact that only principals can select baselines (there are very few instances where a committee would have to select a baseline, but where this comes up, principal needs to select – ex. If using 4th grade Science Assessment for Local but not State, or AP exams in HS) Draw attention to the fact that goal-setting MUST be the growth measurement if baseline is school selected. * Principals also have the option of selecting an alternative baseline assessment for Discovery Math in K only.

When does a principal need to choose a baseline? The following assessments have required baselines: Assessment Baseline All NYC Performance Assessments NYC Performance Assessment (fall administration) Performance Series (fall administration) State Assessments: grades 4-8 ELA and Math ELA/Math Assessment (prior year) Discovery Math 1st and 2nd Grade Discovery Math (fall administration) Look at page 9 of your FAQ

When does a principal need to choose a baseline? Assessment Baseline Choices Discovery Math: Kindergarten (in K-2 schools only) Discovery math (fall administration) School-selected baseline Advanced Placement (AP) Exams PSAT(state assessment 8th grade if not available) School-selected baseline (should be selected by schools if they want to use goal setting, since PSAT fall administration results will not be available until November 15 goal-setting deadline) State Assessments: 3rd Grade ELA and Math NYC 3rd Grade ELA/Math Performance Assessment (fall administration) State Assessments: 4th Grade Science NYC 4th Grade Science performance Assessment (fall administration) State assessments: 3rd Grade Math (prior year) State Assessments: 8th Grade science NYC 8th Grade Science Performance Assessment (fall administration) State Assessments: 7th grade Math (prior year) State Regent Exams PSAT (State Assessment:8th Grade if PSAT not available. State Assessment: 7th Grade if Regents administered in 8th Grade.) NYSESLAT NYSESLAT (prior year) (Lab-R if prior year NYSESLAT not available) NYSAA NYSAA prior year and SANDI if prior year NYSAA not available.

Review: Growth Measurement Options After choosing assessments and target populations, committees must select one of two growth measurements for each assessment. Growth Models: DOE calculates student targets, results, and teacher scores. Results are shared after assessments have been administered so student growth can be compared to similar students’ performance on assessments. Goal-Setting: DOE provides targets for how students will perform on assessments that principals and teachers can adjust based on their knowledge of students. Principals approve targets. 2 min

Content Review: Growth Models 9 Content Review: Growth Models Growth models allow us to compare the progress that students make in a year to similar students. In the State Growth Model, similar students are defined by four student- level characteristics at the student and classroom levels: Academic history Economic disadvantage Disability status Language learner status Citywide models account for similar characteristics to the State model. Growth models control for the degree to which students are expected to make gains given their achievement history and demographic characteristics. 3 minutes

Content Review: Growth Models Does not introduce additional work in schools. Enables schools to compare their students’ performance to similar students. Gives teachers credit for the degree to which students exceed predicted growth. Better able to account for unexpected outcomes resulting from unfamiliar, new assessments. Growth model score results are not available until after assessments have been administered (i.e., the following spring/summer). Benefits Challenges 5 minutes

Goal-Setting Administer baseline assessment DOE sends predicted student targets Teachers review DOE- predicted targets Principals approve or adjust targets Administer end of year assessment Teachers’ Ratings Baseline assessment administered (not required for all assessments). Report baseline assessment results. DOE sends predictions for how individual students will perform. Predictions are based on baseline performance, student achievement history, and student demographic characteristics. Teachers may choose to adjust these targets based on additional information about their students. Teachers submit student targets to principal. Principal (or designee) reports finalized student targets. 10 minutes End-of-year assessment administered to students. Teachers’ HEDI ratings are calculated with a conversion chart based on students’ performance on outcome assessments relative to their targets.

Goal-Setting Benefits Challenges Particularly valuable for teachers/schools with unique student populations or high mobility. Allows teachers and principals to individually tailor student goals. Requires additional time/resources. Teacher’s rating is based on the percentage of students who meet their target, but does not consider the degree to which students fall below or exceed their target. Targets must be set early in the school year, possibly before much diagnostic info is gathered Setting goals may be challenging if: Teachers are not familiar with the comparability between assessments Assessments are new or changing Benefits Challenges 5 minutes

Goal-Setting Considerations This is not the same as the goal-setting you may typically see in schools. Goals are scored against a state conversion chart which makes the target-setting process difficult and non-intuitive. Before considering goal-setting, make sure you understand the work and additional training this entails.

Goal-Setting: Fall Implementation Steps By 9/9 School Local Measure Committees recommend whether or not to use goal-setting for Local Measure. Principals decide the same for State Measure (where the option exists). September Schools who choose goal-setting norm on expectations across classrooms. Schools administer baseline assessments. 9/24 – DOE target reference tables and goal-setting worksheets released on Intranet. October By 10/4 – Schools submit results of baseline assessments if they want data included in DOE-suggested targets (otherwise submit by 10/31). By 10/15 – Principals submit grade/school level goals to superintendent for review. Teachers start process of goal-setting using baseline data and reference tables. November 11/1 – Schools can access goal-setting worksheets which include baseline results and DOE-suggested targets. 11/15 – Principals finalize teacher-set goals. Superintendents finalize school/grade goals. 3 min Remind participants that apart from these dates, should consider additional time needed to norm on target setting, gather background data for adjusting goals, printing forms, completing targets, approving targets, etc.

Goal-Setting – Who Sets Goals? Teachers set goals for students in their classroom (target population: individual), but when goals go beyond an individual teacher’s students (target population: grade or school), the principal sets the goal and the superintendent approves. Target Population Teachers Principals Super. Individual Level By student; or By subgroups of students; or Whole class goal Set Approve* Grade Level Whole grade goal Approve School Level Whole school goal 2 minutes - Several goal types by target population, will go into detail in a minute. Most important thing here is the set vs. approve for different target populations. For an “individual” target population teacher sets, discusses and adjusts with principal, principal approves. However, and this is new information, for school and grade target populations, the principal sets the goals and the superintendent approves. * If a teacher fails to submit goals, the principal will set the goals for that teacher. The principal should not simply “approve,” but rather work with the teacher on adjusting targets as necessary.

(First principal finished will receive a prize!!) MOSL QUiz Please use your MOSL FAQ Document to complete the MOSL quiz (orange paper) (First principal finished will receive a prize!!)

Performance Assessments – Overview NYC Performance Assessments are available in multiple K-12 subjects, some as optional, others as required. These include baseline and end-of-year assessments (no additional selections for baseline needed). If the principal and/or committee chooses the default option, the mandated NYC Performance Assessments in grades 4-8 ELA for the Local Measure do not need to be administered as all teachers in the school will use the default measure. There are Performance Assessments required in some grades/subjects for the State Measure. Those assessments are not affected by the selection of the default option for the Local Measure. In most grades/subjects with Performance Assessments, there is one type of assessment. However, in K-8 ELA, committees have two options within the Performance Assessment category: Running Records NYC Performance Assessment (using one of three reading programs) (TC, DRA, F&P) Writing-based NYC Performance Assessment 3 min Review all points on slide Focus on the optional vs. mandatory nature of performance assessments and how choosing the default option gets rid of the 4-8 ELA Local Measure requirement (but not in other places where performance assessment mandated for State Requirement) Focus on the fact that there are two options for the K-8 ELA performance assessments – will review both today

Performance Assessments: NYC Performance Assessments Considerations: Review of where they exist on assessment list ELA (writing-based): K-12 Math: Grade 3, Integrated Algebra Social Studies: 6-8, Global History, U.S. History Science: 4, 6-8, Living Environment In K-8 ELA, also have choice of Running Records. All NYC Performance Assessments (except ELA) translated into 9 languages. Require norming on rubrics at the school level. Samples available for download (final versions by 9/9). 2 min - These were all on the old assessment list except 8th grade science, which was added in the new version - Need to remind people on this slide that we are now talking about the “other” type of performance assessment – this type exists across several grades and subjects, whereas the running records are just specific to ELA K-8

Performance Assessments: Implementation Steps By 9/9 Final performance assessments released online School Local Measure Committees recommend whether or not to administer Performance Assessments where optional. Principal approves or chooses default. For ELA K-8, principals and committees decide between Running Records and a writing-based performance assessment. Principals plan necessary teacher support around administration and scoring. September Schools develop scoring protocols. Schools norm teachers on Performance Assessment rubrics to ensure a shared understanding of demands of the rubric and scoring across student work. Teachers administer Performance Assessments. October Schools score and scan Performance Assessments. Must submit scores by 10/4 for inclusion in DOE targets (otherwise by 10/31). 2 min - Review points on slide. Timeline isn’t different for performance assessments, but does include some additional work in terms of norming in September.

Deep Dive: NYC Performance Assessments Activity #1 – Task Design Investigation Objective Participants will be able to interpret a sample NYC Performance Assessment by analyzing the requirements for students and predicting students’ strengths and struggles. Activity Preview the sample NYC Performance Assessment and consider the four key questions on the handout. Use the chart to take notes while you’re reviewing the task and texts. Share your reflections with your table. 15-20 min. (More importantly, Activity #1 + Activity #2 should take an hour total) -Distribute / Identify materials for participants: Sample Task Rubric Handout 1 - Introduce the Task Design Activity  ex: We’re going to take a few minutes to review a sample NYC Performance Assessment. As you’re reviewing the task, we’d like to pay attention to two key questions. First, what will students need to know in order to complete the assessment? Second, what will they have to do? After considering those two questions, imagine how students in your schools will approach this task? What will show off their strengths? What will they struggle with? - Ask participants to take 5-7 minutes reviewing the task and texts and taking notes on the Handout 1 chart. - Invite participants to share their responses for 5-7 minutes in pairs or aloud to the whole group. - Wrap up the conversation and smoothly transition into Norming and Scoring.

Deep Dive: NYC Performance Assessments Activity #1 – Task Design This slide is for illustrative purposes only – handout is provided, but may use when introducing activity to let participants know what handout they should be looking at.

Deep Dive: NYC Performance Assessments Activity #2 – Norming & Scoring Objective Participants will be able to apply the rubric as an evaluative tool for scoring student writing and establish a shared understanding of the expectations of the NYC Performance Assessments. Activity Individually read and score Sample 1 according to the rubric. For this experience, you are only asked to grade the rubric dimension of “Focus: Position”, and then discuss your findings with your table. Refer to evidence in the sample of student work and the rubric to justify your choices and seek consensus. Repeat the process for Sample 2 (time permitting). 40-45 minutes (More importantly, Activity #1 + Activity #2 should take an hour total). -Distribute / Identify materials for participants: Sample Student Work Rubric Handout 2 - Introduce the Norming and Scoring Activity  ex: School teams will be responsible for norming their scoring practices by using the rubric. It is important that all scorers are calibrated to the rubric to ensure fair grading practice across students, teacher and schools. We’re going to have an abbreviated norming session to experience a bit of what schools will be doing in the fall. - Distribute Sample 1 to the participants. Ask participants to read the student sample carefully alongside the rubric and then to score the sample based on their interpretation of the rubric. The student sample should receive a unique score for each trait/dimension of the rubric (rather than a single, holistic score). - After scoring the sample individually, ask participants to share their scores and to the whole group, or in pairs/trios. Using the chart as a guide, ask participants to share out and justify their scores using evidence from the rubric and the student work sample. Guide the conversation towards consensus for each score. Refer to the anchor paper commentary to negotiate any disagreements. - After norming Sample 1, repeat the process for sample 2. - Time permitting, consider the following discussion questions: How did your scoring change between Sample 1 and Sample 2? How is the rubric similar or different than others you/your schools have used? What are some of the implications on the scoring processes for schools, given the ultimate restrictions on teachers’ scoring their own students? - Close the conversation on Norming practices after 45 minutes. If the table is behind schedule, sample 2 can be cut short. - Transition into the next activity by asking participants to select one sample that they’d like to investigate in more depth.

Deep Dive: NYC Performance Assessments Activity #2 – Norming & Scoring   12th Grade ELA Sample A 12th Grade ELA Sample B Rubric Dimensions My Score Normed Score Evidence from the Text Focus: Position Development: Elaboration Development: Textual Analysis Development: Counterclaims Reading Organization Conventions This slide is for illustrative purposes only – handout is provided, but may use when introducing activity to let participants know what handout they should be looking at.

Deep Dive: NYC Performance Assessments Activity #3 – Planning For Instruction Objective Participants will be able to analyze a sample of student work within a specific element of the rubric to consider the implications of assessment practices on instruction and the development of student skills. Activity Choose one dimension/trait (in this case, we will choose “Focus: Position”) from the rubric to analyze in-depth. List the rubric dimension and key characteristics of the rubric, then use this as a lens to reread and analyze the sample. With this new lens, consider the student’s strengths and struggles as well as strategies that might support the student’s growth. By analyzing student work samples with the lens of the rubric, teachers will be able to identify strengths and struggles in student performance that inform their instruction throughout the year and support student growth. - Ask participants to select one sample of student work they’d like to examine in more depth. Then, ask participants to select a single dimension/trait from the rubric to use as a focus. Using the chart as a guide, ask participants to write down the key points or essential attributes of the rubric dimension they selected. - After selecting the sample and the trait, ask participants to re-read the sample of student work, looking only for evidence of their selected trait (ignoring all others). They should take notes on their findings including identifying students strengths, struggles and potential strategies for supporting the student. - Ask participants to debrief their findings—sharing their ideas with others who analyzed the same student sample. - As participants complete the analysis, invite them to review other potential tasks and discuss the range of tasks available to schools.

Deep Dive: NYC Performance Assessments Activity #3 – Planning for Instruction This slide is for illustrative purposes only – handout is provided, but may use when introducing activity to let participants know what handout they should be looking at.

Deep Dive: NYC Performance Assessments Activity #4 – Implications Objective Participants will reflect on the cycle of assessment and identify the implications of this assessment option for a variety of stakeholders. They will then plan next steps. Activity As you review additional sample tasks, consider the implications of the NYC Performance Assessments on various stakeholders. As a result of today’s activities, what are the next steps? What questions, comments, or considerations are you contemplating? 15 min. - Distribute / Identify materials for participants: Additional Tasks for Review Handout 4 - Transition from analyzing student work into reviewing additional tasks. Participants may want to consider the following questions: How are these tasks sequenced? How do expectations change from year to year? How do the tasks address the standards? How accessible are the tasks to students at each grade range? Ask participants to consider the implications of their experience with the Performance Assessments on the various stakeholders. What are the next steps for each group? Encourage participants to discuss with one another at their tables as they debrief the activities. To close the conversation – ask participants to share with the table one item on their “To Do” list as a result of the day’s activities.  

Consider the implications of the nyc Performance Assessments on various stake-holders. As a result of today’s activities, what are the next steps? What questions, comments or considerations are you contemplating?

Please include your name on any index card questions so that we may respond via email if we do not respond today