for GIEP’s that Make Sense Tanya Morret PAGE 2012 PLEPS for GIEP’s that Make Sense Tanya Morret PAGE 2012
What are we really trying to do? Answer these questions…. Does this child need enrichment? Does this child need acceleration? Does this child need a combination of both?
Enrichment/Acceleration
What do we know? From the GMDE/GWR we should know… Information on interests, challenges, passions, abilities as described by parents, teacher(s), and student Academic Strengths Cognitive Strengths Current Level of Functioning at Grade Level Academic Achievement Level – aligned to grade level standards Educational Needs related to Strengths Intervening Factors (ELL, Disability, Cultural/Gender Bias, Socio-Economic Status)
What do we know? From last year’s GIEP? Information on interests, challenges, passions, abilities as described by parents, teacher(s), and student Academic Strengths Cognitive Strengths Current Level of Functioning at Grade Level Academic Achievement Level – aligned to grade level standards Educational Needs related to Strengths Progress on Goals Support Services
How do we find it out? Systematic use of all four assessment types!
Assessment – It all begins here Four Types of Assessments: Summative Formative Benchmark Diagnostic Fold – list all four, identify their purpose and an example Turn to your parnter – A’s – explain summative and formative, B’s – Benchmark and diagnostic http://blogs.msdn.com/b/willy-peter_schaub/archive/2009/06/17/vsts-rangers-projects-tfs-migration-questions-and-answers.aspx Tanya Morret and Cheryl Everett PAGE 2011
Types of Assessments Summative Assesses what students have had an opportunity to learn – after instruction Used to determine whether students have met the lesson, unit, grade level, or course goals Used to set district and school-wide goals to improve student outcomes Examples: State Tests (PSSA, Keystone Exams) Mastery Tests Unit or Chapter Tests Final Exams The “Educational Autopsy” Cumulative in nature Tanya Morret and Cheryl Everett PAGE 2011
Types of Assessments Formative Assesses what students have had an opportunity to learn – during instruction Allows teachers to adjust teaching practices to improve student learning Should not be used to evaluate or grade students but can provide ongoing feedback Formal or Informal Examples: Progress Monitoring Measures Ticket out the Door, White boards, Thumbs Up/Down Anecdotal Progress Reports Student work - evidence of all stages Participants look over their sort – would you take any assessments out or leave any in? Tanya Morret and Cheryl Everett PAGE 2011
Types of Assessments Benchmark Assessment Given on student’s actual grade level Assesses end of grade level expectations Administered 3 or 4 times per year to check “trajectory” Compares student to same age peers Used to evaluate the core, discover trends, identify at-risk students, identify advanced level students Examples: DIBELS/AIMSWeb 4Sight Study Island PVAAS Projections** Participants look over their sort – would you change any of the assessments? Other examples: STAR math and reading, Study Island, Terra Nova Can be used to determine likelihood a student will meet pre-determined future goal, i.e, PSSA, Keystone Looks for deficits in student’s knowledge at grade level **PVAAS – Is not another assessment, but it is data that serves a similar purpose to a Benchmark Tanya Morret and Cheryl Everett PAGE 2011
Types of Assessments Diagnostic Provides insights into the student's strengths, needs, knowledge and skills prior to further instruction Targeted for specific audience Examples: WASI – Wechsler Abbreviated Scale of Intelligence OLSAT – Otis-Lennon School Ability Test Woodcock Johnson III Stanford-Binet Classroom Diagnostic Tool (CDT) Teacher Created Diagnostics SAS definition – ascertains prior to instruction students strengths, weaknesses, knowledge and skills Usually occurs before instruction Gives objective, standardized data on pupil skill levels in context of the curriculum Other examples: Running records: reading accuracy and progress GMADE- Group mathematics assessment and diagnostic evalutation- k-12 for math placement , assessment and measuring scale values WIAT - Weschler Individual Achievement Test (linked with WISC IV) – preK-16 Woodcock-Johnson III- test of cognitive abilities Gates- McGinitie Reading Test (GMRT)- students levels of reading achievement GRADE - Group Reading Assessment and Diagnostic Evaluation ( PreK, K-12, adult) MAP- Measure of Academic Progress Tanya Morret and Cheryl Everett PAGE 2011
An Effective System… finds the child has an assessment plan that is prescriptive has defined targets has a clear link to curriculum and instruction How can we articulate the sequence in a way that makes sense or the system works to find the child as early as possible? Now, when are those results reviewed? Tanya Morret and Cheryl Everett PAGE 2011
Assessments should… Be current (within last year) Establish strength areas Indicate present mastery level Help us measure growth Report progress on goals (maintenance) Needs – Current (within one year) Indicate present mastery level – mid-terms/finals/ must be linked to standards, what constitutes mastery, out of level testing, is the child working at one or more levels above current grade Help us measure growth – instructional level, PVAAS, Establish strength areas - Not a standard list Report progress on goals (maintenance) Tanya Morret and Cheryl Everett PAGE 2011
Current Current (within last year) Allows us to collect assessment information that has been maintained on the student Should be collected (not necessarily administered) as part of the gifted support teacher’s case managing duties Needs – Current (within one year) Indicate present mastery level – mid-terms/finals/ must be linked to standards, what constitutes mastery, out of level testing, is the child working at one or more levels above current grade Help us measure growth – instructional level, PVAAS, Establish strength areas - Not a standard list Report progress on goals (maintenance) Tanya Morret and Cheryl Everett PAGE 2011
Establish areas of strengths “Gifted kids’ needs stem from their strengths – not their deficiencies” “Twice exceptional students needs stem from both – documented giftedness and documented learning disability” Dr. Julia Roberts, Western Kentucky University, 2011 Prioritize needs of the gifted child Tanya Morret and Cheryl Everett PAGE 2011
Academic Strength Refer to the GWR As evidenced by achievement in one or more academic areas Hyperlink to parent input form Tanya Morret and Cheryl Everett PAGE 2011
Cognitive Strength Refer to the GWR As evidenced by aptitude/ability in one or more cognitive areas
Indicate present mastery level Must be linked to standards Frustration, Instructional, Independent Clear decisions about what constitutes mastery Use all Four Assessment Types How is this determined? Must be linked to standards – Mastery – In reading we use the terms Frustration, Instructional, Independent Clear decisions about what constitutes mastery – district determined Is Mastery a 93%, if so, how do you address the needs of a child who - without instruction, demonstrates knowledge of 73% of the information? Consider out of level testing If the instructional level of the child is determined as part of the evaluation…and it should…then this should be relatively simple to maintain. At what grade level is the child reading, writing, and solving mathematical problems? To allow their chronological age to be the sole determing factor of what we consider is wrong…it will require a little work on our part. Once the intital level is determined, then, the goals and STLO’s will help us maintain records to record the progress of a student. The problem is, that we don’t identify this level when we evaluate, and we start off with missing information. Why? Ask [articipants: why are we hesitant to establish a child’s reading writing and math level at evaluation when the regs clearly stated that to be identified they need to be working a year or more above their current grade level in something? Let audience share answers.
What can Summative Data tell us? PSSA/Keystones aka “On-Level Testing” Performance level against grade level expectations Scaled Score Gives us a percentile rank against all other PA students who took the same test in a given year Gives us an indication of how successfully they have achieved the knowledge and skills required at the grade/course level they were tested, but not an instructional level!!
What can Summative Data tell us? Common Assessments – On Level Performance level against lesson, unit, grade level expectations Gives us an achievement score (how much they have learned) Common Assessments – Out of Level Performance against above grade level expectations Indicate percent mastery to determine frustration, instructional, or independent level Final, Mid-Terms, Tests, Quizzes
What can Summative Data tell us? Project-Based Assessments Performance against project/grade level standards Indication of where a student is (working below/at/or above) in relation to the project/grade level standard Final, Mid-Terms, Tests, Quizzes
What can Summative Data tell us? Standards Based Report Card Performance against grade level standards Could provide indication of where a student is (working below/at/or above) in relation to chronological grade level standard or Where a child is working in relation to all standards!! Final, Mid-Terms, Tests, Quizzes
What can Benchmark Data tell us? Curriculum Based Measures Performance against grade level standards - On Level Considered an indicator, not the primary assessment!! If the student is maintaining positive growth and on track to meet end of year expectations Performance against grade level standards - Out of Level Gives an indication where the student might be performing against above grade level expectations Should be considered when first benchmark indicates above level performance (90th percentile?) Measures learning along the way Should be considered when first benchmark indicates above level performance (90th percentile?) - If, in September, at the first on-level benchmark the student has already “achieved” at the 90th percentile of all students who have taken the measure, then we should consider assessing the child out of level (next one up?)
Positive or Negative Trajectory
What can Formative Data tell us? Student Work - Compared to grade level exemplars Is the student meeting the expectations – before or after teaching (pre-assessment) What skills/knowledge might be missing or need refining. Are missing skills on or above grade level (will they require specially designed instruction or not) Needs to be collected along the way!!!
What can Diagnostic Data tell us? Classroom Diagnostic Tools A profile of where a student is achieving against all standards in a particular course/grade level Math, Literacy, and Science
Measure Growth To know how far they have grown, we need to know there they start PA Standard: One year of growth for one year of school!! Need a starting line and clear finish line Screen shot for PVAAS http://www.flickr.com/photos/dulwichrunners/4660318629 Tanya Morret and Cheryl Everett PAGE 2011
What does Summative Data tell us? PVAAS – PA Value-Added Assessment System Performance Diagnostic Not a true Summative assessment, because it is not an assessment PVAAS – PA Value-Added Assessment System Performance Diagnostic – longitudinal analysis of PSSA Scores in all areas Not a true Summative assessment, because it is not an assessment Indicates how much growth a cohort of students with similar characteristics made over the last year and the last three years
The PVAAS Methodologies Looking Forward/Planning… PVAAS Projection Reports For Individual Students and Cohorts of Students Looking Back/Evaluation… Value-added Growth Reports For Cohorts of Students The two methodologies really look at two different issues. The Value-added, or Growth, methodology looks back…it helps schools to evaluate how much students have gained. How much growth did students make in the past school year? The Projection methodology looks forward…it helps schools plan for the future. Are students on a path to proficiency? Both serve different purposes, and both are equally important for continuous school improvement. Today
What does Summative Data tell us? PVAAS – PA Value-Added Assessment System Performance Diagnostic Not a true Summative assessment, because it is not an assessment Indicates how much growth a cohort of students with similar characteristics made over the last year and the last three years PVAAS – PA Value-Added Assessment System Performance Diagnostic – longitudinal analysis of PSSA Scores in all areas Not a true Summative assessment, because it is not an assessment Indicates how much growth a cohort of students with similar characteristics made over the last year and the last three years
A Patterns of Growth C B So when a suggestion is made by the district, a valid question could be how has this worked in the past? If cohorts of students who share the same instructional decisions show growth, we can feel more confident about the recommendations of enrichment/acceleration we are making. 31
What does Summative Data tell us? PVAAS – PA Value-Added Assessment System Not a true Summative assessment, because it is not an assessment Indicates how much growth a cohort of students with similar characteristics made over the last year and the last three years Standard Error of Measure is too great to get a single student measure of growth Final, Mid-Terms, Tests, Quizzes
Student History Report in PVAAS Reports scale scores, percentiles, and NCE – Normal Curve Equivalentm – how to express where a score falls in a distribution, similar but not the same as percentile
What does Summative Data tell us? PVAAS – PA Value-Added Assessment System Not a true Summative assessment, because it is not an assessment Indicates how much growth a cohort of students with similar characteristics made over the last year and the last three years Standard Error of Measure is too great to get a single student measure of growth Not available to parents, district access only Can be informative to a district when looking back over the effectiveness of their system, not really valuable to the present level process. Public Site Final, Mid-Terms, Tests, Quizzes
What does Benchmark Data tell us? Curriculum Based Measures Predictive Indicates how quickly the child is achieving when looked at longitudinally (historically) If growth is accelerated Be Careful, you cannot just overlay graphs
What does Benchmark Data tell us? PVAAS Projections Indicates predicted performance on next year’s PSSA (eventually Keystone, AP, maybe even SAT’s) Projection to Proficient and/or Advanced
What does Formative Data tell us? Student Work/Teacher Report Record of how quickly student has moved through the curriculum (pre-assessment/compaction record) Student Work Samples over Time – hard evidence of gained skills
What does Diagnostic Data tell us? Any Diagnostic given repeatedly over time (annually?) Indication of how much growth is occurring. Identify gaps in the student’s knowledge Indication of whether growth is accelerating or decelerating Be Careful, you cannot just overlay graphs
Progress on Goals Failure is an option. Goals may not always be completed Include results from your objective criteria If you know where you start, progress will be clearly defined through all four assessment types Look Back over your notes – What would you use? “proficient” on a rubric Rubric does not have to be part of GIEP, but should be accessible for GIEP team Failure is an option. Tanya Morret and Cheryl Everett PAGE 2011
Not a standard list Driven by individual child’s strengths Not a static list determined by the district Should include all four assessment types Multiple measures needed for planning for annual goals Office of Dispute for Resolution-related cases 1259- gifted identification and re-testing 1483- PLEP and individualization of program 1525- PLEP and GIEP’s 1768- Identification and multiple criteria- IQ test administered for GMDE Gifted White Paper - 2010 Will differ from previous year Remember gifted child may already know 70-90% of information at any given grade level Tanya Morret and Cheryl Everett PAGE 2011
Dreams: Something you learned that you would like to see implemented in your district Nightmares: Fears, hurdles, challenges you foresee to using any of the information shared