Georgia Alternate Assessment Eighth Annual Maryland Conference October 2007 Melissa Fincher, Georgia Department of Education Claudia Flowers, UNCC.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

RIDE – Office of Special Populations
The Teacher Work Sample
So, What IS a Standards-based
Determining Validity For Oklahoma’s Educational Accountability System Prepared for the American Educational Research Association (AERA) Oklahoma State.
Portfolio Review Process Georgia Alternate Assessment.
Arizona Department of Education School Improvement and Intervention.
Georgia Alternate Assessment Understanding the Basics of the GAA GAA Blueprint and Portfolio Components Terminology for the GAA Descriptions and Examples.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
DISTRICT IMPROVEMENT PLAN Student Achievement Annual Progress Report Lakewood School District # 306.
June 2014 NCSC Commitment to Student Communicative Competence.
Consistency of Assessment
1 Alignment of Alternate Assessments to Grade-level Content Standards Brian Gong National Center for the Improvement of Educational Assessment Claudia.
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
Alignment Issues for AA- AAS Diane M. Browder, PhD University of North Carolina at Charlotte October 11, 2007.
Presented by: Laura Hines MCAS-Alt Teacher Consultant October 2014 MCAS Alternate Assessment (MCAS-Alt) Creating Portfolios that Address Access Skills.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Assessing Student Learning
MATHEMATICS KLA Years 1 to 10 Understanding the syllabus MATHEMATICS.
Principles of Assessment
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
1 Oregon Content Standards Evaluation Project, Contract Amendment Phase: Preliminary Findings Dr. Stanley Rabinowitz WestEd November 6, 2007.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Authentic Assessment Principles & Methods
Strategies for Efficient Scoring Jeanne Stone, UC Irvine PACT Conference October 22, 2009.
Wisconsin Extended Grade Band Standards
Ensuring State Assessments Match the Rigor, Depth and Breadth of College- and Career- Ready Standards Student Achievement Partners Spring 2014.
1 Alignment of Standards, Large-scale Assessments, and Curriculum: A Review of the Methodological and Empirical Literature Meagan Karvonen Western Carolina.
Georgia Alternate Assessment:
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Accommodations in Oregon Oregon Department of Education Fall Conference 2009 Staff and Panel Presentation Dianna Carrizales ODE Mike Boyles Pam Prosise.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
EDU 385 Education Assessment in the Classroom
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
The present publication was developed under grant X from the U.S. Department of Education, Office of Special Education Programs. The views.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Georgia Alternate Assessment Post Assessment Workshop Kathy Cox, State Superintendent of Schools “We will lead the nation in improving student.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
1 Alignment of Alternate Assessments to Grade-level Content Standards Brian Gong National Center for the Improvement of Educational Assessment Claudia.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
Alabama Extended Standards & Body of Evidence Marla Davis Holbrook & Susan Skipper Fall Leadership Conference August 17, 2006.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
Alternate Proficiency Assessment Erin Lichtenwalner.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
The Achievement Chart Mathematics Grades Note to Presenter:
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Office of Service Quality
Georgia Alternate Assessment The Importance of Portfolio Review Prior to Submission February 2011Dr. John D. Barge, State School Superintendent.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Proposed End-of-Course (EOC) Cut Scores for the Spring 2015 Test Administration Presentation to the Nevada State Board of Education March 17, 2016.
Creative Intervention Planning through Universal Design for Learning MariBeth Plankers, M.S. CCC-SLP Page 127.
Colorado Accommodation Manual Part I Section I Guidance Section II Five-Step Process Welcome! Colorado Department of Education Exceptional Student Services.
Council for the Accreditationof EducatorPreparation Standard 1: CONTENT AND PEDAGOGICAL KNOWLEDGE 2014 CAEP –Conference Nashville, TN March 26-28, 2014.
Instructional Leadership Supporting Common Assessments.
State Testing for SPED Students: (Georgia Alternative Assessment vs. Regular State Testing with Accommodations)
“Grade-level” and “Competency” Portfolios
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Links for Academic Learning: Planning An Alignment Study
Presentation transcript:

Georgia Alternate Assessment Eighth Annual Maryland Conference October 2007 Melissa Fincher, Georgia Department of Education Claudia Flowers, UNCC

GAA Purpose To ensure all students, including students with significant cognitive disabilities, are provided access to the state curriculum To ensure all students, including students with significant cognitive disabilities, are given the opportunity the demonstrate their progress in learning and achieving high academic standards

Overview of the GAA The GAA is a portfolio of student work provided as evidence that a student is making progress toward grade- level academic standards. Evidence provided must show instructional activities and student work that is aligned to specific grade-level standards. The portfolio system is flexible allowing for the diversity of the students participating in the GAA.

GAA Core Belief and Guiding Philosophy All students can learn when provided access to instruction predicated on the state curriculum Educators are key – significant training and support surrounding curriculum access is critical Test development and technical documentation is ongoing –and includes documentation of decisions surrounding development and implementation Technical expertise is important –Georgia’s Technical Advisory Committee –Augmented with an AA-AAS expert

Additional Resources Georgia took advantage of –Learning from other states –The growing understanding in the field of alternate assessments and what students with significant cognitive disabilities can do –US ED’s offer of technical assistance –We elected to focus on technical documentation –Invitation to have the National Alternate Assessment Center (NAAC) Expert Review Panel review documentation –National Center for Educational Outcomes (NCEO) –Peer Review

Description of GAA Structured Portfolio –a compilation of student work that documents, measures, and reflects student performance and progress in standards-based knowledge and skills over time

Overview of the GAA English/Language Arts (Grades K – 8 and 11) Entry #1: Reading Comprehension Entry #2: Communication – Writing or Listen/Speaking/Viewing Mathematics (Grades K – 5) Entry #1: Numbers and Operations Entry #2: Choice from Measurement and Geometry Data Analysis and Probability or Algebra (grades 3 – 5)

Overview of GAA Mathematics (Grades 6 – 8 and 11) Entry #1: Numbers and Operations or Algebra* Entry #2: Choice from Measurement and Geometry Data Analysis and Probability Algebra Science (Grades 3 – 8 and 11) Entry #1: Choice from blueprint, paired with Characteristics of Science standard Social Studies (Grades 3 – 8 and 11) Entry #1: Choice from blueprint *Algebra strand is mandated for grade 11.

Overview of the GAA There are two collection periods for each entry over the course of the school year—minimum time between collection periods is 3 weeks. Teachers collect evidence of student performance within tasks aligned to a specific grade level content standard. This evidence shows the student’s progress toward those standards. Each entry is comprised of 4 pieces of evidence –Primary and Secondary for Collection Period 1 –Primary and Secondary for Collection Period 2

Types of Evidence Primary Evidence –demonstrates knowledge and/or skills either through work produced by the student or by any means that shows the student’s engagement in instructional tasks. Secondary Evidence –documents, relates, charts, or interprets the student’s performance on similar instructional tasks.

Entry

Permanent Product

1/31 100% Captioned photos clearly show the student in the process of the task as well as his completed product. The captions describe each step of the task and annotate the student’s success.

Rubric Dimensions Fidelity to Standard – the degree to which the student’s work addresses the grade-level standard Context – the degree to which the student work is purposeful and uses grade- appropriate materials in a natural/real-world application Achievement/Progress – the degree of demonstrated improvement in the student’s performance over time Generalization – the degree of opportunity given to the student to apply the learned skill in other settings and with various individuals across all content areas assessed

Rangefinding and Scoring Rangefinding took place in Georgia –committee of general and special educators –scored a representative sample of portfolios –provided guidance and a rationale for each score point assigned, which were used to create scoring guides and training/qualifying sets Scoring took place in Minnesota –GaDOE staff on site –15% read behind and other typical quality control checks

2006 – 2007 Entries by Grade

Standard Setting Three performance/achievement standards called ‘Stages of Progress’ –Emerging Progress –Established Progress –Extending Progress Descriptions written by development committee

Definitions of Stages of Progress

Standard Setting Method Portfolio Pattern Methodology –combined critical aspects of the Body of Work (Kingston, Kahl, Sweeney, & Bay, 2001) and the Massachusetts (Wiener, D., 2002) models –Holistic view of student work and direct tie to the analytic rubric as applied to performance levels –Standards set by grade bands – K – 2; 3 – 5; 6 – 8; and 11 –Articulation committee reviewed recommendations across grade bands and content areas

Individual Student Report Individual Student Report – Side 1Individual Student Report – Side 2

First Year Look at Data Reliability –Potential largest source of error is the test administer –Inter-rater agreement (% exact agreement, % adjacent agreement, & Kappa) –Correlation between scores for Entry 1 and Entry 2 –G-study (persons, items, raters) –Comparability of scores across years (planned) stability over time

Inter-rater Agreement ELA Entry 1 ELA Entry 2 Math Entry1 Math Entry 2 Science Social Studies Fidelity to Standard86.4%84.7%88.4%88.3%88.2%89.8% Context88.0%86.3%88.9%89.6%90.0%91.1% Progress75.1%74.4%73.3%72.7%76.8%76.0% Generalization85.3% Based on 15% read behind; % represents exact agreement.

Kappa ELA Entry 1 ELA Entry 2 Math Entry1 Math Entry 2 Science Social Studies Fidelity to Standard Context Progress Generalization.76 Based on 15% read behind.

Correlation Between Entry 1 and Entry 2 ELAMath Fidelity to Standard Context Achievement/Progress.57.59

Sources of Evidence for Validity Inter-correlation among the dimensions Content and Alignment—Fidelity to Standard Consequential Validity Study – Curriculum access (baseline completed) – Alternate assessment (planned) Comparability of scores across years (planned) Alignment study (Links for Academic Learning) Content coverage (depth and breadth) Differentiation across grades (vertical relationship) Barriers to learning (bias review) Alignment of instruction

ELA Entry 1 Correlations for Grade 3 CorrelationFidelity to StandardContext Achievement/ Progress Generalization Fidelity to Standard 1.0 Context Achievement/ Progress Generalization

The Challenge of Documenting Technical Quality of Alternate Assessments Diversity of the group of students being assessed and how they demonstrate knowledge and skills Often “flexible” assessment experiences Relatively small numbers of students/tests Evolving view/acceptance of academic curriculum (i.e., learning experiences The high degree involvement of the teacher/assessor in administering the assessment (Gong & Marion, 2006) A lack of measurement tools for evaluating non-traditional assessment approaches (e.g., portfolios, performance tasks/events), demonstrating a need to expand the conceptualization of technical quality (Linn, Baker, & Dunbar, 1991 ) NHEAI / NAAC

Georgia Technical Considerations Heavy investment in training special educators –Curriculum access training teachers started two years before implementation of the assessment –Four-step process for aligning instruction to grade-level standards Teacher Training (Regional and Online) –Regional workshops and online presentations –Electronic message/resource board –Sample activities vetted with curriculum staff

GAA Technical Documentation Our documentation includes –Rationale for our AA approach (portfolio) –Consideration of the purpose of the assessment and its role in our assessment system –Consideration of who the students are and how they build and demonstrate their achievement –Consideration of the content assessed and establishment of alignment, including monitoring plans

GAA Technical Documentation Our documentation includes –Development of the assessment, including rationale for key decisions –Training and support for educators responsible for compiling portfolios from both a curriculum access and assessment perspective –Consideration and analysis of potential test bias –Scoring and reporting procedures, including steps taken to minimize error

GAA Technical Documentation Our documentation includes –Our plans for specific validity studies –Traditional statistics you would expect to find in technical documents Inter-rater reliabilities Score point distributions with standard deviations Correlations of rubric dimensions by content area Our goal is to collect validity evidence over time and systematically document the GAA “story”

Next Steps Continue refining program Continue training and support of teachers –Expanding resource board with adapted lessons and materials Conduct a series of validity studies –In collaboration with NAAC and other states (GSEG)