 Elaine Mormer, PhD, University of Pittsburgh  Deborah Moncrieff, PhD University of Pittsburgh  Deborah Dixon, MA, ASHA, Director, School Services 

Slides:



Advertisements
Similar presentations
The Speech Language Pathologist’s Role in Schools
Advertisements

Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
WV High Quality Standards for Schools
Special Education Referral and Evaluation Process Presented by Lexington Special Education Staff February 1, 2013.
Elementary School Counselor
Trends in Teacher Evaluation Systems in Public Education CEC’S POSITION ON SPECIAL EDUCATION TEACHER EVALUATION.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Understanding the IEP Process
Continuing QIAT Conversations Joan Breslin Larson Follow up webinar post Feb for AT Conference for AT Teams Hosted by Oklahoma.
Development of the P erformance A ssessment of C ontributions and E ffectiveness of Speech-Language Pathologists.
Adopting the P erformance A ssessment of C ontributions and E ffectiveness of Speech-Language Pathologists An Alternative to Value Added Assessment Presentation.
What’s New with PACE? Janet Deppe, MS CCC-SLP Director, State Advocacy
Campus Staffing Changes Positions to be deleted from CNA/CIP  Title I, Title II, SCE  Academic Deans (211)  Administrative Assistants.
Speech-language body of evidence September 19, 2014
CHANGING ROLES OF THE DIAGNOSTICIAN Consultants to being part of an Early Intervention Team.
July 2007 IDEA Partnership 1 RTI Process What is it?
Family Resource Center Association January 2015 Quarterly Meeting.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Talbert House Project PASS Goals and Outcomes.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
E-Program Portfolio Let’s Begin Department of Reading and Language Arts Program Portfolio Central Connecticut State University Name: Date Submitted: Program.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Service Delivery Models and Inclusive Practices in Speech-Language Pathology: Challenges and Solutions Connecticut Speech-Language-Hearing Association.
School’s Cool in Childcare Settings
Diane Paul, PhD, CCC-SLP Director, Clinical Issues In Speech-Language Pathology American Speech-Language-Hearing Association
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Study Session   The purpose of the Comprehensive Examination is for Graduate students to synthesize in writing the knowledge, skills, and competencies.
DLM Early Childhood Express Assessment in Early Childhood Dr. Rafael Lara-Alecio Dr. Beverly J. Irby
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
HECSE Quality Indicators for Leadership Preparation.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Assessment Callie Cothern and Heather Vaughn. A Change in the view of assistive technology assessment: From a one shot, separate event to an ongoing,
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
©2014 Cengage Learning. All Rights Reserved. Chapter 13 Using Program Assessments to Look at Children in Groups “Pointing the finger of blame at others.
Observation and Assessment in Early Childhood Feel free to chat with each other. We will start class at 9:00 PM ET! Seminar Two: Using Standardized Tests.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Welcome to todays session!  Please take a moment to check your connection and audio settings.  If this is your first time using LYNC please see the resources.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Program Evaluation Principles and Applications PAS 2010.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
A TAP Story: A. A. Nelson Elementary School Jacqueline Smith, Principal A.A. Nelson Elementary School TAP Leadership Team Teddy Broussard, State TAP Director.
Using the CLASS tool to Improve Instructional Practices in Early Childhood Tracie Dow and Felicia Owo.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Amendments to the District ESE Policy and Procedures that outline Virtual education guidelines appear in blue. "The noblest pleasure is the joy of understanding."
Focus Questions What is assessment?
CAEP Standard 4 Program Impact Case Study
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Child Outcomes Summary Process April 26, 2017
School’s Cool Makes a Difference!
Update from ECO: Possible Approaches to Measuring Outcomes
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Presentation transcript:

 Elaine Mormer, PhD, University of Pittsburgh  Deborah Moncrieff, PhD University of Pittsburgh  Deborah Dixon, MA, ASHA, Director, School Services  Janet Deppe, MS, ASHA, Director, State Advocacy

 Personnel preparation programs must develop valid and reliable student assessment instrumentation (ACAE 2005 standards; CAA standards 2013)  No standardized approach to clinical skills and knowledge across AuD and SLP programs  Several states have developed performance measures for classroom teachers, but not for Audiologists & Speech-Language Pathologists (Asha, 2012)  Development of student/clinician performance evaluation should follow a systematic approach, using appropriate resources

We applied a systematic approach to design an instrument to assess and track AuD and SLP student knowledge and skills necessary for provision of services to high need children in underserved populations. The steps were as follows: 1.Identification of relevant skills and knowledge (i.e. items to be rated) 2.Application of a reliable and valid rating scale 3.Implementation of a data collection mechanism

 1. Identification of valid skill and knowledge list Identified resource materials e.g. publications, standards, etc.(see next panel) Solicited input from relevant constituents e.g. school personnel, consumers, etc. Peer review and feedback Pilot items with clinical instructors  2. Application of a valid and reliable rating scale Determine scale format e.g. # of points vs. checklist Create anchor values Create clearly defined descriptor for each scale point Considered priorities for psychometric characteristics  3. Implementation of a manageable data collection mechanism Format i.e. paper based vs. online? Availability of support staff for distribution and collection? Use of existing online mechanisms e.g. Typhon AHST, E*Value

Produced by the American Speech ‐ Language ‐ Hearing Association’s (ASHA) Value ‐ Added Project Team member requests, rapidly developing State ‐ level policies regarding accountability measures for school ‐ based speech ‐ language pathologists (SLPs) Objective was to identify a value ‐ added model designed for SLPs or one that specifically accounted for the unique contributions of SLPs Team reviewed literature, attended seminars, and conducted a peer review to obtain input from related professional organizations, members, pertinent stakeholders, and researchers on value ‐ added models and assessments

 Addresses cultural /linguistic variations in screening/assessment activities  Calculate classroom reverberation times  Participate effectively on multi-disciplinary team  Minimize barriers to curriculum access  Assist educational team member in making referrals  Train and supervise support personnel  Write appropriate IEP goals, considering academic, behavioral, and developmental issues  Counsel regarding transition planning

Data collection mechanism is online via the TyphonGroup © EASI survey system

1. Currently piloting with selected clinical instructors 2. Implementation of instrument in Fall term Evaluation of instrument pending responses on first round of implementation 4. Ongoing editing of items and scale as per constituent feedback

 Kogan J, Conforti L, Benabeo E, Iobst W, Holmboe, E(2011). Opening the black box of clinical skills assessment via observation: a conceptual model. Medical Education doi: /j x  American Speech-Language-Hearing Association (2005). Quality indicators for professional service programs in audiology and speech-Language pathology. Available from:  American Speech-Language-Hearing Association (2012). Performance assessment of contributions and effectiveness of speech-language pathologists (PACE). Available from: Assessment-Contributions-Effectiveness.pdfhttp:// Assessment-Contributions-Effectiveness.pdf  American Psychological Association (1999). Standards for Educational and Psychological Testing., Washington, DC: American Educational Research Association  Crossley J, Humphris G, Jolly B. (2002). Assessing Health Professionals. Medical Education 36 :

 Value Added Assessment  Research Findings  Rationale for the development of the PACE  Goals of an assessment system  Components of PACE  Tools and resources

 Value-added assessment a process to accurately and fairly assess a professional’s impact on student performance and overall success of the school community.  A comprehensive, statistical method of analyzing test data that measures teaching and learning.

 Federal grant programs and waivers require states to include VAA/ teacher accountability measures in applications  VAA is viewed as an important accountability measure  Teacher accountability systems are being developed in many states

 Research has primarily focused on implications of use of VAA with classroom teachers.  Notable concerns surfaced, such as difficulty linking student outcomes to one teacher and uncertainty about the accuracy of imputation models for missing student data.

 Evaluating the value that an SLP brings to the school or connecting their value to specific student performance is a challenge when compared to a classroom teacher.  ASHA’s Value-Added Working Team was not able to identify any VAA models that specifically incorporated SLPs.

 Since there was no system specifically developed for SLPs or other support personnel ASHA wanted to ensure that the assessment model for SLPs:  accurately reflects the speech-language pathologist’s (SLP) unique role in contributing to a child’s overall performance.  Demonstrates that the SLP is contributing to the success of the school community.

 ASHA also wanted to make sure that the evaluation system for SLPs was:  Comprehensive  Used multiple measures  Demonstrated valid and reliable findings  Provided data for professional development objectives  Linked to the specific roles and responsibilities of the specific job

- PACE Matrix  Portfolio  Observation chart  Teacher, student, and parent checklist  Self reflection tool  Observations by individual with knowledge of the roles and responsibilities of SLP  The matrix consists of a set of nine objectives by which an SLP should be evaluated.  These objectives are derived from typical roles and responsibilities of a school based SLP  The portfolio is developed to show evidence of mastery of each objective

 The PACE documents can be located at: e-Assessment-of-Contributions-and-Effectiveness/ e-Assessment-of-Contributions-and-Effectiveness/  Additional tools/resources include:  A guide for developing the portfolio for the Matrix  An evaluator’s guide  Observation “Look Fors” and scoring system for the Matrix  A Step-by-Step guide for using the PACE system

 When considering an assessment instrument, what are relevant quality markers necessary to include?  What are the challenges of evaluating related services scholars and graduates?  How can data collected be used to improve program quality?  Can we ensure “calibrated” responses across raters?