Visual Impairment Scale of Service Intensity of Texas Study Report AER International Conference San Antonio, Texas August 1, 2014.

Slides:



Advertisements
Similar presentations
Phase III CVI: What do I do now? Part 1
Advertisements

A Vehicle to Promote Student Learning
Assessment Photo Album
PD Plan Agenda August 26, 2008 PBTE Indicators Track
PUT TITLE HERE Collaborating for Better IEPs Slide Deck No.4 Ministry of Education, 2009.
Understanding the IEP Process
Continuing QIAT Conversations Joan Breslin Larson Follow up webinar post Feb for AT Conference for AT Teams Hosted by Oklahoma.
Getting Your Protégé Up and Running September 3, 2008 Chrissy Cowan Mentor Coordinator.
 Reading School Committee January 23,
AER O&M Conference New Orleans, December 13, 2013 Rona Pogrund, Ph.D., COMS Debra Sewell, TVI Debra Sewell 1.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
Formative and Summative Evaluations
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
INACOL National Standards for Quality Online Teaching, Version 2.
Assessment of Special Education Students
Survey Designs EDUC 640- Dr. William M. Bauer
Mentorship. More people are applying for and completing mentorship Mentorship recommended  15 Completing mentorship  2.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
INTELLIGENCE TESTING OF INDIVIDUALS WHO ARE BLIND OR VISUALLY IMPAIRED: A Position Paper Marnee Loftin, MA, TSBVI Carol Evans, PhD, Davis District, UT.
Principles of Assessment
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
Standards for Education and Rehabilitation of Students who are Blind and Visually Impaired A general overview of accepted standards for Teachers of the.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Adolescent Literacy – Professional Development
Mental Health Update The Referral Process Behavioral Strategies MHP Job Duties.
Getting Oriented to Exceptionality and Special Education There is no single accepted theory of normal development, so relatively few definite statements.
Foundations of Inclusion Training Workshop
Classroom Assessments Checklists, Rating Scales, and Rubrics
ND Early Childhood Outcomes Process Nancy Skorheim – ND Department of Public Instruction, Office of Special Education.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Assessing Program Quality with the Autism Program Environment Rating Scale.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
Miller Function & Participation Scales (M-FUN)
SPED 537 ECSE Methods: Multiple Disabilities Chapter 5 March 6-7, 2006 Deborah Chen, Ph.D California State University, Northridge.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
The AER Student Council Presents Preparing for Your Job Search Lisa McConachie Senior Director Columbia Regional Program Portland Public Schools.
RTI Implantation Guide Overview Fall Before we begin… Student Intervention Planning is not a pre-referral process. It is the process of collaborating.
 What is special education?  How does a student qualify for services?
Ohio Superintendent Evaluation System. Ohio Superintendent Evaluation System (Background) Senate Bill 1: Standards for teachers, principals and professional.
Functional Vision Evaluation Tips on getting it done.
Determining Type and Amount of Itinerant TVI Service Using the VISSIT Rona Pogrund, Ph.D. Perkins Webinar November 17,
Functional Vision & Learning Media Kentucky Exceptional Children's Conference Louisville, KY November 22, 2015 Presented by Cathy Johnson, APH Field Services.
Foundations of Inclusion Training Workshop
ASSESSMENT TOOLS DEVELOPMENT: RUBRICS Marcia Torgrude
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
CVI and the IEP Teri Turgeon Education Director Community Programs Perkins School for the Blind.
Program Evaluation Making sure instruction works..
CVI Symposium Ellen Cadigan Mazel M.Ed., CTVI Perkins School for the Blind May 2015.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
NYSED Network Team and Teacher and Principal Evaluator Training Kate Gerson -Senior Fellow Ken Slentz -Associate Commissioner June 2,
How to Involve Families in the Child Outcome Summary (COS) Process Debi Donelan, MSSA Early Support for Infants and Toddlers Katrina Martin, Ph.D. SRI.
COST/SST Referral Process The Pyramid of Interventions.
Michigan Severity Rating Scales Vision Services Severity Rating Scales (VSSRS) VSSRS+ (for students with additional needs) Orientation & Mobility Severity.
Using the CLASS tool to Improve Instructional Practices in Early Childhood Tracie Dow and Felicia Owo.
VISSIT Visual Impairment Scale of Service Intensity of Texas Available from the Texas School for the Blind and Visually Impaired
Vision Services and Support Emily Coleman, Teacher of the Visually Impaired Washington State School for the Blind.
Professional Growth and Effectiveness System Update Kentucky Board of Education August 8,
Navigating the ARD/IEP Process
Mary C. Zatta, Ph.D. and Sharon Z. Sacks, Ph.D.
Practice Session Using the VISSIT to Determine Service Intensity TAER Conference Corpus Christi, Texas March 30, 2017.
DESE Educator Evaluation System for Superintendents
VISSIT.
Chris Russell Sam Morgan Hunter College SPED 746
Update from ECO: Possible Approaches to Measuring Outcomes
Active Learning Implementation
The Assessing Cycle Module 1 Assessment Assessment Accountability Data
Chapter 4 Instructional Media and Technologies for Learning
Presentation transcript:

Visual Impairment Scale of Service Intensity of Texas Study Report AER International Conference San Antonio, Texas August 1, 2014

Members of Service Intensity Subcommittee Rona Pogrund, TTU, Chair Cyral Miller, TSBVI Outreach Frankie Swift, SFASU Kitra Gray, Region 10 ESC Mary Ann Siller, Richardson ISD Chrissy Cowan, TSBVI Outreach Michael Munro, SFASU Tracy Hallak, SFASU Cecilia Robinson, Region 4 ESC Shannon Darst, TTU Ph.D. student Rona Pogrund, TTU, Chair Cyral Miller, TSBVI Outreach Frankie Swift, SFASU Kitra Gray, Region 10 ESC Mary Ann Siller, Richardson ISD Chrissy Cowan, TSBVI Outreach Michael Munro, SFASU Tracy Hallak, SFASU Cecilia Robinson, Region 4 ESC Shannon Darst, TTU Ph.D. student

Visual Impairment Scale of Service Intensity of Texas (VISSIT) Type of service: direct intervention and collaborative consultation Focus is on student need in the ECC areas What is the VISSIT ?

Creation of the VISSIT Field trials June 2013 retreat of the Service Intensity Subcommittee of the Texas Action Committee for the Education of Students with Visual Impairments History of the VISSIT

Research study proposal through Texas Tech University’s and Stephen F. Austin State University’s Institutional Review Boards Study approval and participant selection Study implementation- September March 2014 History of the VISSIT (continued)

Overview of the VISSIT – First Page

Overview of the VISSIT – Final Page

Overview of the VISSIT – Additional Areas of Family Support Table

Overview of the VISSIT – Recommended Direct Service Time Range Form

Overview of the VISSIT – Recommended Educational Team Support/Collaboration Service Time Range Form

Frequently Asked Questions PURPOSE Q: Can the VISSIT be used for all students on my caseload, including those with multiple impairments and/or those with deafblindness? How about infants? A: The VISSIT is designed to determine the appropriate type and amount of services needed for ALL students with visual impairments on the TVI caseload. Q: Is the VISSIT to be used as a caseload analysis? A: The VISSIT is not a caseload analysis tool but can be used as part of a process to determine appropriate caseload size. The VISSIT does not take into account issues related to workload (e.g., planning, travel, and material preparation). HOW TO USE THE VISSIT Q: Can professionals who are not teachers of students with visual impairments fill out the VISSIT? A: The VISSIT must be completed by a TVI who has the vision-specific knowledge to quantify the levels of service intensity. Q: How often should the VISSIT be completed? When might I complete the VISSIT? A: The VISSIT should be completed prior to any determination of service type and amount. It should be completed prior to any IEP or IFSP meeting so that the TVI can have data to determine and support recommended type and amount of services for students.

Information About the Study -38 responded and consented to participate -25 actual participants -81 actual VISSITs completed and returned -Electronic survey -Results of the study indicated that the tool is moderately valid and reliable

Was the VISSIT, in its entirety, easy to use? # QuestionVery difficult to use Mostly difficult to use Easy to useMostly easy to use Completely easy to use Total Responses Mean 1Easy to use? Was the VISSIT, in its entirety, easy to use? (n=25) Was the VISSIT, in its entirety, easy to use? (n=25)

Time Needed to Complete VISSIT The average time it took to complete the VISSIT per student was 31 minutes per student. Average time to complete was faster (15-20 minutes) after completion of several scales and with more familiarity. Time Needed to Complete the VISSIT

#QuestionCompletely not based on evaluation results Mostly not based on evaluation results Somewhat based on evaluation results Mostly based on evaluation results Completely based on evaluation results Total Responses Mean 1Based on student evaluation results? Did you base your VISSIT scoring of student need on the student's evaluation results? (n=24)

#QuestionCompletely did not match Mostly did not match Somewhat matched Mostly matched Completely matched Total Responses Mean 1Matched? Did the results of the VISSIT match your professional judgment regarding student need and recommended type and amount of service? (n=24)

#Question Completely did not translate to recommendation Mostly did not translate to recommend- ation Somewhat translated to recommend- ation Mostly translated to recommend- ation Completely translated to recommendation Total Responses Mean 1Translated to recommendation? Did your VISSIT results directly translate into the type and amount of service you recommended for your student's IEP? (n=24)

#AnswerResponse% 2Yes2296% 1No14% Total23100% Do you feel you would use the VISSIT in the future for determining the type and amount of service you recommend for your students? (n=23)

#AnswerResponse% 2Yes1878% 1No522% Total23100% Do you feel that the VISSIT is a better tool to use for determining the type and a mount of service than other available tools or methods you are currently using? (n=23)

Revised VISSIT Based on feedback from the study participants, the instructions were simplified and reformatted to make it more user-friendly A test-retest phase of the study was conducted in March with the revised VISSIT asking the same participants to complete one additional administration of the tool on one student and take a short survey to assess the revisions of the scale. Revised VISSIT

#AnswerResponse% 1Completely difficult to use00% 3Easy to use00% 2Mostly difficult to use00% 4Mostly easy to use850% 5Completely easy to use956% Overall, was the revised VISSIT easy to use? (n=16)

#AnswerResponse% 1 Completely unclear and not understandable 00% 2Mostly unclear and not understandable00% 3Clear and understandable16% 4Mostly clear and understandable531% 5Completely clear and understandable1063% Were the revised instructions clear and understandable? (n=16)

Would you prefer to use a paper format or an electronic format of the VISSIT? (n=16)

Next Steps The electronic version has been completed. A website for accessing the electronic version is in development. Continuing data analysis to gather more information about various aspects of the validity and reliability of the tool. Next Steps

Filling Out the VISSIT – Sample Student Sample VISSIT – STUDENT A “Lily” Lily is a 5-year-old girl who has a history of “extensive multifocal cystic encephalomalacia involving bilateral cerebral hemispheres”. Dr. XXX reports, “likely cortical vision impairment”, legally blind and, "… appears to have no vision". Lily also has a history of HSV meningitis as an infant, cerebral palsy, seizure disorder, encephalopathy, and developmental delay. She is able to alert to sounds and to her name being called; turn or tilt her head towards music, bells, and instruments; sit in an adapted chair; smile when she is happy; and demonstrate discomfort (as when she is wet). She demonstrates the 3 distinct criteria for Cortical Visual Impairment: 1) eye exam does not explain the child’s functional use of vision, 2) history of a neurologic incident or some neurologic sequelae, and 3) demonstrates the unique visual and behavioral characteristics associated with CVI. With regard to her vision, Lily is able to visually attend to and track an 8-inch lighted red ball when it is presented in either her far left or far right visual field or no more than 12 inches away from her face. She also visually attends to other targets that are either lighted or have reflective qualities. No distance viewing is observed, other than staring at ceiling lights and sunlight for brief moments. Results from her learning media assessment indicate that Lily relies heavily on her auditory mode for learning, but will explore real objects that are of a single, bright color, reflective, and/or can light up. Lily is transitioning from a school-based self-contained classroom into a homebound setting with services being provided in her home.

Filling Out the VISSIT – Intensity of Need in Skill Areas

Questions? Comments?

Contact Information Rona Pogrund, Ph.D., Texas Tech University – Cyral Miller, Texas School for the Blind and Visually Impaired – Michael Munro, Stephen F. Austin State University – Shannon Darst, Ph.D., Texas Tech University –