+ EOIS-CaMS DATA MIA EOIS-CaMS Data Management, Integrity and Analysis (Data MIA) Prepared by: Robyn Cook-Ritchie, RCR Consulting ManagementIntegrity Analysis.

Slides:



Advertisements
Similar presentations
(Individuals with Disabilities Education Improvement Act) and
Advertisements

Employment Ontario Literacy and Basic Skills Performance Management Reports Training For Service Providers.
ATC Conference Call January 10, 2008 Thank you for joining the call. We will start the call shortly. Please enter * 6 to mute your line and # 6 to unmute.
Individual Learning Plan for Kentucky. Background: Data Integration State-wide unique student identifiers are used to link information from a number of.
STAAR Alternate 2 Preparing for the STAAR Alternate 2 Assessment 2015.
PROGRAM MANAGEMENT OYAP PROGRAM CENTRAL REGION. WHAT IS A PROGRAM? Needs Program Objectives Input Activities Outputs Outcomes Short-term Intermediate.
Online Postgraduate Admissions Project Kate Ward – Project Manager.
Supportive Services for Veteran Families (SSVF) Data
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Electronic EDI e-EDI. The EDI has been in use since 1999 using a paper-based system and computerized spreadsheets to collect and manage EDI data. Over.
Employment Assistance Services. Registration Case Management  Assessment  Barriers  Goals and Tasks  Services  Notes Group Exercise Documenting Employment.
What is Contact IV? What is Contact IV? Contact IV is...  A National Case Management Employment Service Delivery System  A micro-computer system which.
Student Success Plan for Delaware. SSP Homepage The SSP Homepage is the central point from which students can access all of the features and functions.
CHICAGO PUBLIC SCHOOLS IMPACT Gradebook Core Team Support Guidelines July 7, 2008 Instructional Management Program & Academic Communication Tool I M P.
State of Maine: Quality Management and National Core Indicators.
National Incident Management System 5-Year Training Plan Al Fluman, Acting Director Incident Management Systems Division (IMSD), National Integration Center.
Enterprise Product Implementation Process. Components of a Successful Implementation  A detailed Scope Document for customer review and signoff  Creation.
Service Coordination Client Service Referral Process to Available EO Services.
System for Administration, Training, and Educational Resources for NASA SATERN Overview for Learners May 2006.
Welcome to PARCC Field Test Training! Presented by the PARCC Field Test Team.
NPI Reporting 2010 Visions Conference July 28, 2010.
Individual Learning Plan for Kentucky. ILP Homepage The ILP Homepage is the central point from which students can access all of the features and functions.
Welcome! Please join us via teleconference: Phone: Code:
Proposed National SET Goals for 2009 National SET Mission Mandate Team and National 4-H Council.
Supportive Services for Veteran Families (SSVF) Data HMIS Lead and Vendor Training Updated 9/14.
Incident Management PCCYFS June 27, 2007 Incident Management for the Office of Children, Youth and Families.
FORTIFYING HIGH PERFORMANCE : IMPROVING REPORTING AND DATA COLLECTION OF POSITIVE EXITS PRESENTERS: RIAN HOWARD POSTELL CARTER DONNA SATTERTHWAITE FORTIFYING.
NRS and Data Collection Part 2. NRS and Data Collection Part 1 and 2  Why? Make sure all programs understand what needs to be collected and definitions.
CiviContacts. Agenda CiviCase What is CiviCase? What does it do? How to create a case?
HPRP Program Close Out Rashida A. Cloud. Agenda Monthly TA Visits Final Monitoring Visits Developing an Internal Close Out Process Special Projects Task.
Circuit Rider Training Program (CRTP) Circuit Rider Professional Association Annual General Meeting and Conference August 30, 2012.
WKCE Proctor Guidelines. Who can Proctor the WKCE? A qualified proctor for the WKCE is an employed district staff member (including administrators, teachers,
NRS and MABLE Changes July 1, new data fields for students Highest Credential Achieved Education Location – U.S. or non U.S. Diploma at Entry?
Behavioral and Emotional Rating Scale - 2 Understanding and Sharing BERS-2 Information and Scoring with Parents, Caregivers and Youth May 1, 2012.
Alabama’s Professional Development Management System
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
AESuniversity NPI Reporting.  Session Overview  Terms and Definitions  Where It Fits Within MIS  Setup Process  User Process  Questions & Answers.
Senior Service America’s SPARQ Transition Overview Module August 15, 2012.
Clerical & Support Staff Introduction to Moving ON Audits & Clinical Indicators.
TAXCO BUSINESS SERVICES INC. Division of Des-Dawn Corporation BOOKKEEPING | PAYROLL | TAX FILING | TAX PLANNING | CONSULTING INTRODUCING TAXCO BILL PAY.
Breast Pump Policy Presentation to: Nutrition Services Directors and Breastfeeding Coordinators Presented by: Julianne Gaston MPH, RD, LD and Pat Cwiklinski.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
SAT ® School Day Implementation Overview November 21, 2013 Nancy Potter SAT School Day Program.
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
American Diploma Project Administrative Site Training.
Response to an Emergency Training for 211 Staff in Ontario Updated September
LBS PERFORMANCE MANAGEMENT WHAT’S HOT Claire Ramsay Contact North Webinar 4 March 2016.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES, ADMINISTRATION FOR COMMUNITY LIVING, WASHINGTON DC PHONE | FAX | WEB
Blackboard Learn 9.1 Communicating with Students © 2010 Blackboard Inc. All rights reserved.
Case Management Updates.  During the of week April 11, 2011, the ETO system experienced technical difficulties.  As of April 13, 2011, the errors have.
TOPSpro Special Topics VI:TOPSpro for Instructors.
Site Activity (Program Metrics) Training
CASAS Technical Assistance for California Non-Funded Adult Schools
2012 Grade 3 Reading Student Portfolio
THURSDAY TARGETED TRAINING: Reporting Regulations and Requirements
ARC Chairperson Training
Field Inventory Services-Sanofi Inventory and Audit Training
Welcome to the Nevada Test Administration Training and Q&A Session
Training for New District Test Coordinators
Setting Up and Supporting Clients Using Employee Development in ADP Workforce Now [Developer: Use this slide if you are not using audio. You can add.
ARC Chairperson Training
Best Practices for Makin’ It All Work!
Dynamics GP 2018 – Payroll and Human Resources
Welcome To Advanced CaMS User Session
Welcome to Building Your CaMS Knowledge
2011 Grade 3 Reading Student Portfolio
Grade 3 Reading Student Portfolio
Welcome to Connecting CaM-a lot – PART II
2013 Grade 3 Reading Student Portfolio
Presentation transcript:

+ EOIS-CaMS DATA MIA EOIS-CaMS Data Management, Integrity and Analysis (Data MIA) Prepared by: Robyn Cook-Ritchie, RCR Consulting ManagementIntegrity Analysis

+ Focus for today…Data: Management Processes to collect good data Integrity Processes to input good data Timely data Analysis Using the data to ask questions and make good decisions related to the five core measures for

+ Currently Available Reports-SDS LBS - All Data – IR LBS- All Data- Outcomes/Follow-ups LBS –All Data- Service Plan /Profile LBS - Case Activity DF - LBS - Participation DF - LBS - Information Session DF - LBS - Sub Goals And Plan Items DF - LBS - Service Plans LBS - Detailed Service Quality DF - Clients (Participants) DF - Education History DF - Employment History DF - Follow-up Events Service Plan Case Follow Up Inactive Cases Service Provider User Management

+ Currently Available Reports-Rollup LBS - All Data - IR (Rollup) LBS - All Data- Outcomes/Follow-ups (Rollup) LBS - All Data- Service Plan/Profile (Rollup) LBS - Service Delivery Site Mailing Labels LBS - Service Provider Mailing Labels LBS - Detailed Service Quality (Rollup) DF - Codes And Descriptions DF - Employers Available for the province or by region: Central Eastern Northern Western

+ Useful Reports Report 60D LBS-All Data Outcomes/Follow-ups Summarizes all outcome information Gives you summary data on completions and outcomes at exit, 3, 6 and 12 months Report 60B LBS-All Data Service Plan Profile Summarizes all profile information Gives you summary data for 4 of the 5 core measures in Report 61 LBS Case Activity Shows all data related to service plans that are open, approved, active or have been closed in the fiscal year (weekly) Report 19A Service Plan Case Follow-up Shows follow-up reviews that are overdue or due within the next 30 days (weekly)

+ Most Important Report Report 64 Detailed Service Quality The MOST IMPORTANT report. The only performance measure report. Shows performance commitments and actual results related to: Customer Service Effectiveness Efficiency This is the report you use to complete your Quarterly Status and Adjustment Report (QSAR).

+ Report 64- Where did that number come from? Phase 11-A: Five measures in DimensionMeasureWeightStandardSQS Value Customer Service (40%) 1. Customer Satisfaction 15%90% Service Coordination 25%50%1.25 Effectiveness (50%) 3. Suitability/Learn er Profile (all 12 indicators) 20%30% Learner Progress 30%60%1.8 Efficiency (10%) 7. Learners Served 10%90%0.9 SQS Standard 5.9

+ Five Core Measures What is different next fiscal year?

+ Dimension-Customer Service (40%) Measure #1: Customer Satisfaction (15%) This number is based on closed service plans Data comes from the answer to the question “On a scale of 1-5, how likely are you to recommend the LBS Program to someone looking for similar services as those you received” -learners who respond to the question with a rating of 4 or 5 are considered satisfied This information is captured on the Participant Exit Form and is entered when a service plan is closed Found on the service plan home page target: 90%

+ Customer Satisfaction Integrity Prior to closing a file in CaMS do we have a process in place to make sure we have collected the customer satisfaction data so it can be entered? (e.g. closing file checklist) Management Is this information collected from the learner at exit in a formal way? (e.g. using the old Learner Satisfaction Survey or another survey)? Is this information kept in the learner file? Is customer satisfaction data collected at points other than exit?

+ Analysis Are 90% of our learners satisfied with our service? Where do I find this information? Review report 64 on a monthly basis If not: Ask learners why not and make changes based on that information Survey learners more frequently Implement an anonymous suggestion box Possible changes: Hours of operation Offer short-term focused programming options Customer Satisfaction

+ Dimension-Customer Service (40%) Measure #2: Service Coordination (25%) Based on closed service plans Three sources: Referred in (when a service plan is opened –captured on the Participant Registration Form) Referral during service (as a service plan sub-goal) Referred at exit (as a service plan sub-goal) Referrals in from EO-Literacy and Basic Skills Providers, or Informal or Word or Mouth/Media Referral are not included target: 50%

+ Service Coordination Integrity Do we make sure that a “referral in” is recorded in CaMS when a service plan is opened? Are referrals during service and at exit that are recorded on the learner plan entered on the service plan in CaMS? Do we check to make sure all referrals have been entered prior to closing a service plan in CaMS? Management At intake, do we record a “referral in” on the PRF if there is one? Do we keep track of any referrals made during service and record them on the learner plan? Do we record referrals made at exit and record it on the learner plan? Do we keep additional documentation related to referrals in the learner file?

+ Analysis Are we meeting the service coordination target of 50%? Where do I find this information? If not: Determine where the gaps are- use report 60B to give you a summary of “referral in” and “referral out” information Make sure you are capturing all referrals that take place Review the report 64 guide to see what referrals “count” Discuss it at your service planning meeting with other community partners Service Coordination

+ Dimension-Effectiveness (50%) Measure #3: Suitability/Learner Profile (20%) Based on closed service plans Phase II-A includes all suitability indicators (see page 31 of the Service Provider Guidelines): Less than grade 12 Income source (OW/ODSP/No source of income/crown ward) Out of education six years or more Out of training for six years or more Over 45 and under 64 History of interrupted education Person with a disability Aboriginal person Deaf/Deaf blind Francophone This information is captured on the Participant Registration Form (PRF) and recorded when creating the case in CaMS, and when entering information on the Person Home Page and in the Client Summary in CaMS target: 30%

+ Suitability/Learner Profile Management Is the PRF filled out completely? Does intake staff understand which items on the the PRF relate to suitability/learner profile in CaMS? Is the PRF and CaMS person home page updated if a client self-identifies after the intake process (e.g. aboriginal person)? Integrity Is the information on the person home page in CaMS correct and up to date? Do all learners have a client summary in CaMS? – Use report 60B to determine if any are missing Is client summary completed and up to date in CaMS?

+ Suitability/Learner Profile Analysis Are we meeting our suitability criteria? If not: What suitability indicators are not represented? -Use report 60B for a summary of the suitability indicators represented or use the sort feature in Report 61 Think about ways your agency can target underrepresented groups. Do your referral partners understand the suitability criteria for the LBS Program?

+ Dimension-Effectiveness (W-50%) Measure #5: Learner Progress (W-30%) Based on both active and closed service plans It is the percentage of learners who have completed at least one milestone in the current reporting period Data comes from “attained” competency plan items target: 60%

+ Learner Progress Integrity Is there a process in place to make sure completed milestones are recorded in CaMS? Are completed milestones stored in the learner file? When milestones are completed are they updated in CaMS in a timely fashion? Do we check to make sure all milestones have been entered prior to closing a service plan in CaMS? Management Have practitioners had adequate training related to the administration of milestones? Do learners understand that they will need to complete milestones as part of the program? Do instructors have easy yet secure access to milestones? Is there at least one milestone related to the learner goal path identified on each learner plan?

+ Learner Progress Analysis Are at least 60% of the learners attaining one milestone? If not: Ensure there are clear processes in place so that learners and practitioners understand the expectations around completion of milestones. Make sure ongoing training delivery includes a variety of task- based activities that will help practitioners and learners know when they are ready to attempt milestones.

+ Dimension-Efficiency (10%) Measure #7: Learners Served (10%) The number includes all learners with an active learner plan served Only one service plan per learner is counted Only service plans with at least one milestone are included The milestone must be “in progress” or “completed” target: 90% but….. each service delivery agency has a performance commitment target of 100%

+ Learners Served Management and Integrity Do we know how many learners we have to serve? (See Schedule E of your service delivery agreement) Are learner plans developed in a timely fashion for each learner? Are all service plans entered in the system in a timely fashion? Are all our service plans “active” in the system? Use Report 61 Do all service plans in CaMS have a “completed” or “in progress” milestone?

+ Learners Served Analysis Are we meeting our numbers? If not: What client group could we target that we are not serving?- Use Report 60B Who are our current referral partners? Which other partners could we focus on? Are our services accessible? Could we offer itinerant or mobile service? Do our operating hours meet the needs of the learners? Is our agency contact information and service description up-to-date on the Web, in social media and in any print material? Do we offer programming that meets the needs of clients being served by other partners such as Employment Services of Ontario Works?

+ The END!