Data Fidelity Preconference

Slides:



Advertisements
Similar presentations
November 19, 2013 Preparing a Successful RFP to get Desired Results.
Advertisements

1.  Three types: ◦ Financial Loans, Other - Provide AT, Other – Savings NOTE: Totals exceed 56 as some states report multiple State Financing programs.
Documenting Cash and In-Kind Match Project Director Training & Annual Meeting.
1.  Decrease data element confusion  Increase data reporting consistency  Eliminate duplicative data (AFP/Telework)  Eliminate data elements not useful.
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
Data Validation Documentation for Enrollments. Learning Objectives As a result of this training you will be able to: Describe the data validation process.
1.  Data comes from October 1, 2007 to September 30, 2008 (FY 2008).  Sources include: State Plans, Annual Progress Reports, UIC data system.  The.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
*With some FY2008 Comparisons 1.  FY09 data comes from October 1, 2008 to September 30,  FY08 data comes from October 1, 2007 to September 30,
Why Use MONAHRQ for Health Care Reporting? May 2014 Note: This is one of seven slide sets outlining MONAHRQ and its value, available at
Presentation to the Oversight Board Santa Clara County Auditor-Controller 1.
Patient Protection and Affordable Care Act March 23, 2010.
Assessing Students With Disabilities: IDEA and NCLB Working Together.
Is it accurate? Current? Recommended comparisons just updated.
Winter Leader Conference February 4, 2009 “ Building Strong “1 American Recovery and Reinvestment Act of 2009 Presentation to: American Association of.
0 Training: General Training is not public awareness – training has more depth and breadth. If the purpose is to create awareness, the training session.
U.S. DEPARTMENT OF LABOR EMPLOYMENT AND TRAINING ADMINISTRATION ARRA GREEN JOB AND HEALTH CARE / EMERGING INDUSTRIES NEW GRANTEE POST AWARD FORUM JUNE.
0 Performance Measures: General Two survey instruments are provided: one for the “access” measure, and one for the “acquisition” measure. Two survey instruments.
UNDERSTANDING PERFORMANCE MEASURES Performance Measure Requirements Performance Measure Development Targets – Met/Not Met Strategies to Increase Performance.
OFFICE OF INDEPENDENT EDUCATION & PARENTAL CHOICE Budget and Finance 2009 CSP Grant OFFICE OF INDEPENDENT EDUCATION & PARENTAL CHOICE Budget and Finance.
READY FOR FY15 DATA? Diane Cordry Golden July 2014.
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
0 Training: General Training is not public awareness – training has more depth and breadth. If the purpose is to create awareness, the training session.
WIOA Basics An initial explanation of the WIOA legislation and MoA between IVRS and DE.
Data Reporting FY 2015 Planning for FY 2016 Diane Cordry Golden Vance Dhooge.
Temporary Assistance for Needy Families Part 265: Data Collection and Reporting.
Collecting Copyright Transfers and Disclosures via Editorial Manager™ -- Editorial Office Guide 2015.
The Assistive Technology Act Legislative History and the Basics of the 2004 Law.
(Draft Report) Data Exploration Actionable insights from data collected for Benchmarking DRAFT.
University of the Pacific
The Assistive Technology Act
Understanding the RUC Survey Instrument
North Carolina Council on Developmental Disabilities
The Council Budget Understanding the Budget Process
THURSDAY TARGETED TRAINING: Reporting Regulations and Requirements
Data Reporting FY 2015 Planning for FY 2016
American Institutes for Research
Child Outcomes Summary Process April 26, 2017
Voluntary Pre-K: SIS Data Entry Good morning.
(Winter 2017) Instructor: Craig Duckett
Diane Cordry Golden March 2017
Working with your AoA Project Officer
Exceptional Children Division Special Programs and Data Section
Supported Employment Part 2: Program and Policy
External Sales & Agreements (Contracts)
Data Update: APR, State Plan & much more (including homework)
Minnesota’s Homeless Management Information System (HMIS)
Cost or pricing data John Cancellara 7 March 2018.
Supported Employment Workforce Innovation Opportunity ACT (WIOA)
Red Flags Rule An Introduction County College of Morris
Performance Review for County Educators
Demo Loan Community of Practice: Update
The HIPAA Privacy Rule and Research
2014 Project Application Process
Performance Review for County Educators
Health Center Outreach and Enrollment (O/E) Quarterly Progress Report (QPR) Training October 9, 2013.
Maryland Online IEP System Instructional Series – PD Activity #8
The Assistive Technology Act
Regional healthcare coalition match documentation
Division of Long-Term Services and Supports
Introduction to Invoicing
North Carolina Council on Developmental Disabilities
DVR Policy and Procedures Overview for SRC
How to Complete a Matching Grant Report
Annual Progress Report & State Plan Update
Assessing Students With Disabilities: IDEA and NCLB Working Together
Estimating net impacts of the European Social Fund in England
Health Capital Technical Topics
Recognized Obligation Payment Schedule (ROPS) Certification Process
The Assistive Technology Act
Presentation transcript:

Data Fidelity Preconference Dr. Diane Cordry Golden March 2019

Device Demonstration Demonstration event has one AT Type (even with multiple devices), one or more participants, one access (decision-making) performance measure and multiple satisfaction ratings from each participant. Best practice is multiple devices within AT type compared and contrasted, reported as one demo event. Different AT type is separate demo event.

Demonstration: Potential Red Flags Demo event (AT type) always = participants Never more than one participant per demo? Reporting choice vs accurate? Average participant number 4x demo events Large participant numbers unlikely in all demos Over 4 for individual demo is suspicious Over 10 is likely not a demo Person with a disability participants  demos 2 PWD decision-makers unlikely in 1 demo

Short-term Device Loan One device loan event has one borrower type, can have multiple devices (usually in one AT type) one performance measure (access or acquisition) and one satisfaction rating. Set short-term period appropriate for purpose Majority Decision-making Purpose ( 80% or more) Minority Other Purposes – Confirmed short-term event Event specific accommodation Loaner while waiting for repair/funding Professional Development/Training event Different from open-ended or unspecified event period Loan of device to use post hospital discharge with no clear end date when device is no longer needed

Device Loan Potential Red Flags Majority Purpose is not decision making How is decision-making supported for complex AT? Loan period by policy > 35 days Longer loan periods suggests events are not set as short-term Paired with accommodation as majority purpose suggests more open-ended loans Borrower # always = Device # Typically some decision-making loans will have more than one associated device Is this artifact of data system, procedures only allowing one device out to borrower at a time, or other?

Reuse Reuse event has one recipient, can have multiple devices in one or more AT type categories, with one acquisition performance measure and satisfaction rating. Recipients can be duplicated, e.g. person acquires reused devices and returns 6 months later to obtain additional devices is typically reported as 2 reuse events Data system must allow one recipient to have multiple associated devices acquired within AT Type or across AT Types

Reuse Potential Red Flags Devices reported with zero retail value Retail value must be $1 or more if a device number is reported for an AT type Average retail price per device is very low Mean retail price of only a few dollars per device suggests that perhaps the devices are supplies or consumables rather than reused AT? Recipient # always = device # Typically some recipients will get more than one device unless there is something unusual about the reuse program (limited AT type, etc.) Artifact of data system? Average Device # per recipient very large 2-3 devices per recipient is not unusual, but 5-10 per recipient especially within an AT type is highly unusual

State Financing Financial Loan has multiple data elements for each loan made, with one or more AT devices, one acquisition performance measure and one satisfaction rating. Other State Financing (direct provision or savings) event has one recipient with one or more AT devices, one acquisition performance measure and one satisfaction rating. Can report Other SFA direct provision or savings as one or separate by different activities.

State Financing Potential Red Flags Financial Loan: Highest/Lowest & Frequency Distribution not mathematically possible Highest/lowest and frequency distribution reported for Approved Applicant Incomes and Interest Rates Not possible to have highest of $80,000 but nothing reported in the frequency cell of $75,000 or more Not possible to have frequency distribution grouped in one range with sum much larger or smaller than grouping suggests Other SFA Savings: Devices reported with zero retail value Any device reported must have retail value of $1 or greater

Training Total Participants by Type = Total Participants by Geographic Distribution = Total Participants by Topic Accessible ICT Training participants have one performance measure each Transition training or technical assistance is mandatory. If Transition Topic has participants reported, must have narrative description. If narrative transition description is provided, must have associated participant number in transition topic line.

Training Potential Red Flags Total Accessible ICT Participants very low (10 or less) Sufficient N helpful for performance measure stability Accessible ICT narrative describes AT training not ICT accessibility training Performance measures are related to changing policies, procedures, practices related to ICT accessibility (e.g. web accessibility, procurement of accessible ICT, and similar) or supportive turn around training. AT training is inconsistent with desired performance measure outcomes. Large portion of participants cannot by categorized Unable to categorize by type and/or unknown geography Possibly public awareness/expo event rather than training?

Technical Assistance Technical Assistance Provided to agencies/programs Designed to improve laws, policies, practices, services, outcomes Transition training or technical assistance is mandatory. Red Flag: TA narrative describes TA provided to an individual. By definition TA is not provided to an individual

State Improvement Outcomes State Improvement Outcomes are Usually associated with “systems change” initiatives and/or TA End result is improved/expanded laws, policies, practices, programs Not mandatory, but not reporting any can be seen as less than positive Red Flag: Narrative describes initiative but no outcome is identified. If changed law, policy, practice, program cannot be clearly identified, initiative is likely TA with outcome not yet realized.

Leveraged Funding Non AT Act dollars that flow to the Grantee Used to support authorized AT Act activities Does not include dollars that flow to contractors Does not include in-kind contributions Potential Red Flags: Dollars are reported in Section B. Section B is limited to leveraged funding that supports AT Act authorized activities not included in State Plan. Almost no reason to not include authorized activities in State Plan. Large amount of “Federal” leveraged funding. Federal source limited to direct federal grants, fairly rare. More likely Public/State Agency flow through of federal dollars.

Performance Measures & Satisfaction Collected directly from state level activity recipients, borrowers and participants. Program should identify area (education, employment, community living) for all potential respondents Performance measure questions should be presented with provided response options for affirmative selection Satisfaction rating choices should be presented for affirmative selection Documentation of responses should be recorded/filed Potential Red Flag: All performance measures are 100% and all satisfaction rating is highly satisfied. Unlikely especially with large N. Do procedures discourage certain responses? Artifact of data system function?

Ensuring Data Fidelity Professional Development Consistent, ongoing PD (training and technical assistance) provided for everyone collecting and reporting data. Adult PD facts: One intensive training can be expected to result in about 12% level of implementation with fidelity. Ongoing review, oversight, additional training/coaching and technical assistance necessary to increase fidelity. Systematic Data Review Systematic review of data (monthly best practice, quarterly minimally) to identify volume changes/concerns and fidelity issues. Develop/implement intervention quickly to address.

Data System Management Grantees report aggregate data into APR Grantees need access to individual data records to ensure data accuracy and consistency. Fidelity checks identified all require access to individual data records to identify issues that need to be addressed Grantees should ensure any data system(s) used internally and by contractors aligns with the parameters of the APR unique to each activity data reporting and that the data system structure and aggregation tables produce accurate numbers. Without access to individual data records, it is very difficult to provide sufficient oversight to ensure data fidelity.

APR Data Entry/Submission/Follow-up Data Entry/Submission Sequence – Deadline Dec 31 Grantees enter data in NATADS Request exception if deadline cannot be met Automatic validation rules applied in NATADS Manual fidelity data review (grantee & CATADA/AT3 staff) Certifying Rep review/approval Mark complete, data is locked by December 31 ACL review and approve Follow-up Data clean-up and export to CATADA web portal Current APR posted to program CATADA program profile page In-depth review current/historical data by AT3 staff for TA

Next Steps TA Materials available (more description than slides) Add initial “red flag” fidelity checks into NATADS Not validation rules - reminders of things to check Pull previous year data for 5 key elements - ask for explanation for any significant change SFA recipients, reuse recipients, device loans, device demos, and training participants Flags may look different for aggregate APR vs individual records in D2D Suggestions/Questions diane.golden@ataporg.org vance.dhooge@ataporg.org