Data Fidelity Preconference Dr. Diane Cordry Golden March 2019
Device Demonstration Demonstration event has one AT Type (even with multiple devices), one or more participants, one access (decision-making) performance measure and multiple satisfaction ratings from each participant. Best practice is multiple devices within AT type compared and contrasted, reported as one demo event. Different AT type is separate demo event.
Demonstration: Potential Red Flags Demo event (AT type) always = participants Never more than one participant per demo? Reporting choice vs accurate? Average participant number 4x demo events Large participant numbers unlikely in all demos Over 4 for individual demo is suspicious Over 10 is likely not a demo Person with a disability participants demos 2 PWD decision-makers unlikely in 1 demo
Short-term Device Loan One device loan event has one borrower type, can have multiple devices (usually in one AT type) one performance measure (access or acquisition) and one satisfaction rating. Set short-term period appropriate for purpose Majority Decision-making Purpose ( 80% or more) Minority Other Purposes – Confirmed short-term event Event specific accommodation Loaner while waiting for repair/funding Professional Development/Training event Different from open-ended or unspecified event period Loan of device to use post hospital discharge with no clear end date when device is no longer needed
Device Loan Potential Red Flags Majority Purpose is not decision making How is decision-making supported for complex AT? Loan period by policy > 35 days Longer loan periods suggests events are not set as short-term Paired with accommodation as majority purpose suggests more open-ended loans Borrower # always = Device # Typically some decision-making loans will have more than one associated device Is this artifact of data system, procedures only allowing one device out to borrower at a time, or other?
Reuse Reuse event has one recipient, can have multiple devices in one or more AT type categories, with one acquisition performance measure and satisfaction rating. Recipients can be duplicated, e.g. person acquires reused devices and returns 6 months later to obtain additional devices is typically reported as 2 reuse events Data system must allow one recipient to have multiple associated devices acquired within AT Type or across AT Types
Reuse Potential Red Flags Devices reported with zero retail value Retail value must be $1 or more if a device number is reported for an AT type Average retail price per device is very low Mean retail price of only a few dollars per device suggests that perhaps the devices are supplies or consumables rather than reused AT? Recipient # always = device # Typically some recipients will get more than one device unless there is something unusual about the reuse program (limited AT type, etc.) Artifact of data system? Average Device # per recipient very large 2-3 devices per recipient is not unusual, but 5-10 per recipient especially within an AT type is highly unusual
State Financing Financial Loan has multiple data elements for each loan made, with one or more AT devices, one acquisition performance measure and one satisfaction rating. Other State Financing (direct provision or savings) event has one recipient with one or more AT devices, one acquisition performance measure and one satisfaction rating. Can report Other SFA direct provision or savings as one or separate by different activities.
State Financing Potential Red Flags Financial Loan: Highest/Lowest & Frequency Distribution not mathematically possible Highest/lowest and frequency distribution reported for Approved Applicant Incomes and Interest Rates Not possible to have highest of $80,000 but nothing reported in the frequency cell of $75,000 or more Not possible to have frequency distribution grouped in one range with sum much larger or smaller than grouping suggests Other SFA Savings: Devices reported with zero retail value Any device reported must have retail value of $1 or greater
Training Total Participants by Type = Total Participants by Geographic Distribution = Total Participants by Topic Accessible ICT Training participants have one performance measure each Transition training or technical assistance is mandatory. If Transition Topic has participants reported, must have narrative description. If narrative transition description is provided, must have associated participant number in transition topic line.
Training Potential Red Flags Total Accessible ICT Participants very low (10 or less) Sufficient N helpful for performance measure stability Accessible ICT narrative describes AT training not ICT accessibility training Performance measures are related to changing policies, procedures, practices related to ICT accessibility (e.g. web accessibility, procurement of accessible ICT, and similar) or supportive turn around training. AT training is inconsistent with desired performance measure outcomes. Large portion of participants cannot by categorized Unable to categorize by type and/or unknown geography Possibly public awareness/expo event rather than training?
Technical Assistance Technical Assistance Provided to agencies/programs Designed to improve laws, policies, practices, services, outcomes Transition training or technical assistance is mandatory. Red Flag: TA narrative describes TA provided to an individual. By definition TA is not provided to an individual
State Improvement Outcomes State Improvement Outcomes are Usually associated with “systems change” initiatives and/or TA End result is improved/expanded laws, policies, practices, programs Not mandatory, but not reporting any can be seen as less than positive Red Flag: Narrative describes initiative but no outcome is identified. If changed law, policy, practice, program cannot be clearly identified, initiative is likely TA with outcome not yet realized.
Leveraged Funding Non AT Act dollars that flow to the Grantee Used to support authorized AT Act activities Does not include dollars that flow to contractors Does not include in-kind contributions Potential Red Flags: Dollars are reported in Section B. Section B is limited to leveraged funding that supports AT Act authorized activities not included in State Plan. Almost no reason to not include authorized activities in State Plan. Large amount of “Federal” leveraged funding. Federal source limited to direct federal grants, fairly rare. More likely Public/State Agency flow through of federal dollars.
Performance Measures & Satisfaction Collected directly from state level activity recipients, borrowers and participants. Program should identify area (education, employment, community living) for all potential respondents Performance measure questions should be presented with provided response options for affirmative selection Satisfaction rating choices should be presented for affirmative selection Documentation of responses should be recorded/filed Potential Red Flag: All performance measures are 100% and all satisfaction rating is highly satisfied. Unlikely especially with large N. Do procedures discourage certain responses? Artifact of data system function?
Ensuring Data Fidelity Professional Development Consistent, ongoing PD (training and technical assistance) provided for everyone collecting and reporting data. Adult PD facts: One intensive training can be expected to result in about 12% level of implementation with fidelity. Ongoing review, oversight, additional training/coaching and technical assistance necessary to increase fidelity. Systematic Data Review Systematic review of data (monthly best practice, quarterly minimally) to identify volume changes/concerns and fidelity issues. Develop/implement intervention quickly to address.
Data System Management Grantees report aggregate data into APR Grantees need access to individual data records to ensure data accuracy and consistency. Fidelity checks identified all require access to individual data records to identify issues that need to be addressed Grantees should ensure any data system(s) used internally and by contractors aligns with the parameters of the APR unique to each activity data reporting and that the data system structure and aggregation tables produce accurate numbers. Without access to individual data records, it is very difficult to provide sufficient oversight to ensure data fidelity.
APR Data Entry/Submission/Follow-up Data Entry/Submission Sequence – Deadline Dec 31 Grantees enter data in NATADS Request exception if deadline cannot be met Automatic validation rules applied in NATADS Manual fidelity data review (grantee & CATADA/AT3 staff) Certifying Rep review/approval Mark complete, data is locked by December 31 ACL review and approve Follow-up Data clean-up and export to CATADA web portal Current APR posted to program CATADA program profile page In-depth review current/historical data by AT3 staff for TA
Next Steps TA Materials available (more description than slides) Add initial “red flag” fidelity checks into NATADS Not validation rules - reminders of things to check Pull previous year data for 5 key elements - ask for explanation for any significant change SFA recipients, reuse recipients, device loans, device demos, and training participants Flags may look different for aggregate APR vs individual records in D2D Suggestions/Questions diane.golden@ataporg.org vance.dhooge@ataporg.org