Presentation is loading. Please wait.

Presentation is loading. Please wait.

This Conversation May Be Recorded for Quality Purposes Fundamentals of a Call Observe Quality Assurance Program Jasmine Thomas Quality Assurance Specialist.

Similar presentations


Presentation on theme: "This Conversation May Be Recorded for Quality Purposes Fundamentals of a Call Observe Quality Assurance Program Jasmine Thomas Quality Assurance Specialist."— Presentation transcript:

1 This Conversation May Be Recorded for Quality Purposes Fundamentals of a Call Observe Quality Assurance Program Jasmine Thomas Quality Assurance Specialist WellSpan Access Call Center

2 What should you be listening for? Establish the specific behaviors employees must practice based on departmental goals and expectations.

3 Wildly Important Goals Did the agent ask the caller for their preferred name and address the caller by that name throughout the call? Did the agent pull the caller/patient up in the scheduling system? Did the agent remind the caller to review their preparation instructions?

4 Authenticates Two Identifiers Verifies Diagnosis/Symptoms/Dx Code(s) Encourages Caller to Register for Labs/Schedule Imaging Studies Clearly Identifies Arrival Instructions

5 What should you be listening for?  Listens Actively  Avoids long, unexplained, Silent Pauses  Uses Courteous Words & Statements  Enunciates Clearly & Speaks Slowly for Ease of Understanding Establish behaviors that are specific to a call center and/or customer service work environment.

6 Create an Evaluation Form!!  Does Not Meets  Partially Meets  Meets  Exceeds  Not Applicable Determine how you are going to rate each behavior.

7 Define Each Behavior Checks Scheduling System for Additional Appointments to be Registered o Does Not Meet- Agent fails to pull the caller up in the scheduling system. Agent pulls the caller up in the scheduling system but does not complete the registration for any future visits occurring within the next 30 days. o Partially Meets- NA o Meets- Agent pulls each and every caller up in the scheduling system, identifies any future visits occurring within the next 30 days and completes the registration for said visit(s).

8 Define Each Behavior Listens Actively o Does Not Meet- While completing the registration, the agent does not give the caller their full attention, continuously interrupts the caller while they are speaking and continuously asks the caller to repeat themselves. o Partially Meets- Agent only gives the caller their undivided attention during parts of the registration. During other parts of the registration, the agent is, noticeably, distracted and/or asks the caller to repeat themselves. o Meet- Agent gives the caller their undivided attention, is able to pull cues from the caller/conversation to assist in the registration and listens without interrupting the caller.

9  First Impression/Great Greeting  Telephone Skills  Identifying Information  Customer Experience  Handoffs  Closing  Bonus (how to achieve an Exceeds) Categorize the Behaviors

10 Develop a Scoring System/Scoring Key  Does Not Meet: 75% <  Partially Meets: 75 – 84.99%  Meets: 85 – 100%  Exceeds: 100% >  Does Not Meet: 0 – 43  Partially Meets: 44 – 49  Meets: 50 – 55  Exceeds: 56+

11

12

13  Create a Summary  Make sure your team understands the behaviors and how they are being evaluated  Keep your team privy to any changes/updates  Provide timely feedback  Schedule routine Coaching  Calibrate!!! QA Specialist Best Practices

14 Questions???

15 Sample Calls…

16 Well THAT Didn’t Go Well … Lessons Learned from a Project Failure Virginia Robbins Director, Patient Access Penn State Hershey Medical Center

17 The Project l Implement automated eligibility checking for: n Ambulatory pre-registration n ED registration l Let’s call it product “Auto Elig” from Vendor X

18 Some Key Facts l Fully employed physician model l One Patient/One Chart, OP and IP l Ambulatory Pre-Registration n 5000 visits per day n Team Manager: 6 months in position and new to leadership l ED Registration n 200 visits per day n Manager: 1 year in position, 4 years in Access leadership

19 Team Members l IT Analysts n Role is to understand software n Meet system requirements l Access Team Manager/Manager n Running complex teams and meeting operational business needs l Director n Simultaneously responsible for Access evaluation of new Revenue Cycle software

20 Project Goals 1.Ambulatory:  Eliminate manual eligibility verification of 5000 visits/day  Automate process  Work exceptions only  Increase accuracy, reduce time, reduce staff 2.ED:  Replace multi site verification with single site  Increase accuracy, reduce time, not staffing impact

21 Ambulatory Pre-Registration Process 1.Create encounter in billing system to initiate billing process 2.Set “Precert” to Y if service could potentially need pre-certification 3.Has patient been seen within 6 months? n Yes: verify insurance eligibility n No: Call patient, perform full registration, verify insurance eligibility

22 Pre-Registration Process – Con’t 4.Copy co-pay information from insurance eligibility source to billing software for practice site staff at patient arrival 5.Set “Referral” to Y if patient’s insurance requires a referral for specialist care

23 Downstream Processes – Patient Access l Centralized Pre-Certification team pulls all “Pre- Cert = Y” encounters, review benefits, initiates pre-certification process if needed. l Centralized Referral team writes PSHMC referrals and calls non-PSHMC PCPs to obtain referrals.

24 Downstream Processes - Practice Sites l Patient arrives for care, checks-in l Pre-Registration complete; reduces registration time l Pre-Cert and/or referral, if needed, are in place l Co-pay displayed in billing system for easy collection

25 Project Progression l Director unable to be present due to competing project requirements with Revenue Cycle software evaluation (Red Flag #1) l Vendor X spent 6 hours with Pre-Registration and Pre-Certification managers to review current procedures in detail l IT not present (Red Flag #2)

26 Project Progression l Vendor X returns with project plan based upon their standard processes and without reference to our current processes (Red Flag #3) l Step 1: implement new tool without automation. i.e., replace today’s multiple web sites with vendor’s one web site

27 Project Progression l Interfaces built and mapping done to send billing data to Vendor X’s Auto Elig website l Pre-Registration Associates trained in Auto Elig l Ready for Go Live!

28 GO LIVE Results l Vendor’s automatic “push” of encounter to Auto Elig website is up to 7 days prior to date of service l We validate at 25 days prior to date of service to … n Allow time to obtain Pre-Certification n Allow time to write Referral l We had to do manual push of data, encounter by encounter

29 GO LIVE Results l Manual push takes 30-40 seconds longer to process than previous websites l 30 sec * 5000 lookups daily = 41.6 extra hours/day = 5.7 FTEs worth of additional processing time l This is progress?!?

30 It Gets Worse … l Even if Auto Elig could check at 25 days (which it cannot) it also cannot return the co-pay to the billing system in the same place and format that the Practice Site Check-in Staff expects to see (500 people) l Stakeholder base just exploded beyond Patient Access … and they weren’t included in the project.

31 Time for Discussion l Vendor: n PSHMC is validating at 25 days instead of 7 days as all other clients do n 7 days is best practice n PSHMC must change l PSHMC: n Very willing to change processes but … n How do other clients do pre-certification?

32 Discussion continues … l Vendor: n Ummm….. l Answer: n Ordering physicians’ offices obtains Pre-Certification n No other clients with centralized Pre-Certification team because … n No other clients with fully employed physician model

33 Ambulatory Lessons Learned l Vendor: n Did not identify what is unique to PSHMC despite six hours review of current processes l Lessons: n Vendor assumptions may be invalid n Assume you know more than the vendor n Question everything n Put the breaks on if not satisfied with information provided

34 Ambulatory Lessons Learned (Con’t) l Operations leadership n Pre-Registration Team Manager too green to see where new process was lacking n Director was not present; conflicting projects l Lessons: n Don’t exceed bandwidth n Balance operational needs and new technology implementation

35 Ambulatory Lessons Learned (Con’t) l Operations/IT Intersection n IT did not need to understand current processes to implement new product n Operations team did not understand the “techie talk” l Lessons n Need project leadership resource n No operational duties n Manage vendor relationship

36 Meanwhile, over the in ED … l All is working well l Why? l ED visits are for today so encounter flows to Auto Elig automatically (trigger is 7 days or younger) l ED Registration Associates pull ED co-pay, continue to manually add to billing system

37 ED Benefits l One source of eligibility verification instead of previous multiple sites l ED co-pay is available at check-out per previous process; no change l Time-savings per registration but volume is too low to impact staffing (ED 200 visits per day over 24 hours v Ambulatory 5000 visits per day over 8 hours)

38 So Where Do We Go From Here? l Coming soon to a theatre you: Automated Eligibility, Take II l What’s different this time … n Project Manager is assigned n Director is leading Operations project team n Solved the 7 day/25 day barrier n Practice Site leadership on board – staff will learn how to use Auto Elig

39 Proposed New Flow For Ambulatory l Create encounter at 25 days prior to date of service (not change) l Manually validate eligibility at 25 days only if Pre- Cert = Y (20%) l Automatic eligibility verification at 7 days for all other services (80%) l Exception worklists for the 80% l Practice Site: get co-pays in Auto Elig

40 QUESTIONS?

41 Getting it Right Up Front: Registration Accuracy

42 TAKING A CLOSER LOOK: REGISTRATION ACCURACY  We will take a look at how to ensure registration accuracy from three different perspectives:  Registrar  Department  Organization Quality Assurance Process

43 RESEARCH ON ACCURACY  30-40% of denials are caused by registration errors, representing as much as 0.5% in lost revenues  Hospitals in the top quartile have a registration accuracy rate of 97% compared to the national average of 93%

44 GETTING IT RIGHT UP FRONT: REGISTRAR  Use all available resources  Help documents  Follow an established workflow  Workflow is set of tasks—grouped chronologically—that are necessary to accomplish a given goal  Goal is to maximize efficiency and accuracy

45 GETTING IT RIGHT UP FRONT: DEPARTMENT  Department specific policies and procedures  Necessary as a resource and to hold staff accountable  Have staff review policies and procedures annually (including clinical staff who perform registration tasks)  Develop workflow  Have new staff trained using workflow  Choose training staff wisely

46 GETTING IT RIGHT UP FRONT: ORGANIZATION QUALITY ASSURANCE  Resources  Help documents  Email – Distribution lists  Hot Tips  Monthly Registration Newsletter  Bill Edits  System generated  Customized based on registration errors

47

48 GETTING IT RIGHT UP FRONT: ORGANIZATION QUALITY ASSURANCE  Maintain a Quality Assurance (Audit) Process  To monitor and track registration errors to identify necessary improvements to the registration process  To improve the education and enhance the accountability of registrars in order to achieve the highest standards of registration accuracy

49 GETTING IT RIGHT UP FRONT: ORGANIZATION QUALITY ASSURANCE  INTERNAL AUDITS  Monitor and correct accounts on daily reports  System generated reports  Custom reports  EXTERNAL AUDITS  Billing and Registration Departments

50 REGISTRATION ACCURACY STANDARDS  The parameters of the Accuracy Standards policy should be clearly defined and consistent across all registration departments and should include:  Registration Quality Assurance Referral Legend  Accuracy Rates

51 REGISTRATION ACCURACY STANDARDS  Registration Quality Assurance Referral Legend  Lists errors and the assigned point values  Audits range from 1 to 5 points  Incorrect Location = 1 point  ABN not presented to the patient (Compliance) = 5 points

52 QUALITY ASSURANCE FEEDBACK  Automated email sent directly to registrar  Explains the error  Provides what corrections were made  Reference to a help document (when necessary)

53 REGISTRATION ACCURACY STANDARDS

54 QUESTIONS?


Download ppt "This Conversation May Be Recorded for Quality Purposes Fundamentals of a Call Observe Quality Assurance Program Jasmine Thomas Quality Assurance Specialist."

Similar presentations


Ads by Google