Revamping the interview season: Considerations and lessons learned Charlotte Derr, MD, RDMS, FACEP Associate Program Director University of South Florida Emergency Medicine Residency Program
Considerations and lessons learned Interview invitation Scoring system and Interview committee Feedback
Interview invitation Previously program coordinator sent out invitations and received phone calls to schedule interview dates Interview broker self-scheduling
Scoring system Previously program used a letter grade system A group (A+ , A, A-) Even larger B group Some C’s Few D’s, F’s DNR category
Scoring system Previously program evaluation method: Knowledge-based question Ethics question General questions about applicant Multiple faculty involved, program director interviewed all
Scoring system program used a letter grade system Based on overall gut feeling, not terribly objective Large number in each “grade” level with no great way to really organize them within that group Compared after each interview date to place them in order of preference but memory of earlier applicants would fade
Scoring system A scoring tool to rank applicants – numeric value Commonly used “metrics” Attributes related to emergency medicine Interview day observations
Scoring system Interview committee of the same individuals 4 faculty and 1 senior resident Other faculty and residents: social, lunch Scoring was completed at the end of each day and entered into ERAS Rank list created by ERAS scoring system as each applicant was interviewed
Scroll down
Scroll down to see more
Enter names of scoring criteria
To enter or see score in applicant’s file go to “score” tab
Scroll down to the bottom to enter numerical values
Enter numerical values in each category
Sort by composite score to see ranked candidates
You can export in Excel and create fields of comparison
Click on Bulk Print Requests and window will open Select file to view in Excel
Sort by composite score to see ranked candidates Excel document can be uploaded into your nrmp rank order list
Interview Feedback Previously: a self-addressed stamped envelope given to all applicants (5 year lapse) This year: survey monkey of new questions sent only to unmatched applicants
36% response rate
Interview Feedback Reasons for not ranking us higher felt more like I was being grilled on information impersonal interviewer read through my application for what seemed like the first time difficult to connect with some of the interviews/had awkward interviews
Interview Feedback Reasons for not ranking us higher Hours worked by residents Resident satisfaction
Reasons for not ranking us higher Interview Feedback Reasons for not ranking us higher Perceived weaknesses Did not market too well on interview day Rough hours (12s vs 8s) department reputation among the hospital Surgery runs most trauma except airway (Need more involvement in procedures/running the trauma)
Interview Feedback Suggestions for interview days More time allocated to each interview Have more residents available Provide a pre-made folder for each participant Give us a chance to get to know you more This program is better than it is sold to applicants on interview day
How do you get the information you really want? Hartman Value Profile in the Resident Selection Process – Michael Harrington, MD
Future Directions Reevaluate the interview scoring tool Similar application “metrics” Review criteria Modifying or removing interview day questions
Future Directions Reevaluate the interview day Program folder with faculty profiles Fewer impersonal questions, be prepared More resident involvement Survey matched applicants starting in July Marketing (suggestions?)
Future Directions Resident perception and wellness Our most important voice Evaluate shift hours Internal survey to residents Solicit feedback on interview process and ways for them to contribute