Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning.

Similar presentations


Presentation on theme: "A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning."— Presentation transcript:

1 A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning & Institutional Research University of Miami SAIR, October 24, 2005

2 Definition of Dashboard A Dashboard is a visual display of Key Performance Indicators (KPIs) presented in a concise, intuitive format that allows decision makers to monitor institutional performance at a glance.

3 Dashboard Characteristics  Provides visual display of important Key Performance Indicators (KPIs)  Uses concise, intuitive, “at-a-glance” format (uses icons and colors)  Offers high-level summary (reduces voluminous data) Display of “gauges” to monitor key areas

4 Uses of A Dashboard  Provides quick overview of institutional performance  Monitors progress of institution over time (trends)  Alerts user to problems (colors indicate positive/negative data)  Highlights important trends and/or comparisons with peers  Allows access to supporting analytics when needed to understand KPI results (drill down)

5 Predecessors & Related Approaches  Executive Information Systems, and their predecessor, Decision Support Systems  On-Line Analytical Processing (OLAP) associated with data warehouses  Balanced scorecards  Key success factors  Benchmarking  Key performance indicators

6 Why Do A Dashboard? Senior managers  Want to monitor institutional performance  Are very busy—little time to study reports  Value reports that clearly show conclusions  Appreciate overview, with indicators from different areas in one place  Use both trends and peer data “What you measure is what you get.” Robert S. Kaplan and David P. Norton

7 Impetus for Next Generation Dashboard Report  Session at Winter 2004 HEDS conference  Representatives from 4 HEDS institutions shared dashboard reports  Presentation & discussion prompted ideas about features that might add value  Dashboard report described here developed as result Example of how conference session led to project that would not have been done otherwise

8 Characteristics of Dashboards Presented at HEDS Conference  All used single page (though some had 2 nd page for definitions & instructions)  All presented trend data (changes over 1, 5, 6, and 10 years)  All used up/down arrows, icons, or “Up”/“Down” to show direction of trends  Three displayed minima and maxima values for the trend period  Three used colors to show whether trends were positive or negative  One used peer data

9 Questions Generated by HEDS Dashboards and Proposed Solutions 1.Concise or detailed?  HEDS: Laments about not being able to provide more detail (“senior administrators should want to see more”)  Reaction: Sympathized with viewpoint, but have learned most senior administrators want summaries, not detail  Next Generation Dashboard: Keep concise format plus links to optional graphs & tables

10 Issues that Came Up in Discussion at HEDS and UM Solutions 2.Trends or peer data?  HEDS: All four dashboards used trend data; one also used peer data  Reaction: UM values peer data to support benchmarking  Next Generation Dashboard : Use both peer and trend data

11 Issues that Came Up in Discussion at HEDS and UM Solutions 3.When should icon for trend or difference from peers appear?  HEDS: Dashboards seemed to display icons for all non-zero differences  Reaction: Didn’t want small differences to be treated as real changes  Next Generation Dashboard : Use p-values from regression and t-tests to control display of icons for trends and peer comparisons

12 Issues that Came Up in Discussion at HEDS and UM Solutions 4.Include minima and maxima?  HEDS: Three displayed minima and maxima over the trend period  Reaction: UM’s senior VP decided too cluttered  Next Generation Dashboard : Shows trends of own institution and 25 th and 75 th percentiles of peers, with no maxima or minima

13 Unique Aspects of Next Generation Dashboard Dashboard  Provide drill-down links to graphs and tables for more detail, if desired  Provide peer data in addition to trends  Use regression (rather than maxima and minima) to determine direction of trends  Use statistical significance of slope (rather than just difference) to generate trend icons  Use t-tests to generate peer comparison summary Functions like adding “Global Positioning System (GPS)” to your dashboard

14 Implementation  Two dashboard reports: student indicators and faculty/financial indicators  17/21 KPIs on single page  Box for each with current value, arrows to show trends, and text to show relation to peers  Links to more detailed graphs and tables

15 Indicator Display Upper left corner  Up arrow, down arrow, or horizontal line  Shows direction of UM trend for last 6 years  vs. vs. based on slope of regression & p-values  Color based on desired outcome  Link to graph with trends for UM and 25th & 75th percentiles for peers Current value

16 Display Upper right corner  Shows relation to 12 peers  Above Peers vs. Below Peers vs. Mid. of Peers based on t-tests (UM vs. mean of peers)  Color based on desired outcome  Link to table with five years of data for UM and peers

17 Macros Used to display  Data for year chosen  Direction of arrow icons  Color of arrow (green for positive, red for negative, black for neutral)

18 Spreadsheet  Dashboard developed using Excel spreadsheet, with one sheet for the dashboard report and one sheet for each indicator (graph, peer data, and raw data)  Macro updates year and controls display of arrow icons (direction and color)  Spreadsheet with template for the dashboard and instructions for customizing it shared upon request (leave card)

19 Indicators Used  Selected with input from the Provost, Vice President for Enrollments, Senior Vice President for Business & Finance, and Treasurer  Mandatory criterion: availability of peer data; sources: –CDS data from U.S. News (Peterson’s/Fiske for earlier years) –IPEDS –National Association of College and University Business Officers –Council for the Aid to Education –National Science Foundation –Moody’s—average A data used instead of individual peers –National academies  See last page of handout for list of indicators used by UM and HEDS institutions

20 Dashboard Complements Existing Key Success Factors (KSF) Report  Distinction between monitoring “critical” measures (tactical/operational, usually updated on daily, weekly, or monthly basis) and tracking strategic outcomes (key to long- term goals, updated less often)  Both KSF & Dashboard presented to senior administrators in Operations Planning Meeting (KSF bi-monthly and each Dashboard annually)

21 KSF Monitors Changes for Critical Tactical KPIs  KPIs in KSF usually related to process (e.g., admissions, revenue sources, and expenditures in various categories)  KSF indicators limited to indicators that change on a continuous (e.g., daily, monthly) basis, as captured at the end of each month

22 Dashboard Monitors Strategic KPIs  KPIs related to effectiveness and quality (student quality and success, faculty characteristics, peer evaluations)  Dashboard KPIs not included in KSF because measured on annual rather than continuous basis  Dashboard KPIs limited to indicators for which peer data available

23 Future Directions And Adaptations  Adapt Dashboard format for UM’s KSF report  Include targets and significant differences from targets instead of/in addition to peers  Make available online  Link directly to various data sources (e.g., data warehouse)  Apply at the school or department level  Allow individuals to personalize their own databases to include KPIs directly relevant to them

24 Implementing Next Generation Dashboard at Other Institutions  Session focus is on effective presentation rather than integration of data into report (low- tech spreadsheet, with tables of existing data copied in)  Spreadsheet itself can be used or some of the key concepts can be adapted in other situations  Author will e-mail spreadsheet template and instructions to those interested

25 Choosing KPIs  Choosing which KPIs to use is critical  Small amount of space, so choose carefully  Appropriateness of KPIs is institution-specific  Critical or strategic focus?  Need to interview key stakeholders to determine what data are important for them  Use different types of KPIs (e.g., quality, process, financial, personnel) to provide balanced perspective

26 Demo of Dashboard Spreadsheet Copies of spreadsheet available upon request—e-mail pliu@miami.edupliu@miami.edu


Download ppt "A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning."

Similar presentations


Ads by Google