A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning.

Slides:



Advertisements
Similar presentations
Follow-up Reporting Expectations MSCHE Annual Conference 2009 Mary Ellen Petrisko.
Advertisements

SHELLY POTTS, PH.D. UNIVERSITY OFFICE OF EVALUATION AND EDUCATIONAL EFFECTIVENESS ARIZONA STATE UNIVERSITY Quality Assurance Mechanisms.
CASPA Comparison and Analysis of Special Pupil Attainment SGA Systems SGA Systems Limited A brief overview of CASPA's graphs and reports.
Refresher Instruction Guide Strategic Planning and Assessment Module
Implementing KPI Reporting in the College of LSA Rob Wilke Nick Hadwick.
Copyright Dickinson College This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,
[Title of meeting] [Name of sponsor] [Date] For guidance on working with PowerPoint and reformatting slides, click on Help, then Microsoft PowerPoint Help,
Viewing this Tutorial Use the ‘Down’ arrow on your keyboard, or left click your mouse, to move to the next point. Use the ‘Up’ arrow to go back. Use the.
Metrics and Quality Assurance Metricus Tool for IT Performance Measurement.
6 th Annual Focus Users’ Conference 6 th Annual Focus Users’ Conference Accounts Receivable Presented by: Robert Myers Presented by: Robert Myers.
Enhancing Decision Making. ◦ Unstructured: Decision maker must provide judgment, evaluation, and insight to solve problem ◦ Structured: Repetitive and.
Data and Reporting In Schoolnet for District Admins Dan Urbanski, DPI IIS - Learning Systems Division.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Data Sources Data Warehouse Analysis Results Data visualisation Analytical tools OLAP Data Mining Overview of Business Intelligence Data visualisation.
Business Performance Management (BPM)
Simfund Filing Training Introduction First Look Step by Step Training.
1 Deans and Decision Support A Partnership Approach to Dashboard Technology Copyright 2008 Board of Trustees University of Illinois.
IStateLink 3.0 Help and Training Tutorial December 2010.
1 Balanced Scorecards for Colleges and Universities: Development and Deployment.
Compliance Assist Refresher Instruction Guide Adding or Editing Unit/College Strategic Goals.
Objective Explain What is the Balanced Scorecard
Internet Banking Standard and Standard-Hybrid Registration Intuit Financial Services University Internet Banking Certification Training.
Office of Project Management Metrics Report Presentation
Department of Veterans Affairs Office of Management Overview OverviewOf The Monthly Performance Review (MPR) May 2011.
Lighting up “Fermidash” – Fermilab’s Executive Dashboard
UGA’S STRATEGIC PLANNING DASHBOARD Allan Aycock Director for Assessment and Accreditation Shweta Doshi Business Intelligence Application Analyst 1.
University of North Dakota Office of Institutional Research November 8, 2013 Drivers get ready - new dashboards are coming your way! Presented at the.
1.Knowledge management 2.Online analytical processing 3. 4.Supply chain management 5.Data mining Which of the following is not a major application.
WHAT IS A DASHBOARD? "A digital dashboard, also known as an enterprise dashboard or executive dashboard, is a business management tool used to visually.
Primavera Highlights During COLLABORATE  Primavera Key Note: Making the Most of Your Oracle Primavera Investment Dick Faris, Primavera Co-Founder & Oracle.
1 Quality Health Indicators Brought to you by… Hosted by Sally Perkins.
Carnegie Mellon University © Robert T. Monroe Management Information Systems Business Analytics Management Information Systems.
Hossein Moradi IT Expert.ir December 2008.
Department of Education 1. Improving Transparency & Accountability  President tasked ED to provide relevant information to students and families to help.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
TRANSFORMING CAPABILITY SUPPORT MATERIALS LEADING VISION CREATION Balanced Scorecard Introduction The balanced scorecard can be used for translating a.
0 eCPIC Admin Training: Custom Calculated Fields These training materials are owned by the Federal Government. They can be used or modified only by FESCOM.
MANAGEMENT SUPPORT SYSTEMS II 7. Business Intelligence.
Introduction to Business Intelligence
September 12, 2008 An Example of Leveraged Management Reporting Using BIRT Richard Shepherd Student Enrollment Services Rebecca Gribble University Information.
MBA7025_09.ppt/Mar 31, 2015/Page 1 Georgia State University - Confidential MBA 7025 Statistical Business Analysis Decision Support System Mar 31, 2015.
Nobody’s Unpredictable Ipsos Portals. © 2009 Ipsos Agenda 2 Knowledge Manager Archway Summary Portal Definition & Benefits.
MGS4020_04.ppt/Feb 5, 2013/Page 1 Georgia State University - Confidential MGS 4020 Business Intelligence Ch 5 – Group Decision Support & Groupware Technologies.
FAMILY AND CHILDREN’S TRUST FUND (FACT) RESEARCH AND DATA MATERIALS.
MIS Reporting Henry Stewart – SunGard Higher Education.
McGraw-Hill/Irwin © The McGraw-Hill Companies, All Rights Reserved CHAPTER 9 Enabling the Organization—Decision Making.
CISB594 – Business Intelligence Business Performance Management.
Management Information System Notes
Introduction – Addressing Business Challenges Microsoft® Business Intelligence Solutions.
Postsecondary Education Administrative Data and Data Tools Susan Aud, Ph.D. National Center for Education Statistics Institute of Education Sciences U.S.
McGraw-Hill/Irwin © 2008 The McGraw-Hill Companies, All Rights Reserved Chapter 9 Enabling the Organization – Decision Making.
CISB594 – Business Intelligence Business Performance Management.
Compliance Assist Refresher Instruction Guide Adding or Editing Student Learning Outcomes.
Building Dashboards SharePoint and Business Intelligence.
0 eCPIC User Training: Portfolios Module These training materials are owned by the Federal Government. They can be used or modified only by FESCOM member.
IPEDS TOOLS Mary Ann Coughlin Springfield College Sponsored by: Association for Institutional Research & National Center for Education Statistics.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
INFORMATION MANAGEMENT Module INFORMATION MANAGEMENT Module
New Forest District Council Service planning surgery – PPRN 19/1/07.
Corporate Customers Basic Services Intuit Financial Services University Business Financial Solutions Certification.
CISB594 – Business Intelligence Business Performance Management.
Project Setup and Execution For PMPlan Enterprise Presented by AlNik Solutions Copyright ©
2 July 2013TCS Public KPI for Manufacturing, Telecom & Banking Presented by: Chiranjeevi Naragani Kartheek Kumar Adda Sai Kamal Srikantam Sai Jahnavi Thangudu.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Web-based Information Science Education
MD Online IEP System Instructional Series – PD Activity
MD Online IEP System Instructional Series – PD Activity #7
By Sarah Dow First corporate experience
Quarterly Business Review Template
Presentation transcript:

A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning & Institutional Research University of Miami SAIR, October 24, 2005

Definition of Dashboard A Dashboard is a visual display of Key Performance Indicators (KPIs) presented in a concise, intuitive format that allows decision makers to monitor institutional performance at a glance.

Dashboard Characteristics  Provides visual display of important Key Performance Indicators (KPIs)  Uses concise, intuitive, “at-a-glance” format (uses icons and colors)  Offers high-level summary (reduces voluminous data) Display of “gauges” to monitor key areas

Uses of A Dashboard  Provides quick overview of institutional performance  Monitors progress of institution over time (trends)  Alerts user to problems (colors indicate positive/negative data)  Highlights important trends and/or comparisons with peers  Allows access to supporting analytics when needed to understand KPI results (drill down)

Predecessors & Related Approaches  Executive Information Systems, and their predecessor, Decision Support Systems  On-Line Analytical Processing (OLAP) associated with data warehouses  Balanced scorecards  Key success factors  Benchmarking  Key performance indicators

Why Do A Dashboard? Senior managers  Want to monitor institutional performance  Are very busy—little time to study reports  Value reports that clearly show conclusions  Appreciate overview, with indicators from different areas in one place  Use both trends and peer data “What you measure is what you get.” Robert S. Kaplan and David P. Norton

Impetus for Next Generation Dashboard Report  Session at Winter 2004 HEDS conference  Representatives from 4 HEDS institutions shared dashboard reports  Presentation & discussion prompted ideas about features that might add value  Dashboard report described here developed as result Example of how conference session led to project that would not have been done otherwise

Characteristics of Dashboards Presented at HEDS Conference  All used single page (though some had 2 nd page for definitions & instructions)  All presented trend data (changes over 1, 5, 6, and 10 years)  All used up/down arrows, icons, or “Up”/“Down” to show direction of trends  Three displayed minima and maxima values for the trend period  Three used colors to show whether trends were positive or negative  One used peer data

Questions Generated by HEDS Dashboards and Proposed Solutions 1.Concise or detailed?  HEDS: Laments about not being able to provide more detail (“senior administrators should want to see more”)  Reaction: Sympathized with viewpoint, but have learned most senior administrators want summaries, not detail  Next Generation Dashboard: Keep concise format plus links to optional graphs & tables

Issues that Came Up in Discussion at HEDS and UM Solutions 2.Trends or peer data?  HEDS: All four dashboards used trend data; one also used peer data  Reaction: UM values peer data to support benchmarking  Next Generation Dashboard : Use both peer and trend data

Issues that Came Up in Discussion at HEDS and UM Solutions 3.When should icon for trend or difference from peers appear?  HEDS: Dashboards seemed to display icons for all non-zero differences  Reaction: Didn’t want small differences to be treated as real changes  Next Generation Dashboard : Use p-values from regression and t-tests to control display of icons for trends and peer comparisons

Issues that Came Up in Discussion at HEDS and UM Solutions 4.Include minima and maxima?  HEDS: Three displayed minima and maxima over the trend period  Reaction: UM’s senior VP decided too cluttered  Next Generation Dashboard : Shows trends of own institution and 25 th and 75 th percentiles of peers, with no maxima or minima

Unique Aspects of Next Generation Dashboard Dashboard  Provide drill-down links to graphs and tables for more detail, if desired  Provide peer data in addition to trends  Use regression (rather than maxima and minima) to determine direction of trends  Use statistical significance of slope (rather than just difference) to generate trend icons  Use t-tests to generate peer comparison summary Functions like adding “Global Positioning System (GPS)” to your dashboard

Implementation  Two dashboard reports: student indicators and faculty/financial indicators  17/21 KPIs on single page  Box for each with current value, arrows to show trends, and text to show relation to peers  Links to more detailed graphs and tables

Indicator Display Upper left corner  Up arrow, down arrow, or horizontal line  Shows direction of UM trend for last 6 years  vs. vs. based on slope of regression & p-values  Color based on desired outcome  Link to graph with trends for UM and 25th & 75th percentiles for peers Current value

Display Upper right corner  Shows relation to 12 peers  Above Peers vs. Below Peers vs. Mid. of Peers based on t-tests (UM vs. mean of peers)  Color based on desired outcome  Link to table with five years of data for UM and peers

Macros Used to display  Data for year chosen  Direction of arrow icons  Color of arrow (green for positive, red for negative, black for neutral)

Spreadsheet  Dashboard developed using Excel spreadsheet, with one sheet for the dashboard report and one sheet for each indicator (graph, peer data, and raw data)  Macro updates year and controls display of arrow icons (direction and color)  Spreadsheet with template for the dashboard and instructions for customizing it shared upon request (leave card)

Indicators Used  Selected with input from the Provost, Vice President for Enrollments, Senior Vice President for Business & Finance, and Treasurer  Mandatory criterion: availability of peer data; sources: –CDS data from U.S. News (Peterson’s/Fiske for earlier years) –IPEDS –National Association of College and University Business Officers –Council for the Aid to Education –National Science Foundation –Moody’s—average A data used instead of individual peers –National academies  See last page of handout for list of indicators used by UM and HEDS institutions

Dashboard Complements Existing Key Success Factors (KSF) Report  Distinction between monitoring “critical” measures (tactical/operational, usually updated on daily, weekly, or monthly basis) and tracking strategic outcomes (key to long- term goals, updated less often)  Both KSF & Dashboard presented to senior administrators in Operations Planning Meeting (KSF bi-monthly and each Dashboard annually)

KSF Monitors Changes for Critical Tactical KPIs  KPIs in KSF usually related to process (e.g., admissions, revenue sources, and expenditures in various categories)  KSF indicators limited to indicators that change on a continuous (e.g., daily, monthly) basis, as captured at the end of each month

Dashboard Monitors Strategic KPIs  KPIs related to effectiveness and quality (student quality and success, faculty characteristics, peer evaluations)  Dashboard KPIs not included in KSF because measured on annual rather than continuous basis  Dashboard KPIs limited to indicators for which peer data available

Future Directions And Adaptations  Adapt Dashboard format for UM’s KSF report  Include targets and significant differences from targets instead of/in addition to peers  Make available online  Link directly to various data sources (e.g., data warehouse)  Apply at the school or department level  Allow individuals to personalize their own databases to include KPIs directly relevant to them

Implementing Next Generation Dashboard at Other Institutions  Session focus is on effective presentation rather than integration of data into report (low- tech spreadsheet, with tables of existing data copied in)  Spreadsheet itself can be used or some of the key concepts can be adapted in other situations  Author will spreadsheet template and instructions to those interested

Choosing KPIs  Choosing which KPIs to use is critical  Small amount of space, so choose carefully  Appropriateness of KPIs is institution-specific  Critical or strategic focus?  Need to interview key stakeholders to determine what data are important for them  Use different types of KPIs (e.g., quality, process, financial, personnel) to provide balanced perspective

Demo of Dashboard Spreadsheet Copies of spreadsheet available upon request—