Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC.

Slides:



Advertisements
Similar presentations
Your Career Starts Here! APPLY ONLINE: campus.canadiantire.ca SUMMER 2009 CO-OP OPPORTUNITY BUSINESS ANALYST Supply Chain Major Projects - Processes and.
Advertisements

Is the Grass Greener? Exploring the New Wave of Course Management Systems Kelly P. Doney Associate Vice President for Enterprise Applications
Test Automation Success: Choosing the Right People & Process
CSG Survey to understand Teaching & Learning space domain Guenthar, Lakhavani, Leonhardt, Stringer, Werner CSG Discussion May 16, 2014.
IT Governance Infocom India Presentation December 6, 2006.
Project Monitoring Evaluation and Assessment
Digital Campus: IT, Media Management, and Innovation Kelly Doney.
Tom Sheridan IT Director Gas Technology Institute (GTI)
A Basic Expectation Regardless of digital format, content should be useful to us anywhere, any time, on any device.
April 28, 2015 Virginia Tech. Data Analytics “Analytics is the combustion engine of business, and it will be necessary for organizations that want to.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 13 Health Information Systems and Strategy.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
CS&T Strategy Articulation Project Results Summary June 2006 “CS&T will be the recognized leader for high quality and innovative Enterprise Information.
Readiness Index – Is your application ready for Production? Jeff Tatelman SQuAD October 2008.
Organizational Project Management Maturity: Roadmap to Success
Common Solutions Group Workshop: Help Desk Survey Results.
Common Help Desk Deep Dive Tom Bourgeois / Laurel Wadlund
Cloud Attributes Business Challenges Influence Your IT Solutions Business to IT Conversation Microsoft is Changing too Supporting System Center In House.
What is Business Intelligence? Business intelligence (BI) –Range of applications, practices, and technologies for the extraction, translation, integration,
1 Dave Shafer, ITS Systems & Platforms June 25, 2010.
QAD's Customer Engagement Dan Blake Consultancy Development Director, QAD QAD Explore 2012.
Use of OCAN in Crisis Intervention Webinar October, 2014.
The Microsoft Office 2007 Enterprise Project Management Solution:
1 Our Expertise and Commitment – Driving your Success An Introduction to Transformation Offering November 18, 2013 Offices in Boston, New York and Northern.
N By: Md Rezaul Huda Reza n
Impactful Portfolio Management
Lori Smith Vice President Business Intelligence Universal Technical Institute Chosen by Industry. Ready to Work.™
Best Practices: Aligning Process, Culture and Tools Michael Jordan Senior Project Manager - Microsoft Consulting Services
Performance Management in Practice
Working Definition of Program Evaluation
A DEPARTMENTAL PERSPECTIVE Drive Value through Compliance with the Green Book – Stop Checking the Box.
May 12, 2010 Planning team: Mairéad Martin, Kevin Morooney, Laura Patterson, Rex Pruess, Shelton Waggener, & Bill Wrobleski.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
Event Management & ITIL V3
Measuring Managed Services 0 IT Infrastructure Partnership Team December 9, 2008.
CSG Cloud Survey 10 September Revised 9/9/2014 Respondents New York University University of Virginia University of Iowa UC San Diego University.
IT Metrics/Dashboards at Duke: Curation, Automation, Aggregation? CSG January 11, 2012.
CyberInfrastructure workshop CSG May Ann Arbor, Michigan.
Introduction – Addressing Business Challenges Microsoft® Business Intelligence Solutions.
General Capacity Building Components for Non Profit and Faith Based Agencies Lakewood Resource and Referral Center nd Street, suite 204 Lakewood,
Introduction to the Continual Service Improvement Toolkit Welcome.
Developer TECH REFRESH 15 Junho 2015 #pttechrefres h Understand your end-users and your app with Application Insights.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
Overview What do we mean by a Learning Organisation? Why did we develop a People Development Framework? What was the process involved in building the.
Data Analysis Superintendents Trust. Increase test scores and graduation rates through targeted efforts and investments that lead to student success Proactively.
Cost Efficiencies and Revenue Opportunities for IT in Shrinking Budgets Kelly P. Doney Associate Vice President for Enterprise Applications
AEFI S Assessment Improved Mustafa Sualp Founder, Servant Leader & CEO.
CSI—The Lifecycle Stage
Communications Management
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
1 Alma SMART Collaborative Networks Collaboration Made Simple.
CSI - Introduction ITIL v3.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Impact Research 1 Enabling Decision Making Through Business Intelligence: Preview of Report.
SharePoint 2007 Business Intelligence October 23 th, 2008 Neil Iversen - Inetium.
Homeless Management Information Systems The Calgary HMIS - A joint initiative between the CHF and the Homeless Serving Sector in Calgary Date: April 21,
HathiTrust: A valuable and visionary Partnership.
Protecting Portfolio Value By Tim Washington September 28 th, 2011.
Service Catalog Best Practices for Higher Ed
Project life span.
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
Research Computing Survey Results
Implementation, Monitoring, and NM DASH
Building a Great Campus Civic Action Plan
Sales operations Project support overview Presenter's Name
Finance IT Project support overview.
Management reporting Project support overview.
6 Business Benefits of Channel Marketing Automation
Presentation transcript:

Metrics & Dashboards Survey Results With help from Marty Klubeck at Notre Dame and Brenda Osuna at USC

Who are we? Brown Carnegie Mellon Columbia Cornell CU-Boulder Duke University Georgetown University Harvard* Michigan State University New York University Penn State Princeton Stanford University UC San Diego UCSF University of Chicago* University of Iowa University of Michigan University of Minnesota University of Notre Dame University of Washington University of Wisconsin Virginia Tech 2

How often do we collect the following types of metrics around service health (effectiveness)? 3

For what services do we collect metrics? Good news is that no one said zero! 4

And, our metrics to measure business efficiency and delivery of goals? OTHER: 1) It widely varies depending on the service 2) We do not collect any business efficiency metrics 3) Project delivery # of calls abandoned; # change requests; # s; # abandoned calls, resolution time, cycle time, abandonment, etc.; capacity, mean time to repair 5

Our use of targets OTHER: 1) Working on the use of ITIL Information Technology Infrastructure Library 2) Note: we don't do this consistently though 3) We do not use any service target range metrics 4) Industry Practices / Standards 6

Metrics are collected and analyzed primarily as… 7 OTHER: System performance metrics in transition to organizational effort.

Who is the audience for our metrics? 8 OTHER: Post publicly

How do we share them? 9

Benefits so far 10 OTHER: 1) right-sizing the organization; metrics enable us to tune documentation and training and better prepare support providers

How do we rate the maturity of our organization’s use of metrics? 11

Our use of external data sources OTHER: 1) Gartner for Benchmarking 2) Used to participate in the campus computing survey 3) Gartner 12

Any BI action? OTHER: Currently considering an environment, platform selection pending 13

Our biggest challenges OTHER: 1) Continuing engagement from mid-level leadership to respond to metrics findings 2) Organizations ability to identify specific KPI's to measure specific objectives 3) Changing leadership/definition of what is necessary and relevant; metrics must mean something to be used effectively; lack of a plan; staff resent 14

What would we find useful? OTHER: 1) None of the above 2) Unified approach to metrics from an organizational perspective; lack of a plan; dedicated resources would be better. No one is going to use another template and different services would be measured by different metrics unless the metrics were provided at a very very very high level 15

Tools – what have we used, what do we think? 16 “Believe that the process and commitment to consistent data collection is far more important than the tool”

Lessons learned Metrics have helped to highlight areas of significant service difficulties (e.g., with BlackBerry services) and to note some low-level points of problems (e.g., around some of our network measures.) At the same time, our current metrics processes are highly manual in nature and require significant time investment to collect and report. We have seen challenges in getting service management engaged on the data writ large, which can lead to problems when errors due to service changes are missed thus impacting trending. Goals for us in coming year include focusing on trend analysis/reporting through executive summarization (done), gaining more mileage out of system-generated metrics on availability and low-level alarms, improving automated collection of non-availability data, and looking to focus data aggregation of human-generated, automated and other data into a dashboard to reduce effort level required to visualize service data. Benchmarking is very challenging because of the variance environments at each institution. Cost components may be different, service features and SLAs may not match, accounting practices can be problematic, tracking labor is different, etc. 17

Lessons Learned We had a nascent metrics program under development with dedicated resources, focusing on helping service managers to develop metrics with their local data. With the departure of that resource in October, we are choosing to re-prioritize the work away from dedicated attention to metrics at this time. Instead, we watch with great interest the aggressive agenda that EDUCAUSE has developed with the reinvigoration of ECAR under Susan Grajeck. We will continue to monitor the progress of the various EDUCAUSE initiatives around research, data, and analytics and pursue collaboration opportunities based on our own priorities and resources. We did quite a push to get a metrics dashboard going a couple of years ago which was quite successful. However, the backend work of building a data metrics repository was never completed. This has limited us from getting deeper analytics questions answered and still requires us to perform manual queries often. On the other hand, when we recently needed to pull together a metrics dashboard for a large client (a hospital) we were able to reuse much of the work we had done previously. 18

Lessons Learned We collect a lot of operational performance data using traditional tools (Cricket, Nagios, home-grown scripts) but don't have a reasonable dashboard or approach to making the data useful. We have recently started measuring performance of our service desk and groups behind them to track delivery against SLAs in our service catalog. We've started a Service-Now implementation and expect to use metrics delivered by that tool. Challenge getting consistent operational definitions both for internal use and benchmarking; Data collection is still a time consuming, manual process that we are working to automate through the collection of metrics from disparate systems into a BI environment; We are exploring the use of Microsoft BI tools (e.g. PowerPivot, SQL Server 2012, PowerView) 19

THANK YOU 20