Presentation is loading. Please wait.

Presentation is loading. Please wait.

Denise Troll Covey Associate University Librarian, Carnegie Mellon NERCOMP Library/IT SIG Workshop – April 2002 versus Count-Ability.

Similar presentations


Presentation on theme: "Denise Troll Covey Associate University Librarian, Carnegie Mellon NERCOMP Library/IT SIG Workshop – April 2002 versus Count-Ability."— Presentation transcript:

1 Denise Troll Covey Associate University Librarian, Carnegie Mellon NERCOMP Library/IT SIG Workshop – April 2002 versus Count-Ability

2 Personal Observation Some data are compiled Lots of data gathering Some compiled data are analyzed Some analyzed data are actually used Wasted effort?

3 Personal Suspicion Lots of data languishing – Why? –Accountability for data reporting is minimal –Data management is cumbersome –Sense that some of the data aren’t useful –Imperative to implement new measures

4 New Data Gathering Evidence of becoming “Predominantly digital” Address Advisory Board concerns

5 Assessment Challenges Deciding what data to gather Understanding the whole assessment process Determining what skills to acquire & how Organizing assessment as a core activity Managing assessment data Acquiring an interpretive context Distinguished Fellowship Nov 2000 – Oct 2001

6 2001 MIS Task Force Assess current & proposed data practices Determine what data to gather & how Develop a specification for a new MIS that solves current problems Oversee implementation & deployment of the new MIS

7 MIS TF Time Line INITIALREVISED Conduct data audit & needs assessmentMay 01Jun 01 Recommend data to gather & manage in MISJul 01Apr 02 Prepare & approve functional specificationJuly 01Jun 02 Evaluate & recommend software for new MISSept 01Aug 02 Prepare & approve design specificationNov 01Nov 02 Implement & test MIS prototypeFeb 02Feb 03 Implement & test production MISMay 02May 03 Document MIS & provide trainingJun 02Jul 03 Migrate data & release new MISJul 02Aug 03

8 May-June 2001 – Data Audit Interviews with functional groups –What data do you gather or plan to gather? –How do you gather the data? How often? –Who uses the data? How often? –For what purpose do you gather the data? –For what audience do you gather the data? –How do you present the data? To whom? –How do you store, access, & manipulate the data? –What problems do you have with data?

9 July 2001 – Assemble Results Created 12-page spreadsheet – data, department, source, audience, software, process, problems, & solutions Identified common problems –Data management is decentralized & uncoordinated –Current “MIS” is incomplete & not kept up-to-date –Errors or anomalies in the data are not corrected –Data gathering & entry are too complicated –Difficult to generate multi-year trend lines –Sheer volume of data is overwhelming

10 More Problems Lack of communication, training, & support –What data are available? Where? –How do I access & use the current MIS? –Problems with access privileges, server, & network Wasted resources –Duplicate efforts –No one has time, skills, or responsibility to analyze the data –Data are either not used or under used

11 August 2001 – Libraries Council Distributed spreadsheet to governing council –To confirm information was accurate & complete –To confirm problems with current data practices –Are we gathering the right data? –Is the data we’re gathering worth the effort? Outcome –Add missing data (e.g., ACRL, IPEDS) –Add more proposed new measures

12 Sept 2001 – Field Trip Keep task force small & strategic –Members with power, willing to take the heat & do the work Be judgmental about usefulness of the data Distinguish data for external & internal audiences Data gathering advice –Stop counting what you don’t need –Automate everything you can –Use more sampling Identify data for MIS based on research questions

13 October 2001 – Department Heads Added data LC requested Distributed 15-page spreadsheet organized by department –To confirm information was accurate & complete Outcome: clarification of minor details

14 November 2001 – Winnow the List Least useful data = data required for ACRL & IPEDS –University Librarian will consider NOT gathering the data IF we don’t really use it or it can’t be automated Don’t know enough about data gathering & use –Someone above them in the food chain uses the data –Data definitions are inconsistent or shifting –How difficult is it to gather data manually? –Can data gathering be automated?

15 February 2002 – Department Heads Distributed spreadsheet per department – data (current & proposed), audience (internal & external) –How used by department? –Is data gathering automated? Can it be automated? –Recommend – Keep regularly, Sample, or Not gather Outcome –Few recommendations to Sample or Not gather data –Uncertainty about what was or could be automated –Hypotheses about how data could be used

16 Current Data Gathering

17 Proposed Data Gathering

18 April 2001 – Recommendations to LC What data to gather How to gather the data What data to manage in 1 st version of new MIS

19 Criteria for Data Gathering Benefits must be worth the cost Gather data that are useful & easy to gather –Usefulness gauged by relationship to strategic & digital plans, & Advisory Board concerns –Keep regularly data that are or can be gathered automatically –Sample data gathered manually, used less frequently, & only by internal audiences Don’t gather data that are not useful or are difficult to gather

20 What Data to Gather & How STRATEGYCPTOTAL Eliminate 11 6 17 15% Sample 13 3 16 14% Keep regularly Already automated 22 19% Automate if possible 13 7 20 17% Manual 36 6 42 36% TOTAL to gather 84 16100 C = Current P = Proposed

21 Changes in Data Gathering

22 Criteria for Data in the New MIS Data gathered regularly & useful with external audiences –University administrators, ACRL, IPEDS, & FactBook –Measures of important traditional or digital trends –Selected service costs & expenditures Small enough data set to implement in a year –Other data may be added to MIS later –All data gathered will NOT be managed by the MIS For example, department goals for Advisory Board

23 Next Steps for MIS TF May 2002 Decide what data manipulations, access methods & controls, & graphics we want the new MIS to do Consider additional new measures July 2002 Determine the feasibility of what we want Document functional requirements specification Submit to the Libraries Council (LC) for approval Sept 2002 Evaluate software & make recommendation to LC Dec 2002 Design the user interface of the new MIS Use paper prototyping to facilitate design work Jan 2003 Begin implementing new MIS prototype

24 Next Steps for MIS TF ??? MIS prototype ready for data entry & testing 2-3 weeks Test prototype MIS ??? Revise design & functionality based on testing ??? Implement MIS Work with LC to decide what existing data, if any, gets “migrated” to the new MIS 2-3 months Documentation, training, data entry, & testing ??? New MIS released

25 “Culture of Assessment” Traditional & emerging measures –Inputs, outputs, outcomes, composites, performance –Meta-assessments of new measures Guidelines, best practices, & standards BS about “creating a culture” –As if new know-what & know-how are enough –No attention to what a culture really is –No understanding of what it takes to change a culture

26 What is a Culture? Beliefs – values & expectations Behaviors – observable activities Assumptions – unconscious rationale for continuing beliefs & behaviors Conner, D.R. Managing at the Speed of Change. NY: Villard, 1992.

27 Orchestrating a Culture Conduct a cultural audit –If there’s a gap between your current culture & your objectives, close it –If you don’t close it, the current culture wins CURRENT Beliefs Behaviors Assumptions DESIRED Beliefs Behaviors Assumptions Transition

28 Audit via Observation CURRENTDESIRED Beliefs Data aren’t useful Data aren’t used No reward for work Data are useful Data are used Work is rewarded Behaviors Haphazard data gathering, reporting, compilation, & analysis Inefficient data mgmt Ineffective data use Accurate, timely data gathering, reporting, compilation, & analysis Efficient data mgmt Effective data use AssumptionsData aren’t importantData are important

29 Audit via Questionnaire Survey of perceptions of assessment practices, priorities, & problems –Perception by level in organizational hierarchy Administrator, Middle manager, Other –Perception by library unit Public services, Technical services, IT –Perception by status Faculty, Staff

30 Audit via Questionnaire Somewhat agree on top three assessment priorities –Understand user behaviors, needs, expectations, & priorities –Operate the libraries cost-effectively –Validate expenditures Disagree on –How assessment is organized –What assessments are conducted –What resources are allocated to assessment –What we need to solve our assessment problems

31 Creating a Culture of Assessment Change beliefs, behaviors, & assumptions –Resistance is natural, but it should be futile –The onus is on administration & management To apply consequences (motivate) To provide training (enable) Change management is pain management –Pain = incentive to disengage from the status quo –Remedies = incentive to adopt the vision & plans

32 Communication is Critical Administrators must focus & secure commitment –Articulate goals & rationale –Develop & distribute action plans –Manage expectations –Assign responsibilities –Provide training & consequences –Answer questions –Identify & solve problems –Repeat as necessary The alternative is to suffer the consequences of resistance, failure, or missed opportunity

33 Speed of Technological Change Change is inevitable & accelerating By 2015, it will occur at an exponential rate

34 Good management without sound administration is like straightening the deck chairs on the Titanic

35 Thank You Denise Troll Covey Associate University Librarian Carnegie Mellon troll@andrew.cmu.edu 412-268-8599


Download ppt "Denise Troll Covey Associate University Librarian, Carnegie Mellon NERCOMP Library/IT SIG Workshop – April 2002 versus Count-Ability."

Similar presentations


Ads by Google