Download presentation
Presentation is loading. Please wait.
Published byMadison Nichols Modified over 9 years ago
1
Quality Reports Come from Quality Data Andrew Baldwin College of Management University of Wisconsin- Stout Jan Boyd Information School University of Washington
2
Overview What is the “why”? What is the best way to display this information? Is the data accurate?
3
What is the “why”?
4
What questions need to be answered? ASK!
5
What’s important to you, Prof? me in! fence Don’t
6
What’s important to you, Prof? Don’t fence me in! Advising Highlight Research Service Teaching Other
7
From “Scheduled Teaching” Screen: Screen is “read only” for faculty except comment field.
8
From the Yearly Narratives Screen From the Scheduled Teaching Screen
9
Capture all my efforts regardless of current status.
11
Ask me what makes sense. Outcome: Evidence of the effect of our research activities is felt and understood in many places. Indicator: Researchers create artifacts or tools that are implemented and used by others. Target: Percentage increase in number and variety of artifacts/tools developed.
12
Evidence of Impact Screen Featured Incorporated Referenced Reviewed Quoted Other Internet Site Magazine Newspaper Podcast Radio Software TV Other Selectors and dates for research group websites
13
“Evidence of Impact” on the Faculty Activity Report
14
Make my accomplishments visible externally. Faculty Directory
16
What is the best way to display this information?
17
Formatting Level Aggregate vs. Individual Organization Tables vs. Free Form Medium Report vs. Website
18
Staff Activity Insight Admissions loves the Experts page for recruiting students. Curriculum Manager uses the same data to see who can teach what.
19
Research Projects for the Web special screen that integrates multiple faculty/current PhDs and has DSAs allowing multiple: People or Collaborators Partners Associated Groups or Centers Sponsors Geographic Impact Locations
22
Research Services staff can easily report status of grants to the Deans. Staff Activity Insight
23
Program Evaluation Initiative Intellectual Contributions By Type and Year Required: Title Contribution type Peer-reviewed? Current Status
24
Is the data accurate?
25
Data Quality Good data Good reports Bad data Bad reports
26
Data Imports and Proxies Import as much as possible from central data sources Data entry assistance Graduate students if possible
27
Make data entry links easy to find (especially for faculty!)
28
Most teaching is uploaded from central data sources. After Scholarship/Research on Main Menu:
29
What staff need and faculty seldom do is at the bottom of the menu.
30
Is the data captured accurately? Does the data make sense? Does anything appear to be missing? Is something included that shouldn’t be?
31
Strategic Plan Reports – Tables!!
32
Provide tools for data cleaning. Reports faculty can run themselves: Intellectual Contributions Metadata Presentations Metadata Professional Service Metadata Reports for staff or grad assistants: DCR - Intellectual Contributions with Students DCR - Presentations with Students DCR - Research Projects DCR - Research Projects for the Web DCR - Student Participation
33
Intellectual Contributions Metadata Easier to spot errors: Contribution type Current status Peer-reviewed
34
Student Participation Report combines: Intel Contributions Presentations Research Projects
35
Spreadsheet DCR version of student participation: filter for inconsistency
36
Macro-Level Data Cleaning Who is a “researcher?”
37
Macro-Level Data Cleaning
39
Summary Need to ask questions! What is the “why”? What is the best way to display this information? Is the data accurate?
40
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.