A Review and Redesign of Roper Center Infrastructure

Slides:



Advertisements
Similar presentations
Transformations at GPO: An Update on the Government Printing Office's Future Digital System George Barnum Coalition for Networked Information December.
Advertisements

Implementation of the DDI at the Roper Center A Pilot Project on Resource Integration Marc Maynard and Hui Wang The Roper Center.
Promoting Classroom Use of Public Opinion Data Developed by The Roper Center of the University of Connecticut Public Opinion.
Building Infrastructure & Alliances to Meet Common Goals: “The Creation of a Canadian Public Opinion Data Index” IASSIST Annual Meeting May, 2006 Ann Arbor,
Upgrading ABC News/Washington Post Data Collections Using DDI and Legacy Databases Marc Maynard The Roper Center for Public.
NOAA Metadata Update Ted Habermann. NOAA EDMC Documentation Directive This Procedural Directive establishes 1) a metadata content standard (International.
DYNAMICS CRM AS AN xRM DEVELOPMENT PLATFORM Jim Novak Solution Architect Celedon Partners, LLC
Surveyors Conference Project Update for the as of March 2007 Right of Way Data Management System (RWDMS)
Installation and Maintenance of Health IT Systems
InWEnt | Qualified to shape the future1 Internet based Human Resource Development Management Platform Human Resource Development Programme in Natural Disaster.
Current and Future Applications of the Generic Statistical Business Process Model at Statistics Canada Laurie Reedman and Claude Julien May 5, 2010.
Archivists' Toolkit - All Hands Meeting Project Objectives Build an application for creating and managing archival information Target core archival.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Introduction to ITIL and ITIS. CONFIDENTIAL Agenda ITIL Introduction  What is ITIL?  ITIL History  ITIL Phases  ITIL Certification Introduction to.
Strategic Planning Chester County Library System Strategic Planning Steering Committee November 14, 2008 Gail Griffith.
Leadership Guide for Strategic Information Management Leadership Guide for Strategic Information Management for State DOTs NCHRP Project Information.
ImageNow -- An Overview --. What is ImageNow?  Loyola’s document imaging and workflow application  Primary application (web based and desktop) of the.
Archon: Facilitating Access to Special Collections Prepared for PACSCL Conference Something New for Something Old: Innovative Approaches.
Washington University School of Medicine
Sample Fit-Gap Kick-off
Right of Way Data Management System (RWDMS)
Software Configuration Management
Project Management: Messages
Digital Repository Certification Schema A Pathway for Implementing the GEO Data Sharing and Data Management Principles Robert R. Downs, PhD Sr. Digital.
Chapter 1: Introduction to Systems Analysis and Design
MANAGEMENT OF STATISTICAL PRODUCTION PROCESS METADATA IN ISIS
Chapter 11: Software Configuration Management
Trustworthiness of Preservation Systems
Investment Logic Mapping – An Evaluative Tool with Zing
MUHC Innovation Model.
An Overview of Data-PASS Shared Catalog
TechStambha PMP Certification Training
Active Data Management in Space 20m DG
Software Documentation
Stakeholder Engagement Plan: Background to Roles and Responsibilities
Terri Tommasone & Diana Abinader
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Introduction to Internal Audits
Applied Software Implementation & Testing
Operational and Postimplementation
You’ve bought a LIMS, now what?
Project Initiatives Identified by the CIA Project
Sophia Lafferty-hess | research data manager
Engineering Processes
Guidance notes for Project Manager
New Alma Customer Onboarding Preparation and Best Practices
Product Development Scenario Overview
You’ve lost the hope in it.
How to Design and Implement Research Outputs Repositories
Implementation Guide for Linking Adults to Opportunity
Project Management Scenario Overview
Project Information Management Jiwei Ma
Certifying Preservation Actions - TRAC and related initiatives
Chapter 11: Software Configuration Management
2018 SMU Staff Performance Review Training
Open Archival Information System
Broadvine Support Portal
Presentation to Project Certification Committee, DoIT August 24, 2008
Engineering Processes
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
CHANGE IS INEVITABLE, PROGRESS IS A CHOICE
Technology Department Annual Update
Capitalising on Metadata
ePerformance: A Process Crosswalk May 2010
Chapter 1: Introduction to Systems Analysis and Design
WHERE TO FIND IT – Accessing the Inventory
Palestinian Central Bureau of Statistics
Staff Turnover and Silos in Our State, Oh My!
NMDWS Internship Portal
Fundamental Science Practices (FSP) of the U.S. Geological Survey
Presentation transcript:

A Review and Redesign of Roper Center Infrastructure Elise Dunham, Cindy Teixeira, Marc Maynard, Lois Timms-Ferrara The Roper Center for Public Opinion Research University of Connecticut IASSIST Conference 2014, Toronto, Canada Throughout this conference, there have been many presentations and discussions about how our community is handling the data proliferation movement occurring today. We are all having to evaluate our infrastructures and makes decision about how to more front. The Roper Center has been conducting a review and redesign of our infrastructure with these challenges in mind.

Project Goals Improve processing of digital objects Expand our collection Volume Types of data/objects Openness with our policies Develop new services When moving forward with this type of project, we needed to identify some basic goals.   First, we know we needed to improve the processing of digital objects including increasing efficiency, reducing duplication of efforts by the Roper Center staff and decreasing the manual processing of materials. Second, we wanted to expand our collection, by both increasing the volume of materials and types of data acquired by the Roper Center. We want to be able to document new and experimental methodologies conducted in the public opinion field, as well as collection disclosure information. The AAPOR Transparency Initiative is recommending a more open exchange of metadata by public opinion data collectors……. Third, we want to be more open with our policies to ensure the Roper Center’s long-standing reputation for data quality. We need to update internal and create external use policy documents. And be able to connect those policy documents directly to our processing actions. Finally, we want to develop new and innovative services for our user community.

Project Overview Funded in 2011 by the Robert Wood Johnson Foundation Three Phases Review Design Implementation With these goals in mind , the Roper Center in 2011 submitted a proposal and was awarded funds from the Robert Wood Johnson Foundation to address our infrastructure needs. The project was to be conduct in three phases.   Review The first phase is the Review Phase. During this phase, the Roper Center was to hire a consultant to facilitate an intensive workflow analysis to include: Documenting all aspects of the processes and procedures in each workflow and defining common terminology used at the Center. From this workflow analysis, we hoped to developing recommendations for the integrating our current workflows into a single stream model. Design Then, we would move on to the Design Phase. During this phase, we would redesign the information architecture using a common metadata structure. We would determine new and desired requirements for data processing, preservation and access, while keeping in mind current archival standards, the needs of the research community and the movement for an open exchange on information with other archives. Implementation Finally, we would prioritize and implement our new process and system changes. We planned on making these changes gradually to not disrupt services to our user community.

iPOLL The Roper Center has two main processing groups. Each processing group is responsible for workflows based on a product or service that we provide to our users. These workflows have specific staff members assigned to them. Our first processing group works on iPOLL. iPOLL is our question level retrieval database of over 600,000 questions and results. The database is populated, generally, using published materials like press releases, report and toplines that include full question wording. In an election year, approximately 20,000 questions are added to the database and 15,000 to 18,000 in a non-election year with 1.5 staff members and 6 undergraduate students.

RoperExpress Our second main processing group is referred to as the Archive group. They are responsible for processing studies for RoperExpress. RoperExpress allows users to search for studies online, and download the related data and documentation files. We process approximately 1,500 studies per year with 1.5 staff members, 2 graduate students and 2 undergraduate students.

Review Phase Creation of a central repository Workflow analysis Hand-offs Skillsets Standards and policies Tracking and queuing Product specific requirements In the Summer of 2011 Creation of the Wiki which included: Modified existing user manuals used for training Create new documents Review the mapping existing databases to DDI In the Fall of 2011 In-depth Workflow analysis began which included: Reviewed processing documents and instructions on the Wiki Complete walk-through of workflows from acquisitions to the public release of materials Hand-offs Skillset required for each step Standards and policies adhered to Tracking and quality assurance mechanisms Current queues Product specific requirements for iPOLL and RoperExpress Wishlist for a new processing system Mapped our workflow to an information model (OAIS)

Processing Workflows ASSESSMENT Acquire Study Materials Prepare Study for Release Add Study Level Information Add Question Level Information Proof Question Level Content Release to Public Acquire Study Materials Initial Processing Quality Control 1 Intensive Processing Quality Control 2 Archive Electronic Files Release to Public

Review Phase Guide to common Roper Center terminology Peer and environmental scan Metadata standards Reporting our findings Creation of the Common Roper Center terminology document   Peer and Environmental Scan Ann conducted Information models (OAIS model) Assessment (Data Seal of Approval and Drambora) Preservation (PREMIS) Repository Certification (TRAC and ISO 1363) Software (Coletica) Metadata standards Identified common metadata “buckets” or “containers” Consider metadata standards like DDI Reports Full Report identifying our current deficiencies and recommendations for a new processing system Executive Summary to communicate our general findings to our Board of Directors Workbook which acts as a roadmap through the design and implementation phases

Project Recommendations Develop new processing and metadata production infrastructure Develop DDI-compliant Ingest & Processing Unit Combine processing teams Develop integrated queue and tracking system Document policies and processing rules Extend support for AAPOR Transparency Initiative Develop formal Preservation Planning process Develop a new processing and metadata infrastructure Based on a Single Stream Workflow Processing Model  Develop a single ingest unit Combine processing teams Build a DDI compliant, integrated common metadata structure Develop metadata production interface Develop integrated queue and tracking system Document policies and processing rules Extend support for AAPOR Transparency Initiative Hire highly skilled specialized staff Consider self and external assessment certifications Develop integrated preservation actions and planning 

Implementation Phases Core Ingest Unit Merge of Processing Teams Study Assessment Module Acquisition Management Module Workflow Management/Queue and Tracking System Within her project recommendations, Ann identified a number of feasible “first steps” the Roper Center could take toward Infrastructure Redesign. Our team’s first task after the review was to break these steps down even further into lean, manageable phases. There are only 3 of us at the Roper Center dedicating time to this project, and between all 3 of us wearing multiple hats & juggling the day-to-day, as well as 1 of us transitioning into a new organization & position, it’s crucial for us to tackle this project incrementally. The implementation phases we’ve identified are as follows: Develop Core Ingest Unit Merge our processing teams Develop Study Assessment Module Develop Acquisition Management Module Develop Queue and Tracking System (for processing at study, file, and variable levels)

Progress So Far 1. Developed Core Ingest Unit Persistent Study ID Study- and File-level metadata production infrastructure 2. Began merge of processing teams 3. Developed Study Assessment Module Developed Core Ingest Unit Persistent Study ID-unique ID assigned and maintained no matter what we acquire first from a study. This is crucial because we often acquire published material before we acquire the dataset it came from, and our access tools require us to make connections between these materials. In the current system we backtrack and identify these connections manually; the Ingest Unit increases our efficiency by assigning a centralized unique identifier for the study and allowing us to attach everything to do with a single study to that study’s record. Also, the core Ingest Unit currently supports Study and file level metadata entry and management, which we’ll get to see in a moment Began merge of processing teams/work flow Started by integrating our physical space Continued by incrementally cross-training our undergraduate student workers in tasks completed “on the other side”. Students hired for iPOLL work are able to jump into some of the work on the data processing side, and vice-versa. Then, we designed and developed an assessment module for the systematic intake of new acquisitions. The key components of this module are that it makes it so staff working in acquisitions begin building study-level metadata records pre-Ingest, and the system itself encourages conscious adherence to our acquisitions policy. We’re actually going to take a look at the design and implementation of the assessment module.

Conceptual Design There are 3 categories of activities within our new system: Assessment, Acquisitions, and Workflow Management. We see the 3 blue boxes at the top as Assessment tasks—we determine whether or not the study meets our acquisition requirements and whether or not we’ve already acquired something from it. On the right, we have Acquisitions tasks—checking for missing metadata components like sampling methodology information or permissions verification—anything that requires back-and-forth with data providers is something we want to be tracking in our Acquisitions Tracking system. The boxes on the bottom represent tasks that become relevant to Workflow Management—we can’t assign material a “task path” until our Workflow Management/Processing piece is in place, so it’s outside the scope of our Assessment Module phase. www.RoperCenter.uconn.edu

Detailed Conceptual Design Check for Study NEW PASS IN SYSTEM Assessment Checkboxes What is the processing priority? FAIL FAIL IPOLL NOT IPOLL Acquisitions Tracking Holding Bin Question-Level Processing Queue Archive Processing Queue INGEST

Demo of Roper Center Assessment module

back to show

Next Steps Upstream Downstream Preservation planning Acquisitions Management Module Downstream Metadata-driven variable-level processing interface Workflow management/task paths Quality assurance controls Preservation planning In terms of next steps…we frame discussions of this project by referring to processes that are “upstream” and “downstream” from Ingest in the workflow. So for “upstream” work we have the Acquisitions Management piece, and “downstream” we will be developing a DDI-compliant infrastructure and interface for variable-level processing, a method for managing workflow and tasks throughout the entire system, and continuing to integrate quality assurance controls across the board. And, as a rider to this project, we’ll be focusing a lot of attention on preservation planning.

Workflow Analysis Tips Bring in an outside expert Be comfortable with your dirty laundry Be open to change Change incrementally What we really hope the takeaway here is that internal reviews are feasible and worth your while, no matter the size of your staff. We all recognize that our workflows could be more efficient, but it really takes sitting down and hashing things out with an outside expert, being comfortable with your dirty laundry, being open to change, and being willing to change at the pace that works for your staff to be able to fully address workflow issues. If we can do it, so can you. Bring in an expert (value of outsider perspective) Be fully open to the process; be okay with your dirty laundry. Be open to change. Change incrementally—working in manageable steps; and start with a concept and take it all the way through.

Thank you! Elise Dunham: elise.dunham@uconn.edu Cindy Teixeira: cynthia.teixeira@uconn.edu Roper Center: www.ropercenter.uconn.edu Follow us on Twitter: @RoperCenter Like us on Facebook: www.facebook.com/ropercenter Thanks to Ann Green for all of her work and continued guidance on this project.