My Resource for Excellence. Canadian Heritage Information Network Creation of the Collections Management Software Review (CMSR) Heather Dunn, CHIN.

Slides:



Advertisements
Similar presentations
The EBSCONET Subscription Management System is a multi-lingual
Advertisements

EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
We have developed CV easy management (CVem) a fast and effective fully automated software solution for effective and rapid management of all personnel.
“The Honeywell Web-based Corrective Action Solution”
The Electronic Office Some supplementary information Corporate websites Office automation Company intranet.
Collecting Citizen Input Management Learning Laboratories Presentation to Morrisville, NC January 2014.
Individual Learning Plan for Kentucky. Background: Data Integration State-wide unique student identifiers are used to link information from a number of.
Training for OCAN Users Day 2
Systems Analysis and Design in a Changing World
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
The System Development Life Cycle
Wayne Huebner, Chairman, MSE S&T Campus Project Leader Chris Weisbrook, Director of Academic Programs UM System Project Leader presented to: ITCC April.
OCAN Education Training for OCAN Users Day 2. Objectives Learn how to: Complete staff assessment Interpret and make use of information from OCAN in a.
Agora-The Learning Centre Transforming information into education Canadian Heritage Information Network February 2007.
Talbert House Project PASS Goals and Outcomes.
1 IS112 – Chapter 1 Notes Computer Organization and Programming Professor Catherine Dwyer Fall 2005.
8 Systems Analysis and Design in a Changing World, Fifth Edition.
Selection & Evaluation of Information Sources and Services Dr. Dania Bilal IS 530 Fall 2009.
Employing e-Portfolios in Instructional and Co-Curricular Settings Jennifer Matthews, Senior Consultant Blackboard Inc April 13, 2005.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
Chapter 5 Application Software.
Star Rays Website User Guide. These screens demonstrate that how to register on Star Rays web site to avail to view the Star Rays inventory. User Registration.
AppExchange Partner Academy- Building Your Application Listing By Jesse Dailey.
NCHRP (47) - MTAP Survey Tool Used to Assess FTA Contractor Performance of State DOT Triennial and Other FTA Reviews - An Update Biennial FTA State.
Custom Faculty Development: Reach Faculty Where They Live Linda A. Leake, M. Ed. Instructional Designer/Blackboard Support Specialist University of Louisville.
Student Success Plan for Delaware. SSP Homepage The SSP Homepage is the central point from which students can access all of the features and functions.
MIS 2000 Information Systems for Management Section A01, Bob Travica Introduction to Course Updated: May 2015.
Evaluating Educational Technology and Integration Strategies By: Dakota Tucker, Derrick Haney, Demi Ford, and Kent Elmore Chapter 7.
1 Public Outreach October 2008 By Adelina Murtezaj – Public Relation Officer For Inaugural Partnership Activity between ICC and ERO.
Advance and the Electronic Packet Advance and the Electronic Packet April 5,
The University of Texas System Board of Regents NAPAHE Annual Meeting March 2012 O NLINE B OARD P ORTALS Tina Holloway.
Teachers Discovering Computers Integrating Technology and Digital Media in the Classroom 7 th Edition Evaluating Educational Technology and Integration.
An ITS initiative in association with the TSC Gathering your needs and requirements to support eLearning at Western Talk to Us!
My Resource for Excellence. Canadian Heritage Information Network Collections Management Software Criteria Checklist Heather Dunn, CHIN.
Problem Identification
Configuring Electronic Health Records Migration to an Electronic Health Record System Lecture b This material (Comp11_unit1b) was developed by Oregon Health.
Chapter 14 Information System Development
Open Source Simon Fraser University Library Kevin Stranack Consortial Support Librarian SFU Library ELN Presentation April 20,
2 Systems Architecture, Fifth Edition Chapter Goals Describe the activities of information systems professionals Describe the technical knowledge of computer.
2008 NAPHSIS Annual Meeting Celebrating 75 Years of Excellence Orlando, FL June 1 st – 5 th, 2008 The Kentucky Vital Records EVVE Experience.
| e n a b l i n g | i n t e r a c t i v e | a d a p t i v e | O V E R V I E W Providing secure access to real-time data via the Internet Focused on delivering.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
Short-Term Economic Statistics Working PartyJune Short Term Economic Statistics Timeliness Framework Richard McKenzie OECD.
Data Quality Assessment
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
© Paradigm Publishing Inc. 5-1 Chapter 5 Application Software.
5 - 1 Copyright © 2006, The McGraw-Hill Companies, Inc. All rights reserved.
Scott Butson District Technology Manager. Provide professional to all district staff Professional development has been provided on a regular basis to.
SACS Compliance Certification Orientation Meeting June 23, 2008.
Microsoft Management Seminar Series SMS 2003 Change Management.
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
Oregon Standards: An Update 2009 Superintendent’s Summer Institute Oregon Department of Education August 3, 2009.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Identifying Assessments
South Region Conference & AGM 13 th June 2003 “BIFM Research & Knowledge” Peter Cordy, Chairman of the BIFM Research, Information & Knowledge Committee.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
WORFORCE PLANNING Ceri Gay Senior Workforce Development Officer Workforce Development Team 17 th January 2012 SOCIAL SERVICES DEPARTMENT.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
1 This Changes Everything: Accelerating Scientific Discovery through High Performance Digital Infrastructure CANARIE’s Research Software.
Component D: Activity D.3: Surveys Department EU Twinning Project.
Rebecca L. Mugridge LFO Research Colloquium March 19, 2008.
The System Development Life Cycle
The Performance and Staff Development Program
After-Session Actions
Reporting the Course level RWR Assessment data
Teaching and Learning with Technology
The System Development Life Cycle
CVE.
Presentation transcript:

My Resource for Excellence. Canadian Heritage Information Network Creation of the Collections Management Software Review (CMSR) Heather Dunn, CHIN

Presentation Summary  CHIN and its mandate  What is the CHIN Collections Management Software Review?  Why did CHIN undertake this project?  How was the Review conducted?  Evaluation team  Tools  Process  Publication

National Centre of Excellence for Museums of the Department of Canadian Heritage, created in 1972 Develops and provides skill development Products and services for heritage professionals Supports the development, presentation and marketing of digital innovative technologies

CHIN –  1972 – The National Inventory Program  1982 – Canadian Heritage Information Network  1995 – First Corporate / Professional Web site  1999 – Artefacts Canada National Database  2001 – Virtual Museum of Canada  2009 – Redesign of Web sites, 3 portals:  The Corporate site   The Professional Exchange   The Virtual Museum of Canada 

Active Network of Heritage Institutions  More than 1,400 not-for-profit heritage member institutions of all sizes and disciplines, from across Canada  Over 35 years of experience  National and international partnerships

What Is the Collections Management Software Review (CMSR)?  A series of CHIN publications which evaluated collections management software products for museums  Four editions, published between 1996 and 2003

What Is the CMSR (continued)?  Assessed the suitability of specific software to museum discipline, collections size, museum functions, and hardware and software environment  Analyzed vendor reliability, support requirements, customization possibilities, and costs  Ensured that the software met CHIN and international standards, and allowed for importing and exporting data

Why Did CHIN Undertake the CMSR Project?  In 1995, CHIN began assisting museums with the transition of their collections data from the CHIN mainframe to in- house collections management systems  The CMSR was created to assist museums with the transition by helping them select appropriate software  The transition was accomplished by 1998  Today, museums maintain their own collections management data in-house, and periodically upload data to the Canadian national database, now called "Artefacts Canada"

Editions of the Review  CHIN published four editions of the Review:  Edition 1 (1996) reviewed 11 software products  Edition 2 (1997) reviewed 16 software products  Edition 3 (2000) reviewed 18 software products  Edition 4 (2003) reviewed 16 software products

My Resource for Excellence.

Product Reports – excerpt

How Was the Review Produced?  Creation of the Criteria Checklist  Request for Information  Evaluation Team  Demonstrations/Evaluations  Publication

Creation of the Criteria Checklist  A list of over 500 functions that can be performed by a collections management system. For example:  Does the system allow the user to record the person who moved an object or specimen lot? Demonstrate.  Is it possible for external pre-built thesaural files to be integrated into the software? Demonstrate.  The Checklist was a key tool in the creation of the Review— it was the basis for comparison used to assess and rate each function performed by the various software packages

Requesting Information from Software Vendors  An request for product information was sent to over 40 Collections Management Software Vendors internationally  The request:  outlined the parameters for the evaluation  asked the vendor for pertinent information such as vendor product description, product costs, etc.  included the Criteria Checklist to be completed by the vendor  Responding software vendors indicated which functions within the Checklist they could perform, and were scheduled to demonstrate those functions that they claimed to support.

The Evaluation Team  The Evaluation Team for the Reviews consisted of:  4 CHIN staff members (3 for some Editions) that were dedicated full-time to the Review  Approximately 20 museum professionals volunteering as reviewers  The volunteer team members were generally from Canadian or U.S. museums that were looking for software  To find volunteer evaluators, CHIN notified the museum community of the opportunity to evaluate collections management software  Respondents included registrars, curators and collections managers. All had background in collections management, but represented wide variety of museum sizes and disciplines.

Product Demonstrations  Each software vendor that had responded to the Request for Information was scheduled to demonstrate their software  For the earlier Editions:  Demo at the CHIN office  A 2-day demo of all the Checklist items the vendor supported  Approximately 20 evaluators, from Canadian museums – some local, many remote  For later Editions:  Evaluations were “taken to the community” – demos in conjunction with U.S and Canadian museum conferences (e.g., AAM, CMA, etc.)  A 1-day demo of selected criteria (169 of 500)  List of criteria selected for evaluation not shared with the vendor in advance

Evaluation  The Evaluation Team followed the Criteria Checklist, requesting the vendor to demonstrate only the functions they could support  For each function demonstrated, team members provided:  Scores (e.g., Good, Fair, Poor, or Does not Perform, with the addition of “+” or “-” for more accuracy)  Comments to each demonstrated criteria  A narrative overall evaluation of the software  The scores were converted to numeric values, averaged and summarized for the “Software Review”  Detailed average scores and comments were made available within “Product Profiles”, one for each software product

Publication  Software Review and Criteria Checklist were made freely available on the CHIN Web site; printed versions sold  Criteria Checklist was also available in a “customizable” version online that allowed museums to select the criteria that they required from the Checklist, and produce a custom report detailing which software products met their selected criteria and how they performed  “Product Profiles” (detailed reports on individual software products) were given to CHIN member museums on request, but sold to others

Why Did Software Vendors Participate?  CHIN did not pay vendors to participate or cover their demonstration costs  Vendors saw this as an opportunity to market their software  Vendors that were included in the first editions of the Review had a head start in an emerging market in Canada  CHIN received requests from vendors wanting to participate in subsequent reviews

Influence on the Software Market  “CHIN Accreditation” – achieved if the software imports/exports data in a format that was compatible with Canada’s national collections inventories  Software products marketed as “CHIN-Accredited”  As a result, many vendors developed an import/export function specifically for the Canadian market, based on CHIN data fields

How Museums Used the Review and Related Products  Museums used the Review to shortlist software products  The museum then requested the more detailed “Product Profiles” for their shortlisted systems  The museum requested an in-house demonstration from the vendors of those products that met their criteria  Museums downloaded and modified the Criteria Checklist and used it to score/rate products during their own demos to perform their own evaluation of the systems  Museums used the online Software Selection course to guide them through the software selection process  Positive reviews from museums and from evaluators

Collections Management Software in Canada  A very wide variety of software is used in Canada. CHIN does not endorse any particular software. However, the predominant software packages are:  For small museums:  Virtual Collections (GCI Inc.)  PastPerfect  For medium to large museums:  Mimsy (Willoughby/Selago)  The Museum System (Gallery Systems)  KE-EMu

How Long Did It Take?  About 9 months per Edition  For the 4 th Edition:  January 2003 – RFI sent out to vendors  February 2003 – Responses received  April to June 2003 – Software Evaluations took place at various locations  July-September 2003 – Results were compiled  Published in late Fall of 2003

Future?  Plans to update the Criteria Checklist in  Update to reflect new functionality of today’s software products

Thank You! Heather Dunn Heritage Information Analyst Canadian Heritage Information Network (CHIN) Department of Canadian Heritage Government of Canada