Presentation is loading. Please wait.

Presentation is loading. Please wait.

RDA: A Practitioner’s Approach to Staff Training and Testing Erin Stalberg Head, Metadata & Cataloging NCSU Libraries LYRASIS Ideas.

Similar presentations


Presentation on theme: "RDA: A Practitioner’s Approach to Staff Training and Testing Erin Stalberg Head, Metadata & Cataloging NCSU Libraries LYRASIS Ideas."— Presentation transcript:

1 RDA: A Practitioner’s Approach to Staff Training and Testing Erin Stalberg Head, Metadata & Cataloging NCSU Libraries erin_stalberg@ncsu.edu LYRASIS Ideas & Insights Series: Technical Services January 20, 2011

2 What this presentation is NOT It is not RDA training It is not a debate about the merits of RDA It is not any sort of future-guessing about the US National Libraries’ implementation decision

3 What this presentation IS It is about the NCSU experience with staff training and transition and what we learned during the test It is about what we learned about cooperative cataloging by getting a window into the workings of the National Libraries It is about the testers finding out what YOU want to learn from our experience(s)

4 About Metadata & Cataloging @ NCSU 20 Metadata & Cataloging staff –7 in Monographs –6 in Serials & Continuing Resources –4 in Metadata & Data Quality –1 Technology Support for Technical Services Highly centralized 2009-2010 cataloging output –60,568 physical & electronic titles (MARC) –50,504 physical volumes (MARC) –12,909 digital image assets (non-MARC) –779 digital text assets (non-MARC) –669.75 linear feet of manuscript materials (non-MARC) –2,943 faculty citations (non-MARC)

5 About Metadata & Cataloging @ NCSU MARC … OCLC & SirsiDynix Symphony EAD … Archivist’s Toolkit VRA Core … VCat MODS … local “Digital Assets” database Dublin Core … Dspace NCSU is not a participant in the PCC.

6 Discovery @ NCSU Endeca …fed by multiple data sources/standards Historical State … fed by multiple data sources/standards Finding Aids Interface … EAD Luna … fed by multiple data sources/standards Summon (Serials Solutions) … fed by multiple data sources/standards QuickSearch … fed by multiple data sources/standards

7 Why did NCSU choose to participate in the test? To force ourselves to learn Copy-heavy institution Support-staff-heavy institution Trying to re-invigorate our training program How do records of various types co-exist happily? Assessment/usability & cost/value

8 About the test – Timeline July-September 2010 –Test partners became familiar with the content of RDA and with navigating the RDA Toolkit. October-December 2010 –Test partners produced records in the test. –http://www.loc.gov/catdir/cpso/RDAtest/rdatestrecords.htmlhttp://www.loc.gov/catdir/cpso/RDAtest/rdatestrecords.html January-March 2011 –The US RDA Test Coordinating Committee analyzes the results of the test and prepares its report to the management of the three national libraries. –Institutional analysis of records and surveys. –Decision expected for ALA Annual, June 2011.

9 About the test – Participants Library of Congress National Agricultural Library National Library of Medicine Backstage Library Works Brigham Young University Carnegie Library of Pittsburgh Clark Art Institute Library University of Chicago College Center for Library Automation (Florida) Columbia University Douglas County Libraries, Colorado Emory University GSLIS Group Minnesota Historical Society Morgan Library and Museum Music Library Association/Online Audiovisual Catalogers, Inc. North Carolina State University Libraries University of North Dakota North East Independent School District, San Antonio, Texas Northeastern University OCLC Metadata Contract Services Ohio State University Libraries State Library of Pennsylvania Quality Books Stanford University Libraries George Washington University

10 About the test – Common set original –6 print single-part monographs –1 print multipart monograph –3 e-monographs –1 film DVD –1 streaming video –1 sound recording on CD –1 audiobook –1 poster –2 print serials –3 e-serials –1 integrating loose-leaf –4 integrating e-resources

11 About the test Common set copy Extra set Surveys –Institutional surveys –Cataloger demographic surveys –Record-by-record surveys –Non-testers could also share their experience here: http://www.loc.gov/catdir/cpso/RDAtest/admindoc8.doc http://www.loc.gov/catdir/cpso/RDAtest/admindoc8.doc

12 Record-by-record surveys What is the sequential number of this record in your personal bibliographic record production? How much experience do you have in cataloging this type of resource? How many minutes did it take you to complete this bibliographic record? consulting others? In creating this record, which of the following did you encounter difficulties with? –Online tool (RDA Toolkit) –Content of cataloging instructions –Selecting from options in the cataloging instructions –Coding/tagging

13 Record-by-record surveys How many minutes did it take you to review, create, and/or update authority records associated with the item in question? How many new authority records did you create in describing this item? What type of new authority records did you create? –Personal name –Corporate name –Conference name –Authorized name (of access point) for a work or expression –Family name –Geographic name –Series title

14 Approach to training @ NCSU Involved all Metadata & Cataloging staff Was not to be a debate about the merits of RDA It would not cover everything It was what was need to know for the test Established a training team When to know what to ask It had to succeed! had

15 Approach to training @ NCSU Training team –Christee Pascale, Associate Department Head –Jacquie Samples, Continuing & Electronic Resources Librarian –Patrice Daniels, Monographs –Anne Navarro, Monographs –Lisa Madden, Continuing & Electronic Resources

16 How the trainers were trained Jacquie Samples went to the Library of Congress' Train the Tester session (for testing participants) at ALA Midwinter, January 2010 Cataloging Management Team watched webinars together The RDA Training Team, who are all members of the Cataloging Management Team, further assembled other available resources, then learned and muddled through as a group, developing and training content while simultaneously learning the material for ourselves Developed local policies & procedures

17 Local policy development Policy setting v. use of catalogers judgment Policy setting for the test v. for forever (if adopted) Alternatives/optional omissions/optional deletions –To follow LC or not to follow LC? LC Core Plus Criteria for upgrading copy Relationship designators Rule of 3

18 Training @ NCSU – Curriculum Webinar: RDA Changes from AACR2 for Texts (Barbara Tillett) FRBR RDA core training Breakout groups ALCTS webinars & ongoing discussion

19 Approach to training @ NCSU – FRBR Hour-long session before the official start of RDA training Deliberately tailored the content to focus on the concepts needed to carry over into RDA training and then attempted to make those concepts more concrete

20 Approach to training @ NCSU – FRBR What worked well? –Tailoring the content to need-to-know for RDA –Concrete examples & props –Focus on user tasks –Group discussion

21 Approach to training @ NCSU – FRBR What worked less well? –FRBR is hard and needs to be reinforced throughout. –The “fun” FRBR philosophical debate is around resources published in multiple expressions/manifestations. NCSU does not collect heavily in these areas.

22 Training @ NCSU – RDA core training Differed from LC’s training in two ways: –More intentionally taught RDA in terms of MARC21 and AACR2 –Softened the presentation of RDA in its FRBR/FRAD-based conceptual framework. 12 hours of training over a 3 day period

23 Training @ NCSU – RDA core training Day One Introducing RDA Access Points Relationship Designators Preferred Title for the Work Day Two Sources of Information Identifier for the Manifestation Title Proper and Statement of Responsibility Content, Media and Carrier Types Designation of the Edition Publication Statement and Copyright Date Extent, Illustrative Content (etc.) and Dimensions Day Three Dates for Mulipart Monographs, Serials and Integrating Resources Series Statement Numbering of Serials & Series Notes MARC Encoding for the US RDA Test Wrap-Up

24 NCSU did NOT cover Changes to types of materials we do not often collect: –Parts of the Bible –Rare books –Treaties –Music We also did not train in-depth on the new MARC Authority Record fields … just enough to be able to read an RDA authority record

25 Training @ NCSU – Breakout groups Split our staff of 20 into small groups Created practice records together Resources we were likely to catalog during the US Test: –Single part monograph (print and electronic) –Multipart monograph (print and electronic) –Upgraded monographic copy (from AACR2 to RDA) –DVD –Children's resources –Streaming media –Theses & dissertations –Serials (print and electronic) –Integrating resources (print and electronic)

26 Training @ NCSU – RDA core training What worked well? –Half-day sessions –Involving support staff in the content creation –Having more than one presenter –Starting with the harder stuff and leaving on a “high” –Having professional-looking Powerpoints & handouts –Having and sticking to an agenda –Investing in the planning –Discussion that ended in decision-making & follow-up –Snacks!

27 Training @ NCSU – RDA core training What worked less well? –Easy to get derailed by the edge cases –Discussion that did not end in decision-making –Staff really want examples, examples, examples We did not have enough examples We did not show full records They were not all contextual to our environment

28 Impact on systems NCSU added all new MARC fields (ourselves) to our SirsiDynix Symphony policy tables GMD/336-338 –NCSU is not displaying and/or utilizing these fields in Endeca –We currently facet & draw icons based on item data –Hit lists & reports in Symphony are a bit of an issue without the GMD as easy format notation

29 Oops! Provider-neutral monograph cataloging 33X subfield b Catalogers judgment & MARC coding implications for RDA decisions

30 Supplying a Date of Publication for Single Part Monographs RDA allows both: –260 ## $a Hoboken, N.J. : $b Wiley, $c [date of publication not identified], ©2010. and –260 ## $a Hoboken, N.J. : $b Wiley, $c [2010], ©2010.

31 NCSU Statistics Common set original: 25 Common set copy: –met NCSU criteria for upgrade: 4 –did not meet NCSU criteria for upgrade: 1 Extra set with surveys: 261 –MARC Original: 189 –MARC Copy: 62 –MODS: 10 Extra set without surveys: 201 ETDs

32 Effect of NCSU RDA cataloging on the community thus far (December 31, 2010) NCSU MARC original cataloging: 189 –156 records (82.5%) have no other holdings to date –28 records (14.8%) have 1-10 other holdings to date –5 records (2.6%) have more than 10 holdings to date

33 What we’ve learned so far It is kind of cool to reset the training baseline LC is not like the rest of us It is very hard to make local decisions without knowing what LC is planning to do It is easy to get bogged down by edge cases Unlearning is hard Energy is a good thing in and of itself

34 Things we know in our heads but it is still good to be reminded of People like examples Catalogers like rules Catalogers like when expectations are clear and documentation is up to date Support staff like when their bosses know the answers to their questions Cataloging managers like when LC figures things out first

35 Things we know in our heads but it is still good to be reminded of “Cataloger’s judgment” needs something to be grounded in –FRBR user tasks The people at LC are just people too We are all responsible for our metadata future We are all responsible for the cost/value decisions in our libraries

36 What we still don’t know Costs for implementation Time study results User assessment Impact on authority control

37 What is NCSU doing during the evaluation period? We are continuing to catalog in RDA for original cataloging & copy that meets our upgrade criteria. We are accepting full AACR2 copy as is. We are following PCC policy to use AACR2 authorized headings in RDA bibliographic records. We are reflecting on our experience & our local decisions.

38 Documentation & Resources NCSU: –https://staff.lib.ncsu.edu/confluence/display/MNC/RDA+Testhttps://staff.lib.ncsu.edu/confluence/display/MNC/RDA+Test –particularly: NCSU RDA Training FAQ University of Chicago: –http://www.lib.uchicago.edu/staffweb/depts/cat/rda.htmlhttp://www.lib.uchicago.edu/staffweb/depts/cat/rda.html Library of Congress –http://www.loc.gov/catdir/cpso/RDAtest/rdatest.htmlhttp://www.loc.gov/catdir/cpso/RDAtest/rdatest.html –particularly: RDA Changes from AACR2 for Texts (Barbara Tillett) RDA-L –http://www.rda-jsc.org/rdadiscuss.htmlhttp://www.rda-jsc.org/rdadiscuss.html LChelp4rda@loc.gov –Email address to contact LC for questions about rules

39 Questions? What do you want to know from us and our experience? What can the early adopters provide that will lower your implementation costs? What worries you most about staff transition for an RDA implementation?


Download ppt "RDA: A Practitioner’s Approach to Staff Training and Testing Erin Stalberg Head, Metadata & Cataloging NCSU Libraries LYRASIS Ideas."

Similar presentations


Ads by Google