Skip to main content
OCLC Support

OCLC Expert Cataloging Community Sharing Session minutes, February 2018

Minutes of the OCLC Expert Cataloging Community Sharing Session
ALA Midwinter Meeting
Friday, 2018 February 9
10:30 a.m.-12:00 noon
Colorado Convention Center, Denver, Colorado

 

The ALA Midwinter 2018 edition of Breaking Through: What’s New and Next from OCLC and the compilation of News from OCLC were distributed. The first item from the latter was highlighted:

  • The first Virtual AskQC monthly Office Hours took place on Wednesday, 2018 January 31. OCLC’s Metadata Quality Control staff made themselves available for one hour online to answer cataloging questions. OCLC will continue to hold these office hours on the last Wednesday of each month at 1:00 pm Eastern Time through June and then evaluate whether there has been enough interest and participation to make it worthwhile to continue.
  • Notes from the January 31 session will be distributed soon, although because of a technical problem that first session was not recorded. OCLC intends to record future sessions. AskQC@oclc.org is the e-mail address to which questions may be sent to be answered by OCLC Quality Control staff. Laura Ramsey will discuss the Member Merge Project.

Laura Ramsey (Section Manager, Quality Control) announced the expansion of the Member Merge Project, in which member library staff are trained to merge duplicate records. Phase Two had begun in August 2017 with four libraries. Phase Three will begin with four or five additional libraries in the next year. Those wishing to apply should send a message to AskQC@oclc.org. There are currently a total of nine institutions participating. The training process includes WebEx meetings and test exercises. Participants work one-on-one with a QC staff member. It remains a work in progress.

There were two pre-submitted questions:

At past Expert Community sessions we have been told that if we change even one subject heading we should delete all the FAST headings so that they will be regenerated. Lately I am updating quite a few records that already have FAST headings but no LC subject headings. I assign LC subject headings which I think are the ones from which the FAST headings would derive. Do I need to delete all the FAST headings for regeneration in this case also?

Not too long ago, the following announcement was distributed to PCC participants:

We would like to provide updated information about how catalogers should treat FAST headings in OCLC records when updating LCSH. A monthly process monitors additions or changes to LCSH and makes applicable changes to FAST headings. Because of this, catalogers do not need to edit FAST headings when they change LCSH. Please note, if a cataloger would like to change the FAST headings, this is okay, and the monthly process will look at those changes, updating or correcting the FAST headings as necessary. However, with cataloger entered changes, no attempt will be made to synchronize the LCSH and FAST headings.

We are in the process of documenting this in Bibliographic Formats and Standards and on the FAST website, so in the future we can point to documentation on this process. For further questions, please contact Diane Vizine-Goetz at fast@oclc.org.

What are OCLC’s plans for the use of the subfield $0?

Here’s what OCLC has done regarding URIs in subfield $0:

  • Validated subfield $0 in all bibliographic fields where MARC defines it.
  • Adjusted subfield $0 validation to accommodate URIs in the most recent OCLC-MARC Update, installed in September 2017.
  • Updated BFAS (http://www.oclc.org/bibformats/en/controlsubfields.html) to account for the expanded uses of subfield $0.
  • Continued to hope that clear guidelines on the appropriate uses of subfield $0 will be made available.

OCLC is currently working on MARC Update 25 (made available in December 2017), which includes implementation of the new subfield $1.

The floor was then opened for questions, answered by Bryan Baldus (Consulting Database Specialist, Metadata Quality); Hayley Moreno (Database Specialist II, Metadata Quality); Sara Newell (Senior Product Analyst, Metadata Services); Rosanna O’Neil (Senior Library Services Consultant, Library Services for Americas); Nathan Putnam (Director, Metadata Quality); Laura Ramsey (Section Manager, Metadata Quality); Roy Tennant (Senior Program Officer, OCLC Research Library Partnership); Jay Weitz (Senior Consulting Database Specialist, Metadata Quality); and Cynthia Whitacre (Manager, Metadata Quality).

Now that you have the AskQC Virtual Office Hours, do you prefer that we use that to submit our questions, or should we send them to AskQC@OCLC.org as in the past?

That’s entirely up to you. If you have a question that needs an answer now, you needn’t wait till the end of the month to get an answer, but if you think the answer will be of general interest and want others to hear it, you can save it for the virtual office hours if you prefer.

If a 100 field has a subfield $0, should we control it, which causes the subfield $0 to disappear?

When a heading is controlled in WorldCat, the link in subfield $0 is to the authority record controlling the heading. That’s the preferred method for keeping headings up to date. Headings that aren't controlled cannot be updated automatically, they won't auto-update. Sara Newell noted that Metadata Services just updated Record Manager to export subfield $0 and has plans to expand that to Collection Manager. Connexion will not be updated to allow export of subfield $0. Even though the subfield $0 appears to be stripped out when you control a heading, it’s actually still there as the link, unless the subfield $0 refers to a different vocabulary/authority file. Nathan pointed out that this is a limitation of MARC 21, that there's no way to specify which vocabulary the subfield $0 refers to. So, a subfield $0 containing an ISNI will be lost when the heading is controlled. There are conversations going on about ways to deal with this problem.

We have problems with the metadata quality in KB.

Please contact OCLC support in your region at http://oc.lc/support.

What are the implications for the name authority files of the launch of Voila, the National Union Catalogue of Library and Archives Canada (LAC) in WorldCat?

Sara said that LAC is migrating to WMS. We are in the process of making the LAC authority files editable in Record Manager later in 2018. She also expects that Canadian subject headings will be controllable.

Casey Mullin of Western Washington University, a participant in the Member Merge Project, delivered a testimonial to the project. Western Washington was part of the second cohort to be trained, beginning with the basics of monograph merging (books and electronic books), then moving on to other bibliographic formats. They are currently training in scores and hope to go on to sound recordings and cartographic materials. As many of your cataloging staff may participate as your institution sees fit.  PCC membership is required, NACO at least. Institutional support is really helpful. Western Washington’s catalog is synchronized with WorldCat, so their work immediately improves the local database. It's satisfying to get rid of duplicates. You have to know BFAS Chapter 4 (“When to Input a New Record”), which fields transfer when records are merged, and which record should be retained. Because Encoding Level M records have been added via automated processes and have not been subject to human review, one can be more lenient about deciding whether such records are duplicates. It may be difficult and time consuming to learn and understand the very complex merge rules, but anything you can do to help clean up the database is a great help to all the member libraries. No algorithms can possibly pick up on all the subtle clues that human catalogers can use to help them decide whether to merge.

Can reporting duplicates be made easier?

Duplicates can be reported directly from the bibliographic record. In the Connexion client, use the “Action” dropdown and click on “Report Error.” Copy and paste the OCLC numbers you think are duplicates, or the search that brought the duplicates up, or the ISBNs the records have in common. We can usually figure it out from there.

Can anything be done about the many duplicates from the British Library?

British Library records have been a quality issue for a long time. As part of the LAC project, we are working on cleaning up variant 015 and 016 fields. We hope to use these fields as another, cleaner matching point for British Library and LAC records.

We are having problems with entering Bengali script. The cursor won’t position properly, which makes inputting the text difficult. It’s not just Bengali, but affects Arabic and possibly other scripts, as well.

Save the record in your online save file and we’ll take a look at what’s going on. It could possibly be a Windows issue. Cutting and pasting text from the Web can also be a problem, as we often see even with such common characters as apostrophes and quotation marks. Sometimes it works better if you paste the material into Word first and then copy and paste it from there into Connexion.

We were able to improve our cataloging statistics and raise the profile of one of our LTAs by actively participating in Expert Community work. It didn’t hurt productivity, either.

The whole idea behind Enhance, historically, as well as its expansion into the Expert Community was to expand the capabilities of OCLC members and to make WorldCat quality a wide responsibility. WorldCat would not be what it is without the contributions of members of the cooperative. Every Expert Community edit helps, even though we know that WorldCat is never going to be perfect. OCLC and your fellow members of the OCLC cooperative thank you for all the work you do. We’re all working toward "a more perfect union catalog."

Can you publish the statistics for merged records and other enhancements?

Some of these statistics are already on the OCLC web site at https://www.oclc.org/en/worldcat/cooperative-quality.html. The original DDR implementation was from 1991-2005, during which time DDR went through WorldCat sixteen times, merging about 1.5 million books duplicates. After the development of the new DDR between 2005 and 2010, we have run DDR through WorldCat and continually process new records, resulting in the removal of over 31 million duplicates in all bibliographic formats, not only books. Please report any records you think may have been incorrectly merged and we will recover them if they were merged recently enough.

OCLC does all these splashy infographics, but we would like to see more publicity on these statistics. The marketing people in the room may be able to take that back.

We’ll look into it. Thanks for the suggestion.


Respectfully submitted by
Doris Seely
University of Minnesota
2018 February 15

With contributions and edits from Bryan Baldus and Jay Weitz.

OCLC
2018 March 6