Skip to main content
OCLC Support

OCLC Expert Cataloging Community Sharing Session minutes, January 2015

Minutes of the OCLC Enhance Sharing Session
Friday, January 30, 2015
10:30 a.m.-12:00 p.m.
McCormick Place West, Chicago


The Midwinter 2015 editions of What’s New at OCLC and News From OCLC were distributed. OCLC’s Jay Weitz hosted; Linda Gabel and Robert Bremer were also present to answer questions.

  1. OCLC acquires Sustainable Collection Services (SCS)
    OCLC announced on January 13 that they have acquired Sustainable Collection Services (SCS) which uses WorldCat data to help libraries decide which print titles to keep locally, which can be kept in shared collections, and which can be discarded as more of their materials go digital. OCLC and SCS had been working closely together all along; the acquisition will make this even easier.
  2. OCLC-LC white paper on linked data
    OCLC and the Library of Congress have collaborated on a white paper comparing and contrasting their approaches to linked data, LC working with BIBFRAME and OCLC with The paper is available at
  3. Merged records
    In response to a question on the status of the pilot project for allowing members to merge duplicate records, Linda Gabel reported that a fifth library has now been added to the original four working on the project; it has gone very well and they hope to have more news on the results soon. Members are urged to continue reporting duplicates. The automated merge programs continue to run on selected categories of new and changed records and have merged 1,7526,850 records as of the end of January 2015. OCLC constantly fine-tunes the algorithms, but of course they still make some mistakes and incorrect merges do occur. Members are urged to report them, since OCLC analyzes them to see what went wrong and uses the information to further fine-tune the algorithms. Another merge problem is created by Marcive records for documents that are too sparse to allow for proper merging. One library working with local WorldCat has found that the numbers for merged records are sometimes wrong in the KB although they are correct in OCLC.

    One cataloger working on the pilot project has come to realize how very complicated the merge process is and now respects OCLC for being cautious and not merging some records that seem at first glance to be obvious matches but may have subtle differences on closer examination.

    Discussion  moved on to why there were so many serial duplicates, particularly from AU@, that were clearly earlier copies of CONSER records. Often, part of the problem is that the descriptions have been updated to reflect an earlier issue, and when the old record comes back through batchload it no longer matches the record from which it was originally derived. Sometimes, the LCCN was also changed; this used to be routine practice when LC claimed a non-LC CONSER serial for their own catalog. In general different LCCNs should be an indication of different serials, but that is not always the case with serials. There was a comment that some of the serials duplicates that have been noticed seemed to be character for character the same or so nearly identical that it was not clear why they were not automatically merged. Robert commented that we would like to get some of those reported to us at so that they could be tested to see why they did not match so that we could improve the software if needed.
  4. FAST headings
    There was a question about the status of the project to add FAST subject headings retrospectively; it seems many records still have no FAST headings. The project is ongoing, but some LC subject headings are too complex for FAST to handle. If you change the subject headings on a record that already has FAST headings, you should delete the corresponding FAST headings so that they can be regenerated.
  5. Vendor records that cannot be updated
    One cataloger reported that sometimes when he tries to update a vendor record to encoding level I the system will not accept his updates. Others suggested this is probably because he has inadvertently changed something in one of the protected vendor fields; even the insertion of a single extra space in one of these fields will prevent the record replace from being accepted by the system. It may be impossible to determine what you did to cause the problem. You can sometimes get out of it by doing a derive from the vendor record, which allows you to get rid of the vendor fields, or by saving your edits, canceling all changes to the vendor record and then pasting your edits back in.
  6. E-book collection management
    Now that e-book collection management has merged into WorldShare Management Systems, how does one find out what is happening there? You can access the WMS information on the OCLC web site or send questions to OCLC at
  7. Metadata management problem
    Metadata management allows libraries to have their records automatically enhanced as enhancements are added in OCLC and have the records returned to them. One library has not been receiving the proper enhanced records. It was suggested that OCLC be asked to update the institution’s profile.
  8. What to do with NUC?
    A member reported that her library has to get rid of 100.000 volumes and she is afraid that the National Union Catalog will be among them. She asked what other libraries are doing with their NUCs. One member said her library had to discard NUCs and they tried to find a home for it but could not. Another said her library had lined a wall in technical services with it and occasionally they can show a user how to find needed information in it. No one could offer a compelling reason for retention.
  9. RDA name headings updates
    There was a question about the status of the 3rd stage of the NAF RDA updates. This project is proceeding. The headings which are controlled on the master record are easy to update, but those that are not controlled require much more work.

Respectfully submitted by
Doris Seely
January 4, 2015

Edited by Jay Weitz