Skip to main content
OCLC Support

OCLC Expert Cataloging Community Sharing Session minutes, June 2015

Minutes of the OCLC Enhance and Expert Community Sharing Session
ALA Annual Conference
Friday, 2015 June 26
10:30 a.m.-12:00 p.m.
Moscone Convention Center, San Francisco


The ALA Annual 2015 edition of Breaking Through: What’s New and Next from OCLC and the compilation of News From OCLC were distributed.

  1. OCLC-MARC Update 2015
    During the third quarter of the calendar year 2015 the 2015 OCLC-MARC Update will be installed. It will implement MARC 21 Bibliographic, Authority, and Holdings format changes announced in MARC 21 Updates No. 19 (October 2014; and No. 20 (April 2015; OCLC will also make additions to WorldCat indexing, validate all MARC Codes defined by LC since July 2014, and implement subfields $8 (Field Link and Sequence Number) in some 46 bibliographic fields. They are harmonizing BFAS, WorldCat validation, and MARC 21 as to subfield validity and repeatability in both the MARC-defined and the OCLC-defined 1XX, 6XX, 7XX, and 8XX fields. Details are available in OCLC Technical Bulletin 265 at
  2. New interface for QuestionPoint
    The QuestionPoint reference management service now offers a new, more contemporary user experience, new interface, and redesigned user forms. Screens, menus and action buttons are easier to read, and screens display as well on smartphones as they do on desktop or laptop computers.
  3. Seventeen new webinars added to the WebJunction catalog
    The Nebraska Library Commission and the Washington State Library have collaborated with WebJunction to add 17 new webinars for library staff to WebJunction. There are now 143 webinar archives in WebJunction and 38 self-paced courses in the Alternative Basic Library Education (ABLE) Program. Learn more at
  4. Questions and answers
    Questions answered by Jay Weitz (Senior Consulting Database Specialist, WorldCat Quality) and Cynthia Whitacre (Manager, WorldCat Quality).
    What progress is being made with bibliographic duplicates?

    We work on improving the matching algorithms all the time. We recently made a major fix to the algorithms for electronic resources and this is making a big difference; we now find and eliminate many more duplicates among e-resources. We work on tweaking the algorithms all the time and meet several times a week to discuss improvements and see if the tweaks we made have improved the process and if not, why not.

    Why are there often multiple vendor records for the same resource?

    Many Encoding Level 3 records from vendors are so sparse that they are difficult to match. We do have special matching routines for sparse records that try to account for the risks of matching with too little information. OCLC can also use macros to match records with Encoding Level 3 using ISBNs. We can target some batchloaded records with library symbols that have notorious reputations for special matching attention. If you see library symbols with consistently sparse or bad data, you may report them to or

    Can we create a new record when the one already in WorldCat is really bad?

    We urge you not to create duplicates on purpose (or by accident, for that matter). Rather than create a duplicate, catalogers should use a macro to wipe out the bad content and redo it properly. That is exactly what Enhance and the Expert Community are for. The macro can be found in the Connexion client under Tools/Macros/Manage/OCLC/ClearELvl3Workform; it “Clears candidate fields from an Encoding Level 3 bibliographic record and replaces them with workform prompts.” You are also encouraged to convert the record to RDA.

    What has happened with the Merge Pilot?

    Two years ago OCLC created a pilot project in which catalogers at four institutions were trained to merge duplicate WorldCat records. Merging of records had previously been done only by OCLC staff and OCLC processes. Although the participants learned to do this very well, the process was much more labor-intensive than expected, both for the OCLC trainers and for the institutional trainees. One institution is still extremely active merging records, but in the process of evaluating the results of the pilot project, OCLC wonders if the results have been worth the effort. Adolfo Tarango (University of California, San Diego) made a plea from the floor that the project not be abandoned, but rather be expanded and asked that a lot more institutions get involved in this work. In his opinion it is well worth doing and although it may cost us much time and effort now it will save us much time and effort later on.

    Can there be an easier way to report duplicates?

    In the Connexion client, you can report errors and duplicates directly from the bibliographic record. Use the Action menu and go to Report Error. Just tell us that OCLC Number X is a duplicate record and hit “Report Error.” It’s as simple as that. You could make it even simpler by adding the “ActionReportError” button to your toolbar using Tools/Toolbar Editor.

    Is there a chart for deciding Encoding Levels?  I need a quick guide to what I can change and what I cannot.

    The definitions of the Encoding Levels are in Bibliographic Formats and Standards at There are other details in Chapter 2.4 on “Full, Core, Minimal, and Abbreviated-Level Cataloging” (, including a chart comparing the standards for the various levels of cataloging. In BFAS Chapter 5.3 ( is a chart of which fields may be added and/or changed under Database Enrichment (and hence, the Expert Community), prefaced with the explanation of which records you may replace using a Full-level authorization, including many PCC records.

    Is the discussion of incentive credits truly dead?

    Yes, the issue of transactional credits is settled with the move to flat-rate credits. But interestingly, from Fiscal Year 2014 to Fiscal Year 2015, the number of bibliographic records replaced by OCLC member institutions actually rose, in the absence of transactional credits, from slightly over a million replaces to just short of 1.2 million replaces. This includes all member replaces under the Expert Community, Database Enrichment, Minimal-Level Upgrade, Enhance, and CONSER. This is a heartening affirmation of the cooperative spirit of OCLC that has built WorldCat into the unique resource it has been for decades.

    The number of edits at my institution has increased dramatically since the flat rate was established.  Will that flat rate ever be re-evaluated?

    At this time, there are no plans to re-evaluate the flat-rate credits.

    Institution Records (IRs) are very important to our institution and the Local Bibliographic Data (LBD) option doesn’t meet our needs. What recourse do we have?

    IRs are definitely going away; that decision has been made.  In the near-decade since the merger of the Research Library Group (RLG) and OCLC, which precipitated OCLC’s introduction of IRs to accommodate certain practices from the RLG Union Catalog, the electronic catalogs of many institutions have been made available online. That means that many more IR-equivalent records are freely available for examination on the Web than was the case back then. LBDs are in their relative infancy in terms of WorldCat and if they don’t currently do what you need them to do, you are encouraged to let OCLC know how those LBD capabilities can be expanded and made more visible. Such requests from OCLC members could have a real impact. Please send your comments and suggestions to

    How can OCLC increase search engine optimization?

    The transition to Resources Description and Access (RDA), the development of the Bibliographic Framework Initiative (BIBFRAME), the availability of WorldCat data as Linked Data via, and the support of the Virtual International Authority File (VIAF) are just a few of the things OCLC is involved in that make bibliographic and authority data more visible on the Web. Visit the OCLC “Data Strategy and Linked Data” page ( to learn more.

    Do you check whether records from institutional loads are in the proper format?

    Batchloaded records go through extensive preprocessing that attempts to clean them up and correct as many problems as can be identified and fixed. There are certain categories of records on incorrect bibliographic formats that we are able to fix, but obviously, not all such errors can be caught. If you come across records on the wrong format, please report them to if you are not able to correct them yourself.  BFAS 5.1 “Type and BLvl changes” ( outlines which Type and Bibliographic Level changes you should be able to do with your specific authorization level.

    If there is only a brief vendor record in English, but a full foreign language record, how can I copy the foreign language record?

    You should be able to use that ClearELvl3Workform macro to delete the bad data on the vendor record. There are user-created macros available from the Connexion Client macros page that may help you copy and paste data from a better non-English Language of Cataloging record into the vendor record.

    What can be done to prevent the records for textbooks which we catalog from being merged with others in OCLC which are not the same?

    If the textbook itself does not include an explicit edition statement that would distinguish it from a similar but different textbook, an effective option is to apply RDA (or AACR2 1.2B4, the parallel instructions in the other chapters in Part 1, and the associated LCRIs) and supply a bracketed edition statement suitable to the situation.

    Do you know the percentage of international records in WorldCat as compared to U.S. records?

    As of January 2015, about 61.3% of the bibliographic records in WorldCat represented non-English-language materials, with about 38.7% English-language materials.  Regarding the Language of Cataloging (field 040 subfield $b), 50.7% of bibliographic records in WorldCat are cataloged in English and 49.3% in languages of cataloging other than English. Non-Latin scripts are represented on about 9.75% (roughly 33.1 million) of the bibliographic records in WorldCat as of June 2015. As far as I’m aware there is no easy way to differentiate records created by institutions in the United States from those created by institutions outside of the United States, if that was the literal intent of the question.

    Foreign records not done according to Anglo-American rules use their own standards and analyze things differently. Does this make de-duplication harder?

    Yes, it does, but remember that records for the same resource but cataloged in different languages (specified in field 040 subfield $b) are considered to be parallel records (see BFAS 3.10, rather than duplicates of records cataloged in English. Only records cataloged in the same language can be considered duplicates. Stephen Hearn (University of Minnesota) noted that he corrects tagging errors in UK and other records and asked if this helps with de-duplication; yes it may.

    Are you working on adding the subfield $0 to indicate that a heading is controlled to the LC NAF?

    Currently, bibliographic record access points that are controlled to the LC/NACO Authority File display as hot links that take you to the specific authority record when clicked on in Connexion client. Part of what is actually going on behind the scenes in Connexion is the presence of the subfield $0 with that authority record identifier. The goal at some still-undefined point in the future is for the subfield $0 for LC/NACO Authority File to be implemented in Record Manager. As other authority files are added to Record Manager, they will use the subfield $0 implementation. This is currently true for the Dutch Names Authority File (NTA Personal Names). When working on a MARC 21 record In Record Manager, you can apply a heading from the NTA Personal Names file. This feature is available only for records that have “dut” in the 040 subfield $b. After searching for a Dutch authority record, you can copy the link data from the Authority field 100 and insert it into Bibliographic fields 100 and/or 700. You may find additional information on these functions in Record Manager Help at

    When will General Material Designations (GMDs) be removed from bibliographic records?

    As stated in the “OCLC RDA Policy Statement,” the plan is to begin removing GMDs in field 245 subfield $h after March 2016, which is three years after RDA “Day One.” We have no idea how long it will take to accomplish this. As you may have noticed, OCLC WorldCat Quality staff have been using macros to go through WorldCat to make numerous RDA-related and other changes to bibliographic records in the Books, Scores, and Cartographic Materials formats, including the addition of 33X fields, to the extent that we can safely do so. Some of the other changes we’re making are noted in the “OCLC RDA Policy Statement.” In the process, we are also fixing up other things, such as controlling headings when possible. As part of the process of removing the GMDs, we will also begin adding the appropriate 33X (and other RDA-related) fields to those records.

    At our library, we currently do good foreign language cataloging, including non-Roman records, but all the catalogers working on these records are reaching retirement age. Who will do that work in the future? Where are the new specialized catalogers coming from?

    That’s a dilemma facing almost every institution, including OCLC, where we’ve seen the retirements of so many vital coworkers and the resulting loss of so much institutional memory. When we look around this room and in other cataloging sessions here at ALA, however, we can’t help but be encouraged by the many new faces we see among the familiar ones. That has to be encouraging. Adolfo Tarango also reminded us that we need to make sure that administrators fully understand the value of good quality metadata and that we have to educate those coming into the profession about the continuing importance of clean data, authority control, and the rest of our traditional concerns.

  5. Some final comments
    Each of us on OCLC’s data quality team continues to work hard on your behalf to address concerns about duplicate records. We spend a substantial portion of every working day dealing with duplicates in one way or another, both directly through manual merges and indirectly through work to improve matching algorithms. Additionally, several of us meet at least two or three times each week with developers and project managers to review, correct and test solutions to incorrect matches detected by our duplicate detection software and reported by members. Please continue to report duplicates and records that may have been merged incorrectly to Comments about bibliographic duplicates can continue to be sent to, and will be shared with others, as appropriate.

Respectfully submitted by
Doris Seely
University of Minnesota
2015 July 6

With edits by Becky Dean, Janet Hawk, Sandi Jones, Marty Loveless, Cynthia Whitacre, and Jay Weitz.

2015 August 12