Skip to main content
OCLC Support

2021 AskQC office hour member Q&A

Review all AskQC office hour member questions from 2021.

January 2021: More on 5xx fields

January 2, 2021

Some bibliographic records have large two or three 505 fields. What are the split or repeated 505 fields based on? Are the 505 field contents evenly split if you have, e.g., two split 505 fields? 

It really depends on kind of the vintage of the particular record or the vintage of the 505 field if it was added. In, a long time ago there were limits to the number of, there were practical limits to the number of characters that could be in a particular field. Those limits really kind of don't exist anymore or they're high enough that they're almost never reached. So if the record is old enough, it's very possible that if you would try to input the record manually, with a huge 505 field, the system would hiccup and you'd have to manually split the 505 field into multiple fields so that each of the individual fields would be under the limits. Again, that's not really the case anymore. Nowadays, it's really more common that a cataloger will split a 505 for more logical or bibliographical reasons: the contents of different volumes, of different parts of a multi-part resource, and that kind of thing. So, it really depends on the context of the record and the resource that you're cataloging as to whether you want to split a 505 field or if it's necessary. Sometimes it's easier to make the 505 field legible or understandable or readable if you split it up, but it's not something nowadays that usually gets done automatically.

Re: 505 field. Isn't there a limit on number of characters in a field?

Not really any longer, as I mentioned before. There used to be limits to the length of the, of individual fields. The limits on both fields and records themselves are now so high that they are for most practical purposes nonexistent. I don't think there's an actual limit on the number of characters in a 505 field or any other field.

Since the subfields in an enhanced 505 aren't intended to be indexed as phrases, what's the logic behind putting every single title in a 505 that consists only of titles in its own separate subfield $t? Wouldn't using a single subfield $t for all those titles suffice?

I guess logically that would be the case, but MARC21 calls for individual titles in a 505 to be separately subfielded in subfield $t if you are going to use the enhanced 505 practice. But that's just what MARC21 says and we carry that over into Bibliographic Formats and Standards.

Is there a recommended way to indicate a sequence of 505 fields?

If there is a logical bibliographical way to indicate the sequence, such as volume numbers or something like that, that would be the way to go if you're going to have multiple 505s. If there's a bibliographically logical way to identify each of the 505 fields, such as volume number or volume title, something like that, that would be the practice.

Are the indexed 5XX fields in a field-specific index or in indexes which merge content from multiple fields?

Most of the 5xx fields that we've mentioned today that are indexed are in both the notes index, which is "nt:", and in the keyword index, "kw:". There are some of the fields that we've mentioned today that may also be in other indexes as well, and in many cases with other fields that aren't 5xxs or from other ranges in the MARC format. We didn't go into, obviously, we didn't go into much detail, in fact hardly any detail at all, about what specific indexes the fields are in because the presentation was already more than half an hour long and we try to keep it to a controllable length. It would have been even longer if we had gone into that kind of detail. But those details are in Searching WorldCat Indexes.

Does the presence of any of these 5XX fields block the dedup process?

Not per se. The deduplication process is, as we have had many times talked about, really complicated. And there are some 5xx field elements that are brought into consideration in certain comparisons within the deduplication process. So it's quite possible that a 5xx field or the information in a 5xx field can block a deduplication or, I guess, prove to the DDR process that the records in question are not duplicates or are questionable enough not to merge. I guess you could say that yes, the presence of certain information in some 5xx fields can prevent a DDR transaction.

Do 5XX fields that 'do not transfer as part of the deduplication process' transfer if they are stored in an LBD?

So, the LBDs themselves are transferring. The information in the LBD does not transfer into the bibliographic record, but the LBDs that are associated with the record when records are transferred remain associated with the retained record.

Because the 521 Target audience note has a repeatable subfield $a do you recommend when cataloging videos (though not print materials) that additional statements be added in the repeatable subfield $a and not in a separate 521 field? For example: 521 _8 MPAA rating: R; for strong violence, some drug use, and language. ǂa Canadian Home Video rating: 14A; brutal violence, coarse language.

This question's really a matter of cataloger's judgment in many cases. If you would be creating different 521 fields with, for instance, different first indicators for, let's say, reading grade level, interest, age level, interest, grade level, and so on; obviously with different first indicators you would want to create separate 521 fields, if the information in the 521 field is of a similar type and would be the same first indicator, even if it has a blank or an 8. Generally, I would use separate 521s in a case where there are different rating systems. So MPAA rating in one 521, the Canadian home video rating system in a separate 521 field. But it's really up to you as the cataloger.

What is the order of these notes in a bib record?

Well, in the days of AACR2, the order of notes was supposed to be the order of the instructions in AACR2, more or less, although there was a provision that a cataloger could choose to make a particular note the first note if it deserved some kind of prominence. In the era of RDA, however, there is really no prescribed order of notes. So, it's really up to you. A lot of catalogers, especially those of us of a certain age, will have continued to use the AACR2-ish order of notes, but strictly speaking, there is no order. And some, especially some local systems, it's my understanding, actually rearrange the notes into some kind of order, especially some kind of numerical order, but there is no longer any prescribed order for notes under RDA.

Can the reformat option put the 5xx notes in the expected order?

Reformat does not do that, especially considering that there is no longer an "expected order" for notes.

Records that are added in WorldCat tend to keep all the 5xx fields, even 561. But when other records are matched to existing records lose their own 5xx (those that do not transfer). Is there a way for such 5xx fields to be preserved in the LBDs please?

From the response to a previous question, it sounds like LBD information is not being lost with merged transaction, it's just that information, if it was in the bibliographic record, and based on whatever circumstances occur with the deduplication process, it may transfer or it may not.

It sounds like what you're asking about is whether, when your record is incoming and matching to an existing record, and there are 5xx fields that do not transfer during that matching process, that you want those 5xx fields preserved in an LBD even though they weren't there originally. And I don't think any of us on this call have the expertise to answer that.

Re: 585 Exhibitions Note: are there 'impact' criteria regarding the Exhibition? does it need to have been (inter)national, generated a publication, etc. For instance, would an inhouse-exhibition justify such a note in the materials that had been used in that exhibition?

I tend to think with WorldCat records, you tend to have notes that will be of interest to other catalogers that have that resource. I'm thinking if it's local, to have it in a local note, like a 590 or so, but I would think impacting international or national makes more sense of having that 585 field in a WorldCat record because it will be of interest outside just of your local institution.

Just remember that you're cataloging a bibliographic resource and if the resource itself indicates that materials in the resource have been used as part of an exhibition or are somehow related to an exhibition, and that was mentioned in the resource, that would be enough justification, regardless of the "impact" of that, whether it's national, international, or local, that would be the justification for the note, the 585 note.

Are there any differences in the distribution of use of specific 5XX fields across OCLC's different language-of-cataloging communities?

That would take some research. It would not surprise me if there were differences among different language of cataloging communities. There are certainly fields that get more heavy use under certain, not just language of cataloging communities but also different descriptive cataloging standards communities. But that would take extensive research to answer, really definitively I think.

I would like to see 510 be indexed particularly for rare materials, as in some cases this is the only way to know how many items a library holds from the ESTC database, by just looking for "ESTC" if this is entered in the 510.

The 510 field is, if not at the very top, it is one of the fields that we have put on our list of things to be indexed, so that is definitely under consideration in the future.

The slide 63, the slide said it is indexed, but we were told it is not.

My apologies if I misspoke. It is indexed and my notes indicated that.

January 21, 2021

Is there a character limit to the 505 field? Does it generate a continuation 505 field if the character limit is exceeded? Also, the 520 field?

There's no practical limit to the number of characters that may be in any field, 505, 520--any field. There used to be, but those have more or less been eliminated. Previous OCLC systems would sometimes break up a 505 field that was too long or a 520 field that was too long into multiple fields, but as far as I'm aware, the current implementations of WorldCat do not do that.

One of our colleagues mentioned for the character limit, 9999 to be able to comply with MARC (and export the record).

Recently we discovered over 80 OCLC records in our database for audiobooks that had a 506 for access restricted to disabled patrons, this field was not applicable to any of these records. Could they have come from deduplication? The current OCLC records no longer have these fields.

It's possible that the 506s could have transferred. If you have examples, OCLC number examples, we could take a look at Journal History to see when the 506s were added and possibly where they may have come from.

How is field 520 indexed?

520, if I remember correctly, is in the notes index and the keyword index. So, note is nt: and keyword is kw:. It's just general, word-by-word, phrases, however you want to go. There are details on, for most pages in BFAS, there's a link back to the Searching WorldCat Indexes for that field, so if you click on that, you will get to Searching WorldCat Indexes and it will give you the details of how that field is indexed and in what indexes it appears.

For most of these, they are in those two indexes, for nt: for note and kw: for keyword.

Is it okay to add two 520 fields one for English and one for the non-English language for works that are not in English?

Strictly speaking, if a resource is being cataloged in English--that is, the 040 $b says "eng," the descriptive information--the notes that you add to the record are supposed to be in English. That's how things are supposed to work.

If a book is bilingual, could we add a 520 field or appropriate notes in both languages?

If the book is in parallel text, could you add 520 notes in both languages?

If the resource is being cataloged in English, that is, the 040 $b is coded as English, the notes and things related to the notes would best be in English. That's the general rule.

You may have a local policy where you want to include notes or summaries in a different language or the language of the item if you want to do that. It's not standard practice to do that. Another thing I have seen sometimes is if the summary note is a quoted note and someone is quoting from the source where it's not in English, you may have it not in English in quotes, within an English language of cataloging record. So that's something that's possible.

We want it discoverable in both languages. We are trying to make resources available to patrons who do not speak English.

All of the notes should follow the language of cataloging. Even though you might not be able to do that to the WorldCat record, you can definitely do that in your local practices in your catalog to serve your patrons.

Would it be possible to get the info on if fields can be added/edited in PCC records in BFAS? Or maybe it's already in there and I'm just not seeing it?

It is already there, it's in chapter 5.

How does OCLC decide some 5xx is indexed or not indexed; for instance, 563, 'Binding information,' and it is indexed?

That is a really great question and unfortunately, I do not know how these fields get decided if it's indexed or not. I know there are some fields that members wish would be indexed but they're not.

A lot of indexing decisions were made a long time ago in consultation with, there used to be a group of users who would consult with people at OCLC about indexing and display issues. We still get recommendations nowadays from groups such as the Online Audiovisual Catalogers, OLAC, from the Music OCLC Users Group, MOUG, about indexing and things related to that. When new fields are added or new subfields are added, as part of the OCLC MARC update, we make decisions about indexing based on what we know about the field and how we expect it to be used, or the subfield, how it fits into the rest of the field and how useful it would be to have it indexed or if it's something that isn't worth indexing. If you have requests for things that aren't indexed that you would like to see indexed, you can send them to us and we will add them to a list of things, sort of a wish list that we have of things that could be indexed in the future.

What email address should we send indexing requests to? 

They can just send those to email address.

Do we need to cite where we get the summary for 520 fields? For example, "summary is from Amazon," do I need to put a subfield c? 

I think current practice is to, if you're quoting from a source other than the resource itself, and possibly even depending on where it comes from in the resource itself, current practice would be to cite that source. It is subfield c. So yes, if you do cite the source, put it into subfield c.

Are there characters that should not be put into a 505 or 520? Sometimes we want to copy and paste...

When you copy and paste into a bibliographic record, sometimes that can cause problems. There are ways around that, because not all sources from which you would copy conform to the display rules for bibliographic records.

The characters not to include are mentioned in Chapter 2 of Bibliographic Formats and Standards, like vertical bar and smart characters. I'm also thinking off the top of my head, there's certain resources like math books that have a lot of symbols, like pi and that, that can also cause a lot of issues when it's copy and pasted to the record. And there's also the option to paste things in records using plain text rather than doing a Ctrl+C and Ctrl+V to copy and paste, so that sometimes can help as well.

When you validate a record, doesn't that tell you if there are characters that shouldn't be there?

Yes, usually when you validate, it will say something like "bad character" or "invalid character," if there is something there that is not valid. That happens a lot less often now that we validated Unicode within WorldCat than it used to, but it does occasionally happen and it is those characters that are documented as not being usable.

Sometimes those validation error messages are kind of cryptic though, so sometimes they aren't as useful as they could be.

Sometimes validation doesn't report everything that it should, that can happen a lot with authority records and illegal character issues, so just keep that in mind as well.

I was taught that the 505 should not include things like "Introduction", "Bibliography" or "Index". Is that still correct? Also, not to put in "Chapter 1" before the title of Chapter 1.

Those are generally still good practices not to include things like that, in many cases. There will occasionally be cases where you will want to ignore those, but generally those are good practices.

We still see a lot of records in WorldCat that have those words in them, like "Introduction" or "Index" or "Bibliography" that you probably wouldn't normally include if you were putting in your own contents note. Those are often there in eBook records that are machine-generated, so an automated process is generating those contents notes rather than a human being and that's often why you see them.

I sometimes see page numbers included in the 505 field. Should we delete those?

I don't necessarily know. I don't recall seeing page numbers. I suppose there could be circumstances where it would be appropriate to include those, but I can't think of any offhand. You can probably, if you're able to edit that 505, and you're so inclined, you can probably take those out.

Is the rule about not putting in index and bibliography different for children's books?

No, it's not different. It's just that bibliography and index notes go into a 504 field, where it says "includes bibliographical references and index," or, if it's just an index, that would usually go in a 500 field, so that's the main reason, I think, because they are distinguished that they are not put into a contents note in the 505 field.

There used to be a Library of Congress project which added 505s or 520s (machine addition) to bibliographic records for monographs. Is that still being done?

I suspect we're all being silent because we don't know the answer. I don't know the answer.

So, if a paperback issued at a later time with a unique ISBN (i.e. 2020) but it is actually a reproduction of one 2018 DLC record, should or could we add 020 to that DLC and not necessarily to create a new book for paperback (in 2020)?

The answer is "it depends." If indeed this is just another printing of the 2018 book and it just happens to be in paperback and it's the same size, it has the same pagination, et cetera, you're welcome to add that ISBN to the existing record and use that record. If, however, there is a difference with the paperback issuance, meaning that it has a different size or a different pagination or perhaps a different edition statement, maybe even some new foreword or something, then you would want to create a new record.

You can also find out more information in Bibliographic Formats and Standards, the chapter on when to input new records. That will give you an idea of the situation that you have and if a new record is warranted or not.

We have noticed some records with many sets of ISBNs, now are they all valid for that one record? Does someone verify them?

I would say depends as well. With this case, they could have been transferred through some merges in the past, if everything matched and there was a difference in ISBN, then those would transfer over. Sometimes there are cases where the ISBNs are really for another version of the resource, like a large print or maybe the electronic, and they're not coded correctly, indicating they're valid for that particular description in the record, so it just depends and it will end up having to look at the record and really verify, someone could have put those ISBNs in there and they do their verification and it does have a few ISBNs. Sometimes, not necessarily, it could have just been a transfer transaction, and those are not necessarily always done by a person, so yeah, it would depend.

Should we be suppressing invalid ones even if the record is DLC?

I would say if you can confirm, you can put them in a subfield z, if you can confirm that ISBN is really for the large print and not for the hardback, then it would be appropriate to indicate it's an invalid ISBN. I don't necessarily think, there's nothing wrong with having ISBNs from other formats if it's the same title, so for the large print and the electronic, if you want to have those ISBNs there, they're okay to be there as long as you indicate they're invalid because they represent another version of the record.

Under both AACR2 and RDA, you're allowed to include in a bibliographic record all of the ISBNs or standard, other kinds of standard numbers that are in that resource, whether they apply to that resource or not. Those that apply to the resource that's being cataloged would properly be in subfield z rather than in subfield a. Only those that apply to the actual resource being cataloged, only those belong in subfield a, in an 020, for instance.

Why is there no punctuation at the end of the 586 field?

It's not just the 586 now. It's all of them. With punctuation now also being optional--and there was a whole Virtual AskQC Office Hour on this--so a lot of these notes will not be ending in any punctuation. The PCC, the Program for Cooperative Cataloging, has issued guidelines about punctuation and you now have a choice of including ISBD punctuation or not including ISBD punctuation. You'd have to go to the PCC website to find those instructions about what the options are, but generally speaking, most of the 5xx fields, unless they end in an abbreviation or some other term that would naturally have a period after it, the final punctuation is generally now left off of many, many 5xx fields.

February 2021: 7xx linking fields

February 9, 2021

Can you explain a bit more about what it means when fields do not transfer when merged?

What that means is there are different ways a record could be considered a duplicate and merged into a better record. This can happen through our duplicate detection and resolution software (DDR). It can happen if an institution reports the duplicate records and they would be manually reviewed and then merged. So when that transaction takes place, there is a hierarchal table that lists all the different fields and different indicators and sets in the record and makes decision then based on that criteria, does that field transfer or does it not? So certain fields during a merge transaction automatically transfer and some do not. I hope that answers the question.

Is the 773 field also used for offprints?

No, it is not. Actually, Bibliographic Formats and Standards (BFAS), chapter 3, section 3.2.2 covers offprints and detached copies, and you do not use in analytic cataloging, extensions for offprints or detached copies.

Is field 777 the best choice for bound-with situations?

No, field 777 shouldn't be used for bound-with situations. Actually, that's better recorded in field 501. Field 777 covers when items are issued together, when even though they're separate, they were actually issued published together.

Also, regarding bound-withs: when some titles are bound together at one branch but separate in another, what is the best way to indicate that?

So that would be, it sounds like local practice. So the items weren't issued together or bound together at publication, it was done after the fact, so you probably wouldn't want to indicate that in the shared WorldCat bibliographic record but that would be considered local information.

Just to add, I agree that's purely local information that maybe would best be handled with notes rather than putting anything into the record in WorldCat.

Can field 777 be used for multivolume training materials, for instance? E.g. Module 1, Module 2, etc.?

I do not know enough about that field. I would think it would be modules for all issued at the same time and were somehow together, issued together, then it would be appropriate to use a 777.

It sounds like it is possible. I have never seen a record where that kind of thing has happened, at least that I remember. We see so many different scenarios played out in records, but not something that I've got multiple volumes that is just a set and use of a linking field to kind of say all of these volumes go together. What you do see often in that case is somebody making a decision to catalog the set itself and then, if the individual volumes were scattered around the collection in different classification, make separate records for the individual parts, and that's potentially the situation that could be handled with a 773 field to link the individual parts up to the parent record for the individual set, but not so much a 777.

Would you use a 786 field with items that were listed in a bibliography?

I am not sure about that. It seems more that the field is used when there is data in an item that's being used to create a different item. I don't know.

I think that you're basically correct. It's not the kind of thing that you would typically use to cite sources that were in a bibliography in some item, to say this is where the information came from. It's not as if 786 cannot be repeated, it can, but the typical situation that I think was envisioned when this field was added to the format was that the resource that you're cataloging don't see that many 786 fields around.

Does the "Insert from cited record" function pull in data appropriate to the particular linking tag -- for example, the edition statements for linked editions?

That feature does pull in information from a cited record. I'm not sure how targeted that data is for each specific tag. I do know you do need to have a correct linking tag and correct indicators in order for the feature to work. It would also be recommended that if you were creating a linking field and you wanted the edition information present, then you would just double check the linking field and make sure it was pulled over or you could add it manually. I'm not sure if the feature was built to work on most fields or if there was actually any specific targeting for it, but I know when I use this field, I always double check and make sure everything was pulled in and there's nothing that has to be deleted.

The way that "insert from cited record" works is that for whatever record that you key in the number and pull in the data, it is exactly the same format in all of the fields from 760 to 787. The difference in the way something is cited is based on whether it is a serial or a monograph. It will pull in different information from the other record depending on what it is. You could look at what we do for monographs and say well, that isn't necessarily current for RDA, or it has more information than what you might put in a link in an RDA record, but that's based on "insert from cited record" having been implemented in the system back in the days of AACR2, where you could have monographic titles that were in conflict, so the only way you could identify that would be to include edition statements, the place, publisher, date, so that's why all of that is there when you're citing a monograph and it's not there when you're citing a serial, because serials had unique titles.

What would be an example of an edition in a vertical relationship?

The only example that I think of in something like that would be an edition statement that reflects the coverage of an item, so if you had something that was perhaps the United States edition, but you had another publication that was specifically the Ohio edition of that publication.

For 7xx, $w is required or optional?

In the input standards, that is a required subfield. It's of course really helpful in making linking fields actually link if there's an identifying number that can be used to navigate to that other record.

For these 7xx fields where they cannot be added to and/or edited in PCC records, are PCC participants able to add or edit these 7xx fields?

Yes. So, the statements that we made in the slides where it cannot be added to or edited in PCC records, that statement is only if you are not a PCC participant. But PCC participants can edit those in the records.

For editions of integrating resources (especially legal resources), is it better to use the 780/785 fields or 775 field?

I don’t know that one way is better than the other. For legal loose-leaf publications, I have seen both 780/785 fields used to link between the different editions and 775 fields with subfield $i linking the editions. Normally you think of 780/785 being used to link between the different iterations of serial titles, those fields show the before/after. With loose-leaf IRs, the editions are a form of before/after so that is a valid reason to use 780/785. On the other hand, they are “editions” and that is what field 775 is used for. I think this is ultimately a matter of cataloger’s judgment.

Is there a movement to add multiple 780/785 fields to reflect multiple title changes within a serial's history?

The purpose was to identify the immediate successor and predecessor so that you could go to Title A, see that it changed to Title B, go to Title B and see the link back to Title A but also the link forward to Title C, and then be able to follow the progression that way. Even though complex notes are allowed in serial titles records, they are more meant for those complexities involving the immediate title change. So, like in that "absorbed by" example that we covered, that was more of an immediate title change complex note. These title changes can get very unwieldy and what is considered a major title change and minor title change has altered over the course of time, depending on the cataloging guidelines used, so there's a bit more complexity in the situation if we included all of the title changes before a particular serial in these linking fields.

For electronic resource serial records, should the 780 and 785 fields reflect the electronic resource records? Or can they reflect the print version records? Or should both be included?

There are situations when you could point from a print to an electronic version as a later title. However, in a normal title change situation when you have a print and an electronic for title A, and a print and an electronic for title B, the 780 and 785 fields for the print would only point back and forth from the print title change, and then the same would occur in the electronic version, the 780 and 785 would only point back and forth between the electronic records for title A and title B. Then Title A, print, would be linked to Title A electronic with the 776 field that Robin covered. Title B in the same way would be linked between the print and the electronic using a 776 field, so you would have multiple 7xx fields in this situation. There is a situation when the print ends and is continued by an electronic version that has a title change; in this case, because it is a different version record, you would use the 776 field to link to that later title change instead of using the 785 field. Because it is another version record. And you would note that in the relationship area, so the subfield $i would be "continues online," and then it would point back from the online to the print using the dates that the print spanned.

A number of slides state that some fields are required if applicable for full-level cataloging. However, the PCC BIBCO Standard Record doesn't consider most 7XX fields as required. Instead they are optional. So why are these fields "required if applicable"?

My guess is these are required if applicable when inputting full level cataloging in WorldCat, for purposes of WorldCat cataloging, and they do in general follow the guidelines put forth by the PCC. The idea is you would want a full level record to include as many of the pieces of information, especially with title changes within WorldCat database, so that it's easier to get from the different titles and see the relationship.

In some cases, we may have looked at a field in the past and said "okay, there are standards outside of WorldCat that treat them differently, and we may want to go above and beyond, in requiring something," but I think in other cases, some of these input standards have been around for many years and they just haven't been reevaluated in light of changes in cataloging. So I think that it would be useful if people sent any fields they were concerned about to and we could reconsider the input standards or at least discuss them and say we decided to be different than the BIBCO standard record for a particular reason. It might be it's required if applicable just because that's the way it was implemented 30 years ago.

Are all the 7xx that are required in a full record needed only if the record is RDA, or are they equally necessary when the record is AACR2?

The input standards are required no matter what the cataloging rules used are. So basically, it means if it's RDA or AACR2, you would still follow the input standards that are spelled out in Bibliographic Formats and Standards.

What is the best linking field to use if you have an antique map that has been extracted from an atlas and you want to link the map to the atlas title? 773 or 787?

Because the map has been extracted from the atlas, you no longer need the directions that are usually available in the 773 to actually go to the atlas to find the map, so it would be better to use the 787 field. If you wanted to create a record for the map and it's still inside the atlas, then you would use field 773, pointing to the exact location within the atlas, where the map is located.

When a eBook is a "reprint", what date(s) should be used in the fixed fields? And how should the publisher information and dates be expressed in 26x, 5xx, and 7xx fields?

I'm assuming in this case that you're talking about something that was a reprint in its print form and that that has been reproduced to be the digitized version that you see online. In that case, you would treat fields like the dates and 260 or 264 the same as you would have if you were cataloging its print counterpart.

No, I'm saying the publisher contributed MARC record for an eBook that has a different date than the print. Is an eBook which duplicates the title page except a change in date, a "reprint" or an edition or neither? What should the dates be in the fixed field, and how is the relation explained in notes, and 7xx fields?

In general, the date when an item is made available online is not a publication date. The date used in a record for an electronic resource which was originally published in print would be the same publication date as the print. As the electronic item is an electronic representation of a print item, the record description should describe the original print item but contain the appropriate electronic fields and coding. This relationship would use 776 fields to link the print and online versions to each other.

Sometimes a publisher will obtain a title then publish or republish it as an electronic resource, the original having been published in print and/or online by the original publisher. In this case, the new publisher will have “removed” the original title page and replaced it with a new title page. In this case, the electronic resource record description should reflect the new publication information and a note could be included in the record referencing the original item, such as a field 534, Original Version Note. A field 775 Other Edition Entry could also be used to link to the record for the original publication.

If the only difference is date, you need to determine if that date is indicating when the resource was made available online or if it is actually a publication date. If you need assistance with a specific resource, you are welcome to ask Metadata Quality staff at, we would be happy to help.

OCLC’s Bibliographic Formats and Standards (BFAS) addresses electronic resources in Chapter 3, Special Cataloging Guidelines. Specifically sections 3.1.1, Provider-Neutral Cataloging: Online Resources and section 3.3.1, Special Types of Publications: Electronic Resources. For field 534, see, and for field 775, see

If a title is issued simultaneously in different languages and you are not sure which is the original language, what tag would you suggest using to trace the relationship?

It seems like it would probably be 767. This field can be used when the item in the horizontal relationship is the original or another translation. It might be safer to use this field because you're not pointing to what the original language was, you're just pointing to a different language edition.

You do see that in Canadian publications where the government will issue the same text in English and also in French simultaneously, so you can't say that one is necessarily a translation of the other, so if they are truly simultaneous and you don't know that there's any translation involved or what is the original language, you would use 775. You would use 765, 767 in cases where you actually have a translation that you know.

When a print title ceases and it continue as the same title in electronic version, do you use 780 to 785 to link the two?

When a print title ceases and it continues as the same title in electronic version, you would use field 776 to link the two, instead of fields 780 and 785.

776 08 $i Continued online: … [on print version record]

776 08 $i Print version, -2019: … [on online version record]

Only use fields 780 and 785 when both a title change and format change exist. For example, the print version ceases along with a title change to the online version. Both the print and online versions records representing the earlier titles would point to each other using field 776, and both records would also link to the later title, online version record using field 785.

     Print version record:

     776 08 $i Online version: … [link to other format with same title]

     785 00 … [link to later title]

     Online version record:

     776 08 $i Print version: … [link to other format with same title]

     785 00 … [link to later title]

Later title, online version record:

     780 00 … [link to earlier title, online version record]

     780 00 … [link to earlier title, print version record]

When a new item has a relationship with an older item, who is responsible for adding the reciprocal relationship to the older record?

In general, when a title changes and a member institution adds a new record representing that title change, the institution inputting the new record is encouraged to add the reciprocal linking field in the record representing the earlier title. If this does not fit with your workflow or you are unable to, you may email a request to

In the past I asked how to link print and eBook for older materials when there are more than one print records and e-books. I was told we are allowed to add more than one 7xx fields in the record to point to the different records for the other format. Is this still the case?

Yes, you may use multiple 776 fields in a bibliographic record for point to other formats. You may also use multiple 7xx linking fields in a bibliographic record as appropriate.

7xx cannot link to the known version if it is not recorded in OCLC/LC system because there is no $w record control number?

Subfield $w is “Required if applicable” in 7xx linking fields in bibliographic record. So, if a control number is available, then you should add it to the bibliographic record. However, if there is no control number available, then you do not have to include one in the field. If a record is added at a later date, the control number may be added at that time.

773 not used for offprints and detached copies. How can we link digital articles, to the journal they belong to? True, there's not a physical relationship, but they're still associated with a specific journal issue, keep continuous pagination, etc.

Offprints or detached copies are issued separately but often alongside of the original for either the author or limited distribution. Information about offprints and detached copies along OCLC's policy can be found at BFAS 3.2.2, Offprints and Detached Copies.

However, you may use field 773 to link digital articles to the journal they belong to. BFAS 3.2.1, "In" Analytics states that articles are considered part of this category and a 773 should be used. Note that some types of publications, such as a single issue of a serial may also use 773 but are not considered "In" Analytics in nature.

February 18, 2021

Do we only worry about coding the related resources in various 7xxs fields when the related bib records are also available in WorldCat?

I think you could still create a linking field for the related resource even if there wasn't a functional number to link it to. So, for example, with the earlier and later titles, you might know the later title but just not know the individual numbers that go along with it. You could enter part of the information into a 785 for the later title, and that would be okay. The idea behind doing it that way would be that later when the control number comes along, you could easily fit it in at that point.

Are the slides for the series/subseries missing the subfield "i"? Slides 31 and 32, the examples that didn't include subfield "i."

With some of them, we included subfield $i with second indicator 8 to show how you would set it up with second indicator 8. In other examples, I showed it using the default second indicator, to show it would generate that default constant display.

Does "may not be added to and/or edited in PCC records" mean that it can't be edited by non-PCC libraries, but can by PCC libraries?

Yes. We took those statements based on information in Bibliographic Formats and Standards, Chapter 5, and that has a title on what fields non-PCC libraries can add or edit in PCC records. If you're interested in seeing the full list, I recommend BFAS, Chapter 5.

In PCC cataloging, wouldn't you ordinarily make a 7xx for a related work as a work--in a 700 name-title entry, for example--rather than using a linking field to specify a particular manifestation?

Yeah, that is correct. In monographic cataloging, you typically would make an access point for a related work, and as coding has developed in the MARC format in support of the implementation of RDA, it's far easier now to be explicit about what the relationship is, so that that information can be included directly in that access point field. A lot of these linking fields would not necessarily be used in monographic cataloging.

Maybe I missed something in the beginning, but why can these fields NOT be added to PCC records? 

Generally, the fields that can be added directly to PCC records by a non-PCC participants have been limited to the kinds of things like call numbers in additional schemes and subject headings in additional schemes and various kinds of note fields that are pretty unique, such as contents note or even a summary note. The linking fields haven't really been in that category. Not that we couldn't reconsider that, in light of the question, but they just have never been considered to be quite in the same category for people to just add them to PCC records.

Why are these fields not transferred when records are merged? Does this mean that the linking data is lost?

When records are merged, with DDR, we're unsure of what the quality is for some of the records. Some of them might be very good quality, both the retained and the duplicate, but with the duplicates, there's no real way to programmatically give this in a real and meaningful way for DDR purposes. So that's why some of these fields are not automatically transferred when those records are merged. Now, for manual transfers, if the person doing the merging just looks at the two records and says "oh, wow, this field really does need to be added to the other one; it would make the record more complete, it adds quality to it," then that cataloger can manually transfer it but it does require someone to manually look at the records when they are merging.

Does the fact that some 7XX fields are not transferrable factor in to whether a record gets merged? Do these fields get added to a record before a merge takes place?

The 7XX fields are not taken in consideration directly in terms of deciding what record to keep and merge. They would be considered in terms of records that we would look at and say, well, they're equal in rank. Let's say that we have two records that are both I-level and the software is looking at the number of fields that are present as well as the number of holdings. In that sense, 7XX fields would be counted and be part of the equation of what record we're going to keep in a case like that. That is sort of a different situation than somebody who's manually looking at two records to merge and evaluating the content of the two records to say, "this one looks better than the other one does." And certainly, you can think of cases in serials where you would look at the linking fields in particular to sort of figure out what is going on. If one record has coverage that’s greater than the other record, it may be that somebody created a duplicate record, added a 785 for what should have been a minor title change, so that explains why this one only runs for a period of ten years where the other record is still open and ongoing, covering fifty years. You look at 7XX fields, the linking fields, but in terms of automation, they don't really get considered in quite the same way.

What is the difference between 501 (with) and 777 (issued with)?

For field 501, that's a "with" note, so it's used primarily to describe resources as they were originally published, released, issued, or executed, and the 777 "issued with" entry is information about publications which are separately cataloged but that are issued with or included with the target item. And you specifically would not use this field for bound-with notes.

Does anyone ever use 760 or 762 fields anymore? 

CONSER practice is to not use 760/762, instead relying on 830 to describe the relationship.

I'm sure that monograph practice is to do the same thing and rely on an 8xx series tracing instead. 760/762, within WorldCat nowadays is a pretty rare thing.

I assume we should link to English language titles if we are an English language cataloging institution and NOT link to records with other languages in 040 $b.

That's correct. It should be links to records that are in the same language of cataloging rather than crossing from one language to another. It may be the case that today there is only a German language of cataloging record available, don't put that bibliographic record number in a citation if you are cataloging in English. To cite a number, there really should be an English language record to cite.

If you were merging a CONSER record and an unreliable record, and the CONSER record has a linking field, does it get retained? 

There's a hierarchy on which records get retained, and the CONSER record does get retained over any other record in that hierarchy. The linking fields in the CONSER record will never go away, at least as far as a merge is concerned. The CONSER record will always win out.

Many CDs consist (in part or as a whole) of re-issued content. Under what circumstances is it useful and correct to include a 775 field linking to the original issue? 

This would be a local decision and, even then, may vary greatly depending upon the circumstances surrounding each individual audio recording. In general, a 500 note explaining that the material has been previously released in whole or in part is sufficient. In that note, you may include further identification about earlier releases, including the audio format (such as LP or 78), the recording label or publisher, any pertinent title information, dates, publisher number, and so on.

If you look at the PCC Standing Committee on Training (SCT) Training Manual for Applying Relationship Designators in Bibliographic Records, “Guideline 13: Relationship Designators for Resource-to-Resource Relationships” seems to be the only relevant guidance. It does state that “The use of relationship designators for resource-to-resource relationships is encouraged,” but if you go through the still-official Original RDA Toolkit Appendix J (Relationship Designators: Relationships between Works, Expressions, Manifestations, and Items), none of the designators really apply to this situation. You are certainly allowed to “use another concise term to indicate the nature of the relationship” (J.1), but you may alternatively draw the inference that perhaps a linking field isn’t necessary to account for these types of relationships.

My suggestion would be generally to not bother with a linking field in this circumstance. Use field 500 to include the previous manifestation data at the level of detail you believe to be useful and appropriate. If you want to give access to the publisher and publisher number of any earlier manifestations, use field 028. If there is any title information worth giving access to, use field 740.

How important is name/title data in a 7XX linking field when an OCLC record number or other record number is present? Does name/title metadata in 7XX need to be maintained to support linking functionality?

I'm sure that the theory is "no, it doesn't need to be there; you just need some kind of identifier." On the other hand, we're still at a point where people are dependent on that data in a linking field to actually display a name and title, to see what's going on. I expect that most local systems don't necessarily use the identifier to go grab the information from the related record and supply it in a display. Maybe some do, but I'm thinking probably most don't. We still have this historical practice of including the name/title data in addition to identifiers that we would also put in the link. And it helps in certain situations. I think certainly in the work that we do in maintaining the quality of the data in WorldCat, we've seen instances where the subfield $w had a typo in the number, and we have the name/title for the successor title in the case of a serial that had a major title change, and having that information helps in sort of figuring out what was intended when the identifier leads you to something that's clearly incorrect.

Is 7XX $a included in browse indexes in OCLC? My concern is that a lot of older print items now appear in digital form with 7xx links. Maintaining a name in the older record becomes more onerous if we have to chase down 7XX occurrences.

I believe that it not. But I'm not 100% sure on that. It does not appear to be part of the phrase searching; only the keyword search. For example, 780, subfield a, is only searchable by the au: search.

I can agree with that, it is a real nuisance to have to maintain the same information all over the place. Going back to what was mentioned earlier about just including an identifier in a link, it makes sense if we could then interactively just pull in the information from the related record and always have it display whatever is currently in that related record. That would be a good thing. And that's something some of us have been talking about for 20, 30 years. It just hasn't happened. The way I would look at it now is the numbers you would potentially include in subfield $x, the ISSN, the ISBN in a subfield $z, and of course the control numbers in subfield $w, that those are the most important things to maintain, and if the citation gets a little bit out of date, out of step with the related record, that's something we might be able to resolve in the future using all of those identifiers. You think of a future environment where we rely on identifiers more so than is the case today, then we could get information from the related resource and populate a display.

Is 775 added to a record for the original title when what I have is an intermediate translation... I have a Spanish translation from the English version of a French work... the 775 will be for the original French?

I would typically use field 775 for these situations where you have the same resource in two languages, when they're issued at the same time so that it would be difficult to say that the English is the translation from the French, or the French is the translation from the English, thinking more that 765/767 would be used, but that is another case where, in monographic cataloging, you would more likely do an access point.

We have started to see uncapitalized 655 AAT headings in records, which seem to be related to a linking project adding subfield 0. Is this the final version of these headings?

Yes, that is the final version.

If cataloging say an online version of something using title main entry, and using a 776 field with the insert from cited record function for the print version which is incorrectly using author/title main entry, if the print version record is corrected at some point to title main entry, does the 776 field in the online version get updated somehow?

Unfortunately, no, that is not an automated process. That wouldn't happen. Ideally, whoever was working on the record would see that and hop over and manually fix the record or report it to Bibchange so we can fix the record, and I know myself and others in Metadata Quality, if we're working on, say, a print record and we see that 776, we will pop over to the electronic record and make sure the links are correct going back and forth between the records.

What is the Member Merge Program? How is the merge program going? Who do I contact if I’m interested in participating in the merge program?  

The Member Merge Project is a program where we train our member institutions to merge duplicate records. It is going very well. We have 53 institutions that are participating and we are stating up another round right now, we just actually reached out to four more institutions that are going to be joining the program, but if you're interested and you are a participant of PCC, please send a message to

Are there major functionality differences between Record Manager and Connexion Browser? Is one more limited than the other?

There is a webinar coming up on Record Manager in a few days, and then I located a comparison chart that shows the differences between our cataloging applications that might be useful.

I've noticed some German language records in OCLC that say DLC. For example, OCLC no. 1179117272. What is the story behind that? Why would they say DLC if they are not cataloged by the Library of Congress?

A longstanding practice from years ago would be for our indexing to look at the coding of source at the tail end of the 008 field, and based on that, if it was coded as blank or even coded as c, we assumed that it was either the Library of Congress or a Library Congress Cooperative Cataloging Program. But definitions in MARC are broader than that nowadays, where blank would be used by any national library, and c could be used for any cooperative program, so source isn't coded in quite the same way across WorldCat as you would have seen in the past. But we haven't updated our indexing to reflect the kind of changes that you see, so records that come from the German National Library, they have source blank, and consequently we're marking things as LC, and particularly in the case of source c, we're marking things as Library of Congress cataloging that are not. We do have a JIRA ticket open so we can take care of that, it's not implemented yet.

I was trying to reach WorldCat Discovery Team to ask a question about its Release Notes. What is their email address?

I would recommend going to OCLC Support and they will direct you to the OCLC staff that could answer your question. And that's

I work mostly with technical and scientific materials, for which there are numerous records in WorldCat in Dublin Core or other non-"regular" MARC records. I get overwhelmed by them. But I am interested only in regular MARC records. Can OCLC create a search limiter in Connexion and Record Manager to limit searches by "regular" MARC and non-MARC (or Dublin Core/non-DC). That would really help.

I don't know if this will help, but in a lot of the searching that I do where I want to weed out of the results Digital Gateway records, I would put in "not AC=DC," and that would cause them to fall out of the search results.

In linking fields, subfield $t contains "title information from subfields ǂa, ǂf, ǂg, ǂk, ǂn, and ǂp of field 130 or field 245 of the related record. " If you have a 130, where does info from 245 go? In the same $t subfield?

Only one field (either field 130 or 245) is chosen as the title used in subfield $t in the linking field. If a record has a 130 field that field would be used because it has the differentiating information needed to distinguish the title in the 245 field from others with the same main title.

Therefore, the thinking is that there would be no need to include the title in field 245.

March 2021: PCC and OCLC

March 9, 2021

Closing records for serials that have ceased, for PCC CONSER records a need. Is there any movement for more serials to have ceased dates added?

Libraries do report that kind of thing to Metadata Quality and we will go ahead and closeout records. A lot of CONSER participants will spot these same kinds of needed changes in their own work. So, it really is a combination of both. We take the requests as they come, either a CONSER library will do it ahead of us, or we do it when it's reported.

Are PCC participants going to more regularly add $0 subfields?

There is currently a pilot going on about adding URIs in PCC records, both for NACO and BIBCO records. Once that pilot reaches its conclusions, they will publish their recommendations and open up entry of those subfields to the rest of the PCC. So, yes, you will start seeing more now, subfield $0 and subfield $1, in the appropriate places.  Look for best practices to come.

Does CONSER still have a moratorium on new members? Is there any sense of when the moratorium might end?

The LC Secretariat said that they are willing to allow more members now. See the PCC website for information on how to apply.

Please confirm: MARC Organization Codes are used in 040 field for authority records and not our OCLC Symbol?

Yes, the MARC Organization Codes are used in the 040 for authority records.

For the PCC standing committees, who can participate, and how would one join if there is an opening?

There are membership slots that open up each year. Generally, it is the chair of the standing committees who recommends, or seeks out, new members. The PCC year starts in correspondence with the federal fiscal year, so the terms run from October through September. October 1st is when new members join. If you are interested in becoming a member of one of the standing committees, we would suggest getting in touch with one of the chairs of a standing committee and letting them know. Occasionally there are also calls for volunteers on the PCC List, and you are welcome to volunteer when there is a call.

Will you add the email link for sending cataloging questions to OCLC?

We have several different emails, depending on the purpose. Cataloging questions go to Requests to correct bibliographic records go to If you are not a NACO member, requests for correcting or creating NACO records can be sent to

What can we do about EZ proxy links to other university libraries in CONSER records? Example: We update our records in WorldShare, and some serials records has an EZ proxy link that goes to a different library and doesn't work for our users.

You can report those kinds of links to Metadata Quality and we will take them out. The intention is that the CONSER record would only contain general links to an online serial publication, and not institution specific links.

I often find mistakes in PCC records, why is this? And when reported to OCLC, why does it take so long for them to be edited, and why don't they notify the reporting library when the edit has been made?

PCC catalogers make mistakes, just like everyone else. If you find mistakes in records, do feel free to report them. We suspect you may find mistakes in DLC records too, and those are also a part of PCC records. If you do find them and want us to correct them, we'll be glad to do that. It shouldn't be taking very long for them to be corrected. We usually turn around our requests to the Bibchange inbox within several days to maybe a week. If you are not seeing that correction being made, perhaps we didn't receive your request, so you might send it again. Also due to the volume of requests we receive, we don't respond to each request.  We want to take the time to address the request, and if we had to respond to every request that was sent in, you could see how that would cut into the time we can spend making correction.  Don't forget that you can make many, many changes to PCC record, which are listed in Bibliographic Formats and Standards chapter 5. There may be changes you can make that you may not be aware of. If you would like for us to notify you because you are waiting on that record to be corrected before you can use it, please just add that to the email or the request and we'll try to do that.

At our library we have recently found out that although records have links to authority records, when users use a heading from a 4XX field, in their searches, Discovery seems not to be able to find records using the authority heading. Why is this?

In Discovery, if Search Expansion is configured, a user should be able to enter 4XX terms and have that term as well as any authorized headings returned as part of their search results. Institutions who want to use Search Expansion will need to enable this functionality in Service Configuration as well as selecting the authority files they want to be used. Should an institution need any help, they can contact their Customer Service area.

Search and navigate to WorldCat Discovery and WorldCat Local>Search Settings>Search Expansion Settings.

According to OCLC Bib Formats field 264: Optionally, add a terminal period at the end of the field unless the last subfield ends with an ellipsis, exclamation point, hyphen, period (following an abbreviation or initial), question mark, closing bracket, or closing parenthesis. The BIBCO example has a period after the bracket. Which is correct?

Long standing practice for that kind of information, whether in field 264 or 260, would be not to include a period after a closing bracket at the end of the field.

Can you briefly explain what Sinopia is?

Sinopia is a cataloging interface where you can catalog resources using linked data. There is a website you can go to read about Sinopia and the efforts that are involved with the Sinopia cataloging interface.

My library's holdings are currently attached to an incorrect WorldCat record and there are no corresponding attached holdings for us in Connexion. Who do I contact to sort this out?

Send a message to so that we can take a look and investigate to see why your holdings may have been attached to this incorrect record. If there is a problem with matching, we can address that. If you have a Data sync project, we can get you in touch with the Database Specialist for that project to see how we might be able to improve the matching, if it's matching incorrectly. 

Will it ever become possible to identify the meaning of the MARC Organization Codes in field 040 of authority records by hovering over them in the authority record (in Connexion), as we can do with OCLC holding symbols in bibliographic records? Can this feature be added to Connexion for authority records 040 fields?

Hovering over those MARC institution codes in an authority record in Connexion and seeing what library it refers to is on our wish list for a future enhancement. It would be a nice feature. However, it is not something that is feasible in the near term.  Also, at this time, there are currently no plans to add a hover-over feature for the institution symbol in Record Manager, neither for bibliographic records nor for authority records.

I always get the usage of OCLC symbol vs MARC organization code confused. Is there a simple rule for usage?

Your OCLC symbol is used in bibliographic records, your MARC organization code is used in the 040 field of authority records. In the past, OCLC sometimes did carry MARC organization codes in the 040 in bibliographic records, but currently we convert those to OCLC symbols.

Catalogers sometimes come across notes in WorldCat records that pertain to a specific organization's holdings, but that do not specify the organization. For example, see the note "Autographed copy" in OCN 27758517 or 2867234. In such cases, may a cataloger report the note to OCLC QC through the "Send change request" so that the note is either related to the pertinent organization through a $5 or deleted from the WorldCat record? Is there another suggested course of action?

If you are unsure if a note should be removed from a record, you are always welcome to send those to and we will look into it. Or you can use the error reporting function through Connexion or Record Manager. Our April session for Virtual AskQC Office Hours is going to be Local Data in WorldCat Records on Tuesday, April 13 at 9 AM Eastern and Thursday, April 22 at 4 PM Eastern.

What is the relationship between WorldCat and OCLC?

WorldCat, as we refer to it, is the bibliographic database that you search. OCLC is the organization to which all five of us report and to which many of you listening have institutions that belong to OCLC. We've tried to limit references to the bibliographic database to WorldCat, that is to refer to the bibliographic database as WorldCat. Although for decades, people have been colloquially referring to all of OCLC databases as OCLC. Our preference is to refer to the bibliographic database as WorldCat. Some people think of WorldCat as the Discovery database or the database, which is openly discoverable or searchable on the web. They refer to WorldCat as something different than what is used in cataloging, but the database is the same database no matter what interface or service it is used as part of.   

I updated a UKM record yesterday, they use "colour" in the 300 subfield $b. Do I change it to "color"?

Leave it intact as the spelling with the letter 'u'. We serve libraries in other countries outside of the U.S. that will spell "color" differently than we do in the United States. In our own work, we tend to leave the two spellings intact as found on records. Unless you were adding that information and you were keying it in for the first time because it wasn't there, then I would use the spelling that you are familiar with. Otherwise, leave the spelling intact because it is not necessarily incorrect.

Is there a way, or can there be a way, to view previous versions of bibliographic records in Connexion? Would be very helpful to troubleshoot cases where a record doesn't quite match the item in hand, but has many holdings, and we want to find out if it's the correct record with a mistaken edit, or a new record is needed.

We have an internal database called Journal History where we can view previous versions of the bibliographic record. That is not something that we can, at this time, make available externally. If you have a question that you think can only be solved by viewing the history of the record, send that query to either or and we can help you with figuring out what's going on with a particular record.

Many of the PCC and BFAS guidelines are premised on libraries having a local system where they can edit the bibliographic record as needed. This can pose challenges for some libraries using WMS that no longer have a separate local system. Is there any chance that this will be considered for future updates to guidelines?

If WMS is your local system and there is something you want edited in a PCC record, please send it to We will make the edit if it seems like that is the feasible thing to do or discuss it with you further if there are questions.

I have come across a record for a digital resource linked in a 7XX to the record for the item I have in hand, but it's clear by following the link to the item that it's for a different (very close) resource. The e-resource record looks like it was derived from the physical record by OCLC, some kind of automated process. Should I report this to the Bibchange email or take some other action?

It turns out that our processing for Google resources and HathiTrust, occasionally an incorrect record is cloned to represent an item online. If you can tell that has happened, you can either report the record to and we can make adjustments to it to reflect what is there in that one link, as long as it looks like that record really just has that one link and is supposed to represent what is at that URL. If it has picked up additional links, then the record may be somewhat more confused (one of the links refers to one version of the resource, while the other links refer to a different version of the resource), that kind of thing should be reported to us in that case. Ordinarily, if it's just the one 856 field for a record that was derived, that has symbol OCLCE in the 040 field, you can make adjustments to that record yourself and also correct the link in that 776 field to point to the correct version of that same item in print.

March 18, 2021

If a CONSER record has a 130, is the 130 "self-authorizing"? Would the CONSER record be the authorizing source for use of the 130 title in a subject access point on a PCC record?

For many years there had been a policy in the NACO file to not necessarily include every uniform/preferred title for works in field 130. Instead, it could just exist on the bibliographic record itself. That is why, when you look at the CONSER file, you'll see so many 130 fields that are there to differentiate similar titles, the same title, for two different publications. So, yes, the CONSER record itself would be the authoritative source for the 130 that you were going to use as a subject heading on another record.

Edits to PCC records to be done in a single transaction. Does that mean we should only use "replace record" once, and not make another edit/replace if you missed something?

Yes, essentially that is the case. Because of the limitations on editing PCC records- editing, adding, or otherwise changing a field in a PCC record, you may find that if you have made an error in a field, it cannot be edited or corrected because that field now already exists after the record has been replaced. So, it is much preferable to replace a PCC record in a single replace transaction. 

Can individuals join PCC?

No, individuals may not join PCC, it must be institutions. If you are from a small library and only have one or two people that are interested in participating, your institution can still join PCC. There are lots of funnels within NACO that allow smaller institutions to join and not have to contribute large amounts of records. 

I see a lot of foreign language DLC records with encoding level 7 and 'pcc' in field 042. I thought the record had to be full level to be coded PCC. Also, I occasionally still see level 4 PCC records and I thought level 4 was discontinued. What's up with these?

You may see older records that have encoding level 4 because when the PCC started, level 4 was the code that PCC libraries used for BIBCO records. That hasn't been true for quite a few years now, but older records do still have that code and you'll still see them within WorldCat. There are still some other libraries that use code 4. They may or may not be PCC records, but probably are not if they are using it currently. As for level 7, if the Library of Congress is using it, when they create new records they automatically add 'pcc' in the 042 field for almost all their cataloging. There is not a problem with encoding level 7, which is minimal level with PCC records, as long as that is what it is accurately representing in the record. Any PCC records with encoding level 'blank' ought to adhere to the BSR (BIBCO Standard Record) or CSR (CONSER Standard Record).

It seems that in the case of BIBCO records, a combination of encoding level 7 and 'pcc' 042 would be a little more unusual and would not pass our validation. Encoding level 7 in combination with 'pcc' might be something that you see more often in a CONSER serial record. The thinking is that 'pcc' indicates that the access points are under authority control and encoding level 7 is indicating how complete the description is, and those are two different things. So, it is possible to have this combination in CONSER. Although typically, if somebody is doing full authority work so that they could add code 'pcc', they usually do a more complete description. So, the combination of encoding level 'blank' and 'pcc' would be far more common.

The combination of encoding level 7 and 'pcc' in BIBCO records fails validation because in monograph records we are not supposed to see that combination. But, as mentioned, it has been seen on some records. A search in the database revealed that for monographs, we have more than 9,000 maybe closer to 10,000 records that fall into that category. They look like they are just errors that have come from the Library of Congress. There isn't a situation where we have converted any encoding level M to be encoding level 7. So, they were received by us, presumably, as encoding level 7. That being the case, we should probably take the pcc code out of those, or at least look at them to determine whether the encoding level is an error and they really should be full.

We will investigate further to see what is in the Library of Congress' catalog versus what we have, in case something did change on our side along the way, determine what the issue is and take care of it. 

Would it be helpful to report encoding level 7 records with pcc as an error to OCLC as we come across them?

No. We have identified this issue and will be doing something about them in the coming weeks.

If a CONSER record has no 130, is the 245 subfields $a, $n, and $p considered the authorized title for the serial for use in other CONSER and BIBCO records?

Yes, just like the case with field 130, if a CONSER record has no 130 because the title was not in conflict, then that information from field 245, the title proper in subfields $a, $n, and $p, would be considered authorized to use for an access point for that serial in another record.

I know I've sometimes replaced a record twice in quick succession, not because of being cavalier about it but because I realized I made a typo or other mistake, or noticed only after I'd replaced it the first time that there was another error to fix. Is that creating some kind of problem?

No, it is not creating a problem. It's best practice to do a replace once, but if you do notice an error or a typo that you need to fix after you replace a record the first time and need to replace the record a second time, that's fine.

The training materials on the PCC website for basic serials cataloging are from 2014. Have there not been updates since then?

There is some work currently going on to update portions of the CONSER cataloging manual. We are not sure when those revised sections will be available.

Can you provide a quick overview of what Sinopia is?

Sinopia is a cataloging interface where you can catalog linked data and recommend going to the Sinopia website for more information. Sinopia website: Standing Committee on Training, Sinopia Training:

There is still a moratorium on NACO membership, correct? If so, does anyone know when it might be lifted?

The moratorium has been lifted as of March 1, 2021. If you are interested in applying for PCC membership, you may do so.

Are you aware of official online or in-person training sessions for NACO being offered through PCC for new participants?

We suggest contacting the Secretariat at the Library of Congress. Because of the moratorium that was mentioned earlier, there were not any training sessions for the last year that were given in person or via online. It is possible that some could be planned for the future now that the moratorium has been lifted. There is a lot of training that has been recorded and is on the website.

I have recently encountered some non-English language bibs (i.e., 040 $b NOT eng) that display as though they are DLC or PCC-authenticated bibs within Connexion search results. Any news on these?

This is the issue related to the coding of the Source (Srce) element in the fixed field, where records are either coded with Srce 'blank' or coded with Srce 'c', and they display as if they are LC when they are not. That is something that we are working to resolve. It is an outstanding issue that has been reported and is in our backlog to work on, but we are unsure when we will be able to get to it.

Does anyone know why the 546 field always seems to display before any of the other 500/5XX fields? Just wondering, as I always have to reformat to put everything in numerical order.

Answer (from participants in chat): That is the correct order of 5XX notes and should not be reformatted. Information about note order can be found in Bibliographic Formats and Standards (BFAS) at

April 2021: Local data in WorldCat records

April 13, 2021

If we find a library's local information in an OCLC WorldCat record, should we report it and how? shows all the different ways you can report issues with records to Metadata Quality staff.

If we encounter another institution's local data in a WorldCat record that isn't for an archival or special collections resource, should we ignore it, or do we have permission to delete these fields? The presence of this information is often problematic for Discovery users because it appears to users as if it applies to our own holdings.

If you’re unsure if a field is too local and should not belong in a WorldCat record, but you don't want to take it off feel free to send them to us. If you are able to edit the WorldCat record to remove what is clearly local data, by all means, feel free to do that, but we would appreciate if you could send us a note about it in whatever way is convenient for you, whichever way you normally report errors. Because the odds are, if you're finding a local note on a record, it's going to be on multiple records and that will give us the opportunity to look at it and hunt down any additional records and correct them in bulk. There might be an opportunity for us to reach out to the institution that contributed those notes for educational purposes. So, by all means, feel free to edit the record if you're able to. But we would also appreciate being notified, so we could follow up if additional actions need to be taken.

What is the philosophy behind including local, copy-specific info in the OCLC record for rare/special collections materials? What is the value to the world to see in WorldCat that Library X's copy 2 is bound in purple leather, or that it was acquired as part of the Blah-Blah Collection? Or are these cases of "abuse of local info" that we shouldn't be seeing in OCLC records?

Certainly, you're going to find records that have local information. It may not be coded as local information, but when you look at it you know it's local information. Please do report those to us. It may be that you have something that is unique, for example an artist book. Something original, maybe that you’re cataloging for your collection, you want to make it as descriptive as possible for your users. And you keep your records in WorldCat. I'm not sure of the philosophy behind it. I can see plenty of reasons why you would want to, but always keeping in mind that if you're sharing this information to WorldCat other institutions are going to see it. It’s going to show up in their records as well or in in the main record in WorldCat. There are lots of different opinions about cataloging rare materials, what's important to somebody who deals with rare materials may not be the case for somebody who works in a more general way, and was just looking at the item as they were adding it to the collection and views it as just an old book, or may look at it and say this is really valuable it needs to be described in very specific detail. A lot of the details that you see in rare book cataloging are really interesting to others that are dealing with the same kinds of materials. They’re trying to differentiate whether they have another copy of exactly the same thing or differences in printings of materials are going to be of interest to different users.  We try to accommodate everybody, as much as we can, that means that we lean on the side of allowing more information in records that represent rare materials with the idea that if you don't need it - for somebody that is using that same record for copy cataloging - they can edit it out locally, perhaps, but it is very much a different philosophy for rare materials versus everything else.

How secure is LBD information, e.g., for donor names?

WorldCat Discovery release notes, March 2021  If you don't want donor information in a particular field to be shown to your users, you can omit that from being shown.

I would be interested in a session on how to pull reports in WMS on LBD and LHR data. I'm talking about pulling reports in WMS. We have Report Designer.

WorldShare Analytics Office Hours

I'm going to pass this along to the analytics team and have them reach out to you, if you want to send us your email address or email ask, we'll make sure you get in contact with the right person.

Does anything need to be said about the use of $5 with non-local MARC fields? Or when would it be appropriate to have in the WorldCat record a field that represents data specific to an institution is designated in the subfield 5.

That would be for a note or added entry where, say, somebody famous had donated an item to a specific institution. Say they was a noteworthy person, and this is a rare book or something, and they pass it onto your institution, you want to make note of that in the bibliographic record. It's rare, it's unlikely anybody's going to have this same item. You have the note, you have the subfield $5. Say somebody looks at this record and they're looking for this item and they see your resource and it has a 500 note with the subfield $5 that says it's signed by Abraham Lincoln, That's kind of noteworthy. It's probably a little bit of interest outside of your local institution and that's just information that's made available to everybody else, even though it is only specific to your resource.

$5 with non-local  MARC fields: this is part of the PCC Provider-neutral Guidelines for e-resources

Yes, in the case of provider-neutral cataloging, there was a need to indicate preservation information that would be specific to a single institution. So rather than have 2 records for an electronic resource, when a library has been involved in some digital preservation program, the decision was to go ahead and include that information in a single provider neutral record that would otherwise be used for cataloging of any instance of that resource online preservation information, though is not necessarily of interest to everybody, so those preservation fields that are used, one of them would be marked with subfield $5, to say that it is really specific to the particular instance that is been used for digital preservation.

For eBooks, what dates should be put in MARC records, and in which fields? When creating an eBook MARC record: eBook has 2021 on t.p., c2014 on t.p. verso., No edition statement in eBook.

 A print book was released in 2014.

Publisher in eBook matches publisher in the 2014 print record.

Pagination in eBook matches 2014 print record.

Option A: 264 _1 2014. Single date in Fixed fields, 2014, ---- (ignore the 2021 date)

Option B: 264 _4  $c c2014 ; 264 _2  2021.  Single date in Fixed field, 2014, ----

Option C: 264 _4  $c 2014 ; 264 _2  2021.  Reprint date in Fixed field, 2021, 2014

other options?

Recall 264 2nd indicators:

264 _0    Production.

264 _1    Publication.

264 _2    Distribution.

264 _3    Manufacture.

264 _4    Copyright notice date.

What dates do you use in a bibliographic record when it is an electronic resource, and it was originally issued or published in print form?

With that criteria, the description of the electronic resource should match the description of the print resource. And it's going to be the additional electronic fields that bring out the dates. Or any information that's electronic. And that doesn't necessarily include dates because different providers make that resource available online in different years. Say, 1 provider has a contractor or an agreement to provide this title for 5 years and then they drop it from their collection. Somebody else picks it up and makes it available in their collection. Then you'd have to go into the bibliographic record and change dates. The actual date is the date the item was originally published. And then the electronic information is added to make an electronic record. Under Provider-neutral cataloging guidelines, you're taking the publication date from the title page that you see, that would normally correspond to the print. There isn't the same level of interest in when the item was digitized and placed online, partly because the 1 record is going to stand for all instances of that same resource as found online. They were probably put online by different providers at different points in time, so it's that original date of publication that is there on the title page. Part of it is does this conflict with what is in AACR2 or RDA and the answer is absolutely it is not in line with either those standards. But it's what is required for provider neutral. To make sure that this was accounted for some providers changed the title page date, but nothing else. How does that play into this? Providers can play all sorts of horrible tricks on us. It may be that you would end up looking at the print record realizing it's the same thing. And particularly if you're dealing with a provider that has a history of changing bibliographic information, or not necessarily presenting everything that you would expect to see. They digitize a book, but they don't give you that original title page. It depends on what is available to you as you are cataloging. A record that you might base on one instance, available from one provider might be altered when it's available from a 2nd provider and an original title page can be seen. You have to take that into account as a cataloger. It's not like you can do endless research on some item when you're cataloging it, you pretty much have to take what you see. If you suspect that you can't see the original title page, or that kind of thing, described from the title page that you have it's possible to include information in a 588 to say what you have based the description on and it's possible to base the description of the electronic version on the print item itself.

I discovered some subject headings that we had entered in a record as 655 _7 $2 local had been programmatically changed by OCLC to 655 _4.  Is that the preferred entry form to use 2nd indicator 4 instead of 2nd indicator 7 with subfield 2 local?

We view those as essentially equivalent. I know that a lot of libraries look at that and look at the MARC definition and say, well, they don't mean exactly the same thing. But in the context of WorldCat, they really do essentially mean the same thing. There isn't a particular preference, but we do change them to be a 2nd indicator “4” rather than a 2nd indicator “7” with a subfield 2 local, in part because we transfer data into records based on scheme, etc. identified by the 2nd indicator and the subfield $2. To our system it looks like 655 with the 2nd indicator “4” in a different scheme than a 2nd indicator “7” with a subfield $2 that says local when in fact they're actually the same. So, if we make them the same, then we don't have the same level of duplication that we would have otherwise. It's possible to call up a record and see the very same term in a 655 with a 2nd indicator “4” and also as a 655 with 2nd indicator “7” and a subfield $2.  We try to avoid that as much as possible.

Is it possible to export LBD records? We only managed to export bib records which are merged with LBD fields (with query collection).

Yes, you are able to export the records through Record Manager. The is a great resource on how to use query collections:  About query collections in Collection Manager 

Does OCLC have plans to deal with the presence in WorldCat records of 856 fields coded for specific institutions’ authentication/proxy methods? There seem to be a small number of libraries responsible for adding these fields (perhaps through an automated process?), but they tend to clutter the WorldCat records and the time it takes to erase them really adds up over time. Does OCLC routinely and systematically "comb through" master level electronic resource records to remove institution-specific 856s?

We do deal with removing 856 fields that are specific to an institution, particularly when there's a more general one that is available instead. It's much easier to do for commercial providers where we have a general URL available, and we'll try to transform the one that's local into the general one. and then, if it turns out it's a duplicate field, it will drop out of the record. And we do that using macros to clean these kinds of things up, but it only deals with a portion of the problem. There was a period of time where we had lots of 856 fields transferring more than is the case now. So that's where a lot of these things have come from the past. If you see any one specific institution where this happens a lot or a URL from a particular provider where this has happened a lot go ahead and contact us. Because then we can put some effort into dealing with that particular problem and get it out of the way as much as possible.

I see a lot of 710 fields with subfield 5 for the special collections of other institutions. We use WMS and Discovery, so there is no way for us to delete them. Is it proper to request that the fields be removed?

Absolutely shoot us a message and let us know, and we will definitely take a look at it. If you are concerned and want feedback, just say I found this, there's possibly more, could you let me know? We'll definitely take care of it, respond back to you, and say yes thanks for reporting this, there were 50 more we've taken care of. These often have subfield $5s and we still want to know because not everybody uses the $5 as it was intended. So, we would want to review that.

Can you add a MARC record in OCLC WorldShare under Record Manager in OCLC WorldShare can you create a MARC record for your library online for your library use only?

Not yet, but future functionality will include this. That's still not scheduled, but in the future, you'll be able to do that. It's planned work, we have the requirements for that, we just don't have a timeline when that will be released.

LAC uses 710 for the bilingual equivalents as part of our bilingual cataloguing as a national library. Would you be removing those?  Our use of the 7XX for bilingual equivalents was discussed and implemented as part of our migration to WorldShare.

We're very careful about what we do with Library Archives Canada records because of their use of, and in particular the need to accommodate some elements that are needed for their unique catalog and the situation with accommodating records into language is French and English. We like to have these reported, we do remove things when it's appropriate to remove them and I'm speaking generally, but we do take care, we are rather conservative in our edits just like we're conservative with our merging. That may leave more data, or in this case, in an emerging case, duplicate records within WorldCat, but that's because we're erring on the side of caution and feel that the duplicate may be more acceptable in some cases than removing it and losing that information. And then Rich comments that most of the community has, he believes, has a hands-off approach to LAC data and WorldCat. So, we're definitely very, very respectful of that.

Situation: PN record, 500 field with no subfield 5 which says digitized from some specific library.  Can that note be removed?  It clearly doesn't apply all the digitized versions on the provider neutral record.

I would agree because the record is intended to stand for all instances of that same resource as available online, particularly when it's available from more than one provider that not is specific to one institution, perhaps it shouldn't be needed.

Is there more information about the 338 field and Spanish?  $2rdacarrier/spa

I do know with the tool kit having a translator version in Spanish, you can find more of the terminology, the cultural vocabulary for the RDA terms in Spanish. You can get more information in there. The macro and the work that we're doing adds Spanish 33X fields to the record when appropriate. We automatically add Spanish 33X fields

Hayley's example of course, was in the context of an LHR but in bibliographic records, the expectation is that the 33X fields would be in the same language of cataloging as the rest of the description. Occasionally, we'll come across a record that is marked as cataloging in Spanish, but the terms and 336, 337, 338 are in English. We'll go ahead and convert those whenever we can, and then mark the language code and the subfield $2. So, you'll see something like RDA carrier to indicate Spanish.

RDA Registry has such controlled terms in multiple languages.

Irene is asking if we could recommend OCLC resources that describe in detail the information that has been presented today.

Bibliographic Formats and Standards: all of the fields are listed in there, also the online help pages. If you go to, and look under support or help, you will find local Holdings information, local data information. The help site could help, do a keyword search - it's really nice and it has categorized listings of all the fields. Plus, you'll have the resources in this video to look back on as well.

Very off topic question: Would you ever consider making it possible to save a record without closing it? The process of saving and re-opening has been a big time-eater this past year of working from home with a less-than-totally-reliable connection. Closes when saving to online save file.

We'll look into this.

April 22, 2021

At this library, they receive only the Cross-reference exception report after we upload our records to, and this is done weekly. Is it okay? To ignore other reports that come along with the cross reference and can we explain how to retrieve the impounded records?

Thank you very much question. We have limited expertise in this area. Metadata Quality team doesn't work quite a bit with this, but we had some internal chat as the session was going on. First to share, which I've now put into the chat to everybody is a link to my understanding my file reports. And if this ultimately doesn't help answer this particular question, then we suggest reaching out to This is the support system, and they will absolutely be able to direct you into the right place or anyone who has questions regarding these cross ref, exception reports for this.

At the Lillian Goldman Law Library (Yale University), we review only the Cross.Ref exception report after we upload our records to OCLC weekly. Is it okay to ignore other reports that come along with the Cross.Ref? Also, can you explain as to how to “retrieve” the impounded records? Thank you.

My Files reports Please reach out to OCLC Support.

How can a consortium use the local notes field?

If the group has some sort of Discovery package, and they want to share that particular note, then an LBD seems to be the more appropriate place, rather than an LHR.

Should we decide to use our institution public interface, even if we are subscribing to WorldCat Discovery, would we be able to extract local data for our users?

Yes, in both Record Manager and Collection Manager. It's a little easier to do it in Collection Manager as a query collection. If you want to learn how to do that check out this link.

Then the other part of the question is, should we decide to use our institution public interface even if you're describing to Discovery? That's a great question and a lot of institutions do need to deal with that particular situation. I know that there's reasons, in some cases to use your public interface and then there's reason to use Discovery. Of course, us being on the call here, we would love for you to use WorldCat Discovery interface and there are quite a few benefits to doing that as well. But the first part of that question is a really good local question to have with not only the catalog within your institution but also with people who work with the public services and users, which may be one and the same. It's definitely a local decision for that.

You also have a comment about access music, libraries having local notes for the music records for these. There are providers around that do not have any kind of general URL that would take you to something about the item. If it's metadata or in the case of something like Ebooks that drop you at a title page that they might let you see. In those cases where that's entirely all local, we would certainly like to reduce the number. The problem that we have when there's a general URL when there is one additional that is institution specific is really a problem of clutter in these records, where you have the same domain name over and over and over, and most of those are not going to work for anybody. If there was only one URL for Naxos on one of those records and it was local it's still not going to work for anybody. Our general approach to that in the past was, we're okay with perhaps removing the other times. We've looked at it and said, well, one is better than none at all. So it's something that we may need to discuss about what we want to do with these kinds of things across the board because if it's entirely something that is unavailable to everybody else in WorldCat, that sort of raises the question of why have it. 

Should things like 856 with a link to a local book fund plates or a table of contents those kinds of things that are password protected, be in an 856 in the WorldCat record?

What is described in the question is a link to a local Book Fund Plate, or table contents, that's password protected. Those are local institution-specific type links, it sounds like. So, no, they should not be in the WorldCat bibliographic.

Is adding DOI ids to LHR 856s a norm? Is adding those kinds of identifiers to 856 a norm? Or, maybe to put a different way, good places to add.

I’m used to seeing them in the 024 field. To be honest, I have not yet come across it in an 856. But I would tend to see them more in an 024 field and we actually recently had an example in Bibliographic Format and Standards showing it in field 024, even though we don't see them a whole lot within the 856. It seems to me though, because of the nature of being the link, a potential link to an item itself, those would go on the WorldCat record. has an element about DOI  

I find that 38x fields often won't validate because of an invalid value in the $2.  I end up having to delete them to validate the record.

If something doesn't validate please notify us. We see validation updates often and usually need to update both the validation rules and the User interface to account for the change.

If there are invalid values in subfield $2 perhaps they are just codes, like from an earlier time period, as opposed to a more specific code based on some terminology that's coming from the registry because those codes have developed over time and our validation is a match to development as well. But I suppose that if a particular one has a subfield $2 that just says when the expected term for that 38X field is from some specific list within the RDA registry that probably it would fail. Let us know about any one example, and that's one of those cases where we can go look for more of them and potentially get them changed to a valid code.

I enter holdings in summary field, 852 $z and 599. Where does 852 $z information show up in Discovery?

Our group does not work directly with Discovery but here is a link to documentation: We encourage you to write to OCLC Support and they will help you.

How is LBD data merged with WorldCat record data for export to a local ILS?

Record Manager inter-weaves it. Collection manager has the 2 options

Does those OCLC routinely and systematically comb through WorldCat level electronic record resource records to remove institution-specific 856s? 

Cleaning up the records when errors or issues like this are reported to, we review that and if it looks like something that has been added to multiple records well, if it's something local, we'll go ahead and fix that record. Then we usually go on and search for additional records and target those that require a log in, and which anybody from an institution can log in with the log in credentials. Of course, we always want to be careful about any that we end up deleting or transforming in some other way. But, if we get a report, and it's a case where one record was reported, but we have another 5,000 that have the same kind of issue we'll try to add logic to macros that we use to possibly transform a URL to be a generic one for that provider, as opposed to an institution-specific one. Periodically we have gone back through Ebook records to deal with it. There was a time period where we did a lot of field transfer that wasn't really intended, and we picked up a lot of institution-specific URLs that we needed to get rid of. And we could probably do that over and over and over again to help clean them up. If you see a problem like that, where the same kind of URL that is institution-specific is across a whole set of records let us know and we'll try to get rid of it.

Should we decide to use our institution public interface, even if we are subscribing to WorldCat Discovery, would we be able to extract our local data for our users?

These local notes appear in the modernized view of Discovery in the item details.  They can appear before or after the WorldCat notes. Libraries can use their WorldShare sign in information: This is a link to a specific program that talks about this configuration option.

When you find an OCLC record with local information, should you report it? And if so to whom.

Yes, that can be reported to And if it's more of a general question, it's not a specific record, but you have more of a general question about local information, then send to bibchange. We would ask, though, if it is local information from another institution and, you are confident, you can delete it and it makes the record better. But maybe shoot us an email anyway and let us know so we can look and make sure there aren't additional records with the same issue.

We use Collection Manager and keep getting ILL requests for articles which are published outside of the years we own. Is that because they are patron-initiated requests or the ILL staff at the borrowing institution may not have time to look up which years we own?

That’s a very good question but outside of the expertise of this group so we need to have you contact OCLC Support and they will get the right group to assist you.

Do English and Spanish subject settings have to be listed in one group all in one group (all 650_0) and then another (650_7)

650 0

650 0

650 0

650 7

650 7

650 7

or can they alternate...?

650 0  English subject heading

650 7  Spanish equivalent

650 0  another English subject heading

650 7  corresponding Spanish equivalent

They will generally sort by the type of subject heading so that you can see the entire set of headings that were assigned according to a particular scheme grouped together and I realize that for some libraries this is a little more problematic, because it may be in a bilingual setting. You're trying to duplicate headings in English also in Spanish, and you want to see them, sort of paired up. But, eventually, when other processes get to those records, we'll sort them by indicator with just a regular reformat in Connexion. You may be able to input them, paired up, but they won't necessarily stay that way. If you are using a local system, and you're exporting the WorldCat record, the export will maintain the order that they're in. When the record first arrives, it's no doubt in the order that it was when it was sent to us, but once it's in the database and subject to other processing that we do, stuff can get sorted around. Discovery doesn't necessarily follow the MARC tags, the subjects may end up in a different order anyway, regardless of what order you see within Record Manager.

May 2021: All things authorities

May 11, 2021

Do non-WMS customers, but those who are OCLC members and Connexion subscribers have free access to WMS record manager? If so, will you provide info about setting up record manager access? Thank you!

Yes, if you are a OCLC member and Connexion subscriber, you are able to access Record Manager for free. To do this, go to WorldShare Record Manger Ordering ( and fill in the form to request access.

When we talk about matching records, if I alter a record to place a statement of responsibility in the $c subfield of the 245 tag in my local record, when reclamation projects or matching projects occur between OCLC and my institution's records, will the system find the match record less the statement of responsibility that I inserted? Everything else being equal but that change, will the records still be the same as the 035 will be the same and mostly everything else except the change in the statement of responsibility that I made?

I ask this question because I am relabeling (classifying) the some of the K books in my library (thousands). I am putting new call numbers on them as they did not have any call numbers but only a K and the item book number. Most of them were done when DLC was not putting statement of responsibility of the 1xx creator into the $c subfield of the statement of the responsibility, everything else was there: the prefacer, translator, etc., but not the creator. Since RDA emphasizes that a statement of responsibility is core, it must be transcribed as it is on the preferred source. So, I have been adding classification numbers and statement of responsibilities. But I might stop adding the statement of responsibilities and let the record stay as they were cataloged as the creator does show up as the system was configured to place by and insert the 1xx field to make it look as of a statement of responsibility is there.

If you need to alter a statement of responsibility in a WorldCat record, in general, this can only be accomplished by editing the record manually or by reporting the correction to

In certain circumstances, it's possible that the record will be updated through a record replace process depending on how the project is profiled. If it's set to 'replace own' records and another library has not modified the record, i.e. no library symbols in field 040 subfield $d, the record will be replaced with the modified version that was sent as long as the record matches.

If the statement of responsibility is correctly subfielded (i.e. $c), it should not affect matching when the record is matched via a batch process. The presence/absence of a statement of responsibility alone does not affect matching.

Regarding URIs, would you mind repeating the recommended best practice for controlling uncontrolled LCSH/NAF headings (or not) in Connexion Client when $0 is present, when replacing the OCLC record?

For English language of cataloging, headings should be controlled to the Library of Congress Name Authority File (LC NAF) and the Library of Congress Subject Headings (LCSH). The $0 will disappear when the headings are controlled but when the records are exported, the headings will include the URIs.

Will the retrospective controlling match with authority records which should not be used? For example, in Canadiana there are many authority records with the mention in 667 "CETTE NOTICE VERSÉE EN LOT DOIT ÊTRE ÉVALUÉE AVANT D'ÊTRE UTILISÉE, POUR S'ASSURER QU'ELLE RÉPONDE AUX NORMES DU PFAN" -» "THIS RECORD SHOULD BE EVALUATED PRIOR TO USE TO ENSURE THAT IT MEETS THE PFAN STANDARDS".

The presence of the 667 field alone will not prevent the heading from being controlled with automated controlling.  To prevent the heading from being controlled, it needs to have special coding in the Fixed Fields. For examples, headings coded as undifferentiated names will not be controlled via automated controlling.

Will the remaining Library of Congress (LC) files, Library of Congress Medium of Performance Terms for Music (LCMPT) and Library of Congress Demographic Group Terms (LCDGT), ever be searchable from Connexion? Even without any control feature, it would be a real time saver to be able to search these directly without going out to an additional source.

We are aware of the benefits of adding the option to search these vocabularies from Connexion and Record Manager. It is on Metadata Quality’s list of requested files to be added but currently OCLC has no plans to add these.

I have two questions. 1) Are there any plans to expand the Metadata API to allow retrieval of authorities in addition to bibliographic records? 2) Are there any plans to deliver authority records alongside bibs with Collection Manager's MARC delivery?

No, not at this time, but we'll consider it. You are welcome to request enhancements using the OCLC Community Center’s Enhancement Suggestions and narrowing the topic to Collection Manager (  Many of the enhancements that have been developed in the past, have come from these member requested enhancements. If you do not already have access to the OCLC’s Community Center and have a cataloging subscription, you will need to request access at

Could you please confirm which cataloguing level subscription is required to contribute records for LC/NACO File?

All Program for Cooperative Cataloging (PCC) level authorizations and roles come with the capability to edit and add records to the LC/NACO authority file. If you don’t already have a PCC level authorization or role, then you are welcome to apply to join the PCC. PCC trains libraries and when a library participates, that library will be given a PCC level authorization or role. You can request training at the PCC website ( If you are currently a NACO library and need to adjust your authorizations that you already have, email or fill in the webform (

For more information on the PCC’s NACO, BIBCO, and CONSER programs, see the Program for Cooperative Cataloging website (

For more information on authorization levels and roles within Record Manager, see BFAS 5.2.1, General Guidelines (

If you have further questions, please email us at

Will all controlled vocabularies have the option to export the URIs, because now some are missing and NTA-records that are controlled do not have the $0 anymore? In Record Manager the $0 goes away and is missing from export. When the heading has been controlled by the NTA record, $0 will not show in the bibliographic record anymore.

We currently have a development ticket to add the $0 as an option when exporting NTA records. In general, it depends on the settings when exporting records.

There are currently no plans for adding the HTTP URIs for other vocabularies, although we can see that it would be useful. You should be able to see the subfields $0 with the URI code/number related to that, so while not a direct use of an HTTP URI, if you were using software like MarcEdit, you could append that part to the number and it should get you the same results as a URI.  

Can you talk about controlling of unqualified names and automated controlling?

Unqualified names will not be controlled automatically since they require a choice from the user. A cataloger must verify that that is the authority that applies to this name.

If your cataloging language is English, you can or cannot use and control terms drawn from authority files the language of which is not English?

When controlling descriptive heading, the language of cataloging has to match. When controlling subject headings, the language of cataloging does not have to match.

I heard FAST are not searchable by the public. What are FAST used for?

While FAST is not searchable using the cataloging interfaces, FAST headings can be searched in searchFAST ( and added manually to WorldCat records. OCLC also automatically adds them to records based on Library of Congress Subject Headings (LCSH) that appear in the WorldCat records.

Is there any plan to create a way to download authority records for holdings synced in WorldCat en masse automatically?

Not at this time, however, you are welcome to request this as an enhancement using the OCLC Community Center’s Enhancement Suggestions and narrowing the topic to Collection Manager (  

When I make a new NAR I always go back and control all the records for that NAR. Are there any plans for OCLC to control these once the record comes out of distribution on a weekly or monthly basis?

Other than the retrospective that Nathan discussed in the presentation, there are no plans to do this on a regular basis for headings that can control automatically. However, generally, when a bibliographic record is replaced or updated, the offline control heading service will be called to control all controllable headings.

We have an open issue for this in the hope that this will be an option in the future, but right now we are waiting on development time and resources to be able to install this. We would love to be able to see this happen, because it would go a long way in helping with the quality of the controlled headings.

Does OCLC have a way to detect and report cases where a name authority is updated but its dependent name/title entries are not? e.g., via an interactive audit as the operator updates the name record?

No, we currently don't have that. You can always report cases to AuthFile either using the form or email ( 

Are all institution codes authorized for NACO work, or is that something done selectively?

No. Once you have joined PCC and participated in the training, then the PCC will send you information on how to contact OCLC to set up cataloging authorizations.

Could you please identify which value in the fixed field might prevent the automated controlling, is it the leader 17 encoding level, or 008/33 level of establishment, or others?

If Leader/14, Leader/15, and Leader/16 are all "b" then it would prevent automated controlling because it wouldn’t be valid for use. Otherwise, we check 008/09 but you shouldn’t attempt to use this code to prevent automated controlling. Instead the best way is to use Leader/14-Leader/16.

When I change the 1XX in a name authority, why does it sometimes take days or weeks for the controlled headings linked to that NAR to update?

There are a variety of reasons, but primarily it depends on how busy our system is in updating the bibliographic records after a change to an authority file records. We are working on ways to improve the process. If something has not changed after a week, it’s possible that will not be changed. Please report these to to see if there is a problem we can look into.

What is the value of the subfield $0 URI code?

Value there is going to be the code that’s related to the text label in the subfield $a. If you went to a source like the Library of Congress Linked Data Service (, all of the headings are associated with a URI, either as a URI code/number or an HTTP URI which includes the The way to add this in a MARC record is to add these manually. We recommend not doing this manually but using some sort of automation such as controlling headings. There are other services such as MarcEdit that allow you to add URIs to your records. The value of the subfield $0 is the URI code which is coming from a source such as

Why are there so many records that are less than desirable?

There are a variety of reasons. We get records from thousands of sources. Some catalogers will work directly within the WorldCat systems Record manager or Connexion to do their cataloging but a lot of catalogers work within their local Institution’s ILS and they send us the records. We have several processes to match and merge and clean up the records as they come in. Anything automated is never going to be 100% good. The other issue with some of these records is a matter of perspective. We get a lot of vendor records for items that have not been published yet but are in the works. These, for the most part, aren’t very useful to catalogers and can create a lot of clutter when searching for other materials. The title might not be correct, you might have an all caps situation, and some of the fields might be wrong. While not useful for catalogers, we’ve has several conversations with a variety of libraries that those records are useful to acquisition staff who need to attach an order record for the pre-publication work because they’ve already planned on ordering it, they are just waiting for it to come out. So, it’s a matter of fit for purpose on who the user is. Records that are not so great for cataloging, might be just fine for another user. That’s one of the ways of working within an aggregated system like WorldCat, you get the benefits of catalogers updating records and enhancing records on one end of the spectrum. Then on the other end, you get added noise as a cataloger as we’re trying to help out these other situations like acquisition or ordering specialists.

What is your current/typical expectation for how long it takes a new/recently changed NAR to go in and out of distribution (i.e., be unlocked)?

As long as everything is going as it should, it takes 2-3 days depending on when it’s submitted. If it's updated today, then it should get to the Library of Congress (LC) tomorrow. LC would then send it back that night and it would return the next day.

Is there a workflow that you recommend to libraries that do not use OCLC WorldShare, but a local system (e.g. ALMA) to add their improvements/enrichments to WorldCat and not just their local system?  I'd love to hear a QC presentation about such workflows.

We would advocate that you do your cataloging in either Connexion or Record Manager and then export the records to your local system. That would allow you to improve and enrich records in WorldCat while you are doing your cataloging.

Depending on your local system’s requirements, you could possibly set up the TCP/IP connection which would connect directly to your system and add the record (including any changes made to it) instead of taking the extra steps of exporting the records and then importing them into your local system.

For information on exporting within Record Manager, see

For information on exporting within Connexion Client, see

For information on exporting within Connexion Browser, see

May 20, 2021

When is the new version of Connexion going to be released?

David Whitehair posted in OCLC-CAT in January, that an early adopter field testing was planned for May/June and release was tentatively planned for July/August. On June 18, 2021, we are hosting a Cataloging Community Session where David Whitehair will be giving an update on what is included in Connexion 3.0.

Are there any plans to support controlling all those other thesauri in the new version of Connexion?

The data set within WorldCat is exactly the same whether you search within Connexion (current or new version) or Record Manager. There is different functionality between the Connexion and Record Manager but the set of data remains the same.

Will OCLC please obtain and load and provide access to Library of Congress Medium of Performance Terms for Music (LCMPT) and Library of Congress Demographic Group Terms (LCDGT) authorities? They can be downloaded from LC even though they aren't distributed through CDS.

In order to control headings, OCLC has to host the file or a copy of the data set within our systems. At this time OCLC does not have these authorities loaded into our system. While we would love to do this, we don’t know when this will happen. Metadata Quality has requested this in the past. If you would like to let OCLC know that an authority file, such as LCMPT or LCDGT, is important to control, please go to the Cataloging Community Center ( and request that the authority file be added. The more these are requested by our members, the higher the priority assigned to that request.

Adam Schiff put in a request for LCDGT and LCMPT in the Cataloging Community Center during this webinar:

For Canadiana, aren't the headings in both English and French? Why does the language of cataloguing need to be French to access Canadiana records?

The Canadiana database is comprised of two different files, the Canadiana Subject headings, which are in English, and the Canadiana Name Authority File, which are in French. While the Canadiana Subject headings may be used on any records, no matter the language of cataloging, only records cataloged using French language of cataloging may apply and control the Canadiana Name Authority File.

Are there plans to provide access to Medical Subject Headings (MeSH)?

OCLC provides access to MeSH using Record Manager but currently there are no plans to provide access to MeSH using Connexion.

Why aren't FAST headings controlled?

This is something Metadata Quality would like to do but currently is further down the list of things to implement. One of the things that happens as part of controlling is that headings are maintained when the authority file heading changes. When that happens in FAST right now, we have to go through the effort and change all of the bibliographic records in a different way than is easily done with controlling headings.

This is another example of an authority file to request from the Cataloging Community Center (

Instead of using subfield $0 with a string, have you considered adding URIs in either subfield $0 or subfield $1?

We have considered that but did not implement this. Subfield $0 has been around longer than subfield $1. In terms of linked data, subfield $1 is pretty important. Subfield $0 can contain an URI or an authority record control number, so we were already on the path of using subfield $0, even though we didn’t display the authority control number in the case of LC controlled headings. We looked at the possibility of outputting either subfield $0 or subfield $1, although we haven’t quite gotten there through Collection Manager so that you can get what you need for a local catalog, while at the same time maintaining our internal mechanism that was already built that makes use of an authority record control number in subfield $0. That could then end up being the basis of what it is that we output.

Will you start controlling 3XX attribute fields in bib records at some point in the future? There's a great need for this now.

This is on our wish list. We do not have a date for this but Metadata Quality has been asking for this enhancement.

The enrichment retrospective will be available in Connexion as well as Record Manager and Collection Manager?

Regardless of the interface that you use to access WorldCat, you will see updates. As the changes are made, you will see them applied to WorldCat as a whole. So, whether you are working in Connexion or Record Manager, you will see those updates. If you get updates through Collection Manager, you will see the updates there as well.

Will the control project flip terms found in fields 450 of authority records to the term in field 150?

Yes, if the heading in the bibliographic record matches a reference in an authority file record, the control heading service will flip the heading to match the form found in field 150 of the authority file.  It is our hope that as we go through the records in WorldCat, retrospectively, that we will pick up headings that weren’t controlled for some reason in the past and get them controlled to the authorized form.

From SearchFast is there an easy way to insert a heading into Connexion?

At the moment the only way to insert a FAST heading into Connexion or Record Manager is to copy and paste the heading into the bibliographic record. You can also look into using assignFAST ( which may be helpful when adding headings.

I have problem controlling headings when there is $e in the 100 or 700 fields in Record manager, no problem when there is no $e. But no issue when do the same in Connexion client.

If you can reproduce it regularly and it seems to be a system problem, then you can report it to and we can look into the situation and if needed forward the problem onto the correct group of people to make sure it not a functionality issue on our end.

Is there any way that I can export all of the name authority file records that my institution’s bibliographic records are linked to?

We currently do not have this capability. You are welcome to add this request to the enhancements in the Cataloging Community Center (

Do you want us to ask for an enhancement in the Community Center? If so, is there only a place for enhancements to Record Manager and not to Connexion? Where should we ask for it?

All enhancement requests may be submitted using the general Cataloging Community Center ( You also have the ability to upvote an enhancement request that has already been submitted by another member library.

Perhaps OCLC should provide a dynamic link to this "Wish List" you guys work from for the community. Just so we all know which of our wishes may be in the works. Thanks!

The best way to see the set of enhancements that have been requested is to utilize the Cataloging Community Center site (

 In other words, some of the upcoming record update batches are going to be huge.

This would be the case depending on your setting for receiving record updates. That is one of the reasons we are giving plenty of notice before starting this project, so that libraries have time to make any needed changes to their settings so the effects on their workflow is minimized. We are planning on starting at the very highest OCN and working our way down to the lowest and expect this to take about 3 months to complete.

In French records, is it currently possible to control subject strings combining headings from both Canadian and RVM, e.g. a name followed by a subject subdivision?

Not at this time. To do this, we would need a request for this in the Cataloging Community Center (

Often I need to change only the MARC tag (100 to 700, for example), but I have to uncontrol and re-control the whole heading. Can the function be changed so the changing of a MARC tag within the same type (i.e. 100 to 700, or, 611 to 111) does not require the uncontrol/recontrol routine?

This would be difficult to do as we protect the controlled headings from tag changes because of the various rules it would take to make sure the tag is correct. So, while it’s possible, it is not likely at this time.

We are switching to creating original records directly in OCLC. Should we use Record Manager or Connexion? Currently we use Connexion for authorities and SOME cataloging.

Anyone with a full cataloging subscription has access to both Connexion and Record Manager, so which one you use depends on your institution’s policy. They each have different functionality. Connexion allows more bulk editing and customization, while Record Manager has other functionality like enrichment and controlling to more authority files. Note that Connexion Client is only available for PC users and requires that you download the software onto your computer, while Record Manager is a web based tool with no software to download.

When I use the generate authority macro in Connexion, my screen freezes right away.

We expect this macro to be fixed with Connexion 3.0.

Does anyone have problems using WMS with Edge? I find that it freezes up. I am in Record Manager when this happens.

This is a good question to send to OCLC Supportn. They can help you troubleshoot with Edge.

We currently use the Extract Metadata function but have found that it does not work because of a "security issue" on some (but not all) of our computers.  OCLC has been notified of the issue and, last we heard, were not going to fix this.  Is it possible the new version of Connexion will fix this?

We don’t know for sure as everyone’s set up is different. We encourage you to attend the Cataloging Community Session on June 18, 2021 for an update on Connexion 3.0.

Could you talk a little about the WorldShare bug that indicated the 006 field was too short, thus wouldn't allow you to edit and save a record.

This bug was fixed earlier this week so should not be a problem anymore.

June 2021: Evolution of a WorldCat record

June 8, 2021

Occasionally I find monograph records (ranging from sparse to pretty complete) in LC's catalog, but there is no record at all for the work in WorldCat yet. What is the workflow for DLC records to enter WorldCat and is there a particular action or record change at LC's end that triggers the export of that record to WorldCat?

The Library of Congress (LC) distributes most of the records they catalog and we receive weekly distributions from them; however, they do not distribute everything in their catalogs. If you are finding a record in LC's catalog, that does not necessarily mean we have received it, so if you are wanting to know about a specific record, please send a question to

Can we have a link to this documentation? Because I would prefer to read it.

We are hoping to have the documentation finished and posted later this summer to the OCLC Community Center and the OCLC Help website.

Is language of cataloging a factor in this processing? Are there any differences in the process on that basis?

The language of cataloging is taken into consideration in DataSync processing and deduping WorldCat. For example, we want to match an English-language cataloged record to an English-language cataloged record and a German-language cataloged record to a German-language cataloged record. But not match an English-cataloged record to a German-cataloged record. Beyond that, there really is no difference in the processing, we compare the various elements of the records, in the same way, regardless of the language of cataloging. Taking field 300 as an example, we're normally looking at things like the coding of the record to determine format and then the extent in terms of the numbers that are present, rather than actually comparing the terms, as different terms are used in different languages to indicate the same thing.

DataSync: from my understanding of it only "record matching" is available at this time. Will "number matching" be available again at some point in the future?

Right now, we are in the process of doing some testing to incorporate number matching back into DataSync matching. We do not have a definite date or timeline when that will be installed, but it may be as soon as in the next month or so.

Are the OCLC numbers in the 019 Field entered chronologically? I wonder how they are ordered in the one Field.

As records are merged the OCLC record numbers are added to the 019 field in numerical order, regardless of when a particular merge took place in relation to any other OCLC record numbers in the 019.

Does this documentation revolve around Data Sync? (Trying to figure out the context.) Does the same process describe also apply when vendors submit e-resource records to Collection Manager?

The documentation will cover DataSync as well as many other processes. For example, Collection Manager and Duplicate Detection and Resolution (DDR) are two of the wide array of other processes that will be covered in the new documentation.

Do you have any rough estimate on the percentage or proportion of WorldCat records that are added manually via Connexion or Record Manager versus through a batch or other process?

This comparison between manual and automated processes to add new records is not something that is currently tracked; however, the vast majority of new records added are contributed by automated batch processes.

Does the process recognize hybrid records, i.e., records with evidence of different language-of-cataloging practices? How is merging done in that case?

The process looks at the 040 $b for the language of cataloging code only in determining the language of cataloging of a record. If a record were a hybrid where the 040 $b code did not match the language of cataloging in the description of the record, then there would be the possibility that the record could be merged into a record with the same language of cataloging code and fields in the different language could end up getting transferred.

Is there any move to make the matching algorithms between DataSync and DDR more uniform? Frequently DataSync will input my records as new (even if they contain OCLC numbers), but then these new records are merged later on with pre-existing records via DDR.

With the updates coming to the DataSync matching that was mentioned earlier, the match rates for records coming in via DataSync should improve. This is not necessarily to make DataSync and DDR matching algorithms more uniform, but it will improve the match rates in DataSync processing.

Is every number in a 019 field represented by an institution code in 040 $d?

Not necessarily, if no information was transferred during a merge, then the symbols present in the 040 will not transfer to the retained record's 040, only the OCLC record number will be added to the 019 field. If data does transfer during the merge, then the symbols will be added to the 040 unless already present.

I see a number of PCC records that use the $v Juvenile ... with annotated children's headings, which is against the guidelines for annotated children's headings. Is there anything I can do to these other than reporting them as an error?

If you are not a PCC library, so that you're not able to change those headings to their correct form on your own, I would say, go ahead and report them and also note that there are others in the database. We have been doing a little bit of work with juvenile subject headings of late and noticing cases where we have juvenile subject headings that include subdivisions, juvenile fiction or juvenile literature that should not have those subdivisions, as well as LCSH headings as well that are related and need to be adjusted, so go ahead and report those errors.

MARC doesn't have a method of recording data provenance, does OCLC keep this information internally, e.g., who input a certain field?

No, we don't because the MARC format doesn't have that we do not have that same kind of information internally. The only record of a change to an existing bibliographic record would be the addition of a symbol to the 040 field.  But once you have a lengthy 040 field with many $d's, and a lot of fields that have been added or changed in the record, there's no way to match up who did what. We do have a history of the record so we can look back at certain points in time and compare before images and after images to tell what happened, but not in the sense that the question was asked in terms of, can I tell by looking at a field who added it for example.

Is there a way to see the way a record looked prior to the most recent change? 

We do have an internal history tool that will allow us to look at records prior to a change transaction as long as it happened after April 2012.

I've seen a number of records for media lately that put personal name $a and $d information in separate sets of 700 fields. How would the processing handle such cases? Are problematic batch loads ever excluded from routine processing?

Unfortunately, I don't think that we have anything in our data prep information processing that I explained. We do a lot of clean-up, but I don't think that's something that our processing can handle. Definitely bring these to our attention so we can work with the database specialists for that particular project. We can communicate back to the institution, so they can fix future records and there is a possibility fields could be excluded from future processing if it's something they are not able to fix.

Would the number of 700s be a factor in choosing which record to retain?

I can say that it does figure into some decisions in part. The record retention hierarchy that we have in place starts off by looking at things like codes and field 042 and the source of the record such as the Library of Congress versus an OCLC member. But once 2 records are in the same rank, then we consider the number of fields that are present and the number of holdings in combination and so a record that appears to be more complete because it has more access points and field 700, could possibly win out in that comparison.

How much of what you just described is automated?

All of it.

I work in a library that uses multiple OCLC symbols. If a record was created using the wrong authorization, so the wrong symbol, is there a way to correct this? Should we report these to OCLC?

Yes, that something that we can correct for you. If you send a message to, with the affected record numbers if you have them, we can make the change for you if you would prefer that a particular symbol as associated with a record. That's something that we have to do, it's not something that our users are able to change in a record.

June 17, 2021

Will the documentation give a better idea of the various OCLC codes used in the 040 field?

No, we are not including that; however, you should be able to find information on the codes we use and the purpose of those codes in Bibliographic Formats and Standards (BFAS) at This might be good information to include, so I'll bring that back to the team that's working on the documentation.

I've seen many fields transferred into merged records with $5 subfields.  It would be great if you could remove these fields or not transfer them.

If you could send a message to with some examples, we're happy to look into it and see if those fields are appropriate for the record or not. Then if they're not, of course, we will be happy to remove them.

Are very brief bib records from Book vendors still being loaded?

We have what we refer to as sparse record checks, so we do have criteria that records need to meet in order to be added to WorldCat. So, the records that you refer to are obviously meeting those the criteria. I don't know if you have any specific examples that you're asking about, again, we're happy to take a look to see if there are some issues, but we do try to set criteria to make sure that we're getting records that have enough metadata so they can be searched, retrieved and be discoverable.

When I am editing an OCLC record via Connexion Client, am I directly editing the WorldCat record that appears on, or is there some distance between the records as they appear in those two different places? Would you see that change more or less in real-time, or is there an amount of time between that edit and its appearing in

You should be able to see the edits you make using Connexion client in the display of the record after replacing the record then refreshing the webpage. Additionally, there can be information from external sources displayed in the record display that does not exist in the WorldCat bibliographic record.

When the new Connexion client is going to be deployed?

We don't have an exact timeframe yet. The testing is happening right now with the field test. I would predict in the next few months, but I don't think we have a specific timeframe yet.

If you do a title search for "untitled" and limit to 2020 on, there are hundreds of brief records that appear to be from book vendors- how long do these stay in OCLC?

That's actually one of those things that's on our list to clean up periodically. So, it looks like we need to put that on our list to clean up sooner rather than later. We do try to take care of those and either merge them into duplicate records or delete them.

I have a totally unrelated question about indexing: Is it possible to "turn on" indexing for MARC field 655_4 in Connexion or Record Manager so that field can be used for keyword searching?

If our Searching WorldCat Indexes document is correct and if I'm reading it correctly, the 655 is indexed in the subject [su] index and in the keyword [kw] index, in addition to the specific indexes that take into consideration the 2nd indicator. So, a 655_4 is indexed and should be searchable in both the subject and keyword indexes. 

What is the usual time from adding a record, to time matching, replacing, determining if keeping a record, a decision is made?

We have several different flows that occur with our DDR duplicate detection program so, how a record is added to WorldCat affects when it will go through DDR processing. For example, records added via DataSync generally get into the DDR queue within several days and processed within 48 hours. For a record added or modified online, it takes 7 days to enter the DDR queue.

For the last week or so when I login into OCLC the message is very brief and says: Welcome to the OCLC Connexion™ service.  You will be using the service in Master mode, and then there are phone numbers listed. Is there something going on with OCLC Connexion right now?

The message of the day was decommissioned in March so a message of the day is no longer being posted. To elaborate, this is related to the new client version, which is in field-testing right now, where there will no longer be a message of the day.

How different will the new Connexion client be from the current program?

Except for the opening screen, which looks a little bit different, it looks very much the same as it always has. The changes are largely, although not exclusively underneath the hood, if you will, behind the scenes. The idea was to upgrade the client to be better compatible with more modern and up to date Windows software and so on. So, it won't look a lot different, but in theory, it will be behaving better. Additionally, some things such as glimmer clustering and institution records that are talked about in various places in the connection client, those references have been removed. If I remember correctly, the help system has been moved to the OCLC website. When you access help, you'll be going to the OCLC website rather than within Connexion itself.

They are not currently indexed, and we would love them to be. Where could a request be made?

* Random feature request: Add a keyboard shortcut for setting holdings in Record Manager.  Thank you all for what you do!
* It would be very useful to us to have the 655_4 keyword searchable under "genre"
* We'd like to be able to limit to a local genre string, and this doesn't work. One cannot browse by genre Theses -- Biology and one cannot do a genre keyword search for Theses Biology. If you do a subject kw search on those words, you'd get too many extraneous records
* I should say the strings are subdivided: 655 _4 Theses $x Biology.  The browse ignores subdivisions

Record Manager enhancement requests can be made through the Cataloging & Metadata Community Center or send a message to 

How long does it take a NAR to get into VIAF?

Data from the LC/NACO authority file is sent each week and the VIAF database is updated weekly as well. Depending if a change or addition was made early in the week, you might see it in VIAF the same week, but if you did it later in the week the change might not be reflected in the VIAF database until the following week.

I did not realize until more recently that 505 field is an optional/optional entry standard. And I wondered why. The 505 field is very important for resources like song albums and collections of children’s tales, anthologies, etc.

The input standards of mandatory and optional and so on were originally set long ago by an advisory group of catalogers from member libraries and were partially based on the Library of Congress national-level demographic record. Fields such as the 505 contents note, of course, are appropriate in many cases and are extremely useful as you point out, such as for sound recording song albums, collections of stories, anthologies, and so on. But the 505 is also not found in many things, such as a fictional novel, for example. Input standards aren't meant to indicate if a field is worthwhile or not worthwhile, they are meant to be kind of floors, not ceilings, so if any optional field is appropriate and useful in a particular instance, then yes, include them.

Will there be new training on the Connexion Client?

We should have some info on this soon — stay tuned!

Remind us what GLIMR is?

GLIMR was an experiment that OCLC undertook some years ago to cluster bibliographic records together. It evolved into the clustering you now see in

Follow-up on client: will we still have access to our current text strings and macros?

Some of us did do some testing prior to the field test and all of our personal macros that we use were made available. I don't use text strings, but I believe both the text strings and macros will transfer over to the new version.

VIAF clustering is so weird sometimes and they cluster things or not cluster for same authority records

Sometimes it gets it right, sometimes it doesn't. VIAF clustering is determined by algorithms, which are not static, they are always being worked on and refined. So, if you see any VIAF clusters that are incorrect, please report them to We manually edit the clusters as needed and can work with the VIAF team to see if there's a bigger underlying issue when indicated.

When the new client is available, is it an update or a whole new download?

It will be a completely new download.

I have seen a lot of clusters that are not "right" and reported them but months have passed and nothing has been done.

We get many requests for editing the VIAF clusters, so we apologize for not getting those completed as quickly as we would want to. Definitely keep reporting them, we try to work on them as much as we can.

How is the consolidation and streamlining of Encoding levels going?  I have been seeing level 1 records and I thought those were for old LC copy that was input without much checking of the book.

We're just working on the conversion of encoding level K records, that's the first encoding level we decided to try to take care of. I just checked today; we still have about 16 million records to go. We have a little while yet to work on those. I think we were talking about moving on to encoding level M, but that's a huge number and we might break that down somehow or another, but I think that may be next.

I have also had that experience, Ivan. Are there plans to open up editing privileges for VIAF?

At this point in time, I do not believe there's any discussion on that being developed for VIAF.

Will vendor records continue to be coded M and not at what level they were input as?

We actually are still having discussions on how we're going to evaluate incoming records and how to code them. We haven't decided yet so unfortunately, I have nothing more concrete to share. 

Are vendor partners are allowed to use WorldShare collection manager?

Yes, if they pay for a cataloging subscription.

What does "VI" in VIAF stand for?

VIAF stands for Virtual International Authority File.

August 2021 - Medley of popular topics

August 10, 2021

Is there a way to validate for "precomposed" characters?

Institutions do not need to worry about precomposed characters in bibliographic records as OCLC uses UTF-8. For authority records, institutions should make sure that characters are decomposed since precomposed characters cause authority records to be stuck in distribution. Copying and pasting such characters from the web should be avoided and instead use OCLC's diacritics and special character set when entering diacritics in authority records.

Also, BabelStone: Unicode: What Unicode character is this? is an online tool that that can help identify if it's Pre composed or decomposed. Copy and paste a character into this tool and see the Unicode characters that that are being used.

And a good tool for diacritics is Joel Hahn's Macros for the Connexion Client: CvtDiacritics.

When MARC records are machine loaded are "precomposed" characters changed to "decomposed" characters?

No, they are not. For the authority records, there's no validation, but it also doesn't automatically change them.

In Discovery, when search terms include "precomposed" or "decomposed" characters, does the search retrieve both kinds of characters? That is, are the search terms normalized in some way?

Normalization does take place for searching functionality and that extends to precomposed versus decomposed characters.

I've noticed lately that [name] $v Correspondence doesn't control automatically. I get the pop-up window that says, "Select Modify Heading for a main entry to start building new heading". The fully controlled heading doesn't seem different from my original heading, but I have to go through that extra step to get the (apparently) same end result. Any idea why this is happening?

Correspondence is one of several special subdivisions that often have multiple possible subdivisions (Field codes) so you can have Correspondence as a subfield $t or you could have Correspondence as a subfield $v and with the goal of trying to be helpful, the controlling software proposes both options and lets the user choose which option is correct rather than automatically controlling to the subfield code that was entered into the record. 

Will FAST headings always be based on the LCSH that they are derived from? Is there any consideration for FAST headings deviating from the LCSH terms if the terms are offensive or just outdated?

The fast headings will change when the LC term changes. We did consider diverging from LC for certain terms but it complicates processing to such an extent that we decided not to do that for the time being with the expectation that many of the offensive and outdated terms will change in the next few years.

We've seen issues where Fields are not paired / linked, even though they are "Model A": so, in Connexion 
(client) there are two "identical" Fields in OCLC, one with vernacular + transliteration, but they aren't linked and when they are exported there are issues with the 880 not being linked correctly to the Field it is supposed to be a pair to. Is there a way to ensure that the vernacular script Fields *are* paired with the transliterated Field?

The only way I know of to ensure these are paired correctly is to look at them manually and pair them manually. Our system does try to pair things up, but it doesn't always get it quite correct. But you can edit that. So, if you're working in Connexion, you can pair and repair Fields as needed to make sure that the pairings are correct. Now, not all Fields do need to be paired if there is a vernacular Field without a transliterated Field then you wouldn't pair that with anything.

I have noticed that I can't search with mathematical operators in record manager such as the infinity symbol. I get a system error. guidelines_and_requirements#Special_characters_in_Latin_script_searches

Is there a list of thesauri that are validated in OCLC bibliographic records?

“Validated” meaning controllable, then there's a list available:

When updating an LCSH term, is it preferable to update the FAST or just delete them all?

It's preferred to just go ahead and delete them all that way, they can be regenerated in about a month or so. It's certainly not required that you delete them all, but it is preferred.

Is it best not to strip FAST headings out of records loaded to our ILS if we are not comfortable with using them but they may be needed in the future? We should just have them not to display in our ILS and just leave them as is in case we decide it is best to have them in for future use?

If you're not comfortable with using them, you can remove them locally from your system, so they're not showing up in your local copy of the record.

If programming your local system to not display the FAST headings while leaving them in the records is an option for you with your own ILS, that’s certainly a good to consider for FAST or any sort of Fields that you don’t want to display to the public. That way, if you do decide 10 years down the road that you really wanted to have those but you've deleted them, you won't have them, but if you've kept them and just the suppressed display, you will have them for any future uses you want to make of them.

Has there been any consideration of not having Lawyers--fiction controlled to legal stories? Often the work has a character that is a lawyer and it is not a legal story.

Lots of people have considered it, but it would be up to the Library of Congress to make the decision to modify their authority records to allow that.

Are there guidelines on how to format diacritics in bib records correctly, or should I just always use the OCLC tool when adding diacritics anywhere in a bib record to be sure it is decomposed?

We do not have specific guidelines in place; both are allowed in bibliographic records. We prefer decomposed because they work better with the Connexion macro. Also, if you picked something up from the bibliographic record and wanted to create an authority record, since the authority record can't accept precomposed characters it's better for the bibliographic record to be decomposed. And so, yes, it's better to use the OCLC tool for adding the decomposed characters or adding diacritics using the tools that are available in the cataloging interface.

August 19, 2021

Will NARs validate in OCLC with precomposed chars?

No, they will not. There's no validation check in the systems in either Connexion or Record Manager that will cause a validation error when a record has precomposed characters. That the best option would be to utilize the Connexion and Record Manager functionality in order to enter those diacritics. So, in Record Manager, there is a function to insert a diacritic and then in Connexion under the Edit menu, you can enter diacritics that way.

Other tools:

Regarding controlling subject headings, I couldn't get headings in OCLC 21969614 --Midnight's children.  Islam--Relations--Hinduism--Fiction and  Hinduism--Relations-Islam--Fiction refuse to control. Any idea why?

From the quick look, I took just now Islam--Relations--Hinduism, and Hinduism—Relations--Islam appear to be validation records. Since there's no separately established subdivision Hinduism or Islam that would be controllable as subfield $x then those subfields are uncontrolled.  So, only the Islam—Relations--Fiction are controllable in that in those headings and Hinduism—Relations--Fiction. Consequently, the $xs remain uncontrolled while the rest of the subfields are controlled.

How long does it take for a reported incorrect merge to be looked at?

Currently, our turnaround time is within a couple of days. Incorrect merges do take a higher priority, but keep in mind that, depending on the complexity of the merge it could take a little while to get them teased apart, but usually within a couple of days.

Does OCLC have any opinion on whether or not it prefers transliterations to be included for non-Latin characters? Or is it completely optional? Our library is thinking about whether we want to bother with Pinyin transliterations of Chinese characters.

It is optional. There is no requirement for you to enter transliterations, or to include the non-Latin characters. But if you are cataloguing according to certain standards you may want to follow whatever it says in those particular standards.

I know there is some thinking going on about what will happen with Bibframe, and the idea has been floated that there will be fewer Fields that would routinely be transliterated within Bibframe.  “Fields” might even be the wrong term in Bibframe, it would be elements within the Bibframe structure.

There are a lot of libraries thinking about this, and what they want to do going forward. The reason of course, that transliterations were entered in such a large scale initially, when people were doing online cataloging in MARC is because the vast majority of local systems did not support non-Latin characters. The only way to enter data into your local systems, or to WorldCat was using Latin characters and transliteration. But that's changed, since OCLC supports all of Unicode now. There are a lot of options.

We were wondering whether DDR took detailed dates in the fixed Fields (DtSt = e)  into consideration when deciding whether to merge records. (We haven't been using that code and we have been supplying a devised edition statement for editions within the same year, but we were just curious about that detailed date and DDR.)

It does not take into consideration the detailed date, and we would recommend that if there's a way to differentiate different versions of a similar document that you supply an edition statement if there isn't one that you can use. So, what you're doing is a good practice.

Is OCLC going to fix the freezing problem with using many of the macros  i.e.  Add 33x or Generate Authority record  It is aggravating to work on a record and then use the macro and it freezes and your only recourse is to Ctrl+Alt+Delete to get out of the record and start all over again.

The next version of Connexion Client is in Field Test right now and these are some of the issues that they are working on getting resolved. So, hopefully, that will come out later on this year and it will solve most of these problems if not all of them.

There were three enhancement requests on August 10 regarding Unicode. Are those enhancements for both the bib record and authority record?

Here is one:

I assume that they would be for Bib records that's what OCLC has control over. If you have concerns about the use of the precomposed versus decomposed diacritics in the authority file LC is the place to talk to about that for the LC/NACO Authority File. Because it's their system that requires the use of decomposed diacritics.

How do you search a particular NACO participants in Connexion?

They cannot be searched directly in Connexion but PCC maintains lists of participants on their NACO website:

Also, you can search the MARC Organization Code in MARC Code Lists for Organizations

Then there is help here on searching OCLC Cataloging Products: ed_Fields/0Indexes_and_indexed_Fields_A_to_Z 

September 2021 - The complexities of field transfer

September 14, 2021

I have a question about Collection Manger and Overdrive books. Before becoming a WMS library, we loaded MARC records from a partner collection from Overdrive. Now how do we get these turned on and updated in Collection Manager without getting MARC records?

This question is better sent to OCLC Support. You can also check out the Collection Manager/Knowledge Base Virtual Office Hours and ask them there.

Does the field transfer protocol apply to records from other national libraries?

This is probably about the Data Sync matching, and yes, there are other libraries, like the National Library of Medicine and the Library of Congress, that would fall into the national library category, or the Program for Cooperative Cataloging projects, so yes, there are others than just the obvious national libraries that come to mind.

How often do you run DDR?

We're actually running DDR continuously, pretty much around the clock, and we have several different queues that records get into DDR. This has come up in previous sessions, but for example, if significant changes are made to a record by a cataloger, those records would get into the DDR queue in a week. New records that come in through Data Sync generally get into the queue within several days. But we do run DDR 24 hours a day.

Any special policy regarding Juvenile materials ($v)?

I'm assuming this is with the subject headings where you have a subject heading "Cats" with a subfield $v with "Juvenile materials." In that particular scheme, if that second indicator 1 value is not present in the retained record, then when the records are merged, those specific subject headings would transfer then to the retained record.

If we notice ISBNs for a 3 volume title and a single volume title with the same content on one bib record, should we report that to be demerged?

Anytime you suspect that records have been incorrectly merged, you are welcome to submit those to and we would be happy to look into that to see if maybe those ISBNs did transfer during a possible incorrect merge and then undo it if appropriate.

For Data Sync, is there documentation of the conditions that cause a 'match' as opposed to 'create' or 'field transfer'? We've seen it when the retained record is missing 264 $a and $b, or when there is a bad character in a note on the retained record. Are there other situations?

When you see "Match," that means that was the only action taken on the record. "Create" obviously would be a new record. "Field transfer" would be when data did transfer to the WorldCat record. We do have some documentation that we've been working on that will explain a little bit more in detail about how matching works, along with many other things such as field transfer. There is some information on Data Sync matching already available that may help answer the question ( There are conditions that go into matching. We obviously do a lot of comparison points from the incoming record to the record in WorldCat. It's all based on the field transfer rules that are outlined in Chapter 5.

If we're trying to match a record that is coming in from Data Sync, and the database record has errors, that can be enough to make them not match--if they're even considered in the first place. We retrieve records and then compare the fields. If it's a case where we have a publisher that's missing in 264 $b, versus the other record has one, that may not make it not match completely but it's taken in consideration. We have situations like that, comparing a published version of an item versus an unpublished version. The quality of the records that are already in the database can affect the matching of incoming records.

What is the status of the enhanced authority control project (i.e. adding equivalent subject headings in controlling)?

We're still working on making sure everything will work correctly when we run that controlling and enrichment retrospective. We're working on final details before we test the first set of records. It's not clear when the full runs will start, but hopefully soon.

September 23, 2021

If the retained record has one 650 _0 and the deleted record has three 650 _0, the extra two fields will not transfer to the retained record?

Yes, that is correct. If there are also LC Subject Headings or subject headings from another scheme already present in the retained record, those from the deleted record will not transfer.

Can you send the link to field transfer for data sync processing?

Are these the same rules used by the Member Merge project participants?

These are the same rules that apply when records are manually merged.

When the institution code for a non-CONSER library appears in the 040 field of a CONSER record, is that due to a field transfer transaction?

Basically yes. Even though there are a lot of limits on fields that will transfer to a CONSER record, a non-CONSER library couldn't make a change manually online that would cause their symbol to show up in the 040. So when you see a $v with a non-CONSER symbol in a CONSER record, after the point the record had become authenticated in WorldCat, the only way that it should end up there would be if it was a field transfer situation.

Why are fields transferred that will not pass the validation process?

We don't actually validate fields at the point where they transfer. That would be an additional step. It's just never happened that way, and yeah it's kind of bad to take a retained record that passes validation and then add a field to it that should improve the record but now it doesn't pass validation because it's got something that was incorrectly coded. For the kinds of fields that we routinely transfer--call numbers and subject headings and contents notes, that kind of thing--we usually try to clean them up as much as possible, but obviously we don't catch everything and we let a lot of records into WorldCat based on the value of being able to share libraries' holdings and that kind of thing, and look the other way on some validation errors. But we try to go after them as much as possible. Maybe we should further discuss the point of validating fields at the point of transfer.

You're also welcome to, if you come across that, welcome to send those to, to see if we're able to figure out what's wrong with them or figure out a way to fix them.

Why are records for audio books merged with the record for the print book?

Please report any such incorrect merges ( so that we may unmerge them if necessary. It could be that there was incorrect coding in either of the records that would cause them to be merged. We do come across a lot of audiobooks that are on Books format rather than Sound Recordings, where they should be. That is a problem and we do fix a lot of those, but it is possible things get by us in those cases. It's normally miscoding that causes that to happen because our merge software is looking at the distinctions, not just "oh, they have the same title and publisher and date in common." It's really looking at the format to say "this is an audiobook, this is printed text," or "here is the online version," to keep all of those things separate. So when they do get merged, it's got to be the result of some incorrect coding.

Is the reasoning/logic provided for why certain fields are or are not transferred in the Data Sync document?

No, I don't believe that we go into that kind of detail in that documentation--for the link that I sent--but I would say the reason that something wouldn't transfer to would be similar to what I outlined as the overarching reasons why we don't transfer some fields. Any specific question about why something doesn't transfer, or does transfer, can be sent to

If I have questions about authority records, can I send those to AskQC?

Authority questions can be sent to And if there's a question about whether or not they're duplicates, or if there might be something wrong with the record, you can go ahead and send those to that address. If it's a question that would better be answered by AskQC, it will be forwarded from Authfile to AskQC.

October 2021 - Getting a fix on fixed field elements

October 12, 2021

Could you speak more about adding language edition statements, please? Is this needed for records that have 041 and/or 546 fields?

Neither 041 nor 546 fields, both having to do with language, play a direct part in matching, or in record resolution. In most cases, especially when titles differ from language to language, which is most cases, you won’t necessarily need a language edition statement.

But in cases where the title of a resource may be the same from language to language, that is, for instance, if the title happens to be a person’s proper name, a biography or something, where that wouldn’t change from language to language, it may be useful to include the language edition statement to help differentiate from one language of the resource to another language of the resource.

If other bibliographic element, such as the publisher, the pagination, things like that, if those are different, it’s probably less important to add a language edition statement. But if the title is the same from language to language, it may be useful.

Would you discuss DtSt = t ?

DtSt t is when you have a date of publication and a date of copyright. That gets used a lot nowadays, because even when you have a copyright date and a date of publication that are the same, those, because they are different RDA elements, they are coded separately, so you could in some cases have either an actual publication date or an inferred publication date in date 1, and the same date, representing the date of copyright, as date 2, so that’s the case.

That wouldn’t have happened under earlier instructions, but does happen quite often under RDA. It is possible that a publication date and a copyright date could be either the same, in which case you would use DtSt t, or different, in which case you could use DtSt t as well. And just to remember, under RDA, a copyright date cannot be used by itself as a publication date. It can be used only as an inferred publication date, if you had no other evidence of publication.

Can the Date Entered and Date Replaced indexes be searched with ranges, e.g., "20211001-20211012"?

Yes. You may use a range of dates to search those indexes.

Would you address when the Target Audience code for adult should be used? Every now and then, I see it and it surprises me. But it is rare.

It is kind of general, and isn’t necessarily helpful unless it is something that the publisher emphasizes that this is meant for adults as opposed to children. For instance, if there’s an adult edition and a child’s edition, or something like that, you may want to include the target audience as adult for the adult version. But in many cases, because target audience is optional, you don’t need to include that it’s for adults.

Could you elaborate on the GPub and coding for state university press publications?

This coding is confusing, because it has changed over the years. The current practice is to not code for government publication if the publisher is a state university press, and this is stated in Bibliographic Formats and Standards, in the fixed field government publication page, under academic publications, where we say, “Publications of state college or state university presses in the United States (e.g., Kent State University Press, Michigan State College Press) are not considered to be government publications.”

There is actually an exception to that. The exception would be things like college catalogs or directories, or things like that. So those are actual publications of the state. 

Is that preferable, then, to use DtSt = t, even if the date is the same (or if only an inferred publication date is available?)


If for a library illustrative content (008/18-21; 006/01-04) has a lot of importance and will be starting to use WMS soon, would you recommend them to continue to enhance illustrative content in this shared environment?

You are welcome to code those elements. We did not cover those elements today—we will be covering them in the 2022 presentation on the fixed field elements that are not used in all the formats. But certainly, you are most welcome to code them, and if you see records in WorldCat that don’t have these codes and ought to, please go ahead and add them to the records and replace them.

Can any OCLC member add an incomplete record (example, title, author, imprint only) to WorldCat and assign value 7 to leader 17?

Yes. This would be the same thing as adding a encoding level K record in the past—encoding level 7 means minimal. Now, if you have a really incomplete record, if you only have title, author, and imprint, and you don’t include the paging, you would have what we call an abbreviated record, and that would be encoding level 3, and you are welcome to add those as well.

If my ILS sent an invalid 008 "Date entered on file" to OCLC, there isn't a way I can fix it in OCLC because that field isn't editable, correct?

If you send a file of records to our data sync with the date entered being a future date, data sync will reject those as being invalid. Any past dates are accepted, but if you sent something that was coded, for example for 2023, in the data, our system would reject it.

Sorry to loop back, but under what circumstances would it be more appropriate to use DtSt = s?

Nowadays, under RDA, you would use s as a DtSt code much less frequently than you used to. Unpublished items that have a single date of execution would have a DtSt s. If you’re cataloging something older that may not have a copyright date, for instance, or that has dates that you have to infer a date of publication, that doesn’t state it outright, it’s possible you could use a DtSt code s. But it’s much less frequently used now than it used to be under AACR2.

Speaking of which, when will the mnemonic codes in Encoding Level become invalid?

This is referring, I believe, to the alphabetic codes, I and K in particular, the OCLC-defined codes. I is equivalent to blank, and K is equivalent to 7 in the official MARC code list.

We don’t have a projected date as to when they will become invalid. We are currently working on converting the level K records to the official MARC codes in WorldCat. We have a long way to go, though. We’re going to have to convert all the Is, and then we’re going to have to convert all the Ms, so we aren’t sure yet.

We will give plenty of notice before we do this. Meanwhile, the official MARC codes are available now, for everyone to use, when they are entering records, so if you want to switch to using blank or 7, or whatever your coding requires for the record you’re entering, please feel free to go ahead and do that now, to get used to using those codes.

If a music score is a collection of folk songs with lyrics in non-Latin language, the language code still would be zxx?

No. If this is one single non-Latin language, that’s the language code that ought to go in the fixed field. If it is multiple languages, you would put mul—if it’s a collection, you might have multiple languages—and then you code the languages in the 041.

The zxx, the code for no linguistic content, is for, in the case of sounds or sound recordings, that would be used when the music is entirely instrumental, and has no words associated with it. If there are, for instance, accompanying materials that need to be coded—program notes, things like that—that would be coded in the field 041. But when the main content has no language content, such as purely instrumental music, that’s when you use zxx.

For the 264 #4 field, if the copyright date is noted with the word copyright instead of the symbol, can we use the symbol in 264 $4, or should it be the word copyright and the year?  e.g. 264 #4 $c copyright 1881

RDA allows you to do either way. You can spell out copyright if that’s how it’s presented, or you can use the copyright symbol, the C surrounded by a circle. Either way is fine as far as RDA is concerned.

Did I hear correctly that if the Type code for a record is incorrect, it can be changed without having to input a new record?

Yes, that is correct. You may change the type code. If for some reason the system prevents you from changing the type code when it’s incorrect, feel free to send a message to, or use the request correction function in Record Manager, or report error in Connexion, and this will go into that bibchange inbox, and it will be dealt with as swiftly as we possibly can.

Now, there are instances where a cataloger may legitimately decide that one type is more important than the other, for example, a printed book that’s accompanied by an audio recording of someone reading the book—that could also be considered an audio recording of someone reading the book, accompanied by the printing book. And so there may be legitimate differences in how people decide to code something, in which case there may be two records in WorldCat. But if indeed there is a mistake, for example, if a map is coded as a book, feel free to change it and/or report it to us so we can change it.

For a map sheet with two main maps that each have a different projections type, polyconic and Lambert, for example, how would you choose which to record in the projection fixed field?

You could code one in the projection fixed field, and then also make a note in the record that there are two different projections.

October 21, 2021

What determines whether a LDR/06 Type change is allowed? I would  expect not to be able to do it on PCC records, but on others sometimes I can and sometimes I can't, and it hasn't been immediately clear to me why or why not.

Editing the type and bib level fixed fields depends on the role or authorization that you have. If you have full level, then you can edit your local copy. If you have the full-level authorization role, then in the bibliographic records, if it’s coded as less than full, you’re able to change those. If it’s your record that you’ve input, then you’re able to change those. But authenticated records, you would not be able to change that, and those you can submit to, and the bibchange staff would be able to help.

It really is about the fullness. Depending on the encoding level, if it is a full-level record, then you probably won’t be able to change it, but if it’s less than full, then you can. And in regards to if you created the record, your library, and you’re the only one that has holdings to it, you can also change the type code.

If DtSt is t and uses the same year for copyright and pub. date, but a previous date exists in a note, ie. for a reprint, should it be changed?

Not necessarily. Nowadays, under RDA, it’s quite common to have the date of publication and the copyright data being the same, so that DtSt is coded t, and both the date 1 and date 2 are the same, or they look the same. That’s because under RDA, the date of publication and copyright date are entirely different elements. That was not the case in AACR2 and other earlier descriptive instructions. So if you have a date of publication, even if you are deriving or inferring that date of publication from a copyright date, the date of publication and the copyright date, if they are the same, you use DtSt t and the same date in both date 1 and date 2.

I’m not sure exactly what you mean by a previous date existing in a note. In most cases, if not all cases, even under RDA, it will be mainly the most recent copyright date that will be considered to be the date of copyright, so I’m not sure exactly what you mean by a previous date existing in a note, but having DtSt t and the same date in both date 1 and date 2 is quite common these days under RDA.

A reproduction date, or the fact that something had been previously published, may or may not factor into the date of publication and the DtSt coding. You’d have to check the hierarchy of DtSt codes. I don’t remember offhand if t is higher than either of the reprint or other DtSt reproduction codes. I think it is higher, so DtSt code t would take precedence.

Related to Date Entered and Date Replaced - is it possible to see a previous version of a bib record (for a specific date or a date range)?

No. I know that’s a pretty brief answer, but within WorldCat, you are always looking at the current version of the record when you’re in a cataloging interface. We do have a file that is internal to OCLC called Journal History, so if there is a reason for us to look up a previous version of a record, we can do that here, and that helps us know what happened in the past for a record, but that is only available internally at OCLC right now.

So, for Russian records we no longer need to code MRec?

My interpretation of the codes for that are that those codes about romanized and such only did apply to transcription from cards. That’s certainly my reading of it. But no, you do not need to code that, if you are not transcribing from a card.

Using Connexion client, I occasionally find that somehow I have changed the Rec stat, even though that field can not be edited. Do you have any idea how this happens? When it does happen, I cannot update/ create the record.

That one’s a mystery to me! Perhaps if they could provide an example, you could send that to bibchange, and we could take a look to see what’s happening behind the scenes.

To make sure I understood - going forward should I be coding  the encoding level blank for full level records

Yes. That would be great. That is definitely what we would like you to do. It is not required at this point—you can still use code I, but we would prefer that you use the blank, because at some point in the future, we will be changing those I-level records into blank, and converting all the records in the database that have level I to the correct, official MARC code.

For RDA records where the copyright date and the publication date match, should you use t and enter both dates or just the s with the one date? I'm seeing this all over the place in OCLC
  • Under RDA, different types of dates (including copyright date, date of capture, date of distribution, date of manufacture, date of production, date of publication) are each independent data elements, even if they are related or look similar.
  • Date of Publication is a Core Element in RDA 2.8.6, whereas Copyright Date is not designated as a Core Element in RDA 2.11; hence, copyright date may be omitted (264/4 subfield $c).
  • PCC practice, in both the BSR note on RDA 2.11 and the RDA LC-PCC PS 2.11, recommends that copyright date be recorded for rare materials.
  • If a cataloger has chosen not to record a copyright date, that date does not come into consideration when determining the coding of DtSt.
  • If a cataloger has chosen to record a copyright date (264/4 subfield $c) and the consideration of available dates points to the assignment of DtSt code “t” (“Publication date and copyright date”), proper practice would call for recording the publication date (whether actual or inferred) in Date 1 and the copyright date in Date 2, even in cases where those dates are the same.
It seems redundant to have the language coded in the 008 field field, and the 040 AND 546 fields as well. Which of these controls how our local ILS displays the language of the material? Thank you.

I would say it is probably the code in the 008 field that controls with your local ILS display, but you would have to verify that with your local ILS documentation, because we don’t really have control over that.

I would also say that the code in the 040 subfield b and the code in the 008 language fixed field are totally different meanings. The language in the 008 fixed field is for the language of the material you are cataloging. The language in the 040 subfield b is the language of cataloging. In other words, are you cataloging in English, are you cataloging in French? What language are your notes in? If you’re cataloging in French, your notes will be in French, even if the material is English, which would then be the code in the 008 field.

Now, the 546 field is about the language of the resource, rather than the language of the cataloging, and that’s usually used when there’s something complicated that you want to explain, or some sort of translation involved. Usually, you don’t enter a 546 field, if it’s very clearly just one language for the resource.

So if I have  a 264  _1 and a 264 _4 is that an example of when  a dtst code "t" should be used?

Yes. When you have DtSt coded t, you usually see the 264 second indicator 1 for publication information, and 264 second indicator 4 for copyright. You usually do see that.

It’s really hard to generalize, because there is that hierarchy in the DtSt, both in MARC 21 and in Bibliographic Formats and Standards, there’s a hierarchy of which DtSt codes you use, so in the case of reprints—certain kinds of reprints anyway, and that’s explained in much more detail in DtSt and BFAS for reprints—the date of the reprint that you have at hand and the date of the original, if it’s presented as such, that DtSt code r may take precedence over DtSt code t. So just because there are two 264 fields, one with a date of publication and one with a date of copyright, that doesn’t necessarily mean you will be coding DtSt t.

What do you mean by "used for matching"?

When we say something is used for matching, there are two times when matching comes into play. This was machine matching we’re talking about. And that is when libraries load records into OCLC by data sync or some other batch process, the machine is using certain elements for matching. And then also, when we are trying to merge duplicate records by DDR, the duplicate detection and resolution software, that’s a machine, or software program, using certain elements for matching those records. So that’s what we mean by used for matching. It’s a machine process.

Also is there a reason the policy got changed to use "t" for the same publication and copyright date?

It wasn’t a change of policy per se. It was the evolution from AACR2 to RDA. In RDA, different kinds of information, even though they may be related, or may even be the same in some cases, different kinds of information are most often separate elements. So unlike under AACR2, under RDA, the date of publication is one kind of date element, and a copyright date is another kind of date element, and so those are considered to be related but separate elements, or pieces of data, and so that is why you’re seeing DtSt coded t and the same date in date 1 and date 2 in so many cases now, under RDA.

So should MRec be made obsolete?

That’s a really good question! When it was coded in the past, for transcription from cards, it was important, and so those records that were converted from cards still have those coded elements, and that may be important at some point. So I’m not sure we want to make the element obsolete, because of that. But it is just not something that’s used currently.

If someone does want to make an element like this obsolete, they would need to propose it to the MARC Advisory Committee. That’s not something OCLC can determine.

So, if we catalog in English, our notes should be in English, to the language of the material?  Because that is not what I'm seeing most of the time.

Yes. If you catalog in English your notes ought to be in English, except for quoted notes. If you are quoting something from the resource, and the resource is not in English, then you may have a note that is in quotes, that is not in English. So if you are seeing things that are coded incorrectly with language of cataloging, and you’re seeing a pattern of those, with a particular library, feel free to report those to bibchange, and we’ll see what we can do about making corrections. Or if it’s just an individual record, you can probably change it yourself.

We have to use the publisher provided abstract. at our agency If the publisher provided abstract is in another language do we need to translate it?

The best possible cataloging would be yes, it would be good for you to translate it, but I too have seen many records where you’re using a publisher-provided abstract in the 520 note, the summary note, and it’s not translated, it’s in the language of the item, but everything else is in the language of cataloging. If that’s all you’ve got, I say it’s acceptable. It’s not the best, but it’s acceptable.

When is the new connexion client is going to be deployed?

It already has been. You can download it now.

November 2021: Cataloging children's materials

November 9, 2021

As part of the resources in the book, did you include the Cataloger's Reference Shelf ( I used/use that a lot as a school libraries cataloger and now even as an academic cataloger.

This is not one of the resources listed in the back of the book. It might have been mentioned within one of the chapters, but if not there's always a place for it in the CCK7 (Cataloging Correctly for Kids, edition 7). I am marking it now so that the next authors have a chance to include that.

There are new instructions in the free-floating subdivisions list H1095 which seem to say that "Juvenile films" should be used for nonfiction children's films and "Juvenile drama" should be used for fiction children's films. This contradicts the other instructions. Which is correct?

H1095 was updated in August 2021, while the other Subject Headings Manual documents were updated around 2013. I would guess that H1095 is the way that the Library of Congress wants to go, and they just haven't finished updating all of the documentation that needs to be updated. "Juvenile films" now would be for nonfiction and "Juvenile drama" would be for fiction films.

There is a lot of Young adult fiction out there. Is there a way to indicate that in the subject headings or subdivisions?

It doesn't look like there is a way to do so within Library of Congress Subject Headings (LCSH) or Sears. It would have to be through some other thesaurus or your own made-up thesaurus with second indicator 4 in a subject heading or a genre heading. There are youth films and youth television programs, other than that I'm not aware of anything in LCSH that allows you to use a subdivision "Young adult fiction" or the like.

A lot of copy cataloging records that are coming in do have the genre heading "Young adult fiction". This has not been investigated thoroughly as to how authentic that is, but it is being used out there as a locally created heading.

Another option for identifying young adult fiction in your catalogs may be through using your ILS. Some can facet by young adult, in the online catalog, as long as the participating libraries have located that particular resource in the Young Adult Room. So, there may be an opportunity to do it sometimes through your ILS features.

Will web-based DDC and POD replace periodical print publication of new version?

Yes, in fact, they already have. It used to be that when there were full printed editions, editors would be working on changes in the meantime, but sometimes they would get held back until a new print edition was put out, which seem to make less and less sense as we looked at it. WebDewey allows us, once a change is ready, to roll it out. The print-on-demand versions hopefully fill that hole and mean print isn't completely dead.

So, will we just continue to code subfield $2 in 082 fields as 23 forever?

Yes, but there is extra information that goes in there too that was just approved by the MARC Advisory Committee at the beginning of this year. The documentation on the Library of Congress MARC pages is up-to-date, but we're still getting the word out on this one. There is different guidance based on whether you are using a print edition, where the content of it is just static or WebDewey.


The short version is that with WebDewey, you will cite the date in which you're cataloging combined with the history notes that are in WebDewey. Those are good ways, if you ever come across a number in the future and the number that's assigned to a resource doesn't make sense, you can refer to that data and maybe find that when the number was assigned, it was the right number even if the subject has since moved.

If we have an item, such as an original music score, that we do not want available for ILL but do want it discoverable for our patrons in WorldCat, is it ok to put a new record in OCLC Connexion, if we will likely have the only holding?

Yes, you are more than welcome to enter original materials in WorldCat. That is a huge use for scholars out in the world to be able to discover that the resource exists. Even if your the only one that holds it.

You can talk with your inter-library loan folks about how to have policies in place to make your library staff and patrons aware that something is not available for inter-library loan. Lots of libraries have policies that restrict certain materials from inter-library loan so that people can only come and use them onsite.

We are attempting to catalog in a way that's respective of authors culture's naming conventions (for example, Icelandic authors are listed/shelved by given name rather than family name). Is there somewhere we can look to find which other cultures naming conventions differ from the anglophone family name first?

Wikipedia is a very good resource for name order. It helps that there is a really global base of volunteer editors. If you look at Wikipedia articles on a lot of famous Icelandic people, for example, there is usually a note at the top saying that it's an Icelandic name and that you typically refer to that person by what looks to us like their first name. It does this also for other cultures as well.

There also may be a Library of Congress authority record that represents that author that can be used.

IFLA has a document that lists naming conventions for many countries.

There are also special instructions in RDA for Iceland names as well (RDA Appendix F: Additional instructions on name of persons).

November 18, 2021

Is anything being done to update or replace DDC so it is more inclusive and less white centered?

Yes, absolutely. In these, or any areas, we are really trying to prioritize community driven updates to the DDC that are best going to reflect the needs of your users. In the past years, we have been very open and welcome to suggestions from the community at large. It's been a very successful program. The Dewey Editorial Policy Committee has an opportunity to review these suggestions from all sorts of places, and we move on them pretty aggressively. For more information about how to get involved in this process, see Contributors.

How are you collecting the community-driven updates?

They are published and made available along with other changes. If you are looking at a feed of changes in Web Dewey, it's not necessarily going to be clear what is changes that we've gotten from the community. We will sometimes highlight in a blog post specific contributions. We also have another related site ( where we post proposals that either come from myself or the community. You can also see proposals from the past few years there, and if you look at a specific proposal you can see where it's coming from.

I sometimes struggle with deciding which Fixed Field code to use for Audience: 'j' or a more specific children's code. Suggestions?

I try to look to see if the resource suggests the audience level. If there isn't an audience level clearly indicated with the resource, I default to 'j' just because I don't feel that I can responsibly 'level' a resource.

When I download records for children's books, there are often two separate LC call numbers. One starting with PZ, and the other starting with PN, PR, PS, etc. I'm never sure which one to go with.

PZ is juvenile fiction and juvenile other things, such as stories and rhymes and things like that. I haven't seen a lot with double LC classification in addition to PZ and PN, PR, PS. Generally, if what you have is juvenile fiction or juvenile stories, rhyme, or something like that, then that would be what you would want to go with. Of course, there would be different numbers for the language that the resource is in that are associated with that PZ, just like there are for the various PN, PR, PS adult numbers. Unless you are talking about poetry, in which case, I believe the PN, PS, PR would be more appropriate rather than PZ.

Does anyone know if there is an update on the LC children's and young adult literature call number policy revision?

I don't remember seeing a follow-up to their survey when they were asking whether the letters that they use as if it were a cutter should be changed to the same or a similar practice to what they use for the non-PZ numbers. As far as I know, they are still using the letters instead of the cutters. If you have any questions about LC practice or the Library of Congress in general, you can send an email to the head of CYAK (Children's and Young Adult's Cataloging Program at the Library of Congress), Stacey Devine at They are very responsive and helpful with any sorts of questions you may have for them.

What is the best practice for creating bibliographic records for picture books published by various publishers on orders from Recorded Books and arriving with accompanying audio CDs? Recorded by Recorded Books is the most popular way to describe these sets as kits. If so, why are there books format bibliographic records in Connexion describing a picture book with an audio CD as accompanying material, or alternately as a sound recording format bibliographic record describing an audio CD with a book as accompanying material? In both cases, these resources are not published at the same time or by the same publisher and are not published as a set. Are these records in Connexion an acceptable practice?

For my local catalog, when you are faced with something like that, you have options. It could be a book with a CD, a CD with a book, or it could be a kit. I really try to limit what I call a kit to something that is in more than two formats. So, in order for something like this to be a kit, I would have to have a book, a CD, plus a puppet, or a book, a CD, plus flashcards. With just two formats, it can go either way. What I try to do in my catalog is keep any audio plus book, just that, have it be an audio format with an accompanying book. Just to keep them consistent and the icons correct.

For Connexion, if the materials are purchased together or issued together by a publisher, definitely cataloging them together is a good idea. Local practice may vary between libraries. You're trying to determine what is the predominant material, when you're cataloging these. With some of them, it really is hard. It may be that you're using a book with an accompanying CD, or maybe the audio recording is more important than the book, but sometimes they're both equally important. So, you really are making choices about how to catalog them. Either choice is acceptable, a book with accompanying CD or a CD with accompanying book. I agree that if you only have those two pieces, you probably don't want to go with a kit. But, if you have more than two pieces, three types of formats, you will want to consider kit as an option. If they are issued separately, you may well want to catalog them separately, but in WorldCat (where Connexion is one of the ways to access WorldCat or catalog in WorldCat), you will probably see all three - kit, book with accompanying CD, or CD with accompanying book.

It seems like the situation here is that the books are often published by various publishers and the CDs were done by Recorded Books that's then issuing them together. So, in the case of the CD, Recorded Books is really the publisher, but in the case of the book that comes with the CD, Recorded Books is really maybe acting more as a distributor. Aside from that issue of what material predominates, and do I describe this as a book with accompanying CD or is it a CD with accompanying book. There is this is of, is it really issued together? It kind of is, but then it was published separately, so it's kind of not. This is something that we could really use the help of our colleague, Jay, and maybe we should have some discussion about this. We will follow-up with Jay Weitz and chat about this a little bit more and make sure that any decisions are added to the notes when they are available.

Juvenile fiction is a bit broad, I sometimes find it hard to assign it to books for young adults even though the audience is coded 'd'. What is the recommendation for young adult books with more 'adult' topics. Should these have only LC headings and not children's headings?

Subject Headings Manual H 1690 says to assign LC subject headings and subdivisions to topical materials for juveniles up through age 15 or ninth grade, and to use the juvenile subdivisions for those. So, if the work or resource is intended for ages above 15, then you would in theory treat it as just a regular general work or adult book without those juvenile subdivisions. Although actual practice on that may vary in the real world.

In local catalogs, you do have the opportunity to get the location as being young adult. So, when you are working with your local system, you might find a way to represent these resources as young adult and not use juvenile fiction, which sometimes looks a little awkward in young adult literature. Plus, if a young adult is looking for something, and they see it as juvenile fiction on it, they think it's for babies and they don't want it. I would suggest looking at your local systems and seeing if there's some type of solution you can come up with, with pairing faceting of location with subject headings or call numbers to responsibly label these young adult materials so that it's clear they are for young adults.

Is the structure of the print-on-demand copy of Dewey the same as the former print editions? In other words, separate manual, tables, index, in addition to the schedules?

Yes, it is. We use the same printing software to prepare those as we did the old print editions. They're nice trade paperbacks rather than hard backs, but otherwise the look of it is very similar.

What maintenance issues are there for children's subject headings? How has the recent expansion of the LCSH authorities helped or complicated that work?

Since the children's subject headings aren't currently controllable, it means that maintenance has to be done manually or by automated tools that are essentially the same as manually. The expansion of the authorities for children, hasn't really had an impact as of yet. Aside from explicitly establishing things that were assumed to be established by their existence in LC, and now there's an explicit heading for the children's subject heading so that it's more clear that it's backed by an authority record even though it can't be controlled.

Is there any help judging genre headings, for example picture books, beginner, etc., and creative non-fiction versus fiction?

With genre headings, there are many types of thesauri and lists in usage. Make sure you are drawing from the same list all the time. When you are looking at that particular list and using a genre heading from that list, go by the guidance set up by the list you're using. So, go by what the genre heading list that you're using, the definitions, and their instructions for use, and that will usually take care of everything. If not, go to your common sense approach of using that particular genre heading.