May 2-4, 2001
With links to background documents
CONSER Meeting, May 2, 2001
The CONSER Operations Committee held its first three day meeting this year. The CONSER and BIBCO Operations Committees met jointly on May 3. Jean Hirons, CONSER Coordinator, chaired the CONSER meeting, with assistance from Les Hawkins, CONSER Specialist.
The first day of the CONSER meeting was an orientation session for those new to CONSER.
Hirons began the meeting with introductions that started with the newest members and progressed to those with ten to twenty years of experience. There were 39 attendees representing 34 CONSER institutions, in addition to LC staff. The morning session consisted of two panel discussions to allow experienced CONSER members including LC and OCLC staff an opportunity to share information and views of CONSER participation with new members. The first was called "How CONSER works: Part 1: In our libraries" and included panelists: Sue Fuller (University of Texas), Ruth Haas (Harvard), David Van Hoy (MIT), Kevin McShane (NLM), Kristin Lindlan (University of Washington). Panelists gave a description of their institution's participation in CONSER, including the scope of material contributed to the CONSER database and workflow issues. Hirons then asked panelists to respond to specific prepared questions and then the forum was opened to the audience. Some of the issues covered included:
- Implementing in-house CONSER training
- Deciding on what to contribute as CONSER cataloging
- Record maintenance, closing out CONSER records
- Consulting with other CONSER members for problem solving.
- What it means to "CONSERize" in adapting records for CONSER authentication.
Challenges that were identified during the discussion included electronic publishing, declining staff throughout CONSER institutions, institutional perceptions of the value of CONSER work (that CONSER work is more time consuming, difficult, and expensive), and the relative value of adding multiple control numbers to linking fields.
The second panel, "How CONSER works: Pt. 2: LC and OCLC," included panelists: Lucy Barron (LC Serial Record), Robert Bremer (OCLC), Judy Kuhagen (LC CPSO), Simone Jones (LC Serial Record), John Levy (LC Serial Record), Les Hawkins (LC Serial Record), and Tom Yee (LC CPSO). Each panelist described their role or the role of their organization in the CONSER program.
Descriptions of the OCLC/LC Pinyin conversion project were given. Transcription of PREMARC LCCNs from LC field offices was discussed. Procedures were outlined for contacting CPSO through a triage approach (CONSER members contact SRD staff who then would forward problems to CPSO if further resolution is needed). Consultation with LC's Serial Record staff regarding cataloging issues was discussed and encouraged. The workings of various aspects of the CONSER program were described: contacting OCLC for quality control purposes, CONSER record flow and distribution, and procedures for submitting updates to CONSER documentation.
Action:Jean suggested that a review of Part I of the CEG be undertaken to determine the situations in which CONSER participants are to notify LC. Sue Fuller volunteered to compile a list. LC will then review it to determine what situations still require notification. The list will be included in the CEG and CCM.
The afternoon was devoted to six breakout sessions on the following topics:
- Problems from home
- CONSER workflow
- CONSER cataloging issues
- MARC holdings
- Electronic serials: open discussion.
Participants had the opportunity to discuss workflow, procedural and cataloging issues during these discussions.
BIBCO/CONSER Joint Meeting, May 3, 2001
(This portion of the CONSER meeting summary has been excerpted from the BIBCO Operations Meeting summary)
A special welcome to the new BIBCO OpCo representatives as well as to new CONSER members was given. The new BIBCO representatives are: Alice Jacobs (NLM), John Sluk (Oberlin), John B. Wright (BYU), Jimmie Lundgren (University of Florida), and Chris Mueller (University of New Mexico). New CONSER members in attendance are: Mary Grenci (University of Oregon), Alwyn Owen (National Library of Wales), Renette Davis (University of Chicago), and Everett Allgood (New York University). Larry Alford, Chair of the Program for Cooperative Cataloging (PCC), recognized the presence of 3 members of the PCC Policy Committee: Carlen Ruschoff (University of Maryland), Glenn Patton (OCLC), and Robert Wolven (Columbia). Alford extended his thanks to the Cooperative Cataloging Team, John D. Byrum, Ruta Penkiunas, and the participants for the Program's success during the past year.
Metadata in PCC libraries
The agenda commenced with a discussion led by Les Hawkins. The lead-in preface was the definition of metadata to ensure that all participants were opened to the "same page".
For purposes of this discussion metadata "refers to cataloging codes, lists of data elements, or other schema that are used to create records or data that enhances the retrieval and interpretation of bibliographic resources in digital form."Hawkins surveyed the OpCo representatives and determined that more than half of those in attendance were using metadata at their institutions. Participants then offered information about what types of materials and what metadata are being used for them at respective BIBCO and CONSER institutions.
Cornell uses Dublin Core to assist in searching the documents that make up its Library Gateway Help system. They have also used Dublin Core in CORC (Cooperative Online Resource Catalog). Other digital collections projects at Cornell use SGML (with a TEI Lite mark-up), the Dienst protocol, and EAD (Encoded Archival Description). Cornell has a clearinghouse node in the National Spatial Data Infrastructure called CUGIR. The metadata for these geospatial datasets is the Federal Geographic Data Committee's (FGDC) content standard. In other cases, Cornell uses locally created metadata structures, for example to serve numeric data and reports for the USDA Economics and Statistics System.
The following list enumerates only those unique metadata schemes that had not been mentioned heretofore; hence, the list for each is not meant to be exhaustive.
The University of Oregon is using visual resources for slide collections and added that they are working to develop a core list of data elements, regardless of the metadata scheme that is being used.
Columbia is working on specialized projects using a modified AACR2 format and is using a selector input template for digital resources called Digital Scriptorium.
The University of Florida is using finding aids on an access database. The Florida Center for Library Automation is using "NDLDT" (Networked Digital Library of Dissertations and Theses metadata).
The National Library of Wales reports using SGML, XML, and TEI.
The National Library of Medicine (U.S.) reported using Dublin Core for digital resources, and enhance Dublin Core for more permanent resources and for materials that have been archived.
BEAT (Business and Economics Advisory Team) at the Library of Congress reported using Dublin Core and free Web-based resources and have put a template into use designed by reference librarians.
Harvard University uses a union catalog of visual resources (VIA) and OASIS, which is an archival and manuscript finding aid using "EAD" (Encoded Archival Description).
Hawkins proceeded to involve participants by attempting to ascertain the kind and level of staff that are involved with either metadata applications and/or creating records for digital resources. Essentially, responses were quite mixed. Staff involvement seems to be at all levels and is a highly-collaborative function, often requiring the assistance of reference staff. It was pointed out that often the resource itself may require the use of a particular metadata scheme. Naturally, support functions and funding issues are vital for maintenance of the metadata. Abstracting and indexing services are creating the largest number of metadata schemes.
Questioning also focused on the individuals responsible at each institution for the cataloging of metadata. The response was varied; however, many institutions reported that mainly serials catalogers appear to be involved with the cataloging of electronic integrating resources, although there were some institutions that reported the use of monographic catalogers, and those that reported the use of both. As an interesting aside, it was noted that catalogers with an AV (Audio/Visual) background had the least apprehension about cataloging digital resources.
Hawkins did try to get a reading about when and if AACR2 will be used to provide description and access to resources such as Web sites, databases, e-journals, and other digital resources. Responses indicated that AACR2 will be used for such resources when it is published.
The metadata warm-up session seemed to produce the desired effect. Much enthusiasm was generated and participation involved many of the OpCo representatives. When asked how participants envision the role of PCC in helping to define the standards by which digital resources would be cataloged, the consensus felt that a subgroup of the Standing Committee on Standards could be tasked with developing some guidelines and that the PCC should become a "clearinghouse" for metadata standards and that no one particular scheme be preferred. It was also suggested that catalogers begin to look "outside" the box of technical services to help promote access to digital resources.
Action:PCC, via SCS to investigate need to define guidelines for digital resources cataloging. PCC Web site to be used as clearinghouse for various defined metadata standards.
Standing Committee on Automation (SCA)
The discussion moved to an update on automated classification given by SCA Chair, Karen Calhoun (Cornell). Classification is the last cataloging process to be significantly touched by automation and remains essentially an expensive manual process. Calhoun referred participants to the work done by the SCA Task Group on Automated Classification, chaired by Gary Strawn (Northwestern). Their report includes product specifications for ILS vendors to consider and includes three main recommendations; (1) that the ILS be able to produce a list of subject headings associated with a classification number; (2) that the ILS be able to produce a list classification numbers associated with a subject heading; and (3) that the ILS be able to report duplicate classification numbers.
Calhoun then referred participants to the questions prepared by Jeanne Baker (University of Maryland) in advance of the Operations Committee meeting. The questions could be helpful for assessing a library's readiness for and interest in automated classification. Calhoun queried the OpCo representatives if the enhancements suggested by the Task Group were available now, in what ways would changes take place in what is currently being done at each home institution? The overall feeling was that the output of cataloged titles would increase substantially. Encouraging the OpCo representatives to get the word out to the ILS vendors, particularly to the various vendors' Users Group meetings, Calhoun solicited assistance from attendees who would volunteer to publicize the need for classification automation.
Action:John B. Wright (BYU, SIRSI) and Jeanne Baker (University of Maryland, Voyager) will work with Calhoun to help with reaching vendors.
Calhoun turned her attention to OCLC batch processing. The SCA Task Group on OCLC Batch Processing has completed three surveys--one with BIBCO liaisons, one with CONSER OpCo representatives, and one with OCLC Users Council delegates. The Task Group's preliminary report (made available in advance of the OpCo meeting) reported that catalogers at BIBCO/CONSER libraries who are contributing Program records online in OCLC are happy with this contribution method. However, BIBCO institutions that batchload Program records are dissatisfied with the process, and enhancements to batchloading would be a significant help to these libraries. There was some interest in batchloading enhancements among CONSER respondents to the survey, but not at the level expressed among BIBCO respondents.
While the availability of batchloading enhancements does not appear to be a driving factor in who joins PCC or how much is contributed, such enhancements could further the goals of the PCC by making more BIBCO upgrades available for use in WorldCat. Currently, when batchloaded BIBCO upgrades match existing WorldCat records, only the library's holding symbol is attached, and the BIBCO upgrade itself is discarded. If batchload were changed, these upgrades would no longer be discarded.
In the past, OCLC Users Council delegates had expressed some reservations about the replacement of OCLC member-contributed records with BIBCO full and core records. The Task Group's Users Council survey was done to learn more about the delegates' concerns and preferences. Results indicated that over three-fourths of the Users Council respondents were either somewhat in favor or fully in favor of OCLC batchloading enhancements that would facilitate the contribution of Program records. Residual uneasiness about the enhancements may be the result of a general lack of familiarity with PCC core and full record guidelines among Users Council delegates.
Survey participants were asked to choose among a variety of options for replacing OCLC member-contributed records with Program records. A majority of respondents from all three surveys preferred a replacement option in which any batchloaded PCC core record would replace any less-than-full member record, and any batchloaded PCC full record would replace any member record. There was some support for an option in which data from member and PCC records would be merged, but it was clear that merging would be complex, difficult, and costly to implement. Calhoun reminded participants that some merging already takes place; when the OCLC batchloading process "bumps" a member record with an incoming record, subject headings and call numbers in schemes not present on the incoming record are retained.
Calhoun asked OpCo participants to help the Task Group with its recommendations to OCLC and the PCC Policy Committee. There was further discussion of the "replace" versus "merge" issue, with some OpCo representatives from CONSER libraries expressing support for data merging.
Action:The Task Group will make recommendations to the PCC Policy Committee and to OCLC, taking OpCo respondents' comments into account.
Utilities Wish List
The meeting then featured the initial utilities responses (both RLG and OCLC) to the PCC wish list of electronic enhancements designed to facilitate contributions of records. Ed Glazier, RLG, took center stage to tackle the eleven issues for BIBCO/NACO/SACO outlined in the background document. Cynthia Whitacre followed in turn with OCLC's response. For reporting purposes each of the 11 points will be enumerated with the respective responses from the utilities.
- Validation of headings on bibliographic records; linked authority control.
RLG: Plans to perform this function were abandoned years and ago and have not been revisited.
OCLC: Experimenting with making the authority file available in OCLC's FirstSearch so that references can assist in end-user searching. OCLC already offers linked authorities in CORC.
- Record distribution between OCLC and RLIN.
RLG: RLG remains prepared to discuss this issue.
OCLC: Equity of exchange is important to OCLC. OCLC plans to examine searching of other databases, the first of these will be to a database in the Netherlands; linking to RLG is a possibility to be explored.
- Batch-loaded BIBCO records to overlay other records in OCLC.
RLG: No comment necessary.
OCLC: Is striving to improve on this capability and is working with the SCA Task Group on OCLC Batch Processing to define enhancements to the process.
- SACO and classification workflow online in the utilities.
RLG: Development not justified; not cost-effective; however, Glazier noted that RLG does not want to interrupt LC's mechanism of subject heading review.
OCLC: SACO is under LC's control; OCLC is looking to LC to provide direction in how to streamline SACO proposal submission.
- BFM: Revision of headings on bibliographic records in the utilities' databases.
RLG: Not planning to implement and do. See 1 above.
OCLC: "Good news"! OCLC is doing pro-active clean-up of headings and is experimenting with using reports of newly established heading which would eliminate the need for BIBCO libraries to report BFM. Global replace is not yet possible except in the context of linked authorities in CORC.
- Series numbering should file/sort in the utilities as a PCC Task Group has
recommended for vendors of local systems.
RLG: No development is currently underway to implement.
OCLC: OCLC does not currently display the contents of subfield $v in truncated lists of search results, but will consider it as our new platform is implemented.
- Ability to import records from remote databases into the OCLC database.
RLG: No comment necessary.
OCLC: This is part of OCLC's Extended WorldCat strategy.
- More search capabilities of the MARC21 tagging in records, including the
RLG: Currently available through Web-based EUREKA and RLIN technical processing system.
OCLC: OCLC responded that it is interested in clarification on these points (e.g., Which fixed fields are most useful?) and will be asking for input on the BIBCO- and CONSER- mailing lists.
- The ability to see all in-process authority records in OCLC, as members can do
RLG: No comment necessary.
OCLC: Definitively no!
- Retain a user-friendly interface.
RLG: We are working to make EUREKA even more user-friendly.
OCLC: Working hard at retaining a user-friendly interface. OCLC is moving in that direction, particularly with the move to the relational database.
- Allow for bigger records.
RLG: EUREKA has essentially no limits on record size.
OCLC: There are no limits to record size in CORC, and as OCLC implements its new relational database, there will no longer be a limit to record size.
Robert Bremer(OCLC), spoke to the CONSER wish list categories listed as: 1) linking-related; 2) multi-part/multi-dimensional records; 3) maintenance-related; and 4) long-range.
Bremer stated that concerns with abilities to link to external databases is indeed the strategy on which OCLC's WorldCat is based and that links to other databases is a very real possibility. He added that OCLC is working toward making better use of existing links in records to bring together related records in catalogs. Bremer continued that with maintenance-related issues that more discussion at all levels would be required, particularly with input needed from other library users and not only technical services staff and that long-range planning should involve input from a myriad of sources. Bremer announced that the new OCLC platform with its new databases and database models should become available at the end of Summer 2002.
PCC Task Force on Multiple Manifestations of Electronic Resources
Jean Hirons reported on the findings in the final report of the PCC Task Force on Multiple Manifestations of Electronic Resources, chaired by John Riemer. Hirons summarized some of the work done by the PCC on works issued in multiple versions. Efforts have included 1) the development of the single-record approach for providing access to online versions through manipulation of the record for the print version; 2) the development of guidelines in the CONSER Cataloging Manual (CCM) which provide guidance on the use of separate records for electronic versions vs. the single record approach; and 3) working with vendors and aggregators of serials to provide access to bibliographic records contained in the vendor's products.
Hirons announced a promising development in the formation of a JSC-commissioned task force to study cataloging at the expression level, using the OCLC-Europe database for the sample. Jennifer Bowen will be chair of this group; JSC member Matthew Beacom will be a member as well.
An informal show of hands revealed what some BIBCO and CONSER members were doing to handle multiple manifestations. Some institutions are loading record sets from various aggregators. Attendees expressed having experience with other monographic record sets including Books 24 x 7, Lexus/Nexus Tripod, proQuest records, and sets from NetLibrary. Cornell is experimenting with harvesting data from digital resources to automatically create its own record sets for loading into its catalog. This alleviates a resource drain by relieving pressure on cataloging staff in that at least a brief record is created. The University of Florida is also experimenting with creating its own record sets for its digital collections.
Action:PCC Web site to be used as a clearinghouse for existing record sets that are available, including lists of commercially available products.
Hirons pointed out that CONSER assumes that the print record represents the first tangible manifestation of an item. Attendees agreed that sometimes the electronic version is actually produced as the first or primary manifestation and other format versions are secondary. In a case where a record for the print or other tangible format version does not exist, elements describing the availability of the print or other versions could be added to the record for the electronic version.
Several issues relating to ISSN and multiple aggregator records were raised. Under current CONSER policy, separate records are created for online version of a single title simultaneously distributed by different aggregators. The ISSN network is assigning one ISSN to the print version and a separate ISSN for the online version; however, it is not assigning a separate ISSN to the online versions of a single title offered by multiple aggregators. If there is only one ISSN for the online version of a print title, which among multiple aggregator records should be used for recording the ISSN?
Regina Reynolds (LC, NSDP, the U.S. ISSN Center) suggested the ideal of being able to assign the ISSN for an electronic version based on the original electronic text produced by the publisher, creating a "master" record for the electronic version. This approach emphasizes identification of the electronic version of the work over description of individual aggregator online versions which might differ from one another in completeness, coverage, etc. It would allow libraries to maintain specific information about aggregator versions (subscription details, coverage, etc.) at the holdings level rather than the bibliographic level.
Some attendees described the local practice of using multiple ISSNs in multiple 022 fields (one for the print, one for the electronic version) when using the single record approach.
Action:CONSER to consider the impact of incorporating the use of multiple ISSNs into its single-record approach for national level records.
AACR2, MARBI, and Integrating Resources
An update on AACR2 and MARBI actions relating to Integrating Resources (IR) was provided by Hirons as well. The JSC has given basic approval to the revised chapter 12 of AACR2, which will have the title "Continuing Resources" with publication expected in 2002. Chapter 12 will provide rules for both successively issued serials (including series) and integrating resources.
An integrating resource will be defined in AACR2 as "a bibliographic resource that is added to or changed by means of updates that do not remain discrete and are integrated into the whole."Hirons further defined the term "integrating" as a form of issuance and that IRs can be print, electronic, or available by means of other media; for example, a serial can become an IR in its electronic manifestation, the updating of a loose-leaf publications, and the updating of Websites. Some highlights of the cataloging rules for integrating resources include:1) description based on latest entry; 2) a new record not being required for title changes; 3) serial-like designations not generally being applicable; and 4) notes to reflect earlier information when it is known.
Hirons presented issues with MARC coding for IRs and reported on several MARBI documents to be discussed at ALA in June, including 1) Proposal 2001-05: Bibliographic level (Leader/07) code 'i' for integrating resources; 2) a discussion paper dealing with which 008 to be used with textual materials: serials or books; and 3) Proposal 2001-04: Repeatable 260 fields. Hirons noted that the decisions concerning bibliographic level 'i' that will identify a record as an integrating resource may give PCC program members more opportunity to perform maintenance on records for IRs and for distributing them. Should code 'i' be implemented for IRs, the determination would need to be made whether it should be used with the Serials or Books/008 field.
The Books/008 is currently in use because these records are coded as bib level 'm'; the continued use of Books/008 would not require that records be converted; however, the bytes seem less useful. Additional possible codes to be added if the Serials/008 is used to help describe IRs would be the code 'k' to identify a frequency of "continuously updated," the code 'l' to identify loose-leaf publications, and the code '2' for identifying latest entry, similar to an update for an IR, as when a new title is recorded in the record. In her description of the MARBI proposal 2001-04, the possible use of the 247 field in records for IRs to record earlier titles was suggested. Hirons asked for a show of hand on which 008 should be used and there was unanimous approval of the serials 008.
Action:Compile a chapter on the cataloging of Integrating resources. Who: Volunteers will include: John Sluk, Naomi Young, and legal catalogers, Judy Kuhagen, etc. (depending on whether loose-leafs are included or given separate chapter.)
Note:PCC participants are encouraged to visit the JSC Web site (external link) for the latest updates and progress on changes to AACR2 beyond those discussed at this meeting.
Interim Report of the Task Group on Implementation of Integrating Resources
Valerie Bross (UCLA) gave an overview on the Interim Report of the Task Group on Implementation of Integrating Resources. The group focused on the needs for training, documentation, maintenance of records, and the distribution of records for IRs. It seems that the PCC is in a good position to create training materials and documentation for the cataloging of integrating resources. Since catalogers in both the BIBCO and CONSER Programs could conceivably be involved in creating records for integrating resources, it was suggested that documentation be available widely, perhaps as a module of the CONSER Cataloging Manual (and consequently available through Catalogers Desktop) and as a chapter in the BIBCO manual.
Action:A decision will need to be made on whether there should be one large document for all IRs or whether separate CCM-like modules should be developed for loose-leaf publications and electronic IRs. Other issues to be resolved include who should prepare the documentation, and when the documentation should become available.
Maintenance of records for IRs was discussed, including record distribution questions. Generally it was agreed that record maintenance for IRs would be important and should be shared among BIBCO and CONSER institutions. However, it was noted that BIBCO records are not simultaneously shared in both OCLC and RLIN, thereby making it difficult to share maintenance of all records for integrating resources. There was also uncertainty expressed about the mechanisms that would trigger the need to perform maintenance on IRs. Would routine staff scrutiny be required to monitor changes or would automated means (such as link checking software) be sufficient?
Action:Maintenance guidelines will be developed to give BIBCO and CONSER libraries an idea of the resources required to monitor and make maintenance changes to IRs.
Record distribution and sharing among the utilities also brought up the use of LCCNs for record authentication and record distribution in the CONSER database. It was suggested that automatic generation of LCCNs in OCLC may be a possibility. Perhaps this automated mechanism would allow more libraries (BIBCO and CONSER) to share in record authentication and the maintenance of IRs.
Standing Committee on Standards
Ann Caldwell (Brown), Chair, Standing Committee on Standards (SCS), followed with a report on the work of the SCS. Caldwell focused on the review of the core records for all formats and has made note of any discrepancies among the various core standards. As she reported in the previous day's meeting they have been grappling with the core record elements included in the record for cartographic materials. At issue is whether a single core record for all types of cartographic materials should be sufficient, which the SCS has indeed chosen to endorse. It has been determined that certain elements of the cartographic core do overlap with elements of other core records, most notably, the core records developed for computer files and serials core. Caldwell commented that the SCS may look to develop a core record with standard elements to be used across the board for all materials, regardless of format, with footnotes to supplement the core record with additional elements based on the type of material. No negative feedback was generated; however, it was noted that each core record also be maintained as it currently is.
Action:SCS to maintain separate core record standard, one with its complete array of elements accompanied by footnotes to supply additional elements based on the material, and a core record with all elements for each type of material without reference to the others.Kay Guiles,(Senior Policy Specialist, CPSO, LC) presented a follow-up report concerning recommended changes to LCRIs from the SCS, Cross Reference Task Group's final report that had been submitted on December 17, 1999. The proposed LCRIs drafted in response to recommendations #8, #9, #10, and #13 are available on the CPSO web site. Comments on the proposed changes are to be sent to the CPSO e-mail account at firstname.lastname@example.org by July 20, 2001. LC's disposition of the remaining 8 recommendations can also be found in the LC report.
Standing Committee on Training
Carol Hixson, Chair, Standing Committee on Training (SCT) was the last speaker for the day. She began her report with the announcement that the SACO Participants' Manual authored by Adam Schiff (U. Washington) had been completed and was at the LC print shop and would soon also be available in print, on LC's Cataloger's Desktop, and via the SACO home page in PDF form. The PCC is deeply indebted to Schiff for this achievement which will greatly help to facilitate contributions to the SACO Program.
Hixson applauded the work of the four SCT Task Groups ( the Task Group on Educational Needs of the Cataloging Community: Final report;  the Task Group on NACO Continuing Education: Final report in which one recommendation called for a revision to the NACO Training Manual to parallel the CONSER Editing Guide and the CONSER Cataloging Manual, which are both sources rich in examples;  the Task Group on PCC Participant and Training Documentation: Final report; and  the Task Group on Web-Based Training and Distance Education: Interim report. Hixson also noted the work of the joint SCS-SCT Task Group on Implementation of Integrating Resources and their Interim Report, which had been presented by Bross earlier in the day. Hixson wanted feedback from OpCo representatives whether or not it falls within the purview of the SCT to conduct basic cataloging classes.
The point was made that library schools are not addressing the current needs for technical services librarians; Hixson proposed that the PCC become a training and education hub for technical services and that the training/education be used as a recruiting tool, both to this aspect of the profession as well as to the Program. Hixson summarized the work of the TGs as calling for standardized documentation, following the CONSER model; a NACO Coordinator and a strong BIBCO Coordinator (equivalent to the CONSER Coordinator); a collection of local cataloging documentation with links from the various PCC homepages; and a personnel resources inventory for BIBCO participants, in which strengths of individual BIBCO members could be tapped when needed. Hixson stated that such an inventory had been alluded to during ALA Midwinter and was perplexed that it has not been pursued. A CONSER participant commented that the CONSER documentation does not obviate the need for consulting AACR2, the LCRIs or other official policy documentation. John Byrum (LC) commented that a restructuring in the Cooperative Cataloging Team was in the pipeline that would help strengthen the various PCC programs.
Action:Hixson will develop a survey instrument to garner the personnel resources of BIBCO participants. The responses would then be keyed into a database and maintained. A decision on whose responsibility it is to maintain such a resource will need to be made.
CONSER Meeting May 4, 2001
AACR revision, MARBI proposals, and documentation
Jean Hirons gave an overview of AACR revision, the need for updating current CONSER documentation, and planning for new documentation. The revised chapter 12 incorporates the concept of "continuing resources," an umbrella concept for serials and integrating resources. The chapter will also encompass some finite resources (e.g. newsletters of an event, finite loose-leafs) which will be treated in the same manner as serials or integrating resources that are continuing. She mentioned that a statement was added to exclude non-serial cartographic materials (e.g. map series) even if they are of an integrating type. There was brief discussion by attendees about the differences between Web sites that contain digitized cartographic material versus digitized maps or map series. Issues surrounding some types of digitized cartographic material and coverage under the new chapter 12 may need further clarification. The revised rules will incorporate many current LCRIs and CONSER practices.
Hirons highlighted the addition of new rules for minor title variations:
- The addition or deletion of words anywhere in the title that reflect the type of publication (e.g. journal, magazine, series, etc.)
- The addition, deletion, or change in the name of a body that is part of the title. (Hirons emphasized that this referred only to changes in the name of the same body- rather than a change from one responsible body to another.)
- Also, the addition, deletion or change in words from a list of words within a title.
There was discussion of what constitutes a list of words and it was clear that good examples will be needed for the CONSER documentation. Hirons pointed out that in considering changes in the resource, the emphasis in the revised chapter 12 is to err on the side of fewer title changes and number of records.
Revisions in other areas of the description were also outlined, including other title information, numbering, publishing area and notes. Note rules have been expanded in the new chapter 12 because the information they include will differ between serials (later information) and integrating resources (earlier information). "Latest issue consulted" has been added to the "Description based on" note for serials while the "Description based on" note will be used alone for integrating resources.
Hirons described MARBI proposal 2001-05, which advocates a new bibliographic level code "i" that would allow for better record identification of integrating resources in OPAC displays and could be important in allowing both BIBCO and CONSER members to maintain records for integrating resources. If approved, it will be necessary to provide clear instructions beyond the PCC community. BIBCO and CONSER members may be especially tuned into such changes, but there is a need to reach other cataloging constituencies that are cataloging integrating resources. There was also a question about how utilities and individual library systems will be able to handle changes in MARC 21.
MARBI proposal 2001-04 calls for a repeatable field 260 in order to provide better access to the current and earliest publishing information. Recording both the earliest and the latest publisher in multiple 260 fields would aid users of records and reference staff in identifying the correct record and is particularly needed by acquisitions staff. Hirons said serials and integrating resources would have to be treated similarly, with the earliest publisher given in the first 260 and the latest given in a repeated 260; however, displays could be configured according to AACR2. There was some discussion about whether only the latest and earliest publishers should be given or if all intervening publishers should be recorded in multiple 260 fields. Subfield "3" could be used to add explanatory dates of coverage for the different publishers. Generally it was felt that leaving out intervening publishers would probably make the record easier to read. There was discussion of how and when place changes should be recorded: should new places associated with the publisher originally recorded in the record simply be described in a note rather than placed in a separate repeated 260 field?
Kathleen Dougherty (NAL) raised the problem of fluctuating publisher names. She pointed out that there are situations where publisher names switch back and forth between two publishers and asked how this would be handled. Another possible problem, raised by Ed Glazier, was record matching in RLIN when newly changed CONSER records containing multiple 260s are sent to replace an existing record in RLIN with one 260. It was pointed out this may be similar to record matching challenges now encountered with CJK records that have multiple 260 fields. Hirons noted that careful attention had been given to retaining access to the earliest publisher for purposes of record matching.
Hirons described CONSER documentation that will need to be revised to incorporate AACR rule changes:
- SCCTP Basic Serials Cataloging Workshop
She also noted that the two new SCCTP courses would be based on the new rules. Additionally, Adele Hallam'sCataloging Rules for the Description of Loose-leaf Publicationswill need to be revised and incorporated into new documentation. In envisioning the strategies for handling changes to documentation, Hirons described the revision of the CCM being led by an LC team reviewing the modules for needed changes, Hirons performing the actual revisions, an initial review of the new draft by an LC team, and final review by both LC and CONSER catalogers. She noted that CONSER and LC catalogers would be integral in the process by providing new examples to the review team and Hirons for inclusion in the draft. The revision of specialized CCM modules could be assigned to individuals or groups that would have a special connection to the module. For example the CONSER e-serials specialist group could head the review and updating of Module 31 for remote access e-serials. Individual authors of other modules (or others who might volunteer for a specialized CCM module) could lead the review process for them. Similarly, the revision of the CEG and SCCTP workshops will require additional examples and input from CONSER and LC catalogers. Hirons asked for volunteers to provide examples for the CCM revision with a tentative deadline of fall 2001. She also asked for attendees to contact her to show interest in revising specialized modules of the CCM. Volunteers for participation in the final review process will be sought later in 2001.
USNP and CONSER
Bob Harriman (LC) briefed the attendees on the status of the U.S. Newspaper Program. Many of the state projects are coming to a close and most of the cataloging is completed. Harriman asked who would be responsible for maintenance of newspaper bibliographic records after projects close and CONSER authentication is no longer available to them. Maintenance of Local Data Records (LDR) was also raised but as a separate issue from bibliographic record maintenance. It was estimated that there were over 600,000 LDRs.
As for bibliographic record maintenance, several possibilities were suggested:
- For USNP members who are interested in maintaining records, perhaps a USNP enhance authorization could be used to perform maintenance on CONSER/USNP records.
- Records that need to be authenticated (e.g. in the case of a title change) could be sent to LC after USNP members have created the record for simple authentication.
- This could be done also in cases where other types of changes need to be made on authenticated records.
- There could be CONSER involvement in regional centers for maintaining records. A CONSER institution could pick up maintenance and authentication for records that fall within their institution's scope. They could also act as a regional contact point for USNP members when authentication and maintenance needs are identified.
A discussion of state libraries and their relationship to newspapers ensued with suggestions that, in some cases, the addition of the state library to CONSER would accommodate the needs of newspapers for that state.
Action:Hirons, Harriman, and Hawkins will pursue development of a USNP Enhance category and other possibilities.
Publication Pattern Initiative
Acknowledgment was given to various people involved with the patterns initiative: Carroll Davis for developing the patterns Web site and refining the statistics collection, Rich Greene for facilitating the Harvard data load of 40,000 records, Robert Bremer for developing the macro to add patterns data, and Frieda Rosenberg for writing the "CONSER Guidelines for input of caption/pattern and holdings data" document and the "SCCTP Serial Holdings Workshop".
The forthcoming ALA preconference, "The Future of Serials Control: Implementation of the MARC21 Holdings Format," Friday, June 15, 2001 was mentioned. The preconference will include a vendor panel discussion which will allow participants to compare and discuss specific serials control features of different ILS products directly with vendors.
Lucy Barron (LC) gave an overview of the development of sharing publication patterns using the MARC Format for Holdings Data (MFHD). The impetus for the current publication patterns initiative is due in part to the creation of predictive check-in workflows required by new integrated library systems. Using the MFHD to record patterns data in bibliographic records allows libraries to share this data rather than each library having to create it for each serial. Barron made a distinction between serial holdings information that one would obtain in an institution's catalog and the publication patterns data added in bibliographic records. The only holdings being added to bibliographic data for the Publication Patterns Initiative is a citation of the first issue in hand that displayed the particular pattern given in the record. Some in the audience likened publication patterns data to a more elaborate frequency history. Publication patterns data may be difficult to load or import into an existing ILS at present, but if available in a large mass and in a standard format could be useful for loading into an new ILS.
Elmer Klebs (LC) demonstrated use of the macro created by Robert Bremer. The macro uses data in the record to set up and add the 891 fields in an OCLC record. Klebs pointed out situations where the macro easily adds data and when subsequent editing is necessary.
James Castrataro (Indiana University) discussed workflow considerations for adding publication pattern data. He noted that some ILS's allow the use of templates to generate patterns so staff adding them do not need to necessarily know the MARC21 tagging. This is useful if acquisitions or other staff not familiar with tagging are going to add the data. Indiana University was able to share established patterns data from another local institution to get started. Castrataro told the audience that cataloging staff at Indiana University were responsible for adding the data, with assistance from acquisitions staff. The question was raised whether workflow should be set up to add patterns to new serials when cataloging from the first issues or should the workflow focus on adding patterns to established serials? Consideration of this issue may depend on which staff are adding the data, in some cases acquisitions or check-in staff may have better access to retrospective pattern information than cataloging staff.
Serials and your OPAC
As a warm-up to the discussion, Hirons asked attendees to gather in groups according to the ILS they use in their libraries. Participants were able to see other CONSER members that shared their system. ILS's identified were: Voyager; III, SIRSI, VTLS, Ex Libris, DRA, GEAC, NOTIS, and HORIZON. The majority used Voyager and III.
Hirons noted that there were some CONSER practices that had been driven by LC's MUMS system that may not be needed now. She asked attendees if they felt it was necessary to continue requiring the use of the 580 note when describing mergers and splits. In answer to this question, the audience made it clear that systems vary in how they handle indicators for linking fields. Several attendees mentioned difficulty in using indicators to generate notes for mergers and splits in their systems and found they need to follow the current CONSER practice anyway. Others have devised in-house shortcuts for generating notes from the linking field indicators and are able to work with the records as they are currently coded. Overall it was felt that it was best to keep the current practice as it stands. Similarly there was an overall feeling that CONSER should continue the practice of adding the 310 field as not many attendees mentioned a preference for generating frequencies from the fixed field.
David Van Hoy discussed MIT's efforts to use linking field control numbers in the local ILS as hot links to records for earlier and later titles and asked whether others were considering this. There was mixed reaction among attendees. Systems base this type of linking on particular control numbers (such as the OCLC number) which work well if the numbers are present and correctly recorded. Some institutions have not settled on the best control number to use for this type of linking. The discussion moved on to record display issues in the OPAC and how these might be improved. Many mentioned that it would be nice to see families of related records display in the OPAC. The ability to do this depends on the type of record control number links that can be made (an example of the VTLS "cascading control number search" was given). Desired features for viewing a family of records would include being able to view at least 780, 785, and 776 links, as well as related holdings information. It was pointed out that if too many links were included, the family tree may become too unwieldy for the OPAC to display.
Other OPAC display issues were discussed including other title information and the part title. Some ILS's index and display this in OPACs in a confusing or inconsistent way.
The discussion of ILS inconsistencies and incompatibilities suggested the possibility that CONSER provide a list of vendors of certain indexing, display and serial-oriented capabilities that CONSER libraries would like to see in an ILS. It was proposed that CONSER compile a list of requirements that are already available.
One goal would be to enhance the ability to adequately request features in a prospective ILS and push vendors to provide needed features.
Participants asked about being able to delete MARC 21 fields from CONSER records that appeared to be obsolete, these include: 012 subfield "i" (John Levy will review), 035 fields (per Robert Bremer all serial 035 fields could be deleted), and the 850 field. It was decided that the 850 could be deleted except for those appearing on records for preservation microform masters.
John Levy will review the impact of deleting 012 subfield "i"; the CEG will be updated to show policies for 012 subfield "i" and the 035 fields.
Use of PURLS or other mechanisms for maintenance of URLs
Valerie Bross (University of California, Los Angeles) presented a PURL Pilot proposal developed by Bross and Becky Culbertson (University of California, San Diego). The proposed "that a group of CONSER participants test the concept of maintenance of URLs through an OCLC-hosted cooperative PURL server. The pilot would be conducted with the intention that, if successful, a recommendation would be passed to PCC regarding possible use of a PURL server for records maintained by BIBCO/CONSER institutions."
Bross pointed out that PURL servers are already used by some CONSER members on a local or regional scale and the proposal asks if this could be done on a wider scale within the PCC. Advantages to sharing maintenance of a PURL server would be the reduction of maintenance efforts and expense across various local files. The proposal also suggests a liberal approach to the type of material covered by these efforts: "the records would not be limited to serials (BLvl "s"). That is, during the pilot, volunteers could create PURLs for serials, integrating resources, and even monographs if desired."
There would, however, be a need to make some limitations on the type of material that could be covered, such as not including commercially licensed materials that require URLs for log-on pages in the 856.
The question was raised about testing strategies to deal with broken links in the bibliographic record. For instance, could broken URLs be moved to other fields while considering whether the URL is valid? Should there be further documentation in the CCM Module 31 about when to remove broken URLs from records, particularly print records with URLs? Could the PURL Pilot also include testing such strategies?
Bross asked if CONSER members could express at least tentative interest in participating in pilot project and received a number of volunteers.
CORC and CONSER: what are we using it for? What are the implications?
The discussion of the PURL pilot flowed smoothly into a discussion of CORC, as there were some overlapping issues. CONSER catalogers briefly discussed the link validation feature and other tools associated with CORC, including the ability to harvest resource metadata to create CORC records, and the ability to view MARC 21 elements in Dublin Core. One difficulty with using CORC, is the inability to use diacritics in records. This will soon be available through a key mapping feature. GPO's use of CORC and its involvement in an OCLC joint effort to archive, via CORC Web resources and sites was briefly described.
Use of URL in other note fields (anticipating implementation)
This discussion of MARBI proposal 2000-06, Defining URI Subfields in Fields 505, 514, 520, 530, 545, 552, and 773, also raised the issue of PURLs and URLs in the context of GPO's use of PURLs. In using the single record approach, PURLs are recorded in the 856 subfield u, while the original URL is recorded in a 530 field. When subfield u is validated for use in fields other than the 856, this will allow more flexibility in recording URLs in the record. The attendees appeared to agree this would be a useful implementation.
OLAC report on terminology for title source notes, etc
This report was not discussed at length but Diane Boehr mentioned that there were going to be some corrections made to the journal title sources section of the document. The CONSER e-serials specialists group will consider this report as a resource in future updates to CCM Module 31.
CONSER visioning ideas from the CONSER At Large Meeting January 14, 2001: next steps and action items
The first challenge from the visioning exercise dealt with the possibility of CONSER broadening its scope to include non-English language, non-MARC or non-AACR2 records, and also whether to include non-cataloging activities (e.g., indexing journals not covered by standard indexes). Discussion of this issue centered mainly on inclusion of non-English language records, given the interest of the Hemeroteca Nacional de México to officially become a CONSER member. Hirons pointed out that the model of multilingual records already exists for Canadian bilingual publications. In the past, the need for English-only records has restricted Latin American countries from applying. Other stumbling blocks are access to OCLC, NACO membership, and the need for Spanish language documentation. [A Spanish version of the CCM is currently being written.] The formation of the MARBI Multilingual Record Task Force was mentioned, which is dealing with records in multiple languages; however, its focus appears to be on authority records rather than bibliographic records.
Concerns include subject headings and notes that would appear in non-English languages. The problem of duplicate records appearing in the database was also raised with the possibility that many different language records for the same work could be created for works that are only in one language (unlike French and English Canadian records for bilingual publications). A suggestion was made that the record duplication problem might be more of a problem for OCLC and the other utilities if multilingual records were contributed.
The next visioning topic concerned CONSER standards for quality and quantity and whether these standards are barriers to increased participation or retention of existing members. What other factors are potential barriers to CONSER participation and how can they be overcome? The discussion seemed to indicate that the quantity of records required was sometimes a problem for institutions, particularly smaller institutions. It was suggested that some sort of scaling factor be used, based on the size of the institution. It was pointed out that a larger number of institutions, doing a smaller number of records could increase record contributions, without harming record quality. Ideas for adding more institutions included the idea of state libraries taking over maintenance for newspaper records at a less than full or associate level. A possible barrier here would be that such libraries aren't all NACO members, though perhaps a minimal level approach could be taken for such a project. The requirement for NACO participation was a barrier mentioned in other aspects of the discussion.
Other ideas for widening CONSER participation and record contribution included targeting new members for particular collection strengths or language expertise. The model of funneling record contributions from a variety of regional or special libraries was suggested. Judy Knop (ATLA) described a project she has been working on to authenticate records contributed by several theological libraries. The funnel model would increase unique and high quality records to CONSER. A possible barrier might be that the original contributors would not receive OCLC credit for CONSER authentication, though they presumably received an OCLC credit when creating the record. Sue Fuller (University of Texas) is aware of several libraries that might be willing to contribute to a Latin American funnel through the University of Texas.
Hirons and Hawkins will work with with Operations representatives and PCC Policy members to revise CONSER applications and discuss revising quantity standards. Guidelines for CONSER funnel projects may also be needed.
A final topic for the afternoon touched on the need for CONSER to develop a closer relationship with reference and acquisition librarians. This will be important for many of the issues discussed at this years' operations meeting, including: developing guidelines for cataloging integrating resources, the multiple versions problem, developing better OPAC displays and ILS features, and the publication patterns project. It was suggested as a start that a liaison from the reference community (e.g. someone from RUSA, the Reference and User Services Association) be invited to address the CONSER at Large meeting at ALA. It was suggested that a CONSER representative could also address a meeting of a group like RUSA.
Hirons will explore the possibility of having a RUSA representative address CONSER at Large.
Next Joint CONSER/BIBCO meeting
Hirons asked the group how they liked the three day structure. There was overwhelming approval and participants said they had little problem getting institutional support. With this in mind, Hirons suggested a three day meeting, May 1-3, 2002, which will include an AACR2 workshop to acquaint catalogers with changes in the rules.