D-Lib Magazine
|
|
David Green |
IntroductionMuseums today are increasingly active partners with libraries, archives, historical societies and others in building digital libraries. The Museum Computer Network (MCN) has been a leading agent in museums' use of computers from improving operational efficiency to animating collections and connecting them to a wider world. MCN's seminal 1997 anthology, The Wired Museum [1] brought together key issues for museums in moving online. Seven years later, its 32nd annual conference (Great Technology for Collections, Confluence and Community) is an occasion to measure progress. "Confluence" is certainly key to museums' ability to successfully collaborate with other institutions as well as manage the internal shifts within their own organizational structures that going digital inevitably brings. Other strong themes at the conference were the preeminence of audiences, (understanding them and evaluating institutional performance in supplying what they need), the rise of service providers, the responsibility of larger institutions to assist and collaborate with smaller ones, and the key role of automation and software tools in meeting the complexity and expense of going digital. Evaluation, Leadership and New MetricsOne of the most compelling issues for this constituency has been how to incentivize museum leaders to fully participate in the online revolution. Digital has often been an organizational sideshow, and digital projects too rarely influence the larger picture of how the museum is organized and driven. Keynote speaker Max Anderson, founder and president of the Art Museum Network, knows these struggles intimately as a leader in demonstrating the benefits of ICT, most recently as director of Manhattan's Whitney Museum of American Art. Anderson's challenge to conferees was that to influence the boardroom, they should develop software to service new metrics that distill audience response, quantifying museums' effectiveness in impacting the community. Funders increasingly demand proof of social impact, and legislators are about to do so too. Anderson cautioned it would be better for museums to evaluate their performance with cutting-edge tools, rather than have government attempt it. He advised museums to prioritize their role as educational institutions, no matter how unfashionable, bearing in mind how analogous museums can be to universities [2]. Evaluation of museums' effectiveness in reaching audiences is certainly underway. The Canadian Heritage Information Network (CHIN) has demonstrated great success in aggregating and integrating content and in responding to audience behavior with its Virtual Museum of Canada [3]. CHIN recently established a Research and Business Intelligence unit, demonstrating the shift from a "content-centered" to an "audience-centered" approach. This unit's work was presented by Kim Gauvin in a session on the future of the virtual museum. The new team examines how performance can best be measured, what kind of audience statistics are most useful and how best to analyze user feedback. As Anderson had protested, usually audience statistics measure raw numbers of site visitsbut the new move is to disaggregate the numbers. CHIN's "Engagement Factor" divides the number of visits to a specific online exhibit by the number of visitors it received, and then multiplies this by the average duration of a visit to that exhibit. This new statistic has proved very effective, especially when combined with larger traffic patterns. CHIN also analyzes user feedback using a typology and database to quantify this qualitative feedback. Evaluation tools currently include formal and informal user observation, web statistics, focus groups, interviews and surveys, usability labs, online feedback pages and external specialist review. In a session dedicated to evaluation, James Ockuly, employing these tools at the Minneapolis Institute of Arts (MIA), emphasized, after reviewing them all, that key lessons he'd learned included that ongoing evaluation and multiple toolsets produce more telling and trustworthy results, but that "too much" data was, not surprisingly, a problem. Ockuly's case study was of an underused digital museum directory in a kiosk at the entrance of the MIA [4]. By measuring audience awareness, use and satisfaction (through visitor surveys, staff interviews and a technical evaluation), the museum refocused the directory's content on immediate (as opposed to future) activities, integrated material from the web and the collections management system, and put other kiosks at "decision-points" around the museum, improving use dramatically. David Schaller presented the results of research he felt compelled to do after eight years of producing Eduweb's award-winning educational websites. He showed dramatically diverse responses to web presentations of material for different audiences (from information direct, via interactive reference or guided tour formats through more indirect delivery via role-playing, puzzles and interactive mysteries to simulation). His broad-brush conclusions included that: a "general public" doesn't exist and any producer needs to closely research target audiences; the web is used for very different purposes by different population segments; and expectations for a museum web-site visit are much more wide-ranging than for a museum visit. A much more complex project is the two-year-long evaluation of the Museums and the Online Archive of California (MOAC) site (aggregating collection and item-level material from libraries, museums, historical societies and archives of California), presented by Layna White. The study confronts many issues both about its aggregated content and its multiple users (K-12 teachers; university students; academics in the humanities and social sciences; and museum professionals, librarians, and archivists), including their surprisingly low level of engagement in an evaluation survey. Project leaders aim to complement some of the evaluation findings of the parallel Perseus Digital Library, which with its very diverse collection of digital objects (at item and collection levels) concluded that key factors of success were to create very well-structured information objects and to understand the nature of humanities research and researchers' behavior [5]. Museums and EducationEfforts at understanding the interaction between the educational community and museums were also represented in a Museums & Education session presenting the offerings of ARTstor, the Cleveland Museum of Art's online archeological game Dig In (designed for children) and a Scottish report on educational use of museum resources. ARTstor is the Mellon Foundation's project licensing image collections for educational use only. Nancy Allen reported that, while continuing its higher education focus, ARTstor is exploring K-12 applications through a pilot project with twelve schools, following the report, ARTstor & the K-12 Education Community [6]. Trying to discover how educational institutions, large and small, urban and remote, use online museum resources in Scotland led Jim Devine of Glasgow University's Hunterian Museum to propose a study to the Scottish Museums Council that would improve communication between the suppliers and users of such material, especially in determining what would work most effectively. The report, "What Clicks?" includes compelling video interviews with teachers and students in remote areas such as Barra in the Outer Hebrides, for whom remote access and shared resources are key. [7] The User-Driven Virtual MuseumThe same impulse to share resources in remote communities through ICT was at the heart of the Community Museum Project, set in Washington's Northwest Olympic peninsula. The project was notable in being community-driven and, while instigated by the University of Washington (bringing its libraries, Center for the Study of the Pacific Northwest and Burke Museum of Natural History into play), local institutions were very active participants [8] The project's main goal is to preserve and share indigenous culture and traditions at a time when their survival is at risk. With other goals of creating an access model for the information poor and a cost-effective model for sustaining lifelong learning, it was hoped the project would foster greater unity among the many participating groups. The Peninsula's high level of connectivity assisted in overcoming the great distances involved in working together on the project, but the Seattle-based librarians still had difficulties working with volunteers from diverse communities and had to learn to deal with local enmities (e.g., between the Makah and Quileute tribes), language gulfs, local suspicion of the University and 18 months spent getting permission to digitize and post material. Tribal concepts of IP ownership are at odds with US copyright law (only a song's owner can perform it) and some images could only be scanned "at the kitchen table". However, the Community Museum Project also created immense local enthusiasm and energy: with scanning training sessions and photo identification booths at festivals and farmers markets producing several formally unknown image collections and, from the elderly "West End pioneer" local historians, suggestions for 55 digital exhibit topics. This project will produce 12,000 digitized images and 6 online exhibitions, each accompanied by curriculum materials and teacher workshops. Such a community-driven aggregation of materials where (at least initially) the creators are the users was surprisingly pertinent to the session on the Virtual Museum of Canada (VMC): The Next Generation, dedicated to the study paper issued by CHIN [9]. Howard Besser, one of the report's authors, made clear that the report was designed to stimulate thinking about all virtual museums not just the VMC. Aspects Besser highlighted included: thinking through the implications of merging digital and analog, physical and virtual museum experiences; the lack of a mass audience for virtual museums necessitating the discovery (or creation) of many specific audiences; and equipping visitors for the creation of their own individualized virtual museums. Traditionally elite, highly interpreted and curator-driven, museums will be more user-driven in their "virtual" rendition, allowing users to create pathways through collections and exhibits, to contribute and manage content, to interact and communicate with other users and to employ interfaces relevant to their information needs and backgrounds. The virtual museum will need to support (and offer tools to assist with) many different re-organizations of content, transparently collocated [10]: The "near-future" virtual museum thus needs to be an authoritative but open and active set of spaces for interpretation, expression and sharing of views. New Tools for Collaborative Aggregation and DiscoveryThis discussion leads to the biggest cluster of sessions around the means to achieve sustainable aggregations of digital content that can be viewed and used as seamlessly as possible. John Perkins introduced a session on Museums & the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH). Under Perkins direction, the CIMI Consortium in 2001 built a testbed repository of some 50,000 metadata records from over 20 museums [11]. The testbed worked, but though comparatively simple, it was still too difficult for most museums to contribute to because of their diverse information management practices and capabilities. The Berkeley Art Museum's Richard Rinehart thought OAI might still be a route for getting more museums to participate in aggregating projects. Getting Dublin Core metadata into an OAI repository was still easier than participating in the centralized aggregating projects the community has seen to date and in which only around 150 of the 16,000 museums in the country had participated. Advocating simpler or "lite" standards, to allow for greater participation, Rinehart introduced a handy tool in the MOAC Community Toolbox that can export museum metadata in EADxml or METSxml formats, among others [12]. Relevant to this effort is Emory University's ongoing (2001-2007) MetaScholar Initiative, which, collaborating with museums, libraries and archives, provides services to scholars, and was described by Martin Halbert, Emory's Director of Library Systems. A component of the initiative is "Metadata Gardening," extending the harvesting model to include the back-end creation, conversion and enhancing of local metadata for inclusion in repositories. One of the MetaScholar projects, The Music of Social Change, explored how music and civil rights material interacted, and in sharing metadata between libraries and museums, project participants quickly discovered the need for tools to deal with non-standard and idiosyncratic metadata. Emory's Metadata Migrator is one response: software that uploads and enhances material from a desktop to create OAI-compliant metadata [13]. Why participating in an OAI repository can be so difficult was graphically illustrated by Timothy Cole [14] in a talk on the Institute for Museum and Library Services (IMLS) Digital Collections & Content (IMLS-DCC) project to build a harvester for material from IMLS-funded digital projects. This appears a cost-effective way to collect material from diverse sources and to expose it to wider audiences. However, it's proving a great challenge, given the huge diversity of IMLS grantees, their metadata formats, technical skills, technology infrastructures, information management traditions (including the sadly classic confusion over whether catalogers are describing an original object or a digital surrogate). The main challenge was in getting the metadata right and getting it into XML. Overall, Cole saw OAI-PMH as a lowest-common-denominator approach to sharing and interoperability: insufficient for some high-level, domain-specific applications, but useful for sharing across more heterogeneous communities, allowing participation with less technology. OAI-PMH metadata harvesters can normalize and augment metadata before sharing with domain-specific federated search portals. Currently in alpha, the IMLS-DCC project is harvesting 27 collections with some 200,000 records, analyzing the metadata, documenting practices, and looking for normalization potential and implications for interface and search engine design. An alternative to metadata harvesting is metasearch, also known as federated search, which searches across multiple databases, platforms and protocols, using metadata mapped to a common element set, such as Dublin Core, and a flexible search engine, such as FAST [15]. Murtha Baca and Karim Boughida, both from the Getty Research Institute, opened a session on metasearch, by introducing the Getty Trust's ongoing pursuit of seamless integrated access to its heterogeneous resources. Currently, a searcher is confronted on the Getty's "Conducting Research" page with twelve different resources, each with its own search path, using language that is not clear to many users (what is the difference for a first-time user between Special Collections and Digitized Library Collections?) [16]. Metasearch could provide one way to make a single search of the Getty's resources possible. The benefits of metasearch are that it happens in real time (harvested data can be out-of-date) and results are easier to sort and de-dupe, (but can be difficult to contextualize). Overall, metasearch appears to be the most promising method of providing integrated access to diverse resources in real time. Sara Randall, who works with ENCompass, one of the leading metasearch tools, reported on the recently established NISO Metasearch Initiative, designed to provide more effective and responsive metasearch services and assist content providers to deliver enhanced content. Museums are not represented on the initiative (although RLG and ARTstor represent museum material) and the presenters at this session knew of no museums working on metasearch. However, Charles Lockwood reported on the success to date of metasearch at Loyola Notre Dame University in getting truly heterogeneous returns (of citations and digital objectsbooks, dissertations, video, catalogs, etc.) [17]. Automation continued as the main theme in a session on implementing metadata. Kurt Bollacker from the Long Now Foundation, dedicated to long-term thinking especially in regard to preservation, encouraged institutions to employ tools that would give quick wins to those creating metadata, encouraging initially simple systems that would at least produce metadata useable in format conversion. The Long Server system is a prototype for a desktop digital preservation system, dedicated to pursuing universal file format conversion, but in the meantime built for basic user functionality (using, for example, an autocomplete tool to suggest new fields from contextual information). Bollacker called for a national collaborative database of metadata schemas [18]. Mary Elings, Archivist for Digital Collections at UC Berkeley's Bancroft Library, then introduced two tools being developed by the community to help automate the process of creating the many types of metadata, often written to multiple standards, required by repositories such as those at Berkeley. One is the soon-to-be-released Archivists Toolkit (which, in addition to managing collection processing, accessioning, description, resource location, and provenance registration will automate the production of EAD encoded finding aids and METS records) and another is the MOAC Community Toolbox, described by Richard Rinehart in another session, that automatically outputs data in EADxml and METSxml formats [19]. RLG's Günter Waibel, while referencing useful metadata extracting tools, such as Adobe's Extensible Metadata Platform (XMP) and the New Zealand National Library's Metadata Extractor tool [20] that mines and displays tiff header metadata, reminded the audience both of the demands of the recent NISO 39.87 Technical Metadata for Digital Images Standard (with 124 elements, 39 of them mandatory) and the metadata holding capacity of JPEG2000. Such demands necessitate automation [21]. One answer is RLG's Automatic Exposure initiative, working with manufacturers to encourage products that automatically capture technical metadata and make it available for transfer into digital repositories and asset management systems [22]. As he put it, for preservation especially "the more you know about an object (and use it) the easier it is to preserve." Collaborative PreservationThis brings us finally to one of the richest sessions, "Achieving Digital Preservation Through Collaboration." While OCLC's Judy Cobb very ably reviewed some national collaborative preservation initiatives, including the PREMIS working group (developing strategies for implementing preservation metadata); the RLG NARA Digital Repository Certification Task Force (ensuring trustworthy repositories); the National Digital Information Infrastructure and Preservation Program (NDIIPP), (helping institutions identify web content to be placed in several repository systems) and OCLC's Preservation Policy document, (featuring a handy file format stability index), others spoke to their local experience [23]. University of Minnesota Associate Librarian, Eric Celeste, somewhat spooked by the definition of collaboration as the "traitorous cooperation with the enemy," spoke of his sense of difference between libraries that allow you to "take material away with you"; historical societies that focus on "the local culture"; and museums that concentrate on context and story. The Minnesota Digital Library Coalition (MDLC), in his opinion, is doing a good job of bringing these different institution types together, expanding the notion of the digital "library." [24]. St Cloud University Librarian Keith Ewing, stimulated by discovering all his own family's memorabilia in a small town archive, spoke about leveraging the skills and resources throughout a community to make a digital library hum. Working at a comparatively small institution, he was engaged in the MDLC alongside scores of even smaller partners, working with the world-famous Minnesota Historical Society (MHS) and the behemoth University of Minnesota Library. The opportunity was to enable people to create their own narratives, supply their own metadata for material and create a community of participation throughout the state for the benefit of education and the public. But, echoing the experience of Washington State's Community Museum Project (see above), he was surprised by the degree of resistance and suspicion exhibited by the smaller institutions about the larger ones (here the University of Minnesota): one of the most difficult tasks was establishing trust between different kinds and sizes of institutions. Within the MDLC, agreeing on common language to describe, discuss and define the mission of the project took far too much time, and many institutions (such as the Minneapolis Institute for the Arts and the Walker Art Museum) drifted away from the project. The team also worked hard to build a metadata standard, only later deciding to use metadata standards and best practices adopted by the Colorado Digitization Project. Sam Quigley, Director of Digital Information and Technology at Harvard University Art Museums, appreciates that the more than four terabytes of material at his home institution are in good hands from a digital preservation perspective. Curious about how other museums were faring, he conducted a survey of preservation and related policy development at nine major art museums and found that none had digital-only policies (as opposed to procedures). The museums had between 4,000 and 60,000 digital images of original works (typically 300 dpi tiffs with 4 derivatives) with the expectation of 3-5 year longevity. For longevity beyond ten years, museums' plans were typically to start planning; Quigley found absolutely no preparation for costing all of this and few, if any, funding models. The current trends are to focus on assessment methodology and to bind the whole digital preservation process into existing preservation efforts. One commentator pointed out that it was only the largest libraries that are proceeding as pioneers with digital preservation, implying that smaller museums had best find a large partner with which to work. Another confessed that material gets lost (especially when personnel change and when there is no file-naming protocol) and another commented that often institutions only know whether or not they have lost a digital image when they go to use it. Serious digital asset management and institutional re-organization is key. Bob Horton, State Archivist at MHS, presented a very pragmatic model for a collaborative digital preservation program, one derived from his e-government experience with its more hard-nosed mandate to produce user-defined benefits. He noted the evolutionary service phases an e-government provider moves through: from providing information and enabling simple transactions to offering web services that authenticate users; integrating web services that share data; and delivering customer-centered web services that bundle and channel customizable data. Throughout this process, new skills are demanded, and he commented that typically what is needed in the cultural nonprofit arena is: far more project and risk management experience; awareness of incremental development (not hankering for perfection each time); building infrastructure through partnerships; responding to adoption rates; and adding value to services. Horton shared some conclusions he drew from involvement in the $360 billion tobacco settlement case that involved processing 20 million pages of tobacco records, separately digitized by tobacco companies. While only a small group of people could use the records, they would have world-wide impact, and international user videoconferences helped determine what the needs were. While librarians demanded the records first be standardized, users most wanted "immediate access" that could be functionalized later. Horton's conclusions were that: users are critical; niche markets can be very powerful; and preservation demands upfront could be problematicthey would be better postponed until demand is clear. Technology, in Horton's experience, is rarely the problem. The problem is rather in making our organizations change to make the technology work: it's the organizational change that is key. He suggested that the nonprofit community needs to shift its focus from digitization projects to implementing the institutional changes necessary for large-scale digital production. ConclusionsAlthough many museums still go it alone, the conference was inspiring in showing how many realize that collaboration (both with each other and with other types of institutions) is increasingly the key to successfully mobilize digital resources. Large and small can help each otherwith technical expertise, products, tools, information and content. Good cataloging and accurate, knowledgeable description according to appropriate standards is still a basic requirement for any digital collection, but once done, there are new tools to assist in bringing an institution's riches out into a wider and more diverse world. Competition may still be present, but with the greater aggregation of material from many sources, making one's own material attractive to aggregators, virtual museums, exhibit designers, educators and others seems the smarter route. Evaluating the effectiveness of one's material and strategies, acknowledging the diversity of one's audiences, also seems smarter, as private funders and government are increasingly interested in what returns they are getting on their investment. While there are still very large internal organizational mazes to negotiate, costing and pricing mysteries to unravel and the perpetual funding problem, museums now seem on the right track in their approach to working with others to share the richness of our intellectual and cultural heritage. Notes and ReferencesSlides from many of the presentations mentioned above are available at the Museum Computer Network website at <http://www.mcn.edu>. [1] The Wired Museum: Emerging Technology & Changing Paradigms, edited by Katherine Jones-Garmil; introduction by Maxwell Anderson. Washington, DC: American Association of Museums, 1997. [2] See Anderson's "Defining Success in Art Museums," in Art Museum Network News, October, 2004; adapted from "Metrics of Success in Art Museums," a forthcoming paper developed at the request of the Getty Leadership Institute (GLI) <http://www.amnnews.com/view_10_2004.jsp>. [3] Launched in 2001, the Virtual Museum of Canada <http://www.virtualmuseum.ca/> has an average of 500,000 visits a month to its half-million images from 2711 Canadian museums in some 200 exhibits, 650 education modules, an image gallery, a local history site, and a games site. [4] See the online version of the MIA directory at: <http://artsmia.org/directories/>. This project was part of a larger IMLS-funded project: What Clicks? assessing MIA's audiences' relationship with the Institute's interactive media and web resources; see <http://www.artsmia.org/what-clicks/>. [5] For more on MOAC, see <http://www.oac.cdlib.org/>; the study can be found at <http://www.gseis.ucla.edu/~moac> and Perseus is at <http://www.perseus.tufts.edu>. For more on the design and evaluation work on Perseus, see G. Crane, et al., "Drudgery and deep thought: Designing a digital library for the humanities." Communications of the Association for Computing Machinery, 44(5), 35-40 (2001). Available at <http://www.perseus.tufts.edu/Articles/cacm2000.pdf>. [6] ARTstor and the K-12 Education Community, <http://www.artstor.org/info/news/ARTstorK-12.pdf>. [7] For information on the What Clicks? report, see <http://www.hunterian.gla.ac.uk/what_clicks/>. [8] Local participants included the Clallam County Historical Society, City of Forks, Peninsula College, Forks Timber Museum, Forks Chamber of Commerce, the Hoh, Quileute, and Makah Tribes and the Makah Museum, as well as the North Olympic Library System. For a full list of participants and more information on the project, see <http://content.lib.washington.edu/communitymuseum/groups.html>. [9] Virtual Museum (of Canada): The Next Generation, by Steve Dietz, Howard Besser, Ann Borda, and Kati Geber with Pierre Lévy. Canadian Heritage Information Network, 2004. <http://www.chin.gc.ca/English/Members/Next_Generation/pdf.html>. [10] The report and Besser frequently cite Christine Borgman's paper, "Personal digital libraries: Creating individual spaces for innovation," Wave of the Future: NSF Post Digital Library Futures Workshop. June 15-17, 2003. <http://www.sis.pitt.edu/~dlwkshop/paper_borgman.html>. [11] For material on CIMI's Metadata Harvesting project, see: <http://www.cimi.org/wg/metadata/>. [12] For material on the MOAC Community Toolbox, see: <http://www.bampfa.berkeley.edu/moac/community_toolbox.html>. [13] MetaScholar home page, <http://metascholar.org/> [14] For Tim Cole's PowerPoint presentation, see <http://imlsdcc.grainger.uiuc.edu/Cole_MCN2004_OAI.ppt>. [15] See <http://www.fastsearch.com/>. [16] Conducting Research at the Getty, <http://www.getty.edu/research/conducting_research/>. [17] For the NISO Metasearch Initiative, see <http://www.niso.org/committees/MetaSearch-info.html>. For an overview of the initiative see article by committee co-chair Andrew Pace, "Much Ado about Metasearch," American Libraries Online, June/July 2004. American Library Association. <http://www.ala.org/ala/alonline/techspeaking/techspeak2004/Junejuly2004muchado.htm> [18] See a related PowerPoint presentation on this subject by Kurt Bollacker at <http://www.diglib.org/forums/fall2004/bollacker1004_files/frame.htm> [19] The Mellon-funded Archivists' Toolkit is a collaborative project of the libraries of the University of California San Diego, New York University and Five Colleges, Inc; see <http://euterpe.bobst.nyu.edu/toolkit/>. For the MOAC Community Toolbox see Note 12 above. [20] See Adobe Extensible Metadata Platform, <http://www.adobe.com/products/xmp/main.html and http://www.natlib.govt.nz/files/Project Description_v3-final.pdf> [21] For an introduction to the NISO Standard for Technical Metadata for Digital Images, see: <http://www.niso.org/committees/committee_au.html>, the trial standard itself at <http://www.niso.org/standards/resources/Z39_87_trial_use.pdf> and for information on the XML schema being developed at the Library of Congress for providing a format for interchange/storage of the data specified in the draft standard, the NISO Metadata for Images in XML (NISO MIX), see <http://www.loc.gov/standards/mix/>. [22] For RLG's Automatic Exposure, see <http://www.rlg.org/en/page.php?Page_ID=2681>. [23] Further information on PREMIS (PREservation Metadata Implementation Strategies), an OCLC-RLG Working Group developing recommendations and best practices for implementing preservation metadata is at <http://www.oclc.org/research/projects/pmwg/>; on the RLG NARA Digital Repository Certification Task Force, at <http://www.rlg.org/en/page.php?Page_ID=580>; and on the National Digital Information Infrastructure and Preservation Program (NDIIPP), at <http://www.digitalpreservation.gov/>. [24] The Minnesota Digital Library (MDL) Coalition <http://www.mndigital.org/> includes libraries, archives, historical societies, and museums, in its work to create a Minnesota Digital Library, "providing a server and database environment and imaging support" as well as funding the Minnesota Electronic Resources in the Visual Arts (MINERVA) and its annual symposium.
Copyright © 2004 David Green |
|
|
|
Top
| Contents |
|
|
|
D-Lib Magazine Access Terms and Conditions doi:10.1045/december2004-green
|