Focus on presentations re electronic resources at the Library Assessment Conference (August 4-6, 2014) – Seattle

2014 Library Assessment Conference: Building Effective, Sustainable and Practical Assessment

By Martha Kyrillidou, Steve Hiller and Amy Yeager

Submitted to Journal of Electronic Resources Librarianship, E-Resource Round Up, Vol. 26, No. 4

The fifth gathering of the biennial Library Assessment Conference took place in Seattle, August 4-6, 2014, with pre- and post-conference workshops before and after the event. The Conference attracted more than 600 professionals this year in an energetic gathering on the University of Washington campus. The event, cosponsored by the University of Washington Libraries and the Association of Research Libraries, began in 2006 and has tripled in size over the last 10 years. Assessment has become an essential activity in many libraries as they engage actively in transforming their services and focusing on what’s important and relevant for increasing value delivered to library users.

As the largest conference of its kind in the world, the Library Assessment Conference has helped build and foster an enthusiastic community that is deeply committed not only to improving libraries but also to documenting their contributions to individual and institutional success. This year the organizers broadened the range of presentations at the conference to include panels and lightning talks in addition to papers, posters, workshops and keynote speakers. These design innovations were highly successful as the lightning talks were among the most popular sessions. The end result was a deeper sense of learning, engagement and interaction among presenters and participants. The range of formats also allowed the Conference to cover additional topics, such as public libraries, that were not possible before.

A number of papers are of interest to the audience of this journal, and we are highlighting below short abstracts of some of the most relevant ones:

Using Bibliometrics to Demonstrate the Value of Library Journal Collections
Christopher Belter, National Institutes of Health Library
Neal Kaske, National Oceanic and Atmospheric Administration — NOAA
As budgets shrink and journal costs rise, libraries face increasing pressure to justify their journal subscription decisions. In response, Belter and Kaske demonstrated the value of the NOAA libraries’ journal subscriptions by analyzing the cited references from recent journal articles written by NOAA-affiliated authors.

Measuring Impact: Tools for Analysing and Benchmarking Usage
Jo Lambert and Ross MacIntyre, Mimas
Angela Conyers, Evidence Base
The Journal Usage Statistics Portal (JUSP) and Institutional Repository Usage Statistics (IRUS-UK) are services developed in response to the requirements of academic libraries in the UK. These services enable librarians and repository managers to exploit usage data in order to gain insight into use of their collections, inform decisions, enable development of policies, and assess and demonstrate value and impact. The session provided an overview of these services and highlighted their value and benefits to institutions. Recent qualitative research conducted with users of these services in the UK was outlined during the session through a series of use cases.

Time for “Alt Metrics” to Drop the “Alt”: Developing a Standards Foundation for Alternative Assessment of Scholarship
Todd Carpenter and Nettie Lagace, National Information Standards Organization — NISO
The NISO Altmetrics Initiative is a two-phase project, funded by the Alfred P. Sloan Foundation, that seeks to study, propose and develop community-based standards and/or recommended practices in the field of alternative metrics. The first phase of the project gathers three groups of invited experts in the fields of alt metrics research, bibliometrics, traditional publishing, and faculty assessment for in-person discussions to identify key issues in this area and determine which ones may be best addressed through standards and/or recommended practices. This paper covered the output of the first three in-person discussion groups and the community prioritization effort undertaken in summer 2014, and outlined potential next steps.

Downloads and Beyond—New Perspectives on Usage Metrics
Peter Shepherd, COUNTER
Carol Tenopir, University of Tennessee, Knoxville
Marie Kennedy, Loyola Marymount University
This panel discussed the advantages and limitations of the COUNTER-based metrics, the extent to which it is useful to develop such metrics further, and the approach being taken by the Beyond Downloads project to understanding the impact of article sharing on online journal usage. The Beyond Downloads research project is seeking to:
• Define ways to measure non-download usage of digital content
• Evaluate and measure the relationship between COUNTER-defined usage and usage of digital articles obtained through other means, notably via shared content, taking into account differences by stakeholder groups
• Develop practical methodologies and heuristics for estimating total digital article usage as a function of known downloads and non-download usage
• Design a usage multiplier that could be used to reweight total measured usage towards a more accurate measure of total digital usage that varies by subject discipline and other factors
• Initiate discussion across the publisher, STM research, and library communities regarding these issues.

Usage Statistics of Electronic Resources: A Comparison of Vendor Supplied Usage Data (Lightning Talk)
Kanu Nagra, Borough of Manhattan Community College, City University of New York — CUNY
This presentation analyzed and compared vendor-supplied usage reports, type of reports available, data categories and definitions, access platforms, availability of reports for local, national and international standards, sufficiency of data and the extent of data mining possible in the 165 diverse electronic resources available to the Borough of Manhattan Community College (BMCC) in the City University of New York (CUNY).

Running the Numbers: Evaluating an E-Book Short-Term Loan Program for Cost-Effectiveness (Lightning Talk)
Brendan O’Connell, John Vickery, North Carolina State University
This study explored whether an e-book short-term loan program (STL) would be a cost-effective model for North Carolina State University libraries. Using e-book demand-driven acquisition (DDA) purchase reports and usage data from the last three years, the authors ran various hypothetical models using SAS statistical analysis software to find out if NCSU libraries would have saved money by utilizing STL as an alternative to DDA for these titles.

E-Book Reading Patterns of Faculty: A Lib-Value Project (Lightning Talk)
Lisa Christian, Carol Tenopir, University of Tennessee
Donald W. King, Bryant University
With the advent and increasing adaptation of e-readers and tablets, e-books are an increasingly important part of a library collection. This study examined how and if faculty and students are using e-books in their academic work.

Three E-Book Outlooks: What Humanists, Social Scientists and Scientists Want and Predict (A LibValue Study) (Lightning Talk)
Tina Chrzastowski, Lynn Wiley, Jean-Louise Zancanella, University of Illinois at Urbana-Champaign
An e-book survey focused on user attitudes and valuation was conducted at the University of Illinois at Urbana-Champaign between 2013 and 2014 to determine how humanities, social science and science scholars viewed the current and future use of e-books in their field.

“Absolutely Amazing!”: A Comparative Study of User Satisfaction with Summon
Dana Thomas, Kevin Manuel, Ryerson University
Because many academic libraries currently invest heavily in Web Scale Discovery services to meet the needs and expectations of undergraduate students, it is important that we evaluate user satisfaction. The presenters shared the results of two survey iterations, comparing satisfaction levels with the service in 2013/14 with those evident in 2011/12. They showed how students at Ryerson University like using Search Everything, powered by Summon.

What’s It Worth? Qualitative Assessment of E-Resources by a National Consortium
Eva Jurczyk, University of Toronto
The Canadian Research Knowledge Network (CRKN) is a partnership of Canadian universities, dedicated to expanding digital content for the academic research enterprise in Canada. This paper described how CRKN adapted a simplified version of the California Digital Library’s quantitative assessment model, allowing for the establishment of an assessment program using existing resources. The paper highlighted the technological, human, and knowledge resources that were needed to put an assessment program in place and provided an overview of the process by which the program was established. Lastly, the paper highlighted two cases when a quantitative assessment of licensed resources was applied: to the cancellation of a low-value package for the consortium, and for decision-support by a member institution returning to a title-selection model when a Big Deal was no longer affordable.

Assessment of the Use of Electronic Resources at the University of Massachusetts Amherst: A MINES Study Using Tableau Software for Visualization and Analysis
Rachel Lewellen, University of Massachusetts Amherst
Terry Plum, Simmons College
This paper presented the findings of the second, year-long, systematic evaluation of electronic resources at the University of Massachusetts Amherst, using the MINES for Libraries® methodology. The presentation compared two implementation methods for a point-of-use, intercept survey launched at the EZproxy server: 1, randomly chosen two-hour sessions and 2, an every Nth user methodology. Finally, the presentation demonstrated how using Tableau Software, a data analytics and visualization application, to interact with the survey data in real time had unexpected benefits.

Assessing Electronic Collections at Point of Use
Jane Nichols, Rick Stoddart, Oregon State University
While libraries are using increasingly sophisticated metrics to determine electronic resources’ usefulness, impact and cost effectiveness, much of this data reflects past usage. More nuanced information is still needed to guide collection managers’ decisions about which content to purchase, borrow or deselect. To fill this gap, librarians at Oregon State University Libraries and Press and Ohio State University Libraries are currently testing the utility of a pop-up survey to gather patron feedback at their point of use. This paper discussed how the application works, whether users respond to a pop-up survey as expected and other preliminary findings with JSTOR and Elsevier electronic resources.

The Conference covered the broad spectrum of assessment efforts related to libraries. For more information on the full range of topics, you can consult the ‘schedule’ page on the Library Assessment Conference homepage: Poster files and presentation slides are linked from this page, and the conference proceedings will be published here in the coming year.

While the Conference serves as a learning, networking and community event, it also has a tradition for being fun. Participants enjoyed the receptions as well as the delightful Seattle summer weather. The University of Washington, Seattle, is consistently ranked as one of the most beautiful urban campuses with plenty of open spaces to soak up the rays. Conference sessions ended by mid-afternoon, giving participants time to take advantage of the campus, rest, or explore other parts of Seattle. The Local Arrangements Committee prepared and assembled guides to the best of Seattle, and they were ready to offer advice on restaurants, shopping, the annual return of the salmon, public transportation and whatever participants were interested in. The twitter handle @LA_Conference was keeping busy during the event, and a nice thread is captured with the #lac14 hashtag.

Proceedings from earlier years are available on the website and offer insight on the rich perspectives and approaches library assessment encompasses as it covers all functional areas, crosses department boundaries and emphasizes strategic engagement. For those wanting to stay in touch with this community and its development, you can join the arl-assess Google group supported by the Association of Research Libraries!

No comments yet.

Leave a Reply