Notes by Karen Neurohr (Oklahoma State) and Martha Kyrillidou (ARL)
A gathering of 80+ colleagues interested in library assessment met as usual during Friday before ALA to hear interesting developments in library assessment. This forum takes place twice a year in conjunction with ALA annual and midwinter meetings from 1:30 to 3:00pm on Fridays and brings the community of interest together in addition to the biennial Library Assessment Conference (which now gathers more than 500 people). Please note that the upcoming conference is scheduled to take place in Seattle, WA, on August 4–6, 2014. http://libraryassessment.org/index.shtml
Martha Kyrillidou opened the meeting by sharing with everyone that assessment is a continuous learning and iterative process and she introduced today’s speakers: Bob Fox, Teresa Walker, and Danuta Nitecki. ARL is currently doing work in the area of facilities and is also interested in expanding the new tools developed through the LibValue IMLS grant. For more information on the LibValue Webcast Series, see: http://www.arl.org/news/arl-news/2488-library-value-webcast-series-launched-by-arl-and-libvalue-project.
Among the colleagues attending was Carla Stoffle who was chair of the ARL Statistics and Assessment Committee in 1999 when the ‘new measures’ work was initiated at ARL. She briefly shared with us her latest news. She was acting Dean of the library school after retiring from her role as library director. Announcement: http://sirls.arizona.edu/content/dean-carla-stoffle-announces-she-will-be-returning-sirls-faculty
Bob Fox (Dean University of Louisville Libraries and Chair, ARL Statistics & Assessment Committee) spoke on the current work of the committee regarding the development of a facilities inventory.
There is interest at different levels—some of it is basic inventory information, the second level is a deeper understanding on renovations and innovative uses of space, and the third level is really about what this all means for our learning and research mission and outcomes. Bob highlighted the desire of the ARL directors to undertake a pragmatic approach and highlighted the main reasons of why we assess:
1. Do our jobs better, improve a process; inform our direction
2. Tell our story: public relations, statistical compilations, benchmarking
3. Convince others to allocate resources to us (donors, funding agencies, university)
He is coming to this from a couple of different administrative perspectives
• As Associate Dean at Georgia Tech (GT) they had a lot of assessment opportunities for their facilities and used a variety of assessment tools such as design charettes and observations. They created an assessment librarian position. Library advocates included a student advisory board and a faculty advisory board.
• As Dean at Louisville, his position is very different. His role is to work with campus administrators, external donors and have ongoing conversations about the library’s priorities and what they should focus on. There are 13 Deans all vying for funding amidst years of budget cuts.
• Stakeholders want different messages. Many donors prefer to see traditional print materials in development materials and they care about gate counts. The president prefers to see extraordinary achievements, not the number of books circulated, gate count, etc. The Library needs to fit their efforts into the things on the President’s scorecard.
Statistics & Assessment Perspective
• The ARL Annual Statistics lack facilities information. Although there is certainly an interest in keeping the statistics concise, the ARL Statistics & Assessment Committee polled ARL Library Directors and Deans with about 80% indicating an interest in facilities data such as renovations. They perceive facilities as still very relevant. There is interest in creating a Facilities Inventory, but very different needs were expressed, such as
1. Data for bench marking (gate count, etc.)
2. The visual/programmatic database to inform renovations (a very rich database with detailed information, this was funded by IMLS grant, very rich tool, talks now how to maintain the database)
3. Describing the value, ROI of library facilities
• A facilities questionnaire is in development. It will go to ARL Stats and Assessment committee and then, hopefully by next May’s meeting of survey coordinators, the questionnaire will go out.
1. An ARL facilities inventory is being drafted which would address the top three needs expressed by deans, for example: the number of seats and cost of the most recent renovation, and how the space in the facility is allocated (all these are quantitative). This would be designed primarily to give benchmarking to ARL Libraries. Data would be collected every 3–5 years, not every year. This would be a very practical tool.
2. They did look at the ACRL academic library facilities questionnaire and thought they could work together but found that its data are framed in a different way so there may be two facilities surveys. The ACRL version covers all types of libraries and is a snapshot of academic libraries in general. It is easy to fill out and does not require a lot of data gathering.
Bob asked for input and questions on the ARL Facilities Inventory data elements from attendees:
• The number of seats: should classroom seating in the library building be included? Comments: could define seating as specialized or general and collect those numbers.
• Gross square foot (SF) of library areas
• Allocation of spaces: collection, partners, staff administration
• Total gate count
• Total number of physical locations
• Question about the number of sites, satellite campuses. Martha said ARL already captures that but does not specify locations. Michael Maciel (TAMU) said what is important from a location standpoint is asking if the same services are being provided in the different facilities.
• Construction, renovation: do we ask cost? Most of us are investing tremendous money that we have to raise into physical facilities. Attendees agree it would be helpful to have dollars, cost per SF for new construction, and renovations because renovations can really inflate the library’s expenditures numbers. Comment: maybe differentiate renovations or renovations with new additions because these are very different.
• Question about library operational hours/facilities hours because hours are a big expenditure concern. Student groups demand 24/7, but provost only have x amount of money; always trying to reconcile this. If asked they would have to qualify the hours for which library building on campus. They could differentiate by building. They should consider building hours vs. service hours.
• Question: time period the survey will cover? If the survey is every three years, it will cover the past three years.
• Comment: include who paid for the cost of renovation; the source of funds
• Comment: ARL directors interested in special collections may want to link special collections space to some of the facilities inventory data but today’s topic does not include that.
Teresa Walker Head Learning, Research & Engagement, University of Tennessee Libraries
LibValue Commons Study
Topic: information commons assessment conducted through LibValue grant, IMLS (Teresa was glad Joan Lippincott was at the meeting. Joan (CNI) has been through three iterations of consulting visits with UT)
• Teresa provided an overview of the LibValue study. The question: Is there a link between common spaces and student success? They had usage data and lots of numbers collected such as National Survey Student Engagement; student exit surveys; LibQUAL+; service points, etc., but they needed student reported usage, student reported value, persistence and time to degree data, and an augmented university data set for tracking. Admissions and demographic data are there, but needed more “progress towards degree.” That data is hard to get.
• They conducted two surveys:
1. In-person, 1 page long survey about services used in Commons on a typical visit; 957 respondents
2. In-class use of commons services and spaces; feelings about the value of commons; worked with communications students
• Why ROI? Changes in higher education landscape. Complete College Tennessee act; Tennessee’s move to performance based funding; Top 25 initiative, a strategic priority for the university. Also, their accrediting body has a focus on learning outcomes. The top priority at UT is undergraduate education. Infrastructure is also a high priority.
• Define student success: retention rate needed to be better; library looked and asked what they could do.
• 74% students said working in the Commons makes them feel more involved in the university community. Other findings: who was using what; to what extent do the library commons provide space for collaboration, conversation, group work, or individual study. Had to look at the data to figure out what was needed. Group study rooms, glassed-in, still felt connected.
• Students in the commons who make most use of research assistance, computer support, and tutoring had higher GPA; lowest GPAs were students who were not making use of these resources.
• Other findings: 90% said the Commons provides resources they need for class; 85% said it is a great place to get help with assignments; facilities group work and collaboration. Librarians asked instructors to suggest to their students to use the commons; at first faculty disliked this but now it is much better.
• Their most recent study is about sustainability. They are combining branches of surveys to include all informal learning spaces and not just the commons.
• Teresa said we should ask what we need to know for our own space; what about the space contributes to student success. She said don’t just survey once, continue to survey and follow the cohort through their college years.
• The UT Library has made such a good case that other campus renovations are looking to the library for input on student success.
• UT has used LibValue to inform facilities planning and to get more funding by directly tying to student success and aligning with university mission and initiatives.
• ARL and the UT Libraries are looking to do this pilot with some additional schools and test the replicability and comparability of this methodology across institutions.
Danuta A. Nitecki, Dean Drexel University Libraries
Topic: challenges of assessment [utilizing ROI and observational data to better advance the university’s mission]
• Drexel is not an ARL Library, but Danuta’s career has been primarily with large research libraries. At Drexel, the most used library is not physical; comparisons of physical and online are important. Drexel University has a tuition driven budget and 23,000 students. The US News ranking of an up and coming university to watch resonates on campus. Danuta is the ACRL liaison to the Society for College & University Planning (SCUP) and recommends their resources: http://www.scup.org/page/index.
• At Drexel they are looking at transforming the concept of the academic library and asking how the library contributes to advancing the mission of the university. They began with a focus on the library as a learning enterprise.
• She recently made a presentation to a trustee Academic Affairs committee (she believes this was a first for the library); one outcome was an invitation to return with identification of proposed metrics to measure progress the library makes in meeting its goals to advance the university mission.
Danuta discussed four data gathering approaches and their challenges:
1. Customer opinions and perceptions [captured via surveys, discussions, interviews]
2. Observations of physical space use
3. Financial benefits; ROI
4. Statistical correlations as a possible way to identify associate engagement with library programs and resources with academic success factors such as retention, GPA, and time to graduation
Customer opinions: Many of the comments heard from library users related to space. Drexel Libraries uses Counting Opinions to routinely capture value judgments and comments; the lack of seats in the libraries comprises most concerns expressed. Staff are looking to focus on one question or topic per month to get more details on a given topic such as space. The libraries are crowded and heavily used. Reframing the topic of library space has been one of the Libraries strategic directions—to build learning environments. The challenge is to identify what makes library leaning environment different; one difference is access to resources and expertise. Gathering customer perspectives around this topic has been informative, for example:
Drexel held two forums to introduce plans for a project to build a learning terrace a few years ago. Over 250 attended, mostly students, to give feedback and ask questions. The desire for more facilities for group study was identified then. After the Library Learning Terrace was built, the 3000 persons who entered the facility during the first few quarters were asked through an email survey to evaluate the environment’s success in facilitating a number of factors associated with learning engagement. High scores were recorded with the exception of access to expert help. Staff have experimented with ways to bring effective programming and access to librarians to this space.
Another data gathering method was used relating to renovation planning. An architect-led series of envisioning use of the main library resulted in an articulation of a destination for “serious” work, a space communication the expectation for focused engagement with learning. The process involved around 20 faculty, students and staff over a couple of visually rich discussions. With general success of the flexible Learning Terrace, Drexel students were now saying they can find group study space anywhere and want more spaces for more focused student learning (which echoes what is found in contemporary workplaces) solo space, spaces for focus and concentration.
It can be difficult to relate results of such opinion-gathering efforts to student actual use, and hard to identify with the different kinds of information on campus, but results indicated student satisfaction with transformed environments in the library.
2. Observation of space use
Gate counts aren’t very useful beyond identifying who comes into the building; Danuta has been exploring how to identify percentage of occupancy as one metric of use of space.
Drexel Libraries staff conduct a seating surveys over sample times to literally count how many and where students sit, and if they are working in groups or not; one subjective judgment made by the observer is to gauge if students are actually working together or just sitting together [“working along”], or studying alone.
• Also want to know how the environment is being used; e.g. if students use library laptops and computers.
• The challenge for this type of data collection is that it is very labor intensive.
• SCUP colleagues urge there is already enough evidence that engagement does contribute to student learning, and thus suggest focusing on examining the extent to which students are engaged.
3. ROI: Compare expenditures to actual value; previous research in which Drexel participated involved many multiple variables, too intricate; more recently looked for quick and dirty method to measure ROI. Patterned after work done by Bob Dugan as well as the toolkit from U of Huddersfield [http://eprints.hud.ac.uk/16316/1/Toolkit_final_LIDP2_GS_EC.pdf] an approach evolved to select those library services for which staff could identify an alternative in the marketplace, measure local use and calculate the ratio of total library expenses to the sum of alternative costs based on local use. Examples:
• Access to e-resources in terms of downloads, using $31.50 average cost so multiplied by number of downloads
• Room Use: payment for security, upgrades; students can go to hotel and rent conference space; how many group study reservations; multiply hotel cost (other considerations, lockers, concierge, etc.)
• Initial estimate of the ROI calculated through this approach suggests a $10.50 ROI
• Challenge: difficult to validate; estimate accurate comparable market costs; not all expenses come out of budgeted resources
4. Correlations: Exploring the use of statistical correlations to address an “if null” hypothesis—that there is no significant difference in retention (persistence or dropout) among those with library engagement. Engagements may involve measures of entry into facilities, use of collection resources as measured through circulations or downloads, engagement with library experts through instruction or reference for example.
• First exploration is with entry to facilities, using gate swipe data now being analyzed by statisticians to explore if variation between continued students and drop outs might relate to use of library space
• challenges include looking at downloads and who is using databases; who comes to instruction sessions; looking at different exposures to library
Danuta concluded comments by showing what these data gathering approaches mean to align the library to the university’s strategic plan.
Four strategic directions that emerged for Drexel Libraries and related assessment methods:
1. Ensure access to authoritative information and ideas—is heavily service based and assessment includes measuring customer satisfaction and ROI (for how the library is lowering expense for institution).
2. Build learning environments—observations of use and correlations with academic success factors. SCUP’ Chapman prize has funded research on learning environments which is a hot topic; see: http://www.scup.org/page/resources/perry-chapman-prize. The first prize recipients conducted a thorough review of research on learning spaces (see: http://www.scup.org/page/resources/perry-chapman-prize/previous-recipients) and conducted a workshop at the last SCUP conference. This year’s prize is funding researchers from Northumbria who are designing tools for evaluating such spaces. At Drexel, a goal to gauge opportunity (e.g. 5% of students could have a seat in a focused learning environment with access to expert services) and actual use are difficult to meet.
3. Strengthen Drexel connections to scholarship—libraries facilitate bridges between people and information, as well as across disciplines for people to find collaborators and share inquiries. One focus is to strengthen learners’ connections to resources, introducing researchers to new sources of information, whether faculty or students. Held the last day of each quarter, the “Scholar Sip” event brings faculty and professional staff together from across campus to share research interests in an informal context. About 65 faculty come and the library welcomes indicators that new collaborations have occurred as a result of attendance. In addition, Drexel is building a central database of faculty achievements, working with Thompson Reuters and its Research in View and InCites systems.
4. Model an organization that responses to opportunities for transformation. Current new directions include development of campus infrastructure for research data management, including curation of raw data and associated analytics, processes, and documentation to give them meaning. One effective metric at Drexel is the press about innovation. For example, the Libraries introduced a self-checkout laptop kiosk; it was something new, different, press picked up on it and it went viral. President noticed.
Additional comment by Joe Matthews pointed us to a “really wonderful University of Minnesota study all assessment librarians should read”: Huesman, Brown, Lee, Kellogg, & Radcliffe’s “Gym bags & mortarboards: Is use of campus recreation facilities related to student success ” Journal of Student Affairs Research and Practice. Volume 46, Issue 1, Pages 50–71, ISSN (Online) 1949-6605, DOI: 10.2202/1949-6605.5005, March 2009
The study found positive associations for first-year student retention and for five-year graduation for students who use recreational facilities 25 times or more. The study helped the recreational facilities get $60 million for their facilities.
Steve Hiller—Highlights of upcoming Assessment Conference. Registration will open in May. Register early, there have been waiting lists for previous conferences. Sun., Aug 3 will offer workshops in the afternoon and Thurs. Aug. 7 will offer workshops in morning. Two new presentation formats included this year: Lightning talks and panels in addition to paper and poster presentations. In 2011 they received 229 proposals; in 2014 they received 254 proposals, an 11% increase. He and his colleagues are looking forward at welcoming us in Seattle! Weather predications call for a hot sunny day in Seattle!