Data liberation of in-house library statistics

Have libraries used an institutional repository as a “container” for library-related statistics,  current or retrospective, and/or dumped sources of raw data into a web-based application, such as Nesstar, for collaborative viewing/sharing/manipulating within the institution? I have started exploring the idea of the IR as container, but we need to formulate a set of guidelines to be considered. Not everything statistical is suitable for this somewhat public “display”.  I am thinking of something more interactive than simply a dashboard of facts and figures. The purpose would be to democratize access to data in-house for library administrators, librarians and others interested in manipulating the data for customized purposes.  Is this topic old-hat? in other words,  am I coming to this rather late — it has been solved in your library? or is there some pioneering work that is being done (IR standards, scope, case studies) that can be shared? Thank you, Margaret Friesen, University of British Columbia Library, Assessment Librarian.

,

3 Responses to Data liberation of in-house library statistics

  1. eun-ha hong January 14, 2010 at 5:26 pm #

    This is a very interesting idea. I have not looked into it but certainly I would like to share your experience and/or look into this option if possible for our institution. I am currently organizing all internally produced data in the basic excel format in dedicated directory until I find something better.
    But never thought of Nesttar or IR. Excellent idea. How can we look into this further?

  2. Eric Phetteplace February 20, 2010 at 4:03 pm #

    Margaret,

    The Library Assessment Working Group at UIUC has started discussing this very issue recently. I certainly don’t think it’s old-hat and I would definitely be interested in any findings you come across or any replies to this post.
    I know a few of our preliminary issues are A) requiring a suitable amount of background info, which provides transparency into how the data was collected but without being so complex as to discourage deposits, B) the relation between our IRB and what data could be published, the possibility of limiting access (i.e. to administrators and librarians only, as you allude to), and C) linking this to a larger culture of assessment, how to incentivize deposits and collaboration between distinct units. At least those are the issues that come to mind right now. If you come across something substantive, please do think of contacting me.
    Best,
    Eric Phetteplace
    Graduate Assistant with UIUC LAWG

  3. Eric Phetteplace February 20, 2010 at 4:20 pm #

    Margaret,
    The Library Assessment Working Group at UIUC has started discussing this very issue recently. I certainly don’t think it’s old-hat and I would definitely be interested in any findings you come across or any replies to this post.
    I know a few of our preliminary issues are A) requiring a suitable amount of background info, which provides transparency into how the data was collected but without being so complex as to discourage deposits, B) the relation between our IRB and what data could be published, the possibility of limiting access (i.e. to administrators and librarians only, as you allude to), and C) linking this to a larger culture of assessment, how to incentivize deposits and collaboration between distinct units. At least those are the issues that come to mind right now. If you come across something substantive, please do think of contacting me.
    Best,
    Eric Phetteplace
    Graduate Assistant with UIUC LAWG

Leave a Reply