LOEX is one of my favorite conferences. Its smallness makes it more intimate than ALA or ACRL. It’s “all inclusive,” which promotes those in-between-sessions conversations that are often the most fruitful. And everything is about instruction. Win win win.
And sometimes one of the most helpful takeaways appears in an unexpected format. All the sessions I attended at LOEX 2014 in Grand Rapids, Michigan, were great, but the one that paid the most immediate dividends for what’s happening right now at my library was the lightning talk by Chantelle Swaren of the University of Tennessee at Chattanooga. In a strictly timed 7 minute presentation to all LOEX attendees after lunch, Chantelle explained the statistical concept of correlation, and demonstrated how to use Microsoft Excel to generate correlation coefficients. This is a statistical method for revealing relationships among the piles of data we have lying around: circulation stats, survey results, instruction session attendance, etc.
As it happened, my library received the results of our local Ithaka S+R Survey of faculty right before I departed for Grand Rapids. While Chantelle was giving her presentation, I had the Excel spreadsheet of the survey responses (scrubbed of identifying information, of course), in my e-mail inbox. How fortuitous! I have to admit that I did test out the Excel correlation function before dinner that same day, but I still had some things to learn about correlation and the survey data before I could make these numbers meaningful.
Since then, I’ve done a little reading (thanks Wikipedia!) to better understand how statistical correlation works, and what are its limitations. And now that my entire library is focused on digesting and interpreting our Ithaka Survey results, I’ve been putting my new Excel skills to good use. The folks at Ithaka sold us an analytic report of the survey findings, but that mostly included comparisons of the Tulane faculty responses to those of the 2012 national faculty survey. I could have done that myself since the data set for the national survey is the ICPSR data archive. I wanted to know more. For example, does a respondent’s perception of librarians’ impact on student success have any relationship to their value of librarians overall. (Answer: It does.) Or does a respondent’s heavy use of the library collections have any relationship to their willingness to divert funds away from the library building and staff. (Answer: It doesn’t.) Correlation of these survey responses does not mean causation, but it does generate some interesting questions about how the library is perceived and valued by our faculty, and what aspects of our work with them may have the most impact.
As a library we’re still digesting the survey results, but my work with correlation coefficients has quantified what most librarians in public services have known for a long time: the more visible we make our work to our users, the more they will value us as partners in their research and teaching. When we make ourselves invisible–a common side-effect of making discovery and access easier for users–the work of librarians becomes devalued simply because our users don’t know about it. How my library will respond to these findings is still being discussed, but to me the solutions are obvious: put more resources into the visible services that generate value for the library as a whole, and find ways to make the inherently invisible work of librarians (collection development, technical services, electronic resources management) more visible outside the library walls. It’s up to all of us to demonstrate value to our communities, and the numbers suggest that even at a research institution, just having a great collection is not enough.