- The discovery service implementation experience will vary depending on your organizational size, structure and culture. Grand Valley State had support from administrators and the library director, presumably an organizational tolerance for risk, and inquisitiveness (manifested in several colleagues besides Jeffrey who have published papers related to the Summon implementation – I’m guessing this goes beyond the need for tenure Jeffrey alluded to humorously in his presentation!) All these factors combined to enable them to sign on as a Summon early implementer. The size of your library also affects how you approach things like troubleshooting implementation issues or “marketing” the service to other staff or library users.
- Marketing is really important, in e-resources as everywhere else in the library. But who is the audience? I found it intriguing to hear Jeffrey say that administrators need to be a major audience for librarians’ outreach efforts. Students and faculty make up the majority of my library’s user population, and they are the main groups my colleagues and I think of when we consider “outreach”. But administrators are an important group to dialogue with given their decision making and fund allocation powers. This underscores the “marketing is everyone’s job!” idea that was expressed so clearly during the afternoon brainstorming session.
- Discovery services’ impact on usage is hard to predict. Be prepared for complex collection development decisions and budget forecasting. Jeffrey mentioned that GVSU was hoping to recoup some of the costs of their discovery service by tracking usage and cancelling some A&I databases, aggregators or e-journals later on. However, the only A&I databases that lost usage turned out to be required for department reaccreditation! Journal usage went up, but aggregator usage held steady so they couldn’t conclude that one was substituting for the other, and didn’t cancel either. I took away from this that I should not expect an investment in a discovery service to (partially) pay for itself down the road.
- Librarians need to be selective about what kind of assessment to perform.Laura Robinson’s initial question about whether libraries have positions dedicated to analysis (i.e. assessment) positions was telling – most attendees were juggling this task among many other responsibilities.In addition to the familiar COUNTER analysis (an input metric), there is also outcomes analysis (such as citation studies), contingent analysis (estimating what the cost to users or the institution would be if the resource was not there), and return on investment (ROI) analysis (comparing some beneficial resource created – approved grant $$, information literate graduates, etc. — against the cost invested in the library to produce it). With so many types of analyses available and so little time, librarians need to focus on the method that yields the most convincing evidence.
Just as library instruction assessment has moved from inputs (# of students taught) toward more outcomes focused assessment (rubrics, demonstrable achievement of predetermined learning outcomes), perhaps the state of the art in e-resource assessment is also moving past COUNTER input metrics (# of articles downloaded) toward a richer indicator of the output or result of that use — what the user actually DID with that download.What kind of user studies would allow libraries to convincingly answer the question of what a discovery service allows, say, a student researcher to do with the sources that she found?