Laurie A Alexander
Library Blogs
Showing 51 - 60 of 71 items
Results
in Blog: Tiny Studies

The 2018 Library Assessment Conference (https://libraryassessment.org/) brought together a community of practitioners and researchers who have responsibility or interest in the broad field of library assessment. This post recaps the conference poster content presented by Laurie Alexander and Doreen Bradley about how analytics advanced the Library's internal understanding of the course-integrated instruction provided by Library staff.

The first post ("Personas: A Classic User Experience Design Technique") in this 2-part series described what personas are and, generally, how to create them. I closed with some cautions about ways personas might come out less than helpful – creating flat, overloaded, or fake (unresearched) personas. The second post presents our persona development for a specific website project.

Document Delivery provides traditional Interlibrary Loan Borrowing service, and scanning and delivery service for books and articles from material owned by the U-M Library. As a result of a successful pilot to provide free Local Document Delivery for faculty and graduate students, the department next sought to change the fee-based service for undergraduate students and staff. Departmental managers wondered: What would happen if we made scanning and delivery service free for these patron groups?

Personas are employed in user experience design work to help design teams create or improve systems, spaces, and services with targeted populations in mind. Libraries use personas as archetypes to maximize effective library user experiences. This is the first of two posts about the creation and use of personas in the U-M Library.

Not everything a library wants to know is available via web-scale analytics tools such as Google Analytics. Often, custom instrumentation and logging are the best way to answer usability and analytics questions, and can offer better protections for patron privacy as well.

When developing or reconsidering a library service, sometimes you can get stuck in your head. You go back and forth with your colleagues proposing different ways of doing things. You model out different scenarios, do an environmental scan, read the literature, weigh pros and cons but you still can’t decide how to proceed. A great way to figure out how to move forward is to go to your users for feedback by employing intercept interviews.

Continuing the discussion about survey design (see Let's Talk about Surveys, Part 1), you’ve decided a survey is an appropriate methodology for what you want to find out and are thinking about what questions you want to ask. But how you ask these questions and structure them within the survey itself, as well as the question formats and options you give people for responding all require careful consideration.

Doing a survey is often the default research method thought of when you need to answer questions about what people like, expect, or want, among other things. While surveys are likely to be considered the easiest option, you can’t conflate “easy to create” with “easy to create well.” Even if a survey is an appropriate methodology for the question you’re looking to answer, the questions you ask, the way you ask them, and the options you give people for responding all require a thoughtful approach.

“Learning from Advanced Student Staff Experiences” was a University of Michigan Library study conducted in 2017, integrating methodologies of user-centered design and critical librarianship.

In this study, engineering librarians Leena Lalwani, Jamie Niehof, and Paul Grochowski sought to learn from graduate students in the College of Engineering (CoE) how these students could benefit from more instruction on U-M Library resources.