Joe Zynda
Posts tagged with Assessment in Blog Tiny Studies
Showing 41 - 50 of 65 items
Assessment and research activities focused on the U-M Library faculty, staff, and student experiences are happening regularly, and often the Library Human Resources (LHR) team is contributing to these activities if not leading the research. This work can focus on quantitative data, qualitative data, or take a hybrid approach, and can involve surveys, interviews, and/or some general number-crunching. This post looks over some recent HR assessment projects.
When planning an assessment project in the Library, one important step is to consider whether your project should be vetted by the Institutional Review Board (IRB) at U-M, a committee that ensures studies with human subjects are ethical, that subjects are protected from unnecessary psychological or physical risks, and that subjects are participating in a fully informed, voluntary manner. This post details when your data collection may be subject to a full IRB application and review process.
Assessing library impact on student learning is essential for demonstrating libraries’ integrated value and commitment to higher education. In 2018 the author investigated faculty perceptions of student learning in library instruction sessions, and as a result, revealed that faculty observe enhanced learning when their students participate in library instruction opportunities.
In this post, the author describes how they used the assessments of a revised library curriculum for the College of Pharmacy to demonstrate the value of the sessions for students, and to stimulate the creation of a new learning object - a game - to improve student learning.
The 2018 Library Assessment Conference (https://libraryassessment.org/) brought together a community of practitioners and researchers who have responsibility or interest in the broad field of library assessment. This post recaps the conference poster content presented by Laurie Alexander and Doreen Bradley about how analytics advanced the Library's internal understanding of the course-integrated instruction provided by Library staff.
The first post ("Personas: A Classic User Experience Design Technique") in this 2-part series described what personas are and, generally, how to create them. I closed with some cautions about ways personas might come out less than helpful – creating flat, overloaded, or fake (unresearched) personas. The second post presents our persona development for a specific website project.
Document Delivery provides traditional Interlibrary Loan Borrowing service, and scanning and delivery service for books and articles from material owned by the U-M Library. As a result of a successful pilot to provide free Local Document Delivery for faculty and graduate students, the department next sought to change the fee-based service for undergraduate students and staff. Departmental managers wondered: What would happen if we made scanning and delivery service free for these patron groups?
Personas are employed in user experience design work to help design teams create or improve systems, spaces, and services with targeted populations in mind. Libraries use personas as archetypes to maximize effective library user experiences. This is the first of two posts about the creation and use of personas in the U-M Library.
Not everything a library wants to know is available via web-scale analytics tools such as Google Analytics. Often, custom instrumentation and logging are the best way to answer usability and analytics questions, and can offer better protections for patron privacy as well.
When developing or reconsidering a library service, sometimes you can get stuck in your head. You go back and forth with your colleagues proposing different ways of doing things. You model out different scenarios, do an environmental scan, read the literature, weigh pros and cons but you still can’t decide how to proceed. A great way to figure out how to move forward is to go to your users for feedback by employing intercept interviews.