Empathetic Assessment Strategies: A Conversational Approach to Navigating Ethos and Best Practices

At the beginning of the Fall 2019 semester, I was very excited at the prospect of starting my new career as the Library Operations Outreach and Engagement Specialist, a position that would help students and faculty learn about our many library services and programs. I created a number of goals to help students and faculty learn about the various library services and engaged with more than eight programs to reach student populations outside our Library. One of my goals at that time was to evaluate the effectiveness of my engagement approach, and solicit feedback on how to improve my services via a survey that I planned to distribute and analyze toward the end of the academic year. Once the pandemic began in March 2020, I was left with some very tough assessment questions. The most important question being: knowing what I learned about assessment best practices, how could I get valid data without traumatizing or alienating students who are of my utmost concern in this very trying time? (Asking the question even makes me feel a little...queasy)

Easy answer: use empathetic assessment strategies. But wait, are those even a thing or just an approach for assessment that centers the wellbeing of the population to be assessed?

Digression: we should be prepared to encounter conflicts where our hopes for acquiring data conflict with issues of timing or morality of an assessment project. In my particular case, the conflict came up as a double-edged sword of student suffering and unknown furlough conditions. Pretty much a moral quagmire.

With this post, I'll share what choices I made and my thoughts as I navigated that quagmire to be proudly rocking a response rate of 34% (N=50).

Do I even do assessment?

This particular dilemma was tough because I already had enough programming and artifacts to demonstrate beyond a doubt that I was already going above and beyond in my position. Distributing a survey seemed extraneous and potentially alienating to the students and community if not done appropriately. Would they think I was desperately hustling to avoid a pandemic furlough and terrible for bothering them? Or might the fact that I was asking students and faculty for feedback on my programming be encouraging to them? I wasn’t sure, so I had a consultation with our assessment specialist, Craig Smith, who reviewed my survey and my aims, and we developed a plan of action that made me feel way more comfortable about distributing the survey. I then revised the survey, reduced the number of questions, and timed it at 5 minutes or less to complete.

When do I release my survey?

When a pandemic hits toward the end of a semester and students are forced into tough decisions about staying on campus or moving at the same time attempting to navigate new learning environments and the world is just generally crashing around them, you have to be very considerate in your timing. I chose to release the survey as Study Days moved into Finals Week and allowed eight days for response. I sent the survey on a Wednesday and sent two brief reminders.

Radical response rate acceptance

When you distribute a survey in the midst of a pandemic, especially one that doesn’t relate directly to the pandemic at hand, you have to release all expectations about what your response rate will be. You should also understand how response rate works so you can make good, if not moral choices, what N should equal. In my case, I opted not to blast the survey to any email listservs as I wanted to ensure only people who had engaged with me in some manner were providing feedback. I went through my Google Calendar invites and my program sign-in sheets to find N=50 people to survey that met my criteria.

After distributing the survey, I chose only two well-spaced reminders so as not to overwhelm respondents  and also because I didn’t want my community to worry that I was in any type of danger and would not be available to them. They had enough to worry about.

A few days after the survey closed, I was overjoyed and humbled to find that I had a response rate of 34%, or 17 respondents.

Based on a concept I encountered called "Radical Acceptance," "Radical Response Rate Acceptance" means that I had to really appreciate those respondents who were able to participate and also those who were unable to participate for a plethora of circumstances known and unknown to me. Although I worked fairly closely with all 50 potential respondents, I accepted that circumstances did not favor even a 50% response rate. Therefore, I could not take nonresponses personally. I hoped that limiting my reminders to two emails would make potential respondents feel at ease if they came across my noncritical survey after the response period had ended. I figured only coming across two reminders would let them rest easy and imply that I was confident with the current response rate, which I was. 

How do I get good data during a pandemic?

This was also a tough question to grapple with, but consulting with our Assessment Specialist was key to helping me develop in both the survey and the distribution plan. I was sure to be very explicit at the beginning of the survey about the number of questions and the amount of time it would take to finish.

For example: “In order to better serve student and community populations typically under-represented in academic libraries in general, and the University of Michigan Library specifically, we would like to ask you 8 questions about your interactions with Library Operations Outreach and Engagement Specialist, Jasmine Pawlicki. This should take no more than 5 minutes to complete.”

In the survey I utilized a combination of multiple choice, 5-point scale, and two short answer questions that covered ways respondents interacted with me, the events and programs we ’ve attended or collaborated upon, types of information they’ve received from me, how likely are they were to seek help through the library, whether they’ve met someone else in the library they could turn to, how supported they felt, their impression of me, and any additional comments or suggestions. (Click for an Open-Access example of my survey here.)

As for distribution during pandemic conditions, I opted for a succinct email subject line: 5 Minute Survey Request-Library Outreach and Engagement Specialist. I will admit, the brevity of this subject line really depended upon my community to trust that I was sharing pertinent information in this email, but that trust was established through my careful curation of informative but succinct emails with the community.

The body of my email invitation was worded thus:

“I hope Finals Week finds you well! I am hoping you could take 5 minutes to fill out this 8-question survey based on your interactions with me in my role as Library Operations Outreach and Engagement Specialist since this Fall. I hope to collect responses up until April 30, 2020. The Google form will not collect your email address.”

Short, sweet, not terrified or desperate. Just trusting that those who could, would, and being thankful for them.

So, what did I learn?

I discovered that all 17 respondents had encountered me both in-person and in email, and that half of the participants were faculty and staff, one quarter of participants were graduate students, and the other quarter were undergraduate students. The majority of contexts in which respondents encountered me was the Dance for Mother Earth powwow committee, in the Library, in my office, and the Native Graduate Student Group Saturday Gatherings. Other popular contexts included the Powwow Dance Steps class, the University of Michigan Native American Student Association and the Eastern Michigan University Native American Student Organization Get Together, and virtual office hours. In regards to types of information I shared with respondents, 94.1 % cited Traditional Knowledge and Cultural practices, 82.4% cited Services offered by the U-M Library, and 82.4% cited U-M Campus Services and Resources. 

“Jasmine is a friendly, professional, and committed specialist in library outreach who makes serving all patrons, especially Native and Indigenous people, a tangible priority.”

Of those 17 respondents, 13 said they were Very Likely to come to or refer someone to a campus library upon encountering me, with the remaining 4 respondents answering Likely. What made me very happy to see that 58.8% of respondents have met someone else at the Library who they feel comfortable going to for assistance. What tickled me though was that 82.4% of respondents felt very supported by the Library and the remaining 17.6% felt supported by the Library.

Wrapping it up

I’ll be the first to admit that I am suspicious of assessment, especially when it is used in ways to justify decisions that center quantitative over qualitative data. I think it was leaning into my own distrust of assessment and on the knowledge of a trusted expert of Assessment that helped me slog through the moral quagmire of distributing a noncritical survey. Under my newly developed empathetic assessment strategies I did my best to center the circumstances of my potential respondents by limiting the number of questions, aimed for completion to take less than 5 minutes, and sent only two reminders. Key to this process was the relationships I had established or strengthened throughout the Fall semester, so I strongly encourage my colleagues to find ways to meet students and faculty in their spaces as well as the spaces we create for them.

I approach my outreach and engagement holistically, combining the promotion of and assistance with library services with academic advising and cognitive-coaching methods. The fact that all respondents felt supported by the Library and that many of them appreciated that I was able to transmit Traditional and Cultural knowledge as well as academic and library knowledge lets me know that my pre-pandemic approach was working. Now I’ll be translating my in-person approaches to a virtual environment for the upcoming Fall semester and will surely inquire at the end of the next academic year how the virtual approach has worked.

In desperate times, we will need to weigh our need for data against the circumstances that our potential respondents are experiencing. I do hope discussing my approach to Empathetic Assessment and Radical Response Rate Acceptance helps you navigate any assessment-related moral quagmires you may encounter.