Launching the Digital Preservation Assessment

Initiating an Urgent Assessment Response

In the summer of 2021, the University of Michigan Library’s Digital Preservation Steering Committee found itself with a compelling challenge. We had a few months to complete an internal assessment of our digital preservation capabilities to respond to a request from library leadership for a proposal to improve preservation support. From the start, we knew there were flaws in how the library conducted digital preservation activities, but we were also concerned with identifying unknown blindspots. Our recommendations needed to be completed in time to share with administrators before the annual budget process in October. This meant we had to finish the assessment by the beginning of the Fall semester.

In mid-2020, following the library’s renewed focus on digital preservation as a strategic objective, the Digital Preservation Steering Committee formed to oversee policy development. The Steering Committee completed a cornerstone visioning document called Baseline Digital Preservation in June 2021, which defined the minimal activities required for the responsible stewardship of the library’s digital materials. In response to the Baseline document, library administration asked the Steering Committee to submit a request for resources that would bring our digital preservation program closer to enacting the Baseline standards. The Steering Committee formed an Assessment group to plan and conduct a survey, analyze the results, and create a report to share with the administration.

In all of this program-building work, we benefited from regular consultations with our colleagues in the Big Ten Academic Alliance (BTAA) Digital Preservation group, a network of digital practitioners from peer institutions. In particular, the Baseline document was inspired by a similar document created by the University of Wisconsin–Madison Library, and our assessment plan was informed by Michigan State University’s experience in implementing the Digital Preservation Coalition’s Rapid Assessment Model. We considered community standards and existing assessment models while planning for our assessment, but ultimately chose to develop our own approach to the assessment process.

Figuring Out Assessment Methods

The Assessment group began meeting to figure out how to quickly gather information with the goal of making the case to library administrators for additional support. We soon realized that there was not enough time to do a thorough gap analysis of every essential activity defined in the Baseline, so we looked for a different way to get the information we needed.

We consulted with the library’s Assessment Specialist Craig Smith, who helped us think through our goals and consider the best ways to make use of the short time available. Following Craig’s suggestions, we broke our planned assessment into two phases, with the first phase being an informal, targeted approach to gathering impressions from library stakeholders. Phase 1 could be completed over the summer to inform our recommendations to administrators. We could take our time with phase 2 to delve into a more detailed gap analysis and measurement of our Baseline commitments. The information gathered during phase 1 would shape our approach to phase 2 and ongoing assessments in the future.

During phase 1 of the assessment we hoped to learn how much preservation activities depended on organizational contexts, such as staff capacity, access to support and expertise, communication channels, and how roles and responsibilities were defined. We wanted a picture of the digital preservation program that went beyond metrics to gather critical feedback on areas to improve our digital preservations services. Essentially, we wanted to give stakeholders the chance to tell us how they perceive problems with the digital preservation support they receive.

To get the most mileage out of our limited time, we invited a small set of stakeholders (around 15 people) from different areas of the library to participate in one-hour Zoom interviews. Over the course of two months, members of the Assessment group met with interview subjects selected from Library Information Technology (LIT), Digital Scholarship, Research Data Services, Collections, and other areas of library work that involve digital preservation activities.

To encourage engagement with the assessment, we chose a survey format that fostered a conversation around preservation. We drafted a set of seven questions to guide the discussion, and left room for the respondent to raise any issues that fell outside of the scope of our questions. The survey acknowledged that we are aware of existing digital preservation shortcomings, and invited respondents to elaborate on how they perceived these issues. 

The questions included:

  • What do you see as obstacles to the Library implementing the strategic goals to plan and deliver a comprehensive digital preservation program?
  • Where do you see hurdles to the proper preservation of the material you work with? 
  • What digital preservation commitments do you feel are not currently being met?
  • What additional funding or technical support do you require now or in the future to manage your digital content? (For example: additional storage, software or tools, file migration, etc.)
  • What are the strongest needs for digital preservation expertise and/or staffing in your area of the Library? (For example: additional staff hours for quality assurance, professional development or other training opportunities, digital media specialists, etc.)
  • What is the most important digital preservation priority for the content that you manage? How can the Library best support that priority?
  • Is there anything else you would like to tell us about digital preservation in the Library?

Participants were emailed a link to a Google doc with the survey questions in advance and invited to fill out the form themselves, or go through the questions with us during the interview and have the interviewer complete the survey as notes on the discussion. The Google doc was then left open after the interview was over for respondents to revise or elaborate on their answers if ideas occurred to them later. 

We wanted respondents to feel free to be honest, so each survey doc was kept private and only seen by the members of the Assessment group. Anonymized quotes were used in the report to administration. We did not want to filter the responses or tone down any of the criticisms. Throughout the process, we sought to be transparent about the goals of the survey and how the responses would be used for a report to administrators.

Outcomes

The survey conversations surfaced concerns about a lack of support for collecting natively digital materials. Meeting our current preservation commitments requires ramping up our knowledge and expertise around born-digital formats and new forms of digital media and scholarship. Respondents identified the need for better communication between those who collect digital materials and those who support its preservation. The survey also highlighted serious limitations in our ability to handle these problems with current staff levels.

After the survey interviews were completed in August, the Assessment group analyzed the results and drafted the report. The report included recommendations that we could implement right away to meet the challenges identified by the survey, as well as proposals for four additional staff positions to address capacity issues. We also endorsed several existing position requests in Digital Scholarship and the Digital Conversion Unit that would support preservation priorities.

One key finding from the assessment was the realization that the survey itself proved to be a very helpful tool to strengthen our digital preservation policy. The conversations with colleagues not only identified problems, but also created an opportunity for deeper communication that could be carried over into finding solutions. The report included a recommendation to establish an ongoing assessment program to document and evaluate our digital preservation requirements on an ongoing basis, starting with the follow-up phase of the current assessment.

For an in-depth look at the digital preservation results of the assessment, check out our Bits & Pieces blog post.

Up Next: The Baseline Assessment

July marks the one-year anniversary of the completion of the Baseline Digital Preservation document. The Baseline was created to calibrate a shared understanding of digital preservation activities so that we can align our efforts and priorities across different areas of library work. 

Building on what we learned from phase 1 of the assessment, we will launch phase 2 in July 2022 to gather information related to activities defined by the Baseline and contribute to a more detailed picture of digital preservation channels, which we’re calling “Mapping the Universe.” This will give us a clear sense of how expectations are communicated between library services and confirm the policies being enacted for the preservation lifecycle of our digital materials.

The methods for phase 2 of the assessment project will be different, casting a much wider net to gather responses from a general audience of librarians who work with digital materials. This second survey will aim to answer specific questions about preservation functions, such as when and how they occur, who is responsible, and how preservation actions and policy are documented. After further discussions with Craig Smith, we’re working on a Qualtrics-based survey format that uses filtered questions to tailor the survey to the respondent’s particular role. The results will capture the current status of our preservation program and form the basis of a long-term assessment project that will recur every few years to measure our progress. We’re very interested in implementing an assessment that can, over time, track our progress relative to the Baseline by identifying and validating changes that have been effective in implementing our preservation policies.

Submitted by Scott Witmer (switmer@umich.edu) and Lance Stuchell (lstuch@umich.edu).