Ky Wojciechowski

I'm a researcher, writer, and designer committed to making information easier to use.

Flint Public Library

Designed a series of contextual inquiries and developed a comprehensive method of collecting usage statistics and feedback from patrons.

Role: UX Researcher

Timeline: Sept – Dec. 2017


Overview

As part of the Contextual Inquiry and Consulting Foundations course at the University of Michigan (SI*501), my team and I conducted a series of contextual interviews with stakeholders at the Flint Public Library to get a better sense of issues that were preventing them from gathering meaningful feedback from patrons and program participants.

Background

The library’s 2016 annual report indicated that its mission was to become “Flint’s go-to place to learn for life”. To this end, its staff is committed to the development of family and digital literacy and the creation of a community gathering space. These commitments are met through a full schedule of workshops, group meetings, services, tutorials, activities, programs, and resources.

However, library administrators found that their methods of gathering feedback from patrons and program participants had not been efficient or successful. They were informal: conducted mostly via verbal communication and paper forms, which often went missing or were not entered into a centralized database for future use.

Accordingly, administrators wanted to develop

  • a comprehensive, standardized method of collecting usage statistics and feedback from patrons and program participants and
  • an effective framework for turning that data into meaningful information, whether for the purposes of informing decision-making or providing content for year-end reports, funding requests, or the like.

Methodology

Contextual Inquiry

To more accurately assess the library’s need for a more formalized and standardized system of collecting feedback, we interviewed six employees (administrators, program facilitators, and Digital Learning Specialists) to understand their workflows and current information management practices and to validate or refine the problems described to us.

Data Analysis

After completing each interview, our team held an interpretation session and analyzed our findings as a group, using notes and recordings as points of discussion. Individual pieces of data—those that captured single facts emerging from either the interview or our observation—were then recorded as discrete affinity notes.

After our interviews and individual interpretation sessions were complete, we held a final interpretation session in which we created an affinity wall. This activity required us to gather each of our nearly 250 discrete affinity notes and categorize each according to theme or sentiment.

From our affinity wall came a series of findings and recommendations.

Findings & Recommendations

Lack of Formalized and Standardized Work Processes

Without a formal framework in place that librarians or Digital Learning Specialists must follow when planning new programs, most programs were not implemented based on the specific learning outcomes they ought to achieve but are instead based on the types of activities the Library has the resources to offer.

Recommendations: Accordingly, we recommended using outcome measurement as a program planning tool for programs that meet the outlined criteria.

For programs that do not meet the criteria for implementation with outcome measurement, the general ideas baked into the technique can be used through backward design.

Lack of Motivation for Feedback Collection

The few data points that are regularly collected are outcomes (circulation data and program head-counts), meant for annual reports and grants. There is no underlying motivation or rationalizable purpose for librarians and specialists to collect more in-depth, qualitative feedback data.

Recommendations: For the purpose of better communicating the needs and expectations for library programs between the administration team with other staff, we recommend the administration team promote internal communication related to program purposes and desired outcomes, while incorporating team communication and collaboration tools such as Slack.

Also, considering the fact that the library does not have a strong resource support in terms of personnel and funding, we recommend leveraging the joint effort of the Library ambassadors and volunteers for easily-implemented feedback collection tasks such as data entry, conducting short follow-up interviews to program participants, and gathering “impact” stories from patrons.

Lack of Organizational Resources

The Flint Public Library ultimately suffers from funding and staffing issues, creating constraints in program planning, implementation, and standardization that stifle innovation and potentially preclude the Library and its resources from serving certain sectors of the Flint community.

Recommendations: To reduce staffing constraints for the Digital Initiatives team to explore adult digital literacy programming, the Flint Public Library could explore collaboration with the University of Michigan Flint campus to develop student volunteer programs in the Library, focusing on working with the Computer Science, Engineering, and Physics department and student organizations such as the Association for Computing Machinery.