A New Website for the Louisiana Department of Agriculture and Forestry

Screenshot of an LDAF website page titled "Supporting agriculture producers and sellers"

My first project at Ad Hoc LLC was an overhaul of a state agency’s website, the Louisiana Department of Agriculture and Forestry (LDAF). Check out the website we launched last fall at: ldaf.la.gov. 🎉

Partnering with LDAF stakeholders, we were able to improve task success rates, reduce time on task, and increase user confidence in the website overall. I led research for the project, which included market/competitor analysis, a top tasks survey, card sorting, tree testing, first-click testing, and usability testing (pre- and post). There was also a ton of information architecture and content strategy work.

It comes as no surprise to most that government websites can be hard to use. And getting a chance to improve this website for the people of Louisiana was really rewarding.

Learn more about the project:

Lessons From a UX Librarian

Me ready to conduct usability testing in the library lobby, 2019

As I’m beginning my new role with Ad Hoc after 19 years at the University of Arizona Libraries, I recently published this article in Medium: What I’ve Learned as a UX Librarian: 19 lessons over 19 years at an academic library.

Here’s the abbreviated list:

  1. Decision making needs to be clear
  2. Leadership needs to be on board
  3. UX can influence culture and should inform strategy
  4. Students should be on our teams
  5. UX requires research, design, and content
  6. Content strategy is critical
  7. Web writing matters
  8. Usability testing pays off
  9. Internal-facing UX is good, too
  10. We need allies and champions
  11. Assessment and UX work should be aligned
  12. Marketing and UX should collaborate
  13. We should make it fun
  14. We need to work with the implementers
  15. Research can be lightweight
  16. We should democratize UX (to a point)
  17. Patience is key
  18. We need to adapt and iterate
  19. We should prioritize what matters most

I’d love to hear from others what resonates and what I might have missed, either in comments here or on the Medium post!

Vanna from Wheel of Fortune in front of complete puzzle reading: I had a good run
Timely Wheel of Fortune puzzle, May 2022

Lightweight & Impactful: UX in Action and on a Budget

Constraints within the UX process are a common challenge. Restrictions such as budget, time, tools, and access to users can lead to new ways of producing lightweight yet impactful work.

In this talk for UX Wellington last month, I shared methods and experiments from the University of Arizona. Given that I’m leaving for a position in the private sector next month – improving federal government services – this was a great way to cap off my career at the University of Arizona Libraries and share various approaches we used to scale our work including our tiny cafĂ©, participant pool, and research repository.

The talk is filled with examples of how I implemented impactful UX practices within constraints and on a tight budget. An interactive presentation, participants shared a little bit about their work and the challenges they face. We’re in this together, all.

See the slidedeck or watch the full recording below.

What We’ve Learned About Remote UX Research

Since March 2020, our UX team has been working remotely due to the pandemic. We’re expecting to go back on-site in a few weeks, so I wanted to take the opportunity to reflect on what we’ve learned about remote research. It took awhile, but we learned several strategies that allowed us to keep a research practice going.

Recruit from a pool

We built a participant pool that allows us to send out email invitations through Mailchimp. Now over 320 students, faculty, staff, and community members, we’re able to reliably get responses to our requests for participation. See how we set it up in Remote recruitment for UX studies, an article by our intern Rachel Brown from last July. We also just made a public webpage about our participant pool, including some guidelines for using it.

Text on a screenshot saying 'Help us improve our websites, services, and more" followed by a description of the sign-up to receive invitations to surveys, feedback sessions, prototyping sessions, and more.
Google form inviting people to join the participant pool

Keep the ask simple and relevant

We’ve sent over 30 recruitment emails since last March, and one of our student colleagues Yashu Vats did an analysis of email data to determine which had the highest response rates. Those that did best:

  • Had active subject lines (e.g. “Help us pick the best thank-you items”)
  • Were a short time commitment (5 minutes or less)
  • Had clear calls to action (e.g. “Vote on your favorite”)

Those that asked the participant to respond to the email to set up a time, or complete an activity like a Padlet or Lookback, didn’t do so well.

Email with library logo saying "Your feedback could help us make decisions!" Call to action button labeled: Help us decide in 1 minute.
Recruitment email for a simple survey

Consider unmoderated methods

We tend to prefer moderated methods, and miss Tiny Café terribly since it allowed us to conduct moderated, lightweight research on a weekly basis. But the logistics surrounding scheduling and technology for virtual sessions proved a big barrier for recruitment. Especially when we had no incentives to offer, we struggled to get people to sign up for time slots.

We learned how we can get useful data through unmoderated methods, including first-click tests, preference tests, impression tests, and other well-written surveys. Not always our first choice, but if we hadn’t used these methods we’d have barely heard from our users this past year. And when we kept the responses simple, we could easily get 50 responses within a day.

Pie chart showing responses to the prompt, "Vote on the version that makes sense to you." 80 responses with "How safe does this space feel?" winning 62.5% of the vote.
Survey sent to participant pool to inform language for a campus project on perceptions of COVID safety

Make signing up easy

For moderated sessions, the sign-up process can be a barrier. After trying out and failing with SignUp Genius, Zoom registration, and a “Respond to this email to sign up” option, we now use Calendly for almost all moderated research. It’s easy to use for both researchers and participants. You can customize time slots, sync with Outlook, and add screener questions. Calendly also sends meeting request invites, and we have yet to have a no-show.

Interface with a calendar and options to sign up on Friday, Jun 25 at 9am, 10am, or 10:30am
Calendly sign-up form for user interviews

Provide an incentive

Some compensation or incentive for participants goes a long way. If you want someone to sign up for a moderated session especially, this can be critical. The incentives don’t have to be big, but they should be something. One of our campus partners offered a bag of swag left over from our IT Summit and had a strong response.

After much delay, we were able to secure gift cards for student participants through an incentive program offered by our campus bookstore. Since being able to offer gift cards (specifically $15 for a 30-minute session), we’ve had a huge increase in response rate. We’re working towards other, non-monetary incentives for the fall semester. In a survey, coffee, tea, and items unique and local to Tucson were popular options. In the guidelines we’ve set up for our participant pool, we now require some sort of incentive if you are asking for more than 30 minutes of a participant’s time.

Miro virtual sticky notes with ideas about incentives, including tea, key tracker, fridge magnets, hot sauce
Ideation board for incentives, completing by the library’s UX team, Business Office, and CATalyst Studios

Don’t stop now

We’ll be going back to the office almost fully in the fall semester, but we’ll continue to take advantage of what we’ve learned about remote research. Having this option strengthens our research program, since it allows us to connect with a diverse and distributed population of participants, including remote students and instructors as well as those who might never visit the library in person. That said, I’d be lying if I told you I wasn’t really excited to be back at Tiny CafĂ©.

three people sitting at a table with laptops and snacks, next to a Tiny Cafe sign
Tiny Café, pre-pandemic, at the UArizona Health Sciences Library

Lost in the Stacks: Human-Centered Research

Last November, I was invited to join a podcast conversation on Lost in the Stacks: the Research Library Rock ‘n’ Roll Radio Show.

We often talk of human-centered design, but rarely do we talk about how to make our research itself (which guides our design) also human-centered and empathy-driven. In this conversation, I joined Aditi Joshi, Code for America Senior Qualitative Researcher, to talk through human-centered, trauma-informed research that puts wellness and care of human participants front and center.

Listen to Lost in the Stacks Episode 476: Human-Centered Research.

Service Blueprinting 101

Service blueprinting, as a core component of service design, can be a helpful tool in the early stages of prototyping an idea.

I created this short presentation for the Innovation for Justice course and thought I’d share it here. It covers:

  • Why services are often complicated: variety of players, channels, and interdependencies
  • What service blueprints do: visualize processes across swim lanes
  • Characteristics of service blueprints: comprehensive, specific, iterative
  • Examples of service blueprints
  • Considerations for creating a service blueprint: your audience, level of fidelity, and goals

Also access the slide deck with notes.

Practical Personas: Built Collaboratively and Purpose-Driven

We’ve been using personas at the University of Arizona Libraries for a good while as design and communication tools for different projects. I’ve learned a lot from our different attempts at persona development, so wanted to share my learnings here. In particular, how we’ve collaboratively created personas, leading to buy-in and shared ownership across the organization.

Previous personas

I believe it was 2011 when we first tinkered in persona development. But we made several missteps on our first attempt. We:

  • based them on assumptions (rather than research)
  • created them in isolation (by the 4-person Website Steering Group of the time)
  • used stock photos and stereotypes

They were pretty silly and simplistic, and didn’t really help us build empathy for our users. I remember the donor persona, in particular, was inspired by Daddy Warbucks and became more of a joke than an actual tool for our conversations.

In 2014, we gave it another go. This time, we created personas specific to our Website Redux project where we were re-designing the digital user experience. We based them on data, including web analytics, usability testing, and surveys. We shared them with the library at a “Meet Our Personas” open house event.

people standing around table learning about personas
“Meet Our Personas” event

These became much more useful, particularly as we incorporated them into the Redux project. We used them in:

  • User stories, the framework for all web development work (e.g. “Cheyenne wants to reserve a room from her smart phone.”)
  • Content planning, as we associated every new or revised web page with particular persona(s)
  • Project updates, as we held monthly brown bags and used them as a basis for much of our work

We also distinguished between our primary and secondary audiences. We had 4 primary personas:

  • Cheyenne, the freshman
  • Brandon, the PhD student
  • Emily, the graduate student and teaching assistant
  • Renee, the faculty member

And 3 secondary personas:

  • Donald, the potential donor
  • Elle, the library staff member
  • Craig, the community user
7 personas from 2014
Snapshot of personas from 2014 website project

2018 Persona Project

Context

Come 2018, a number of things had changed. Our content strategist who provided leadership in persona development, Shoshana Mayden, left for another position on campus. We had hired a new content strategist, Kenya Johnson, who also played the role of marketing and communications manager. I had moved out of the technology unit into our administration, providing vision for our UX work library-wide. We also realized that hey, it’s 2014, and Cheyenne the freshman is graduating.

Most of the library staff were familiar with personas. In addition to having used the 2014 personas for several years in the context of our website, we’d also had a design thinking project in late 2017 that gave library employees the experience of creating their own student and faculty personas. This design thinking project also gave us a wealth of new user research data.

So in spring 2018, Kenya and I started working on developing new personas that could be used library-wide.

Intention

We wanted the new personas to be a bit different. We wanted them to:

  • Be useful and adaptable for different project needs
  • Be inclusive and diverse
  • Avoid stereotyping

We identified the purpose of personas as design and communication tools that:

  • consider the users’ perspective and experience, not ours
  • help us understand our audience
  • encourage us to question our assumptions
  • ensure we focus on what matters to people and has the most impact
  • provide a useful foundation and starting point for any project

We wanted personas to help us:

  • describe and empathize with our target audience
  • get on the same page about who we are designing for
  • guide decisions related to services, products, content, design, and more

Workshops

We invited all library staff to attend collaborative workshops to build our personas. We held multiple workshops at different times to allow people to attend no matter their work schedule.

We ultimately had 35 attendees including people from varied departments including technology, access services, research and learning, health sciences, and marketing. In the first 1-hour workshop, we:

  • reviewed design thinking personas
  • conducted mock user interviews
  • identified behaviors, motivations, and constraints of particular user types
Two people presenting a sketch and sticky notes version of a persona
Second persona workshop

In the second 2-hour workshop, we:

  • created teams; created goals, behaviors, constraints for 5 personas
  • identified names, quotes, and photos for personas
  • presented personas to the larger group in a creative way

Our new personas

Persona for Nate with goals, behaviors, and constraints
Final persona for Nate the navigator

Informed by the outcomes of the workshop, we created the following primary personas:

  • Nate the navigator
  • Sam the scholar
  • Isaiah the instructor
  • Linda the learner

And secondary personas:

  • Esmeralda the explorer
  • Evan the employee

One of the main shifts from our previous set of personas was that these were structured around purpose rather than status. We had discovered over the past few years that many of our services weren’t geared specifically to a demographic such as undergraduates, graduate students, or faculty members. Rather, they were geared towards an audience based on their purpose.

Our research services serve all researchers, whether they are faculty, staff, students, or visiting scholars. Our instructional services serve all instructors, whether they are teaching assistants, faculty, or adjunct faculty.

When consulting with staff on projects, such as research support services, we’d often hear things like, “Well, it could be a PhD student or a faculty member, or maybe even an undergraduate.” So we’d often end up with three or four personas listed as an audience for a service, which was less helpful. So we shifted from thinking about students vs. faculty members and started thinking about learners vs. scholars. And recognized that depending on context, an individual could play the role of different persona identities throughout their experience with the library. Someone might be working on a class assignment in the morning, teaching a course in the afternoon, and navigating library spaces in the evening. We’ve found this to be a much more helpful framing.

Final persona for Sam the scholar

Rollout and training

Kenya and I presented the final personas to our library leadership team, encouraging them to use them in upcoming projects and to share them with staff. We also provided hands-on training to departments upon their request. In one-hour training sessions, we presented the personas and had people break into small groups. They worked through a Project Starter where they came up with a project (usually a real one), identified their primary persona(s), adapted them as needed, and thought through how the persona would help guide their design and communication decisions.

We were hopeful that by developing the personas collaboratively and through the hands-on training sessions, people across the library will find them useful in their daily work.

Adoption and adaption

Since launching the personas, they’ve proved helpful for a variety of projects, including the design of new websites, tutorials, and services. The staff who attended the workshops are also now equipped to develop personas whenever they find them useful.

I’ve probably found our new personas most useful as a starting point. Project teams will take one of the personas and adapt it to best fit their purposes. Since these were created in Powerpoint, they are easy to update to fit a particular need. By providing complete personas as well as adaptable template, we’re helping empower staff to place users at the center of their projects, informing their conversations and their decision making.

IT Summit: Creating User-Centered Website Navigation

Over the past several months, our UX team has been preparing for updates to our primary, global drop-down menus on the library’s main website. We started this project in anticipation of significant building renovations and the launch of associated new services to happen in 2020 (see CATalyst Studios). We realized that our existing menu structure didn’t allow for this evolution in our services.

We still have some work to do to before launching our new menus, but in October, I presented with two colleagues, America Curl and Lara Miller, on our progress to date. This was part of the University of Arizona’s IT Summit.

In this talk, we covered our user-centered and content-focused process, with our main techniques being card sorting and tree testing. We’ve also done some prototype testing and first-click testing. Hope you enjoy!

User Interviews: Asking the Right Questions

person interviewing someone at a table
User interview at Tiny Cafe, May 2019

User interviews are a great way to learn about and understand the current user experience. They require asking a lot of questions, so asking the right types of questions matters.

Awhile back, I created a basic training for library employees on how to conduct effective user interviews. I’ve pulled this together as a resource for others.

Don’t lead.

Staying as neutral as possible will ensure more authentic responses from your participants.

Leading questionNon-leading question
Is research funding a big stressor?What has been causing you stress lately?
Did you come to the library to study?Why did you come to the library today?
Have you noticed that it’s cold in the library?What do you notice when you visit the library?
When you said X, did you mean Y?Can you tell me more about what you meant when you said X?

Get them to talk specifics and stories.

What people say is not always what they do, so try asking them to recollect specific experiences.

General questionQuestion that invites specifics and stories
How do you usually conduct research?Can you walk me through your last experience conducting research?
Tell me about your teaching challenges.Tell me about a time you had a teaching challenge and how you handled it.
How do you tend to do that?Can you give me an example?

Be encouraging but neutral.

There are no right or wrong answers, and all insights are valuable. Don’t insert your own reactions or ideas. Try to keep your poker face and manage expectations.

LeadingNeutral
That’s a fantastic idea – I love it!Interesting. Thank you for sharing that.
Yeah, but…I appreciate your ideas.
We tried that once before, but…Thank you, this is really helpful for us to hear.

Listen and dive deeper.

As advocates for our libraries and organizations, we are tempted to talk about ourselves and the services we offer. We also might want to share our own experiences. But this can harm the interview, distracting from the focus: learning about the user.

User interviews are not intended as two-way conversations. As the interviewer, you should ask questions, listen to the responses, then ask more questions. If there’s a moment of silence, that’s fine! Avoid jumping in. Be comfortable with seven-second breaks where no one is speaking. Allow the participant time to process their thoughts and share their experiences.

Avoid such comments as:

  • You probably didn’t know that the library already…
  • We’ve already heard…
  • Don’t worry, we’re trying to….
  • We tried that before but…

Insert “why” and “how” questions to dive deeper.

  • Interesting. Can you tell me why?
  • Why is that important to you?
  • Why did you approach it that way?
  • Interesting. Can you tell me how?
  • How did that make you feel?
  • Why do you think you felt that way?
  • Why is that important to you?
  • How do you currently deal with this?
  • Why does that example come to mind?

Other helpful follow-ups include:

  • Tell me more about…
  • Can you expand on that?
  • Is that what you expected?
  • Can you give me an example?
  • What did you find frustrating about that experience?
  • What might have improved upon that experience?
  • Do you have another example of…?

If participants don’t think they are helping.

Participants will occasionally express concern that they aren’t being helpful. They might say things like, “I don’t know enough about…” or “this probably isn’t the type of response you’re looking for.” To respond to this, try things like:

  • We’d like to hear from everyone and your input is really valuable to use. Can you tell me more about…?
  • This is exactly the sort of information we’d like to hear. Why…?
  • Don’t worry, I have lots of other questions! If you don’t have more to say about ____, can you instead tell me about…?

If participants ask you questions.

Participants occasionally will ask you questions about the project or related services. It’s probably fine to talk with them about these details at the close of the interview (depending on the project), but you want to avoid distracting your conversation with these details while you’re still in the midst of the interview itself.

If a participant starts asking you questions, try something like:

  • I can’t answer that right now, because we’d like to hear from you first. But if you still want to know more when we’re done with the interview, I’m happy to share more.
  • I’m happy to tell you more about X, but first let me ask you…

I hope you find these tips helpful! I’d love to hear more tips or suggestions in the comment sections, as this (like everything) is a work in progress.

Gathering Feedback on Library Furniture

When the UX team moved out of the technology department to be housed centrally in administration, I knew we’d expand our scope to include projects related to physical space. I didn’t know that for two weeks at the end of the fall semester, I’d be immersed in a furniture study to quickly gather feedback from students on dozens of furniture options. It was a new thing for the UX team. It was interesting. And I thought I’d share what we did.

two people talking in a collaborative furniture setup
Preliminary user interviews about furniture in spring 2018 (that’s me on the left)

Context

We’re undergoing a massive, multi-million dollar renovation to our Main and Science-Engineering Libraries. Part of the renovation includes new furniture for our ground floors. In early November, 2018, we received dozens of chairs, a few tables, and a couple pieces of lounge furniture to pilot for a few weeks. They were placed throughout the Main Library ground floor, with tags identifying them.

We were given a quick timeline to gather as much feedback as we could from students to guide decisions. The UX team (3 of us) worked with our new assessment librarian, Lara Miller, and staff from access services, John Miller-Wells and Michael Principe.

Methods

Michael and John created a survey with Qualtrics that people could take online or fill out in person. They have student workers dedicated to data collection who collected observational data (primarily counts of furniture usage) and transcribed the print survey results.

Screenshot of survey questions: letter of option, rate this option, and tell us more
Survey we provided in person and online

Lara and the UX team conducted more qualitative observations, and did some informal interviews with people using the furniture.

We placed stock photos that the companies gave us (just a selection of some of the furniture) on boards in the space and asked people to mark their favorites with sticky notes. Realizing quickly that this did little to tell us why pieces were marked, we asked participants to also describe the furniture in a few words.

Stock photos with instructions to tell us what you think by putting sticky notes next to your favorites and describing in 1-2 words
Stock photos with instructions for passersby to vote on their favorites and describe them

Getting close to the end of the pilot, we realized we didn’t have as much feedback as we wanted on particular choices, such as the laptop tables. We also wanted to compare some specific stools and specific chairs side-by-side.

To get this feedback, we posted photos of the pilot furniture on a large whiteboard and then placed the actual furniture nearby. We asked participants to mark their favorites with green sticky dots and their least favorites with red sticky dots. (We since realized this would be an issue for colorblind users, so in future might use something like stars and sad faces instead).

Whiteboard with votes surrounded by pilot furniture and students testing the furniture out
Pilot furniture placed around a whiteboard where students could provide feedback

Limitations

We were short on time, it was near the end of the semester, and we had a lot of furniture to get feedback on. The stock photos also didn’t match the pilot furniture exactly.

Stock photos of lounge furniture with sticky notes and descriptions like "flexible" and "outlets!"
Feedback on popular stock photos – we didn’t actually have this furniture as part of the pilot

It’s hard to get authentic feedback on this type of thing. Most of the data we collected was attitudinal rather than behavioral. And if we really want to make the best decisions for our students, we should know what they do not just what they think. The best way to discover how students actually use the furniture and what they prefer might be an ethnographic study, but we didn’t have time or resources for that.

A significant issue with most of our methods is that students could vote on a chair for aesthetic reasons (color, shape) when they haven’t actually used it in any real capacity. So a chair could score highly because it’s attractive but not particularly functional, especially for long periods of study.

The decision making process at the end of the day was also unclear, as it’s a negotiation between the library project team, the architects, and the vendors. We can provide the data we collected, but then it’s essentially out of our hands.

Findings

We ended up with 283 completed surveys, 606 sticky notes on the stock photos, and 573 sticky dots on the whiteboards (we removed the stickies as we went so the boards wouldn’t get overwhelmed). We also had 13 days worth of usage data and a handful of notes from qualitative observations.

While we had a couple of hundred survey results, since each survey only referred to a single piece of furniture it was hard to make any conclusions (just 0-5 pieces of feedback per piece). We found that the comparative data was much more useful, and in retrospect would have done more of this from the get go.

We put together all the data and in December were able to present a selection of furniture we recommended and didn’t recommend for purchase.

In the chairs category, most of the recommendations were adjustable, on wheels, with arms, and with fabric seats. For stools, the ability to adjust up and down for people of different heights was especially important. Those chairs we didn’t recommend tended to have hard plastic seats or metal arms, be non-adjustable, or be less comfortable for longer term use.

whiteboard with images of stools and chairs and voting with sticky dots
Stools and chairs comparison – results of preference voting

The winning laptop tables had larger surfaces (to fit a laptop and a mouse/notebook), felt sturdier, and the legs could fit under a variety of chairs or tables.

whiteboard with images of laptop tables and voting dots on each image
Laptop table comparison – results of preference voting

Overall, we didn’t find anything groundbreaking in the data. But we do now have some solid recommendations to share with the powers that be. And we did learn a lot just through the process, which was in many ways an experiment for us:

  • how to gather data on people’s attitudes around furniture
  • how to act quickly and iterate on our process
  • it’s possible to gather a bunch of data in a short, focused amount of time
  • a mixed methods approach works best for this type of thing (as it does for most things!)