Designing the interface for Ontario.ca chat support pilot projects

Working with the Call Centre Modernization team to design and usability test the interface for Ontario's first online chat support options

Problem

Chat support is a common feature across the web today. As a province, we needed to address the growing need for web-based support in a post-pandemic world.

Goal

Our goal was to develop a robust chat support solution that could scale effectively. It was important to match the current understanding of how online chat support looks and feels for users. We were also required to adhere to the current design system and standards of Ontario.ca.

Action

I facilitated feedback sessions, create personas, created prototypes, tested those prototypes and took on a teaching role with my stakeholders so that they could continue to focus on human-centred design after I left the project.

Impact

I wrote thorough developer handoff documentation, which is being used to develop the beta. The beta is launching across 44 different program areas. This research will inform the launch of online chat support across Ontario.ca, potentially reaching millions of users.

Problem

Context

Chat support is a feature that users have come to expect from websites on today's internet. The opportunities to provide helpful support to users (and often to automate this support) are something that the Ontario Public Service wants to be involved with. As such, I was tasked with working on a team of technical and business stakeholders to research and design an effective chat solution that could meet the needs of visitors on Ontario.ca. Put simply: users in 2020 expect some form of chat support to exist, and in many cases prefer chat support to other methods.

Graphical representation of quotes from feedback sessions. Quote 1: "I just like chat support because I think I can easily get my point across. I think over the phone, it's sometimes hard to explain what actually is going wrong." Quote 2: "It’s 2020, you can’t get away without using chat support." Quote 3: "I would do the chat, because I like being able to see the information to go refer back to it." Quote 4: "When I think about support over the phone, I think about long wait times." Quote 5: "Sometimes when I have simple questions, it's hard to sit there for 45 minutes or an hour on the phone just for a simple yes or no." Quote 6: "I am a milennial, a true milennial so for me it's definitely not [phone]... I would definitely prefer either email or some sort of chat available." Quote 7: "The wait time is typically shorter than when you’re actually calling someone."
Figure 1.1: Some highlight quotes from our feedback sessions highlighting the desire for well-built chat support options

As a member of the Ontario Digital Service prototyping lab, I was assigned to this project as the lead designer and lead researcher. I worked closely with stakeholders from the Ministry of Government and Consumer Services to deliver actionable, user-centred recommendations for what chat support should look like on Ontario.ca.


Chat support takes many different forms across the web. This research specifically concerns user preferences for online, live chat with a support agent. The idea was that the contact centre team would be responsible for the initial implementation of this chat feature; as a government, we would then learn from the launch of this service and continue to iterate on the findings. The service is currently under development and may look different at launch than what is presented here. To learn more about the Digital Service Standard, and Ontario's approach to iterative, user-centred design, click here.

Teaching stakeholders the value of human-centred design

Throughout the project, I worked with stakeholders who had very little experience in the user research, user testing space. As a representative of the prototyping lab, I not only completed this research work to a high standard but I was also encouraged to take on a teaching role. The hope with teaching and co-leading the research with business-side stakeholders was that those stakeholders could continue the practice of conducting effective, people-focused user research after I was moved to a new project.

Figure 1.2: A summary of the process for user research on this project


Goal

Problem statement

How might we deliver a user-friendly chat interface to the people of Ontario that meets and exceeds user expectations for features, wait times, and overall satisfaction with the service? Additionally: how might we gather measurable feedback throughout the design and delivery of this service to continually iterate and improve?

Core research objectives

  • What features and appearance do users expect from online chat support?
  • What preferences do users have when recieving chat support?
  • What will the button to start a chat look like, and where will it appear?

Important things to consider

Action

My responsibilities as the research lead

As the research lead on the project, I was responsible for all of the research operations. When it came to writing a screener, selecting participants, writing outreach emails, and scheduling sessions, I was in charge of it all. In between those tasks, I was crafting effective, open-ended questions, creating a clickable prototype to put in front of users, and coaching my stakeholders on how to do these things so that after my engagement on the project ended, they would be able to take up the mantle. Typically this work would be divided among 2 or 3 members of the prototyping lab, but because of how thin we were stretched I was the sole product designer assigned to this project.

A graphic depiction of my responsibilities regarding the research on this project. 1: Writing a participant screener, going through screener results to reach the most diverse set of participants possible. 2: Scheduling feedback sessions, collecting user consent to participate. 3: Translating research outcome questions into more conversational, open-ended questions. 4: Writing a facilitation guide and coaching stakeholders so they can host some sessions. 5: Take notes on the recordings of feedback sessions I hosted. 6: Synthesize notes into core research insights that can inform additional prototyping.
Figure 1.3: The responsibilities I had relating to the research on this project

Once my prototyping tasks were done, and the scripts were written, we were able to gather a ton of feedback. I was then responsible for taking notes on the transcriptions and recordings of the interviews I had led, and synthesizing those notes into actionable insights for prototype adjustments. It was a lot of work, but the sessions were super informative and my stakeholders were engaged observers throughout the duration of the project, attending a large majority of the sessions.

Conducting sessions, creating personas

A screenshot of a series of chat windows, which were shown to users. The text in the chat window is not meant to be visible.
Figure 1.4: The chat window flow we showed users

I led (or co-lead with project stakeholders) 11 feedback sessions, totalling 330 minutes of feedback from the public.We were able to reach a wide variety of users, especially considering the potential limitations of conducting feedback sessions during the COVID-19 pandemic. I was particularly proud we were able to reach multiple users over 65 years old, and users from 5 different self-identified ethnic backgrounds. Based on notes and observations from myself and my stakeholder team, I facilitated work on surface level personas to explain key information about the types of users we were able to speak to.

A graphic-based persona. Name: Michelle. Age: 16-30. Occupation: student / young professional. Digital literacy: high level of digital literacy, comfortable with technology. Familiarity with online government services: comfortable (has accessed previously). Familiarity with online chat support services: comfortable (has accessed previously). Preferred method of seeking support: online chat, considers it to be the fastest option. Amount of time willing to wait to connect: short period, 2-3 minute max. Quote: “When I think about support over the phone, I think about long wait times.”
Figure 2.1: A persona, based on insights from feedback sessions

Figure 2.2: Another persona, based on insights from feedback sessions

A graphic-based persona. Name: Claire. Age: 50-75. Occupation: Retired elementary school teacher. Digital literacy: low/medium level of digital literacy, online shops since COVID-19. Familiarity with online government services: has no previous experience accessing. Familiarity with online chat support services: has accessed once or twice, not often. Preferred method of seeking support: in-person support (if available safely). Amount of time willing to wait to connect: medium to long period, 10-20 minutes. Quote: obviously I have to do things online now because of COVID, so I’m trying to learn.
Figure 2.3: A third persona, based on insights from feedback sessions

What we heard in our feedback sessions

Based on our findings from these sessions, we were able to put together some common thoughts expressed by a majority of participants. We used a prioritization matrix to make informed changes to our prototype and to create actionable recommendations for the project moving forward.

  1. Users expect 'start chat' button at the bottom right hand corner of the page
  2. Users want the ability to save a copy of their chat to reference at a later date
  3. Users don't want to wait a long time to be connected
  4. Users prefer a human agent to a chat bot
  5. Users would like to see their agents name to indicate they are speaking to a human
  6. Color contrast, message labels and on-screen location made it clear which message belonged to which party

It's also worth mentioning that users expressed an interesting common line of thinking around their data: specifically, it was expressed that the privacy statement should be kept short. A majority of users expressed a lack of interest, but also mentioned that they appreciated having some knowledge of how their data was being used. This isn't how I originally anticipated users reacting to the privacy statement we presented them, as I would've expected them to ignore it all together. This is something that I'm interested in exploring further as the service launches.

Figure 2.4: Iterations of the chat button based on user feedback

Figure 2.5: Iterations of the window header based on user feedback


Opportunities to improve initial prototype

I pride myself on being a feedback driven designer. What that looks like in practice is taking findings from feedback sessions, analyzing their causes, and making improvements to the prototypes to better serve users. This is the core of the work that I do: I actively communicated with my stakeholders throughout this feedback analysis so that they were able to understand my interpretations of findings. Ultimately, qualitative research findings can be difficult to parse through, but removing the ego from the equation and remembering that good designers thrive on feedback makes for a much more effective team.


After carefully considering all user feedback, we identified core areas we could improve our prototype designs. Although our sample size was relatively small in our testing round (11 folks), it's definitely a large enough sample to identify and address trends. As the service matures and nears launch, it will be interested to re-visit testing to see whether or not user concerns were effectively addressed.

A graphic-based summary of insights. 1: Users worried they may lose the original button design on pages with more text than the sample shown in testing. 2: Users expected to know their place in queue or an estimate of how long they would be waiting. 3: Users want to know that they are able to save a copy of their chat earlier in the process. 4: Users did not like the word "client" to identify their message (preferred “you” or “your message”). 5: Using emojis as buttons to denote level of satisfaction with service was not perfectly clear on its own for all users. 6: Users were worried they might miss incoming messages if too many were sent by their agent at once. 7: "Transcript" is a technical word, and the reading level needed to be reduced. 8: Users wanted to see the Trillium logo, or some sort of branding to clearly identify that this chat service was provided by the Government of Ontario
Figure 2.6: Summary of insights after 11 feedback sessions

No items found.

Impact

Guiding principles for moving the project forward

After making the previously mentioned adjustments to our prototype, we were able to generate additional guiding principles that the team should keep in mind as they continue with research and development.


Some of the core principles we generated based on feedback:

  • The shorter the wait time, the better (both for initial connection and responses)
  • The shorter the messages, the easier they are to understand
  • Keep reading level of messages low, and do not use technical words that could intimidate or confuse users (e.g. transcript --> copy of chat)
  • Confirm with users whether or not their questions have been answered
  • Clearly labelling all key elements is important to users
  • The more requirements you impose (such as the need to give a name or email address) the less likely you are to retain your users

Based on the initial prototype and the feedback we recieved from users, I generated a prototype that was approved by stakeholders and subsequently generated handoff documentation for the development team.

Making interface adjustments, providing documentation

Once the interface was re-created in response to user feedback, I generated thorough handoff documentation for the development team to use while building the beta of the service. You can see some screenshots of what that interface may look like below.

Reflection

The project will continue to evolve over time; as we learn more from users, improvements will be made. Because the service is pre-launch, it is hard to quantify the amount of reach it will have. However, I can report that it recieved strong support internally, and the team of stakeholders I worked with is operating under the assumption that this window design will become a standard across Ontario.ca.

"Your work incorporating the design system into their live chat UI and testing it with users is greatly appreciated. Thank you for your dedication, confidence and flexibility in being thrown into a project that you had to lead on your own. We are so proud of coops like yourself who exemplify the digital way of working."

Hillary Hartley
Deputy Minister, Chief Digital and Data Officer
Ontario Digital Service