Chat support is a feature that users have come to expect from websites on today's internet. The opportunities to provide helpful support to users (and often to automate this support) are something that the Ontario Public Service wants to be involved with. As such, I was tasked with working on a team of technical and business stakeholders to research and design an effective chat solution that could meet the needs of visitors on Ontario.ca. Put simply: users in 2020 expect some form of chat support to exist, and in many cases prefer chat support to other methods.
As a member of the Ontario Digital Service prototyping lab, I was assigned to this project as the lead designer and lead researcher. I worked closely with stakeholders from the Ministry of Government and Consumer Services to deliver actionable, user-centred recommendations for what chat support should look like on Ontario.ca.
Chat support takes many different forms across the web. This research specifically concerns user preferences for online, live chat with a support agent. The idea was that the contact centre team would be responsible for the initial implementation of this chat feature; as a government, we would then learn from the launch of this service and continue to iterate on the findings. The service is currently under development and may look different at launch than what is presented here. To learn more about the Digital Service Standard, and Ontario's approach to iterative, user-centred design, click here.
Throughout the project, I worked with stakeholders who had very little experience in the user research, user testing space. As a representative of the prototyping lab, I not only completed this research work to a high standard but I was also encouraged to take on a teaching role. The hope with teaching and co-leading the research with business-side stakeholders was that those stakeholders could continue the practice of conducting effective, people-focused user research after I was moved to a new project.
How might we deliver a user-friendly chat interface to the people of Ontario that meets and exceeds user expectations for features, wait times, and overall satisfaction with the service? Additionally: how might we gather measurable feedback throughout the design and delivery of this service to continually iterate and improve?
As the research lead on the project, I was responsible for all of the research operations. When it came to writing a screener, selecting participants, writing outreach emails, and scheduling sessions, I was in charge of it all. In between those tasks, I was crafting effective, open-ended questions, creating a clickable prototype to put in front of users, and coaching my stakeholders on how to do these things so that after my engagement on the project ended, they would be able to take up the mantle. Typically this work would be divided among 2 or 3 members of the prototyping lab, but because of how thin we were stretched I was the sole product designer assigned to this project.
Once my prototyping tasks were done, and the scripts were written, we were able to gather a ton of feedback. I was then responsible for taking notes on the transcriptions and recordings of the interviews I had led, and synthesizing those notes into actionable insights for prototype adjustments. It was a lot of work, but the sessions were super informative and my stakeholders were engaged observers throughout the duration of the project, attending a large majority of the sessions.
I led (or co-lead with project stakeholders) 11 feedback sessions, totalling 330 minutes of feedback from the public.We were able to reach a wide variety of users, especially considering the potential limitations of conducting feedback sessions during the COVID-19 pandemic. I was particularly proud we were able to reach multiple users over 65 years old, and users from 5 different self-identified ethnic backgrounds. Based on notes and observations from myself and my stakeholder team, I facilitated work on surface level personas to explain key information about the types of users we were able to speak to.
Based on our findings from these sessions, we were able to put together some common thoughts expressed by a majority of participants. We used a prioritization matrix to make informed changes to our prototype and to create actionable recommendations for the project moving forward.
It's also worth mentioning that users expressed an interesting common line of thinking around their data: specifically, it was expressed that the privacy statement should be kept short. A majority of users expressed a lack of interest, but also mentioned that they appreciated having some knowledge of how their data was being used. This isn't how I originally anticipated users reacting to the privacy statement we presented them, as I would've expected them to ignore it all together. This is something that I'm interested in exploring further as the service launches.
I pride myself on being a feedback driven designer. What that looks like in practice is taking findings from feedback sessions, analyzing their causes, and making improvements to the prototypes to better serve users. This is the core of the work that I do: I actively communicated with my stakeholders throughout this feedback analysis so that they were able to understand my interpretations of findings. Ultimately, qualitative research findings can be difficult to parse through, but removing the ego from the equation and remembering that good designers thrive on feedback makes for a much more effective team.
After carefully considering all user feedback, we identified core areas we could improve our prototype designs. Although our sample size was relatively small in our testing round (11 folks), it's definitely a large enough sample to identify and address trends. As the service matures and nears launch, it will be interested to re-visit testing to see whether or not user concerns were effectively addressed.
After making the previously mentioned adjustments to our prototype, we were able to generate additional guiding principles that the team should keep in mind as they continue with research and development.
Some of the core principles we generated based on feedback:
Based on the initial prototype and the feedback we recieved from users, I generated a prototype that was approved by stakeholders and subsequently generated handoff documentation for the development team.
Once the interface was re-created in response to user feedback, I generated thorough handoff documentation for the development team to use while building the beta of the service. You can see some screenshots of what that interface may look like below.
The project will continue to evolve over time; as we learn more from users, improvements will be made. Because the service is pre-launch, it is hard to quantify the amount of reach it will have. However, I can report that it recieved strong support internally, and the team of stakeholders I worked with is operating under the assumption that this window design will become a standard across Ontario.ca.
"Your work incorporating the design system into their live chat UI and testing it with users is greatly appreciated. Thank you for your dedication, confidence and flexibility in being thrown into a project that you had to lead on your own. We are so proud of coops like yourself who exemplify the digital way of working."
Hillary Hartley
Deputy Minister, Chief Digital and Data Officer
Ontario Digital Service