Many things have changed since COVID-19 began to impact our way of life, and the virus has undeniably accelerated the transition to a digital-first society. Specifically, the effects on the way governments engage with citizens cannot be overstated; many projects that may have otherwise taken years to be prioritized became the immediate focus of many public servants.
For Ontario's Ministry of Finance and the Canada Revenue Agency, collecting electronic signatures is something that has long been a topic of conversation but came into the spotlight specifically in a post-COVID world. Working with the Ontario Digital Service Lab as a product designer, I was tasked with an early-stage research project that would help solidify the understanding of user expectations for digitally signing documents.
I worked closely with stakeholders from the Ministry of Finance to develop prototypes of current-state and future-state implementations of eSignature collection, and independently facilitated user feedback sessions that helped to validate and question assumptions.
The program area of focus for the project was specifically the Healthy Smiles Ontario program, which provides dental care to people 17 years of age or younger who come from low-income households.
Working with a previously-hired vendor, we were initially unable to view any sort of demo of what a realistic potential solution could look like due to legal limitations in the early-stage of this project. Still, it was our mandate to provide thorough recommendations to the Ministry of Finance that would help inform a user-friendly e-Signature implementation.
I did not have the luxury of delaying the project until the vendor was able to work with me more closely, and so I began the project by facilitating a 2-hour workshop with a diverse set of project stakeholders that would help to develop a plan of action and clearly define our priorities.
Our anticipated delivery must include:
We initially aimed to quantify our changes by:
We observed that the process for signing and submitting an application to Healthy Smiles Ontario does not enable users to sign up entirely online. How might we enable the user to submit signatures online so that we are able to reduce the amount of time and effort it takes to enrol in HSO? Further, how can we design a simple, secure and inclusive experience that sees a reduction in the abandonment rate of users attempting to enroll?
Typically, we’d hope to conduct sessions in-person to reach as large a scope of participants as possible. Obviously with everyone working from home, we were limited to online recruitment methods. It should be noted that everyone we spoke to was required to possess some level of technology literacy, as we conducted sessions virtually. Special attention was paid to outreach for low-income individuals, and we reached out to community health centers and other local community organizations for help reaching this user group.
After clearly defining our problem space and objectives, I created a research plan for our feedback sessions that included a script and specific scenarios for users to walk through. Special attention was given to writing open-ended questions that avoided leading the user. If I were to directly ask the user what we were hoping to learn, it wouldn't make for great conversation or promote honest answers.
As I worked to build the script independently, I was working in parallel on the prototype we would be showing to users. I worked with another designer on the prototyping, and took on a teaching role as they had no experience in Figma prior to this project. Together, we built a recreation of what the current process looked like, as well as a future-state for the application that adopted the Ontario Design System. We tested using the current-state visual standard, as it was decided that we would be best served showing users as close to current-state representations as possible.
As a team, we went over the script and prototype together to make sure it aligned with our research goals. I then facilitated user feedback sessions via Zoom, asking generative introductory questions and walking users through the prototype using Zoom's remote access features and our Figma document.
When creating scripts for user testing sessions, it’s important to understand that directly asking what we want to know doesn’t always make for a great conversation. We also want to avoid leading the user with our questioning, and allowing them to freely offer up an answer to the question we ask.
Something that has been a challenge for me as I've learned and grown as a designer is becoming comfortable with the awkward silence. Usually the user offers up valuable feedback to get out of it, and so it's best to let a question sit until you're asked to clarify.
As a simple example, we often ask “what are your thoughts on (a feature)” or “what would you expect from (a product or service)” rather than “do you like this?” ... Our hope is that this allows the user to offer up their own thoughts with less of a lead-in to their answer.
Because we were using the Healthy Smiles Ontario program as a testing ground for the e-Signature implementation, some feedback was specific to that program. A lot of the feedback from users was on the content design and instructions they saw when working through the process. We leaned into this focus while maintaining our priority to establish user preference for how their electronic signature is collected.
Based on our feedback to this point, it was clear we needed to go back to the drawing board on the content design. Improvements were made after our first round of sessions to the instructions on the page.
At this point in our testing, users found the overall process simple and easy. There were concerns expressed specifically regarding the accuracy of their signature, but overall feedback was quite positive. As a team, we worked to drill down on some of the core criticisms users had with their processes. Once our prototypes were properly built in Figma, I led another round of feedback sessions to solidify the findings.
I took an opportunity between rounds of testing to speak to some internal folks. The verification agents we spoke to had close-up knowledge around the types of users who typically applied for the Healthy Smiles Program, and their behaviors. Our project stakeholders gave us a great foundation, and after our feedback sessions we had a really solid set of questions for the folks who work up close and personal with these users most often.
We learnt a lot from the feedback session with the verification agents, but here are some highlights:
We then pushed forward with a second round of testing with the public.
I pride myself on being a feedback driven designer. What that looks like in practice is taking findings from feedback sessions, analyzing their causes, and making improvements to the prototypes to better serve users. This is the core of the work that I do: I actively communicated with my stakeholders throughout this feedback analysis so that they were able to understand my interpretations of findings. Ultimately, qualitative research findings can be difficult to parse through, but removing the ego from the equation and remembering that good designers thrive on feedback makes for a much more effective team.
After carefully considering all user feedback, we identified core areas we could improve our prototype designs and make actionable recommendations. Although we were very early in the process, and the vendor had not yet been on-boarded, these recommendations make up a good foundation for the project to move forward prioritizing user needs. Ultimately, the results from this work are mostly conceptual: these recommendations serve as best-practices, and exact interface designs will need to be tested further as they are developed.
Based on the feedback we received from business stakeholders, verification agents and the public, we were able to make informed recommendations for best practices. Stakeholders were pleased to have an informed set of recommendations to give to the vendor once development began.
The project will continue to evolve over time; as we learn more from users, improvements will be made. Because the service is pre-launch, it is hard to quantify the amount of reach it will have. However, I can report that it received strong support internally, and the team of stakeholders I worked with is moving forward with the principles we recommended based on user feedback.
Note that my work on this project was done very early on in the development process: the eventual final product that launches may look different from what is shown here.
With more time and resources I would've loved to have continued to be involved with usability testing. Having a vendor before having an understanding of your users real needs is also not ideal, but is an unfortunately common reality in government.