The World Council for City Data (WCCD) Open Data Portal showcases data for ISO 37120 certified cities. Specifically, it allows users to visualize complex data sets and enables users to compare, benchmark, and visualize data trends the world over.
We were tasked with conducting usability studies and building prototypes of recommendations that would improve the usability of the site. Working over the course of 8 weeks, our group of 7 students collaborated in modified GV Sprints to analyze the current offering, create mockups of improvements, test those mockups, and make revisions based on our feedback sessions.
We began with a competitive analysis and literature review to develop our understanding of the problem space. We focused on other well designed websites with a focus on data visualization. We also conducted stakeholder interviews to get a better sense of their priorities.
Here are some highlights from what we learned:
Working with such a limited budget (surprise: it was literally $0) and limited time, we had to be mindful of scope when creating our project plan. Ultimately, we decided to move forward with one round of testing / feedback sessions in the hopes of learning more about user needs directly from them. After the feedback sessions, we would then create mockups / recommendations based on what we heard, and encourage more testing.
How might we create a simpler, easier-to-understand interface that removes barriers to entry for accessing city data? Further, how might we build more engaging visualizations that address the needs of advanced users?
Now that we have a reasonable frame of reference for the project, we moved forward into building a testing framework. We started by analyzing our research and agreeing on some common takeaways from our research. We then analyzed the current platform to better understand the opportunities for improvement.
Once that was completed, we moved on to ideation exercises. We decided to format our feedback sessions in the form of hybrid interview / prototype walkthrough.
Our team worked through a number of activities in ideation: four step sketches, 'Crazy 8s', and solution sketching. Each team member took what they learned from the competitive analysis and suggested improvements to the current, sketching ideas over a 2 hour workshop.
Ultimately, each team member drew out their best idea into a 3 panel storyboard and taped their ideas up onto a whiteboard, art museum style. We then heat mapped our favourites by placing dot stickers beside the parts of each solution that seemed the most promising, and wrote down some questions based on these ideas.
From there, we went into a speed critique: discussing each solution sketch to decide what should be included in the first round of prototyping. We took the best of everyone's solution sketches, and built those into a storyboard we could show users in testing. We understood our gaps and priorities at this point, and all that was left to do before the round was write the script and complete the prototype.
I personally built out the screens for the prototype shown to users, and for the final recommendations. The prototype was built in Sketch over a short time frame, and included over 50 art-boards. I was responsible for leading the prototyping in both the initial round, and the recommendations.
Some of the features we added to the platform in the first round of testing are below:
We hosted interviews and observed users walking through our prototype. We also asked generative research questions prior to the walkthrough, and used A/B testing to validate some assumptions that our stakeholders had not previously validated with users.
Users completed pre-determined tasks, and were timed and recorded as they walked through. We observed users clicks and eye movements, and we heard lots of feedback from users. Here are some highlights of what we heard:
Our initial plan for our solution involved adjusting the scatter-plot layout of the data as presented in the current site visualization. We heard overwhelmingly from the users that the way the current data was visualized was actually “refreshing,” despite it’s somewhat clear usability issues, and we elected in our final prototype to focus more on adjusting the usability issues we found through testing as opposed to the actual data visualizations. We recommended looking into the data visualizations in more depth in a further sprint.
We adjusted our onboarding experience based on our testing feedback, trimming the fat off of our initial prototype and taking into account the needs and wants of our users.
Some things we brought over from the current site that worked well for our users in testing:
Some key changes we made from the current site to now:
* Visualization views weren't fully redesigned (based on user feedback), but instead adapted from the current design to the new design language of our proposed solution.
Continuing to explore the ways complex data gets represented is a key to the long-term sustainability of the site. The people we tested with generally understood the language of the city-based scatter-plot, but problems like labeling and density still exist in the current visualization.
While we initially planned to tackle that challenge, our understanding of our space quickly shifted to being able to access information (selecting filters) once we realized that was a key roadblock that came before the visualization in the process. In the future, it would be worth exploring the ways to display information in a way that works in more dense situations.
As mentioned previously (during the Initial Prototyping stage) we attempted to sort out an option where a user was able to log in and save a list of filters for later reference, but simplified our prototype for the sake of testing our core functional changes to the systems usability as opposed to this change. Moving forward, it may be worth re-visiting the ability to save and select visualizations from a user profile or list.
I recieved a 100% final grade for this project.
Looking back on this project years later... it's clear to me we over-thought a lot of what we recommended. We were in way over our heads with the amount of ground we tried to cover, and rather than focus on a truly impactful portion of the site that we would be able to effect, we spread ourselves too thin and delivered surface-level recommendations. I would love to go back and cut out a ton of the scope creep that plagued this project.
I'd also spend less time with high-fidelity prototyping than we did. Although medium and high fidelity prototyping could be helpful with a project like this, we were much too early in the process to extract any value from that time. Our time would've been better spent writing more effective research questions, and really getting to the bottom of what people want from a service like this.
It's also worth mentioning that the company we were partnering with experienced some turnover when we were mid-project, and so the advising we received from the community partner meant to liaison between our group and the World Council on City Data. Looking back, we definitely could've used a more firm hand to push us in the right direction to make an impact.
I would personally like to thank the World Council for City Data for being gracious partners, and for affording our group the opportunity to work on a project that challenged us as students and designers in new and exciting ways.
Our group would also like to thank Abby Goodrum for her continued guidance and support throughout the duration of the project, as well as our testers from the Digital Media and Journalism program at Laurier Brantford for being kind enough to let us test our ideas on them.