Project Details

  • Client: People Sourced Policy
  • Timeline: July 31, 2017 - August 4, 2017 
  • Team: Ben Dustman, Josh Kaeding, Laura Cesafsky, Megan Kelly 

Tidbit Overview

  • Full stack development students from Prime Digital Academy recently designed a site for our client, People Sourced Policy. 
  • Our UX team ran usability tests on the existing site to pinpoint usability issues. 
  • We redesigned the site to address user pain points that were discovered from testing. 

Methods

  • Heuristic analysis
  • Usability review
  • Inperson usability test
  • Remote usability test
  • Findings and recommendations report
  • Interactive prototype design with Axure

Deliverables

 
About Infographic_Official Arrow.png
 

The Problem

People Sourced Policy’s mission is to host an online community forum where people can discuss MN social issues and policies. The current interface is overwhelming and lacks visual hierarchy. Because of this, it is unclear how visitors should utilize and participate on the web forum. Furthermore, there are many user actions that are irreversible and unclear, making it an unfamiliar and uncomfortable interface, especially for new visitors.

The Solution

I proposed a design for a more inviting and informative interface. Simple and thoughtful information architecture on the landing page allows visitors to quickly grasp the organization’s mission and the purpose of the web forum. In addition, I increased user feedback and error prevention to make the forum a more straightforward and comfortable space for users to participate in.

 

The Process

 

Goals

 
  • Conduct formal in-person user interviews with the existing applications. 
  • Uncover pain points and propose a design solutions in a usability report. 
 
 

Understanding The Client

We worked with PeopleSourced Policy, a nonprofit with a goal to increase awareness of local community issues and politics. A group of full stack development students at Prime had built a functioning site for this organization. Now it was time to test its usability.

 
 

Research

 

Usability Review

The first thing we did as a team was evaluate the site. We identified many glitches and potential pain points. The key was to parse out possible usability issues that were not already known and identified by the full stack students. Each team member completed a usability review which included a heuristic analysis based on common tasks that a user may complete. We prepared a script with tasks and scenarios to ask our participants during the upcoming usability test. 

 
 
 
 
 

Testing

In-Person Usability Test

We were fortunate to have Fathom Consulting lend us their interview space to conduct our in-person usability tests.  It was eye-opening to see how much preparation and planning went into conducting effective interviews. We had the opportunity to interview and test the site on 3 participants. Each session lasted approximately 30 minutes. 

 
 
 
 
 
 

Remote Usability Test

After the in-person tests, each team member conducted two remote usability test. Overall as a group, two of these remote tests were unmoderated while eight were moderate over video chat and screen sharing. Two of my remote usability test participants experienced unexpected technical difficulties, but eventually I was able to troubleshoot and continue with my interviews as planned.

 
 
 

Synthesis

Data Analysis

I individually picked apart the large volume of data we had acquired, and little by little I started to see patterns and themes within the many usability issues encountered by our users. 

 
 
 
 

Findings

I narrowed down the user goals to three items each for the community member and administrator. Next, I identified the most problematic issues that would interfere with the user's main objective when using the site.

 
 

User Goals

Issues

  • Users may have difficulty understanding the purpose of the website. 
  • Users may have difficulty interpreting the flow of the site and hierarchy of the content.
  • Users did not receive consistent feedback to reassure task completion.
  • There are not enough systems for error prevention and there are no options for error correction. 

 

 
 
 

User Goals

Issues

  • The administrator dashboard is not easily discoverable and accessible. 
  • It is difficult to determine all the actions a use can perform on the “flagged item” screen. 
  • There are not enough systems for error prevention. 
 
 

Issue Ranking

Each issue was tagged with a severity ranking. 

 
 

Usability Report

 
 

Findings and recommendations were compiled in a usability report. [1]

 
 
 
 
 

Prototype

 

After suggesting several recommendations in my report, I created an interactive prototype with some of these changes in mind. [2] For this portion of the project, I focused on the following:

  • Creating a home page that has better flow/hierarchy and helps the users intuitively determine the actions that can be performed. 
  • Explicit and concise information on the purpose of the website/organization. 
  • Consistent and obvious feedback when a task has been completed.
 
 

Next Steps

 
 
 
 

Conclusion

This project was informative in learning how to prepare and conduct different forms of usability tests. In addition, it was a project that allowed us to fully evaluate and iterate a low fidelity website. In future client projects, I hope to learn how to take notes more efficiently as well as try out different data analysis methods to see what works best for myself. 

 
 

Documentation & deliverables

 

[1] ^ usability report [open pdf] 

[2] ^ prototype [open Axure link]