RiseSmart

Research, IA, UX | 2019

 
 

Redefining the wayfinding experience for job seekers

 

 

Team
3 Designers

My Role
Project Lead
Research & Design

Timeline
3 Weeks

Deliverables
Information Architecture
Wireframe

Research
Card Sorting
Tree Testing
Usability Testing

 

 
 
 

Overview

The Company

RiseSmart is an HR company that specializes in outplacement services. It helps individuals impacted by layoffs find new opportunities by providing career transition services such as career coaching, resume writing, and job search assistance.

The Product

Our flagship product, Spotlight, is the one-stop shop for job seekers to find everything they need to get back on their feet. Users can book an appointment with their career coach to work on job interview preparation, request a resume writing service, or get matched with jobs that best fit their skill sets. It is a web app with mobile apps available as support.

Our web app before the redesign:

Screen Shot 2021-07-12 at 9.30.23 PM.png
 
 

Problem

If users can’t find it, it doesn’t exist.

Image Source

Users were struggling to find resources on our platform and this was problematic because it created a poor user experience, and what’s worse, since our revenue depended on the number of services requested by participants—many of which came directly from our platform—any navigation difficulties led to a loss in business.



How did we get here?

Our platform was about ten years old. Over time, new features were added without modifications to the navigation because of the same old story: we didn't have time! As a result, the menu grew longer and longer, making it more difficult for users to find things. The UI structure became unsustainable as we continued to cram more stuff into the limited real estate on the screen. We decided to tackle the problem with a systematic approach through a complete redesign.

 
 

Design Brief

Redefine the wayfinding experience to help job seekers quickly understand what’s offered and where to find them.

 
 

My Role

There were two phases for this project:

I led the first phase of the project, delivering the information architecture and wireframes in collaboration with two other designers. I provided strategy, conducted user research, synthesized data, shared findings, and created information architecture and wireframes.

 
 

Pillars of Success

These three design elements are critical to crafting a successful wayfinding experience:

 
 

Strategy

The strategy for success is to involve users along the way. We stayed close to the users throughout the project by conducting user studies including usability testing, card sorting, and tree testing to validate our assumptions and refine our solutions.

 
 

Process

Here were the steps we took for phase 1 of the project:

  1. Measure Findability & Discoverability

  2. Define Information Architecture

  3. Create Wireframe

Step 1: Measure Findability & Discoverability

I conducted usability testing with 5 users to help us validate the problem, get buy-ins, and set the benchmark. We uncovered the usability issues that prohibited users from accessing our key offerings such as resume and coaching services.

 

Step 2: Define Information Architecture

Instead of relying on design “experts” to define the site structure, I believed we’d have a better chance of arriving at a more intuitive structure if we let the people who’d be using the system “tell us how to design”. Therefore, we performed card sorting to uncover the user’s mental model; took these insights and transformed them into the site structure; and iterated it through tree testing.

Card Sorting

Card sorting is a research method for understanding how users make sense of the information on your website. Users are asked to sort a deck of cards that represent the content on your website into categories as they see fit.

We performed a remote, unmoderated, open card sort where participants had to define the labels for the categories they created. From there, we acquired inspiration around how to best organize our content.

For preparation, we conducted a few offline sessions with our internal team for quality assurance and to make sure the team was in alignment with the purpose and process of this project.

We performed an unmoderated, remote, open card sort with 9 participants. We asked them to group 29 cards into categories as they saw fit and define the labeling for the groups they created.

For analysis, we looked at the data by slicing and dicing them in different ways:

  1. Participants: How did people sort information on an individual level? What was the logic behind the groupings?

  2. Cards: What were the categories created for each card? Was there any consistent pattern?

  3. Categories: What categories did people create? What cards were sorted under these categories?

  4. Similarity Matrix: How often were two cards paired together?

 

So, how do people sort the content? We’ve discovered some interesting findings such as the different ways people sort information:

  • By relevance: related to me vs. not me

  • By timeline: past, present, future

  • By urgency: to-dos vs. no action needed

  • By topics: interviews, job search, networking

  • By stages: job search, interviews, receiving job offers, salary negotiation

So, what’s the best way to sort the content?

I was struggling so hard to find the right way to sort information. Eventually, I realized there was no such thing as the perfect way of sorting because every individual was unique. We were different in many ways from gender, age, occupation, and background, to personal goals. This led me to the conclusion that we should design for users of different mindsets. In other words, a good design should accommodate users with different mental models. This could be done by providing multiple pathways to the same destination.

 

Tree Testing

While card sorting is great for understanding how our users think, it doesn’t tell us whether our proposed site structure works. Tree testing tells us how easily people can find information on our site, and where they get lost. It is a fast and iterative way to evaluate whether our categories and labeling make sense to the users.

We started by mapping out the tree representing the current website and the new tree in a table view:

We quickly found ourselves in a difficult situation when each designer proposed a unique structure. How would we know which one was better? Would the better one solve the problem? Would it create new problems?

We decided to resolve this conflict in a civilized way - tree testing. We compared the performance of the trees proposed by each designer. Then, we selected the best one for further refinement and iteration.

For the test, we defined a list of tasks focusing on our most important offerings and potential problem areas:

We built the trees using Treejack, and asked participants to complete the list of tasks by navigating through the tree we set up. Below is a demo showing a participant clicking through the tree to find the target item:

TASK: Your job assistant found some jobs for you. Where would you go to find them?

Logistics:

  • Remote, unmoderated testing

  • Tested 4 trees

  • 10 Participants / Tree

  • 6 Tasks / Tree

  • Average time to complete: 3 mins

For analysis, we looked at the results from different angles:

  1. Success rate

  2. Pietree

  3. Destinations

  4. First click

  5. Paths

 
 

Step 3: Create the Wireframe

Arriving at an intuitive site structure was only half of the battle. The next step was to transform that structure into an interactive experience for our users.

We created rapid iterations of low-fidelity clickable prototypes using UXPin. Having something tangible allowed us to get the feel of how things come together, from the content & layout to the transitions between pages. 

 
 

Outcome

Following the user-centered design process, we believe we’ve improved the product discoverability and findability, making it easier for our users to navigate.

Here were the things we did to improve the way-finding experience:

  1. Reduced cognitive load by cutting down the amount of text and visual distractions on the screen. We also reduced the number of CTA buttons on a page, making it easier for the users to focus on the most critical tasks.

  2. Introduced a more sustainable navigation UI by using dropdowns in the navigation menu to allow for second-layer information access. This solved the issue of the limited horizontal space, especially for users with smaller devices.

  3. Adopted a more intuitive labeling. We moved away from marketing jargon to simple language that people speak. For example, we replaced “Your brand” with “My resume”.

  4. Surfaced our key offerings that were previously hidden from the UI, such as resume writing and career coaching. We moved these critical services up to the first level in the navigation so they would always be visible.

  5. Enhanced accessibility by offering multiple pathways to a destination. It turns out that everyone benefits from a more flexible and inclusive navigation design.

 

Before

 

After

 
 

Measuring Success

How did we know if the redesign was successful? 

As we completed the redesign, there was a shift in priorities, so we didn’t get the chance to measure the impact of the redesign. If given the opportunity, I would evaluate the following metrics:

  • Usability

    • Task completion rate

    • Time on task

  • Service Requested

    • Number of coaching sessions requested

    • Number of resume writing services requested

  • Cost Savings

    • Time reduced on tech support by coaches

  • User Satisfaction

    • System Usability Scale

    • User satisfaction rating and qualitative feedback

 
Previous
Previous

Journeys

Next
Next

Wall Street English