top of page

Rachel in Industry

Portfolio Overview

Professional Publications

Various white papers I've written and published for WillowTree in service of their loyalty program offerings.

WillowTree

Mixed methods research project developing a user-centered & data driven app strategy for a globally recognized fitness brand.

LiveRamp

Mixed-methods benchmarking assessment of a SaaS platform for highly technical users.

Ashley Furniture Industries

Multivariate A/B test on the user experience of an e-commerce website to improve engagement on product detail pages.

White Papers & Professional Publications

Screenshot 2024-06-12 at 10.02.51 AM.png

A loyalty program is a way to reward your customers in service of building more meaningful relationships with them. We can better understand how to drive consumers’ loyalty to your brand through the following relationship characteristics:

❤️ Relationship Satisfaction: Do your consumers feel good about the quality of the services being provided to them by your loyalty program? Does it take consumers a lot of effort to redeem rewards? Do consumers feel like they get value from your loyalty program?

💳 Relationship Investment: Have consumers been a loyalty program member for a long time? Do they get extra benefits or incentives for being a long time member? Would it be hard for them to start over somewhere else?

👀 Quality of Alternatives: Are there other credit card loyalty programs that would afford them better benefits or rewards than your loyalty program?

Taken together, this model states:
Relationship Satisfaction + Investment - Quality of Alternatives = Consumer Commitment to their Loyalty Program

In this white paper, we explore which elements of a credit card loyalty program’s digital experience impact these different contributing factors of consumer brand loyalty in five regions (AP, EUR, MEA, NAM, & LAC).

You can download our white paper in full for free! (click image above)

image.png

WillowTree

I am currently a Sr. Product Researcher at WillowTree. You can learn more about WillowTree here. I help companies understand their customers, make strategic decisions, and implement a best in class digital experience. 

Example #1: Mixed-Methods Research Project

Fitness App Strategy & Market Viability

Blonde woman dancing

Stakeholders: Large, global, house-hold name fitness brand (C-Suite, VPs, Directors)

Objective: To explore the viability of and strategy for the development of a direct-to-consumer app. 

Research Activities: 8 Stakeholder interviews, in-depth 45-minute semi-structured interviews with 10 consumers (6 who had prior experience with the brand, 4 who regularly engage with the brand), a 750-person survey, a 400-person experimental survey, and in-depth 30 minute semi-structured interviews with 10 instructors over the course of 14 weeks.

Team: Strategy, Design, Growth Marketing, Engineering, Product Management

Read Case Study here

Project Outcomes: Through this process, we identified the market viability of the app and a price range for which the app would be appealing to consumers and fit within a competitive space.  Implementation of product recommendations resulted in 400,000 app downloads in the first four months on the app store.

LiveRamp

Example #2: Mixed-Methods Research Project

Screenshot of LiveRamp's Safe Haven Platform

Safe Haven Benchmarking Assessment

Stakeholders: Product Team
 

Objective: To explore how ease of use metrics, time to task completion, and user-noted pain points changed year over year in the Safe Haven product
 

Research Activities: 10 in-depth 60-minute usability interviews, survey, analytics

Process:

  • I modified a pre-existing benchmarking plan to account for product changes that had occurred within the prior year. I recruited participants who fell into particular data-informed user personas based on their engagement with the tool and their job roles and responsibilities.

  • Participants were asked to walk through a list of tasks that should be accomplishable by individuals who have similar regular daily tasks in the platform. During this process, participants were asked to share their thoughts as they navigated each task. Each task was timed to see if modifications to the product resulted in ease of use for users. Participants then completed a survey after the task-completion activity to complete the system usability score (SUS) measure to see if overall perceptions of product usability changed year over year. 

Outcomes:

  • I compiled a deliverable overviewing task-score changes year over year and highlighting whether the experience was unchanged, improved, or made worse. I tied in thematic assessments of participants' shared insights during the course of the study to ensure the user voice was present throughout the presentation. My recommendations rerouted the product roadmap for the next year.

Ashley Furniture Industries

Example #3: A/B Testing

Mattress Configurator Multivariate Test

Stakeholders: Merchandising and UX teams

Objective: To identify which button display for product options was most effective for customers on mattress product pages.

Research Activities: Two Multivariate A/B tests

Team: UX Design, Research, Development (Front End & Full Stack)

Screenshot of Ashley Furniture Mattress Page

Process: This test employed a 2 x 2 x 2 multivariate design through which we explored the interplay between selection options for mattress sizing being displayed as text compared to an icon, a radio selector compared to no radio selector, and having an option selected by default compared to not having an option selected by default. Every user was exposed to one condition from each variable (e.g. text, radio selector, default selection). This test was run separately on mobile and desktop devices to account for differences in the volume of traffic coming from different device types. In addition, this test was run specifically on mattress product pages, only.
 

Data Analysis: I looked at both the main effects for each variable as well as the interaction effects between variables to see which variable and which combination of variables were driving effects. I then assessed differences between mobile and desktop users to appropriately make recommendations for each user group.
 

Outcomes & Recommendations:

  • Add to cart rates were mostly impacted by having no default selection on product selection buttons for both mobile and non-mobile users.

  • Radio buttons significantly impacted add to cart rates for both mobile and non-mobile users.

  • Finally, the images and text buttons were differently effective for both mobile and non mobile users. 

  • From these results, I made different recommendations for mobile and non-mobile experiences, and made an iterative recommendation to assess the impact of default selection on product options across the website (beyond mattress detail pages) in future tests.

Image of mobile display for mattress size options using radio buttons and text

Mobile Option Selections (above)
Desktop Option Selection (below)

Screenshot of desktop mattress selection options with icons and text

Professional Development

Wanek School of Business
Ashley Leadership Foundations 1                                                                                                         Dec, 2021

SiteSpect Case Study

My manager, Matt Sparks, and I were invited to discuss the testing program at Ashley after a very successful year in 2020. Here I provide tips for designing more rigorous experiments and conducting appropriate analysis in A/B testing.
 

bottom of page