top of page

Role

Industry

Years

Senior UX Design &  Research Manager

Strategy, User Testing, UX Design, A/B Testing, AI Personalization

2020 - 2023

Testing & Personalization

Challenge

When I arrived at YETI, there was skepticism and hesitation towards making data-driven decisions. They had been riding on the success thanks to the nature of customers getting into the outdoors as a safe activity during COVID, so why rock the boat? 

Quantitative

The first order of business was cleaning up and adapting the current quantitative approach.

 

Part of the reason there was hesitation to testing was due to the previous A/B testing being present but lacking in strategy and reporting. After creating a strategy and testing roadmap, we began our testing efforts. All tests that went live had a data-driven hypothesis. Then the data began flowing in. Taking a statistical approach, we analyzed the data carefully to make the case of success or learnings. Next we created a process to tell the story for each test that helped illustrate to the business in a more approachable way to understand the results than a plain table of numbers. This way of approaching A/B testing quickly adopted the realization of the benefit of testing the hypothesis rather than spending the time and money to implement.

Another testing method we deployed was the ability to run quick dynamic tests that allocated traffic to the winning variant quickly based on a key metric. Oftentimes the decision was made within 24 hours to a week to decide the winning variant. This allowed us to adapt quickly during our most crucial holiday sales season and optimize key metrics like conversion and revenue per session without testing over a longer period of time or enacting an experience that would hurt the experience because we think it will work.

Example of site test: Interactive Accessories with +8.3% increased AOV

Another thing the business lacked previously was understanding of the customer, the behaviors, and the technology customers were using to consume our site. After creating a comprehensive dashboard through DataStudio using our Google Analytics data, we were quickly able to see detailed data device, demographics, consumer behavior, and purchase behavior that was able to be sliced in multiple different ways (by date, device, site part, site test variant, etc). 

Lastly, there was no strategy to constantly keep in tune with customer feedback. Using site surveys, we consistently maintained feedback loops with customers about any data we sought. Example: Why are customers exiting the Customization process? Trigger a survey when customers exit the customization tool without adding to bag asking questions of reason for abandonment, what they were seeking, etc. These allowed us to help answer the tough questions that other forms of data were not uncovering.

Qualitative

Another tool YETI had not used yet was qualitative data. I knew this had to change immediately as quantitative does not answer all facets of the experience. After a couple of months, we were using User Testing for all of our qualitative needs. The moment the first highlight reel of customer feedback was flashed in a meeting, everyone understood. Suddenly we were flooded with requests to test the customer journey. Equipped with this new tool, we tested everything we had the capacity to in order to supplement our quantitative data, answer why our customers behaved the way they did, benchmarked our site's usability, and usability tested every new feature and experience. It was a regular in our toolkit.

Personalization

Personalization seemed like a daunting topic to a company that has not attempted it. Where do you start? What audience do you segment? How do you approach it?

Start small. Prove it works and gain business buy-in.

 

When we launched the Alpine Yellow seasonal color, we knew that our seasonal colors were more frequently purchased by returning customers while new customers preferred to simply purchase drinkware in the color they desired. Armed with this knowledge, we conducted a test to study the data and prove this point. On launch day, displayed two main banners on the homepage: One showcasing the new color in two categories (coolers and drinkware) and one showcasing drinkware in a variety of colors. Our hypothesis based on our previous purchasing behavior was validated. While the Alpine Yellow banner performed best with returning customers displaying an increase +14% in conversion and +37% in revenue, the variety of Drinkware colors performed best with new customers displaying an increase +81% in conversion and +95% in revenue. This strong performance and showcasing a simple split of new and returning yielded a strong performance when displaying relevancy allowed us to petition for more.

Soon, strategies for our advanced analytics product propensity, LTV, pursuit propensity, product ownership and some other tools in our personalization platform like previously viewed and weather-based targeting soon flooded our roadmap.

Agile Methodology

Challenge

Of course every designer is against the idea of being constrained by the process of agile methodology and our designers were no different. Knowing the benefits and outcome we could optimize quicker in this type of environment, I set out to change our mindset.

Design Sprints

The easiest way to ease the hesitation towards the process is to host a design sprint. The collaboration and excitement of completing a project with full customer feedback in one week motivates teams and is the best approach of agile methodology to ease their hesitations. The moment we started our first design sprint, I could read nervousness and doubt on their faces. Keeping to the process, making it fun with whatever emojis they wanted to use across a digital whiteboard, and completing each activity with a positive tone and break, I started to see a new face in the team: engagement. The design sprint concluded with a watch party of the customer feedback interacting with the prototype we created. Ideas sticky notes flooded the scorecard on our digital whiteboard from every design sprint participant. Engagement was at an all time high. Everyone was active and absorbing. After the takeaways were filled in, I congratulated everyone on their first YETI design sprint and surveyed the crowd. Every one of them had glowing reviews of the process and were eager for the next. They were so enamored with the Design Sprint, we ended up making it a monthly exercise and a powerful tool in our new Agile Methodology toolkit.

Remote Collaboration

Our first design sprint was in the middle of the hairiest part of the pandemic, which meant we were all still remote. This posed a threat to the adoption of agile methodology; not only did we have to figure out how to conduct a design sprint remote, which I myself held all my previous in person, but it added an extra risk of failing and the team not seeing the value in the agile methodology process. Conducting thorough research prior to kickoff, I found multiple tips to help.

 

First and foremost, we had to find a way to simulate the traditional in-person whiteboarding. I had used Miro as a tool in my past, but not for anything as extensive as a week-long event. Luckily, I far underestimated the power of the tool. It was exactly what we needed and was the perfect tool to host the fully-remote design sprint.

 

Second, we needed to stay engaged in a very distracting environment: our home, not to mention most of the team were not actively in meetings that occupied the full day of engagement and attention. Enacting rules to increase engagement.

  1. Always have your camera on -- great for collaboration since 90% of communication is body language which allowed me to monitor emotions including exhaustion to offer a break and team members to see each other's expressions).

  2. Take frequent breaks to help relax the brain in between exercises.

  3. Communicate your thoughts clearly and often through voice, not chat.

These simple adjustments made the remote collaboration smooth and effective. So effective, in fact, that we began to use Miro as our ongoing way to communicate through meetings through notes, brainstorming, and collaborating, but also allowed us to use it outside of meetings as well to keep collaboration going to maintain the agile environment.

Dev Partnership

Of course the development team was already utilizing agile methodology to complete their work. Another challenge presented itself: how do we fit ourselves fluidly into their already successful process? Partnering with our Dev Manager, we found a process that worked. We updated the workflow to include design request and design feedback, weekly meetings to review design questions and feedback, and backlog prioritization that included the team to help prioritize impact. While the process is not complete, the improvements helped begin the process to make the UX team more agile and integrated with development.

Site Redesign

Challenge

YETI's site had launched in 2018 with little data to jumpstart its eComm business, leaving a lot of the design up to assumptions and with little UX considerations. In addition, the site was not built to handle the company's category growth, leaving much to be desired as it expanded it's catalog. 

Quantitative & Qualitative Data

Luckily we were at an advantage: in 2019 the newly hired Director of eCommerce had implemented data tracking through Google Analytics and Quantum Metrics. We could see some historic data about how customers were behaving on the site. 

Using Google Analytics, we were able to track how customers were progressing through the customer funnel, which pages were driving the most conversion and revenue, which products were the most purchased, and basic demographics including device type, size, and OS. 

Using Quantum Metrics, we were able to begin to understand the friction points we were seeing in our GA data. For example, why were customers benchmarking higher on PLP to PDP back to PLP? By watching session replays and detecting behaviors on a more granular level, we were able to analyze the behavior a little clearer.

That's when qualitative made its debut. While Quantum Metric did dive deeper into a little bit of the heart of behavior, we needed to know the customer sentiment and thought process presented clearer. Using usability testing, we took the customer through the full customer journey and explored the reasons behind their behaviors. We were able to uncover far more when words and expressions of feelings were added as a layer of visibility.

 

Returning to the PLP to PDP to PLP example, we discovered that customers were often confused by our naming conventions of size. They would click on our Tundra 45, our second largest cooler, read the minimal educational information we provided at the time, be confused on size and have no way to see other sizes so they would go back to the Coolers PLP. Upon returning, they would attempt to select another size. Sometimes, they would forget whether they clicked into the Tundra 35, 45, or 65 and end up at the same 45 in confusion. Other times, they would click into another size and explore its benefits, again with limited educational information for such detail, be confused again and hop back to the Coolers PLP once again. With no good way to compare the coolers or change size once selected into the relative family, customers were bouncing to and from a category page to a product page, often resulting in frustration and confusion. These experiences and emotions were not present in our previous quantitative data and helped provide possible friction points for us to solve while entering the design phase.

We then sourced external experts in the industry: Baymard. With a full site audit, we were able to uncover some additional insights and solutions as well as validate what we had already been witnessing through our data. 

Armed with all the data and insights, we turned to our competitors and brands we aspired to be.

Competitor Analysis

In order to jumpstart the project with inspiration, we sought examples of eCommerce sites that were executing well. We drew inspiration from multiple different competitors and brands that allowed us to incorporate what we believed would work well for us. 


We gathered inspiration from brands that blended storytelling and selling products like Patagonia, Nike, and Marmot. We viewed competitors to see how they were educating their customers on the benefits of their products over the next brand like Camelbak, Hydroflask, and Corkcicle. We also drew from more eCommerce-driven sites like Amazon and Target. All of these were taken into consideration to help us build the best site for us.

Agile Design & Testing

In order to accomplish this in an efficient manner, we adopted the agile approach. Breaking the site out into parts and elements, we created design sprints focused around a two week timeline.

 

First was to approach all the global elements of the site: the foundation.

Addressing this first allowed the development team to take these designs and begin skinning the site with global CSS. This also set us up for the foundation of the remaining parts of the site so we were not considering H1, H2, links, etc while we created the next parts. 

 

This included the backbone of the site: the navigation. From our learnings in data, we knew from our qualitative research that customers found our navigation hard to use. WE had lumped everything outside of coolers, drinkware, and bags into a generic “gear” category, often damaging our awareness of our innovation into outdoor chairs, dogs, and cargo storage. Most users thought ”gear” meant small items and peripherals such as apparel and small items to use for their cooler. Our quantitative data echoed this. Most of the customers were clicking on drinkware and coolers, as to be expected since customer awareness of YETI building products in these two categories were high. Another interesting discovery was that “dogs” was a regular search term on our site, ranking in the top three consistently. That meant the customers that were aware of our innovation in the dogs category could not find it within the navigation and resorted to searching it. Same with “kids” We knew our taxonomy had to change. In order to bring more attention to the custom category, and to draw inspiration from our competitors, we decided to move towards a three-main link approach that allowed the customer to go into each: Shop, Custom, Stories. This allows the customer to set their mindset into each one as they click into each. Once in the shop category, we wanted to surface the categories individually to draw attention to the innovation and create awareness around the lesser-known categories such as dogs and cargo. 

 

After a usability test, we discovered success and improvements. While the customer did understand the new navigation and had no issues navigating, we discovered frustration and confusion around our original naming of “Stories”: “Dispatch.” “Dispatch” is the name of our quarterly magazine/catalog that tells rich stories of our ambassadors and deep education around products. While in hand, Dispatch was a perfect name for the publication, customers were not as understanding on our site when presented as a top-level navigation link. When asked what they believed they would find when clicked, there was a mix of answers: some thought it ot be dispatching our customer service as you would on a mountain dispatching emergency services, others thought dispatch meant that they could “dispatch their order status” while one angry customer summed it up in a true sentiment: “This is what I’m talking about, you expect me to understand your company language. You are acting like we’re all working behind the counter instead of treating me like a customer.” His frustration towards the language added an extra five minutes to his user video. Only one customer guessed the outcome correctly, but only after talking herself into a circle for a couple of minutes before coming to the conclusion it was most likely our “blog” of stories. So much insight came from our usability testing that we were then able to take and iterate on our design before handing off to the development team. 

 

Another change made to the navigation was the ability to shop by multiple categories at once. Our current navigation was limited and made the user select individual categories to shop (coolers, drinkware, bags, etc.) without exposure to the other categories. As a brand, we pride ourselves on helping customers find what makes them good at their outdoor passion. By offering a way to shop by their passion (fishing, camping, commuting, travel, snow, beach, etc), they could be exposed to multiple categories at once. For instance, a camper might gravitate towards purchasing a cooler, but did they know we also sell a water cooler, a chair, and a blanket that is perfect in all weather? This allowed us to surface the products across categories that were suited to their individual needs to create relevancy.

 

We also knew from our data that our customer has a strong affinity towards colors and matching their products across categories in the same color. Another way we could create relevancy and awareness of other categories was to create shop by color categories that allowed a customer to choose black, pink, navy, etc in order to shop their favorite color in all the products we make. It seemed like a no-brainer, but at the time, it was not even considered.

 

This sprint also kicked off our foundation for our global design system. This allowed us to set the tone, brand, and styling for the rest of the site redesign. Carefully partnered with Brand,w e achieved a look that brought our design to be less masculine, evident by our customer split skewing more female, and fresh into the 2020s.

Second sprint included the main selling sections of the site: The categories (PLP) and product pages (PDP).

The PLP and PDP were crucial sections of the purchase path that had been neglected since the launch of the original site and had many opportunities to improve to help the customer discover and learn the reason to buy the products.

For the PLP,  we had a rare opportunity: we had never had filters. That small change could help customers exponentially find what they were looking for quickly with less effort. In addition, we had never considered product education a key component of the categories. WE simply put up a grid of products and expected customers to click through to the PDP to learn about each. Mixed with storytelling, we knew we could help the customer find what they were looking for by adding the ability to siphon into categories (ex Hard versus soft coolers or different types of drinkware like mugs, jugs, and tumblers), be more educational with the product benefits (such as offering a carousel of cup holder-friendly products, a question that regularly was asked from our customer service to answer), and offer better product details and badging to differentiate the products for easier comparison. Concerned about the length we were adding before proceeding the customer to the product grid, we conducted a usability test to see if the extra information impeded the customer from proceeding to choosing a product. The study displayed no concern, if the customer wanted to get to product quickly, they would click the filter button in the top banner, but most customers found it engaging to peruse the content and discover product information as they explored the category.

 

For the PDP we had much more work to do. Our data displayed multiple frustrations with our product pages: confusion about naming and being able to compare comparable products, lack of education on why they should purchase the product, shipping considerations, return options, etc. There was so much that we could do to improve the PDP.

 

First up was to solve the persistent PLP to PDP back to PLP frustration we had seen in our customer journey mapping and qualitative data. One of the reasons we uncovered that caused this issue was the naming conversion causing confusion, often forgetting which product they had just viewed. Another reason we found is that by the time it took five seconds for the PLP to load to select the second product and five seconds for the new PDP to load, the customer had forgotten the little information we provided on the PDP from the previous product. While we internally knew the difference between the Tundra 35, Tundra 45, and Tundra 65 and they were comparable in all aspects other than size, our customers did not. In order to allow for better comparison  of products in the same family, we added the ability to change size in addition to color on the product page. This simple change cut down on seven seconds of loading the various pages to compare what they had previously experienced. While this change was met with technical raised eyebrows, we knew it was necessary to make this customer friction resolved. It became the top priority to solve, and to solve with different product education to load, add the respective selected sku to cart, and quickly load all the new content to alleviate page load concerns. In addition to the size selector, we added the comparison of what each size selected was best used for. This allowed a customer to quickly compare the size differences as they selected each. For coolers, for example, the size would reflect how many cans it could hold: on brand and a relevant sizing comparison for customers. For drinkware, it could be as simple as the 10oz holding a cup of coffee, the 20oz holding two, and he 30oz holding all the coffee needs you could want for a day. Each one of these allowed the customer to gain a quick comparison in a very simple few word explanation. In addition, if the customer did not see the ability to compare the sizes, further down the page we offered the ability to compare the different sizes quickly in a more detailed carousel with the ability to switch to the site that was relevant to them.

 

Prior to beginning the design sprint, we had tested adding value props to the product page: free shipping, free returns, and the warranty the product offered. These were things that customers had frequently voiced they were unsure of in our previous usability tests. We wanted them clear and visible to give reassurance and peace of mind for the customer at the point of adding to the bag. A simple iconography under the add to bag contributed to an additional $750k in revenue. That simple peace of mind and education allowed the customer to find these important pieces of information without having to spend the energy to hunt them down. This was an obvious permanent addition to the site redesign PDP.


Accessories were also a component of the shopping experience we knew we had shed little awareness towards for the customer. Adding this in a way to showcase the ability to complete their product with the accessories that make your product better helped raise AOV and bring awareness in a digital space.

Third sprint tackled the core of the design system: Modular Components.

The idea behind a design system was a foreign concept at YETI. Prior, they had embedded text so they could get whatever design they wanted for any homepage or landing page. After advising the setback to doing such (lack of accessibility, pixelated text, larger file size for slower load, more design production work for global translations) it was clear we needed a library of storytelling-focused with a mix of selling components that were modular to design homepages and landing pages. Multiple components were designed to accommodate: banners, category carousels, videos, customer and ambassador testimonials, Shoppable recommendations, and many more. This created the foundation of our library, but in turn, the education of the capabilities and vast flexibility of having these.

Fourth sprint focused on the final key components of the purchase path: Cart and Checkout.

While we historically had little friction with the current checkout process, evident in our conversion rate, session replays, and qualitative studies, there were some improvements to be made. 

 

One of the opportunities was to clearly display shipping options and delivery estimates. We had some delivery time frames but no solid dates to display to the customer. Working with our Narvar partner, we were able to design an experience in cart that a customer could enter their zip and get delivery estimates by line-item. This followed the customer throughout their checkout path. When selecting shipping methods in the checkout flow, customers could clearly see the estimated date. In addition, the order summary line-items also displayed the dates. This was key, especially for our customized product which took longer than standard non-custom product. 

 

In our current checkout flow, we included the ability to create an account in shipping. While this did prove effective with about two-thirds of our orders including an account, this impeded the customer in the flow. Not only is it intimidating to create an account during the shipping method process, it also appeared mandatory when it was optional, evident in our user testing. In order to alleviate this additional step in checkout and ease the process, we removed this step and added it to the order confirmation page, standard for most of our competitors. 

 

We also currently had not included express payment options. While we did offer PayPal and ApplePay, we only displayed this at the end of the checkout in payment method. In order to quicken the process and allow their payment provider to fill in the forms, we added the express payment methods to minicart, cart, and checkout.

 

Mini cart also had room for improvement. In our current site, it was a small drawer that was like playing a game of tag with the mini cart not wanting to lose; it disappeared as quickly as it appeared when adding a product to cart. In addition, it provided no space for upsells. After an A/B test that proved upsells were beneficial to add to mini cart, we created a full-height drawer that allowed the customer to review the products in their cart, see optional add-ons, and offer checkout options.

 

Checkout also felt daunting, not giving a clear beginning and end. In order to alleviate this, we created sectioned steps with clearly defined numbering of each. 

 

Lastly, the mobile order summary of products they were purchasing was buried at the bottom below all the forms. This caused confusion with customers; often not knowing where to view their products, they resorted to returning to cart. We shifted the product summary to the top of the checkout pages at all times so customers could interact and verify their order without going on a wild hunt for this information by abandoning checkout.

Last sprint included customer retention and post-purchase: Account and Emails.

These two items had been ignored since the launch of the original site in 2018. Needless to say, there were plenty of improvements to be done. Account had been home to order history and nothing more. 

 

In order to welcome the customer back, we included a personal greeting and the progress status of updating their account. In addition, we wanted to begin with the latest order with clear status; we liked to call it the pizza tracker. One of the consistent top reasons for contacting customer service was to check on order status. We clearly were not doing our jobs on the site to help answer that question for the customer. With this new order status tracker, it clearly displayed the step of the process the order was in (received, preparing, shipped, delivered) with an consistently-updated delivery estimate date. Lastly, we wanted to offer our preferences survey that we included in our email footer on our site, in a more engaging quiz-like interface. This allowed customers to update us on their pursuit, color preference, etc. even down to pet’s name so we could best customize their experience.

In addition, we wanted to allow customers to feel peace of mind and view what was in their garage by offering a place to view their previously purchased online and in-store products. This gave them an updated timeline of their warranty status, reminded them to register their product or offer a visual to show them that they had, and give them the ability to write a review or reorder easily. Our usability test displayed this was the most exciting feature of the improvements. Customers felt reassurance and reminded that they had a warranty and the ability to do everything in a more gamified manner.

 

For emails, we knew we also needed to update to reflect these changes to order status updates and improve the automated flow. We included a delayed email in the series to let them know if their order was delayed past their previous expected delivery date, an occurrence that happened with customization. We also updated the customization cancellation email to include the reason behind the cancellation (copyright infringement, profanity, too detailed of a design, etc) so the customer could resolve the issue without reaching out to customer service to discover the reasoning behind the cancellation. We also included the pizza tracker to help the customer clearly see the current estimated delivery date along with the current status in each email.

Once all the sprints had concluded, we began refining and finalizing the design system and style guide.

Once the design sprints were over and handed off to development, the process to produce a full style guide and design system library ready for use commenced. This allowed all designers, developers, and content builders to reference the foundation and modular components at any given time. Using a web-based link, we created a readable deck of all available global elements, style, and design components that anyone could reference. For the designers, we created a library of components they could drag and use in any of their files that were templated for their designs. This system was the first organized structure of this kind for design YETI had.

A/B Testing

We had produced a lot of new designs that were considered risky for the business and development. In order to validate success or failure prior to committing with the full site redesign, and since the development would take some time, we took the liberty to test some of the changes.

 

  • Category carousels were one of the first items that was given skepticism. The idea of adding subcategories to our already small catalog seemed tedious and unnecessary to the business and created clutter when browsing. 

    • To test this, we chose the top-level categories of coolers, drinkware, and bags and conducted a traditional split A/B test. What we found was even though it seemed unnecessary with the small product assortment, it did prove successful and echoed our hypothesis that simply having subcategories like mugs and jugs not only allowed the customer to funnel to more relevant product quicker, but also created awareness these types of products were offered by YETI. 

    • We also tested this approach on the homepage. The results displayed that the category carousel was the top clocked element of the homepage, and continues to show that performance today. Echoing our qualitative, the category carousel helped customers quickly discover the wide offering of the YETI catalog they may not otherwise known.

  • Accessories exposed on the PDP was new and unknown. Previously it had been hidden behind a drawer that was hard to find and used once opened. The business was concerned that exposing them on the PDP might distract the user from the current product they are considering. We decided to test. What we found was that customers were more likely to convert and significantly increase their AOV with the products visible. The product education of how to make your cooler, drinkware, bag better helped convince the customer to purchase.

Development

As each design sprint had completed, development began working on it. With that came close partnership to answer questions, compromise on technical functionality if needed, and ensure the design was implemented as intended. 

 

As a pilot, we launched our new site in Canada to monitor the experience for the customer and work out any bugs in January 2022. There were multiple considerations: French translation, differing catalog of products, and different culture considerations. Partnering together over the course of four months, we developed, adjusted, and tested the new site in staging to prepare it for launch. On January 6th, we all met together virtually to launch the new YETI.ca. The beauty of launching the site before the larger YETI.com allowed us to monitor successes, friction points, and opportunities. Through usability testing and screen session recording monitoring, we were able to track and detect where the customer experience could be improved. A short three months before the US launch allowed us to optimize, A/B test, and design solutions for customer frustration. 

 

In April 2022, we launched YETI.com, this time in a room all together starting at 4am. Between CA and US, this had been the smoothest launch I have witnessed in my career. The development team was on their top game, the merchandising team was successfully managing any fires in content, and we were able to catch and fix all errors for the customer.

Qualitative & Quantitative Testing

After the site went live, we conducted a benchmark usability study against our old site. This allowed us to see the improvements’ effects and any friction points we had not considered. These findings helped stem new inspiration for A/B tests and iterations for future iterations. Pleasantly, our SUS score increased to 92 from our previous 86, but our work to improve was yet to be done.

 

Fueled by our benchmark findings, session replays, and interactions data, we formed new hypotheses that refreshed our A/B Testing roadmap. Some examples of the tests that were performed:

Navigation Mobile Quick Links When we launched the site, we removed the small buttons under the main navigation that exposed top-level categories for quick navigation. While it did provide a cleaner visual, we debated if the removal of the links was beneficial to the customer. Testing these validated these were needed, significantly decreasing bounce and increasing PLP views. Based on these results, we added these back for customer ease.

ab-tests-quicklinks_edited.jpg

PLP Mobile Two Column When we designed the PLP, we attempted to design a mobile experience with two across. We abandoned the effort in fear that removing the swatches would be detrimental to color discovery and that the images would be too small to view detail of the products. However, when we launched the site, we were seeing an increased bounce on the PLP. We hypothesized being able to view more products at once could help the customers discover what they were looking for quicker and with significantly less scrolling. We took away our original bias and decided to test proceeding with a two-column execution with substituting “X more colors” as other brands have in place of the swatches and a smaller product image. The results were astounding. Not only did we increase PDP views and add to carts, direct metrics of this kind of change, we also greatly impacted AOV and conversion.This was immediately implemented at full audience without hesitation. 

Interactive Accessories As a team, we made a field trip to one of our stores. While there, we were able to observe customer behavior and ask the sales associates questions. One of the things that immediately stood out was customers very rarely left with just one product, confirmed by the store associates. They often paired a cooler with one or two accessories or a drinkware with an alternative lid; they rarely left without at least each hand filled with our products. This differed from our site experience. While we did expose the accessories to our customers in the exposed accessory view, we did not provide education, reason to buy, or a visual of how that accessory paired with the product, things the store associate could easily demonstrate and describe in-person. In order to create an experience that was as close as we could on our site, Interactive Accessories was born. Thanks to our strong developer who was able to bring this vision to life, customers could actively see the compatible items on the product in real time at their convenience. Within a week of testing, multiple cross functional teams across the organization discovered the experience and were begging to put it sitewide. The results from our customers echoed this enthusiasm with a 8.3% increase in AOV for the products that included the experience. The lift to add this to all products does take some significant time to produce the assets, fully develop to be able to configure for all products, and build for each, but the results showed clear its success and the process to implement sitewide is in progress. 

Checkout More Prominent View Order Summary for New Users When we launched YETI.com, we began to see a spike in initiations of checkout without payment. Due to the nature of the launch, the styling of the cart and checkout had not been complete to ensure the deadline was reached. Due to that, everything presented was on a white background close together making it very difficult to find the order summary, evident in our benchmark study. We aimed to change that with an A/B test with the added benefit of quantifying the impact to justify returning to complete the design changes. What we discovered in our testing was this increased visibility for the order summary was most beneficial to new customers, greatly decreasing the return to cart from checkout and boosting conversion. From these results, we were able to justify the updates to complete the development and, until complete, personalize this experience through our testing and personalization platform for new users.

Checkout Shipping Expanded We began to uncover new trends in our data as we continued to gather Intel on the site's performance. One of the data points suggested we were seeing a dip in year over year advancements to the shipping fields in checkout. While some customers do use the shipping form to visualize cost and delivery estimates as their source, evident in our usability testing, we knew this should be tested to see if we could improve the experience. One of the hypotheses was that customers were met with an email field first and foremost in checkout before proceeding to the shipping section. Customers are skeptical and cautious of entering personal data and this barrier to proceed could have been one of the blockers to proceeding to verify shipping. A test was created to expose the shipping form without the need to enter email to access it.

Never stop improving

We had successfully launched a full site redesign and continued to test and optimize, but in no means is our job done. We have learned a lot and there is still a lot of opportunity to improve the customer journey and even go beyond.

Customization

Challenge

Customization was the big differentiator of YETI.com. Wholesalers and stores did not have the capability to offer the same product, thus it was the main differentiator on YETI.com.

Quantitative & Qualitative Data

Being such a large differentiator, we knew that we had to approach it with a large arsenal of data before we even considered executing.

Using Google Analytics, we could see how customers were flowing in and out of the customizer, where they were initiating the option (PLP vs PDP), and how many converted. We could see that 70-75% of our customers were mobile users, meaning we had no excuse not to optimize mobile-first. We also discovered that a lot of customers were initiating but dropping off before adding to cart. If they did add to cart, the conversion rate was almost double compared to the rest of the site’s add to cart. This data instantly told us this was a priority.

 

Using Quantum Metrics, we were able to view interactions and hotspots. How were customers discovering? How many categories did they interact with before deciding? We were able to see a lot of these behaviors. 

 

Using onsite surveys, we were able to understand the reason behind exiting. We knew a lot of customers were abandoning prior to adding to cart, but we did not know why they were. We triggered an onsite survey when a customer exited without adding to cart asking why they exited, what they were looking for if it was a choice limitation, and why the experience was confusing for those who answered with confusion. This allowed us to see that a lot of customers were using it as a fun tool to browse a product. What we did not expect was the answers from confusion. Some of the answers included confusion around why would we print a black design on a black cup (we did not display the stainless steel effect on the duracoat while editing which caused confusion) or why our back button exited the customizer when they thought it was for the category selection within designs or why they could not change cup color in the same size without exiting. In addition, some customers simply wanted a different design with simple designs such as hearts and more animals were top requests. From the survey we were able to answer more clearly the frustrations of the customers who exited without committing to a customized product.

Using qualitative usability studies and customer interviews, we observed how customers were experiencing the current customization tool. Decision fatigue in choosing their design to customize their drinkware was frequent, the additional pricing was not clear, and there was a lot of confusion in the UI elements that helped guide them through the experience, echoing our onsite survey results.

Competitor Analysis

We had conducted a lot of introspective work. Now we needed to study how others in the customization space were creating the personalizing experience. 

 

We studied our competitors that also offered customization. Camelbak had a sleek and intuitive experience offering vastly more features than we had. In addition to making it easier to understand the vast library of options, they also had the ability to move, pair designs and text together on one side, and resize the designs. Tervis allowed the customer to customize the full 360 surface and did it all with displaying the product in a 3D moveable view so the customer could review the design as they desired. 

 

There were other companies outside of our competitors that were also offering this customization. I had experience at Kendra Scott offering the ability to design your own jewelry with varying stones and metals. From my previous experience, customers like to use it as their canvas and get creative with their creations. We also studied Nike, who by far had the best mobile experience for many steps of designing shoes. They offered clear direction with tooltips, zoning on on the element you are customizing, and a menu of options clear and present to choose your selections. We also studied less-sophisticated experiences. Etsy was a great example of personalization across multiple shop owners. Even though it was simple, offering form fields for most customization and no preview of the final design, they offered clear direction with words and secondary images detailing example customized products in their product pages.

 

We learned a great deal of what worked and did not work from each and every brand we studied.

Design Sprints

In order to achieve the result we desired, we knew this was going to be a large project that needed an agile approach to be successful.. 

 

To kick off redesigning the customization tool, we used a design sprint, the first time using it at YETI, so we could prototype a design and gain customer feedback all within a week. Our standard time for a design like this was around 4-6 months, and I wanted to show we could achieve something within a shorter time frame to allow us to improve quickly and over time keep iterating to take it even further. The result: a functioning prototype that only had a few iterations from customer feedback. When the prototype and concept of a design sprint as a method to achieve project designs were presented to the broader team, the idea spread like wildfire. I was requested to present to SLT the concept. They were amazed at the result. The idea of being more agile and answering a long-term goal with a prototype with real customer feedback was a new concept and it was well-received. Not only did other external teams request to demonstrate a design sprint concept and how they can use it, SLT asked when the design that we created in a week could be implemented. The quick prototype that had some extra work to refine was such a step forward and validated by customer feedback, SLT wanted it today. While we knew it was not done, we continued to iterate to make the design better, always with customer feedback.

 

In addition to the customization tool, we also used a design sprint to improve the customization customer journey. By using the week to find pain points of discoverability of the customization option, we were able to make multiple ways to educate the customer on the big YETI.com differentiator. We learned quite a bit from a one-week sprint.

 

  1. Education for laser-marked quality. One of the key learnings during the design sprint was that customers believed the customization we offered was merely a sticker that would quickly deteriorate. By displaying a video of the laser-marking process paired with the language “laser-marked,” an immediate light bulb illuminated in the customer’s mind that this was a quality marking and would not fade and was worth the investment of a YETI drinkware over other brands. In addition, simple value props calling out benefits such as dishwasher safe, 5-year warranty, and exclusive to YETI.com gave peace of mind for the customer.

  2. Inspiration was needed. The offerings of our customization was broad and vast. While we saw this as an opportunity for the customer to be creative, the customer was often overwhelmed by the quantity of options. Narrowing the options to be relevant inspiration: best sellers in their area, clear categories, and other customers’ creations showcasing the capabilities inspired the customer. A couple of customers saw a handwritten note on one of the social posts we included and immediately equated it to their ability to write a note to their loved one and changed their tune of no desire to customize to excited to customize.

  3. Option to customize was difficult to find. When we transitioned to the new site, we decided to change the design of the button and incorporate it within the swatches. What we did not realize was customers began to equate that not with the laser-marking customization we offered but the ability to customize the color with it embedded among the swatches of the other colors. When asked to find the button, customers would struggle to find it and two users gave up completely. What this meant for us was he had a problem to solve: how do we educate the customer about the premium-quality and benefit of customization and allow it to be easy to discover? This was a good candidate for A/B testing.

Previous

Next

bottom of page