Skills: Correspondence with customers, writing interview and survey questions, conducting interviews, analyzing feedback, prepping presentations, Usabilla feedback platform

Our team decided to gather customer (current, potential and lost customers) insights about the company’s current digital experience, then use these insights to spark and back up ideas for new website designs and features.

The company is looking to increase overall customer conversion and would like to gather user feedback to ground ideas for business decisions and website design.

Users and Audience
The audiences we focused on for research were current B2C customers who went in-store rather than online, lost military customers, and lost student customers. These audiences had already been offered an e-commerce experience on the website, so we wanted to determine either why they went in-store, OR why we lost them as customers. Answers could vary from being digitally/website oriented, to business oriented.

My Role
My role was to implement our customer research process through phone interviews as well as gathering feedback through various methods on the website. See “Process” for more.

While we had a large user-base to pitch for our phone interviews, we received limited responses, so we only had 11 interviews to analyze. From a qualitative point-of-view, the feedback was extremely valuable. From a quantitative point-of-view, the population size was not necessarily large enough for an accurate depiction of the majority of customers’ thoughts. We also did not have a streamlined process for presenting our feedback to higher-level stakeholders and executives (we did, however, have a decent process for presenting to our customer service representatives so they could follow up with customers as needed).


Phone Interviews:  I was tasked with conducting customer interviews via the phone. This process included writing or editing interview questions and scheduling 11 qualitative phone interviews among three customer types (see “users and audience” section) via email. My team member and I divided questions into various overall groups, such as “customer service experience” and “website experience.” Questions themselves were generally qualitative and the interviews were casual.

I analyzed my interview transcripts for key takeaways, and created a report that outlined, summarized key takeaways, supported by quotes. A colleague presented these results to the marketing team, and another colleague presented these results to our customer service team.

Website Feedback: I was also in charge of gathering feedback through our website, via a feedback button, exit surveys, and slide out surveys on specific website pages. I wrote the survey questions, set up the surveys through a software called Usabilla, and worked with our Usabilla representative to design the look and feel of the surveys. I regularly reviewed and sent this feedback to the customer service team. I also analyzed this feedback and created monthly reports, which my colleague then presented to the marketing team and other stakeholders.

Feedback Button Survey

Screenshot of feedback button survey on website.

Through both methods, conclusions included: we should implement more e-commerce functionality for the user, such as being able to change their credit card information online; customers and lost customers agreed that they had a good experience with our customer service; pricing was great for most markets, but students felt it was too high.

Company X Feedback Button Analysis

Lessons Learned
One of my colleagues inputs all feedback into Salesforce for reporting purposes, but as it stands, our feedback is too qualitative for accurate and automated reporting. Moving forward, we are looking to add more quantifiable elements to our interviews and surveys for easier and more accurate reporting through Salesforce. I also would like to have a process for presenting business-oriented feedback to higher-level stakeholders in the company so they can make educated business decisions.