This project was for Fidelity and took place in January, 2017. It was a large-scale qualitative study with myriad complications.
Background
Fidelity is revisiting its service-model design for a premiere product line, specifically revising which customer-service touchpoints are primarily digitally enabled vs. advisor-led. Product stakeholders had created a proposed solution that needed to be tested.
Since the solution had been developed by the business over several months without any rigorous user input, I advocated (successfully) to make the primary goal of the study to understand user needs around touchpoints while bumping the evaluation of the proposed solution to secondary status.
Process
Issue #1: Selecting a method
Due to the high-profile nature of the product line and potential implications of the findings, it was tempting to structure this as a quantitative study with statistically significant data. But the study goals would require deeper insights into user motivations and emotions than could be captured in a quant study.
We settled on a qualitative study of unusual size. Twenty hour-long interview sessions would run in two labs concurrently for two days. This meant two moderators, two note-takers, and double all the usual logistical prep work.
Issue #2: Recruit
At the outset, the team knew we wanted to hear from three user groups. I had run studies with all of these groups previously, so I created a screener using standard questions I had developed earlier in the project.
But after the recruit launched, a key stakeholder pointed out that—based on very recent developments at the corporate level—we needed to change the approach to include a new persona. I scrambled, cashed in some relationship equity with the recruiter, and worked with the creators of the new persona to create a set of questions that would (we hoped) serve as a proxy for identifying our target participants.
Issue #3: Study flow
The business team had been planning to create a customer journey-map in video format to communicate the new service model to internal stakeholders, and video seemed like a quick and engaging way to introduce the service to participants in the study too. Fortunately, video production had not yet begun, and the business team welcomed my feedback on the script and animations so it would be optimized for the study.
I didn’t want to begin the interview sessions with the proposed-solution video, because that would create a framing bias. Instead, I needed to understand user needs in a way that would be unbiased, generate conversation, and provide structured data for efficient analysis (I did not want to watch 20 hours of video footage to compile the findings!). After exploring a few options, I settled on an electronic card sort with a follow-up interview designed to understand why participants made their selections. Additionally, I created a note-taking template in Excel that would follow the interview topics. This would help to structure the post-study analysis.
The final study flow was:
- Session introduction and warm-up
- Card sort and follow-up interview
- Video viewing and follow-up interview
Issue #4: Video storage and sharing
At least half of the participants in this study would be remote, and we would use WebEx to screenshare during the sessions. This posed unique logistical and compliance issues.
On the logistical side, I was concerned that WebEx would be unable to render video properly for participants with slower connections.
The work-around would be to load the video onto an external Fidelity server and share the link with remote participants via email prior to the study (with a note to not watch it before the session). But this posed a compliance issue: the video would include confidential information about a competitive product, and sharing a public link via email could compromise that confidentiality.
In the end, the video was uploaded to the external server, but the moderators did not send the link out prior to the study. Instead, we planned to share the video via WebEx as Plan A. If that didn’t work, Plan B was to email the link during the session and be sure to remove the video from the server as soon as the sessions ended. This would limit our compliance vulnerability while ensuring that we could run the study as planned.
Results
As noted, analysis for this study has not yet begun. The execution of the lab sessions, however, went very smoothly. The recruiter pulled off a miracle on short notice, we captured deep insights on user needs in a way that will enable efficient data analysis, and we were able to share the video via WebEx in all remote sessions.




