The What & Why of Continuous Discovery

Knowledge / Inspiration

The What & Why of Continuous Discovery

Product Direction
UXDX USA 2021

Most product teams are starting to adopt discovery best practices (e.g. interviewing customers, usability testing, experimenting). However, many of us are still stuck in a project world. We do research to kick off a project, we usability test right before we hand off to engineers, and our primary means for experimenting is a/b testing. These methods are better than nothing, but the best product teams are shifting from a project mindset to a continuous mindset.
In this talk, we’ll explore the key differences between project-based discovery and continuous discovery and give your team a clear benchmark to aspire to.

Hi, everyone. Welcome to The What and Why of Continuous Discovery. I'm Teresa Torres. I work as a Product Discovery Coach. Fortunately, I've been lucky to work with teams all over the world. Some at early-stage startups as small as two founders up to really large multinational companies with hundreds of thousands of employees. What's great about this is that I've been able to collect best practices and really iterate on techniques that have helped teams adopt a Continuous Discovery cadence. And that's what we're going to talk about today. Really what is Continuous Discovery and why does it matter? So, we're going to start at the very beginning I like to just define Discovery and the easiest way to do this is in contrast to delivery. So, all product teams have to discover what to build and they also have to deliver that product. So, we look at Discovery as all the activities that we're doing to decide what to build. So, think about all the decisions that we're making. How are we making those decisions? What data are we collecting? Who are we interacting with? Whereas delivery is the work that we're doing to build and ship scalable production quality products. Now, most of us especially in the last 20 years, we've seen a big evolution in product Discovery practices, and most of us are getting to the point where we're starting to adopt modern techniques. Well, we're interviewing customers. we're usability testing our designs. Maybe we're AB testing a feature once we release it this is great. We're seeing a dramatic improvement in the way that teams make decisions about what to build, but as we see our delivery practices move more towards a continuous cadence where we're continuously deploying code to customers. We're seeing the same shift happening in Discovery, where we start to shift from a project mindset where we do a bunch of Discovery upfront and then we rely on that Discovery while we implement that project. We're seeing teams shift to a more continuous mindset with Discovery as well. Whereas instead of doing Discovery up front, we're doing Discovery continuously at the initiation of the project or product all the way through as we develop our first iterations and then continue to iterate from there. Now, this shift from a project mindset to a continuous mindset, I think is particularly important because we're seeing that with digital products they're never done. We ship a mobile product. We iterate on it, we update it. With our websites, we're constantly evolving our products and services. We see this even in SaaS software. It's continuously evolving and even in software that is off premise outside of our building like hospital software, software in your cars, we're getting to the point where it's easier and easier to update that software over time and so our jobs are never done. And so, with this continuous development of products, we’re looking at how do we adopt a Continuous Discovery model to make sure that we're always building the right things. And 2020 was a really good lesson in why this is so important. So, we're probably not going to face global pandemics every year of our lives, hopefully, but we did see pretty big shifts in the market in 2020, where suddenly a lot of people started working from home. Different needs rose to the top. For teams that had a continuous cadence of Discovery, they were able to quickly adapt to these changes in the marketplace. Now, like I said, hopefully we're not going to see global pandemics every year. But our markets can be disrupted all the time. We see new competitors enter the field. We see technology disrupt what's possible. We see as we release every new version of our product, our customers’ needs pain points and desires, evolve as our product evolves. And this is really the argument for why a continuous cadence matters so much. Now, I know from working with teams that a lot of teams think they've already adopted a continuous cadence when really what they're doing is Project-based Discovery. So, I want to start with a really clear definition of what it is, I mean, by Continuous Discovery. So, I define Continuous Discovery is at a minimum weekly touch points with your customers by the team building the product. Where that team is conducting small research activities in pursuit of a desired product outcome. Now, each line of this definition is pretty critical. So, we're going to break it down line by line. We're going to start with this first one. Why do we need to engage with customers every week? So, most of us on product teams are making product decisions every day. Some of them are big strategic decisions like what goes on our roadmap? What customers are we going to serve? What opportunities are we going to pursue? Others are smaller more daily decisions. Like where do we expose this feature in the interface or what do we label this button? We're also making sort of medium-sized decisions. How should the data model work? What should be exposed through the API? These are all decisions that can benefit from customer input. The more that we engage with our customers, the more that we keep this continuous drip of interaction with them, the more likely we're going to get feedback on most of these decisions. Whereas what we do in a project world is we tend to do a bunch of project research. Before those big strategic decisions and then we just do the best, we can on those daily little decisions. Whereas if we talk with our customers every week, it allows us to engage with our customers continuously and allows us to get feedback on many more of our decisions. And it's not just more of the decisions it's that we're able to get feedback much earlier in the process. So, a lot of us have adopted what I call a validation mindset, where we do most of the work upfront when we're all done. When we've got the prototype ready to go, we've designed everything. We put it in front of customers and we validated it. We get it right. The challenge with the validation mindset, we do need to do that validation research, but the challenge is we're often getting feedback too late. At the point that we're getting feedback, our engineers are waiting for it to go into the next sprint. We have very little time to make changes and we tend to just make the small cosmetic changes and we don't have time to make the big changes that would dramatically improve the product based on our customer's feedback. So, what I like to see teams do is actually get feedback on their ideas when they're still half baked, when they're still pencil drawings or whiteboard drawings. And this can feel a little uncomfortable until you get used to it. But teams that do this, they start to co-create with customers. They start to adopt what I call a co-creation mindset, where we're inviting our customers to create with us. Now, inevitably, when I share this, somebody brings up that Henry Ford quote of if I had asked customers what they wanted, they would have said a faster horse. So, I want to be really clear. We're not going to ask our customers what should we build. The way that we're going to co-create with them is we're going to take our knowledge, our expertise about what's possible with technology and combine it with our customer's knowledge about their own world, about their own needs, about their own pain points. And this combined knowledge is what's really going to empower us to create better products. So, our customers do need to play a role and they do need to play a role continuously because that's what's going to ensure that we stay on track and that we keep building a product that's going to work for our customers over time. So, that's our first line of this definition. The second is by the team building the product. So, let's get into what I mean by this. It starts with this idea of a product trio. So, this is becoming a more common model. Some people call it a triad. Some people call it a three-legged stool is this idea of the product manager, the design lead and the tech lead being jointly responsible for those Discovery decisions. So, remember Discovery is just the work that we're doing to decide what to build. Now, historically, product managers have owned these decisions. They write up requirements, they hand them off to their designers. The designers do all the design work and they hand them off to engineers. And the problem is we see a lot of reworks in this handoff model, the designer gets the requirements. They run into some problems. The requirements have to evolve. The design is done, it gets handed to engineers. They run into challenges, and now we have to redo design and redo requirements. With a trio-based model, we're basically saying let's have these three rules work together from the beginning. Let's have them drive our Discovery decisions and jointly decide what to build. Now, you probably have more people on your team depending on your dev ops strategy. You probably have some QA folks on your team. You definitely have more engineers on your team. You might have some business folks. If you're working on a data heavy product, you might have data analysts or data scientists on your team. If you have the luxury of a user researcher, they might be embedded on your team. Here's the thing, it can be easy to fall into the trap of let's include everybody in Discovery. The problem with that is the more people involved in every decision, the slower you're going to go. So, what we want to do when we think about this idea of a trio, it's not a hard and fast rule. Some teams, the trio is a quad. I want you to think about the ideas behind a trio, which is to have the cross-functional roles represented for each decision where they're needed. So, in most Discovery decisions we need at least these three roles represented, but some of your Discovery decisions, for example, if they're about your go to market strategy, you're trying to learn what's the best way to bring this product to market. You might invite your product marketing manager to be part of those conversations and part of those decisions or if you're working on your data model, you're probably going to invite your data analyst or your data scientist to be a part of those decisions. So, this idea of a trio can flux just know that the more people you invite, the slower you're going to go. So, think about bringing in the right people for the right decisions rather than having everybody in every decision. But we know that we want to see equal footing. We want true collaboration between the product manager and the designer and the tech lead. So, that's your starting point, make sure that those three people are driving your Discovery decisions. So, we've covered weekly touch points with customers by the team building the product. Let's talk about these last two lines. We're talking about conducting small research activities in pursuit of a desired product outcome. This is where we're going to get into the heart of the definition. And these two lines work really well together because when we talk about Continuous Discovery, if your understanding of an interview is an hour-long interview, and you think when you interview, you got to interview 12 people. You can't do that every week. That's not sustainable. So, we've got to turn our research activities into smaller bite-sized activities that we can do continuously over time. Same with, I see a lot of teams that forget that our job is product teams is to create customer value in a way that creates business value. So, our job is not just to serve the customer, but to also serve the customer in a way that creates value for our business. And that's where this fourth line becomes so critical is that we're not doing research for research’s sake, we're doing a research to help us reach our desired outcome. So, let's talk about how we do this. I like to use this visual I created, which is called an Opportunity Solution Tree. It's designed to help the trio chart out their best path to a desired outcome. So, I argue that good Discovery starts with a clear, desired outcome. This is different from what a lot of teams do. A lot of product teams are still in a roadmap world where they start with a list of fixed outputs. That's an outcome led way of doing product management, where they're just delivering outputs. What we've seen over the last 10 to 20 years is teams are shifting from an output mindset to an outcome mindset. Instead of saying, what things should we build? We're asking what impact should those things have? And I want to be clear, your desired outcome should represent a business need. Now, I do distinguish between business outcomes, which tends to be your financial metrics, like increase revenue, grow market share, and product outcomes, which are metrics that measure behavior in the product that if you drove those metrics, would drive your business. So, I want to see a product outcome at the top of your tree. So, again, a product outcome is a metric that's measuring a behavior in your product, but it needs to be tied directly to a business outcome. One of those financial metrics that's creating value for the business. This is going to ensure that your team is not just creating customer value, but it's creating customer value that drives business value. Now, once we have an outcome in place, we needed to discover the opportunities that will drive that outcome. Opportunities or customer needs, pain points and desires. So, this is where we're looking at how do we create customer value and again, there's lots of opportunities that we could address, but we're only going to consider the ones that have the potential to drive our desired outcome. This is a big deal because this is allowing us to resolve the tension between business needs and customer needs. So, we see this all over the place where a product team is constantly having to decide what's more important than the business need or the customer need. An opportunity solution tree is going to help you resolve those where you're looking at the customer needs that will drive those business needs. Once you discover those opportunities, you of course needed to discover the solutions that will deliver on those opportunities. And we run experiments to help us figure out which solutions those should be. We're going to break this down even further. So, we're going to start at the top when we're setting an outcome, it should be a two-way negotiation between the product leader and the product team. So, what do I mean by those terms? By product leader, I mean, your chief product officer, your VP of product. If you're a really large company, it might be your head of your business unit. It's the product leader that has the across the business of view of what the business needs right now. So, the product leader is saying, given our strategic context, given our corporate company goals, this is what we need from your product team to deliver, to create value for the business. Now, the product team who's leading this negotiation is that product trio. The product manager, the designer and the engineer, and they're communicating to the leader, what they know about that outcome, what impact they think that can have on that outcome and in what time period. Now, it's really important that this be a two-way negotiation because the team and the leader need to be aligned on the best way for that team to create value for the business. Once that outcome is in place, we're going to start to kick off our continuous cadence. So, we're going to look at a few core habits that are going to help teams chart their best path to a desired outcome. So, reaching an outcome is a really undefined problem, it's wide open. And one of the best ways that we can add structure to this ill structured problem is to start to map out the opportunity space. So, again, opportunities or customer needs, pain points and desires. I like to see teams’ interview to discover opportunities. There are other ways to discover opportunities - we hear about them from our sales teams and from our support tickets if we have the luxury of observing our customers, we can uncover opportunities that way. But one of the easiest ways to have a continuous drip, continuously listening for opportunities is to just interview at least one customer every week. And then remember the goal of that interview is to uncover customer needs pain points and desires. This is different from what a lot of teams do. A lot of teams use their interviews to get feedback on their solutions. You're going to see in the later part of this talk, we're going to talk about a better way to get feedback on your solutions. So, think about interviewing as a way to discover opportunities. Now, as we discover those opportunities, I encourage teams to take the time to map out the opportunity space. So, visualize using this tree structure. What are all the opportunities you could consider pursuing to reach your desired outcome? This gives the team a big picture view of all the different paths that they could take. And this is what helps them make more strategic decisions. Now, to do this, to be able to take an inventory of all those opportunities, we need to start this critical habit of interviewing every week. And the reason why a lot of teams struggle with developing that habit is because they don't know how to recruit continuously. When we do project-based research, we can just send an email to 500 people and hope 12 respond and then book 12 interviews over the next two weeks. And we're good. That's not sustainable. We can't do that week over week. So, the key to making continuous interviewing work is you have to automate the recruiting process. Now, there's a number of ways to do this. It's going to look a little bit different based on your industry, based on your product based on your target customer. But I will remind you that I have put this into practice in a lot of organizational contexts. So, I promise you, there is a way to do it for your team. It's just going to take a little bit of experimentation and iteration. So, one of the most common ways that works for the vast majority of products is to recruit people while they're visiting your product. So, what does this mean? If you're a consumer company, if you're an enterprise SaaS company where people are in your product all day, every day, then you can recruit them while they're visiting your product or service. It's as simple as just popping up a message, like you're sitting in the visual here or you could do it in a more subtle way, you could include it in your newsletters. You could include it on their account page, just anywhere that your customer is going to see the ask while they're using your service. So, what's happening here is we're just saying, "Hey, we'll give you a $20 Amazon gift card in exchange for 20 minutes of your time." The way to make this effective is to make a small ask in exchange for a large reward. So, the example that you're looking at snag a job is a job board that focuses on retail workers and restaurant workers here in the us. These types of employees don't typically make $20 an hour. So, giving them $20 for 20 minutes of their time is a small ask in exchange for a big reward. It's going to take some experimentation to get this offer right. So, you're not going to just launch this and immediately get this continuous flow of participants. You're going to have to experiment a little bit, but this is one of the most effective ways to automate your recruiting process. Let's look at the second way. If you're an enterprise company and your end users or your customers, don't spend a lot of time in your product. What you're going to want to do is look at your customer facing teams - your sales teams, your account management teams, your tech support teams, and start to look at how can you get them to help you recruit for interviews. So, this can be as simple as define some triggers for them. If you talk to a customer that exhibits this behavior, ask them if they'll participate in an interview. The same criteria exist here. Have your support teams offer a big reward in exchange for a small ask. Now, for enterprise companies, cash is not usually an appropriate reward. A lot of employees can't take gifts. So, you want to think about what can you offer that creates value? And this could be as simple as a discount on your product, a free month of your service access to a premium helpline. You can invite them to a webinar that teaches them a new skill. There's a lot of ways to think about this, but you can use your customer facing teams to help you recruit. Now, I have worked with a number of companies that have teeny tiny markets. Now, when I say teeny tiny markets, I mean, they sell the movie studios in the US and there's six of them or they work with Canadian medical schools and there's only a few dozen of them. So, if that's the case, if you're working with a teeny tiny market, well, you're going to want to do, is you're going to want to set up a customer advisory board. I don't think about this as a focus group. Think about you're building long-term relationships with your customers, where you're engaging with them week over week, where you're working closely with them to develop your product. I still want you to meet with them one-on-one and to think about it as an interview, but you're basically setting up a long-term relationship with a small number of customers. Once you have your customers in the room. We got to look at how to ask the right questions? Now, remember, we're trying to uncover customer needs, customer pain points, customer desires. The number one thing I want you to avoid is speculation. So, we tend to ask our customers who, what, why, how questions these direct questions are challenging because all humans are pretty bad at answering them accurately. We're really bad at summarizing our own behavior. We're not quite sure what we do, how often, or when we don't actually know what our needs or pain points are. It's really hard for us to verbalize this. So, what we want to do instead is we want to collect specific stories. We want to ask our customers about specific instances in which they do things. And so, this could be as simple as if I work at Netflix, I might ask them, tell me about the last time you watched Netflix. If I work in an enterprise context, I might ask somebody, tell me about the last time you had to do this workflow. The key is to keep them grounded in that specific context. So, you start to collect reliable feedback. Those stories is what's going to help you identify those opportunities. Now, once you started to map out the opportunity space, you're going to very quickly want to prioritize one of those target opportunities. This is a strategic decision. What customer needs do we want to address? You want to look at what are the ones that are most likely to drive your desired outcome? Once you have a target opportunity, you're going to generate as many solutions as you can. The reason for this is, we want to work with sets of solutions. So, we're comparing and contrasting our top solutions against each other. Now, most product teams are overwhelmed with ideas, but they're not overwhelmed with ideas for the same opportunity. That's the key difference. You're choosing a target opportunity and then you're working with a set of solutions within that opportunity. Why does this matter? When we work with one idea at a time, we tend to set up what's called a whether or not decision. Where we ask, is this idea good or not? The problem with this type of question is it sets us up for confirmation bias. We're going to see all the evidence that says our idea's good. We're going to miss all the evidence that says it might be flawed. So, instead, we're going to set up a compare and contrast decision. We're going to say, which of these ideas look like they're best for addressing this target opportunity. So, remember, we're going to compare solutions that address the same target opportunity. And as we experiment, which we'll talk about in a minute, we're looking for a clear front runner. So, what does that look like? This is a picture of Usain Bolt. I want you to keep this metaphor in mind. Usain Bolt at one point was the world's fastest hunter meter runner. If you saw him running around a track and I asked you, is he fast? That would be a hard question to answer. Is he fast relative to what? Is he fast relative to a cheetah? Probably not. Is he fast relative to a Tesla in the first a hundred meters? I actually want to see that race. I don't know who would win. But if I ask you, is he fast relative to other humans? The answer is clear. It's absolutely. He's definitely fast. And this is what we're looking for. We're looking for a clear front runner when we compare and contrast our solutions against each other. So, how are we measuring them? That's our last habit where what we're doing is we're going to break our solutions up into their underlying assumptions. That's what allows us to go from big project size experiments down to continuous bite-size experiments. Testing assumptions is a lot faster than testing whole ideas. So, what do I mean by assumptions? Each of our ideas
is based on all kinds of assumptions. We have desirability assumptions. We're making assumptions about why our customers might want it. Where are they willing to do what we need them to do? We're making assumptions about is this good for our business? We're making assumptions about, is it even possible? We're making usability assumptions. Can people find it? Do they understand it? Can they use it? We're making ethical assumptions. Is there any potential harm in building the solution? There's a lot of types of assumptions. So, what I recommend teams do is they take their top three ideas. They break them down into their underlying assumptions and then they rapidly test those assumptions. And when, I mean rapidly, I mean, take three ideas on Monday break them down into other underlying assumptions. Take the top two assumptions from each idea and collect data on all six in the same week. So, that by Friday, you're able to start to look at which of these might be our front runner. So, this really is a continuous fast cadence. Now, we went through this really quickly. We started with defining a clear outcome. We talked about interviewing to discover opportunities. We talked about comparing and contrasting solutions by testing assumptions. There's a lot to get into here. And if you're really want to learn more, I have a book out it's called Continuous Discovery habits. It's designed to be a product trios guide to a structured and sustainable approach to Continuous Discovery. It's going to teach you how to adopt simple habits that you can do week over week, that will help you run this entire process in a way that's sustainable that you can do alongside your busy delivery schedule. If you want to learn more about this book, go to continuousdiscoveryhabits.com and you can get a nice overview. All right. Thank you everybody. I would love to keep the conversation going. Feel free to learn more about me@producttalk.org or reach out on Twitter @TTorres. Thanks.

Got a Question?

More like this?

Tue, Jun 15, 9:30 PM UTC

Breaking Down Complex Problems - Implementing Change
Rory Madden

Rory Madden

Founder, UXDX

Wed, Jun 16, 5:55 PM UTC

Continuous Discovery In Practice
Teresa Torres

Teresa Torres

Internationally Acclaimed Author, Speaker & Coach, ProductTalk

Flavia Neves

Flavia Neves

Director of Product Management, Spotify

Kevin Newton

Kevin Newton

Manager UX Research

Thu, Jun 17, 6:25 PM UTC

Creating a Continuous Learning Culture
Marc Majers

Marc Majers

UX Lead, Progressive Insurance

Ryan Leffel

Ryan Leffel

Head of Design, Priceline

Subhasree Chatterjee

Subhasree Chatterjee

Lead Data Analyst, LexisNexis

Jennifer Cardello

Jennifer Cardello

VP and Head of UX Research & Insights, Fidelity Investments, Fidelity Investments