Democratizing Research

Knowledge / Inspiration

Democratizing Research

Continuous Discovery
UXDX USA 2022

The need for organizations to make more user-centered decisions, means that research is less mission critical than making sure company-wide insights are gathered. Our panelists will discuss what productive, responsible, and effective research democratization looks like today including how:

  • Cross-functional teams can apply research
  • Researchers can facilitate insights between users, product and marketing teams to be more effective
  • Undemocratising research, is the time vs value delivered worth it to the organization?

Rima Campbell
Today's topic is democratization of research, right? What does it really mean? Democratization research really means that give everyone in the company access to customers data, give them an empower them, given the ability to run their own research, and collect data, build a culture of data driven organization. As we all know, the demand of user researchers who are talented these days is rapidly growing. Finding those talents in industry and hiring them becoming a little bit challenging for us, because we right now have, according to Nielsen, Norman, we have 1 million UX professionals in the workforce, by 2050, we're going to need around 100 million more user researchers to actually cope with the demand. That's the good problem to have, right. So with that democratization of research was born. And we have leaders in different size of organization that are actually leading democratization programs in their organization, whether they're centralized or decentralized. So it'll be great opportunity to hear from our distinguished leaders, how do they support democratization in their organization today? Maybe we'll start with Jessa.

Jessa Parette
Yeah. Hi, thank you. It's so great to be here. My name is Jessa. I currently work at Capital One, I lead Research and Design for All Capital One's auto finance, with our research our design strategy in our design system, for democratization, just to kind of lay the groundwork, and have to define what it means and what it doesn't mean. And that is the way that you figure out how to scale. And that's the way we've set up the different structures in our team. One size does not fit all, you will have to adjust as your company grows. But it starts with really understanding and deciding the battles that you fight in terms of what does democratization mean, what does it not mean? And when does it happen? Because that will tell you things like budget, headcount and team size. I'll pass it over to my other panelists.

Erin Howard
Sure. I'm Erin, I lead research and design at Charles River. And we're in a different phase of our sort of organizational journey through democratization, right now I actually don't have a research user research function. My product designers do all generative research, all the research that we do, we do outsource to some strategic partners for some of the very skill based survey design, etc. But we're in the phase of sort of hiring generalists, so that we can create potential career paths around research in the future, bring research as part of our organization through our whole discovery lifecycle, and sort of embedded alongside product creation and sort of adopt a continuous discovery model as the way we work moving forward.

Kendall Avery
I'm Kendall I lead the writer and maps research team at Uber. Our democratization journey has kind of gone a little bit from research kind of did everything. The research team owned all research processes. Part of it was because we had a really large team. And then as our team started to scale, the types of problems and projects we were being getting pulled on to we realized that we had a smaller team size didn't have the ability to staff and support all the different projects and in areas that we really needed to and so we're kind of in this mode of actually introducing more democratization, starting with defining it really understanding like, what is it that we are democratizing? What is it that we are not? What are the guardrails? What's the governance that we have to incorporate there? And then really making sure that we're setting those expectations with our partners to say, this is what research means. This is the brand of research and how do we make sure that we maintain that brand quality of research, even if researchers themselves are not the ones leading it.

Rima Campbell
That's terrific. Thank you, ladies. So I hear from our customers that User Zoom often and I had my own experience back at Citi where democratization of research is really, no, no, that's back like, six, seven years ago, only specialized researchers should be doing research, whether it's generative research, or just purely validating design across the, you know, the design lifecycle. So I guess the question here would be, as you establish democratization and have that practice, how do you run that program? Some customer asked us to run boot camps, to teach research one on one to product design. loan designers, non researcher, purely non researcher, other than they do the trainer program. So maybe talk a little bit about that, as you established that program, how do you do it?

Jessa Parette
Yeah, so I'll speak to my most current experience. Capital One being a larger organization, we have the luxury of being a larger organization. But also some of the challenges of the demand for what people want and need will always surpass. However many researchers are designers that you will ever have. So the way we are structured, I've actually split research operations. And researchers, there's two different functions, research operations is actually centralized. The reason we do that is at the size and scale of our company, there are certain legal ramifications, tools, process access, licensing, that you really do need to have a central area of control in order to allow product teams to self serve, because not all product teams will have researchers.

Jessa Parette
Now, researchers, they still roll up to me. But researchers are embedded in the lines of business. They work with the product owners, they work with the product teams to prioritize. So we have a little bit of a combination of both a self service model where product teams can do their own research. And our operations team supports the legal side of it, making sure it's compliant because being a bank, I promise you, we have more regulations than you could ever think possible. I mean, it's really fun, then there is researchers in those product lines. So that's how we're sort of set up. And there is training that comes in both of those aspects. Managing it will always be a conversation of demand and budget versus size and scale. And it really comes down to you're saying you want to do this, it will take X number of people, do I get X number of people? No, then you don't get what you want. Which is not always easy to have. But that's kind of how we're set up. We balance it carefully. Because we had an average of 76 project demands wins in our first quarter. 59% increase in the demand for researchers, we had zero additional headcount added. So those are some of the challenges you have to be able to manage at that scale.

Erin Howard
Yeah, I mean, for us, our growth is pretty rapid. We've gone from one designer to 10, in the last 15 months. And so with that we're actually using each other as sort of our network for standards or creation of processes. We're building the plane while flying, if you will, in many ways, but we do a lot where we sort of CO work with each other around creating our frameworks for the conversations that we're having, or the sort of outcome results guidance documents that we're creating, or how we're doing the analysis on unmoderated testing, we use each other as sort of guides and trials there. We're not yet empowering that activity directly in our product teams or engineering teams, we're not at that training point, we're still sort of owning that ourselves. But we are working on how we figure out that as part of our Agile Model and our continuous discovery model and how we do that at scale and speed. So I see our lives sort of changing and looking a little bit more sort of structured in the future. But it's we're just not quite there yet. We're all kind of forming as we're going and normalizing processes where we can along the way.

Kendall Avery
Yeah, I think from our standpoint, we're looking at how do we add the right process around? What should researchers be taking on what should be democratized? And part of that is we very recently started looking at what's the research engagement model, and so on a matrix of business risk, high business risk to low risk and then level of uncertainty with designs or questions that we have, we kind of have broken it out into like four different buckets are kind of calling it like a like a buffet, because there's a ticket, manage it, enable it pass it kind of felt like a toy. And so based on where the business risk and uncertainty kind of fell, that's where we're able to make from a researcher standpoint, okay? Is this something that needs to be owned by researchers in house, we're the ones with the expertise, we have to be the ones executing it isn't something where we don't have the capacity to support it in house, but it does need dedicated like qualified research support to be owning it and getting us the insights. Maybe that's where we pull in a vendor or contractor we look at our other ways we can get this data.

Kendall Avery
The enablement bucket is really where we're starting to experiment with democratizing and so even within this larger guardrail of what's our engagement model of researchers, then within enable it, okay, how do we enable it? What are the guardrails within that and so we're starting to build out templates, we're starting to build out playbooks things that are very specific, and not allowing for a ton of wiggle room within those. And so it's if you're going to enable it, you're going to follow these standards and practices that have been agreed upon across the entire research Oregon. So we're kind of trying to centralize some of the standards. And the way that we as researchers work.

Rima Campbell
I love the idea that you brought up the standard. Right? The research question, it's asked question, how do you establish those how you share them? And also, I wonder, how do you know that applied correctly?

Kendall Avery
Yes. So great question. The way that it ended up actually working, was a little serendipitously. So one of the designers on our team had experience at a previous company leading some research. She wanted to unblock herself and just be able to lead research. And so she was empowered by her manager to take on what does it look like from a designer's point of view to run Self Service Research? Well, we found out that there was a researcher and another org that was trying to do the exact same thing. And so we actually were able to bring the two of them together and say, from a design and research point of view, what does it take for a designer to feel confident within the research? Because I think one of the things that sometimes happens is when researchers are the ones democratizing and leading the process, we forget how much we know. And so we forget what's important to teach. And so bringing in that designer to also say like, Hey, if I'm the one executing, what are the questions I actually have? And then having that researcher there to make sure hey, these are the you know, you should add, assess, ............. something at the end of it to make sure that it's a standardized questionnaire, you're asking the same thing of everybody? This is how you make sure it's not a leading question. So there was some guidance over it. But I think what's really making it successful is that it's not just prescriptive, from a researchers point of view. It's really collaborative from research and design together.

Rima Campbell
Love it. Love it. That's pretty cool. Now, I wonder if Aaron or Jessa can add to the quality of the insights coming out of those types of research that you decide to empower designer or non researcher to actually usually answers.

Erin Howard
I mean, I think quality is a difficult thing, right? Because unless you can, sort of, for us, we don't have a baseline to measure against, we're establishing it. So we're starting to kind of learn and pressure test our own assumptions and quality and work against sort of our own previous experience. Okay, how I did this last time, did it work? Did it not work to give me the answer I needed? Okay, what do I need to change? Right, and sort of continuously improving from that perspective for ourselves? But you know, in the end of the day, did it help you move your product forward? Did you help them make a decision? Did it connect to the business need? Did you feel informed about what you went to research? I think some of that is where the quality comes in. You know, it also to your point, like the level of severity of getting it wrong impacts whether or not you feel like you can sort of measure that quality. And you know, when you kind of start to get into those more severe problems, that's where you sort of tighten those guardrails, right, and that's where I you know, sometimes we have more than one designer on those more tightly guardrail phone calls or sessions, or we engage with a preferred service provider for a survey product, because we're not survey writer. So because we know we can't achieve that. We know we have to outsource that and bring a vendor in to support that piece. So there's a little bit of that.

Jessa Parette
Yeah, it's adequate to what you're saying one of the things that I found is helpful in defining with just a research team, take two in a box model. Okay, so two squares, two squares was four altogether, and kind of understand what is quality to find? What quality undefined and then in your next, like sort of plane, what is quality that is allowed? What is quality that is not allowed. And if you can start to bucket things, for example, like quality needs to be defined in the number of users that you use for usability test, you can't just talk to three people and then make a product decision that's $2 million. Because you called someone out like, Okay, we're gonna define quality in terms of the usability test, the quality can be a little bit undefined with the product team when it comes to things like, color template format, I don't know we're just using that as an example. When this quality really matter. When is quality when is it okay to be a little bit loose? Because in democratization, you're gonna have to be, you're gonna have to choose what battles you fight, what you are allowed to eat, what you can give control up on, like, maybe you want a standardized template for everything, that'd be really, really great.

Jessa Parette
But the bigger battle you're now fighting, is the fact that you have teams that are going to the field and they are talking to customers, and they have not had any sort of consent form fine signed. Oh, my goodness, that's a bigger risk. So what can help But it's like define those in those buckets. And you can start to say, okay, here are the things that I'm gonna let product teams in this undefined bucket here. Here's the things like yeah, go nuts go for it not a problem, here are the things that we absolutely need to have a product roadmap or working towards to really clearly define. And oh, by the way, I'm going to need a person full time headcount to like manage that. And at some point, I have found that has helped as well. Because when you start to grow out of that, hey, now we're starting to define, you're gonna get to a critical point where the ramifications are not just preference, the ramifications will hit your product, the ramifications will hit your platforms and your services, and heaven forbid, the ramifications will hit the media outlet, because of something that happens.

Jessa Parette
That's when you find really, really quickly, what a small decision should have been made before was not made. So just to add to that, it's a helpful sort of like, quick back of the napkin that you can do with your research team or with your designers, and come to an agreement and then gather agreement with your product team, if you're kind of in a pinch.

Rima Campbell
Lovely, loving, loving, loving the risk aspect of it balancing all that based on expectation in different organization right. Now, where do you store all this data? How do you make it accessible? What kind of tools do you use? What kind of, you know, type of reporting that you make accessible is it snippet is highlight and tell me a little bit about Kindle.

Kendall Avery
So building off of the conversation, I'm gonna have folks join the research and design ops conversation earlier. They're in an invaluable resource. So if you're able to grow research ops team, I think that's, it's been incredibly helpful for us. So they help us manage, like, we have report templates that we for the most part use for all of our research reports. And one thing that I've learned there's, it's not just the quality of the insights, but it's the quality of the storytelling and how you're actually sharing those insights and whether or not it's going to land. And so, you know, bring in design partners to help create those templates, like make sure that you're able to tell a beautiful story that is accurate and data driven, and something that's going to move the business forward. And so that's one thing that we try to use is like templatized things wherever possible. We also have a an in house research repository that is really useful. And so we've been really encouraging the team and finding efficiencies wherever possible, we found that some of the teams were already kind of doing their own research repository and a spreadsheet. And instead of asking them to like pick up how they were working and change to a different model, we just added like a Google script that could kind of like pull everything in.

Kendall Avery
And basically, how can we make sure that it's the most efficient way possible that all of the insights can be in one centralized location without adding additional workload onto the researchers themselves, but finding different ways that you can make those efficiencies up front, so that they just become part of the process, it just becomes standard. And it can, I think that's part of democratization is like the scale piece of it is really where it comes into play. And so it's not just scaling the skills of research across your team, but giving them the tools so that they can scale efficiently and effectively, and not feel like they have to learn how to create a research report and how to do the research and how to do all the legal stuff at the same time, the more we can just give it to them upfront, and it's kind of a little bit more plug and play with research, oversight and guidance, the more efficient your teams will be.

Rima Campbell
Excellent. Aaron, any perspective?

Erin Howard
Only that I think is my teammates are watching this, they're gonna go, yes, Kindle all of that. Because we are we're on that precipice. We're at that moment, we're at that moment of scale. And so now is the time that we're beginning to look at that Research Repository tool, that ability to cross share right now, the sharing our findings, and our discoveries and our direction. And our connection to the business happens through our demos in our agile process. It's part of the demo structure. It's how it gets out to the organization. But we're finding the opportunity now that we really can help cross pollinate each other. And that research repository is part of our next step. They'll probably want me to go talk to dovetail really quickly. But, you know, there's a lot of tools out there. And so we just haven't, we haven't embraced exactly what that looks like for us. But we know it's on the horizon as part of our next growth. And I truly will, I believe it will be an unlocked for us and how we just partner across our teams and our sprint and our Scrum teams and sort of decouple those independent silos. That's creating.

Jessa Parette
Yeah. So we've gone through with the different teams of I've worked with different approaches in terms of managing insights, and you'll probably find that you'll start something and then stop it and then you know, it doesn't get so it doesn't get off the ground or you know, it starts and then someone else discovers something great. That's a really, really good reason why you need research operations. So because guess what, when people don't have something, they might assume that it doesn't exist, and they're gonna go build their own, and then you have someone who's doing duplicate work, that we've used everything.

Jessa Parette
So starting with some homegrown things, you know, what is accessible to everyone in your enterprise? That's a really big question. Because you might have an instance where if you have a centralized licensing platform where like only certain people get a license to a tool, and you use that to house your insights will suddenly your product owners may not get access to it. That's a problem. So what is central to your organization that you can kind of clued you together or use that everybody has access to? And then how do you manage it? So you use we've used things from like air table to in smaller teams, you know, like Excel spreadsheets to Google Docs, to Google Drive's to. We even use JIRA at some point to understand what insights were requested. And then of course, there are more standardized tools, which are great like, you know, handrail dovetail some other things like there are systems that house insights for those research tools.

Jessa Parette
The challenge is how do you manage all of that? The skill set that comes with it is not just a researcher, it's actually like program managers, process managers like either our specific information architecture, skill types that go with how do you standardize tagging it Google presentation so that it is actually searchable across the enterprise, so that if you have it in a centralized location, and ready can look for it. So some of the things that we've done is also you really need to understand how people search for insights like product owners might go, do people really like eating in the cafeteria? Well, you need to understand semantics, and how people search for things. Because if your documents aren't tagged, correctly managed correctly, no one will ever be able to find them. Right?

Jessa Parette
So it takes trial, it takes practice, but then it takes the discipline of someone maintaining and sustaining that system, or I promise you, I absolutely promise you with every fiber of my being, it will become wild, it will become unruly, and then it will become unusable. And then you'll have someone going, we don't have anything for her inside, we'll make something you're like, oh my gosh, it's like Groundhog Day. We're just doing the work all over again. So that's where it's like be, you know, experiment. Yes. But then also really clearly understand when that trigger moment is like, okay, for like hell or high water, we gotta get someone to manage this way of doing insights.

Rima Campbell
I love it. I mean, this rising of the, you know, research operation, as a rule becoming really critical, right? As the team mature, making sure the data is accessible, you know, how you target all that to make sure because your consumer of insight, right? What about the life self of that data? We know from, you know, what we've been through in the past couple of years Olala data, it's really not valuable as much anymore because consumer because it has changed. Everything is shifting to digital, you know? So how do you manage the lifecycle of that data?

Jessa Parette
So I look at it in three ways. And this is kind of like a rule of thumb. It's not like a do it every single time. But insights and data you need to bucket them into and define types, or else you're going to come up with a generalization. So qualitative insights, they tend to where maybe they had a longer shelf life of like 12 months prior to the pandemic, I've been talking to some researchers in the fields and my friends like, what we're finding is that those insights now actually have a shelf life of maybe six to eight months. Okay, qualitative insights of like, empathy studies, user interviews, we are changing faster. And the demographics are changing faster, due to some very turbulent times, is a gentlest, very mild way to put it.

Jessa Parette
So your qualitative probably has a shorter shelf life than before. quantitative information then falls into two buckets. There's the automated one where it's part of your system, you are measuring and you're automating. And the Insight is coming from a by product of your system, like bounce rates, or number of users in a system number of click through rates, those things tend to have a little bit of a longer shelf life like dusted off every 12 months, make sure that your systems are integrated, your formulas are still correct. But then moment in time quantitative, it becomes outdated. The minute you deploy your next iteration.

Jessa Parette
Like you can't use Oh, hey, this was the benchmark and this is how much it improved. And then six months down the road, pull that study and go. We improved it this much. And this is what user said, and I don't know you've had two deployments since then, your fundamental landscape has changed. You can't just go this was moment in time. And we're going to use this moment in time to determine something 12 months later, so that becomes outdated the minute you deploy something new, but I'll defer to others.

Jessa Parette
I mean, I have a tendency to sort of just agree. You know, but you know, I think some of the things around like sentiment and empathy and some of that stuff, it's a little bit depends on your industry to like we're b2b versus b2c. And I think our b2b landscape tends to change a little bit slower than b2c, or b2b etc. And so we're, you know, we probably have a little bit more length of time with that, especially if we're using it more directionally Zahn, and then we sort of re dive in based on that direction versus choosing it as law, but not much longer. Maybe a year, you know, but it's, um, it does have, I think, a slight tweak, depending on your industry.

Jessa Parette
And I think there's also a caveat just add to that, like, there's a difference between a trend and an insight. And there's a difference between an insight and a fact. Okay, and those can get pretty convoluted sometimes. So like Trends tend to stay a little bit longer than insights, which tend to be a little bit more moment in time, or maybe have a little bit of like maybe a half life of understanding of product. And there's definitely a difference. So I would totally agree that it depends on your industry, it depends on are you calling a trend and insight when it actually isn't?

Kendall Avery
We look at what past research do we have when we're in the prioritization process? And so if there's a new research question that comes up, it's, you know, is this actually truly a new research question? Or do we have research from the past? And sometimes we have studies from like, 2018. And we look at it and it's like, does this still feel right? Does it feel like there's some something fundamentally different because I think when you have a smaller team, you have to be like ruthless in your prioritization? And so what we've tried to do with our partners is say, if they're requesting, you know, basically, the 2018 study redone for 2022, is there a new question we can add to the mix? Can we beautiful add a little bit of extra is there, you know, really relying on like lit reviews, taking a look at all of the research and saying is there you know, yes, let's double check that these things either are still true. Maybe they're not true. But making sure that you're still kind of like moving forward with everything. And always trying to identify that there's this net new question yet to be asked to just make sure that you're constantly bringing additional value to the research.

Rima Campbell
Great points, great points. Let's talk about scalability a little bit, as teams grow. How do you and at what moment in time? Do you say this model doesn't work? It's not scalable. And to talk about that a little bit.

Kendall Avery
I mean, for me, I think that that mean, that moment, scale is perceptive to the size of your organization, like scale for you could be from 1 to 2 versus 10 to 20. So that moment, is really I think, sometimes good. Now it's a prioritization, how much are you able to get done? How effective are you? How are you able to connect with the problems? How do you understand the business strategy, and if that's starting to get stretched, then then you're at a scale moment. And that could be a small scale moment, or a big scale moment, or a big adjustment to the way your organization is designed, or how you support or whether you are a service offering or embedded service, like some of that starts to come into play. So I don't see scale is like, needing to be this massive growth moment. It's just it's sort of small triggers. It's just recognizing that something's changing the way you're working, and now do you need to readjust and recalibrate and maybe ask for more resources or reprioritize, or move away from a project or look for a tool, but scale sort of relatives.

Kendall Avery
Couldn't agree more. I think there, it's really identifying like the reflection moment, it's like looking back and saying, okay, like, you know, take a last six months, how many projects that we've been able to do, what types of projects? Okay, what are the things that we couldn't support? Why couldn't we support them? Was it the right decision not to support them? And one thing that we've been looking at with why democratization is something worth investing in for our team is, sometimes there's a lot of smaller questions that are, you know, low business risk, but high uncertainty that just don't fall into our camp in terms of like all the other things we have to prioritize. But one of the things that that leads to is then the designers maybe don't have the confidence that they need to be good, like feel competent in their work and move that forward. And so we have to introduce democratization because that's a scale gap for us. We aren't at a position where we can support designers and their competence and making sure that their designs are getting tested and whatever it might be for them, because we have so many other generative and foundational efforts that we're taking on. And so recognizing, like not only just what are the types of research that you're doing, but like, take a look at your team and see like, are there areas that my team is not being supported in? And what are the things that I need to do as a researcher to help support them and really embed yourself as that research partner, and not just a resource who owns the research and then like, kind of disappears? It's really how do I make myself a partner to all these different teams and figure out how I need to scale our practice to support all these individual functions.

Jessa Parette
Yeah, there's the scaling of practice. And then there's the scaling of the things that helps the practice. And it is very tempting for companies not to recognize the ladder. Because oftentimes, we will ask people to take on more and more and more, and then you sit back and you realize 50% of our researchers time is actually creating JIRA stories, or figuring out the tool that they need to use or talking to legal about the compliance and you're like, hang on, alright, that might be a moment where you need to talk about what needs to scale. So indications can come when you look at job expectations, versus what people are actually doing. And it's harder than you think it means actually having some very interesting conversations with your people and making sure that you're actually looking at how like workload and burnout. So for example, going back to like, define what you need to democratize and what you want to democratize. Are you going to democratize the governance of insights? Is that a decision you're gonna do? Can anybody just write an insight? Or is that something that you are going to say no, no, that needs to fall under the highly controlled quality, highly managed, will when your company starts to scale, and you start to by adding more and more products, more and more developers, there's going to be a point where researchers are not going to have the capacity to have that same level of quality, that's an indicator, okay?

Jessa Parette
We decided that the governance of insights was something that we're going to democratize, we're now at a scalable moment where we need to bring in operations to actually manage that full time. So that's an indicator, and it's also an indicator of scales, what do you need to scale? Is it doing the research? Is it understanding the research? Is it managing the tools? Is it managing the platforms? Like what is it that needs to scale, because while you will always feel the pressure, I'm sure you didn't always feel the pressure, I mean, once everything all at the same time, it's what would relieve the pressure enough that actually you have a leeway of about six to eight months, or maybe longer to continue as is and it might be, hey, we actually just need another researcher, we don't need to scale in our operations, we actually just need another person. Or it might be the flip side, okay, the skill sets that our researcher has quantitative, generative evaluated, those are no longer fitting the type of work that we have to do 50% of the time, we have to actually add a Program Manager for research.

Jessa Parette
So it's not an either or, but it does take careful management in understanding the strategy of your team and identifying what I call like the triggers. When this happen, it's my trigger that x needs to I need to have this discussion. Don't just let it happen to you. Because otherwise you will be controlled by the tides of the product, versus helping the products mature in terms of being able to sell service.

Rima Campbell
You actually answered my next question. I was literally going to ask, what's the teacher research? What are the biggest challenges? And I think you will covered all that. Thank you so much. This is great. I wanted to turn to our audience and see if there's any question that I may fail to. And you will like to ask one of our panelists, anyone?

Speaker
Hi, there, you made a really interesting comment actually, about defining insights versus other things that are maybe conflated with insights like facts or trends, or recommendations. How do you define an insight?

Jessa Parette
An insight is an amalgamation of three different data points that configure to a unique understanding of a situation. So it is not a trendline of like, Oh, we're seeing more and more people buy milk. That means X. Hang on, we're seeing more and more people buy milk and more and more people are doing X and there's a decrease of this. The insight is that new thing so it's two to three actual like different points of data coming together that generates a new I thought that is an insight. And insight is not a trendline. That's how I would define it. But like, feel free to absolutely disagree with me.

Kendall Avery
I think I've heard this a lot like, what's the difference between an insight and a finding? A finding exists, there's proof of existence. Like if you if a person, let's say I talk to six people, and five of them roughly say the same thing and one person doesn't. The finding, they said that thing they exist, I don't know if I'd call it an insight, because I only heard it from that one person. And so one thing that I'm thoughtful of, and I'm actually thinking maybe we would incorporate into our democratization of like the synthesis piece, is making sure that you're including multiple data points, multiple voices, like one thing that I think is super strong is when I'm crafting my insight, if I have three different quotes from three different people roughly saying the same thing, like that's really hard to argue with? What are what are the chances that three people who've never met each other live all across the world are saying roughly the same thing? Like there's something there? And then to strengthen it further?

Kendall Avery
Is there you know, quantitative data that I can bring in? Like, how do I triangulate this to really strengthen it? But it's not to say that that finding does not hold any weight, it just needs to be further investigated. I think that's the one of the risks in democratizing is making sure, like what's the governance around findings versus insights? And how do you make sure that you can, you know, maintain that quality brand of research that you can trust, the level of work that's coming out, but also letting people kind of discover those insights, because they've already got the lightbulb moment for a lot of folks.

Jessa Parette
It is interesting. So I'm like the only non researcher on the stage, I think, actually, because I don't actually have a formal training background in it. But what I would say is the Insights is what's connects to you to the business strategy. It's where you have the opportunity to be influential, a finding an observation, a theory. That's all great. That's all really valuable. That's sparks valuable conversation and valuable ideation. But an insight is sort of evidence that can inform business strategy. And evidence requires different aspects of facts, different pieces of the equation, whether that be what your what the customer said, what the user said, what the business needs, all of that comes together to sort of a precipice. And those are the pieces that I feel like are how research and the activity can be super influential.

Jessa Parette
Yeah. And I also think of it this way, like talking about an insights library, where you're putting your research findings, what's more helpful, having just a list of like, here's statistics that you have to then interpret yourself, or having a very thoughtfully well crafted sort of a, these are three different things happening. Here's an interpretation of those three things. And here's possible outcomes. That's sort of it. That's an insight that's different. So like a thought experiment, I'll do this is, okay, there's a rise in automated self driving vehicles. With the rise in self driving vehicles. Traditional vehicle servicing organizations like you know, Walmart servicing organizations, their workforce is mostly maintained of skilled mechanics.

Jessa Parette
But with the rise of self optimization, beat vehicles, and with the skill mechanic field, are you going to have an issue? Are you going to have to have a moment of time where you're going to have to start hiring engineers, to staff, your Walmart, tire and automobile sections? That is a dip you're in. So we're missing a third piece we're missing like the last moment, I still don't have that research. But that kind of gives you an example of like, two different facts coming together gives you an insight of understanding, do we need to drive the business strategy in an auto and tire but like, does, you know, O'Reilly's auto needs to start hiring engineers to be their surface people instead of just mechanics. That's the difference between an insight and just a kind of like a fact finding.

Rima Campbell
That's excellent. Excellent. I think do we have one more time for one more question? Let's try anyone go ahead.

Kendall Avery
Think if you've got like, you have a research operations team, if you have operations, people who can help develop that framework, absolutely lean into it. If that's not the case, I do think it falls on the researcher as the work isn't done just because of the report is done. And so how do you make sure that you're maintaining? I call it like stickiness. Like what are the different ways that you're making sure that your research is sticking with the people that it needs to. And so if that's creating a repository on your own, I think this kind of goes back to the insights piece is making sure that you have a point of view that goes along with the research insights. And so not just saying, this is what we found. But that next level of this is what it means to the business, right. And then, like, one thing that I'm starting to implement my team is what we're calling them point of view docs. And so it's kind of a combination of a lit review of here's all the past research we've done on this topic.

Kendall Avery
So I have it all consolidated in one place. But then what's my point of view based on all these findings? What do I think the business should do moving forward? And from a lot of product folks I've spoken with, that's what they're looking for. Is that, okay, we learned this, what's next? And so a repository can be a place to get that. But I think the more valuable piece is that commentary and narrative that comes with the insights. And I think that typically falls on the researcher and then also creating a good alliance of partners who can also advocate for the impact of research and how it can move forward.

Rima Campbell
That's an excellent point how you make that insight actionable in that moment in time, conceptual. Terrific. Thank you so very much. I hope everyone enjoyed the conversation and learned from it that they can go apply. I appreciate it. Great discussion. Thank you all so much.