A Practical Guide To Mixed Methodologies For UX Research
A Practical Guide To Mixed Methodologies For UX Research
We've all heard it. The best UX research method is the mixed-method. By combining both qualitative and quantitative data the better you can understand your users. Is there such thing as too much data?
In this session, Alina will talk through how to manage your user insights to tangible actions and plan for your team. She will talk through:
- How in Allegro user insights is collated through research, big data and behavioural sciences but what happens next;
- How to prioritise your data/insights;
- What challenges can you encounter and how to solve them; and
- What best practices she uses to ensure the team is aligned in understanding these insights.
Hi, everyone. My name is Alina. I am the Head of User Research in Allegro.
In this talk. I'll highlight some ideas about the Mixed Methods Framework. So, much has been said about mixing methods, but some points are still to be covered. So, let's go.
I'll start with the idea that research is cooking, to prepare some juicy insights, we need good quality ingredients, and what's more, we need to know how to make them. Our ingredients are big data, large and complex to manageable data, to small data, meaning the outcome of user research for qualitative and quantitative. I will share some ideas on how to prepare and bill out.
We deal with the enormity of data that surround us today to get the most valuable insight that drive evidence-based product and business decisions. Companies grow and gather even more data. Allegro grew from a startup to one of the largest e-commerce websites in Europe. Can you imagine how many petabytes of data this is? It's a heaven for data analysts and researchers to give you a better look at it. We are the favorite shopping destination for poles and the place to do business for over 120,000 companies. Each month, 20 million customers visit our platform. This is the equivalent of 80% of all internet users in Poland.
As we grow, we collect more and more diverse data. Our job is to know how to manage them and get the knowledge out of them. On the one hand, we have experiments, statistics, algorithm and much more. On the other user research with interviews surveys and others. My perspective is the perspective of user researcher. Don't worry, you won't find any pies and cones in this presentation. So, the key question is, do we need more data? And the obvious answer is yes, yes, yes. But only one condition, one condition, we need to do know how to manage properly. Now, tend to the central part of my talk, which is framework. I will share with you the framework that helped me organise my team’s work. It is spelled in three steps and each of them comes with a challenge.
So, first it's a Setup. What answer do you need to your research question and which methods to mix? Plan which scenario to choose. So, how to put it all together and; Manage how to deal with data diversity and how to cooperate with other things.
So, let's go through all three steps and shed some light on how you can approach the challenges.
Step one, comes with the three main questions to which you must find an answer before going further. So, think about some ongoing project in your company and try to answer those questions. Where you are in the product development process. On the slide, you can see how we visualise the steps of the process. Before that, of course, there is future thinking with analysing trends spots not about it today. Today about the data that we collect at each step. Answering, where are you in the process will help you determine what methods are at your own disposal. Let's take, for example, exploration. Not only you can use small data, but you can also dig deeper with your analytical team to find out some behavioral patterns. So, at each step makes both small and big data.
Next question is what knowledge do you need? Sometimes depending on the research question, you need to know how deep you want to go with your research. On the surface, you can get a quick answer to what people think or say, so simple qualitative methods are enough, but remember, that's just an opinion. If you want to explore how people behave and why this is a starting point to mix different methods.
And finally, after you make sure where you are in the product development process and how deep you want to go with your research. Now it's time to pick the suitable methods. Standard metrics charts show you qualitative and quantitative dimensions as well as observational and the clarity of data. This month only slightly behind us and I want to leave you with a perspective on mixing small and big data. What is shown here is the power of big data and search for deep desires. Once you've chosen the proper method that's to answer your research question, this is time to plan it all out. So, now we come to the point where you need to choose the right scenario. So, we have explanatory research, exploratory research, and one more dynamic research.
So, first explanatory research, one day I say one thing and users say another. A few weeks ago, our product team responsible for pot makers and decided to resign from one of the filters. It was a filter that helped to select offers at least at a given time. So, on analytical data showed that the tiny percentage off customers select this filter when searching for the right offer. With remote this filter for customers and then in turn out load over. They written other customer doesn't respect the opinions of customers. Give back this filter. This is of course not true because we do a ton of research. It turns out that the customers from two main categories, automotive and collections needed this information. We have re-introduce it to use it, this feature in those categories and well, that will check what's missing here. When you want to verify user insights. This is exploratory research.
Now, an example from the listing page. So, I'll mess up the bids. I say that you can list offers product, online products on other grounds. Some offers can be linked to one product and leave them. We have a forum that is supposed to help merchants go through all the listings steps. After a quick concept test with users, we didn't have a clean answer on designing this forum because our user behavior was, I mean, the user behavior was undetermined. So, we were asked the testing phase of several alternate versions and look at what the data tells us. The data told not to make changes for now and the customers in the during the quality test only confirms that.
And the most exciting scenario to me is when small data and big data work together within one research study. Back to the previous problem with listing offers and products, but from the perspective of customer buyers can find on Allegro multiply offers of the same product we are working on making all the necessary information readily available, but sometimes they are missing and the customers need to contact merchants often via chat.
So, machine learning helps us understand the reason for the contact, categorise those reasons and thanks to that we'd have quick feedback from customers.
The final step is to manage all the matters, but the jobs-to-be-done here is to make sure that you collect user insights from different sources. Transparency is essential. You need to know who is involved, what kind of data they have and simply how to get them. Don't forget to use data from outside of your company. Maybe some reports already exist and you will still save yourself time.
There should be a person who is an insideleader, the owner of the research process, making sure all perspectives are taken into account and often confronted. In other words, research led connect the dots and the role of a research manager is to assure collaboration and connection between teams between the dots.
Last but not least, you should also have one place where you gather all the details about your research project. You can use Data Studio or Google Docs, whatever. The thing is to keep everything in one place, no matter what source it comes from.
Finally, I like to address the problem of prioritising insights. I think we all know how to prioritise insights on the experience level using for example, Severity Scale. Now it should go a step further and be able to assess the overall impact on the business as well as show the cost risk. Of course, we don't do it alone, but with the cooperation of business and technology, but the point is prioritisation is more complex, especially when you prioritise strategic insights. I know, it's already looks like the evaluation of solutions. It's actually is.
In my opinion, we should have that third view of a given insight at an early stage, according to the Fail Fast Principle. So, for example, you might have a site that is slightly less important, but it will significantly affect your company's turnover. You might also encounter equivalent insights, but in further analysis, one of them is less. expensive So, visualisation is secondary here, you might as well it's show in the form of a matrix or a table.
Let's turn to the next issue. There is a lot of talk about research democratisation. I'm looking at it and I have to tell you that the problem often lies in the ineffective distribution of knowledge. So, as a cognitive psychologist, I spread the idea of cognitive economy. So, the expression of it in the user research area, zero waste research. Before you start doing for the research, make sure that someone else is already working on a similar topic and vice versa. If you are doing research, make sure that it reaches all interested stakeholders.
So, what do we do to answer that knowledge is well distributed? Well, I will give you three hints. We have a knowledge base that connects all the sources in the company, a simple website with test and really well-working search. So, stakeholders can search for a report or some research that may be helpful. We have a research letter, thanks to which the right people get the dose of essential insights every month. We also do a researched demo a special meeting for all interested stakeholders. And last but not least, we often obtain insights from research that do not relate directly to the study earlier. Then we'll do the so-called Trash Book for stakeholders’ data.
Okay. I'll stop here. It's enough to digest for today. Thank you for taking time out to listen to my presentation. I hope the framework will be helpful to you as it is to me.
Let's go out and create opportunities for the next level, next mixed methods approach.