March 13, 2024

Seven Deadly Sins of User Research with Els Aerts

Welcome to nohacks.show, a weekly podcast where smart people talk to you about better online experiences!

In this episode I talked to Els Aerts, user research advocate and co-founder of AGConsult. We explored the deep nuances of user research, the strategic decision of AGConsult to refrain from working with political parties, and the pivotal role of carefully crafted research questions in yielding actionable insights.

Els shared her expert perspective on avoiding common research pitfalls and emphasized the importance of evidence-based decision-making. The conversation also touched on the "seven deadly sins of user research," offering a comprehensive understanding of the challenges and best practices in user testing.

Through engaging anecdotes and a wealth of experience, Els shared valuable lessons on conducting meaningful research that truly enhances user experience.

This episode is done as part of our partnership with Experimentation Elite - The UK's premier Experimentation and Conversion Rate Optimisation event on December 7th in Birmingham, UK

https://experimentationelite.com/
https://www.linkedin.com/in/elsaertsuserresearch/
https://www.agconsult.com/en/


---
Tune in for an enlightening conversation and don't forget to rate and review the episode!

nohacks.show
YouTube
LinkedIn

Transcript

[00:00:15] Sani: Welcome to NoHackShow, a weekly podcast in which smart people talk to you about better online experiences. My guest today is co founder and managing partner of AG consults. She's a beloved keynote speaker at your favorite conversion conference, whatever that conference is. And I've heard others refer to her as queen of user research and queen of user testing.

So if I tell you that you will learn a lot about user research today, you better believe that. Els Arts, welcome to the show.

[00:00:41] Els: Thank you so much, um, for this very embarrassing intro. Um, I'm, I'm sure some people only said it once in a very unguarded, perhaps half drunk moment. So, but I'll take it. I'll

[00:00:54] Sani: counts. It counts. Yes. Yes. And some of those people who said it were on this podcast also. So it counts. It really does, especially here. So, uh, almost 23 years of, of running, uh, your own consultancy business with AG consults. First of all, that, wow, that, that is incredible. That's almost Google in terms of age of a company that, that is in digital space that is unreal.

[00:01:17] Els: it's true. Uh, we have been around for longer than Google analytics. Um, so that is, that is very, that is very funny. I know. Uh, I think it says something about our tenacity as people that we, once we, we start something, we don't give up, we keep going.

[00:01:35] Sani: that's the way that's optimization in a nutshell, basically. I have a very 2024 question about the company. So this being the election year in 64 countries worldwide, covering 49 percent of world population, which is just mind blowing and a disaster waiting to happen. If you ask me, your website lists that you do not work with political parties, like explicitly says, we also don't work with political parties.

So why is that

[00:02:01] Els: Yeah, well, um, with political parties, I feel like one, there are political parties I would just never worked for in the first place because I disagree violently with their their take on things like some parties are just outright racist. Who wants to work with them? I don't know. But also parties that might affiliate, you know, might be closer to my real political affiliation.

You never know what might happen, which way they might go on. These air people with enormous power over other people's lives and I don't want to be responsible for making something Uh for these people which you know could could have that type of impact. I Basically, I guess uh too long didn't read. I don't trust politicians period

[00:02:52] Sani: beautiful? Neither do I, not a single one. If you start with the relationship with a political party or politician, As they're lying. Let me check it. They're not, you're doing a great thing. If you ask me. Okay. So today we're talking about speaking of politicians, we're talking about sins, but we're talking about deadly sense of user research.

This is something you talked about at the conference formerly known as conversion hotel, which is the full name. A few months ago, it was in November, right? And, uh, first of all, tell me about the conference. I heard amazing things. I, I, I, when it was going on for a week, my LinkedIn was only conversion hotel.

And that tells you about my connections, I guess.

[00:03:34] Els: Yeah, no, I, I think it, it might say something about your connections that it also says something about the conference because it is simply, I think it's one of the oldest conversion conferences around and it is one of the, if not the best, um, my friend Don Wesseling just does an amazing job of keeping it fresh, uh, every year and just bringing in not just the best speakers, Uh, because that would be a terrible thing for me to say, since I've just spoken there.

But, um, the, the, the real value of Conversion Hotel, or as we like to call it as well, Experimentation Island, because it takes place on a Dutch island, yeah, uh, is, is the fact that it's a great mix of everything. So it, there is, there are keynotes, but the audience is really asked to participate in And because you're locked away on this island for like a whole weekend, also weekend, so you really have to love what we do to, you know, sacrifice your, your, your spare time away from your friends, your family.

And so, which means that before karaoke at two 30, you're still talking about. research. You're still talking about AB testing. So it's just, it is one of, it is one of the funnest weekends of the year for me. So,

[00:04:57] Sani: that is, that sounds incredible. And Tom told me, I think that the last fairy leaves. Before the conference is over. So to make sure that everybody stays overnight, which is brilliant. It's absolutely brilliant. Okay. Uh, so, uh, it may or may not be happening this year. I may or may not have talking to, spoken to Tone about this in our episode that is coming out, uh, very, very soon.

Uh, so let's talk about the, the seven deadly sins of user research. And first of all. Why do people commit since when doing user research first? Is it the not knowing? Is it a going for the easy option or, you know, the, the HIPAA factor? What is it?

[00:05:35] Els: Well, I would say it's mostly ignorance over malice. Um, um, which is, which is, which is how I try to, uh, always look at things to, to start with. Now. I think there is indeed there, there is still a great ignorance about what good use of research really, really looks like. Uh, and there's this misconception that it's easy.

Uh, just as a matter of note, I just had a conversation this week with one of our clients, and we're like, yeah, this survey stuff, it really takes a lot of time to process this. Can't we get some student in to do this for us? Not really. If you, you know, if you want to get The maximum insights out of the data that we've collected, you know, was an extensive survey, lots of open answers.

You need somebody who knows your business, knows what we're researching, and, you know, knows how to deal with this amount of data, and also to gather the insights, put it together with the other pieces of research that we've done, etc, etc. So, it's not as easy as it looks.

[00:06:41] Sani: It, it not, it never is as easy as it looks. And I mean, to people who are listening to this, we're not really into conversion optimization in general. I mean, everybody will tell you it's 80 percent or more research before you do anything else. And to everyone who wants to pay for conversion optimization, when do we start testing, like, when do we do the first AB test?

That's not your experience because you operate at a higher level, I assume, of conversion optimization with the clients and, and, you know, asking you questions, they will know what it is. But when a client wants to do conversion optimization for the person, this is sort of where I operate. The first question is, when are we running our first test?

Like, why? What? What are you going to test? And yeah, I guess, I guess all those blog posts about, uh, button colors in Google Optimize. Did have a very negative effect on client education. So let's talk about the seven sins one by one. I have my iPad here. I have my notes here. I'm going to read them and then we'll just go two or three minutes about each one, maybe four minutes, depending on, on, on, on the question, the sin.

And yeah, let's see what you have to say. So business needs should drive the research questions. That's the person tell me about that.

[00:07:52] Els: Um, again, uh, this is based on all the sins are based on my experience with the type of questions people come to me. Sometimes people just come like, um, Hmm, we want to do user testing. And I'm like, great. I'm a big fan of user testing. What exactly is it that you want to find out? Yeah, there's this, there's this, uh, section on our website and we just want to know whether people find it easy to use this, that, and the other.

I go like, okay, great. How important for your business is this section of your website? Because if it's only visited by maybe 5 percent of people and it is responsible for maybe 1 percent of your revenue. Is this really our priority? Should we really be digging our research heels into this topic? Or should we spend our money elsewhere?

And I'm, I'm saying this as somebody who works not only with some larger companies, but we also work with smaller companies. And I try to always make the most of their research budget. Um, which means like do research that will get you insights, that will enable you to really move the needle for your business.

Which means let's not waste our research, you know, efforts on things in the margins. Um, so yeah, let, make sure that it really applies to a business KPI basically.

[00:09:30] Sani: Yeah, man, how common is it that the CEO just wants to validate what they think is right and just do the research just to prove that I'm right? Does that ever happen?

[00:09:39] Els: that, of course that happens. And when somebody says that to me, like, could you validate? And I say, I could not. I can, I can, I can do the research for you. But there is no guarantee that it will be validated. Um, that's, that's like the same, like how many, how many, what will ha how long will the research take and then how long will, what we have to change take?

I don't know. I mean, I know how long the research will take, but let's, we'll do the interviews. Let's start with six. Let's see how, how far we, how far we go with six. But I would say a maximum we need to do 12. So that's X amount of time. What exactly will you have to change? Well, sweetie, I don't know.

That's the whole point of doing the research. If you, if I would know this and I could give you a time frame or I could, I could give you a roadmap for this. Why do you think we would be talking to these people? So the research insights dictate, What the next steps will be. And if you have a fixed set of time for that next step, I totally get that.

That's, that's fine. Then we prioritize and we see, okay, what's the most important thing to do in the three weeks that we still have before we, you know, have to launch or whatever it is. Yeah.

[00:10:57] Sani: I'm just thinking synthetic users and user testing and, and everything that you said will take. No, no, no, I'm not, I'm not, I'm not saying it's a good thing. I'm just saying that someone will think, Hey, we can do all that with synthetic AI users in 15 seconds and think that this is the way to go. That's all.

Let's not go there. Let's just do a repeat episode in, in

[00:11:16] Els: Are we going to keep it nice? Are we going to keep it

[00:11:19] Sani: Today. Yes. Today. But, but if, if we do a repeat episode, let's talk about that and let's just, let's just make that one crazy. But I have a feeling that there will be people who think this is the way to do it. That's all I'm going to say.

I'm not going to say they're right.

[00:11:36] Els: Okay, that's good. Then, then I shall, I shall, I shall park your remark because the weekend is, is, is upon me, uh, and I will be, I will bring all my positivity moving forward and ignore you said that

[00:11:53] Sani: There we go. But we agree that's the most important thing. We agree on how bad that is. Second sin, choose research methods carefully based on what the specific insights are needed. So attitude versus behaviors, quality versus quantitative, avoid common mistakes, like asking about future feelings, seeking data available elsewhere, or using the wrong research approach.

So. Basically choose your tools, right? Choose the way you're doing research, but how does, I mean, like, you know, but how does someone know as a brand, which research method they should be using, just go to the expert and have the expert tell them

[00:12:32] Els: Well, um, if you're not an expert, going to the expert is always a good idea, in my opinion. And hopefully that expert is, um, aware of, indeed, the variety of tools that we have and when to use which. Personally, I think, and the, the grid that Christian Rohrer has made for NNGroup, Uh, is, is a really good starting point.

I think every, uh, user research professional more or less has their own version of this because there are nuances sometimes to be made in where exactly you place certain methods. But it's, it's essentially, um, it's, it's, it's a graph that plots research on, uh, uh, two axes. What do you want to find out? Do you want to find out something about, uh, Do you want to find out about, about facts?

Uh, or do you want to find out about feelings? Do you want to find out about facts being behavior? You know, what are people actually doing? Or do you want to find out about feelings? What are people thinking? What are, how do they, how do they relate to your brand? And, do we need quantity? Or, are we going quality?

Do we need deeper insights? And so this, uh, Plots, different research methods, surveys, moderated user testing, tree structure testing, etc. Plots that on the axis. I think that's a great way also to explain to a customer who might have a question, like actually your research question means X. Um, what I find is that actually getting to your research question to start with.

is well, challenge number one, because people sometimes phrase their research question in such a way that it's actually very misleading. Um, and then, then you're tempted to choose the wrong method. I was like, people go like, um, we want to know whether people prefer design a to design B. You know, we've, our, our, our, oh, else our agency has made these two new designs for the homepage.

And we want to know, uh, which one, uh, people prefer. Actually, no, you don't want to know which one people prefer and prefer is also the wrong word because it's a page on your website. It has to do something. You know, um, I have clients whose homepage is extremely important for them. It's actually the first step of the funnel.

So what do people like best? Not important. Where do people actually initiate that action best? Ha ha behavior. And it's also not which, it's about how many people want A. How many people do initiate this behavior on A versus initiate this behavior on B. So, how many means it's quantitative. So, your idea of doing six interviews with people to ask them which one they like, no, this is, this is, this is something we should be using quantitatively.

an A B test for, I would say. But you see, sometimes somebody who doesn't know research, the way they phrase the question in itself is actually very confusing. And if you're not an expert, if you can't get to the bottom of what is it that we actually want to find out here, then you might be tempted to choose the wrong research method.

[00:16:12] Sani: That's a, that's a great explanation. You'll, it also mentions future feelings. By the way, the notes I'm reading is from Kevin Anderson's blog post from, from his newsletter that he wrote about, uh, your, your session at conversion hotel. Basically. So shout out to Kevin Anderson. He was on the podcast a few weeks ago.

So future feelings asking about future, future feelings. What does that mean? Give me an

[00:16:34] Els: well, thank you, Kevin, for the great write up. Um, yeah, future feelings, I see this mainly happening in surveys, but also sometimes in interviews and, God forbid, user testing. Um, where, yeah, I call this asking people to predict the future. Um, like, um, you have an app, you have, uh, you have a roadmap for this, but it's like What, what would you, if we added feature XYZ, would you use our website more often?

Hmm, no. Um, also, I'm personally not the biggest fan of the traditional MPS survey question. Like, would you recommend us to a friend colleague? It's again, predicting future behavior, um, not happy. And it also doesn't work the greatest for every type of product. I always say, if you're the type of person that recommends cloud software to friends in a bar,

[00:17:35] Sani: You

[00:17:35] Els: know, I, yeah, you probably don't have a lot of friends.

Um, so, but you know, um, yeah, so no, I prefer not to ask about future behavior. Um, I think the best predictor of future behavior is past behavior. So, let's look at the data that we have. Um, or, but not just ask outright about future behavior.

[00:17:59] Sani: that makes a ton of sense. So sin number three is beware that various biases can skew results like sampling biases, relying on single data source, survey response biases, and leading questions or advice seeking in user research. So basically loaded questions or any kind of, any kind of issue that can, that can mess up the results.

Right.

[00:18:20] Els: Yeah, and there are just, there's, oh man, there's just so many things that can go wrong. It's really, I like, yeah, I'm not making, I'm not making it sound sexy, uh, am I? But it's true. Uh, uh, there are just, if you, and if you don't get it right, things, the data is not right. It's like, I, I like to compare it to a faulty setup of your analytics.

Yeah. Thanks. If you're measuring every event twice by accident, it's not great. Um, you know, and the same thing if you ask loaded questions or if you ask leading questions, you will get answers. It is very easy. I could make a survey for you tomorrow and I will get you all the answers exactly the way you want them, but they will not be the truth.

It is really, really easy to manipulate people's answers. Um, and sometimes a lot of people don't do it, you know, again, they don't do it out of malice. They do it because they don't know that. For example, if you ask people, um, like on a scale of five, uh, um, um, um, how much do you like our, our, our new feature?

In essence, you've just told people they like the new feature. All they have to tell you is how much they like the feature.

[00:19:42] Sani: Yep, but you know

[00:19:44] Els: they are,

[00:19:44] Sani: Go ahead.

[00:19:45] Els: who does, no, go ahead.

[00:19:47] Sani: you know, you know who does that the leading question leading questions loaded questions political parties when they're doing surveys and when they're doing when they're doing Research of how many people support them or support their ideas. This is exactly the opposite of how it should be, isn't it?

[00:20:04] Els: I would also say that those surveys used by political parties aren't actually surveys but secret propaganda tools to actually influence your way of thinking in one way or another. The word research, as far as I'm concerned, is also bandied about a bit too loosely these days. Because every single question that you ask all of a sudden is research.

Let's just relax. Um, but yeah, it is really, it is really easy to manipulate people. Um, and again, as I say, you know, some people do it, um, out of malice and that's very, very bad. And some people do it because they simply don't realize. And again, it takes that knowledge to be aware of how do you formulate a neutral question.

So instead of asking, how much do you like our new feature? Okay. You should always, if you have a positive verb or a positive adjective in your, in your question, also add a negative counterpoint. How much do you like or dislike? Um, or how would you rate our new feature that's, that's a neutral verb? So this neutrality is, and it's, it's, it's just very hard.

Uh, it's hard in surveys. It's also hard when you're doing an interview to just be aware. To not see. Steer the interviewee to not steer the people answering your surveys in a direction that, you know, will lead to insights that simply aren't the truth.

[00:21:35] Sani: and it's not easy. It's not always easy. So you need to be extremely careful. So the next one is And I'm just looking at the list and there's one that we'll have a laugh about that's AI related, but the next one is do not jump to conclusions from limited user testing data. Look for patterns instead. So

[00:21:54] Els: Yeah, this is again, this, I'm going to hark back to basically our, our, our question number one or two, um, which research method do you choose? Like if you're doing user testing, like usually six, 12 people. Lots, it can be very, very tempting. Um, if one person says something that you wholeheartedly agree with to just pick that, put that in the report and go like, see, I told you so, and we're like, no.

What one person says is not a pattern. What one person says is, is an anecdote at best. Um, and you know, can be really misleading at, at worst. So, if you're reporting about user testing, you're not reporting one person out of five did this. What one person out of five, or what one person out of six did, is really not very, is really not very interesting.

What are the larger patterns that you saw? Did you see that? You know, three or four out of five people. And it doesn't even matter whether it's three or four, you know, three or four out of five people had problems going from step one and two in your funnel because of unclear copy X, there's your problem to fix.

And it doesn't matter the exact number, but one person, no, never rely on things that one person says, no matter how much you agree with it, there's the bias again, and no matter how, you know. Because sometimes during user testing, people can say real things that will look great in a report

[00:23:33] Sani: no, I'm

[00:23:34] Els: or like amazing quotes.

If it's just one person, ignore.

[00:23:39] Sani: one of

[00:23:40] Els: have no supporting data elsewhere, ignore.

[00:23:43] Sani: absolutely. One of my favorite rabbit holes on Wikipedia is list of cognitive biases page, and that is just a, just a page is fascinating read, but following all the links and reading all the crazy things. Oh, you need to leave all of those in front of the door when you're doing any kind of research, basically.

Otherwise, I mean, it's impossible to completely detach yourself from, from what your brain thinks. And that's just a fact. if you're conscious about it and try to do the best you can, I think you're doing a better job than without trying, basically. So, uh, the next one is, Pay attention to smaller response, response patterns in surveys, not just easily to respond to that.

That's basically the one, right? Uh,

[00:24:27] Els: this is actually, well, it's like in quantitative, and here's the thing. In qualitative research, I, And in quant, we're looking for patterns. And so in qualitative research, if one person says something during a series of interviews or during user testing, that's not good enough. However, when you do a survey, and let's, I'm going to go with, because we've got big clients where we get thousands, ten thousands of answers.

Um, Which is great. And we also have, sometimes we help smaller clients, and we get maybe a hundred answers on a page. If you see ten people, ten people, saying something, you might think, well, that's not a lot, you know, ten people out of a hundred. But if they all say the same thing, this is, even though it's a weak signal, it is a signal.

It is not an anecdote. It is signal. Um, and so having a look at what they're saying, getting the context for their comments, you know, looking at your app, looking at your website, looking at your funnel, rest of your data. That is an interesting thing because sometimes, um, you know, these a hundred answers that you get, a lot more people come on this page.

Uh, then, then the hundred answers. So these ten people are actually, you know, uh, they're signaling something that is important for a larger group. And it can be something, if it's an easy fix, that can actually, will make a difference. And, and that we have seen actually in follow up AB testing, where we went with a, you know, a small signal that turned out to have, uh, a very big impact.

[00:26:28] Sani: Right. And even if it just affects the 10 people out of a hundred in the tests, maybe the other 90, just don't mention it or not all of them, but, but maybe they just are not as frustrated as the 10 percent to, to tell you about it. So that's a great point. Are we ready for number six? It's don't over rely on AI. We talked about, we're not going, are we going to talk about that today? Are we not going to talk about that today? If we're doing a followup episode in the future, we're not talking about AI today. just put it that way.

[00:27:02] Els: Your call. Your

[00:27:04] Sani: We're not,

[00:27:05] Els: invite me back?

[00:27:06] Sani: Sin number seven,

[00:27:08] Els: Okay. Okay.

[00:27:10] Sani: research sources carefully for credibility before believing reported results. So if you're doing the research yourself, I guess that doesn't really count. Right. Or, or is it? Yeah.

[00:27:25] Els: No, no, hopefully you can trust yourself. Hopefully, you know that, that you, that you did this research, uh, in, at a moment where you weren't inebriated, uh, or, or, or, or otherwise engaged. No, um, I'm talking about people who like, well, um, Go to LinkedIn, you know, this. Wonderful fountain of knowledge.

Um, I'm kidding. There are, there are, there's great knowledge on LinkedIn and there, there's also stuff that you really, really need to look twice at. Uh, I think one of the, uh, one of the things that springs to mind was, was somebody who said, Oh, in a survey, 96 percent or 98 percent of people say, uh, they want to, they want to work remotely.

When will companies listen? And you're like, sounds dodge, you know, any, any, any sweeping statement like that, oh, 96 or 98 percent of people say they want to work remotely. That sounds, that sounds a little weird, right? So then you go look at the survey, and you go like, oh. You ask people who work remotely.

Sampling bias. Of course they love working remotely, it's what they do, right?

[00:28:40] Sani: it was an Upwork survey or something like that. Okay.

[00:28:43] Els: And, and also it, it was also that I, uh, I'm not sure whether it was Upwork, but it was a company that basically sold tools to, uh, companies that promote remote work and you're like, Really? Don't do that. You know, have a look at what is the source of this, of this, of, of, of this information, who benefits?

From this. It's like dog food companies publishing dog food survey results. You know, I would really try to find information elsewhere. Um, of course they're going to say their dog food is the best dog food. And that's actually great. Pretty much with what a lot of these surveys are. And I would say, this is what I said before.

That is not research. That is marketing. And a lot of, a lot of the time, the time these days, and, and, and, and sometimes it's very smart marketing, but you as an optimizer, as a researcher, you have to be able to tell the difference between which research is done for marketing purposes. and which research is done to give you actual insights and data you can rely on.

[00:30:06] Sani: Don't trust everything you read online basically is, is what this one says. Okay. And that's not just research, that's just life.

[00:30:15] Els: I was about to say that the fact that we're talking about this makes me feel a little preachy and like, everybody should really know this. Yeah.

[00:30:26] Sani: Not everyone does though, but we survived the seven sins without talking about AI. Which in 2024, in 2024, that's a big deal. Not everyone gets to do that. And we will do a

[00:30:38] Els: so much.

[00:30:39] Sani: AI. Absolutely. And, and, and thank you for being such an amazing, amazing guest. I look forward to meeting you in person later this year, hopefully at experimentation elite in June.

And, uh, to everyone listening to this episode, I know you enjoyed. So please consider rating, uh, liking, reviewing, sharing, whatever you do with podcasts, and I will talk to you next week.

[00:30:58] Els: Thank you so much for having me. Lovely talking to you. AI, we're coming for you next.