On Wednesday, January 25th, 2023 at 5:00pm Central, Meena Kothandaramn joined us for a live Q&A session called “There is More to Research than Asking a Question.”
Session Transcript
[00:00:38] Chicago Camps: How do you find user research as a calling, and what are the types of research that really get you excited?
[00:00:45] Meena: So I’ve always been curious about people. I’m a people nerd. I love human stories. I love hearing about people. I always wanted to actually go into psychology and somehow I ended up thinking, okay, psychology.
[00:00:57] And then I got somehow veered away to architecture and ended up in computer science, which was a bit bizarre. It’s sort of an odd. But the best part of it is I actually, in one of my first co-op terms, which is amazing, that I even had a chance to do co-op, was one of the first cohorts in my university.
[00:01:13] One of my mentors was actually basically a user experience researcher, and my first project was actually researching a new design for a payphone in Canada. So I started, and I never looked back because I just thought, this is so much fun. I love mixing drinks. It’s like being a bartender, but getting paid a lot more.
[00:01:34] So it was just a fantastic joy to be able to just meet people, learn from them, hear about their stories, and when people start to tell things that are like emotional to them or passionate to them, I just love listening. So it just seemed like the right path and I’ve never looked back, to be honest. Hua plowing board and nabbing fun.
[00:01:54] Chicago Camps: That’s really great. And I have a follow up from that, which. How has payphone design changed in Canada?
[00:02:02] Meena: I did that project very long ago. I’m not even gonna give you a date cuz that dates me. Well, very long ago. But basically the one thing that we figured out was that people were constantly hitting the wrong button because they weren’t getting enough feedback on exactly which button they were touching. So I don’t know if you remember the buttons that were sort of brown and sort of angular. So if you look at them and you click them, it’s almost like a tap. You wouldn’t really get that, that sort of feel that you’d actually, the tactile feel that you’d actually press the button and then they were charged money.
[00:02:32] They didn’t like it. They had to put in another quarter or whatever it was back then to make the other call. And it started to actually get people very irate. Bell Canada, which was the organization we were working for, actually got a. Negative feedback to people saying, you’re trying to steal my money.
[00:02:47] Sort of that kind of stuff, which many people try to steal their money too. So I think it goes both ways. But basically we wanted to make the payphone a little bit more inviting, give a little bit more feedback to people so that they could actually say, this is the number that I’m actually dialing. Do I agree that this is the number I want to?
[00:03:05] The buttons actually had a lot spring to them. So for people who were older who didn’t have as good fine motor skills, it was much easier. So it was that really fun study of actually observing how people use a payphone. Exactly. All sorts of feedback. Exactly. So it was a lot of fun to to study that and understand exactly how we could make it better.
[00:03:25] And we must have done such an amazing kick ass job. It’s the same payphone that we designed. It hasn’t changed. There are still payphone everywhere in. That amazing. I know it’s sort of funny cuz every time I land at the airport, I go and I look for the payphone, . Every time I go home I’m like, oh my God. Is the payphone still there?
[00:03:44] Yes, it is. I take a picture of me standing next to it. I have a whole bunch of pictures standing next to the payphone. Everybody does have some sort of mobile phone or some sort of mobile technology or. Some sort of something with them, so, but it is still used. I’ve seen people use it and like, wow.
[00:04:02] Especially when I think it’s international. Sometimes it’s just cheaper to do that than to try and get onto the wifi. Sometimes wifi can be a real pain, so on and so forth.
[00:04:10] Chicago Camps: I love this next question, and I think it’s relevant because of current events. It’s true that everybody can ask a question. However, it’s important for people to understand that asking questions doesn’t represent all the work that goes behind getting to the point of asking the question. There’s a lot of labor that happens just to get to the point of the question. So what’s important for the team and stakeholders to know beyond just asking questions?
[00:04:37] Meena: Absolutely. That one is actually, that is one of my favorite questions because I think people think, people think, oh, well I can draw. That means I can design, what’s the big deal? It’s the same sort of thing. Like I can ask a question, what is the big deal about being a research. And I think, to be very honest with you, Russ, I think part of this sort of fable that’s out there has been created by researchers themselves because we tend to get an ask.
[00:05:03] We don’t really challenge it, we just go with it. We scurry away, we solve for it, we answer a question, and then we come back and po we have answered the. And unfortunately there has not been enough transparency in our field in researchers actually revealing the amount of effort it takes to do the work of answering a question, and it’s more than just answering a question.
[00:05:28] The research team in my mind, really bears the responsibility of showing what that potential could be. Forgetting an answer in a way that is high confidence, creates high integrity. The data that you’re bringing. And also really starts to show that there is potential for it to be a strategic tool for people to use and move forward with.
[00:05:50] But when you just make it this, I’m gonna ask a question because even in, sorry to say, a lot of literature that’s out there, people are always just like, and what is the right question and how do you ask the right question before we even get to the right question? We have to expose. The importance of alignment on the team.
[00:06:07] Are we all aligned on what our learning objective is as well as I do? People are running, stakeholders are running from meeting to meeting to meeting, and they don’t get a chance to really think in between and they don’t get a chance to sort of stew or mull over or reflect on what it is that they’re trying to learn and how that learning is actually going to help move them and their team and their design potentially forward.
[00:06:28] So the research team really actually has multiple responsibilities beyond the ask of a question is first sort of ascertain what the aligned objective is. Make sure we understand what that intent is and how research can now help you actually solve for that objective. What service research is going to provide in a given moment because there are different patterns that we’ve observed over time where we see certain companies doing certain things, and if we don’t sort of call people out to that moment, often they don’t even realize before you even try to answer the question.
[00:07:02] We have to sort of have a little self-awareness session first. Where are we? What are we doing? How do we need this information? Why do we need this information? What do we already know and what do we not know? And sort of get all of that detail aligned before you then even ascertain that learning objective, and then move forward into recruiting, taking time to recruit.
[00:07:24] Now, recruiting is such an important topic, more so than it has ever been because when we say we. A balance of people that we’re looking for. We need to make sure it’s a balance across populations of different races, ethnicities, genders, whatever the word is that we, whatever the focus is, we need to truly tempt a balance and not just sort of do the gratuitous balance.
[00:07:47] And it’s really important for us to tell stakeholders that that takes time. That’s more effort. It takes some sort of moment where we have to actually think about how to approach these populations where we can find. , we think about the protocol. How do we actually engage our stakeholders to help us think about ways that we can have people articulate the thoughts that they wanna share with us?
[00:08:09] So there’s every step of the way. Like we’ve, I know we’ve talked about this in in prior conversations at Twig + Fish, we have a five phase process, and literally in every phase, we need to make sure that we are exposing the work that’s being done, and we’re also sharing it with everybody who’s a part of it.
[00:08:29] It is not just the act of gathering information, it’s everything leading up to it. Them being a part of it and listening to those stories just as much as we do. Having the whole team analyze the data with us, and then really thinking about ways to translate that data into actionable steps to move forward.
[00:08:50] I know it’s a very long answer to your question, but it’s imperative that we not just focus on what’s the question and what’s the answer because. Not really one-to-one, unfortunately. Not even in quantitative data. Is it one-to-one. There’s always more work than it looks like.
[00:09:06] Chicago Camps: I’d love for you to go a little bit deeper here. What are some of the key considerations we should be keeping in mind as we’re hearing responses from our participants and what do we do with those responses?
[00:09:17] Meena: One, if we focus our attention on our own team, on our stakeholders, we need to involve them and make sure that they are listening with us. Researchers, even if they’re internal to an organization, more than anything else are the bearers of knowledge about who the participants are, who the consumers are of the designs that you’re putting out to the world.
[00:09:37] But that doesn’t mean that they should be the sole owners of that. Everybody in the organization has to hear those human stories. Everybody in the organization has to partake of them. And the more ears we frankly have listening to those stories, the more perspectives we appreci. Hence also the importance of a diverse team because all of us have different lived realities ourselves.
[00:10:00] We need to hear and we will listen for different things in the stories that people share with us. So that’s one angle of what we need to emphasize, which is to make sure everyone is involved. Everyone knows to listen, and even if they listen to just one session, Russ, it’s fine. That’s more than fine, but they need to be a part of.
[00:10:17] Then when we do the analysis, everybody has to listen to that story. And that’s where it’s imperative for the whole team to participate, listen, and be sort of fully present to engage in those stories so that they know how to leverage it to move it forward. If we look at the other half the coin, we think about the participants themselves.
[00:10:38] I always sort of joke when I, when I talk to my own clients, I joke with them and I’m like, nobody, if I’m working for a, a financial investment, Nobody gets up in the morning thinking just as they get outta bed, how am I gonna invest it? And what did I think about when I invested last and why do I invest?
[00:10:56] And nobody jumps out of bed thinking about one particular company or one particular product. They’re often plagued with so many other things that they’re thinking about. So we have to be empathic. To people who are answering our questions, we need to make sure that they are given a chance to answer.
[00:11:15] They’re given a chance to sort of think about a little bit more deeply about why they do things, what they do things for. What does that motivation, is there a particular attitude that they’ve always maintained? We know attitudinal and emotional data lasts so much longer. Let’s go after that a little bit more and get a sense for what drives some of those deep-seated behaviors.
[00:11:37] When we think about this, it’s important to not just ask the question. It’s, it’s really important to provide some sort of basis for them to be able to share information back and forth with us. Soon as they tell us the story, then we can leverage that in order to make it that much more powerful and sort of dive in a little bit more deeply.
[00:11:57] But I have to say a lot of times, the questions that I get are like, well, how do you know you can trust that answer? And how do you know they’re telling you the truth? Or maybe they’re just sort of making it up to make you feel good. That really never happens if you give somebody a chance to answer a question.
[00:12:13] Very few people go into a session going, how am I gonna trick the researcher this time? It’s not really the strategy and the effort that people put into that, but they do earnestly want to answer your question and answer it well and answer it fully and honestly and sincerely and truthfully. So if we help them reflect, the best compliments we get are when people leave the session and they’re like, never really thought about car insurance as much as I did just now.
[00:12:41] Wow, that’s interesting that I make these things this way, like I’ve never thought about it, and it’s almost like they have a little revelation themselves and they have a bit of fun with it. So these are things that we really do need to bring to the forefront when we’re listening in on what the participants are sharing.
[00:12:55] Chicago Camps: Thank you for that amazingly detailed answer. There has been a lot of change in the world over the past few years, and given that, and given that we’ve sort of accelerated our pace of change and the ways in which we work have changed dramatically. What’s important to keep in mind in today’s world as we conduct research studies?
[00:13:13] Meena: Revealing the effort for the research team to assert a process to make sure that the process is clear to.
[00:13:21] So it’s not as if a stakeholder is telling me, well, you’re gonna go run focus groups and you’re gonna do this in a week, and you’re gonna get this back to us. It’s, that’s not the way it works. You have a process and that’s gonna bring, again, integrity to the way you actually collect information. And it’s how you bring everybody into that process and make sure they’re a part of it.
[00:13:41] Staying transparent, being respectful to other timeframes that are going around, but having people be respectful to your timeframes as well as a research. I think doing it that way, quite honestly, the thing we keep in mind then is that at the end of the day, we want high confidence answers. The more organized we are, the more transparency provide with the more structure we provide.
[00:14:04] If there’s a clear process and I can actually see exactly where you are in a given moment in time, Russ, I will do a much better job myself of engaging with you and I will also have higher belief in the outcomes of what is coming out of that particular study, and I think that is something that it’s interesting to say this out loud.
[00:14:26] The more I tell this to people that I work with, they’re just, it’s almost like I’ve given them permission to push back a little bit and to say, Hey, you know what? We’re doing this because it’s part of our process, not because it’s just a waste of time, and it’s almost being given that permission that I think is incredibly important for people to feel like, yes, there’s a reason we’re doing this.
[00:14:48] If we have good enough answers, who wants to build anything based on good enough? So that would be my, that would be the emphasis that I place and that is what I place with a lot of the people that I work with, with my students at Bentley. I just, I really try to tell them, sort of draw that box and say, this is the process and this is the way it has to be, not so dogmatic where you can’t be flexible, but have that process in place because it becomes something that people can respect, engage in, and also just become comfortable with the language.
[00:15:21] Chicago Camps: And now for a question from our live studio audience. David asks, “When do you have a good idea about when you’ve reached saturation when you’re doing research?”
[00:15:31] Meena: A great question. So David, my answer would be that it depends on, again, the intent behind the research.
[00:15:38] And I’m happy David to, if Russ is okay with that, I can share some links to a framework that we’ve put together that would really help answer this. But you’ll notice that when you have exploratory or discovery type questions, which are much more human based, which are much more focused on understanding people’s lived realities or potentially aspirational details that they’re after or possibly work around so that they’ve considered when they’re actually trying to solve for a particular task.
[00:16:09] When you have those kinds of question, Our goal is to actually just be inspired by people. It’s to talk to people, learn from them. So if I talk to Russ and I’m like, Russ, tell me why you chose that particular mic that’s in front of you. He’s gonna tell me all the details of what he chose, why he chose it, and that mic actually might not even be the mic that I represent with my company.
[00:16:32] It might be a competitor’s mic, but it doesn’t matter to me. I’m trying to be inspired by Russ’s. By what he compared against and sort of his criteria that was important to him. When you are looking for inspirational data, which is often what pure qualitative data is thick, rich data, it’s actually binary.
[00:16:55] Believe it or not. You have been inspired or you haven’t been inspired. So to come back to your question about understanding saturation numbers don’t have to be large for that type of data. You will gain inspiration and you will say, oh my gosh, we’ve got like five or six different points for us to figure out.
[00:17:15] Let’s chase those down before we move forward. On the other side of the fence though, if you are doing more sort of ideation or validation type of research, if you’re doing that kind of research, you will. That saturation comes pretty quickly because that N of five, quite honestly, per group, that is important to you, and those groups need to be carefully defined.
[00:17:42] You will find that that hits saturation quickly with that five to eight number. That’s always tossed around, and you really don’t need to go further than that why You’re not introspecting about somebody’s use. You’re simply seeing whether or not your product works or not, which is again, sort of binary in its own way.
[00:17:59] You’re answering the question, does this work or not? And what happens where people get stuck with this but how does that make sense is when you conflate the two and you think you’re asking exploratory questions in a validated setting, that’s when things go a little south. Are you familiar with the nCredible framework?
[00:18:19] Oh my goodness. That is so exciting. Yes, that that is exactly it. David, I am referring to exactly that, that will. Help you describe to people, cuz a lot of times stakeholders struggle with the concept. I mean, they, they’re not researchers. They don’t need to be researchers. They’re doing their function, they’re doing their job.
[00:18:41] Oh, how lovely. Oh my goodness. This is so great. So basically when you’re looking at the two sides of the framework, it makes it so much easier to describe to stakeholders the sort of the visual distance, if it’s fair to say between the types of question. That helps parse apart a little bit more about the sample sizes.
[00:19:04] How many people should I be speaking to? It really starts to uncover and reveal a little bit more of that information in a way that then people can see why small numbers is okay on the left side, and big numbers is often a little bit more important on the right hand side.
[00:19:18] Chicago Camps: Jared Spool talks about how, and I’m paraphrasing, When you stop being surprised by responses or feel like you can predict them to a degree, that it’s the point of least astonishment and it’s a good time to evaluate the benefits of continuing the process and/or consider moving into your synthesis and analysis phase.
[00:19:38] Meena: The only caveat to what Jared said was I would apply that a little bit more to Validative cuz that not learning anything new is very often associated with testing, but when you’re talking exploratory, it’s a little different. Little different. There is that somewhat same element of not learning anything new, but not quite as much because your numbers, again, are not so large.
[00:20:00] You’re gonna come out with a lot of different learning and the question is how to make sense of how to find the patterns across that, which is a different challenge, if that’s fair to say.
[00:20:08] Chicago Camps: We have a multi-part question from our live studio audience: “What are some of the ways that research teams are enabled to do their best work? How are they succeeding the most? How are they being set up for success and what does that look like in an organization?”
[00:20:24] Meena: It comes back to a little bit about what I said before that I sort of encourage all researchers that I work with. Having been in around the block enough and around turns of the sun, I feel like it’s really important to assert the effort that goes into doing.
[00:20:42] I think if we assert that and we sort of say, this is the process and this is why we have to do it. So not just, again, not being dogmatic and saying, well, this is the way it has to be done and too bad if you don’t like it. But it’s explaining and rationalizing why we need to do things the way we need to do things.
[00:20:59] It’s a bit of coaching. I feel like the research team has to sort of invite people in, but coach in a way that they become much more sophisticated consumers of what it is that we put out to them. It’s a joint education, but we have to also be empathic to our stakeholders. So the question is about enabling us to do our best work.
[00:21:22] Well, if we fit in and help and are empathic to what other people have to do. Provide structures and frameworks and timelines of projects where we’re very clear about what’s going on when down to the date, which is very possible by the way, in any, any research study, then we can actually fit into somebody else’s world where they are pushed up against deadline and we can tell them what is feasible and we can also tell them what’s not feasible.
[00:21:51] I have a bit of a rule in terms of enabling us to do our best. Is whenever presenting a study design to any stakeholders, we always start with the ideal. We always start with the ideal of this is the right way, this is the correct way to answer the question. And I’m not saying we blow it outta proportion and say, this needs to be a 12 month project and we’re all flying to Tahiti.
[00:22:14] That’s not what I’m talking about. It’s within reason. What is the best way to get answers for this particular objective? Often as can be imagined, research studies and researches in general is one area where constraints hit us and we are ended by constraints, but often it’s less budget, it’s less time.
[00:22:36] You don’t have enough people to do the work. In those moments, what we need to do is not just go or assume that those constraints are in place. Always present the ideal regardless of what is put in front of you, and then apply the constraints and tell people what they are. Again, we don’t make that clear.
[00:22:55] We don’t make that clear to stakeholders. So if we did it the last time in half the time, but killed ourselves in the process, that’s actually a bit of our fault. We need to be transparent and say, okay, well we can’t do an in a 48 in one week. We can’t even do an end of 24 in one week. We can do an N of 12, but by getting one quarter of the data, this is what you might lose.
[00:23:20] We need to be very clear about what is lost because then either the stakeholder will say, yes, this is what we’re gonna do, and I’m sorry we’re losing that, but that’s what we have to do. But they’re now made aware of the fact that there’s a better way to do things so they themselves can learn. Maybe ask for more time or maybe ask for more budget or whatever.
[00:23:40] Exactly. It can totally, totally be ambitious. But the key is, is that we can be very clear in what is lost, and as soon as that conversation happens, we are respecting the stakeholders saying, we understand you’re pushed up against the wall, but we’re also preparing them for the next time. Because when they come back and say, we’re pushed up against the wall again, you can go, that’s because you’re not a very good planner and you’re now taking your baggage and putting it onto my plate.
[00:24:08] And as a researcher, I’m not okay with that. So it’s again, creating some sort of guideline, but making sure that we can indeed be transparent, have a conversation, and show why things are being done a certain way, and what you will get out of it or what you will lose from not doing it.