Chapter 2: Choose an Approach

Once you’ve conducted your kickoff session, stakeholder interviews and desk research, you’re ready to design your research methodology.

The research cycle: design phase

The research cycle: design phase

Designing a research project is easy, if you’ve grasped a few core principles. In this chapter, we’re going to explain those principles, and show you a useful tool to apply them, so you can work out:

  • Which research methods to use
  • How to use different methods together
  • How many participants to include in your research

In addition to choosing the right approach, there’s another big benefit to understanding how to design a research project. When you have to justify the need for research, or when your stakeholders are challenging your findings, you’ll be able to argue your case with confidence.

The Core Concepts

This next section is going to get a bit theoretical. Don’t worry: we’ll show you how to apply it later in the chapter. For now, though, you need the basic building blocks of research design.

In this section, we’re going to run through 10 concepts. Some may already be familiar to you, others less so. They are:

  • What is data?
  • Qualitative vs. quantitative
  • Discovery vs. validation
  • Insight vs. evidence vs. ideas
  • Validity and representativeness
  • Scaling your investment
  • Multi-method approaches
  • In-the-moment research
  • Ethics
  • Research as a team sport.

What is Data?

The research process involves collecting, organising and making sense of data, so it’s a good idea to be clear what we mean by the word ‘data’. Actually, data is just another word for observations, and observations come in many forms, such as:

  • Seeing someone behave in a certain way
  • Or do something we’re interested in (such as click on a particular button)
  • Hearing someone make a particular comment about your product
  • Noting that 3,186 people have visited your Contact Us page today

But how do you know what’s useful data, and what’s just irrelevant detail? That’s what we’ll be covering in the first few chapters, where we’ll talk about how to engage the right people, and how to ask the right questions in the right way.

And how do you know what to do with data when you’ve got it? We’ll be covering that in the final two chapters about analysis and sharing your findings. In particular, we’ll be showing you how to transform raw data into usable insight, evidence and ideas.

Qualitative vs. Quantitative

When it comes to data analysis, the approaches we use can be classified as qualitative or quantitative.

Qualitative questions are concerned with impressions, explanations and feelings, and they tend to begin with why, how or what. Such as:

  • “Why don't teenagers use the new skate park?"
  • “How do novice cooks bake a cake?”
  • “What's the first thing visitors do when they arrive on the homepage?"

Quantitative questions are concerned with numbers. For example:

  • “How many people visited the skate park today?"
  • “How long has the cake been in the oven for?
  • "How often do you visit the website?"

Because they answer different questions, and use data in different ways, we also think of research methods as being qualitative or quantitative. Surveys and analytics are in the quantitative camp, while interviews of all sorts are qualitative. In general, you’ll be leaning on qualitative research methods more, so that will be the focus of this book.

Discovery vs. Validation

The kind of research will depend on where you are in your product or project lifecycle.

If you’re right at the beginning (in the ‘discovery’ phase), you’ll be needing to answer fundamental questions, such as:

  • Who are our potential users?
  • Do they have a problem we could be addressing?
  • How are they currently solving that problem?
  • How can we improve the way they do things?

If you’re at the validation stage, you have a solution in mind and you need to test it. This might involve:

  • Choosing between several competing options
  • Checking the implementation of your solution matches the design
  • Checking with users that your solution actually solves the problem it’s supposed to.

What this all means is that your research methods will differ, depending on whether you’re at the discovery stage or the validation stage. If it’s the former, you’ll be wanting to conduct more in-depth, multi-method research with a larger sample, using a mix of both qualitative and quantitative methodologies. If it’s the latter, you’ll be using multiple quick rounds of research with a small sample each time.

At the risk of confusing matters, it’s worth mentioning that discovery continues to happen during validation – you're always learning about your users and how they solve their problems, so it's important to remain open to this, and adapt earlier learnings to accommodate new knowledge.

Insight, Evidence and Ideas

Research is pointless unless it’s actually used. In some cases, the purpose of research is purely to provide direction to your team; the output of this kind of project is insight. Perhaps you want to understand users’ needs in the discovery phase of your project. If so, you need insight into their current behaviour and preferences, which you’ll refer to as you design a solution.

Often, though, you need research to persuade other people, not just enlighten your immediate team. This can be where you need to make a business case, where your approach faces opposition from skeptical stakeholders, or where you need to provide justification for the choices you’ve made. When you need to persuade other people, what you need is evidence.

And sometimes, your main objective is to generate new ideas. Where that’s the case, rigorous research is still the best foundation, but you’ll want to adjust things slightly to maximise the creativity of your outputs.

Research is great at producing insight, evidence and ideas. But… methodologies that prioritise one are often weaker on the others, and vice versa. It’s much easier if you plan in advance what you’ll need to collect, and how, rather than leaving it till the end of the project. The takeout: you should think about the balance of insight, evidence and ideas you’ll need from your project, and plan accordingly.

When it comes to planning your approach, bear in mind your analysis process later on. If you give it thought at this stage, you’ll ensure you’re collecting the right data in the right way. We talk about this more in Chapter 8.

Validity

Validity is another way of saying, “Could I make trustworthy decisions based on these results?” If your research isn’t valid, you might as well not bother. And at the same time, validity is relative. What this means is that every research project is a tradeoff between being as valid as possible, and being realistic about what’s achievable within your timeframe and budget. Designing a research project often comes down to a judgement call between these two considerations.

Let’s look at an example. You want to understand how Wall Street traders use technology to inform their decision-making. If you were prioritising validity, you might aspire to recruit a sample of several hundred, and use a mix of interviewing and observation to follow their behaviour week by week over several months. That would be extremely valid, but it would also be totally unrealistic:

  • Wall Street traders will be rich and busy. They’re unlikely to want to take part in your research.
  • A sample of several hundred is huge. You’re unlikely to be able to manage it and process the mountain of data it would generate.
  • A duration of several months is ambitious. You would struggle to keep your participants engaged over such a long period.
  • Even if the above weren’t issues, the effort and cost involved would be huge.

Undaunted, you might choose to balance validity and achievability in a different way, by using a smaller number of interviews, over a shorter duration, and appealing to traders’ sense of curiosity rather than offering money as an incentive for taking part. It’s more achievable, but you’ve sacrificed some validity in the process.

Validity can take several forms. When you design a research project, ask yourself whether your approach is:

  • Representative: Is your sample a cross-section of the group you’re interested in? Watch out for the way you recruit and incentivize participants as a source of bias.
  • Realistic: If you’re asking people to complete a task, is it a fair reflection of what they’d do normally? For example, if you’re getting them to assess a smartphone prototype, don’t ask them to try it on a laptop.
  • Knowable: Sometimes people don’t know why they do things. If that’s the case, it’s not valid to ask them! For example, users may not know why they tend to prefer puzzle games to racing games, but they will probably still take a guess.
  • Memorable: Small details are hard to remember. If you’re asking your participants to recall something, like how many times they’ve looked at their email in the past month, they’ll be unlikely to remember, and therefore your question isn’t valid: you need a different approach, such as one based on analytics. If you were to ask them how many times they’ve been to a funeral in the past month, you can put more trust in their answer.
  • In the moment: If your question isn’t knowable or memorable, it’s still possible to tackle it ‘in the moment’. We’ll say more about this below.

Takeout: You want your research approach to be as valid as possible (ie, representative and realistic, as well as focused on questions that are knowable and memorable) within the constraints of achievability. Normally, achievability is a matter of time and budget, which leads us to…

Scaling Your Investment

Imagine you were considering changing a paragraph of text on your website. In theory, you could conduct a six-month contextual research project at vast expense, but it probably wouldn’t be worth it. The scale of investment wouldn’t be justified by the value of the change.

On the other hand, you might have been tasked with launching a game-changing new product on which the future of your organisation depended. You could go and ask two people in the street for their opinion, but that would be a crazy way to inform such a major decision. In this case, the scale of the risk and opportunity justifies a bigger research project.

So when you look at your research project, ask yourself: what’s the business value of the decisions made with this research? What’s the potential upside? What’s the potential risk? Then scale your research project accordingly. Incidentally, it’s also good practice to refer back to the business impact as a project KPI. You’ll find it much easier to justify the value of your research later on if you can show how it’s made a difference to the business numbers your colleagues care about, such as revenue, conversion rate or Net Promoter Score.

Multi-Method Approaches

You’ll sometimes hear people talking about qualitative and quantitative methods as if they’re in opposition. Not so: they’re friends. And your research projects will always be better if you can combine both, because they counteract each other’s blind spots.

In fact, all research methods have blind spots. Although you’ve got to make a judgement call about which method to use in any given situation, you should always be aware of its limitations. But the best way to overcome these limitations is to team it up with another approach, so you can have the best of both worlds. Kristy Blazo from U1 Group describes the cycle of qualitative and quantitative stages as a spiral. Each stage builds on the last as you work round it, until you get to the point where the benefits of increased certainty are outweighed by the costs of further research.

>The spiral of qualitative and quantitative stages

The spiral of qualitative and quantitative stages

In-The-Moment Research

Earlier, we talked about research needing to be knowable and memorable in order to be valid. Actually, that’s not always the case. If you can be present when the event you’re interested in is actually happening, you don’t need to rely on their patchy memory and interpretation to figure out what’s going on.

Imagine you’re interested in the experience of sports fans at a game. You could interview them afterwards, but it would be more insightful to be there at the event. That way, you could look at the features you’re interested in, and compare your observations to visitors’ own comments. Rather than asking them to recall the state of the toilets and the quality of the catering, you could observe yourself and interview people there and then.

In-the-moment research, then, gives a more realistic view of events than asking people afterwards. The main methods for in-the-moment research are contextual interviewing and observation, diary studies and analytics, which we’ll talk more about later in this chapter.

In-the-moment research

In-the-moment research

Takeout: If you’re interested in events and behaviour that people aren’t likely to recall accurately afterwards, you should consider in-the-moment methods, instead of approaches that involve asking them about their experiences weeks or months later, such as depth interviews and surveys.

Taking Care

Research has the power to do harm.

  • By revealing participants’ identities, you could expose them to consequences in their work or community. Because of this, we hide people’s identities as default.
  • Depending on what you’re researching or testing, you risk upsetting people, particularly if they’re young or vulnerable. Because of this, we take care to set up interviews in as unthreatening a way as possible, and ensure participants know they can leave at any point.
  • For researchers themselves, there are risks. Visiting participants in their home requires care. Working in a state of deep empathy, sometimes on distressing subjects, can be emotionally hard to deal with, and researchers can and do get burned out as a result. Because of this, we take care with physical safety, and make sure we’re managing the emotional burden together.

Takeout: When you design your research project, consider the impact it may have on both participants and the project team. If you’re working with adults on an shoe retail website, then this isn’t something you need to worry about too much. But if you’re working with vulnerable teenagers to create an app about domestic violence, then it’s a different story.

Research as a Team Sport

Research is most effective when the whole team’s involved. Consider the difference: a project where a researcher takes a brief, goes away for a few weeks, then comes back with a report, versus a project where the whole team decides on the approach together, take turns interviewing and observing all the interviews, and analyse collectively. In the latter, you’re going to have better understanding, greater buy-in, and quicker, more effective results. Research isn’t just about generating insight, evidence or ideas: it’s also about building consensus among a multidisciplinary group who are about to tackle a problem together. The UK’s Government Digital Service calls this ‘research as a team sport’, and that’s the way we think it should be played, too.

We talk about how to work as a team in Chapter 2, and how to engage and activate the research with your wider group of stakeholders in Chapter 9.

Research Methods

As you can see, there’s a lot to consider when you design a research project. Don’t panic though! In this section, we’ll show you how to bring this information together to choose the right approach. Now that you’ve been introduced to the core concepts of research, it’s time to walk through the main methods.

There are a great many research methods out there. The good news is you only need a couple of them to be able to do effective research. What’s more, the rest are mainly just variations on a theme. So if you want to branch out later on, you’ll find that they’re easy to pick up.

Depth Interviews

  • How it works. Asking questions in a relatively unstructured conversation. Normally one-to-one.
  • Type of data. Qualitative.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Mainly insight, but also evidence and ideas.
  • Investment Medium.
  • In-the-moment? No.

User Testing

  • How it works. Observing users while they complete a series of tasks with the product being tested. Also includes elements of qualitative interviewing.
  • Type of data. Qualitative, although task data can sometimes be quantitative.
  • Discovery, validation or post-launch? Validation.
  • Insight vs. evidence vs. ideas Insight and evidence.
  • Investment Medium.
  • In-the-moment? Yes, although the moment is artificial

Guerrilla Interviews

  • How it works. Stopping people in a public place to conduct short (5-15 minute) interviews or tests.
  • Type of data. Qualitative.
  • Discovery, validation or post-launch? Discovery and validation.
  • Insight vs. evidence vs. ideas Insight and evidence.
  • Investment Low.
  • In-the-moment? No.

Contextual Research

  • How it works. A mix of observing behaviour in context and conducting short spontaneous interviews.
  • Type of data. Qualitative.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Mainly insight, but also evidence and ideas.
  • Investment High.
  • In-the-moment? Yes

Web Analytics

  • How it works. Exploring, monitoring and testing hypotheses using a tool such as Google Analytics.
  • Type of data. Quantitative.
  • Discovery, validation or post-launch? Discovery, and also monitoring post-launch.
  • Insight vs. evidence vs. ideas Insight and evidence.
  • Investment Low.
  • In-the-moment? Yes.

Co-design

  • How it works. Bringing together a group of stakeholders and users to work on creative group exercises, to both define a problem and explore solutions.
  • Type of data. Qualitative.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Ideas. While co-design does provide some insight, it should always be checked with another methodology.
  • Investment Medium-high.
  • In-the-moment? No

Card Sorting

  • How it works. Types of content are written on cards, which are then sorted into groupings based on their conceptual similarity. Can be done online or in-person
  • Type of data. Both qualitative and quantitative.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Insight.
  • Investment Medium.
  • In-the-moment? No

Tree Testing

  • How it works. Users are asked to find particular items of content within a tree structure, eg, “Which aisle are the bread rolls in?” Mainly done online but sometimes in-person.
  • Type of data. Usually quantitative, although can be used for qualitative research too.
  • Discovery, validation or post-launch? Validation.
  • Insight vs. evidence vs. ideas Mainly evidence, with some insight.
  • Investment Medium.
  • In-the-moment? No

Surveys

  • How it works. Large numbers of users are asked to fill in a structured set of questions, usually online.
  • Type of data. Quantitative, although some qualitative data may also be gathered
  • Discovery, validation or post-launch? Discovery and post-launch.
  • Insight vs. evidence vs. ideas Insight and evidence.
  • Investment Medium.
  • In-the-moment? Yes, if the survey is an intercept (e.g. website popup). Otherwise no.

A/B testing

  • How it works. The behaviour of test and control groups are compared on a live product, to see which scores best against a specific metric.
  • Type of data. Quantitative.
  • Discovery, validation or post-launch? Post-launch.
  • Insight vs. evidence vs. ideas Insight and evidence.
  • Investment Low.
  • In-the-moment? Yes.

Eyetracking

  • How it works. Like a user test, but participants’ eye movements are monitored to see where their gaze is moving across a web page, picture or room.
  • Type of data. Qualitative, although some quantitative data may also be gathered.
  • Discovery, validation or post-launch? Validation.
  • Insight vs. evidence vs. ideas Mainly evidence, with some insight.
  • Investment Medium-high.
  • In-the-moment? Yes.

Diary Studies

  • How it works. Participants keep a record of their behaviour, thoughts or feelings over a period of time, typically a few days to a couple of weeks
  • Type of data. Qualitative, although some quantitative data may also be gathered.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Mainly evidence, with some insight.
  • Investment High.
  • In-the-moment? Yes.

Focus Groups

  • How it works. A group of customers are gathered to answer questions about a particular subject. Similar to co-design, but with more emphasis on talking rather than activities.
  • Type of data. Qualitative, although some quantitative data may also be gathered.
  • Discovery, validation or post-launch? Discovery.
  • Insight vs. evidence vs. ideas Ideas. While focus groups do provide some insight, it should always be checked with another methodology.
  • Investment Medium.
  • In-the-moment? No.

Necessary Skills

When you’re starting out in user research, the two most important skill sets for you to develop are:

  • One-to-one qualitative research. We’ll spend most of the rest of this book taking you through this. Once you’ve got the hang of this core skill, you’ll find you can apply it in a number of variants: depth interviews, user testing, guerrilla research and contextual research.
  • Analytics. This is covered in Luke Hay’s recent book (https://www.sitepoint.com/premium/books/researching-ux-analytics), so we won’t say much more about it here.

Once you’ve picked up these two approaches, you may be curious about others. We’re not going to go into each of these in detail, but it’s useful to know what else is in the toolkit. After one-to-one interviewing and analytics, we think the next two methodologies to learn are:

  • Co-design
  • Card sorting

And after that:

  • A/B testing
  • Surveys
  • Tree testing
  • Eyetracking
  • Diary studies

There are plenty more (especially if you include all the remote testing tools out there), but by now we’re getting pretty niche. The important thing to remember is that most design questions are answerable with one-to-one qualitative research or analytics. So to get started, that’s all you need in your toolkit.

How to Choose Research Methods

Choosing research methods can seem complicated, but actually it’s pretty simple if you refer to the rules we talked about earlier in the chapter. To make it easier, we’ve provided the table above. So when you’re thinking about your approach, ask yourself these questions:

  • What stage of the project are we at? Discovery or validation?
  • How important is the question I’m trying to answer? What’s the appropriate scale of investment?
  • What mix of insight, evidence and ideas do I need? Do I need to convince anyone else (in which case evidence is important), or is it just for the information of me and my immediate colleagues (in which case focus on insight)? Or do I need to focus on coming up with some fresh ideas?
  • Do I need qualitative answers (ie, to understand things from a user’s perspective), or do I need quantitative answers (ie, an idea of how many, how often or how much)? Or both?
  • Is the behaviour I’m interested in something that people can remember accurately? If not, I may need an in-the-moment methods.

Once you’ve got answers to those questions, you’re ready to choose your methods. If the answer to any of the questions above is “it depends” or “both”, then you may need to use a multi-method approach.

How Many People?

We’re often asked, “How many people should I include in my research?” Here are three simple rules of thumb to help you size your a sample:

  • Firstly, how confident do you need to be in the answer? The more important the outcome of the research, the bigger the sample.
  • Secondly, how many different sub-groups do you have? Often there are a couple – say, ‘customers’ and ‘non-customers’. For each sub-group, include at least three people, so in this case our sample size would be six.
  • Finally, what stage of the project are you at? If it’s discovery research, you’ll want a larger sample. 20 people is ideal – not too big to manage, but enough that you’re getting a real cross-section of experiences.

Using these rules, the numbers will look something like this:

  • Discovery: 5-20 people in a single round
  • Validation: 5-12 people per round, in multiple rounds
  • A least three people from each sub-group in every round of testing.

Even the smallest qualitative study should include five people, otherwise you’re running the risk of your data misleading you.

Quantitative research follows different rules, but here you should be thinking in the hundreds rather than single figures. 300-500 is plenty for most basic quantitative research, although more advanced techniques require a couple of thousand participants. Just like qualitative research, you need to ensure you have enough people from each sub-group, but this time we tend to use a minimum of 100 people per audience, rather than three.

Summary

Like any kind of design, research design is about understanding the problem before you apply a solution.

  • Ask yourself the questions we showed you in the ‘How to Choose a Research Methodology’ section.
  • Look up the most suitable option (or options) in the table of methods.
  • Work out your sample size using the rules above.
  • And if in doubt, remember that in the majority of cases, the most flexible, all-round-useful approach is one-to-one qualitative interviewing.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset