Should you or shouldn’t you listen to your customers? Do they know what they want or just what they think they want? Famously, pre-automobile era, people didn’t want a car, they wanted faster horses. So how do know when to listen to your customer feedback and when not to?
In this talk from BoS Europe 2020, Sally Foote (VP of ECommerce at GoCompare) shares her experience and expertise working with some high-profile digital businesses operating at scale to manage profitable growth and innovation in their organisations.
She will explain:
- how she uses data-driven decision-making frameworks to identify profitable opportunities that can have a potentially transformational impact
- why listening to your customers isnβt always a good idea
- how to know if you’re asking the right questions
Slides
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Transcript
Mark Littlewood
Welcome.
Sally Foote
Hey, everyone. Thank you so much for having me, Mark, really great to see that you’ve managed to pull this together, despite all the circumstances. So, I’m very pleased to be a part of it still.
Mark Littlewood
And I’m sorry, we’re not all there in person. But this is much, much better for introverts is one of the one of the messages I’m getting. And it’s also much, much better for nosy people, because you get to see inside people’s houses, sites that have some weird version of through the BoS holes.
Sally Foote
Yeah, five minutes ago, this was a laundry.
Mark Littlewood
Say, Sally, just why don’t you introduce yourself and just say who you are, what we’re going to do, and then you’re going to show your talk, which runs at just over 25 minutes. I’d like to say thank you so much for taking time out of your day, not just now. But actually we recorded this particular talk at about half past eight or something on Friday evening, Sally.
You don’t want to you do want to know what goes on behind the scenes with these things. So there’s Sally and yours was one of your scripts or one of your spare bedrooms.
Sally Foote
We’re fortunate enough to have one, yeah.
Mark Littlewood
With with a laptop piled up on some suitcases and piles of paper. And well, you’ll see in a minute, but you won’t get any sense of what it was like, because I can I can imagine you gave me the kind of wide view of the room but it looks super pro. And I have to say you did. I’ve been doing a lot of conversations with people. And we’ve we’ve videoed and recorded a bunch of talks and you’ve got a second career, lady, outside broadcasting in, in difficult environments.
So, say Hi, who you are, and roll the video.
Sally Foote
Hi, everyone. Thanks so much Mark. I’m sorry. First, I’m the Vice President of E-commerce at GoCompare, where I look after all of our digital marketing, and then product development at the moment, that’s a website.
I spent my career working in digital product really primarily across news organizations, like The Times, the Sunday Times, Guardian, spent a bit of time at sky. And then prior to my current role, I was Chief Innovation Officer at Photobox. And I’ve talked a lot in the talk that we’re going to paint a moment about my experiences are working directly with customers over the years to understand what was going on for them and to build product concepts out of that.
So looking forward to have you enjoy it. I’m looking forward to hearing your questions.
Mark Littlewood
Right, without any further ado, roll video. This is what I find amazing is that I’ve got no hands here, no tricks for ghosts.
Sally Foote
Everyone lovely to virtually meet you all. So I want to talk to you today about how to talk to your customers. And I want to start off by saying that I think probably everyone agrees that the number one reason products fail is because there hasn’t been enough market research or customer testing done, right?
When Colgate launched dinner entrees in 1982. This probably seemed like a really good idea. These products were like at the peak of their market. And you can just imagine an executive board sitting around and thinking that we should definitely get into this space. But in retrospect, it seems highly unlikely that they ever sat an actual living human being who didn’t work for Colgate in front of one of these products and ask them when they would like their dinner produced by their toothpaste company. Or how about when Cairo launched a touch of yogurt shampoo? Again, this was following the markets it was very much on trend. Very much a trend at the time to start including natural ingredients in your shampoo, like honey or herbs or you know yogurt. Believe it or not, there are actually two things wrong with this products. The first was that people didn’t want their hair to smell like your boots. And the second was that there was actually some reported cases of people eating the shampoo, something you think might have shown up in a focus group.
But let’s not leave as gallery of botches to match ups between foods and cosmetics. Much more recently, Google Glass mothballed in 2015, so not even that long ago. The whole idea behind this product was that it was supposed to be technology that worked for you. But it turned out that people didn’t really want technology to work for them on their face. And they definitely didn’t want it to work for them at that kind of price point, or with those kinds of security concerns. Does Google really test this technology with any non glass holes?
From the outside, it doesn’t seem to us that anyone involved in developing these products spent any time talking to real customers about whether these were the types of things that they wanted, it seems highly unlikely that if they’d done that these products would have got to the market, true? Well, I’m not actually not sure. I think there’s a very strong possibility that many of these products were informed by some level of testing or market research. And I’m saying that because I’ve seen firsthand how very easy it is to misinterpret what your customers are saying or to take what they’re saying or to context or even to be asking them the wrong question altogether.
I think what’s more likely to happen here is that the product developers or market researchers that were working on these products, started to see a positive result in their tests and just ran with it.
They didn’t question what they were doing enough, they just basically wanted to look for an answer. And they started to see it. And so they ran after it.
In my previous role, I was Chief Innovation Officer at Photobox. And in my time there, I got to work on some really interesting projects, one of which was helping customers to make their photobooks more easily.
And a few years ago, when I, when we started looking at this problem, it seemed a very natural area for our AI team to focus on. And that was partly because they had already developed some very clever photo signs that men could look at an image, the algorithm could look at an image and understand what that image contains. But we also had about 20 years worth of photo albums that customers had made, that we could use to train an algorithm on. And we could get to a point where we’d be able to create other new views, new sets of photographs, new customers photos, to make albums based on what other people had done in the past seemed like a really good idea, right? So we spent quite a lot of time and energy developing the software. And when it was ready, we tested it with customers. And we did a diary study. So we got customers to make a photo book at home. And we got them to record how long it took them to, to do so and all the tasks, then we brought them in, we sat down with them and got them in front of our new piece of software, got them to make their book again, and we timed it. And what do you know, it was an 80% reduction in the amount of time it took to make this photobook. And this was really significant because until this point, our customers had taken on average, but eight hours to create a photo book. And this isn’t elapsed time, this time can run over weeks or months, even footbox. We had customers who start their album in January and finish it just in time for Christmas. So significant improvement in this in this journey for customers in terms of making their books. Well, we were really, really pleased with this. We were delighted, we put it live.
And guess what? No one used it.
So we had this pop up that kind of came up as you start to make it work. And on one side it said, Hey, use our amazing new AI to create your book and it will be really fast and super easy. And it won’t take any time at all. And on the other side, it said oh, you can just carry on doing it the really hard way.
And of course, everyone just carried on doing it the really hard way, of course, instantly almost instantly realized what we had done wrong, in our research. What we had done was we had tested a solution, we had designed a piece of software. And we tested that it functionally work the way we had designed it. What we hadn’t done was go back and really test that user problem. We hadn’t checked that what we were solving for was really what customers wanted.
So whenever you’re doing testing, you might need to make sure that you’re testing while understanding of the problem long before you begin to test the solution to that problem. When we went back and we looked at it, work through it again, with customers, we found that they did want help automating the creation of their photobooks. We just didn’t want it all automated. They wanted to help with the boring bits like deleting duplicates, or organizing photos from the same event like onto the same pages or choosing layouts that would work for the shape and structure of their pictures.
We built a lot of that into the next version of the software, which you might have seen the ads for and it’s been it’s been very, very successful for customers.
After years of working in for the digital product development, I’m more reliant on research than ever before. I’ve recently started a new job at GoCompare. So for those of you who not deeply familiar, we’re the ones with the moustache, not the meerkats. And I’m new to not only the company, but I’m new to the sector, the industry. Basically everything about this is new to me. And so I have spent a lot of time over the last few weeks, getting my hands on what ever I can’t understand our customers and what they need. That includes digging out any existing research. I’ve spent time in the call centers, listening to customers on the phones, and my teams and I have been out watching customers using using our products, looking at our websites.
I don’t believe that you can do product development without research. But I also don’t believe that you can do research without questioning what it is you believe from the results?
What the hell do I mean by that?
Well, I mean, that you need to question everything you see everything you think you’ve proven every conclusion you draw, tested and tested, again, making sure that you’ve correctly interpreted what is going on in that research and the results that you think you’re seeing.
Is everyone familiar with the board game Jeopardy. It’s based on a very famous American television program a quiz show and the basic premises that there is a Quizmaster and instead of asking you a question, you provide an answer, and the contestants work out what the question is. And it occurred to me that it’s so often the case that we do that when we’re conducting research, we set out with the answer that we want to frame. And we frame and we set up all our questions in order to reach that answer. And of course, what we need to be doing is making sure that we are asking the right question.
And of course, once I’ve made that connection between board games and research, and I was trying to pull together this talk to keep you all entertained with a nice string of tidy metaphors and kept it all well connected together, I started to wonder whether we could use board games, the metaphor to stitch together some of the experiences and learnings about how to get great results out of out of talking to your customers. So let’s roll that dice.
Many people will be familiar with the rule in a Scrabble that if you end up with a rack of consonants, and you’re not playing the Welsh version, it does exist. But you can take those consonants and you can throw them back in the bag, you it’s a risk, obviously. But the greater risk in some ways is to hang on to all of those and the hope that you might get enough out to be able to do something with them in your next term, you might lose some points by doing that. But you’re basically making a calculated risk. And you’re throwing back what you’ve seen what you’ve got, because it doesn’t work for you. And it’s much harder to make some sense out of what you’ve got there to try again.
And again, this is so often the case with research, you don’t have to use everything that you get, it’s much better to try and work out what’s important, from what you’re seeing than to try and use everything that that you’re seeing from your customers.
Few years ago, well actually many years ago, now I was working with the Thai working for The Times and Sunday Times, and we were redesigning the homepage of times The Times online as it was at the time. And as you can well imagine with a news website, the homepage is incredibly important. Most people who love and know the brand just come straight to that page every day, they don’t really navigate it, they don’t necessarily come to it through search. And they just use that as the main way to get to everything, all of the journalism for times is produced would be linked from that page. And that’s what they will use to get around the site. And so we were having a lot of debate about what that should be on that page and how it should be structured. And so of course, we went out to customers to test this. And what we did was a thing called a card sorting exercise, which is where you put all the features and ideas and things that you might want to put onto the homepage, you write the buy cards, and you put them in front of people, and you get them to order them into a priority order in terms of things that are most important to them.
And we must have done this with about 20 customers.
Every single one of them put weather in the top three. We just couldn’t believe it. How is this? What is going on here? Why are people telling us that the weather is more important to them than all of our opinion journalism or all of our travel journals and all of these amazing pieces of work we do. There’s weather that we buy in as a cheap fee from the cheapest place we can get it is like the most important thing.
Now, reflecting on it, you probably don’t have to do a study to understand that weather is important to the British public, that it’s something we care about deeply and spend time talking about. And what was going on for people is that they were looking at all of these things that we put in front of them and they were looking at weather knowing that they cared about it. And they were putting it waiting at quite highly in terms of their priority, what they weren’t doing, which was actually our job was to filter that with the context of The Times, What should The Times be focusing on? What was important for The Times?
So we changed the test, just to be sure. And we did two further tests, the first one we did was we drew out the page with all the kind of blocks and spaces on it. And then we put the cards next to it. And we got people to connect what they were interested in to spaces on the page. And of course, where they got really small and tiny and went right up top.
And then we did a second test where we removed the weather card. And we just put in some blank cards in case people felt there were things that were missing that we hadn’t included. And of course, no one wrote weather.
And you can see my point here, it was our job to work out what was important for customers and to interpret what they were saying the context of our business.
Don’t do everything your customers tell you, it’s your job to work out what’s right for your customers.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Obviously, with the repetition of that test, we got a much better result, much better design a much better page overall.
The first box, I was looking after a lot of physical products, and we made a number of changes. And one of those changes that we made was to our calendars, which is I’ve got one here to show you. And we removed this plastic cover. Two reasons for it, mainly because it was fairly low. But what was very high quality plastic, but it was unrecyclable. And we got a better product at the end of it by removing it. But also, because we’d seen whenever we took this calendar into customer testing, what people would do is that they would just turn it over and hang it on the wall. And so we knew that there was kind of no real kind of functional value to this to this plastic cover. And we’ve got a much better recyclable product by taking taking it off. And as soon as we released that, without having done any testing, we started to get feedback from our customers. And, of course, some people noticed that the cover was missing. But we started to get some feedback on some other things like people started saying, Oh, we don’t like the new paper, or the new printing isn’t so good. And we haven’t changed any of that the only thing that we had done was remove this plastic cover.
So we went back to testing to try and understand what was going on for our customers. But this is quite a difficult thing to test, right? Because if you just take two calendars and put them next to each other, and you ask people what the difference is, they’ll tell you that the cover has been removed. But what we were trying to understand was, what was different, what was the thing that the cover was doing for customers, that meant they felt differently about the whole calendar. And so we redesigned the test again.
And what we did instead was we put a whole bunch of products into test even though we weren’t very interested in the other ones, including these two calendars. And we asked people to rate them on various criteria. Like, for example, you know, how high quality they thought it was. And what we started to notice was, we started to see some other things coming out and told us a little bit more about what was going on. So for example, people noticed that the calendar without vertigo was lighter. And that was playing into their quality judgment. But they also rated it as more likely to get damaged in the post. That was also very interesting.
There was something going on as well, that customers couldn’t tell us about. And that was, this took us a long time to work out, because we worked it out by stitching together quite a lot of research over the years, about what we knew about how people thought about their photographs. What we realized was that the kind of cover had been respecting the photograph. Let me explain.
So Photobox customers print photos of people that they love their friends, their families, their pets, sometimes not people. And those verses are precious to them, they they talk about how it’s really difficult for them to throw them away. They talk about how they don’t like putting pins through them, for example, all of these things are really interesting, that tells you the value of those pictures. So that’s another reason why Photobox don’t make this kind of stuff, suitcases, floor mats, table mats, because bad things happen to those photographs. And and what we realized was going on with this calendar was that the plastic cover had been separating the picture on the front of the calendar from the outer packaging in which the calendar was posted. And that was making people feel a little bit uncomfortable. And we knew we were onto the right hypothesis because when we started doing things like we put a paper cover on, or we wrapped it and put it back in the packaging, people started to feel much more comfortable about the calendar as a whole. We knew we’d finally know the insight of what was really bothering people.
And the point here is that it took a lot of thinking and hard work and hypothesis-forming to work out what was going on here, these insights are hard to come by, you have to stretch yourself intellectually. And then you really have to take the effort to stitch together these insights and test them working through creatively finding ways to work out which one is the actual cause of the problem. Or sometimes it’s an opportunity you’re testing out.
At Photobox, I also launched this little product, which is a baby book. And it’s got like, some board books got nice thick pages, and rounded corners, and all that sort of thing. And we put this into testing again, and we started getting some interesting feedback quite early on, which was along the lines of oh, we’re not sure if it’s robust enough. And we’re a little bit worried about like, whether it’s strong enough. And we couldn’t stand this because, well – A. developed it to like, European standards in terms of baby stuff, and; B. read, like, really stress tested it. So we’ve thrown it about we tried to tear it, I left one in my daughter’s nursery for three months and took it home intact.
So we kind of work out what was going on. So eventually, we noticed that it was at a particular point, in the research that this started to emerge, I’m gonna ask you to do something slightly strange, particularly because we’re all over video conference, but I’m gonna ask you to close your eyes for a moment because I really need you to listen. So if he just doesn’t, I’m going to open his book now.
Did you hear that?
Right? You heard it? Great. That is actually a completely natural sound, what happens is, when the pages are glued together, you get a little bit of glue on the top of the spine here. And then when the cover is adhered or added on, that the cover adheres to the pages. And then when you open it, it separates, and you get a slight snapping sound or a creek in the spine. And this is what customers were hearing. And this is why they were worrying that the book wasn’t robust enough.
But you know what, no one said that it was they weren’t able to tell us what was actually going on. We had to get there. By asking them all sorts of questions. So you know, we think it might be that. Is it? Is it the corners? Is it the pages, is it the bending? Is it that it’s the sound that you can hear. They weren’t able to tell us exactly what’s going on. And that’s often quite true. Customers can’t necessarily articulate what is going on for them. And it’s only through these kind of detailed working through the scenarios that you get to what’s really going on.
So far, I’ve been talking about one main kind of research, face to face, interview style, focus, group style. And these are incredibly powerful for getting deep behavioral insights into what’s going on for your customers, understanding their motivations, understanding their thinking, just getting a real sense of what what’s going on for them. However, they have one major flaw. And that is that they are not real.
How people behave in real life does not reflect itself necessarily, in testing, how they say they’re going to behave and how they really behave, can often be two different, entirely different things.
Recently, my team and I were out at a research session looking at GoCompares mobile sites, and we were watching people use the site on their phones. And we’ve had about six participants. And every single one of them got stuck on one field, which was the job title field where you have to fill in your job title. And no one could find a matching job title. We’ve since fixed this, obviously, but some very interesting study nonetheless.
Anyway, and after people we had, about half of them weren’t bothered, they were fairly confident insurance buyers, and they just kind of found something that they’re happy with and kind of cracked on. But the other half got completely stuck. They really didn’t want to choose something that wasn’t exactly right, because they were worried that they would invalidate their insurance or they didn’t quite know. And they didn’t know how to get help. And because there was a research assistants or a researcher sitting over their shoulder kind of prompting them to continue, they they went on to continue. We can tell you right now that in real life, not one single one of those people would have completed that journey, they would have left our site or tried another site and they would stay with their current insurer. And that’s a really great example of where test situation isn’t going to give you a strong read on what is actually going on, going going to happen in real life. I know this as well, because when we were working for The Times, we launched The Times and The Times and The Sunday Times poll, we surveyed hundreds what probably not hundreds but a lot of people on whether they would pay for The Times or not.
And I can’t remember a single focus group in which there were more people who said they would pay for The Times or not. And yet, six months after launched over 100,000 subscribers and four years later, a million subscribers. Another great example of the fact that things don’t always work work out in real life the way that they wouldn’t test scenario.
So how do you get around that? Well, you need to make sure that you are finding a way to test and as real a situation as you possibly can. So if you’re testing an ecommerce solution, whatever, get the person there with their credit card, get take them all the way through that journey.
And that usually means getting out of the office, it means getting away from a test scenario, it means getting out of from behind those one way glasses, and going to where things are really being done. So the more you can get into the real situation of where this task is going to be done better for your results. Another great example of this is I worked with a company called The Sell Room many years ago. And they’re a marketplace that connect online will auction houses with online people buying antiques. So a really high volume, interaction site and really important that people have the information they need, etc. And we were doing a piece of work to look at how we could evolve their their solution. And so what we did was we went and sat at the desks of all of the people who are participating in these auctions and who were buying these auctions. And we found all of these nuggets of information. We didn’t script the interviews, we didn’t set up particular steps that we wanted to go through, we’re just waiting to have them watch what they were doing. And you could look at what browser tabs they had open alongside the sale room, you could look at the post it notes on the sides of their computers, a piles of books and reference material that they had alongside their computers. We understood what their needs were through this transaction, and what they and we found nuggets of ideas, ways that we might be able to support them better.
The more you can get out to where people are doing things, the better.
So far, everything I’ve been talking about has also fallen very much into the category of qualitative research. So working with smallish numbers of customers in a very hands on detailed way. But even once you’ve got those really interesting nuggets of insight before we start investing in them, and kind of really developing against them, needs to be sure that they’re going to scale. So you need to make sure that those insights are relevant to a much wider portion or base of your customer base or much wider slice of the population. And of course, the way to do that is with some quant research. And ideally, with your point research, you want to be able to test a much larger number of customers at a lower cost and much faster. So thinking about the hypotheses that you’ve formulated out of pole and finding ways to do that at a quantum volume level. Now, there’s number of different ways to do this. And if you already have a live site, that’s the place to start. For example, Google Analytics can help you but more commonly tools like we have, for example, at Photobox, we use a tool called Usabilla, which plugs into the website. People use that to leave feedback in the middle of the journey as a kind of doing stuff. They’ll give us feedback on what’s going on for them. But we also could use usability to build a panel of our customers that we could survey on various options. And whenever we’re looking at something launching something new, we could insert parts into the journey to say, Hey, give us your quick five minute feedback on this thing X, Y or Z that we’re looking at.
You can also incorporate other types of tools in your in your website. So for example, at Photobox, we use a tool called Quantum metrics. And what this does is it basically allows us to watch any session on the site live, we can filter that by device by location by part of the site or by a particular type of interaction.
So for example, we can look at, find anyone who is reached clicking who’s got frustrated with what they’re trying to do, either because there’s a bug or because we can find a place where the user experience is a bit broken. And it’s not clear to them how to proceed to great way for us to find those types of problem areas. And then we can go look at large number of customers. If you’re using that and see how they’re behaving. If you are trying to launch a new feature or something new on your site, then of course, a B or multivariate testing is a great way to be able to do that, putting out different versions of the same feature, and looking at how different segments of your customer population respond to those features, incredibly powerful for E commerce to see whether a change that you’re making is actually going to work for your customers at scale. If you’re testing something that isn’t quite live yet, then again, there’s a number of different ways to do that. And two of the most common here Wizard of Oz test.
So this is where you put something like pretending that it’s automated. And in the meantime, there are a ton of people running around in the background making it happen. So with our photobook automation tool, for example, one of the ways that we might have tested that was with the Wizard of Oz test saying hey, we’ll quit your book for you. And we could have done it in 48 hours in the background and sent it through, for example to test and really test the appetite for that product from wider base before we started to build it at scale.
Another common test in this space is called a false door test. And this is where you put something like that isn’t live. So you might say, hey, click here to download our app. And someone might go through to a page that says coming soon, give us your email address. And we’ll be in touch with with you about it. And of course, that has two benefits. One, you get a volume test on the number of people who are interested in downloading the app, but also plus a ton of other related data. But you also get a prospect pool of people that you can get in touch with, when you’re ready. Those are two great examples of ways to test on a live site about something new that that you’re starting to look at.
So back to jeopardy to finish off a final round of jeopardy. So when you kind of come back to this question, you think about everything I’ve been sort of talking about today, if the answer is that it’s all about using lots of different tests to really validate the understanding of your customer and problems needs, then the question, of course, is, how should you listen to your customers? Don’t be lazy, it isn’t your customers job to tell you the answer. It’s your job to work it out from what they tell you. Thank you so much.
Q&A
Mark Littlewood
Sponsor dab on somewhere, drop us an email if you’d like to sponsor the commercial breaks in the. Fantastic. Now, I have one comment to make about people closing their eyes. I’d be really curious and let us know in the chat. Number one, whether you closed your eyes. And number two, when you heard that crack, did you open them? Because I’ve seen that bit three times now. And every time I’ve closed my eyes (laughs).
Great. So keep question questions coming..
Sally Foote
It’s okay to open your eyes now.
Mark Littlewood
Don’t forget the format here. There’s a little Q&A button that you can put. You can put questions into we’ll take them one by one.
I firstly, just thank you very much for the talk. Thank you for having the forethought to make it so boardgame oriented. Because that’s given us a few ideas about ways to spend more time with our family. profitable. Right.
So Sally, please just tell them all to open their eyes. I think this is the challenge we have.
But yeah, some some fantastic thoughts and some, some really, really interesting things. Going to dive into questions straightaway.
This is Ed, who I think is in Brighton. I really do look everybody up on LinkedIn when they register. And it’s nice to know who they are and where they’re talking to. So I think that’s in Brighton, what was.
Sally Foote
Amazing.
Mark Littlewood
Well, that’s basically Brighton anywhere that’s in my highly parochial view these days. Is okay, he’s best. Just because I did geography. I wasn’t there for that lesson.
“What percentage of your time budget to spend on user research? How do you convince the business to start investing in user research? And how do you demonstrate ROI?”
And I’m going to caveat that with a couple of things. percentages, maybe aren’t the right way to think about this just because I think the budget for something like GoCompare is essentially infinity and a 10 person
Sally Foote
Not quite.
Mark Littlewood
But it’s closer to infinity.
Sally Foote
Well, it’s actually not. Yeah.
Mark Littlewood
That’s a mathematical and philosophical discussion. But yeah, how do you how do you sort of think about that?
Sally Foote
Yeah I think there’s a couple of things right. So I’ll sort of start with how we we look at it, we budget for the annual budget cycle, we include a line for research, and then we decide how we’re going to allocate that across the year based on how our strategy evolves, or what we’re particularly looking at. But we also balanced that line across different types of spend, right? So a lot of the one I’m talking about there you might do with an agency or you might do yourselves, but we also budget within our research line for tools that we that we might use for so for example, for specific UX testing, where you’re specifically looking at someone using your site versus the more sort of behavioral type of stuff that I was talking about where you might use more of an agency. I think the key thing here is to not feel constrained by your budget.
There’s so much that you can do that is almost zero to very low cost from using platforms to to watch people or building panels from your customers that you can then talk to, and watch what they’re doing and talk to you use your teams to do it, if you haven’t got a big budget allocation, that encourage your teams to do it, I also have been through a kind of a transformation of believing initially, that research was best conducted by researchers. And actually, I’ve gone on the complete other side. And I’m very, very insistent that designers and product people are deeply engaged with the research because I think they often get different insights out of it, then someone who’s designed a script might might get out of it. And it’s really, really important that they’re closely involved. And to sort of that sort of broader, I’d say, in terms of demonstrating ROI, it’s often easier to demonstrate the ROI of not having done the research. So projects that fail or don’t develop, and it’s clear that you’ve not landed, and all you need is one of those, and you’ve got a very good case of being able to, to the kind of the immeasurable bidders the successes that you’ve missed out on, because you weren’t working closely enough for your customers. Generally, you’ll find Yeah, it’s a tough, it’s a tough thing to do. But just those those examples of where something really hasn’t landed, and it’s because you haven’t done your work.
Mark, you’re on mute.
Mark Littlewood
You are right. Great. We’ve got loads of questions coming in. And we’re going to skip through some of these, “Can you tell us a bit about how your research team is set up and how you balance in house versus agency research?”
Sally Foote
Great. Sure. So we have, we have a head of, we call it customer customer experience. And she will head up our design and research team. We currently have one internal researcher, customer research manager within my team. And she will look after all of our agency engagement plushy will look after all of our internal tooling. But our main responsibility is making sure that we’re getting the right insights from all of those places, and that we’re cross fertilizing insights between each other so that we’re making sure we’re connecting the dots between stuff we’re seeing, but also to be interpreting what we’re seeing and to making sure that that’s communicated back into the teams and that we’re taking action on it. I think I’ve just think a little bit more about that ROI question. And I think I didn’t talk about this my toe, but one of the worst things you can do is to spend money on research and then not use it. So focusing what you’re doing on a specific problem that you know, you’ve got roadmap, and you know, you’re going to be tackling, focusing your research on that. And then and then actioning it to make sure that you’re making the most out of it.
Mark Littlewood
These are great. “How does luck play a part in getting the product right for your customers? can you leverage what we understand about the potluck plays in more substantive ways than do more better research?” This reminds me of David Brent’s. Approach to employment, which is avoid avoiding, avoiding employing unlucky people by throwing half your CVs away before you’ve read them.
Sally Foote
About that one. Yeah. I’m slightly less of a believer in luck as an as an element in this. And I probably I would say that given kind of what I do and my career, I think the key thing to do is to break your research into multiple phases, particularly if I’m also just char your question about, you know, how you ask customers about something new. And the first phase of that is the behavioral research. So really understanding what’s going on, on for customers. And then the bit that comes in the middle of that, once you’ve got those insights of trying to understand the problems that they’re trying to solve, or what they’re trying to do, the bit that comes to the middle that I would substitute for luck is the creativity and the ideas that you generate off the back of that before the next phase of research, which is more of the concept test. So where you’ve got a bunch of ideas, taking those in and testing those multiple phases through through your development. So I’d argue that customers are just really a completely anti luck policy. It’s about not using luck to do product development. And it’s about rigorously moving through a series of options and working through those to get to a result that you feel much more confident about.
Mark Littlewood
So many good questions. “Testing takes time that board book was in your daughter’s nursery for three months. How do you balance the need for research with the desire for early entrant advantage?”
Sally Foote
Yeah, that’s a great question. I think it has to be proportionate to what you’re trying to do. That book very quickly became one of those boxes, biggest products for the year. And there was another one that launched with it, which basically introduced two new lines. So it was absolutely worth the amount of effort that we spent testing it. It was also pretty new to the European market. There was American providers doing a book like that. So we knew because we were going to be first to market sort of broadly within the UK. And because it was going to be such a key product that we had to spend the time you can I mean the physical products is a little bit different, right? I think with live products, the sooner it was such a digital products, the sooner you can get something live to your customers and get actual customers on it, the better. With physical products, we have one chance to set up our production line. And so you want your production line to be set up right from the start. So it’s slightly different process there. With with with digital products, I’m much more in the camp of using false door tests or Wizard of Oz tests or anything else you can think of email campaigns, etc, to gather early feedback and to get people onto your product or onto your concept as early as possible, rather than those kind of long lines that you might see with a physical product.
Mark Littlewood
Brilliant. We’re gonna crack through there just more and more coming in. “If you’re looking at a new market, what would you recommend for starting the market mapping process? And there are so many ways you can cut it, you just got lost in data and ended up down a rabbit hole.” That’s from Flopsy Hornby.
Sally Foote
Yeah, complicated question. There’s loads of great tools and techniques out there for this, I’d recommend the blue ocean strategy for looking for a space that both won with low competitive or where you can gain competitive advantage and then obviously mapping that to core of what your business does. If you connect with me on Twitter or LinkedIn, I can send you another piece of framework saying that I’ve done with other businesses before I’m happy to share some that stuff. But probably another whole talk just on that question. Yeah,
Mark Littlewood
Angling to get invited back?
Sally Foote
Be too busy with my broadcast career. Yeah,
Mark Littlewood
well, you know, I’m your agent now. So. Okay, thank you. We’ve got loads here. I hope maybe we could set something up in that in the break, potentially save a little? sure people can come in,
Sally Foote
oh, send me a message on Slack. If we don’t get through? I’ll do my best to answer I can show you we’re not gonna get through them all for five hours.
I Gosh, how do you encourage developers to get involved in research?
We’ve got tomorrow, we’ve got two days of or Wednesday and Thursday, we’ve got two days of research and our whole squads are invited. And I’m really happy to see loads of them there. We’ve also opened it up to the wider companies. So the brand teams come our partnerships, teams come who work with our insurers, it’s a really useful opportunity for anyone to get close to close to their customers. And the fact that we’re doing an all over conference makes it easier for people to do it from their desks and they don’t have to go to a conference venue and stuff like that.
They really like it. When are they? Are they up for it? Are they excited about it? Yeah, I mean, we work in..
we’re moving to a squat model. And so this is very different from kind of pure software teams. So everyone in the team is responsible for thinking about and being involved in the decision about what to do next, right? Because that is a balance between the problem you want to solve for customers the best way to execute the solution to a problem and then how difficult and complex it is to develop that solution. So we want everyone in our teams to feel like they are there to solve for customers and they get gold on a customer goal customer Okay. And, and that’s what they’re all working towards. So yeah, they are very interested in understanding and a lot of the great ideas in our teams come from from our developers or software engineers.
Mark Littlewood
Great. Again, cracking on. “Completely agree with the not slavishly following the outcome of tests, any tips on how you can handle the absurd accusation?” I think you mean column manager, “that you’re not being objective, that you’re picking and choosing until you get the outcome you want.”
Sally Foote
I don’t think customers can. So resistors can necessarily be objective, you, you are trying to solve for the balance between what people say and what your business does. And what you’re trying to find is the intersection where between what you can do what you what you believe you should be doing in line with your strategy, and a section of the population or your customers that that works for. But the thing, you know, ultimately, it’s all in results. So being able to the sooner particularly gaming software, the sooner you can get people onto your idea and a live kind of way, the better you’ll be able to demonstrate that the thing you’re you’re proposing is correct. And I think the other question that comes up a lot is around the volume of people that you talk to I think this is certainly one of the challenges that I’ve had many organizations is you’ll say, Oh, we’ve drawn this conclusion, and this is what we’re going to do. And and then people will say, but you spoke to six people like how can you, you know have drawn that conclusion. And I think, again, it’s just been very clear within your organization explaining you know, when you’re looking for behavioral, on the behavioral side of things for opportunities to explore, you probably do any need to speak to six or 10 people because some of the same behaviors start to emerge and you can see the patterns. And then the key thing is To develop some prototype or idea, that concept and then to start developing the concept, and that’s why early low volume and then higher volume on the on the concept once a bit further down the line is so so important. Right?
Mark Littlewood
Apologies if you have questions I’m, I got a little button which says that I get to dismiss them. And it’s not because I dismissed them as being important or interesting questions. They all are. I’m triaging. And there’ll be some follow up here. What are your thoughts on product testing? When it comes to two sided markets? Should you test both markets at the same time?
Sally Foote
Absolutely, absolutely. We have done a number of big B2B projects, primarily for B2B publishers. And we’ve done two things. So one is we’ve worked directly with people who are going to be the kind of consumers of that product. And in a b2b setting, you can go and sit at their actual desk and look at how they’re using it, what they need to be doing, it’s really actually great. We’ve built out, we did a number of projects where we were we chose organizations who were our build partners on that project. And this is almost something certainly something that we would be doing a go compares, like working with a small part of our number of our insurers who are interested in innovation, and we will be working on a concept with them to bring it to market. And then at the same time, we would absolutely be testing that with consumers. So that whenever you are kind of we are a marketplace. Whenever you’re looking at at that situation, you have to be working with both sides, because in the same way that we wouldn’t do anything that didn’t work for our insurers as well wouldn’t be successful if it was just solving for our customers. And our job is go compares to find the balance between those two things.
Do you do a lot of A B testing and GoCompare that was like a baby we do.
We’re at any one time, we were running sort of two or three different AV tests. depending on their size and scale and the volumes of the audience, we use Optimizely as our platform, it’s really good to use it elsewhere. We’ll run tests on everything from UX changes to when we’re ready to launch a new feature, we might run it. We also test our PPC ads and how they land onto our main landing pages through through API testing tool, we might test different versions of copy and that type of thing. Cool.
Mark Littlewood
Okay. I gotta leave, I have one final question to leave you with. But I think there’s a couple of things I’d like to say here. I’d love to get you back to do another session. Not I mean, I know there’s loads going on. But there’s clearly lots of people that have have things to ask you about. And you know, lots of lots of things kind of throwing around. So one of the things that we’ll be doing post this event is actually running a series of, well, maybe we’ll become a SAS base subscription business where we have kind of weekly or monthly or bi weekly, or whatever it is. sessions like this. So I think it’d be awesome to get you back in and take some questions on that one put a bit more time.
But our final question from Chiara, “How do you ask questions? Or do research when you’re leading customers into a new area where they haven’t had a solution? before? It’s great question, choosing between a or b, when customers don’t know what A or B?? Oh,
Sally Foote
The key thing is to split out the behavioral research. So when you’re looking at a new area, what’s the thing you’re solving for really understanding what’s going on for customers, and then separating that out from the concept testing. So for example, we’re doing a piece of research tomorrow on renewals around car insurance and how people think about how they think about that what’s going on for them, what information they have available to them, what how they go look and see who they’re currently reassured with how they think about the price, etc. And what we will not do in that is we won’t test any of the ideas that we’ve got, because we don’t know that those are good ideas yet, we’re just trying to prove what the what the space looks like and understand kind of where the big problem areas are. And then we’ll almost certainly go back to a second round, where we will basically say, here’s some ideas that we can work through, there might just be pen sketches at that point. And we’ll start to look through kind of where some of the concepts are. And then there’ll probably be a third round of testing where we evolve and develop that.
And then ideally, we would quickly start to try and get people on to some of the big concepts that we might be looking at. But that’s the key thing is when you’re trying to do something new. The worst thing you can do is go in and ask people do you want this? Do you want this? Do you want this? You have to start with okay, you’re in this moment, you’re trying to do this thing, what’s going on for you? What have you got in front of you? What information have you got what’s hard, what’s easy and really understanding their behaviors? Brilliant.
Mark Littlewood
Okay, Sally, thank you so so so much I can’t say what a privilege and a pleasure it’s been to. I can’t wait to to get you back. And if we can set you up in a Hangout room later on, for the lunch break or something, I know there’ll be people that will be dying to come in and say hello. on their own. So you’re probably not going to hear this, but everyone’s just gonna go.
Sally Foote
Thanks so much. Um, please do feel free to send me messages on the Slack channel or on LinkedIn or Twitter. And yeah, I’ll do my best to come back to you all over the next few days. Thank you so much. Really appreciate the time and thanks Mark for keeping this all together.
Mark Littlewood
Thank you. Such a such a pleasure. Such privilege. Thank you so much. That Sally Foote, ladies and gentlemen
Sally Foote
Now VP Ecommerce at GoCompare, Sally has been part of the team behind some high-profile digital product launches at the Guardian, the award winning Times and Sunday Times paywall, BT Sport and Photobox.
Sally has a product background and leads eCommerce at GoCompare overseeing digital marketing and product development. She loves working at the strategic execution level – creating business visions and goals and translating these into executable products that deliver tangible value for organisations and their customers.
Next Events
BoS Europe 2025 π¬π§
ποΈ 31 March β 1 April 2025
π Cambridge, UK
Grab early bird tickets until TODAY
Spend time with other smart people in a supportive community of SaaS & software entrepreneurs who want to build great products and companies.
BoS USA 2025 πΊπΈ
ποΈ To be announced soon
π Raleigh, NC
Learn how great software companies are built at an extraordinary conference run since 2007 to help you build long term, profitable, sustainable businesses.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.