In David’s 2011 talk at BoS he covers, amongst other things:
- Why is it important to create a data driven team? I think that if you develop a culture within your company around measuring and having clear feedback loops, everyone knows how what they’re doing is impacting their business.
- Being data driven allows teams to fail faster when you’re starting to fail with your products and your services. Why is failing good? The guys from Rovio started Angry Birds. Angry Birds is a phenomenal success by any measure: financially, culturally, a huge success. But Rovio failed fifty-one times before having the success with Angry Birds. They’re an overnight success, everyone talks about Angry Birds, it actually took fifty-one failures before it got to this.
- Data allows everyone in the business to see how the things that they’re doing have affected the key things that we care about – trying to get someone from being a prospect to being a customer.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Transcript
So thanks for having me. Thanks for that intro, it’s the best intro ever; I’m going to have to play that for my wife.
So, I like to start every talk with this quote and, aka, prove that shit, right? So, it’s my favorite quote. So my name is David Cancel, you can find me online as DCancel, dcancel.com, Twitter DCancel. And uh, I’ve been very lucky too, it’s been my whole life basically creating things. Everything from companies to furniture, right I’m into woodworking, to kids. I have two great kids. A nine week old and a six year old, so I’m up late nights, which is fantastic.
And, lots of different companies. Most recently I was CEO and founder of a company called Performable, which is a marketing software company. When I first met Mark, and started to talk about doing this talk, we were still Performable. If you’ve been here for the full three days, you know that Dharmesh has been going after every company that’s speaking. So now, we got acquired in June, we are part of HUBSPOT, as is 140 who spoke yesterday. So, and things there have been going great. I’ve always been considered kind of like the data guy. Most of my companies have had something to do with data or marketing or analytics. Now being at HUBSPOT, it’s taken it to a level that I haven’t seen before. They’re so data driven that it kind of scares me a little bit. And I have to pull back.
So basically, data geek, I like creating companies that have kind of large data at their core. The easiest way to do that, for me has been marketing type analytic companies. Before we get started and start to talk about how you can create a data driven company, let’s just set some ground rules. So everyone has come here to learn. There are some great books out there. I read the Derek Zebra’s book, which was awesome. I read it a couple of times, actually on my phone, so happy to have a copy. I love reading, I love learning. But when you leave here today, you have to remember you have to implement these things, you have to act. It’s not enough to read, to talk, you have to, and this was actually in one of our Performable offices and it’s somewhere in HUBSPOT now, this sign, you actually have to do things. This is Herb Kelleher from Southwest Airlines, it’s a great quote. So, talking and reading and dreaming alone are worthless, you have to act.
Which comes to my next favourite, it’s not a quote, but it’s my favourite hashtag. So on Twitter, this is my favorite hashtag. It’s JFDI.
For those of you who don’t know, it stands for
‘Just Fucking Do It’, right?
Say it over and over and over again. I love tweeting about start-ups and about data and this and that. But most of the people that I talk to after these talks, or on twitter become so obsessed about just the material and reading and trying to discover the secret. So at the end of the day, for me, I just kept emphasizing that you just have to do something. Even though it’s scary, you have to actually do something. Even though it’s painful, you just have to, just fucking do it. [laughter]
So if you remember anything, with my face it’s ‘Just Fucking Do It.’ [laughter] And now my teams, from Performable and now at HUBSPOT, which are product engineering teams, they basically could create a chat bot for everything I am going to say up to now. ‘Just Fucking Do It’, ‘Do it Faster’, ‘Just Ship It’, ‘Do both’, that’s a favorite one of our engineers. ‘Should we do this or that?’ We have to do both, right? It’s not a perfect world. We have to do both. But just fucking do it.
The other thing you should walk away with, is that even though we are going to talk about data, and I love data, data alone is useless. Right? The point about creating a data driven team is not just to collect data, it’s about optimizing your business for learning, not data. Data is what influences your learning. Data is all about validating your assumptions. So a lot of, we did a lot of A-B testing stuff at Performable, we had a site called abtest.com. And again, it spent a lot of time talking to people about testing and found that a lot of folks thought that testing was going to give them the magic answer. Right? At the end of the day, testing is nothing new, but it’s all about validating your assumptions. You have to bring assumptions that you have to prove or disprove. And that’s what testing is good for. It’s not going to make decisions for you. So enough ranting. I’m good at ranting. So I could do that forever.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Let’s talk about data.
So why is it important to create a data driven team? I am obsessed about data driven teams and I kind of like the Chewo version now that I’m thinking a lot about it. It’s the customer version teams. Right? And I’ll talk a little bit about that. But I think data driven teams are stronger. I think that if you develop a culture within your company around measuring and having clear feedback loops, everyone knows how what they’re doing is impacting their business. If they don’t today, then you have to develop the right metrics for them to understand what they’re doing and how that impacts a business. Whether it’s support, sales, marketing, everyone is measured and everyone knows how they’re impacting business.
Again, the people that hate this the most, engineers. Engineers do not like to get measured. This is a hard one. And, you know, lines of code and all this kind of stuff doesn’t really work. For us, what we do, is we spend, we tie ourselves to a churn since we’re a SaaS business, and then product usage since we’re an online software. We look a lot at how are things that we’re doing impacting product usage. Are they impacting the churn? The crappy part about that is that those two things take a long time for results to show up. It is not that immediate feedback. But, everything else that we tried and we started to look at, how many bugs we are closing out, you know, what are the support counts that are coming in, what is, we came up with metrics for trying to measure happiness. All of those things kept climbing up to the right, and we kept celebrating. We kept thinking that we were doing a good job. But if you looked at churn and you looked at product usage, those things were flat. Even though we started spending months developing new features that we thought were great and we were celebrating, there weren’t any more people using features, and the people who were using them weren’t using them anymore. At the end of the day, if it’s online software that you’re selling, they have to use it to be valuable. And it’s the same with churn; churn was kind of flat.
So those are the two metrics we try to spend a lot of time looking at. Not everyone, even engineers, understand what they’re doing and how that impacts customers and in the long run, churn and product usage. So, I mentioned AB testing. It’s the site, ABtests.com, it’s a community site where you could upload tests. Great thing, right? Online testing is awesome, it’s not new, it’s been around, direct mail guys have been doing it forever. We’re just rediscovering it online. It’s a great thing, I totally encourage testing. But again, testing alone is not going to lead you to the answers, right? Testing is just for proving and disproving your assumptions, and validating. So, I think you should always be testing, especially if you have online software, but just don’t be stupid and think that that is going to solve your business. The good news about testing is that it is hard to be worse than the average, because the average conversion rate is two percent. No matter what anyone tells you, that is the average. Whether you are selling iPods or elephants, as Augonosh from Google Analytics, great quote, that’s the average; that’s what you are competing against. Any incremental change that you can do will get you above that. NP So the other thing that I love about being data driven is failing. Failing, especially if you follow a lot of the stuff online, failing is like the big meme and people like to talk about it. Again, it’s a thing that people talk about, but it sucks to go through. I’ve failed many, many times and I will continue to fail. Easy to talk about, hard to go through in business. But the thing that I like about being data driven is that it allows teams to fail faster. And you know when you’re starting to fail with your products and your services. So why is failing good? Does everyone know what this is? Good. Especially if you have a kid, you know what this is. The guys from Rovio, which started Angry Birds. Angry Birds is a phenomenal success by any measure: financially, culturally, a huge success. But Rovio failed fifty-one times before having the success with Angry Birds. They’re an overnight success, everyone talks about Angry Birds, it actually took fifty-one failures before it got to this.
So this one, everyone should know. 409. Does anyone know why it’s called ‘409’? Exactly. So, that was the 409th time before they got this formula right. And again, you will fail, especially in product development and in your companies, you will fail a lot. But, if you have a way to measure that, and you’re team’s maniacal about doing that, then you’ll know when you’re failing, and you will learn from each failure. And you’ll know when to cut bait.
So, Harmonix, a little company, Guitar Hero, a lot of rockers, it looks like here. [Laughter] I see a couple of people who have definitely taken the guitar from their kids, in the crowd. Guitar Hero, again, a phenomenal success. The part that no one talks about with Harmonix is that it took them ten years of failures before they came up with Guitar Hero. Since then, they’ve had a string of successes with subsequent releases, but ten years of failure.
Does anyone know what this is? This is actually, yeah, it’s the wrong iRobot, whoops! That’s the overthrown. This is a local company called iRobot, some of you might have the Roomba. Great company, they do a lot of military and commercial products. But, it took them eleven years of failing, twelve years, of failing before they became an overnight success with the Roomba. That was a long process of constant failure with their products before they came to success.
So, the only thing I urge for you if you do not have the discipline of being data driven in your company, you can start today.
And we will look at some ways you can start. So, to me the operating dashboard is the easiest way. I talk about this a lot. And, it’s just a place where everyone in your businesses, again most people are uncomfortable with this, but being highly transparent, if everyone in your business, whether it is an intern or you know a VP, everyone has access to the data metrics that matter about your business. If you’re a SAS business, they kind of look like this. There are visitors to your site, there’s usage, there’s churn, there’s session length and that kind of stuff. Depending on your business, it’s going to be totally different. You don’t have to spend a lot of time on the V.I. side or coming up with fancy dashboards or whatever. For most of my businesses, we’ve used an ExCell spreadsheet, right? It totally sucked and everyone hates, especially as you get bigger, having to update that thing and it gets totally nuts. But, it’s the quickest way to start and only once everyone has become fully bombarded with reviewing this every day, do we then pass in creating real dashboard. But, that takes years before we do that. So this is companywide, everyone sees this, that’s the easiest way to start. If you’re not doing that in your business then you should start today. And all of your key initiatives, back to being data driven should match with your operating metrics that you’re looking at in your dashboards. NP So, Funnels, again, especially in online businesses, sales funnels, they kick ass. They’re awesome. There’s lots of neat ways that you can look at them.
You know, at HUBSPOT, we’re obsessed with looking at funnels. Crazy obsessed. I thought I was obsessed with looking at funnels, but then I came to HUBSPOT. Crazy, crazy, there are funnels everywhere. This is an example. But, again, for most of my businesses, they’ve kind of looked like this. We’ve kind of made them with Excell or on GoogleDocs, totally free, shared within the company and we’ve updated them manually, but everyone in the business is seeing how the things that they’re doing have affected the key things that we care about and trying to get someone from being a prospect to being a customer.
So this is something that I’ve never done before until coming to HUBSPOT. And I couldn’t believe it when I saw it because it was kind of the ultimate in accountability and transparency. We have an inside sales team, a very large inside sales team and we share with everyone who wants to see it in the company, on a day to day basis, performance from every single rep. So this is a blacked out version. But, email goes out that you can subscribe to, that’s every single sales rep, what their quota is, what they sold month to date, how much above quota they are, how many units they delivered. Those are some product lines that we have and so we have a breakdown of each product line sold, and we have sixty-some odd sales reps. Everyone in the company can see exactly how every single sales rep is doing, including the ones who are not meeting quota and the ones who are way over quota. I’ve never taken it to this degree, and this is actually the best thing that I have seen. So everyone, you have this pressure internally on meeting your numbers because it’s not something that gets talked about three months down the line, or at the end of the month, it’s something that people are looking at every single month. If I were to ever have another inside sales team, this is the first thing that I would do.
So, the next thing that I would like to spend a lot of time on is churn performance.
Again, online SAS business, this is something I’ve done with every business and sharing this data with everyone on the team seeing how they do in churn. Back to engineers, this is what we look at, this and product usage, to understand what we’re doing is affecting the life of the business.
So, does everyone know what NPS surveys are? They’re Net Promoter Score surveys. And when you use them a lot, both internal and external, we have customer NPS surveys, we have internal team NPS surveys about the team themselves and about the product themselves. So would you recommend this place to a friend as a place to work at? If not, why? Then also, how do the individual teams feel about the product. And of course you know, the engineers are always down on everything. But, you know the surveys are pretty simple to instrument. You could use GoogleDocs, like in this example, this is Raleigh software, where you just send out a very simple survey that says ‘How likely are you to recommend this software to a friend?’ And some of the qualitative feedback you get on why you rated this this way, is some of the best feedback that I’ve seen. We use this constantly. We’re doing quarterly right now, we’re trying to move to a monthly survey and you start to look at your business like this and how’s it doing with customer feelings. And we look at this, not only for external customers; again we look at them on a team by team performance. So, is the IT team feeling better over time, about the company? Is the DEV team feeling better? Is the sales team feeling better? And then we start to dig in and we see outliers in that and we look at that qualitative feedback to see why they’re feeling what they are feeling. NP So this is my favorite thing, everyone knows what a cohort analysis is? No? So this is the best thing to do, once you have enough, you can’t really do it until you have enough data. What you’re looking at here basically here is how new customers who are coming on over time, how are they acting? If you haven’t seen David Scott’s blog, it’s a fantastic blog. This is a cohort analysis example from his blog. And it’s just showing, are we getting better with new customer acquisition? Are they churning less? And we’re looking at the aging of customers and how churn is changing there. I highly recommend doing that, again you don’t need fancy software to do that, and you can just do that in GoogleDocs or Excell. But if you are not doing that, then you do not know if the new customers that you are bringing in are performing any better. Are you getting better as a company or not? You have to look at it cohort by cohort. And we take this to a crazy extreme at HUBSPOT, heat maps and cohorts on everything, but we’re looking at, you know, are we getting better as we progress with the new customers that are coming on? And this is a great thing to look at for your sales team, because they are looking at are the bringing in the right type of customers. And we’ve seen times where they are really pushing and you can see the new types of customers that they are bringing in are actually worse, and then we’ll go back and actually correct what they were doing.
But you know, all of these things are great and I think you should implement all these, and I am happy to talk more about any one of them, but what really matters is not the data, not the learning, but it is also about creating a fantastic company like Zendesk, that really cares about customers. It’s about creating a company like Zappos, who is so customer driven around happiness. At the end of the day, it’s really about creating happy customers. That’s what being data driven allows you to do. Those are my two obsessions: Being data driven and customer focused. So, this is what success will look like. So happy to take any questions, thanks for having me out. Just remember, my head, JFDI, Just Fucking Do IT, thanks. [Applause] Thank you.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.
Audience: So you are talking about that you shouldn’t do anything isn’t going to effect the performances of business, or the metrics. How would you term destructive technologies in your business like that, as they’re going to be a direct contradictory to the metrics.
David Cancel: Absolutely, those were examples of there of Harmonix and iRobot, right, those were destructive technologies that took years to figure out, but the thing that is important about being data driven, even those businesses, is understanding when things are failing worse than you would have thought. So, looking for outliers, again, both things that are failing worse and things that are pockets where things are succeeding better than you thought, and then digging into those pockets and spending time with those customers or those prospects to understand what aspects of the products are working best for them. So it’s really a mixture of quantitative stuff, but then following up with a kind of qualitative feedback to understand why things are working in different pockets. So I think, even with those businesses that took ten years, there was constant iteration and constant product development happening there and they needed some level of data to understand when to cut bait on certain things and when to kind of shift certain things that they were doing and kind of pivot into something new.
Audience: How often do you run your NPS through a customer and how long would it be before you would ask them again?
David Cancel: We do it quarterly, we actually do it monthly, we have different cohorts and each cohort is measured, we ask them quarterly, but because we have multiple cohorts that we are surveying, we actually have the surveys happening monthly. And we keep looking at differences in cohorts.
Audience: David, over here David, thank you for that. You’ve shown us a lot of data driven metrics that you’ve used and what I’d like to ask is, at the end of the day, what are the things that you and DeMesh actually look at each day? Where is the effort? If you only have half an hour a day to put something together, what is the one metric that we should look at to try to drive our business?
David Cancel: Probably bad examples, but we looked at all of these. We looked at the operating metrics each day. We looked at the sales performance, that’s key so we can see when things are going off forecast. And then we’re looking at churn. Churn, we’re looking more weekly or monthly, but we’re looking at usage metrics of the products each day, we’re looking at the operating metrics the ones that change daily and we’re looking at sales performance because we are a sales driven team probably the closest each day.
Audience: David, in February this year, your NPS cratered down 6.3 or whatever.
David Cancel: Oh, that wasn’t ours, that was an example.
Audience: Okay, well, let’s say that was yours, what would you do to figure out how that happened? And then what corrective actions would you, if you could speak in some texture.
David Cancel: Sure, we actually had differences and huge swings in NPS where we’ve had to do this. We immediately go out and start to talk to those customers who are a part of those panels, the panels that we use. First we look at the qualitative feedback and see if there is any pattern that is falling through, so there’s almost always a pattern, whether it’s the site is getting slow, there’s downtime, something’s not working, we see that come out there, then our second step is to actually spend time with customers and we go, we have user groups for our companies from around the country right now, and we actually send PMs out to the user group meetings, they spend time with customers. We send out tech leads now, that’s a new thing that I’ve done that is tech leads and engineers go out and spend time with customers. Again, something we did since day one with Performable, was probably the best thing we ever did was that every engineer was customer facing. They answered support, we rotated support, everyone did support, interns, me, everyone. Now we’re trying to instill that practice at HUBSPOT and we’ve started by following up, mostly on NPS results, so we see differences in there. One thing that’s great on the NPS stuff is having different cohorts, so you see them act pretty differently.
Audience: Related to what you were just talking about, when you send a survey, how many people respond and do you see a bias in people bitching about things, rather than saying yes, we’re happy in general?
David Cancel: Yeah. Data is all about biases; it’s all about trying to figure out the bias. We do, we see differences, and this is why we have multiple cohorts that we are using for the NPS surveys instead of one. Luckily, we have enough customers so that each pool is in the low hundreds of respondents so, we’re fortunate there, in that we have, I think the last one is two-hundred and some odd respondents. I’m not sure how many people they actually fielded it out to, that survey.
Audience: Can you talk a little bit more about the scale of data that you need to have in terms of businesses so your AB analysis and cohort analysis actually can reveal some sort of results with some confidence in a short period of time?
David Cancel: On the testing side, I mean sample size, the biggest thing that decides sample size and whether you can declare something a winner is how many variations you’re testing. For most smaller businesses that don’t have traffic, which is most, you can’t really do things like multivariate testing. You read about things like multivariate testing with Amazon and Google and blah, blah, blah, but those are kind of useless and lead to a shitty experience for most small companies that I’ve talked to because they don’t have enough sample. They’re totally pumped about doing a multivariate test, you talk to them, it’s like how long has that test been running, seven months. What good is this test? A seven month test where there is no declared winner yet, because your sample is too low, because you’ve ended up testing thirty-five variations of something. What you really want to do is stay to a single AB type test where you don’t have to worry so much about your sample size. There are plenty of free calculators online where you can calculate what size sample you need to declare a winner. Then, on the cohort analysis, that’s really something where you need a good amount of customer information before it becomes useful and it’s going to really depend on your business. I’d say it’s most important once you start to get into a steady repeatable pattern in your business, that’s when you can start to look at a cohort’s changing overtime or not. Sample size, there’s an issue there, but not as big as on the testing.
Audience: How many responses do you need to consider the NPS valid in terms of the statistics and to get that much, how many requests do you send to fill out the NPS?
David Cancel: It all depends on, again, sample size. For us, we have fifty-five hundred, six thousand customers, something like that, and we’ll field it out to, we want to have over a hundred respondents before we start to feel good in there, and that’s really just looking at a percentage of total customers. I don’t know what the number is on how many they can field before they can get that response rate, but it’s actually a pretty responsive group, I’m surprised. Customers want to give you feedback, and if you’re doing great, that’s awesome, if you’re sucking, they like to give that feedback too. One thing to be careful on your samples is to make sure they represent your total customer base. We have a wide range of customers where we can have small businesses to large businesses, they look very different, the buyers are very different. We are very careful to make sure that sample is as representative as possible of our customer base. So, you don’t want to survey just a bunch of the smaller ones, just a bunch of the larger ones, you want to try to balance that out.
Audience: So you say one hundred is enough to consider statistically?
David Cancel: For our business, we think that’s enough and that’s usually what we’ve been doing month over month. We have some larger panels in the new cohort that we’re tracking, but it’s been fine for us.
Audience: Hi. I’ve seen several curves this week to go zero to four million, they all have the same shape-
David Cancel: Yeah. Everyone likes that curve.
Audience: Those of us who were at the beginning who are aspiring to the two percent conversion rate who are bringing the average down. The elephant sales guys are making six, and we are making the average two. How do you begin the AB testing process? When we’re at the very early stages of the decision tree when you’re trying to decide very basic, very early things. How do you suppose AB testing in that and how do you design an AB test when you are talking about something at such an early (?) you know? [coughing]
David Cancel: It’s actually one of my favorite times to test, is at the very early stage because you are searching for the right product, the product market fit, you’re testing some pretty big assumptions there. What I recommend is sticking to simple message testing. That’s usually what has the biggest impact. Everyone talks about the red green button or whatever color of something on a page. Those things don’t usually have a radical effect. It’s messaging. At the end of the day your customer has to connect with the message or not, it has to resonate. And so I designed those tests around really simple big messages, big call to action, and test just one variable at a time. So if you’re in the very beginning and testing business model, I’m sure you have different markets you are looking at, different angles to the product. Start by testing one then the other, and the differences between the two are pretty stark. Then go out and try to recruit some sample of customers, whether that’s by buying some through advertising, or I’m not sure how your business works, or trying to recruit them online to give you feedback. What I’m saying is don’t do that testing on your homepage, do that somewhere else, a dedicated place, because again it’s all about trying to control that sample. You don’t want people who already know your business to be part of that test. You want people who have no idea what this is, probably they’re a typical buyer that you’re looking for, how are they responding? It’s really like, big simple message test, do they click to learn more? That’s really enough to understand. Do they care or not? That’s really the care meter. Do they care or not?
Audience: Thanks a lot for the presentation, it was really good. I have one quick question. Do you ever advocate giving an incentive for people to respond to surveys? Like a five dollar gift card or something. If you don’t have thousands of customers, we have almost two hundred, would you ever say, okay just give them all a five dollar gift card or does that kind of skew things?
David Cancel: Doesn’t work. Doesn’t work. I would do it sometimes if it’s your customer base, but if you’re recruiting a sample that’s outside of your customer base, I definitely wouldn’t do it. Those are professional kind of survey takers that are looking to do that kind of stuff. And the type of customer that you want to answer this is usually not going to be moticated by a twenty dollar Amazon gift certificate.
Audience: So usually, I’m more interested in the customer base.
David Cancel: Yeah, exactly. So with the customer base, I wouldn’t spend any time trying to do that. You want to try to recruit the people who are passionate enough about what you are doing, even if you suck at this point to give you feedback, because they need you. And I think that’s a good sign, once you start to have people who are willing to give you a lot of feedback, even in these survey recruitment things, there’s some aspect of your business that’s resonating with them, that they care enough. Even if you’re not fulfilling that now. For me, it’s all about launching products as quickly as possible, even if they fail, just to get that, ‘do people even care enough?’ That’s the biggest hurdle that we face in our businesses and out products that no one talks about is like; most people don’t care enough to switch what they’re doing now, to switch their daily habits to use your stuff. So, the first thing to test is do they give a shit, or not?
Audience: Going back to the importance of say, meeting your customers, you were saying that (?) and I believe it probably went out this morning you had meeting customers, if they come back with say, all the customers love this feature, but they wanted to do blah. Is the next step going to quantify that, or are you happy to take customers for what they say?
David Cancel: So this is a common question actually, because people think using customer feedback, or being customer driven means the customer is designing software. Customers can’t design software. That’s your job, or products, or whatever you do. Being customer driven means we’re trying to spend less time talking about features with customers and we spend more time looking at how they work. For us, we’re creating marketing software, so even though we take their feedback, we’re really there to see, show me how you do this today. Having nothing to do with what we do, because that’s actually when we have the great learnings, like oh wow, they say it’s easy but it actually takes them, for us marketers, it takes them fifteen steps, thirteen exports, this that or that and that’s what we’re trying to fix in our product line.
Audience: So that makes perfect sense, but do you ask, if the team comes back and sort of say it turns out that it’s taking fifteen steps and it’s a real pain in the ass, do you take that, or is there still a part of you that wants to say get the data on how many customers are suffering from this, or will you sort of take them on their word and say we need to revise this?
David Cancel: We’ll usually take that, then go to another user group or another set of people that we have their attention and actually validate it with them. Less about data, this is more qualitative data, kind of evaluating this and this is more prototype and get something out there as soon as possible. For us it’s easy to update, it’s online software, and see if it’s actually having an impact. That’s when the quantitative stuff comes in. Are they actually using it? How much are they using it? Up until then, it’s really qualitative stuff that you’re looking for. I think one last point is, you need to get it out there to get that quantitative stuff, because if you try to quantify before then, it’s free. Everyone’s just like, ‘would you like this feature?’, ‘yes absolutely, couldn’t live without it’. ‘It would cost you ten cents’, ‘I don’t need it’. Every time. [laughter] I don’t know how many times I’ve done that, it’s just like if this feature is not here, I could not use this product. Okay, it will cost you one dollar, whatever, just make up some, ‘I don’t need it anymore.’ Okay, next. That’s the easiest way to validate a feature? Charge a dollar for it, no one wants it. Were there any other questions?
Audience: So I guess my question is what role does good tastes play in this, so someone talked yesterday about how all logical decisions should go to the creative person. So if for example, you found that putting the hot spot on comic sans serif, gave you a type set increase and conversion rate, would you do that?
David Cancel: No. [laughter, clapping] No, we wouldn’t do it. That’s a part of, we have a great kind of user experience team, we are one of every engineer, and everyone on the product side and everyone on the sales and marketing side want to create a product that they’re proud of. So, we’re trying to get very maniacal about the quality of our product, because it reflects back on the team, both in the confidence of selling it and how we feel day to day. So we won’t do things like that, you know, changing all of our links to red or something like that that’s going to have a big increase. There’s a point at which good taste and discretion comes into play.
Audience: A conference first, actually, a question from the live stream on Twitter, ‘How many visits do you consider to be significant from AB tests?’
David Cancel: That totally depends on the variation between the two things that you are testing. There’s free AB testing calculators out there, it’s easy to calculate in Excell. It really depends on how big a difference you see between the delta of the two tests.
Audience: Hey Dave. I feel like for my product, we don’t have nearly enough insight in terms of what features people are using, or how much they are using, et cetera. But, when I think about tracking that, I’m thinking, there’s so many things to track, every single page, every single click, would your advice be to just start recording every damn thing, to see if it’s being used?
David Cancel: No. No, I think that’s another place that people go off. It’s not about recording every single click, it’s about what are the flows that you’re trying to encourage, or trying to solve for. At the end of the day, it’s really flows that you’re trying to solve for, ‘My customer is trying to do X’, my software involves these three steps, these three screens, I start to measure those things and how successful they are, back to funnels, to getting through that process. You don’t want to measure everything on a page. That’s only good for some heat map analysis. That’s fun sometimes, but it’s not something you are going to look at every single day. It’s really about measuring the flows that you care about through the software.
Audience: So I guess the second part to my question would be, I’d be concerned that there is some correlation that I’m completely missing that users who do X are way more likely to convert. Is worrying about that just me being way too analytical or should I be?
David Cancel: Yes, I wouldn’t worry too much about that. I think that, I don’t know, at the end of the day everything is correlated in some way, so it’s hard to, it’s easy to get lost there. I’d spend more time focusing, again on solving these flows or pinpoints for your customers and measuring that funnel and worry less about how this one independent feature is related to this feature, unless that is part of your flow. I don’t know what you’re software is. It really is flow driven.
Thanks. I think that’s it, thank you. [applause]
David Cancel
David Cancel is a 5x founder and 2x CEO with four successful exits to date. Currently, he’s co-founder and CEO of Drift, where he’s helping marketers transform their relationships with customers to drive retention & growth.
Next Events
🗓️ 31 March – 01 April 2025
📍 Cambridge, UK
Spend time with other smart people in a supportive community of SaaS & software entrepreneurs who want to build great products and companies.
Want more of these insightful talks?
At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.
Sign up for a weekly dose of latest actionable and useful content.
Unsubscribe any time. We will never sell your email address. It is yours.