S: Welcome to episode number 121. Conversion rate optimization is today’s topic with a big emphasis on testing. That’s right, AB tests or split tests are going to be your new best friend, thanks to today’s guest, Justin Rondeau. Justin leads the marketing and growth team at DigitalMarketer, the Digital Marketing BMF that produces the traffic and conversion summit. Justin, welcome to the show.
J: Thanks for having me, Stephan.
S: Let’s start by talking about testing because there’s a lot to conversion rate optimization and testing seems to be kind of the foundation to it, figuring out what things to test, in fact whether to even run a test. I think it’s something that people need to think about too. Do I even need to run this test or should I just do the thing that I think is common sense?
J: Yeah. I think testing is very much a foundation of optimization, but testing in itself, if we’re gonna be doing it to any sort of statistically significant level, requires time. A lot of people don’t have as much time as they need to get these things done which is why you do actually have to rely on things like common sense. What you’re looking at is this – something that you test out or is this something that you just merely apply. One of the things I like to do when I’m thinking about this is going, “What’s the upside of this change that I’m doing? What’s the potential improvement?” Specifically if you’re not doing any sort of reactive type of response. If all things are looking pretty good in terms of on your landing pages or within your funnel, and you go I feel like changing this or I wanna try something new. Before you allocate resources to it or try to improve it or you have this hypothesis like, “Oh, if I change this, this should improve this, and I’ll know it when I see this.” You have to take a look at the upsize. Is the change worth the amount of effort put in or is that change even worth the effort of splitting traffic and potentially sending 50% of the traffic to that page to a kind of a worse off version? I think people need to get a lot better at recognizing what is worth testing. Generally, it comes down to areas where there’s a bit of ambiguity or risk. You don’t know what the actual outcome will be or you don’t have a good enough data set to predict what that outcome would be where a lot of people will just kind of test the fully foundational things which are things you just changed overall. Whether it be kind of something within messaging where you can look between different levels of targeting or anything there to see if there’s a message mismatch or anything like that where you don’t test, “Oh, our headline is different than the promise that was put out in this advertisement.” Of course you’re gonna have an issue there because there’s no congruence, you don’t test that, you just change it.
S: You basically fix things that are broken. You don’t test that to see that, “Oh, yeah, I should fix that broken thing,” if it’s against best practice and it’s just not a good situation for the user or it creates confusion or what have you, just fix it.
J: Yeah, exactly.
S: What would some of the most obvious kinds of tests that you would recommend people do? For somebody who’s not into the realm of conversion rate optimization, what are some basic tests that they would start off with?
J: Generally, I think it comes down to there’s really four things that break down when you’re looking at any sort of optimization or any time that you’re seeing lower conversions on the page, there’s only four ways to really influence it. That’s by either with your offer, either the articulation of that offer, the targeting, or the design of the page. It really turns into an offer problem more often than not. People need to look at whether their offer is actually attractive and useful and that they’re articulating it in that way. I’d be testing different messaging and different things there. If you’re looking at something that’s just fix it and you don’t wanna do any sort of research whatsoever, one of the things I’d recommend doing is taking a look at the form fields on your lead gen forms as well as in your carts and using different mobile input types to make it easier and more accessible. If they’re typing in a phone number, it doesn’t bring up a standard QWERTY keyboard, it brings up just numbers, or if it’s a credit card it just brings up numbers, or if it’s an email address you have the option for the @ and . on the first side of the keyboard versus having to hit shift or something like that. That’s just an easy win. But generally, it comes down to making sure that your offer and your articulation of that offer is correct which is what people generally test, I think most of the time.
S: What would be an example of a test on an offer? Let’s say that you’re providing some sort of subscription service, would you test first month free, first month $1, trial or double your money back. What sort of things would you test with an offer?
J: You can try all sorts of things there. You hit a bunch of them where it’s like there is no trial, let’s start a 14-day trial. How about 30 days. Just start for $1 and then pay at the pricepoint going forward, or anything like that. Those are different kind of offer tests. You can look at different types of bundles as well. Maybe something that you think is inherently valuable, or has a very high value to you as the person who runs the business, might not have that same perceived value. Using different types of bundle options to kind of make that more attractive. Also, if it’s a perception issue that you’re dealing with then that’s all kind of page layout in a message articulation that you need to look at rather than just the offer itself.
S: Yes. There’s this concept of risk reversal that I think is important for our listeners to understand. They’re taking a risk as a potential buyer of your products or services and they’re risking, I don’t know, what’s gonna happen with that credit card number once I put it into your system, are you gonna rip me off, or am I gonna have to spend hours on hold and trying to get a refund, or what have you. Ways to reserve the risk so that it feels like the risk is on the vendor. It’s really, really powerful and important. One of my favorite examples of this is a guy named Clint Arthur has an amazing risk reversal. Not only will he give you a full refund if you buy his celebrity launchpad which is $10,000, it’s a big investment. But if you don’t get at least three TV appearance bookings right during that weekend from doing the work with him, if you don’t get at least three bookings, guaranteed he will give you all money back plus $1000.
J: That is a heck of a risk reversal. That is compelling.
S: That is super compelling. Any words of wisdom around risk reversal?
J: In terms of risk reversal, I don’t think I’ve dabbled too much in reversal. One thing I just try to do is try to either mitigate the risk or to take risk off the mind. I don’t do deals that are $10,000 mark that I’m working with an acquisition funnels with optimization campaigns. The risk threshold is fairly low at that point. We have to show people that they can trust us which is where you have to break down through different trust deals, looking at, “Here’s our guarantee. We’re in the Better Business Bureau,” all this other stuff showing them that they can trust us. But the risk threshold is fairly low. I guess the highest risk even for a product that’s $49 a month, really the risk there is to make sure that your data is safe. You kind of have to beat that point. If you front end risk too much, people might not take you that seriously. I’ve seen lots of split tests where people start using old school terms like risk free. When you start saying things like that, you’re pinging something in someone’s head. They’re like, “Oh, I didn’t even realize there is a risk to begin with.”
S: I agree. Yeah, that makes sense.
J: Especially at kind of the low risk threshold. For the higher ones, absolutely you have to make a case, but for something similar to something at lower priced points, it’s kind of a no brainer.
S: When you say $49, that’s what you’re selling the DM lab for, DigitalMarketer Lab?
J: Yup. That’s our DigitalMarketer Lab product. That’s kind of our flagship subscription product. It’s really kind of been the backbone for DigitalMarketer since our inception.
S: You offer $1 trial, correct?
J: Yep. It’s actually funny the $1 trial, how that whole thing started. We wanted to do a free trial but due to tech limitations we weren’t able to offer something for free and then bill somebody a month later. We have to get them to spend something with us. Luckily, it fell in line with that little tech snafu. It worked out great for us. But also, it fell in line with one of our main core tenets and things we teach here at DigitalMarketer which is whenever you’re moving people through the customer value journey, you’re changing your relationship with the customer. They’re a visitor, then they’re a lead of some kind. If you can get them with an entry point offer, like a low $1 offer, whether that be one of a $7 report or they jump into lab for $1, just be active spending any amount of money with somebody turns them into a customer. That dramatically changes the relationship.
S: In what ways? What happens? Because they spent that $1 with you, are they more willing to spend let’s say thousands of dollars with you to attend your conferences, to go through your certification programs, what tangibly changes in that relationship?
J: As a customer, if you wanna go back to how we’re looking at risk reversal or risk aversion, at that point they’ve seen nothing bad that’s happened, they’ve also seen value. The whole idea is leading with value so that they wanna keep coming back and they will continue to purchase. All roads go through our lab product. They go through lab, they’ll purchase a lot of certifications, they’ll purchase different tickets to our events. Some, if they identify as agencies or consultancies, willing to become one of our certified partners. With all roads leading through lab, the $1 trial has proven to be a very valuable tool in how we kind of bring people through the customer value journey.
S: Right. You also sell these execution plans separately. They’re all part of the eat-as-much-as-you-want sort of membership site but you can buy them beforehand at what, $7, I think?
J: Yup. Actually, we used to do those at $7 funnels. We ran some tests for a while where what we realized was knowing that all roads need to go through lab in order to make somebody a full-fledged DigitalMarketer advocate, we were shooting ourselves in the foot by just leading with that $7. What we wanted to try to do was we tried a few different variants where we moved up the offer of the DigitalMarketer lab product to be in competition with these reports. We had it at the full price report of $27. We had the $7 report as well with the option that you can also get this report as well as 35 others, access to our community, and some other things like that, if you just sign-up for $1 today. We funneled the bad offer there because if people didn’t purchase the report, they never even knew DigitalMarketer Lab existed. We wanted to make sure that they knew that that product existed because then we could be free to talk to them about it later on.
S: Yeah, that makes sense. The way that you pitch it, awesome, makes a lot of difference as to the value being portrayed. Let’s say, for example, with Amazon if I go to a Kindle ebook page and it’s got the product description and everything, it will say if it’s part of Kindle Unlimited which, a lot of times it will be, it’s $0. It shows the price for the Kindle edition, price for paperback, and then the price is $0 if you sign up for Kindle Unlimited. It’s a great soft sell on their Kindle Unlimited plan. Did you do something like that with these execution plans like, “Okay, it’s $27. We’ll give it to you for $7 or whatever but it’s free, it’s $0 if you’re part of our DM Lab, and this first month is a $1 trial.”
J: Yeah. Actually, where we got the idea for this model because we had seen something similar when we were looking at Audible. Audible had an interesting page that I think they used, everything was I think either free when you create this subscription and you get a trial but since we’ve always been within that $1 trial area, we kept it as just ‘get it for $1 today’. I think some of the language was like, ‘you get all of this in addition to it’ or ‘this is included with your subscription’. But I think we stayed a little clear of saying too many things that are like, “Oh, it’s free.” Because we know that they’ve had to spend at least a $1. We kind of want to make sure that we were keeping the expectations right upfront.
S: Okay, got it. There’s kind of a sweet spot of what kind of price range, a trip wire, that’s the terminology you guys use for low price but high value initial offers like the $1 trial. Is there a point where it clearly breaks down in the testing like, “Okay, over $17 and forget about it. Nobody’s gonna want to spend that kind of money,” or what does the testing show in regards to the price point?
J: Yeah. It’s all a balancing act. It comes down to the perception of it. I think for our reports, we still see a decent amount. We don’t see the volume that we saw of people purchasing at the $7. We can have them at the $27 one. But we are seeing around 20% more people taking the $1 trial of Lab than before. That was kind of a big deal. We kind of took the hit on the revenue side there. I did find that if we put the price point for $47, the take rate was almost non-existent in terms of an acquisition funnel. You can see the sequencing is all wrong though, “Hey, here’s something free. Now, pay me $50.” I think there is just the breakage point there. That can change within industries. You have to think about what feels almost like a fungible expense. What could be something that feels a little sporadic where you’re like, “Yeah, I can get that.” That kind of comes down to how we even position everything on the page when we start looking at the price. That comes back down to articulation. When we’re positioning these things at $7, we could say like, “It’s just your large, fancy Starbucks coffee or something like that,” or you could invest in your business. It’s an easy way to make something look ludicrous of why you’re turning this down because there’s so much value in it. Whereas, when you start saying $47 they’re like, “Oh, your week’s worth of coffee,” or something like that. It gets a little bit harder to show that saying no is…
S: Kind of trivial.
J: Yeah, exactly.
S: Do you do a lot of comparison chart or matrixes or whatever? “Here’s the best value version,” and you make that one bold in a different color or a larger size.
J: Yup. We’ve done that a few times. It just shows that it works. There’s a few other tricks that people like to try out where they like to starting from the highest price point down to the lowest to avoid sticker shock. I haven’t seen that move the needle. But generally, everything just seems kind of the same now when it comes to the pricing tables. We will have the middle option will always be the most popular, the highest option will always be the best value, and those types of things. But those types of visual cues help people not think and by the time they’re on your website and they’re trying to make a decision on how to do things, likely they’ve made more than enough decisions they need to in a day, and they just need to be told what to think sometimes. It’s a matter of, “Okay, let’s make this as easy as possible for their brain to do it.” Could you imagine at the end of a long day… Isn’t there a number of decisions people can make in a single day before they’re just like, “Ahh, I don’t care anymore.”
S: Yeah, I know what that is. But there is decision fatigue, and there’s also finite amount of willpower that people have. If you keep the cookie jar out, it will eventually wear you down.
J: Mix those two together, my gosh. But it really does come down to you need to make it easy to make a decision because you don’t know what that person’s day or experience is like when they access your website. Sure, you might know what device they’re on, or what time of day that they’re looking at it, but you have no idea what their mindset is. You have to make it as simple as possible for people to understand what it is they’re looking at, what does it mean for them, then what to do next?
S: Yeah. You said try not to make the user think too much. That reminds me of a great book by Steve Krug, Don’t Make Me Think, are you familiar with that book?
J: Yes, I am.
S: Yeah. There’s a whole chapter in that dedicated to focus groups. This might well segue here to my next point is what are you doing in terms of focus groups? Because they can be as impromptu and just spend $50 or $100 buying free pizza or whatever for people and get them in the room for an hour or two and watch them trying to navigate your website and purchase things and stuff like that. What kind of stuff are you doing in terms of focus groups to inform your decisions?
J: Yeah. I don’t use focus groups a lot. I break down the concept of gathering data, specifically user data, into two main buckets. You have your active user data and you have your passive user data. I’ll define them first. Passive user data is a type of user data where the user is unaware that they’re in a tested environment whereas the active user data is a type of user data where they are aware that they’re in a tested environment. Whenever you have people that are aware they are in a tested environment, they’re gonna act very differently than when they’re not aware of that fact. I’ve always found that if you are incentivizing, people are doing things to get them to give you some information about a product or about things like that, I found that it’s generally not that useful, or furthermore, they don’t know what they don’t know. If you get what I’m saying there. One, there’s two issues. If I’m putting the questions together for the focus group or really anything like that, I can easily frame it any sort of way. I’m not malicious or anything, maybe I’m just unaware of the fact that I’m framing things at all which is not great. On top of that, people in general when they’re in those scenarios will want to try to kind of appease the person, it’s called the Hawthorne effect. Where they’re gonna be giving a response that is most likely to fill the correct response to whoever’s administering the survey, or any sort of focus group thing there. You also get people that are just there for the reward. Finally, people in general don’t know what they don’t know. In some regards, when you’re doing product work or anything like that, you need to be the innovator still. That’s why this stuff is difficult. You can’t just rely on listening to a group of 20 people, “Okay, I got it”.
J: You need some level of innovation there and you have to take the risk there but you just take the mitigated risk.
S: Yeah, that makes sense. But you do do some surveying of your users?
J: Really rare. I actually had a whole thing on. Surveys, they’re generally done, kind of just suck. One of the main things, it comes back down to that active and passive testing, I will do surveys that I can quickly connect to quantifiable data or that are highly factual, and very, very clear. I try to keep out any level of interpretation by both the person administering the survey as well as the people who are taking part in it. For example, I want to know if there was a lifetime value difference between people who are in DigitalMarketer Lab and in our Facebook group, our private group, or versus people who weren’t in the group. Generally, I’ll have a lot of community managers out there. They’re gonna be talking about things like, “Hey, you need to invest in community because people spend more with you and do all sorts of great things with your company,” and then you go, “Well, do they?” You’d assume they would because they’re more connected to you, but do they actually? We want to figure that out. We asked a survey that just waited for people who logged into DigitalMarketer Lab, a little thing popped up in the corner and just asked them a question that says, “Are you in DigitalMarketer Engage?” In parenthesis is our private Facebook group. It’s a yes or no, no interpretation there. They’re aware of Engage. It’s fine. If they said yes, it just said, “Hey, real quick, could you just type in your email address?” If they said no, the same question was asked, but based on whether they said yes or no, a different tag was applied in our CRM, and then we are able to look up the lifetime value of these people. We did find that overall, they do spend more money with us, and they stick around longer, that’s nice. It did show the power of community. But it wasn’t really a traditional survey in those regards. Another one was from when we were launching a new product. I was trying to figure out what to charge for it because it would be based off of different web sessions. I didn’t know what the more popular kind of cohort would be. I took a look at Digital Marketer Engage again. I just went on digitalmarketer.com at this point, and I just put a survey up that just asked, “How much monthly unique traffic do you get?” Then I gave them ranges. I didn’t say like a lot, a whole lot, or anything like that. I gave them ranges from 0 to 10,000, then going upwards to 200,000+, or 500,000+, or something like that. It was funny, I wish I’d actually taken the results more seriously when I started creating the different pricing plans, because 80% of the people that answered that survey said they got between 0 and 10,000 unique visitors. When I made the pricing plans, I didn’t need to be so generous, or as generous as I was with the page view allowance but I was very generous at the start. It turned out that 80% of people are in the lowest plan because it fit their standard of stats. I could’ve used that a little bit better.
S: Interesting. Are you using a particular survey software for that? You’re running Infusionsoft on the backend, right?
J: Not anymore. I think some of our legacy funnels still does some stuff with Infusionsoft. We’ve been doing a lot of things, we’ve been using LimeLight for our processing, and cart as well. Then Maropost for all of our automation, and those types of things. We moved off of Infusion last year. They’re still selling our products though but what we used for surveying and those things is actually a DigitalMarketer Product, it’s called True Conversion. We ended up buying it about a year ago. It does all the passive test stuff that I was talking about in terms of heat maps, session recordings, funnel analysis, form analytics, and those things. But it also has micro surveys which are those kind of surveys that pop up in the side corner, that are those really nice and easy like yes, no, something like that, that you can just move on from. Then also the longer form surveys as well.
S: Cool. How does that compare with its competitors? There’s bunch of software out there that will do…
J: Oh, yeah. There’s tons of different tools like that. I’m super biased because I helped make this tool, so I think it’s better. We definitely invested way more on the passive testing side of things where it’s the level of segmentation and granularity you can get to is actually more useful. We also have released an, well, this one’s not released yet but it’s our smart funnel tool which makes funnel tracking a lot easier. What we have there is I think probably the most powerful thing on the market. You can do things, say you were looking at a heat map, or you were looking any type of heat map, you can’t really break it down any further from the average data set. We’ll actually let you choose different segments from where you draw it based off of those inputs from the data set that we have, so you can get a better look. What’s happening is, which I think is pretty neat, we’ve also included a full on custom branded agency model for that where any sort of email they’re getting from the application looks like it’s coming from people. It looks like it’s fully something whoever decided the custom branded. It’s great for our agency partners who happen to be kind of one of our main customer avatars at this point.
S: That’s the smart funnel tool that you’re talking about?
J: Not just anything within in the True Conversion tool itself. It could be any hidden app report where it looks like it was built and created by whoever is branding it or anything further down.
S: Nice, alright. Speaking of smart funneling or smart funnels or whatever, let’s talk about different kinds of funnels and how you might optimize each of these funnels. Let’s say that you’re consultants or you’re a coach, or some sort of service provider, and you need to get people on the phone in order to make the sale. Maybe the price point is too high, or maybe it’s just too custom, or maybe you just get much higher conversion because you give them kind of like a free session before you sign them up, that would call for an appointment funnel, for example. What would be some of the ways to optimize a funnel where you’re just trying to get a phone appointment with the prospect?
J: There’s a lot of ways you can do that. I guess when people are looking for optimizing conversions, it always comes down to the quality, quantity thing. But I can tell you the tactic we use for one of our particular customer avatars that we have optimized for traffic we’re already getting. We don’t ask for the phone number up front. What we end up doing is for anyone of our lead magnets, anything that is for lead generation purposes, just information, things like our Facebook ads pamphlets or any thing like that, we ask people first name, last name, email, then we ask them two binary segmentation questions. Those are gonna be self-identifying questions that are just a yes or a no. We ask them, “Are you an agency serving small businesses?” Yes or no. These are both optional, by the way, and then we ask another one that says, “Do you manage a sales and or a marketing team? Yes or No?” Depending on those responses, depending on all things considered, once they put their information in, they’re gonna go through the Lab funnel because we believe that all roads must go through Lab. But say if they say they’re an agency or consultancy serving small businesses, that’s our highest value customer avatar. What we do on the backend of that is they’re routed into a sequence in Hubspot where it comes from one of our sales reps that asks them to reply to the email. Then within that reply, they set up a Calendly link and then they just get on the phone. Rather than taking what is the general conversion killer when you’re looking at quantity which is asking for a phone number upfront, we went for a more passive approach from leads we were already getting. We’re able to kind of really, really boost up lead flow for our MDRs.
S: Right. When you say that it’s a conversion killer to ask for the phone number upfront, to what degree? Is it gonna kill 90% of the potential increase that would’ve come in? They don’t fill up the form because you’re asking for a phone number now?
J: I don’t have a good percentage drop from that, because all marketing is is messaging in sequencing. If you’re asking for the phone number at the right time, they’re gonna put it in. If there’s a high enough perceived value, low enough perceived risk, and then I guess kind of higher reward on that end side, that’s when it turns into a way that they’ll put in more information. The number of form fields you have, as well as the type of form fields you have, are directly connected to the perceived value and perception of risk.
S: I really like your quote, by the way. You said that all marketing is just messaging in sequencing. Is that something that you came up with on your own or is that something that you heard from somewhere else?
J: That is from Ryan Deiss, I’ve heard him say that. I think I’ve heard him say lately. I was actually at an event on Friday and it popped in my head. I think I heard him say that a couple of years ago at one of the events we’re at. I was like, “Gosh,” it popped in my head on Friday when someone asked a similar question. I think it’s very, very true that it just comes down to those two things.
S: Yeah. It kind of reminds me of something Tony Robbins says that, “All business can be broken down to marketing and innovation.” I like that one too.
J: That is a good one.
S: Yeah. What do you guys use HubSpot for?
J: We tend to dabble in lots of different techs. But we’ve been using HubSpot because we invested in Salesforce at the company for the last 14 months. We’ve been kind of building in training and sales course here. That’s what we’ve been using HubSpot for. We’ve using their sales enablement tools there and some of their marketing tools to use a sales enabled marketing for them. The thing I was just explaining to you like with the binary segmentation questions. You go, “Oh, this is going into HubSpot now.” We add built in marketing campaigns in the backend there that then connect into their sales funnel and pipelines.
S: Yup, got it. I’m curious why you guys migrated away from InfusionSoft? I’m using InfusionSoft myself. It’s such a robust although kind of confusing platform. What was the driver for switching off of that?
J: We just kind of reach limitations. We grew up. InfusionSoft was great for us as we were a growing company and as we were kind of exiting our teen years. We knew that we were hitting limitations for things we needed to do within some of the data and how things would play with some of our own internal texts that we’ve built. We had to find things that were a bit more open from that point.
S: That’s cool.
J: Nothing wrong with Infusionsoft. It’s great. We were on it forever. Likely, if we were to launch anything else again, we’d likely just go do that. But as you get larger, it does lose some of the, I think, the power and ease.
S: Let’s go back to this DM Lab situation now. You’ve got somebody who opted in for the $1 trial. They’re happy, they’re staying with it, it’s $49 a month. What are you doing to make sure that they stay, you’ll retain them. Because you might do a short microsurvey, you ask them like, “Are you participating in the Facebook group or not?” If they are, then they are getting ongoing value, but if they’re not, their retention’s probably gonna go down and be less. What are you doing to actively retain these ongoing members?
J: That is the million dollar question. That’s actually something which is pretty much one of our biggest initiatives in 2018. We’ve always had some early on [low-fi [00:36:26] kind of email sequences that would go like, “Hey, first do this. Then do this.” But we do wanna bring in some onboarding, in-app walkthroughs. We also wanna go really deep into the user data and try to catch people before they start showing signs of potentially cancelling whether that be not logging in for more than an extended period of time or we’re rewarding people more when they’re actually doing things. If they’re completing a lot of execution plans within there, maybe give them a little bit of love or something, point out that they’re doing it, or maybe have suggestions to move to the next spot. That’s a big part of 2018 for us. We’re all about digging into the actual user behavior data and using that as predictive measure. That’s the big thing.
S: Are you gonna gamefy the experience more?
J: That’s not off the table. We have these monthly growth meetings that I lead with different members, with different heads of each department. We take in all the ideas from people within the company where they have to write them down, give a detailed explanation, build out hypothesis, and they have to score it based off of its impact, its confidence, and its ease. Finally, they have to say whether it impacts acquisition, activation, monetization, and retention. Those are multiplying factors for the score which gives a very, not an exact, but there’s no subjectivity in terms of why different projects are getting picked in those types of things. When it comes to the type of stuff that we’re looking at for retention, one that is continuously going back is gamification. But to what level we go, I’m not too sure. It’s still a little early to figure that one out.
S: Okay. It’s a big opportunity for you guys.
J: Oh, absolutely.
S: Let’s talk about your process for defining customer avatars. Just to give you an example, I think this goes to the nth degree, it’s so powerful. I don’t remember if it’s Adidas or Nike that does this but they have full size posters, life size posters of each of their avatars. Let’s say they’re teenagers, so they have a life size poster of each, and then they will have their locker next to that poster. It’s an actual high school locker. You can open it up and it’s decorated. Typically if it’s a girl or whatever. It’s decorated inside or outside or both or whatever. There’s all their gear, their sports gear, there’s their book bag, and there’s lunch, all of their stuff is right there so you can really get immersed in their world just by opening their locker. It’s so cool. I just think that it’s such a powerful example of really defining your customer avatar to a whole new level. What do you guys do?
J: In terms of avatars, we don’t go to that level of having them up on all around the office. The big thing we have up all around the office are we look at the old homages to direct response marketing, as well as our core values at the team organization. We put those up all over there. In terms of avatars, that’s actually something we probably should get up on the wall. Actually, one of our trainings is built around how to develop a customer avatar. Russ is probably a lot better at that than I am and Ryan. Avatars I’d say aren’t my strength, in terms of developing those. I can understand them once they’re put together but the big thing for us when we’re looking for very specific avatars, we like to ask, let’s say you wanna try to find the type of interests or things about that person that only someone who is a hyper evangelist or a really hardcore person within that customer avatar would understand. If you’re going after golf, most people would pick Tiger Woods, and they think, “Let’s start here.” My grandmother likes Tiger Woods and she’s not a golfer. You have to get more specific. I don’t know any other golfers. I would also not be fit to discuss this, I can’t finish this metaphor for example. I think you get what I’m saying. It’s kind of getting into specificity. Whenever we build out our avatars, we like to look at based on the avatar we have, what’s the before and after state after kind of interacting with our brand and with our products.
S: What would a before and after state look like for an avatar?
J: First, we break down before and afters by have, feel, average day, and status. What do they have? Let’s say it’s for a marketing manager or marketing director, for the hams, they have an unorganized team that can’t communicate or semi-trained team that can’t communicate, something like that. That’s kind of what they have, what they’ve inherited. After they use the product, they have a team that’s trained using the same vocabulary and can do the tasks assigned. That would be one for feel. How you break down these for have, feel, average day, and status, that’s when you’re kind of writing the copy for things or articulating your offer, it gets that much more powerful. Have versus status, there’s a multiplying factor in terms of persuasiveness there. But then you start looking at feel, it’s like you’re frustrated that you can’t get your team to discuss or you can’t get your team on all of your projects because they don’t know either the terminology well enough or they’re slowing down because they don’t know how to do the job. Since they streamlined things, you’re feeling a lot more confident in your team and in yourself for growing your organization. Your average day is more likely digging through reports that are kind of done haphazardly that you have to constantly monitor and make sure that you double check, triple check, and kind of can’t get away from so you’re not able to do your job because you’re essentially babysitting. Your average day changes too. You streamline communication and projects and now you can start doing what you’re paid to do, and that’s innovate and come up with new ways to grow the company. Finally, knowing the status, and this one’s not gonna be that great but you’re the boss, then you’re kind of like the super boss. You have everything together. You have evangelists within your organization that have seen what you’ve been able to do. You’re able to do all of that with that product.
S: Yeah. You get to go from a manager to a leader.
J: There it is. That’s the one.
S: That all goes into the copy to inspire them, to poke at their pain points, and to stretch that gap between where they are currently and where they would like to be or where you can picture them or have them picture themselves.
S: Yeah. Sounds very similar to four forces exercise that I learned from Taki Moore which is you’ve got four quadrants; there is the frustrations quadrant, there is the fears, there is another quadrant for wants, and then another quadrant for aspirations. The idea here is one of those axis is for whether it’s in the here and now or if it’s in the potential future, and then the other axis would be if it’s something they’re moving towards or moving away from. Fears are in the future. Unrealized potential there and hasn’t come to pass and that’s something they’re moving away from. Whereas a want is something that’s in the here and now, and it’s something that they’re moving towards. If you can identify some of the stuff like what they’re frustrated by, what they are wanting, what they’re aspiring towards, etcetera. You’ve populated items in all four of those quadrants, and you work all that into the copy, and that triggers them in all sorts of different ways instead of just if you’re not doing this probably just thinking in terms of what’s their frustration or maybe it’s just their fear, you’re not hitting them from all angles. Have you heard about this particular exercise?
J: No, I haven’t.
S: It’s really cool. It’s very powerful. Awesome. Let’s go back to where we started on the testing. Let’s share some more of the quick hit sort of test and things. I know you present a lot at Traffic and Conversion Summit. I’m sure you present at other conferences as well. I’ve heard you speak multiple times about some easy wins, kind of obvious tests, some things like that. I’m wondering if we could delve a bit deeper into some of these tests. You’ve already given a couple of ideas at the beginning of things that we could test. You mentioned trust seals, something that you could incorporate into the site to do some testing. By the way, are you familiar with SafeSite Certified?
J: Yes, I am.
S: Yeah. I know Chuck Mullins. He shared a statistic that was pretty compelling about just adding some of this SafeSite seals to really increase your conversion rate. What have you guys seen as far as impact of incorporating trust seal?
J: I think with trust seals, there’s been some really interesting studies done both from a testing perspective as well as coming from an academic perspective. One of the big ones being that only three seals are noticeable by consumers. They did a test of which one seems most trustworthy and when you notice them there. It was McAfee, Norton, and some others that kind of hit the list. I’ve also seen some cases when you’re looking at that, you’re not using a well-known one, that can hurt things because they’ll be like, “Oh, what is this?” Two, if you’re using the wrong seal at the wrong time in the customer journey, I saw someone use a traditional ecommerce kind of a trust seal on a lead gen form once and it completely tanked conversions. It was a trustee seal for lead generation which made no sense. People who recognize that seal normally identified it with something that had to do with a payment so that goes back to that whole like, “Oh, risk free. Wait, there’s risk,” scenario.
J: Then also, if you’re just using too many of them. If you have just a metric button along the page, it sounds like you’re saying, “Trust me. No. Seriously, trust me. Come on, trust me, trust me.” I very rarely trust the person that says that four times.
S: Right. Or the person who brings up how much integrity they have. If they bring that up then you know they don’t have any.
J: Exactly. It’s like, “Oh, I’m so smart.” It’s like, “ Nah. I don’t think you are. Smart people don’t say that.”
J: Trust seals are important. Really, when I’m breaking down landing pages or anything like that, it comes back down to looking at, again, the offer, your form, and your CTA. There’s two types of landing pages, really. It’s the kind of click through that’s getting them to take an action to either add something to a cart, or do an order form, or move further down into the funnel, or there is the lead generation which is a form. There’s kind of looking at form optimization points there in terms of the right types of form fields, right number of form fields. Generally, if you know you’re gonna need more than three fields on your form for a qualified lead, you can probably get up to as much as seven bits of information before the conversion rate really starts to tank because you start seeing the conversion rates will decline from three to four, but they stay relatively stable until the seventh field from what people have seen. You can probably get higher quality leads if you’re running into those problems. Again, you should test that out but it’s worth noting. If you’re ever having a problem in terms of messaging, or you’re just not sure why things aren’t converting, I use a tool from UsabilityHub called Five-Second Test where it shows your site or whatever page you wanna show to some people and then it asks them a series of questions. I generally wouldn’t go more than two because five seconds is not a long time to digest anything. Generally, I wanna ask them, “What did you see? What was in it for you? What did you get? What do you see and what do you do next?” Then going from there, that will tell you really quickly if you have an articulation issue, and is super valuable, and works with mock ups. If you’re in prelaunch, definitely worth using.
S: Oh, that’s nice. What you’re saying there is people think about when they’re looking at a web page for the first time, what’s in it for me? They’re listening. They did WIIFM, what’s in it for me, and they’re also thinking, what’s the next step? What do I do next? If there are too many potential next steps, then you have the paradox of choice and there’s just too many things for them to choose from. Decision fatigues sets in. They don’t make any decision. Then they leave.
J: Exactly. Definitely recommend doing that. We haven’t launched anything like super, super brand new recently where I’ve had to do that. But I know when we were launching a new variation of the blog, I tried a few of those things on there to make sure that we were on point while we were still in development mode.
S: How do you get the statistical significance with a test or survey like that with say Five-Second test? You can’t just run it on a small amount of traffic.
J: No. I look for between 100 and 200 responses. Then you look at the word clout and you start seeing like, “Am I starting to see similarities?” But if it’s spread out way too far, generally, what that tells you is no one knows what they’re doing, so that actually helps. If you’re not getting any consistency across the board, then you’re saying, “Okay, I’m definitely not articulating anything on this page,” but if you’re starting to get a single one or similar threads, then you’re likely doing it well enough. Really, the problem comes when you have two equally opposing ideas. There’s something that would be having equal amount of information or equal amount of representation in that experiment which should be unlikely. Generally, you don’t want small numbers, but also the whole game of experimentation when it comes to experimentation at companies is risk versus reward. Whenever you’re looking at any sort of split test you run, and any sort of experiment you run, whether it be just purely qualitative or whether it be something that’s hugely quantitative like a split AB test, you have a clear exploration period, and a clear exploitation period. The longer your exploration period, the shorter the exploitation period, and vice versa. If you spend too long trying to make sure, and likely people will spend a lot of time to reach a higher level of statistical significance that they feel is less risky, if you do that you might have a shorter period of time to actually reap the rewards because designs change, traffic influx changes, all sorts of things are gonna change overtime. Time is ever present. Be aware of that when you’re looking at these types of things and you’re thinking about the levels of statistical significance. We’re not research institutions, we’re businesses.
S: Right. You shorten the potential runway by taking too long with these AB tests and so forth.
S: Makes sense. What’s your feeling about sliders, carousels?
J: They’re terrible. They’re a lazy solution to a complex merchandising problem, or a complex offer articulation problem. A lot of those come out of like, you’ll see it in larger companies too, where people are fighting for spots on the homepage. You say, finally, you can appease it apart by saying “Hey, we got you on the homepage, you’re the 7th slide.” There’s a five second interlude between them. I think, what is it, the average slideshow is about five slides. They have between 3.5 seconds per slide. If you’re trying to consume information and there is motion happening on a page, you’re not consuming anything. If you’re asking for multiple offers or trying to make multiple cases while on the page, you’re not making a case at all. If you want people to consume content and try to find out where they’re going, do that in a less kind of distracting way. I’ve even been even playing around with more of getting back to not traditional micro sites but doing self-segmentation on the front end. Where it’s like, “What are you interested in?” One of the things that they need, I will say as a digital marketer, is for marketers, for businesses, for agencies, pick which one you are and things like that that move people to the right type of content. I think this is an old old school test from Dell but it always stands out in my mind. They had a slider of five potential panes on it. What they ended up doing was they just put a grid on their homepage and they let people pick. They turned their homepage into pretty much a category page for the product types. It won hand over fist.
S: Interesting. I’ve heard from so many experts in conversion rate optimization that sliders, carousels are death to conversion. How about videos?
J: Videos work well.
S: If you are on a homepage and what’s happening is the background is playing as a video without sound.
J: Oh. I haven’t tested those, actually. Those have kind of gone out of style. I don’t see those that much anymore. You’ll see them on video sites.
S: I don’t think so. I just saw one today.
J: You see them on video sites. I see them on video provider sites. Wistia, I think has that.
S: Well, it’s also like tonyrobbins.com.
J: Okay, yeah. There’s a few that do it. It’s a major distraction, but likely, it’s a unified distraction. It has a sole purpose.
J: That’s where I think that would ultimately beat out the slider.
S: Yeah. Speaking of things that are going out of style, what’s your feeling about Paralux?
J: I still think it looks cool but I haven’t seen a case. Like all design things, you have things kind of go in and out. That’s what I was saying whenever you’re going through that explore, exploit period, you wanna be very time conscious. But Paralux, I don’t really care for it but it looks cool. I think it’s still somewhat of a distraction. It’s a fun feature but not very necessary. It doesn’t really help with anything. I probably wouldn’t use it.
S: I think it’s just kind of peacocking. Whatever. If you don’t have something really substantial then I guess go for the flashy. Another thing that’s going out of style or has thankfully gone out of style already is when you mouse over the button then you can actually see it in a more vibrant color or the outline. It’s more than just an outline of a button.
J: Oh, the ghost button?
S: Yes, a ghost button.
J: Ghost buttons work. They work so far as a secondary call to action. Similar to when people would have lines of text underneath the button, that would be optional. You’d be like, “Hey, get this thing now or learn more,” in text. If you’re gonna have secondary calls to action, totally cool, but if it’s your primary call to action, it’s not doing its thing.
S: Right. If it’s you primary call to action, like you want them to add to cart, or watch a video, or whatever, and it’s a ghost button. It kind of blends into the woodwork and they have to mouse over for it to pop, then it just completely misses the point, I think.
S: What about social proof? What are some of your tips for testing on social proof?
J: Testimonials are very, very important. If you’re an ecommerce having reviews and genuine reviews at that, are super important. But really comes down to you gotta make sure it’s not just a quote without a face. You need to have the face of the person. If you’re B2B and you just say why they’re relevant, who they are, what company they are from, if you’re allowed to give out that information. Don’t go too overboard with the quotes. You wanna seem genuine. I found that people make most of their decisions off of reviews and quotes that fall within the two to four star range versus the huge advocates like five or like the huge pessimist at the one. It’s worth noting about those things. But you need some level of testimonials on your site. That’s what’s gonna really move things. There’s different tools out there. There’s this tool called Proof that would kind of be like, “Hey, such and such person bought this two minutes ago.” That is a form of social proof. People know that people are buying. They don’t wanna be the only person because it goes back to again, that fear of risk and loss where they don’t wanna be the only ones getting duped. They’re buying something that no one else has, like a piece of garbage.
S: Yup, good stuff. What about video testimonials? Are those important? Do people actually watch those?
J: People do watch those. It goes back to look at how much persuasive factors you’re going to need on a page based off of what you’re offering. For probably larger ticket items like for our HQ product, for some of our larger ticket items, we would have video testimonials and customer stories on there. They’re very, very persuasive, and a lot of people watch them. In general, we don’t have that on for our flagship product because it’s only $49 a month. We just needed a whole list of testimonials from people that have been using the product and have seen all of its success. It’s just weighing the options there, or weighing the price and value, and those things, and what you need to push people over the edge to make a decision.
S: Yeah. For something like DM Lab, you might use shorter written testimonials. Would you do those as like Facebook screenshots, so if somebody posted…
J: Yeah. We’ve done Facebook screenshots on some of our product pages, we’ll do things like we’ll get them stylized where it’s not Facebook anymore. Traditionally, in all of our onboarding sequences and some of our pages still, we go with just the Facebook screenshots because everybody knows it.
S: It does have some sort of cache when you take a screenshot from Facebook and use that instead of just copying and pasting the text.
S: One other little tip here that I’ll share with listeners when it comes to video testimonials, because not everybody will watch a video testimonial, I take the text of the testimonial. I have it transcribed, the video. That text, I will have certain bits of it highlighted in yellow. If the person doesn’t read the whole testimonial transcript, they just at least look at the yellow highlighted bits that will convey the key points.
J: Oh, that’s great.
S: Cool. Do you wanna leave our listeners with one last gold nugget and then we’ll close the episode out?
J: Yeah. I think the last thing I wanna say and we touched upon it a little bit earlier, it’s kind of optimization sacrilege to say this, and I’m glad you picked up on this. It’s something I really believe. The whole concept of best practices and those types of things have generally been kind of shunned by the optimization community which I think is stupid, because generally, what best practices are, they’re common practices. It’s what’s common to your user and what’s common to their experience. You don’t wanna just go out there and say, “You can’t know what works,” because the fact of the matter is there’s things like templates, and there’s things like basic user experiences. People are used to having websites that you can stick with than start from a baseline. If you ever hear someone say best practices don’t exist, they’re a liar.
S: I agree with it, totally. You’re not gonna design a car that has the steering wheel on the backseat just to be different like, “Oh, we got to test it and see if the steering wheel does belong in the driver’s side than the front.”
J: Yeah. You absolutely don’t need to do that and if you did, you’d be fired.
S: Awesome. Where would folks go to connect with you and also sign up for DM Lab, because it is an awesome program.
J: You can just go to digitalmarketer.com. You’ll see right when you get there that you can sign up for an invitation to Lab. We actually do close the community because we wanted to make sure that we’ve got the right people coming to it, and we wanna make sure that there’s value at all times. You can request an invitation at digitalmarketer.com. If you want to chat with me more, you can find me on Twitter it’s @jtrondeau, and then if you have any sort of show notes or anything, it can link to my, I think my LinkedIn is /in/jtrondeau.
S: Yup. I will include that in the show notes for sure.
S: Alright, awesome. Thank you, Justin. Thank you, listeners. Now, it’s time to take what you’ve learned and actually apply it in your business, on your website. Not only will we have the show notes for this episode on marketingspeak.com, but also an action list, a checklist of things that you can apply from this episode to your website to improve your conversions. Check all that out at marketingspeak.com. Thank you so much. We’ll catch you on the next episode of Marketing Speak. This is your host, Stephen Spencer, signing off.