This Week’s Guest:
Customers can be hesitant to sign up for your services if they perceive that there’s too much risk involved. However, using terms like “risk-free” can create more problems than it solves. Customers don’t always consciously worry about risk. Using this terminology makes them realize that they should be considering it as a factor, which works against you rather than for you. Avoiding the acknowledgement of risk is a no-brainer, especially at lower price points.
Joining me today to share many tips and insights is the remarkable Justin Rondeau. He’s here to dive into the depths of conversion rate optimization with an emphasis on testing. After listening to this conversation, you’ll be inspired and empowered to use A/B tests or split tests to help optimize your business. Justin is a uniquely qualified expert. He’s the leader of the marketing and growth team at DigitalMarketer, which produces the Traffic & Conversion Summit.
Find Out More About Justin Here:
In This Episode:
- [01:09] – Justin jumps straight into talking about testing, discussing when one should use testing and when common sense is enough.
- [03:36] – For someone who isn’t too familiar with the realm of conversion rate optimization, what are some of the most obvious kinds of basic tests that Justin would recommend that people do?
- [05:04] – Justin talks about some of the things you could try out with offer tests, such as 14-day or 30-day free trials. Stephan then talks about the concept of risk reversal, and offers a strong example.
- [07:05] – Justin hasn’t dabbled much in risk reversal. Instead, he either mitigates risk or tries to take it off of people’s minds.
- [08:41] – We hear the story of how the $1 trial for DigitalMarketer Lab started, when they actually wanted to do a free trial.
- [09:51] – What happens when someone spends $1? Is that customer then more willing to spend more in the future?
- [11:37] – Stephan explains that the way you pitch something makes a lot of difference in terms of the value being portrayed. He uses an example, and inquires whether Justin has done anything similar.
- [13:38] – Justin talks about what testing showed in regards to the price range that works for a tripwire.
- [15:15] – We hear some tricks that people try to encourage customers with pricing tables. Those visual cues help people not think, Justin explains, and makes it as easy as possible for a customer’s brain.
- [16:52] – Stephan mentions the book Don’t Make Me Think, then asks Justin what he’s doing in terms of focus groups.
- [19:31] – Does Justin do any sort of surveying of his users?
- [22:27] – Justin lists some of the tools that they use on the backend after having moved away from Infusionsoft.
- [25:19] – What would be some of the ways to optimize a funnel where you’re just trying to get a phone appointment with a prospect?
- [27:56] – Justin credits Ryan Deiss with the idea that all marketing is ust messaging and sequencing.
- [28:30] – What is Justin using HubSpot for?
- [30:17] – We hear more about why Justin migrated away from Infusionsoft.
- [31:33] – Justin explains the techniques he uses to retain customers after they stay beyond the trial period.
- [34:23] – Justin discusses avatars, which he says aren’t up all around the office. He then discusses what a before and after state would look like for an avatar.
- [38:02] – Stephan talks about a four forces exercise that he learned from Taki Moore.
- [40:17] – What has Justin seen in terms of impact from incorporating trust seals?
- [44:10] – Justin discusses how to get to statistical significance with tests like he’s been describing, which you can’t just run on a small amount of traffic.
- [46:09] – Sliders and carousels are a lazy solution to a complex merchandising problem or offer articulation problem, Justin explains. He and Stephan then discuss videos in the background on homepages, which have gone out of style, as well as several other design features that are no longer in style.
- [50:20] – What are some of Justin’s tips for testing on social proof?
- [52:53] – Stephan shares a tip about video testimonials, which addresses the fact that not everyone will watch the video.
- [53:29] – Justin shares a final piece of information for listeners related to best practices, and points out that best practices do exist.
- [54:46] – Where can people go to connect with Justin?
Links and Resources:
- Justin Rondeau on LinkedIn
- Justin Rondeau at DigitalMarketer
- @Jtrondeau on Twitter
- Justin Rondeau on Facebook
- Traffic & Conversion Summit
- Clint Arthur on the Optimized Geek
- DigitalMarketer Lab
- Don’t Make Me Think by Steve Krug
- Ryan Deiss
- Tony Robbins
- Taki Moore on Marketing Speak
- SafeSite Certified
Your Checklist of Actions to Take
☑ Invest some time in split testing to see what truly works in my CRO. Not every conversion strategy will work for my business.
☑ Evaluate my current CRO before spending more on resources. Eliminate strategies that are not successful.
☑ Understand the 4 influences of optimization: what you offer, how you offer it (articulation), targeting and web design.
☑ Optimize my web form fields. Make it easy for people to navigate when they subscribe or purchase.
☑ Offer products or services that don’t require big commitments. Use free trials, discounted memberships, or limited time offers to increase conversions.
☑ Add a compelling risk reversal tactic to my campaign to build more trust with my customers. Offer money back guarantees or full refunds for unsatisfied customers.
☑ Use particular language throughout my sales process. Test headlines and call to actions to see what works best for my customers.
☑ Use charts and infographics to help site visitors easily understand my offer. The faster they understand my offer, the sooner they can buy it.
☑ Keep my message simple and highlight only the best products or offers on landing pages.
☑ Place social proof all over my site. Use testimonials, reviews, “as seen on” logos as well as verified payment methods.
S: Welcome to episode number 121. Conversion rate optimization is today’s topic with a big emphasis on testing. That’s right, AB tests or split tests are going to be your new best friend, thanks to today’s guest, Justin Rondeau. Justin leads the marketing and growth team at DigitalMarketer, the Digital Marketing BMF that produces the traffic and conversion summit. Justin, welcome to the show.
J: Thanks for having me, Stephan.
S: Let’s start by talking about testing because there’s a lot to conversion rate optimization and testing seems to be kind of the foundation to it, figuring out what things to test, in fact whether to even run a test. I think it’s something that people need to think about too. Do I even need to run this test or should I just do the thing that I think is common sense?
J: Yeah. I think testing is very much a foundation of optimization, but testing in itself, if we’re gonna be doing it to any sort of statistically significant level, requires time. A lot of people don’t have as much time as they need to get these things done which is why you do actually have to rely on things like common sense. What you’re looking at is this – something that you test out or is this something that you just merely apply. One of the things I like to do when I’m thinking about this is going, “What’s the upside of this change that I’m doing? What’s the potential improvement?” Specifically if you’re not doing any sort of reactive type of response. If all things are looking pretty good in terms of on your landing pages or within your funnel, and you go I feel like changing this or I wanna try something new. Before you allocate resources to it or try to improve it or you have this hypothesis like, “Oh, if I change this, this should improve this, and I’ll know it when I see this.” You have to take a look at the upsize. Is the change worth the amount of effort put in or is that change even worth the effort of splitting traffic and potentially sending 50% of the traffic to that page to a kind of a worse off version? I think people need to get a lot better at recognizing what is worth testing. Generally, it comes down to areas where there’s a bit of ambiguity or risk. You don’t know what the actual outcome will be or you don’t have a good enough data set to predict what that outcome would be where a lot of people will just kind of test the fully foundational things which are things you just changed overall.Whether it be kind of something within messaging where you can look between different levels of targeting or anything there to see if there’s a message mismatch or anything like that where you don’t test, “Oh, our headline is different than the promise that was put out in this advertisement.” Of course you’re gonna have an issue there because there’s no congruence, you don’t test that, you just change it.
S: You basically fix things that are broken. You don’t test that to see that, “Oh, yeah, I should fix that broken thing,” if it’s against best practice and it’s just not a good situation for the user or it creates confusion or what have you, just fix it.