Episode 112 | Posted on

SEO Audits Done Right with Bill Hartzer

Search engine optimization is one of those things that many people know a little bit about, but very few people know a lot about. If you run a small business with an online presence, you’ve probably heard about things like keyword density. But you may not know much beyond that. If you’re already an SEO professional, you know that there’s always more to learn! This week’s guest explores various SEO topics in a way that is sure to bring value whether you’re a beginner or an expert.

Bill Hartzer joins me today to talk about the nitty gritty of SEO audits. A good audit is a deep dive into the SEO of your site that results in a roadmap for success. It can give you insights into all the changes you need to make and the opportunities you’ve missed. Bill is an independent SEO consultant who has been working in the industry for over two decades. He’s a co-founder of DFWSEM and was a recent speaker at Pubcon on the topic of SEO audits.

In this Episode

  • [01:35] – Bill talks about some of the big gotchas and lessons from the topic of SEO auditing. He discusses how he starts the process of his audits.
  • [03:38] – There are certain types of information you need to gather when assessing why traffic might have dropped off.
  • [04:09] – Does Bill bother with Bing Webmaster Tools?
  • [04:52] – Bill walks us through more of his process. Stephan then clarifies what Screaming Frog and DeepCrawl are for listeners who may not be familiar with the tools, and Bill discusses the differences between DeepCrawl and OnCrawl.
  • [08:06] – Bill recommends using Google’s tools for daily issues, but the other tools he’s been describing for more complex cases, such as when traffic is going down or you have an issue that you want to diagnose.
  • [09:24] – We hear what Bill does next after using spidering tools.
  • [11:36] – Duplicate content is the bane of the SEO practitioner’s existence, Stephan explains. He and Bill then discuss DMCA issues.
  • [16:38] – Bill talks about his favorite page speed analysis tools.
  • [19:11] – We move on to analyzing HTML coding. A lot of what Bill does in this regard is manual, he explains.
  • [21:44] – Are there other favorite tools that will do HTML analysis? Bill can’t think of any, but lists a couple other potential issues that webmasters should be aware of.
  • [24:51] – What does Bill use for heatmap analysis? He answers, then walks people through how to use Google Analytics for this purpose.
  • [26:59] – Bill talks about some of the work that he does within the developer tools in Chrome.
  • [29:12] – We move onto another aspect of the audit process: external link analysis. Bill discusses how he handles this.
  • [32:47] – Are there any other tools that Bill is incorporating into his link audit?
  • [35:20] – Stephan recommends looking for lack of diversity when you’re doing a link audit.
  • [38:30] – Bill differentiates between manual and algorithmic penalties for listeners who may not understand why they’re so different.
  • [40:06] – Are there any other algorithmic penalties that Bill wants to mention other than Panda, Penguin, and Fred?
  • [43:58] – We hear about how big Bill’s reports tend to be, and how long it takes for him to complete the audits he does. He then talks about pricing, which typically runs from $500 to $1,500 for small business websites.


Every site needs an SEO audit, I don’t care how long it’s been around, how big the site is, whether it’s ecommerce or brochure, you need an SEO audit. It’s a deep-dive forensic analysis, it will roadmap out all the changes that you need to make, all the fixes, all the missed opportunities, it really is an essential component to success. It’s usually the first step in an SEO engagement, I know it is with most of my clients. Today’s guest is Bill Hartzer, he’s an independent SEO consultant, he’s been doing SEO for decades, since 1996 in fact. He’s co-founder of DFWSEM, the Dallas/Fort Worth Search Engine Marketers Association. I recently heard Bill speak at PubCon where I was also a speaker, his panel was on SEO audits. He did such a great job, I thought let’s get him on the show to geek out about SEO audits. Here he is, Episode 112 of Marketing Speak. Bill, it’s great to have you on the show.

Great to be here.

Let’s talk about auditing. You’ve done a lot of audits lately, SEO audits. You just presented on that topic at Pubcon. I was in the audience, I heard you speak and you did a great job. I’d love for us to share with our listeners some of the big gotchas, big lessons in the topic of SEO auditing.

Sure. I have a process that I go through typically to audit a site. I still like to start on some real high level and talk specifically with the site owner, or if it’s a client and so forth, and just get overall a good picture of what they’ve done in the past, and what their concerns are, and what their overall goals for the website are.

Is there a questionnaire or a kickoff call that you do?

It can be a kickoff call. It could be primarily a conversation, just get an idea where I then typically take notes on my hand. Certainly, even if you’re auditing your own website, typically, I will recommend that you start with just higher overall goals and where you want to be. Is the goal more traffic or more leads? Just put down a couple of different goals and then have those written down if you can. Then, it gives you a high level of where you want to be as far as traffic goes or leads, or so forth, in six months, three years, so forth. As well as any concerns, if traffic is down, if you’re not getting people to fill out your forms, something like that, if you’re not getting enough leads. This seems like higher level issues, but as you go through the audit, you can keep that in mind and look for certain telltale signs that might lead you down the path. I had cases where they’re not getting many leads and we figured out that only some of the forms on the website worked and some didn’t. If you have some of these high level issues, then you can start working for those answers.

Right. Let’s say the traffic is dropped off and they have no idea what happened, if that was because of an algorithm update or because of something they did inadvertently, and you can start looking for those telltale signs.

Yes, exactly. There’s certain types of information you’ll need to gather. Certainly, I like to start with access to the site, access to the website’s log files if you can, because there’s a lot of information with log files that you can get that is not provided, say, in Google Analytics and so forth. And then also access to Google Search Console and Google Analytics, or whatever analytics the site is using.

Do you bother with Bing Webmaster Tools access as well?

Actually, yes. I do actually prefer to have that because they will give you a little bit more and different information. A lot of the information is the same from Google Search Console to Bing Webmaster Tools, but there are different reports. They give you different data and it’s from a different perspective from their crawlers and so forth.

Let’s walk through your process. After you gather this data from a kickoff call or an introductory conversation, you take notes, now, what tools are you using, what stepwise process do you go through to make sure you comprehensively audited all the SEO elements?

It’s a good question. Typically, I go in a gathering phase. I actually prefer to have either Microsoft Word or Notepad open while I’m starting to gather all the data. Essentially, there’s a couple things that I’ll initially run, certainly are the crawlers. There are several different crawlers out there for log files and so forth. Typically, I use OnCrawl. OnCrawl combines actually a crawl of the site with Google Analytics Data plus your website’s log files and compares all those and gives you your certain reports and information that way. Certainly, DeepCrawl is another that I’ll typically use.

OnCrawl combines actually a crawl of the site with Google Analytics Data plus your website’s log files and compares all those and gives you your certain reports and information that way.

Yep, I use that one too. I love it.

Depending on the size of the site, if it’s a really, really huge site, it might be a little bit more difficult but running Screaming Frog SEO Spider is typically for most websites. I have used it for a crawl site that was 10 to 12 million pages and that required some extra hardware to get that done, but with Screaming Frog, typically under 100,000 pages, it’s a pretty good tool to give you a good idea of what was going on with the website.

Just to make sure our listeners are clear on this, Screaming Frog is a desktop program for spidering a website, looking for things like redirects, 404 errors and all that stuff. DeepCrawl is “software as a service”. That can scale to very, very large sites, but it’s also costed on a per URL basis. If you have a very large website, lots and lots of URLs, that can get quite expensive with DeepCrawl, whereas Screaming Frog is just a licensing fee for the software, and then you can run it on very large sites and very small sites. The pricing doesn’t change.

Yes. OnCrawl, which I’ve been using also, is a little bit different than DeepCrawl. They do crawl and it’s a software as a service. In OnCrawl actually, you pay just one monthly fee and whether you have a large site like eBay or whether you have a site that’s 50 or 100 pages, you basically pay about the same price for that. As an SEO guy who does a fair amount of crawling of lot of websites during audits, I use OnCrawl and have lots of websites. Not only that, for my own personal websites, I actually use OnCrawl and I have the site crawled every day. If there is any changes and so forth and crawling activity from the search engine spiders, I can see a lot of that from day-to-day.

I’m guessing that some of our listeners are thinking to themselves, “I get Google Search Console data. I get reports on crawling activity. Isn’t that sufficient for my purposes? Why do I need a separate crawl or Googlebots going in and doing its thing, and then I get reporting data on what’s happening through Search Console?”

For daily things and monitoring, it’s good to have the Google Search Console data and Webmaster Tools and so forth. I think when you start to get into a situation where your traffic has been going down or you have an issue and you want to diagnose it, then basically for example, when you have a crawler, especially a crawler that will look at your log file data and crawl your site and so forth, that goes a little bit further into helping you diagnose some of the problems.

It’s essentially giving you the opportunity to control the spider and look at stuff that is not necessarily available inside the Search Console. It’s also a more complete crawl, because you can’t assume that Googlebot is crawling everything and reporting on everything, every single page of your site, especially if you have a large website. Okay, let’s move on from the spidering tools to what else goes into your process. You’re taking notes and figuring out what to do here in terms of analyzing further after the crawler or the spider has done its thing. What’s next?

One actually other tool mentioned that is a little bit different is Siteliner. Siteliner is by the guys who do Copyscape. Copyscape works for duplicate content on the web. Well, Siteliner does a crawl of your website and works for duplicate content that’s on your own website. To give you an example, I ran Siteliner on one particular site and it turned out this one particular website had a bunch of testimonials and reviews, and some of those were pretty long. That section of testimonials and reviews, they had put that on the sidebar of every single page and then also had some of the testimonials and reviews in the footer of the site. That ended up being, maybe, a thousand words of content and it turns out that their typical blog posts that they did were say 300 or 400 words of content. What ended up happening is all this extra content of their reviews and their testimonials in the sidebar and the footer of the site was just duplicated over and over and over again. There are typical issues like that that Siteliner will find, that you typically may not be aware of that you’re using the same text on several different web pages on the site and so forth. That can be issues. It gives us an idea of what type of content we’re using, and certainly what type of content is on our themes and our templates of our site that’s maybe causing issues. In that particular case, we identify all these extra reviews and testimonials on the site. Just by changing that, we’re able to basically double the traffic on the site as far as organic. There are other issues like duplicate content issues that Siteliner will find that can make a huge difference.

For our listeners who didn’t quite understand why this is an issue, duplicate content is just a big no to the SEO practitioners’ existence because you can get caught up in Google’s duplicate content filter and if a lot of pages look like they’re duplicate to each other within your own website, then you end up competing with yourself. The pages that you want most to rank might not be the ones that get favored by Google. Then it’s gonna pick the one that it thinks is the one to present and the other duplicates are filtered away, not presented high up in the search results. What you’re saying, Bill, is that when you had this client with so much content on the page that was duplicated, maybe 1,000 words duplicated across a whole bunch of pages, and the blog posts were only 300 or 400 words, the duplicate content overwhelmed the unique content and made these pages look very similar to each other.

Yes, exactly.

Siteliner sounds like a great tool. Copyscape is also a great tool for just identifying scraper sites that have stolen your content. Is that something that you recommend people chase up, go after the scraper sites, or content thieves that are scraping your content?

In a lot of cases, what I have found over the years is that generally, say for example, you have a lot of products, and you write product descriptions, and you have that in the other websites selling the same products that are constantly copying your product descriptions and using them on their own websites, or same thing with photos, and so forth. You either have a typical issue whether you’re always chasing them down, or typically people want to copy your content. If you do have an issue, there are some tools out there that will help you identify and actually take care of people who copy your content. There’s one called Blasty. Blasty will actually just give you an alert if somebody’s copied your content, and then you literally pull up a list of all the pages that had been copied. In a search results-type of format, you click one link and it will automatically send them a DMCA takedown request to Google and to their web host, and so forth, to get that content removed. That’s all compliant with the DMCA. There had been situations where I have had some clients that by deciding in a ecommerce situation, they would write unique descriptions for their products, and then everybody else would copy them. They’re using Blasty to take care of that.

Yep, very, very cool. DMCA, for our listeners who don’t know what that is, is the Digital Millennium Copyright Act, and there is a protection in there for copyright owners to make a request. Well, it’s not really a request, that’s like a notice that, “Hey, this is my material. This is my copyrighted content, and you need to take it down.” That would not be necessarily to the owner of the website because they’re not gonna comply. They know they’re stealing your stuff. Usually, it will be to the web host, the hosting company, and/or to the search engines.

Yes, exactly.


We’re talking about the process. In those particular situations, we think about those tools, the crawling and so forth. As you crawl and as you look at that information, typically I have my notes out there. If I’m using Siteliner, if I’m using a crawler, if I’m using Screaming Frog, there will additionally be some issues that come up. Maybe it’s content issues, maybe it’s lots of redirects. Some of these tools will also give you how fast your site is loading, and how fast your pages are loading individually. They give you traffic information. You can start to identify some areas that you need to dig into further. That’s where you go use that initially – those three or four tools that we mentioned – and then figure out, “Okay, what are the issues that we really need to dig in a little bit further?”

Page speed, that’s an important issue not just for SEO but for conversion. If you have a slow loading site, that’s gonna really hurt your conversion rate. It could be quite detrimental to your SEO as well. What are your favorite page speed analysis tools? I’m guessing PageSpeed InsightsGTMatrix, and webpagetest.org, and whatever others?

Yes. Actually, even I’ll start with Google Analytics because Google Analytics will give you some data that was reported through Google Chrome and how visitors were visiting your site. They get some data. There’ll be certain pages on the site that will typically load a lot slower than others. It could be that maybe one page has a large image that other pages don’t have or could have other types of issues. I use actually one of the ones that you’ve mentioned. Certainly, Google has their tools then you can actually get that WebPagetest or you can use the one that would get me started and identify which parts of a page need to be sped up.

Yep. The waterfall report or graph will show what’s loading at one point in time as the page is painting, why is everything taking so much longer. It could be that there’s one script or one particular resource that’s holding everything up until it’s finished loading.

Yes, exactly. Just a tip there, what I have seen is you can use certainly content delivery networks, sites like Cloudflare, even Amazon’s web services, to essentially cache your pages. They load. I have even seen one website use several different services and different servers so that when a page loads, it’s pulling images from one particular service and JavaScript from another service. There are ways and there are experts out there who just do work on page load speed and getting your website to load faster.

In fact, there’s a service W3 Edge offers where they’ll do a performance audit and then do server tuning if you’re running a WordPress site. They are actually the creators of W3 Total Cache WordPress plugin which is a great plugin that a lot of websites use for caching. Most people don’t know that you can actually hire the company that created that plugin to tune your web server, if it’s a WordPress server, to improve its page speed.

That’s good to know.

Alright. We’ve gone into page speed optimization and some of the tools that you look at for that aspect of it. What about analyzing the HTML codings, Schema.org, markup, all that stuff?

A lot of that is from basically a checklist or a checklist that I typically go through to work at. Actually, a lot of it is manual. Once I’ve identified if there are certain pages or certain issues, from those, I’ll take a further look into the issues. If the website is a local business, certainly, we would want to focus on making sure the Schema markup is correct and so forth. Typically, for example, just use Google’s own tools to check for markup. Typically, in most cases I found, if the website is doing any Schema markup, most of the time it is correct because the developer has implemented it. But in a lot of cases, it just doesn’t exist. If it’s a local business, they just didn’t know that they need to markup their name, address, and phone number on the site. Some of it is pretty basic that it either exists or it doesn’t. That’s what I’m running into.

Right. When you mentioned Google’s own tools for this thing, there is this Structured Data Testing Tool. We’ll provide links to all these different resources and tools in the show notes, listeners, so check that out at www.marketingspeak.com. Another tool that I really like using is Ryte. It’s a combination of a crawler and an HTML analysis tool. It’s got other capabilities in it as well, but it will identify not just things as structured data markup issues but other issues, like for example let’s say the canonical tag is pointing to a wrong location. I actually had this with a client. They didn’t update, and the canonical tag was pointing to localhost. It actually said localhost in the URL. That was a big mess-up that they didn’t catch but Ryte found it. Let’s say that the reader racks are chains of four or five redirects, one after another, instead of just one – at one hop, you got four, that thing can be identified with a tool like Ryte or also with the Cron and some of these other tools. Are there favorite tools that will do HTML analysis for you that we haven’t talked about yet?

Let’s see. Looking at my list here, other than what we mentioned, Bing Webmaster Tools will show you some issues, same with Google Search Console. They did have a case once where, it was odd to me, but the canonical tag was put on every single internal link on the site, or they were using the canonical tag and putting it in every link, rather than just in the header of the page. Basically, when they decide to link out to another site, it had a canonical tag out to that other site. They were losing a lot of value because their canonical tags, that’s part of every href link on the site. The other is really identifying internal link and internal link structures. Some of the crawlers, like I said, will identify those issues. OnCrawl, in particular, will help with the internal navigation. Certainly, one of the basic rules of thumb is to have all of your internal links or have more internal links on your site pointing to your homepage because your homepage is the most important page on your website, typically. Having a lot of links from other pages pointing to your About page and making your About page, for example, or your Contact page, if that has more internal links, that will make the Contact page or the About page be the most popular and most important page. There are different internal linking issues that can be found. That’s a combination from crawlers to actually physically working at the website. The one tool you can use is a part of Google Analytics, it is the In-Page Analytics, and you can pull up typically, in the Google Chrome browser, you can run the analytics in there and it will show you a wide view of your homepage, for example. When someone comes to your homepage, you can actually see what percentages of people click on which links on the website. There are issues like that you can work at. Basically, as a user experience type of issue where we’re really looking to see when someone comes to your homepage where they’re clicking, and what pages are they going to and what pages they are not going to. That’s more of a manual process – some manual and some using some tools.

The tools for looking at the heat map of where people are clicking on your page to go to another page on your site, or off to another site that you’re linking to, things like Crazy Egg would be examples of those of heat mapping tools. Is that one that you use or using like Hotjar? What do you use for heat map analysis?

Actually, I have used both of those. If it’s a website that has not had those, those have to be essentially installed on the site. You have to let it run for a certain period of days or a certain number of visitors. Google Analytics has that built-in so you can look it back over the last six months or the last three months of data without having to have that essentially installed.

Why don’t you walk our listeners through how they could find that on Google Analytics, because I’m assuming that most of our listeners do have Google Analytics installed on their websites.

Typically, I use it within Google Chrome and there’s a Google Analytics Chrome plugin and I typically will turn it on while visiting the website.

We can include a link to that Chrome extension that you mentioned in the show notes as well for our listeners.

Yes. It is the Google Analytics Chrome extension that basically you view the website, you go to the website, www.marketingspeak.com, and you would basically press a little button on Google Analytics in the toolbar. You turn it on and it will show you the data right there.

Just for the site that you own, not for any site on the internet.

Yes. As long as you have Google Analytics access, then you will have access to the data.

Awesome. Speaking of the Chrome browser, there is just a wealth of data analysis inside of Chrome that a lot of people are not even aware of. Inside of the developer tools, you can turn on the mobile emulator, for example. You can use pagespeed insights inside of the developer tools. You can use third-party tools that integrate with the developer tools inside Google Chrome. RankSense is an example of a tool I use inside of the developer tools of Chrome, not just generally on the toolbar. These are some of the top of my head examples of great resources inside of developer tools. Do you have any favorite tools or analysis that you do inside of the developer tools in Chrome?

Actually, yes. One in particular that I think is important is to realize all the different types of screens that people are using to access websites now. For example, we have an iPhone, iOS, we could have an Android phone, tablet, and so forth. What are beginning to be popular much more and more is that people are actually viewing websites on, for example, 4K TVs and monitors that have really, really good quality. There are other situations where essentially our website will look good on a mobile phone, look good on most tablets and laptops, and so forth, but one is projected or one is on a 4K TV, the graphics may be optimized and they just might work horrible being on a 4K TV. There is a way in developer tools to look at various screen sizes. You can actually emulate. You can look at it on an iPhone 7 versus iPhone 7 Plus, an iPad Pro. You can work at all the different screen sizes. That is really, really key because what can happen is when, for example, if it’s a lead gen or someone who’s gonna fill out a form on your website, or do something, you may have identified in Analytics or even through some of the crawler data that you’re not converting as well on certain screen sizes. This could be one area to look at.

It's important to realize all the different types of screens that people are using to access websites now. Click To Tweet

Right. You don’t even need to use a third-party tool like Browsershots to look for compatibility issues between different browsers and versions and so forth. You just can do that inside of Chrome. Is that true?



You’re right inside of Google Chrome.

Let’s move on to another aspect of the audit process. How about external link analysis, inbound links that could be potentially even harming your site more than they’re helping? What analysis do you there? What tools?

I have done a lot of link audits. One thing I specialize in and have done over the years is when a website gets penalized for bad links or the new manual action in Google. There’s a whole series of process to get a manual penalty from Google lifted or revoked. To be clear, I am a Majestic brand ambassador so I typically use www.majestic.com for all the link data.

Yep. I love Majestic.

I do use Majestic. They’re a little different because recently, they are reporting not only just the incoming links to a webpage, but they also report on the outgoing links. They report the outgoing links on that page. You may have 50 links coming into a webpage and you have a link on that page, but if there are 150 links going out, it may not be as good a link because they’re sharing a lot of the link juice, if you will, and pointing it out to a lot of other links on that page. It’s a good idea to have a good sense of all the links that are coming in and all the links that are going out of all the web pages.

Right. Majestic has a couple of metrics that are critically important – trust flow and citation flow. Those are metrics that help you gauge whether a domain or a specific page is highly trusted or not so trusted, whether it’s very important or not that important. You wanna give any additional context around that?

Basically, there’s two numbers that you’ll see initially, which is the trust flow and citation flow. The trust flow, basically, is like the old PageRank, you get trust from other websites. The trust flow is a number between 0 and 100. For trust flow, you want your number to be as high as possible. In trust flow, basically, you get trust from other websites, CNN or www.marketingspeak.com, or any of these websites that are trusted. If they have a higher number and they link to you, then they will help boost your trust a little better. When you have citation flow, citation flow really is just a sure number of links from other websites that are pointing to you. There’s not really a value on those. It’s just that basically you want your trust flow to be high and your citation flow to be as low as possible.

The way I think about it is you want to be more trusted than important. If you are a high trust flow and lower citation flow, that’s the right permutation, whereas if it’s the other way around, if your citation flow is higher than your trust flow, that means you’re less trusted than you’re important. Importance without trust is not good because it doesn’t convey that you’re a trustworthy website to Google. Sites that aren’t trusted don’t rank well.


Let’s go from Majestic to some of the other tools that you use for link analysis. Any others that you’re incorporating into your link audit, the Link Research Tools, tools set, for example, or Open Site Explorer from Moz or Ahrefs or some other tool?

Sure. To do a really proper link audit, there are a lot of different sources, from Majestic, to Ahrefs to open site explorer from Moz, as well as Bing Webmaster Tool links and even the Google Search Console links. There will be some links mentioned on one source that won’t be mentioned on another. What you wanna do is you want to just essentially gather all those links from all those sources and put them into a spreadsheet like Microsoft Excel and remove the duplicates. Then, you can get a really true picture of all the links. It really won’t typically show all of the links because there are reasons why certain tools won’t show links versus others, but we wanna basically get all the links we can and then go through them from there. Typically, if there’s a link problem on the website, meaning that some of the issues I’ve ran into would be around anchor text, Ahrefs and Majestic and actually now SEMrush which is using Majestic data – all these tools will show you, for example, the anchor text of all the text links that are pointing to your website. You would want to have typically, which is something which is natural, is most people will link to you, either with your company name or your brand name, or maybe your product name. Typically, it’s not as natural-looking to have somebody link to you with a keyword that you wanna rank for. If you would get all the texts of the links that are pointing to you, and you have issues where your brand or your website name or your company name or your personal name, product name, when those are not mentioned as often as say, a ranking keyword, they could back and run into an issue. We wanna have people talk and link to us with our company name or our product name. In my case, most people link to me with Bill Hartzer to my website. That’s more natural. Those are the issues like a natural anchor text and having a lot of links that don’t have a lot of trust linking to you. Those are some of the issues that we’re working for when it comes to links.

Right, right. If you have a lot of money anchor text versus brand anchor text, that looks suspicious and potentially spammy to the search engines.

Yes. Another issue that I will work at is for example, using Majestic and some of the other tools, you can see which pages on your website have the most links or which pages on your website are your strongest pages, meaning that it could be that you wrote a blog post and it went viral or it got a lot of notice on social media and there were other websites that linked to it. I’ve ran to situations where after six months, they redesigned the website then removed that page or that blog post. There’s opportunities there to look at all the pages that are important because all of the pages that are linked and making sure those pages still exist or they’re redirected properly on your website.

Link reclamation.


Another thing that I think is good to look for when you’re doing a link audit is lack of diversity if you’re looking at the themes of the sites that are linking, the type of sites that are linking, like CNS-type websites and forum-type websites and so forth. There is a lack of diversity there or in the top-level domains that are linking to you, you have an overabundance of .info or .biz links and very few of the other TLDs. That looks suspicious, that looks engineered.

Yes. Typically, we want everything to look natural. When it comes to links, we don’t want all of our links to be one type of link. We want a good mix. If you’re a business, you’re gonna have links from press releases, from news and media websites. You’re gonna have links from industry magazines, industry directory websites. You’re gonna be mentioned on social media and so forth. It will be a good combination of links. That’s what we’re really looking for.

Also, there gonna be cases that happen naturally where you’re mentioned by name, or your brand, or your company, but there’s no link there. If every single appearance on the web of Bill Hartzer includes a link, that looks weird. That doesn’t look natural.


That’s another thing. If you do go overboard with your over-engineered-looking links, that can cause you problems in terms of a penalty situation, a manual action, or an algorithmic penalty. You wanna just differentiate for our listeners manual versus algorithmic. I know you briefly, briefly touched on it, but I think it’s important for people to understand the differences. The approach for doing a clean-up is very different depending on whether it’s manual action or algorithmic penalty because one involves a big red button that somebody pressed on your site and the other is just a set of rules that were tripped. Wanna give us some more insight on that?

The approach for doing a clean-up is very different depending on whether it’s manual action or algorithmic penalty because one involves a big red button that somebody pressed on your site and the other is just a set of rules that were tripped.

Sure. Typically, what you wanna do is if you have identified by looking at your links if you will, and if you’ve identified that there may be an issue, some of those issues can be very different. Sometimes they’re issues with where you’ve bought links or maybe you have a widget on your website that you’ve distributed out to all sorts of other websites. There could be a whole series of different issues. First thing you gotta do is look at Google Search Console and Bing Webmaster Tools to make sure you don’t have a penalty. Basically, there is a section in there, under the search traffic in Google Search Console that basically is the manual action area. They will tell you that you have a penalty or you don’t have a penalty.

But that would only be for manual action-type of penalties, so the algorithmic ones, you’re not getting notified about.

No, you’re not. Those are actually a little bit more difficult to diagnose and clear up. At least with the manual action, in a roundabout way, they will tell you what the issue is, then deal with it.

People probably will have heard of Penguin and Panda but there are a bunch of other algorithmic penalties that you can get hit with. The Fred updates hurt some sites that were over the top in terms of too many ads on their sites. Any other algorithmic penalties you wanna mention besides Panda, Penguin, and Fred?

Recently, there have been a lot of penalty issues with Schema markup. There are websites that have been marking up their code with too many events or marking up in a way, the same thing, there’s been a lot of recipe websites that have been hit using certain Schema markup plugins on your website. There can be essentially internal link penalties as well as penalties or algorithmic-related issues not only internally on your site, but obviously from other websites linking to you.

Schema.org markup can be used legitimately to do things like add star ratings and review count numbers to your search listings in Google, but it can also be abused if you’re using this in the way that, let’s say, you don’t have that number of reviews and you’re over-inflating the numbers or the scores and reporting that through the schema.org markup. That’s not legitimate. Or you’re using reviews and ratings that don’t belong to you. That’s violating the terms of use, like for example you can’t copy and paste the reviews from Yelp. That’s against Yelp’s terms of service, putting those reviews on your own website. There are a lot of potential gotchas. I think the bottom line for our listeners is that if it’s something that you would feel embarrassed or uncomfortable sharing with a Google engineer in a conversation over a beer, don’t do it. It’s too high-risk.

One issue that’s also worth mentioning is the issue of negative SEO, if you will, when it comes to things. I’ve been involved with cleaning up or trying to help clean up a lot of issues where a competitor has just created all sorts of links to their competitors. Typically, they will be links related to pills, or adult links and so forth with text. Those are issues that either some people who are not necessarily playing by the rules, they will just buy a bunch of bad links to their competitors, and just flood them with thousands and thousands of completely off topic links, thinking that that will hurt. It can be frustrating. A lot of cases, we’ll just identify those links and disavow them, tell Google that those links are not links that we agree with. Then, there’s the whole issue of having a hack site. I’ve had situations where I have come in to clean up sites that have been hacked. You’ll go through and you’ll clean up your website, but there will still be a lot of links out there to pages on your site that don’t exist anymore. There are ways certainly to identify those links, disavow them, try and get them removed. That can also be hassle as well.

We covered that topic pretty well. The Christoph Cemper interviews, that’s a two-part episode there, that’s definitely worth listening to, listeners, if you care about link cleanups, and negative SEO, and link audits. Christoph is the founder of LinkResearchTools. It’s a great tool set. So many favorite tool sets! I love them all. I love SEMrush and Majestic and LinkResearchTools. There’s just so many. We’re running short of time so I wanna wrap things up here. I wanted to get a sense from the size and scope of these audit engagements that you do. How big is a report and how long does it take you to present all your findings and recommendations?

That’s good question. It depends on the size of the site. Typically, a small business website, if it’s 50-page or under 100 pages, it might take me a day or two to go through and find the majority issues and make recommendations. Typically, those audits and the final documents and so forth would be anywhere from 30 to 40 pages to 100 pages or so. I’ve done large audits of sites that have millions of pages that will take a month to a month and a half to complete and have everything written up and so forth. Those can take much longer.

Cool. How would people reach you if they wanted to work with you?

The best way is through my website www.billhartzer.com.

Perfect. Thank you so much, Bill. This was just awesome information, lots and lots of great tools and resources. We’ll include all those in the show notes. Listeners, do check that out. We’ll create a checklist of actions to take from things that we discussed in this episode. That’s all at www.marketingspeak.com. Also, go to www.billhartzer.com to talk with Bill and potentially work with him. Thank you, listeners. We’ll catch you on the next episode of Marketing Speak. This is your host, Stephan Spencer, signing off.

Important Links:

Your Checklist of Actions to Take

☑ Regularly conduct SEO audits regardless of the website’s size. Audits are the first step in SEO client engagement.

☑ Get to know my client’s goals before I start optimizing their website. This will give me a clear idea of what SEO procedures to prioritize.

☑ Write down specific goals before auditing my own site. Order priorities from most to least important to set up a detailed timeline.

☑ Utilize tools like Google Analytics to analyze data and reports. I can also use Bing Webmaster Tools to compare results.

☑ Run data through a set of crawlers to get more extensive information. Resources include OnCrawl, DeepCrawl and Screaming Frog.

☑ Use Siteliner to find duplicate content on my website. Duplicate content is a big no-no in SEO.

☑ Look for companies who copy my content with the help of Blasty and Copyscape. Copyright infringement is illegal and can hinder my own site’s SEO.

☑ Check the site’s page speed and make sure every page loads fast enough for visitors. Use tools like PageSpeed Insights, GTmetrix and WebPagetest.

☑ Redirect as many internal pages to the homepage as possible. The homepage is the most important page of a website.

☑ Utilize heat maps like Crazy Egg or Hotjar to determine which part of a webpage has the most activity. I can strategize content and CTA buttons based on this data.

About Bill Hartzer

Bill Hartzer is an independent SEO Consultant, having practiced SEO for over 20 years. He’s published case studies on the New gTLD domain names and their impact on SEO and PPC.

Leave a Reply

Your email address will not be published. Required fields are marked *