In SEO, the Google Algorithm is king. If your site winds up on the wrong end of a Google Algorithm update, watch out. You could see your rankings, traffic and revenue plummet. You may have been hit with a Google penalty. Like me, my guest on this episode number 162 has spent years studying Google’s updates and helping her clients stay on Google’s good side. Marie Haynes is a Search Marketing Consultant and expert on Google penalties. She has written extensively on Google’s site quality guidelines and major Google updates like Panda, Penguin and Medic. While Google updates are unpredictable, there are plenty of things you can do to minimize your risk of being affected by an update. Stay tuned as we reveal tips and best practice techniques for avoiding the wrath of Google.
Marie Haynes, it’s great to have you on the show.
Thank you so much for having me. I’m excited to do this.
It was a pleasure meeting you. We met in Helsinki in the summer. We were at this exclusive SEMrush Summer Jam event. It was your first time and my first time as well. Let’s start by picking up on a conversation that we had in Helsinki. We were talking about E-A-T and Google Quality Raters’ Guidelines and how important that is. That was before the August 1st update hit. It was such a timely thing that our discussion and your focus on how E-A-T, Expertise, Authoritativeness and Trustworthiness is important. It just played out so well in the next month’s update. Why don’t we describe for our readers E-A-T and why it’s important and then go into August 1st?
We’ve been talking about this in our company for a couple of years now. For a lot of SEOs, this is a brand-new concept. I’ve had some arguments with some people on Twitter saying, “E-A-T is an SEO-made concept,” and I’m 100% convinced that this is a major factor in Google’s algorithms. E-A-T stands for Expertise, Authoritativeness and Trustworthiness. In my opinion, it’s Google’s attempt to only rank businesses that are known as the experts, the authorities and also trustworthy sites. Back in February of 2017, there was this big algorithm update and we didn’t even give it a name. Five days after this algorithm update happened, Gary Illyes from Google tweeted this random reminder saying, “Don’t forget to read the Quality Raters’ Guidelines.” I asked him, “Was that random? Are you trying to give us a hint that things that changed in the algorithm are connected to the guidelines?” He said, “You’re right, it was connected.”
What I saw back at this time was sites that were losing rankings, one of them was a site that had good medical information but there were no doctors at all on the site. Google, in my opinion, was recognizing like, “Even though these are well-written articles, the author doesn’t have E-A-T to write them.” What we’ve been seeing is almost every algorithm update that’s happened from that point to even including the August 1st update. They addressed something that was in Google’s Quality Raters’ Guidelines and most of the time this has to do E-A-T.
The Quality Raters’ Guidelines, let’s give a bit more background on that for our readers who are not familiar with that. That used to be a secret. It was a confidential document that kept getting leaked by different people who got their hands on it. I’m assuming, the human reviewers, the quality raters that worked for Google, would leak that document and then we SEO people would go crazy analyzing it. Do you want to go into some detail about this?
Years ago, I bought the domain name LeakedQualityRatersGuidelines.com. I had one of those leaked copies up on that site and got a bunch of links to it. Yeah you’re right it used to be a secret. Let’s even back it up further and talk about who the quality raters are. Google contract out and it’s somewhere between 10,000 and possibly even 100,000 people. Anybody can become a quality rater. You can apply online. I got approved to be a quality rater and then it turns out you have to commit to giving something like fifteen hours a week, which was something I couldn’t do at this point. What you do is you learn from this guide. The guide anybody can find now online. If you search for Google Quality Raters’ Guidelines, you’ll see a PDF document that comes up. It’s about 160 pages. Some of it is interesting reading, other parts are a little bit tough to get through. It’s an instruction manual for these raters who are contracted out by Google to tell them what is considered high quality in a website.
Anytime we ask a Google employee, “How can I rank better? How can I have a site that Google wants to rank well?” The answer is always, “Create great content.” Google put out a blog post a few years ago and they said, “We are publishing the Quality Raters’ Guidelines so that webmasters can know what it is that Google looks for as quality in a website.” That’s all I need from Google to say, “It’s definitely something important to pay attention to.” The important things to know are that these quality raters are not the same as Google’s webspam team. If somebody from the webspam team looks at your website and determines that there’s a quality issue, they can flag it, they can give you a manual action. They can do all these things.
The quality raters don’t have that power. What we believe happens is that they have straightforward yes or no questions to answer. For example, Google might say, “Do a search on this particular topic and tell us the site that’s ranked number one. Would you give your credit card info to this site? Would you trust the site with medical information?” They get all of this information from thousands and thousands of people. Let’s say there’s a problem. Let’s say that the site that’s ranking well for this top medical query is super low quality for some reason. That feeds back to Google’s engineers and then the engineers try to tweak the algorithm so that type of thing doesn’t happen again. When we get an algorithm update like the August 1st update we had, that’s basically the result of this whole process that the quality raters determined, “There’s a problem here.” The engineers said, “Let’s try to fix it by tweaking these signals,” and then when they push that out we see the algorithm update.
Let’s say this small army of tens of thousands of people are beavering away, looking at all of these different websites and trying to ascertain if they’re trustworthy, if they’re authoritative if they show expertise and so forth. Presumably, this is not going to continue to scale to a point where there are a million of these folks. It’s going to be baked into an algorithm at some point.
The quality raters don’t have an effect directly on websites. If you think of how much is involved getting this many people doing the work for you, this is something that Google can’t do forever. I personally believe that at some point, they’ll use this information for machine learning. There have been some statements from Google employees that said that’s not happening at this point, which surprised me. That’s probably what they’ll end up doing in the future is they get this information and then they use that as a learning set. They can say, “We’ve learned that this particular element tends to be something that is related to low quality and then we can apply that to every site on the web or every site in this particular category. Who knows how long Google will keep doing this process? It’s fascinating to me that they mostly use the raters to see, “Are the algorithms doing what we want them to do?”
If you read through this 160-page document, your eyes could glaze over after a while. There are a lot of acronyms thrown out through the document like PQ is page quality, MC is main content and so forth. They seem to like acronyms over at Google, at least the people who wrote that document. What level of skill would a business owner, a marketer or an SEO needs to be in order to get adequate value out of reading through the Quality Raters’ Guidelines?
The guidelines are written to the level of people who have no SEO experience or no marketing experience. I don’t find that they’re technically difficult to read through. There’s a lot of repetition in places. They’ll have examples of sites that are high or low quality depending on whatever that particular topic is talking about. You can click on a link in the PDF document and it will open up a screenshot of a particular site saying, “We consider this to be of the highest quality because it has these features,” and that can take some time to go through. I’ve probably been through these guidelines 100 times and it can take a lot of time to read it. In terms of skill level, anybody who’s reading this blog can get good benefit out of reading the guidelines.
The question though for our readers is, do you have the dedication to go through a 160-page document and then follow all the examples on the web to see those page quality guidelines and so forth in action so you get the gist of it? It’s a commitment.
We have a whole team of people who spend time doing this. If I was running a business at the same time not related to SEO, you’d have to be a certain type of person to want to read through this thing in your spare time. I know there are some people out there that get a pleasure out of this type of thing. There are many tidbits in there. Every time I go through it I say, “I didn’t see that before and that explains why this site saw a drop or something like that.” There are many good tips from there.E-A-T stands for Expertise, Authoritativeness and Trustworthiness. Google attempts to only rank businesses that are known as the experts, authorities, and trustworthy sites. Click To Tweet
A good starting point certainly would be The SEM Post by Jennifer Slegg on Key Takeaways from the Latest Quality Raters’ Guidelines, which came out in July. Are there any other resources that you would recommend?
Jen has got a couple of really good pages. Every time there is a change in the guidelines, she will write a post on what’s changed. Often there will be something that changed and then there’s an algorithm update that reflects that change. For example, one of the words that was added to the guidelines back in late July 2018 was the safety of users. That was in the whole trust section of E-A-T. August 1st was about user safety. Jen has that written out, “Here are the changes and here is why they might be important for businesses.” I also have a summary that I’ve made. It’s a checklist that my team and I use when we’re assessing sites in the eyes of the Quality Raters’ Guidelines. What we’ve done is we’ve summarized like, “Here are all the things that they talked about and here’s how we would manually check them ourselves.” You can get that at MarieHaynes.com/book. It’s something that we give out free to our paid newsletter subscribers. That’s a resource. There’s not a lot else that’s out there that’s written about the Quality Raters’ Guidelines. I predict that there will be. This is one of those topics that there will be a lot of people talking about this as they realize in the months to come how important it is.
You also have a great resource on your site about the August 1st update. You refer to YMYL, Your Money or Your Life websites in the title of the post and how those seem to have been targeted specifically with the August 1st update. YMYL isn’t just an acronym you made up. That’s in the Quality Raters’ Guidelines.
That’s all throughout out the guidelines. YMYL stands for Your Money or Your Life. Most of the sites that we do site reviews on, we’d consider YMYL. Any site that is medical, legal or financial is automatically considered YMYL. Other sites would be sites that take transactions. If you take credit card transactions then you’re also YMYL and any site that helps people make serious decisions. One of the examples in the Quality Raters’ Guidelines is if you had a site about adoptions. You could say technically maybe that’s a legal site, but you want somebody whose writing that content who has expertise in that area because it’s important. The reason why it’s important for us to classify sites as YMYL is that what the guidelines say is that those sites are held to a higher standard of quality. If you are writing about a YMYL topic and you don’t have E-A-T for that topic, you’re probably going to have trouble ranking well for it, which is why it’s so significant.
It’s high stakes essentially. If you’re writing about home remedies and let’s say somebody is not qualified to write about this, and they’re a writer for your site. They send this reader, this user down a rabbit hole that’s a bad remedy that doesn’t work and then they die from it. That’s as high stakes as you can get. It seems like a disproportionately large percentage of medical sites, especially home remedies and alternative therapies and so forth were targeted with this August 1st update. It’s because it’s such a high stakes thing and maybe it relates also to user safety.
When I first wrote that post on the August 1st update, it was very clear that a large number of medical sites were affected. Now that we’ve had time to review a number of sites, it wasn’t just medical sites. There were sites of all kinds that were affected although they all seemed to be Your Money or Your Life sites. One of the things that we noticed is that almost every site that we’ve reviewed that saw drops had some type of trust-related issue. Medical sites that made claims that went against scientific consensus, and we’re seeing that some of the sites that are ranking well after the August 1st updates are sites that do a good job of referencing their medical points. For example, if I wrote an article on diabetes and I have references all throughout my article to the latest scientific research on diabetes, that’s probably seen as a sign of trust.
We’ve reviewed some sites where there were medical claims that are pretty dubious and there’s nothing to back them up. That type of thing was demoted, in our opinion. One of the things that’s in the Quality Raters’ Guidelines is they instruct the quality raters to see our medical citations appropriate. Have they referenced medical articles? It doesn’t matter how you do it, whether you have links at the end of your article or scattered in between your article, but you want to show that this is up-to-date with the most recent research that you can.
Any time that you make a claim, you need to substantiate it. Like Wikipedia, they say in the Wikipedia guidelines that this is not the place for original thought. That everything needs to be referenced and substantiated. This is good practice for any blog post or article.
We had one, and this wasn’t with the August 1st update but it was related to a quality update where the site had this huge part that was dedicated to the zombie apocalypse. You can argue whether that’s going to happen, but it was presented as fact. I don’t know how Google figures this out algorithmically but in the guidelines, it says, “If it’s satire, it has to be clearly labeled as such.” I’m not sure if satire is the right word for the zombie apocalypse, but the point is if you’re presenting some-out-there theories as true fact, then you may have trouble ranking for the true fact terms. If you have parts of your site that have to be read with sarcasm font, then that could cause you to have a ding in quality in Google’s eyes.
If you’re going to be tongue-in-cheek, the whole site should be that way and it should be pretty clear that it’s that way. If I can’t distinguish between a section that is satirical or tongue-in-cheek and a section that is not or an individual article, I don’t know what’s credible and when you’re messing with me.
Google’s been under a lot of fire over the last couple of years about fake news. It’s hard to algorithmically determine what truth is. That’s one aspect of some of these quality updates. Often that’s hard for us to determine on a site quality review because we don’t know the sections of the site that are potentially on the edge in terms of truth. That’s something to take a good hard look at your own content and ask yourself, “Is there anything on our site that could be considered controversial?” If you remember back when the Panda algorithm came out, Google gave us this list of 23 questions by Amit Singhal. These, I believe, are closely related to the Quality Raters’ Guidelines. There are questions that say, “Would you trust the information from this article? Is this something that’s widely known as fact or are you trying to present a different theory?” There’s nothing wrong with that. There are all sorts of sites that talk about alternate theories but it has to be clear. The ultimate question is, “Are you tricking users?” and if that’s the case then that has to stop.
There are a couple of factors or biases that we should bring up in this episode so that folks know the reason why fake news can be effective is that of cognitive bias and because of the backfire effect. Do you want to share anything about one or either of those? There’s a fun resource about the backfire effect. It’s a cartoon or comic from The Oatmeal that talks about our biases and how are our amygdala is guarding our brain from intellectual onslaughts, not just physical ones where our core beliefs are being challenged, at risk or in danger. That’s the backfire effect. Let’s talk about these biases.
I’ve seen that from The Oatmeal and it makes sense. We consume so much information on the internet. The real question is who determines what is the truth? We can get philosophical with this. At this point, we’re in very early stages of Google and other search engines or even Facebook, determining what is good for us to rank well. The types of things that we’re seeing drop are very obvious like, “This website tells us that I can eat carrots and cure my cancer.” That type of thing Google is taking out of the top-ranking ability, but they have a long way to go. We can read enough things about a topic and start changing our biases in our mind and how is Google going to determine what is true for everybody. I don’t know. I don’t have the answer to that.The ultimate question is, “Are you tricking users?” and if that’s the case then that has to stop. Click To Tweet
We’re basically looking for reinforcement to our already deeply-held core beliefs and we toss out the things that are against that, even if they’re well-sourced. If they’re not well-sourced, that’s something that an algorithm can pick up on. The cognitive biases thing, that’s a little trickier for an algorithm.
It’s not like it’s black and white like Google says, “You made this claim and you didn’t give us a scientific reference so we’re not going to rank you well.” We’ve strayed off the topic of E-A-T but it’s all connected. If I’m reading an article and it’s written by somebody who is world-renowned as an expert in the subject, and they’ve made a claim that perhaps is controversial, Google may allow that. Whereas somebody who nobody’s ever heard of if they make the same claim, then that might be more likely to be seen as fake news.
You gave an example where you’re talking about a site that is talking about medical advice and there’s not a single doctor on staff or as a contributing writer that lacks the E, especially in E-A-T, so the expertise is missing. Let’s say regardless of whether you have doctors on staff or as contributing writers or not, and you believe that you got hit with the August 1st update, what would be a way to build that expertise or show that to an algorithm?
That particular client is one of our favorite success stories. They were hit in February of 2017. We worked with them to fix a great number of things on their website. In terms of E-A-T, we said, “Why don’t you see if you could go out and hire some physicians?” I know this is not a feasible thing for every business to do because I would imagine it costs them a crazy amount of money to do this, but they did hire physicians. They hired physicians who were known as authorities in their particular medical space. Every post had the author bio of the journalist who wrote the post and we worked with them to even boost up the journalists’ E-A-T.
If we were writing about diabetes and let’s say the person who wrote the post they’re not a physician, they don’t have any actual medical experience. I don’t know if this helps or not but it certainly won’t hurt is we wrote in their author bio, “So and so has been writing about diabetes for the last ten years ever since her diagnosis,” trying to get some real-world expertise that counts for something. What we did was we had them also create an author bio for the physician and we said, “This post was medically fact-checked by Dr. so and so,” and they did have the doctors fact check their posts. They had bio pages for these doctors that greatly extolled their E-A-T.
They basically said, “This doctor graduated from this medical school. He has been in practice and has been published in these authoritative places.” We did everything we could to build up like, “This is why you should trust the information that’s on this site because it’s been fact-checked by doctors. We also made sure that all of their claims were referenced.” This particular site saw some increase with a quality update, and then with the August 1st update they regained all of their lost rankings. It took almost a year and a half for them to regain these rankings but it can be done. The reality is though that it’s difficult to fake E-A-T. We’ve had some other nice success stories. If you’re lacking E-A-T, sometimes one of the answers is to hire people that can collaborate with you who do have the E-A-T that you need.
As a starting point, at least stop hiding behind a persona, stop hiding behind an illustration or a comic version of yourself. I presented a webinar to a bunch of people who were hit on the August 1st update in the medical space, like affiliates. I was doing live site reviews in this webinar. One of the affiliates that I was critiquing, the guy didn’t use his last name. He didn’t even use his correct first name. He changed it and he had a comic illustration of himself. It was a caricature. It wasn’t a real photo. I was like, “This is the opposite of E-A-T. This is hiding behind a persona and that’s not credible. That looks so not legitimate,” and he was like, “That’s a good point. I’d never even thought about that.” This guy is not a medical doctor but he’s got a lot of expertise in weightlifting, bodybuilding, supplements and all that. He’s got a credible opinion but not as a persona rather than a human.
One of the first things they’re supposed to do when they’re assessing a site is to determine who’s responsible for the content on the site. In some cases, it’s obvious. For example, if there was an attorney’s website, with every single article, you don’t need to have necessarily a bio that says, “This is written by this particular attorney.” It can be fairly obvious from the about page that whole business the attorneys are responsible for that content. Specifically, when you have sites that have multiple different types of authors and articles, you really do want to make sure that it’s clear who’s responsible. Who had written the content? The part that people get confused about is authority.
I’ve had people write to me and say, “I am a doctor and yet my site dropped with the August 1st update.” When I search for their name, I can’t see anything other than their own site. Who knows if they check databases or they checked LinkedIn? They can check all these external sources to say like, “This person is a doctor,” but what it ties into is the whole concept of link authority. Often what we will do for a review is we will look up, “You dropped in rankings for this term. Let’s look at who’s ranking well, then we will look up that author and we can see, “They’ve been quoted in the New York Times, in Forbes, in this medical journal and everywhere we look they’re known as the authority in that space.” You can’t fake that. There are ways you can display it better. If you’ve written a book or you’ve won awards, you need to brag about those things on your website. If you don’t have authority in your space, that’s where you need to start working on it.
I had somebody say, “It’s like a catch-22. You can’t become an authority unless you’re recognized in these authoritative places and these authoritative places won’t recognize you unless you’re already in authority.” Look at my own story and most of us who are in SEO. I wasn’t known as an authority in SEO many years ago, but I wrote things, I got things published. I got interviewed in places and you gradually build up this authority. The point is that in order to rank well now, Google wants to see that you’re recognized as a leader in your space.
That’s different from showing expertise. You could put in what Malcolm Gladwell refers to as the 10,000 hours to build that mastery. Outliers talk about 10,000 hours being a magic number. You’re an expert but if nobody comes to a consensus saying that they recognize how your contributions are valid, proven and relevant, then it doesn’t matter. You’ve got expertise but you have no authoritativeness, nobody is looking to you as the authority. There are many ways to establish or to present that authority beyond links. Most people talk about authority in terms of the context of links. Let’s say that you’re Facebook verified, you’re Twitter verified. You have a Wikipedia article that’s well-referenced. These sorts of things will help establish your authority position as well as the authoritative links pointing to your site from high page rank endowed websites.
At Pubcon Austin I asked Gary Illyes from Google, “How does Google algorithmically determine E-A-T?” I was surprised he gave me an answer and he said, “It’s primarily based on offsite links and mentions. Even Google knows which parts of the web to count. If you get a mention in Forbes, Google knows which parts of Forbes are easy to buy your way into and which parts are truly legitimate press mentions.” That’s what we need to be striving for. The Quality Raters’ Guidelines have several places they talk about the importance of having a Wikipedia page. We’ll do this in our reviews. We finished one up where we could see that the sites that are all ranking number one for their main keyword, they all have their own Wikipedia page. This particular business we were reviewing did not have a Wikipedia page. In order to get a Wikipedia page, you have to be an authority. It all feeds back into itself. The thing you need to be working on is how we can get true mentions in the sites that are recognized as authorities in our space.
With Wikipedia specifically, it’s beyond being an authority in order to warrant you being in Wikipedia with your own article, either you personally or your company. You have to be encyclopedic, which means you have to be notable. There is a whole definition around notability in Google and in Wikipedia’s guidelines. There are other criteria that you have to meet. You have to do all this without being a player in the creation of your own Wikipedia article because that violates the conflict of interest guidelines of Wikipedia. It’s very tricky.
I do think that that’s factored into the algorithm. It’s not black and white. It’s not like, “If you don’t have a Wikipedia page, you’ll never rank for anything.” It’s one of the things that Google looks for as a sign of recognized authority. There are some niches where you’re never going to get a Wikipedia page. If none of your competitors have them, that’s probably not as big of a deal. It’s very hard to spam your way into Wikipedia. I’m sure there have been people that have done it. You have to be notable enough for other people to want to be talking about your business which makes it hard. Anybody who’s been doing SEOs since pre-Penguin days or pre-2012 remembers that you could take pretty much any business and if you got enough links pointing to it, it would rank well. That’s what Google is trying to erase. They’re trying to erase the ability for us to SEO our way to the top. There are still tons of room for SEO, provided that you are a legitimate business that’s well-respected and recognized.
Specific to the E-A-T guideline here or area of focus, if we were to differentiate authoritativeness and trustworthiness, they get authority and you have trust in links and those are different. We should distinguish those for our readers. Around authority, there’s importance, and there is more on the trust side. There’s the distance from a trusted seed site and that’s a different thing. I can establish trust by getting a link from Stanford University, from an area of that site that is not gameable, not like a user page or famously a Stanford newspaper would take money for dropping links in their footer for years. Google knew about this for a long time and zeroed out any benefit that those links gained or could have gained. That went on for years. Trust is about distance from trusted seed sites from a link perspective and authority’s more about the importance of the sites that are linking to you. It’s not every link is created equal or weighted. It’s a meritocracy, not a democracy. There’s more to it than that though. As the algorithms evolve beyond links to expert systems that are better than humans at sniffing out stuff that isn’t authoritative or trustworthy. We’re going to see a lot of evolution in that in that space. What are your thoughts around all this?
The whole seed site thing is interesting. I’m sure what you’re referencing is a Google patent that was a continuation of the original PageRank patent.You have to be notable enough for other people to want to be talking about your business. Click To Tweet
It’s from Yahoo. The patent was filed by Yahoo, the TrustRank algorithm. I’m sure that the algorithm got baked into Google’s PageRank algorithm, so when they refer generically to PageRank over at Google, there’s a trust component in there.
There are two different things that we’re talking about. There’s the link trust and then there’s something different that Google did in terms of trust with this latest update. I’ll mention the seed site thing first and then we’ll get to the second element of trust that is important. The patent that I’m thinking of, it was a continuation on Google’s PageRank. I do think PageRank initially was all connected with the whole Yahoo TrustRank and things. When PageRank first came out, Larry Page said that it was unspammable. SEOs developed into SEOs at this time to figure out like, “In order to rank well, I need to get links from authoritative sites,” and then we found ways to buy links and to create our own authoritative sites and manipulate the system strongly.
PBNs and all that.
Some of those can still work, but Google’s getting good at figuring out which sites truly are trustworthy. One of the things that it seems like they do is they have this manually-curated list of seed sites that are trusted. One of the examples that are in the patent is The New York Times. If you had a link from The New York Times, that’s a good link and we’d all take a followed link from The New York Times. They said, “Look at the distance from that.” If I got a link from a site that was linked to down the chain, it started at The New York Times and then they linked to this other news site and then that site linked to me. That’s the type of link that could count. Google got better at figuring out what are these trusted sites. Not a lot of people talked about May 24th, 2018 as being a big update. This was a big link-related update where they got better at figuring out which links to count and which links matter. That’s one factor.
Something different happened with August 1st in my opinion. In July of 2018, Google made this change to the Quality Raters’ Guidelines. One of the things they added again was the safety of users, which they wanted to make that important. One of the things that we’re seeing is pretty much every site that saw a drop, they had some type of element where they were tricking people. If you go to their Better Business Bureau listing, there’s a big red banner at the top that says, “Warning, people have had issues getting refunds from this company,” and that’s in the Quality Raters’ Guidelines. It says that, “If you have a negative or a low Better Business Bureau rating, that’s a sign of low quality.”
We saw several sites that we looked at their review profiles around the web and we could see lots of people had issues with this business in relation to trust. Another thing that’s in the Quality Raters’ Guidelines is how important it is for Google to be able to see your terms and conditions page. That’s something we don’t talk about a lot. A lot of sites will no index those pages, and yet that’s what Google wants to see. They want to see that if somebody wants to get a refund, is it easy for them to figure out how to do that if it’s applicable in your space?
We saw sites drop that it was clear like their competitors had friendly refund policies. There’s the client that we were reviewing. You had to dig hard to find the refund information and then it was saying like, “No, we’re not going to refund anybody.” I’ll give another example, one was a site that had older above the fold content on their home page said, “This is a free product. You get free access to this, this and this.” Then when we read the reviews it was all people saying like, “I thought it was free. They wanted my credit card info and then they charged me.” Trust issues like that somehow Google is factoring into the algorithm.
There are all sorts of stuff in the guidelines about ads. If you run a site that runs ads, then you should go through the guidelines to see what they’ve said about ads. One of the things is that ads need to be clearly labeled as such. We’ve all seen those sites where you go and maybe it’s a directory listing. You go to click on something and you’re like, “I didn’t even know that was an ad,” and you realized you just clicked on somebody’s ad. That type of thing is something that can cause demotions in terms of quality. The other things are any ads that are annoying to people. We know that there’s an algorithm specifically for mobile interstitial. The first thing people see when they land on your site is a big huge pop-up ad that you can’t close, that can potentially cause a demotion. One of the things that are in the guidelines too is even if you have ads that have objectionable content and you’re not an adult site, it can lead to a reduction in rankings. One of the things that we love to do is get real people navigating through the site and seeing where are the annoying factors. If people get annoyed with trying to navigate through your site, then those are things that need to be changed for sure.
Those annoying ads, the pop-ups get in the way of being able to see any of the page content behind it, Google refers to those as intrusive interstitials.
The problem is that those things convert well in some cases. I look at some of them and go, “How would anybody ever click on one of these?” There’s a reason why marketers use them because often they’ll have a higher conversion rate or they’ll make more money for the business. Often when we do our site reviews, we have this real argument between the SEO team and the monetization team because a lot of monetization methods can hurt you in Google’s algorithms.
You mentioned that you do site quality reviews. You had mentioned that you had just done one. What’s involved in a site quality review? How much does it cost? What’s it look like?
Back in 2012 when Penguin first came out, I was a veterinarian. I was interested in SEO. I did it as a hobby. People started asking me because I was in forums and I would talk about my thoughts on Penguin. People asked me for a site review and I said, “No, I’m not a marketer. I’m not going to take your money for a review.” I created this report and all it did at the time was look at your analytics and tell you whether you had been hit by Penguin or Panda. There was no recovery advice or anything. I charged $89 for this report. Throughout the years, I’ve been improving upon that report dramatically. Now we have this thing that it’s a combination of a site audit, so we don’t do an entire technical audit but we’re looking for any things that could be problem areas.
We have a huge component that looks at the Quality Raters’ Guidelines, so it looks at, “Do you have E-A-T compared to your competitors? Is there something you could be doing to be getting better press? Are there trust issues that maybe you haven’t noticed?” We also look at link quality. The final thing is looking at how well you serve users as compared to the sites that are ranking well. This whole report takes my team and me one to two weeks to complete a thorough report. They’re very thorough. Our current pricing is $4,000 US. I say current pricing because we have a large waiting list and so we may end up bumping up the price depending on what the demand is.
I was a solo consultant and now we have ten people on the payroll. I’m training all of them to do site quality reviews. I’m still heavily involved in every review that we do. Our goal is to have a whole team of people that are really good at assessing site quality. I can’t say that every site we review has seen improvements. In some cases, there are sites that will never rank again. Maybe they existed on tricks or tactics that Google’s closed the loopholes. A good number of the sites that we’ve reviewed have seen really nice improvements. The process that we have is almost completely all manual. We use a few tools but we do a lot of assessments based on, “Here’s what Google has said.” Every week, we comb through everything that a Google employee has said or Google blog posts or help forum threads, trying to get little hints from, “Google hinted that this could be a quality issue,” and so we’ll add that into the report.
You started off as a veterinarian and you’re not doing that anymore at all. This is 100% of your focus. The SEO world is your world now. What was it that got you into SEO in the first place? It seems like a strange hobby to have for a veterinarian in your off-hours. How did that happen?
My poor dad, every time I see him he’s like, “Are you ever going to be a vet again?” because I work hard. I was a good veterinarian. I was one of the vets for Stephen Harper when he was Prime Minister of Canada and did a good job and I loved it. I hurt my back in 2008. While I was on bed rest I was like, “I’ve always wanted to learn how to make a website,” so this was before WordPress. I think WordPress was around but it wasn’t super popular. I learned HTML and CSS and I built this website where people could ask me a veterinary question. I’d have all these clients that would come in and say like, “I read on Yahoo Answers that I should give garlic for fleas,” or something that was bad advice. I thought, “I’d like to get good advice on the internet.”
I created this website. I was getting 30 people a day coming to it. I was trying to figure out, “How do I get more visitors from Google?” My first forum post was in the SEO Chat Forums years ago was something about keyword density. I wanted to know, “How many times to have the phrase, ‘Ask a vet,’ on my home page.” I learned how do I improve my title tags and how do I create content that people want to engage with. At its peak, this site was getting about 20,000 visitors a day. It’s sadly neglected now. It’s one of those things that I need to pick up again. The embarrassing thing is it’s been hit by quality updates because I haven’t kept it updated. That was my interest in SEO.
I hung out in the SEO Chat Forums sometimes many hours a day. In 2012, I was pregnant with our second child and on bed rest again. That’s when Penguin came out. I was obsessed with it that that’s all I could think about. I would have dreams at night. I had a dream once that Matt Cutts called me into his office and said, “We’d like you to announce the next Penguin update.” Normal people don’t have dreams like that. When it was time for me to go back to work as a veterinarian, I was already doing these site quality reviews. They were basic compared to what we do now. I started getting into Google penalty work and became known as an expert there. When it was time for me to go back to work, I was making more money working from home. I was home with my infant daughter at the time. It blows my mind that I can do things like this. That people want to interview me and talk about SEO. It’s a pretty cool story. I’m still licensed as a veterinarian but I think I’m in this profession for the long run.
Penalty work, what did that entail? Were you doing outreach to toxic websites that were linking to your clients, asking them to remove the link? Were you doing disavows? Were you creating a response for your clients who got hit with manual actions for reconsideration request? What were you doing as far as penalty work?
All of that. There was a guy in the Chat Forums who messaged me and said, “Can I pay you to remove a link-related penalty for my site?” He had a penalty for unnatural links. I said, “I’m a vet. I’m not an SEO. Maybe I can recommend somebody for you from the forums.” He came up with this idea, he said, “Why don’t you do it and I’ll pay you $300 if you succeed?” I thought, “That’s a cool challenge.” I went through the whole process of manually reviewing the entire site’s links and figuring out which links did Google have an issue with. It was hard to get this penalty lifted because at the time, there was nothing written about how to do it. We got the penalty lifted. He paid me the $300. I started to write articles about, “Here’s how I got a penalty lifted,” and I was published on Moz a couple of times. That got me more business. I started to charge more for the whole process. At one point, I had twenty of my friends from a church I was going to at the time and said, “Come on to the house and I’ll pay you $20 an hour. I’ll teach you how to audit links.” We were all day long auditing links and contacting site owners for link removal. I’m proud to say I have a 100% success rate in getting unnatural link penalties removed and some of them were really challenging.
We don’t do a whole lot of that these days. We do the odd one, but Google’s dealing with most things algorithmically. A couple of years into doing penalty work, I could see the writing on the wall that the manual actions were getting fewer and fewer. That’s when I stepped up my game in terms of being able to assess site quality. Penalties were my in to the industry. I think it’s what gave me the A in E-A-T. Most people, if you mention my name, if they know me in SEO they know me for penalty work. It was fun times. I thoroughly enjoyed link audits. I’m sad we don’t do as many of them these days.
Did you utilize Link Detox as part of that link auditing process? Do you utilize now the tool from Link Research Tools?
I’m not a fan of automated link auditing tools. I’ve used a couple of them initially to help organize my link audit spreadsheets. What I decided to do was create my own link auditing tool. I spent an entire summer a few years ago trying to figure out how to program this. What I realized was that you cannot algorithmically determine link quality. I have a blacklist that I created that every time I do a disavow, if there are sites I come across where I’m like, “I would always want to disavow this site,” then I put it in my blacklist. You can check a domain across the blacklist, it’s MarieHaynes.com/blacklist. It will tell you whether I’ve disavowed it in the past.
What I found was that I could not algorithmically do as good a job as I would manually reviewing links. I used that software to create my own link spreadsheet. What I do is we manually look at one link from every single site that’s linking to the site. Some of them we can eliminate pretty quickly. We can see patterns of like, “All these spam sites linked with this particular keyword, so we can get through them quickly,” but we manually do. It’s a lot of work and it takes an awful lot of time. That way I can feel comfortable that I’m not making mistakes and disavowing good domains because all of the automated tools, they’ll make recommendations that are way off. If you know what you’re doing, the automated tools can be a good adjunct to help you. I get frustrated with some of them because I’ve seen people say, “I ran it through this tool and I filed that disavow and they’ve disavowed really good links.” It’s one of my pet peeves.
I would recommend you check out Link Detox because it’s a good tool and it’s not all automated. It allows you to scale across a large linked profile, which is super hard to do fully manual. For example, there’s a Link Detox Screener as part of the tool that allows you to hand check these quickly. The pages that are linking that are identified as probably toxic and then it gives the reason for it. It’s a flag with a tag like TOX1 or SUSP3. Each one means something different like, “This page has been de-indexed from Google or is not indexed in Google,” so that’s TOX1. TOX2 is, “The site has been identified as dangerous. It’s got malware, virus, malicious software installed on it.” You can hand check all of them but you get information.
I’ve certainly seen the reports. For somebody who knows what they’re doing and is using it as an adjunct to a manual review that they can be helpful. I’ve seen case after case of people that relied on them too heavily. The software that I created does pretty much the same thing as that. We crawl through them. We put together like, “We think that this is unnatural because it’s got this factor and this factor,” but then we still want to manually check that as well. That’s a whole other topic as for whether we still need to be doing heavy link auditing. We do some, but only for sites that have been heavily involved in manipulative link building. I think that for the vast majority of cases now, Google is ignoring those unnatural links. That’s a controversial subject as well.
My opinion is that you can still get damaged by a competitor or somebody who has targeted you by doing negative SEO against you and building low-quality links. There’s another strategy they’re utilizing, which is even eviler than just buying a bunch of low-quality links to point to you. They will pretend to be you, contact legitimate site owners and insist that you remove the link. They’re going to report you to Google if you don’t remove the link from their spammy site. It’s actually the competitor and it’s a legitimate link.
I think some of that stuff is criminal activity. There’s always going to be black hat, just unethical people out there. Can spammy links hurt you? We’re going to be up for debating that for a long time. I know Gary Illyes from Google shared with me that there are algorithms outside of Penguin that looks at link quality. I believe that if you’ve really been active in building unnatural links, Google can look at your entire link profile and say, “Even these good links, we can’t tell what’s good and what’s bad.” In cases like that, we’ve seen some uplift after filing a disavow. It’s tricky because a lot of the sites that we file disavows for, we’ve also made tons of other improvements. It’s often hard to know, did the disavow help? We’ve got a couple of cases where we filed a thorough disavow and then within a few weeks we’re starting to see a nice increase in rankings. It is something that some sites still need to do. If you’re the average small business owner who’s never paid for SEO link building and you’re seeing weird spammy links pointing to your site, you probably can ignore those.
If you have a Link Detox type of tool that you can run to see if you have a problem, then you can get some expertise, some help to ascertain what level of toxicity you have. I hate to have readers assume that they don’t have a problem because they haven’t been ranking at the top already. Maybe they don’t know that anybody has done anything nefarious to their website. That they had a previous SEO who maybe crossed the line and did some stuff to get results quickly. More information is better. For our readers who want to learn more SEO, where would you suggest they start? You started with forums like SEO Chat and reading different websites and blogs. Where are the best places to go these days?
I find that a lot of the forums that used to be good aren’t high quality now. There’s a lot of spam in them. One of the forums that I like is the Local Search Forum. It was purchased by Joy Hawkins who’s well-known in the in the local space. If you’re a small business that has general questions about SEO, there are good practitioners in that forum who day in and day out do local SEO. That’s a good place to ask questions. In terms of staying up on advice, The SEM Post by Jennifer Slegg and also Search Engine Roundtable. There are a lot of info there. Barry Schwartz will write probably four or five posts a day on the latest things that Google has done. One of the best ways to learn SEO is to hang out in the Google Help Hangouts that John Mueller from Google does. It’s approximately once a week. He’ll do a Hangout where you can ask any question. We’ve learned tons of stuff. Every week when he does it, I have one of my staff members transcribe the meeting, and then we talk about, “What did we learn from this?” and there’s always something.
I also have a newsletter. You can go to MarieHaynes.com/newsletter. There are two versions. There’s a free version which tells you, “Here are all the things that Google announced or changed publicly that you need to know about.” Then there’s a paid version that also curates good tips. We’re scouring Twitter all day for somebody who said, “We tried this experiment and this worked,” or things that we’ve gleaned from Help Hangouts and stuff like that. That’s in the paid version. If you want to learn SEO, by far the best way is to get your own site and start to try to rank it. I see many people that want to offer SEO services and they’ve never ranked a site before. That’s how I learned was I created a site and then I figured out how to get it to rank. By far, that was the best learning that I could do.
If you don’t want to learn this but instead delegate it to somebody else and have them learn it or maybe bring in an expert to do this for you and they have already done all that learning. You need to at least have some level of expertise to ask the right questions. Make sure you don’t end up getting snickered by somebody who talks a good game but they don’t have the chops. Trick questions that you can work into the interview. I created an SEO BS Detector free PDF download. It has a bunch of trick questions that there’s only one right answer for. If they answer it wrong, then you know to throw them out the door. Even things that aren’t necessarily trick questions but difficult for somebody who doesn’t know what they’re talking about to answer. It would be things like, “Tell me the difference between Penguin and Panda.” If they get it wrong, “Panda’s the link one and Penguin’s the content one,” and they’re like, “That is so basic and you got that wrong. Goodbye.”
It’s frustrating to me that there are many SEO companies out there that they’re good at presenting themselves as experts but a lot of them are relying on tactics that don’t work anymore. One of the things that I recommend if you’re trying to hire an SEO company, if you’re serious about it is asking for references. Most SEO companies if they’ve done good work, they have a client base of people that are willing to become a reference. With that in mind, I would only do that if you’re serious. If everybody who wanted to work from me asked me for a reference that would take a long time. There are many people that are charging good money for a service that doesn’t work anymore. There are some really smart good SEOs out there as well. It’s knowing which questions to ask.
Your website MarieHaynes.com is where people would go for the newsletter, for the blacklist, for the checklist, for the blog post about the August 1st update. All of this amazing stuff, all these great resources all at that website. If people wanted to reach out to you directly on a social platform, what’s your favorite platform?
I’m on Twitter, @Marie_Haynes.
Thank you, Marie. Thank you, readers. We’ll catch you on the next episode.
Your Checklist of Actions to Take
☑ Recognize the importance of EAT: Expertise, Authoritativeness and Trustworthiness. Marie says that this is a major factor in Google’s algorithms.
☑ Hire credible people and experts in their field to boost EAT off of my website’s content.
☑ Be well-informed about the Google Quality Raters’ Guidelines. It’s an instruction manual for these raters who are contracted out by Google to tell them what is considered high quality in a website.
☑ Rank better by creating great content. Substantiate claims with trusted resources and up-to-date research.
☑ Understand the distinction between quality raters and Google webspam. A site may rank higher but if found with a quality issue, it can be flagged by Google and lose its ranking.
☑ Make sure to read The SEM Post by Jennifer Slegg on Key Takeaways from the Latest Quality Raters’ Guidelines.
☑ Visit MarieHaynes.com/book and get access to the free checklist that Marie and her team uses when evaluating sites based on the Quality Raters’ Guidelines.
☑ Check Marie’s blog post that discusses the effects of the August 1st update on YMYL, Your Money or Your Life sites.
☑ Consider utilizing Link Detox as part of link auditing process or even check blacklisted domains from MarieHaynes.com/blacklist.
☑ Learn more about SEO from these useful resources: Local Search Forum, The SEM Post by Jennifer Slegg, Search Engine Roundtable and Google Help Hangouts.
About Dr. Marie Haynes
Dr. Marie Haynes is completely obsessed with understanding Google’s algorithm updates. She speaks and writes regularly on the topic of Panda, Penguin, link quality and the Quality Raters’ Guidelines.
Leave a Reply