Dan, it’s so great to have you on the show.
Thank you so much for having me.
Let’s talk about content and the intersection of SEO with content. Because there are some folks that still say that SEO breaks content because you should just, from a stream of consciousness, write whatever awesome stuff comes out of your head. But now, you gotta start injecting keywords in and it’s all artificial, unnatural, and it’s made for search engines instead of for real humans. What do you tell people like that?
I’ve been positioning this by calling it writing content for a search audience. I prefer to use a search audience in place of SEO or SEO content. That is because people searching Google or some other search engines are actual people looking for content and they have certain psychology, goal, emotions, and mindset when they’re searching Google for content versus signed up your email newsletter or browsing social media. That’s where I begin this argument.
I have a lot of clients say to me, “Well, you gave us a lot of great topics to write about but we don’t want it to ruin our content. We think SEO is going to make our content look stale and stuffed with keywords, etc.” But I try to reframe that as, “Well, let’s talk about creating content for the psychology and needs of a search audience,” and that’s how I begin that conversation.
That makes a lot of sense. What is the psychology of a search audience and how does it differ from the psychology of a Facebook user?
I’d say probably one of the number one things across all industries and content types and B2B and B2C is speed. That not only means the speed of how fast your page loads technically but the speed of how easy the user can find the answer to what they’re asking. That can involve things like having really great headlines, not burying your content below the fold with big images, or ads, or call to actions, and just aesthetically, in designing UX-wise, having a website that makes the content very easy to find and browse through and consume. Maybe it involves a table of contents at the top or a sidebar menu with jump links. Speed is really number one.
Number two, that’s where we start to get into talking about what type of audience, specifically, and topic because a health search audience is very different than a search audience. I just did a topic with a client that was the history of bacon. Two very different psychologies, but at the end of the day, they’re looking to find information as quickly as possible. Then number two is going to be structured. Everyone wants to find content that is very well structured. I think that’s something very common that results in content that is very successful in search which is a lot different than maybe content that’s successful in social platforms.
To your point earlier about the concern about people stuffing in keywords and manipulating the content in that way, I like to use SEO research to inform the structure of content above and beyond anything else. That’s going to mean, “I’m writing a topic…” I’m going to give you a real example. I just created a piece of content for a client recently. I’ve never ever created content anymore but I’m doing some SEO work for a music company. I’m also a musician myself so this is an area of expertise for me. I created a piece of content that was centered around, “What is the best tempo for Trap hip hop music?” I put a lot of research into this but I used my keyword research to help dictate the structure of that piece of content.
I’ll give you a link you can put in your show notes, or everybody can Google BeatStars Trap BPM—that’s BPM as in beats per minute—to see the exact piece of content that I’m talking about. You’ll see a very logical heading and document structure which is very important to be successful in search, but that’s because Google is trying to go after what the audience wants. What the audience wants is, “Okay, I’m going to type Trap BPM into Google,” and that is a user that might not know exactly the specifics or nuances of what they’re asking or what they want to find. They’re depending upon Google and the content creator to deliver content to them structured in a way or they’ll walk them through the questions that they also didn’t know to ask.
I think a user searching Google is their starting point and they’re almost asking Google, “Tell me what to ask next.” You see this reflected in the people, also ask area, and that little drop-down when you click on the result on Google. It says, “People also searched…” That’s Google trying to help you along your journey through learning about this topic or whatever you’re researching.Google and other search engine users are actual people looking for content. They have certain psychology, goal, emotions, and mindset when they're typing in the search box. Click To Tweet
Right. In fact, can’t you structure a whole section of your site, not just the document itself based on your keyword research? An example that I’ll share is from my oldest daughter, she started doing SEO and blogging when she was 14 years old. At the time what was very popular and what she was really into was a kid’s gaming site called Neopets owned by Nickelodeon, the TV network. She would research what sort of keywords were relevant and popular related to Neopets, Neopets cheats, Neopets avatars, Neopets points, and all sorts of different things that were relevant.
Then she would structure—based on those most popular keywords—entire sections of her site that informed her decisions about which things to feature as part of her top nav and whether to create a whole page around the topic or just include it in particular other documents on the site. She made this whole blog that ended up making a fair amount of money for a kid like $1000 a month passive income from Google AdSense by doing a lot of her thought process and research upfront and launching out of the gate was something that was really strong.
So, there’s an example. Do you have an example on your side maybe it’s even the same client?
I have an example actually of how to fill in those topical gaps. There’s a couple of new tools out there. One that I have been hesitant to talk about and share a lot because it’s named not optimally and it’s called the TF-IDF tool. Of course, I’m sure a lot of listeners know TF-IDF, term frequency-inverse document frequency is not a ranking factor as far as we know, it’s not something Google uses. It’s a very primitive way to analyze a piece of content for the terms in it. It’s like keyword density. It should not be looked at as a score in Google.
However, I think the mistake this tool is making is they’re positioning it like that. But with that said, this tool, what it does is it scans the Top 20 results for a keyword. I’ve input Trap BPM into this tool, and it gives you back the most commonly occurring, one, two, and three-word phrases found across all of the Top 20 documents in Google for that phrase.
If you set aside the fact that Google is not using TF-IDF and realize that this is a super powerful tool to just instantly give you all of the topics that are commonly mentioned when you search for a particular topic. Now, you can pull out that list and not think of it as keywords but think of it as subtopics to address. When I did this, I found producer names, song names, and co-occurring topics like Latin Trap, etc. Then I went, “Okay. Now, I know how to be inclusive and put all of these topics in there.”
Another example that I’m sure listeners can relate to a little bit more is when I was researching for another project and I put in Flint water. Of course, having to do with the Flint water crisis. It spits back like the Top 5 term was 9/11. It makes this topical jump but it’s doing that by scanning the Top 20 results in Google and pulling out the co-occurring terms. A very interesting psychological realization to connect two topics like that together.
The other tool which I just played around with a tiny bit and in fact, Rand Fishkin formerly of Moz, he tweeted this last week, it’s called INK. It’s an AI topical semantic analysis tool with something like a Hemingway app built into it. You can analyze a piece of content topically and for grammar and good sentence structure. It’s a very powerful tool and I believe you can download it for free right now for Mac and PC.
I put in the text from the content I created for this client—which by the way is already ranking on Page 1 for its term—and it was 94% topically complete. I got that score after I created that content, so I was very pleased to see that my process and what you just spoke to was actually working. I think that’s, at least, one reason I speculate that that piece of content is already ranking well in Search.
That’s pretty slick. I’ll have to check that tool out. It reminds me, what you’re describing, of the search metrics content suite and specifically, the Topic Explorer and Content Editor where the Topic Explorer allows you to see topic or entities and how they relate to each other and explode out subtopics with any each of those topics and then add them to a list which you can then work on that list inside the Content Editor, which I’ll get to in a moment. But then with the Topic Explorer, you can also slice and dice how these different topics relate to each other not just in terms of relatedness because they’ll have a distance between each of these bubbles based on how related they are, but also, semantically associated, but also things like seasonality and competitiveness.
If you’re already ranking and you have a website and you put that into the tool, it will look for your rankings in relation to those topics and keywords associated with each of those topics. You can also look at where in the funnel, like in the sales funnel, that each of these keywords are. Are they more at the top of the funnel or more towards at the bottom where you have purchase and even retention? Is it more of a transactional, informational, or navigational keyword? All of these are just available via different tabs and then, they are all color-coded based on which tab you’re in. It’s pretty darn slick.
Once you’ve selected a handful of topics that you want to cover in a content piece, then, you go to the Content Editor. Those topics get exploded out into must-have keywords, recommended keywords, and additional keywords. You might have three topics, 30-40 different keywords. And then as you’re writing the content or let’s say, you’re editing an existing piece and you just copy and paste it into the Editor, it does all the scoring.
Of course, not across the entire document to give you SEO score and readability, and that sort of stuff but also how many occurrences of each of those keywords and those categories like additional recommended, etc. Do you have mentions of those different keywords? And then, if you go over the top and mark it red and, besides giving you the number of keyword occurrences in your document so far, it will also show you what its recommended numbers are. You might have a 0 out of 8 or you might have 16 out of 4. Of course, it’s red and you’ve way overdone it.
This is really powerful because the idea of repeating keywords or related close synonyms is so 2006 SEO. You’ve got to really have a comprehensive overview of the topic. If you’re talking about a certain type of electronic music, for example, trance, and you’re not also mentioning other related things.
Like EDM or maybe house or asset or whatever, these are related terms that are important. If you’re trying to sell lawnmowers and you’re just talking about lawnmowers in the document but nothing about yards, lawn clippings, lawn care, weed whackers, landscaping, and all those other keywords, none of it just lawn mowers all over again, multiple times, that looks like thin content to Google. Not thin content as in like 200 words, that’s a pretty minimal amount of content, but it’s just repetitious and surface level.
That was a long little diatribe there but the point here is that this is really important for people to get what they need from a great tool that will give them insight on what topics or entities to cover. Then, they need to do a comprehensive and solid job of the research and the layout of what’s the storytelling arc and what makes this comprehensive and deep instead of thin.
I think one very important thing I want to point out, people do want to look at this article. I’ll speak to how I catered to the search audience but this company’s existing audience because I always run into this dichotomy when I’m working with clients where’s the level topic from a search perspective. But then, they all cringe when they think about sharing this with their existing audience and there’s often a lot of friction there.
If you look at this article, the top—I’ll call them paragraphs—the top four paragraphs, before it says, “What is the BPM of Trap?” That was all written for existing people or their newsletter or people that follow them on social. That content is not for search engines. Although I do set up just the keyword to the content in there. But then, the heading that says, “What’s the BPM of Trap?” That’s where the beginning of search engine content begins.
It’s not buried so deep, I see people complaining about this recipe content where somebody will write a recipe but then have nine paragraphs of some story leading up to their actual recipe beginning. But a lot of users coming from search, they’re the ones that want the recipe to start at the top of the content, but the reason why I get it that the recipe bloggers put a story is because they’re writing with their existing audience in mind.
You have that dichotomy, but the way I’ve tried to solve for that is just putting a quick little intro at the top that’s for the non-search audience and then where it says, “What is the BPM of Trap?” That’s where the search audience begins because this is a key thing too, it seems the search is Trap BPM. What I’ve really been trying to do is just answer the question. The inverse pyramid—don’t bury the lead sort of thing—is to give the literal answer to the questions, even if you’re just summarizing the post right at the top. That’s super key especially in this day and age of featured snippets.
That entire section below, “What is the BPM of Trap?” That’s written to be the beginning of the content, but also hopefully, if and when they win the featured snippet, that’s the content that should appear there as well. I want to point out that structure out too because it was very intentionally designed with that mind.
How does that pan out for you in terms of targeting featured snippets assuming that this isn’t the first time that you’ve targeted a featured snippet with a piece of content? Are you pretty successful normally in stealing the featured snippet or is that something pretty elusive?
Pretty successful. It’s not something I focus on in the weeds so much because I’m usually consulting at a higher level with clients. Occasionally, we’ll focus on featured snippets but usually, what I do is set up the content for featured snippets’ success and I could spell out that structure and then oftentimes, when I go back and check, the client does have the featured snippet.
For this one, it’s structured, “What is the BPM of Trap?” Google and users just want that answer. The biggest thing that I do is put the noun in the question and then begin the answer with that noun. It’s called, “What is the BPM of Trap?” “The average tempo of Trap music is…” It’s just a very clear structure.
One thing I do, once in a while, is throw the content in the free version of Google’s natural language tool and look at the sentence structure. I don’t pretend to know everything about what’s going on in that tool, in that cool diagram with all that lines connecting the words together, if maybe some of the listeners have seen that, but I know if I stepped back and looked, it’ll show you how complicated your sentence is. When structuring for snippet, I just want to make the first sentence or the answer as simple as possible, “This thing is this,” and just connect it with the word “is”. I’m also, in that piece of content, bolding the answer, just like Google would do in the snippet itself.
Right. Keywords and context.
Exactly. I think that’s important to do that as well.
The natural language tool, that’s part of Google Cloud and it’s an AI-based tool that you can pay for but there’s this free demo where you just paste in a bit of text like a text from an article or your home page and then it does stuff like calculate what the topics or entities are from the page and what your salience score is for each of those topics. Do you have any recommendations around what’s a good salience score and when to freak out that you’ve gotten everything wrong when you get the results back from the tool?
Here’s how I use that tool and I made a video about this earlier in the year that a couple of SEOs criticized me a little bit because technically, the tool is supposed to be used for analyzing sets of documents and the free version is not really a tool, it’s just a demo even though I call it as a tool. But if you keep in mind, just that context, the couple ways I like to use that tool is number one, put your content in there and go to the Categories tab. I did this as I was comparing two health articles: one on Healthline and one on WebMD. I think it’s about green tea or something like that. One of the ones that are really ranking really well came back categorized like healthy nutrition, but the other one that came back that wasn’t ranking as well got categorized as, I think, food and beverages.Give the literal answer to the questions asked online, even if it’s only the entire content’s summary. Click To Tweet
I found that very interesting and I do this a lot, analyzing my own content or just looking at the categorization of content that ranks well. If you are really trying to rank for something that’s more a health and nutrition-related search then, your content should hopefully be categorized like that.
You can check the categorization of how Google sees your content, number one. Number two, what I really like doing, getting back to making sure that you cover all the related topics is taking high ranking articles of the search you’re trying to rank for and put the content of those articles in the tool to see all the entities and things like that, that show up for the articles that are really ranking well. I don’t do this in a super scientific or calculated way. I’ll just put the text in there and scan the results. Again, as a writer or content creator, just make sure that I’ve used that to be inclusive and have covered all the appropriate subtopics or related concepts in a given article. That’s two ways that I really like to use that tool.
That’s awesome. I’ll include the video if you can send me the link afterward. I’ll include it in the show notes.
Perfect. Do you ever use Majestic and the topical Trust Flow list of categories or topics to see how well your link graph reinforces your primary topic or maybe not so much?
I haven’t really used Majestic a whole lot to be honest. My go-to link tools are AHREFs and Moz, but not Majestic but that sounds cool. That’s something maybe I’ll dig into a little bit.
I’ll include a couple of articles that are relevant to Majestic that’s one relating to topical Trust Flow and one also relating to the natural language tool from Google Cloud. What are your thoughts on E-A-T and how important is that? For E-A-T—Expertise, Authoritativeness, and Trustworthiness. How important is it to get that right? If you are a content writer or you’re a website owner and you are really sure of what this whole thing is about, and you’re not really worried about it. Should you be worried about it?
Okay. We’re going to have to peel this apart a little bit, there’s a lot to unpack here. E-A-T comes from the Quality Raters Guidelines. I think it’s a given that you should have expertise, authority, and trust in everything you’re doing. I don’t think there’s any secret to that. But where I think SEO has gone astray a little bit is how they’re interpreting the Quality Raters Guidelines and what it means in relation to the algorithm and the latest core updates. I believe they’re supposed to be, and I think of there as being human E-A-T and algorithmic E-A-T.
What I mean by that is the Quality Raters Guidelines spell out these things that they want their human beings to look for. I think the reason why they do this is because their algorithms can’t do this. In my mind, why would they have 10,000 human beings as Quality Raters around the world rate content and look for links from Better Business Bureau or good ratings or look at their author page to determine if that person’s an expert in that thing or do things like a domain search, minus site calling domain name to find other reviews or sentiment around the web. That’s where they have the humans do to do QA against what their algorithms are doing.
In my opinion, the algorithmic side of E-A-T is going back to a lot of things we’re talking about–topical analysis of a piece of content. In other words, if you have a piece of content that’s supposed to be the health benefits of green tea, Google and the AI machine learning is probably getting really good at figuring out what is a high-quality piece of content about X topic included. What other words in co-occurring themes? What sort of writing quality? What sentiment? What are all the things should that a single document include to make it high quality and relevant? Number two is the website is a hole and we talked about this a little bit by talking about the Trust Graph of links and making sure that your website has content related to the topic that you’re trying to rank for, not just that single topic itself.
I think that a lot of SEOs and companies are pointing fingers at E-A-T for a reason why they have lost rankings or lost traffic in the core updates. Like I said, expertise, authority, and trust has to do with how well you’re going to perform and search to a certain degree, but I don’t think the Quality Raters Guidelines form of E-A-T is at all directly related to algorithm updates.
My position or opinion on this is that E-A-T and just all the data that the army of human reviewers, the manual raters working for Google, they’re producing all these data that is then used as father or as training data for machine learning algorithms because the more training data an algorithm has, the better it’s going to come up with the results.
Over time, I’m sure we will see machine learning algorithms that are better than humans at sniffing out the surface level or fake content out there and determining if this really does have expertise and credibility and so forth. That’s, I think, the bigger picture, but there seems to be—from a lot of SEO’s perspective—this correlation between E-A-T and huge rankings drops during some of these major updates such as the Google Medic update, Google Birthday update, and some of the core update. Would you say that the correlation is real or imagined extrapolated a bit too much?
I’ll step back just a quick second because I think the paradigm of how people are looking at Google updates is very off. In the past, when there was a Penguin update or Panda, or any of those older updates, pre-Medic essentially, it was generally a domain-wide de-ranking. You would look at your domain as a whole and see the traffic going up or down. Google is turning down or up the dial for your entire domain. But pretty much every site I’ve ever analyzed for Medic and core updates since Medic aren’t like that at all.
The paradigm I see is Google is adjusting relevance and rankings for certain query spaces but is not singly tied to any particular domain or not. You see their sites that look like the entire site has gone up or down but it’s just that they all happen to have 90% of their content is the same type of document or the same type of query that they’re all targeting.
I think that’s the first thing that SEOs and people that are struggling with core update need to realize is that, it’s not when Google and Danny Sullivan say that there’s nothing to “fix”, I believe they really mean that. Meaning that in the past with the Penguin update, you just needed to submit a disavow or improve your link profile, or with Panda you need to improve the quality of your content overall or thin content of the website overall and everything will go back up together. But I think people are trying to say, “Oh, let me just fix my website. There’s some red flag that Google has and then all my traffic is going to go back up.” I don’t think that’s the way to approach it all.
Instead, what I have done with clients is approach it at a page-type and query-type level and even just a single page or single query level as well. When I go into all the clients’ search console accounts, you’ll see a handful of pages that may have had a lot of traffic for them. You typically see this in websites where you see handfuls of content driving the topmost traffic. If a few of those happen to go down because Google adjusts their relevance and their rankings for that set of keywords, it’s going to look like your whole site is going down. But really, what’s happening is Google is just adjusting the relevance and rankings for a few pieces of content that make it look like your whole site is going down when it’s not.
That’s a great point. You need to go into tools like Google Search Console and tease apart how much of a traffic drop in terms of clicks a particular piece of content or section of a site contributed to that traffic loss.
I’ve been wanting to share more about this or blog about it. But I think it’s a mistake for SEOs to be calling sites winners and losers in core updates and to be sharing screenshots that only show visibility. I mean, traffic reports are bad enough but visibility reports are even worse because they don’t show much of anything. I think it’s a mistake to share an overall traffic report and then call a site a winner or loser because that’s not what’s really happening. Pretty much every site that “lost” in a Google update, you are going to the search console and they actually had some content that gained traffic. It’s just that they lost more than they gained.
The issue that I see with visibility reports, I remember back in March, there was an update and one of the visibility tools showed a whole list of sites that have “lost” traffic and one of them was a client of mine. I had access to their internal search console—one of the top publisher sites. I went to their search console, they had not lost 1% of traffic, but all they had lost was impressions. Of course, impressions go back into these tools because that’s what the tools are looking at for visibility and a tool sees, “Oh, you lost 10,000 impressions for this one random keyword that you weren’t getting clicks for, but maybe you are at the bottom of Page 1 or something.” They had lost impressions like that across thousands of keywords, and so it looked like they had lost 60% of their visibility where in fact, they had not lost one click at all. I think that’s the danger of using visibility reports as well.
Good point. That’s a great little analysis you did to make sure that you’re still in the green zone of doing a good job even though there was supposedly a big loss and that happened. Are you using any kind of competitive intelligence tools to see how your competitors are doing since you don’t have access to everybody’s internal search consoles and Google Analytics and so forth? Are you relying on SimilarWeb, or AHREFs, SEMrush, or all of the above?
I’m a big SEMrush user for the competitive reports. But the one thing I really like doing in any of these competitive reports is not just relying on the overall visibility chart but diving into the pages report or the positions report to try to get a sense for, “Okay, if it says they lost traffic, what specific keywords or keyword type or content types did they gain or lose traffic from?” I’m essentially trying to do the same thing I do in search console, but obviously without the access to that data. The one thing I like about AHREFs is that their keyword report will show you a ranking history of I believe the Top 5 or Top 10 positions over a period of time so you could actually see how the different sites that are ranking 1-5 kind of trended overtime.
You can also see that in SEMrush too. If there’s a little greater than symbol in front of each keyword in the organic research tab when you are looking at the report of all the keywords and you click on that and then, it will expand that particular keyword and you can see the last 12 months of rankings for that keyword.
Both those tools are handy. One thing I want to circle back to if anyone is trying to recover from a core update or just make sense of it, I see a lot of companies skipping over the basics of just the web and SEO. What I mean by that is, here’s a little trick you can do: install the web developer plugin for Chrome or Firefox and turn off CSS and just look at your document as plain text. Going back to what we talked about earlier, if your document is not structured well and then your site is not structured well with internal linking to be relevant for that search, that’s where you need to begin.
You go into Search Console; you look at where you lost traffic from what queries and you find a particular keyword where you lost a lot of traffic search in Google. Look at the other content there. Look at your content with CSS turned off and you can look at your competitors’ content with CSS turned off as well. Look at your headings, look at structures, throw it into a Hemingway app, or some sort of tool to look for grammar or is it easy to read. If your content, just the plain text is not well created, well-structured to begin with, that’s where you need to begin because any amount of adding an author link to head content, or ratings is not going to help if the content itself is not good, to begin with.
If somebody wants to get a sense of what these different core updates mean—the March update versus the June update, and some of these updates from last year and so forth—what would you tell somebody who wants to know, “What does that mean this June update? What does the March update mean? Was that one about E-A-T and that one is about E-A-T as well?” It’s all very opaque when you just look at the tweets from Danny Sullivan and Gary Ish, and so forth, but we try to make meaning out of this and create our own stories around these updates. But there are a lot of updates that happen we don’t even notice.
I say for the core updates, I would always begin by looking if there are relevance updates. The punchline to that is for me, many of these I’ve analyzed, relevance is sort of that top thing. Of course, that’s very broad as well, but a talk I’d like to share a lot is one that Paul Haahr, who’s a Ranking Engineer at Google, he gave it at SMX Advanced 2016.Optimized content is when both your audience and Google's algorithm appreciate what you published. Click To Tweet
I loved that. I was there in person.
That’s amazing. In that talk, he mentioned relevance as Google’s top-line metric that they measure the success of their search engine on. Let’s take it back for a second. Core update is a core update. E-A-T is an element of the algorithm, but when you think about just the core of Google’s algorithm, what do you have? You have a search engine that tries to rank relevant authoritative and quality documents on something.
I would urge everyone, if you are trying to make sense of the core updates, to use your own data and use your own analysis. Go to Search Console, look at the keywords spaces or types of contents that have lost traffic and then start by looking for, “Are you truly the most relevant result?” I was doing this for another client. I forgot the topic space, but I went in and looked at the keyword, they were ranking for something that they weren’t really relevant for and they just happened to be sitting at position two or three in Google for a little while, kind of out of dumb luck rather than something that they should have been ranking for.
Very often, you see that there’s a lot of sites and site owners I’ve seen complaining about losing traffic, but I’ve even gone in and looked at the keywords they’ve lost traffic, and their documents not really their most relevant result for something in many cases. I would begin with relevance and then work from domain authority or link authority because I’ve seen that as a huge factor as well.
If I had to rank them, I’d say relevance, then your overall site authority, and then your site’s topical authority. Maybe that SearchMetrics tool is something you can use, but I notice a lot of niche websites with a lower domain authority just ranking really well even for high volume stuff because they’re more topical experts and probably have more topical authority around those things.
What sort of metrics do you like to use specifically? When you say domain authority, are you referring to DA from Moz Link?
I use that as a general gauge. I just like to know in general, “Is a website a 40 or are they 80?” Just looking for a ballpark because granted not all link metrics are great, and certainly, with Moz, there’s really no difference between a 79, an 80, or even 83 or something like that, so I’m just looking for ballpark domain authority.
For Moz, they also have MozRank and MozTrust which are approximations of page rank and trust rank. Are you paying any attention to those metrics as well?
Not really. I’d say, the second thing that I do after domain authority is just trying to gauge the topical authority of a site. There is a site called Business Chron. I’ll get you the link, but my point with that site is this is a website that writes about everything from business to cat. The topics are all over the place, yet they have a domain authority of 90. If I see them ranking for something, I know that I can outrank them on almost any other website because their topics are all over the place on their site. They are the Walmart of websites. It’s, “Everything for everybody,” sort of thing.
What’s the expression, “Jack of all trades, master of none.”
Totally. That’s exactly what that website is, and it’s ugly looking, etc. That’s an example of a site that you look at the number, you have a high domain authority but topically, you’re not really experts in anything. That’s the other thing I do, is do a really quick gut check and then, diving deeper by getting a sense for how much of an expert this site is at their topic. You can do something like site crawl and search and in-title for the topic or just the general keyword and get a sense for how much content do they have around that topic. You can crawl the site and do a search and see how much content they have around the topic or you can go SEMrush and see how many keywords they’ll rank you for that topic. There’s a lot of ways, but just trying to get a general sense of how authoritative they are on a given subject.
You’re not relying on any particular metric for that, like Majestic, topical Trust Flows, or anything inside of let’s say linkresearchtools.com or anything like that.
Got it. I love that you are using in-title. People don’t use that enough. I have a whole book all about Google Advanced Query Operators and things like that, it’s called Google Power Search. It’s a topic near and dear to my heart.
What do you think about bias? Is there an inherent bias in Google’s algorithms? One thing I’m thinking in particular of is this thing that Google internally refers to as ML Fairness–Machine Language Fairness. There’s some stuff out there about this. There are some secretly recorded discussions with Google Executives and stuff like that. There’s definitely some bias in there that gets worked into the algorithm, maybe not intentionally, but then there’s also manipulation that happens after the fact to skew things in one direction or another.
I just heard a talk from an ex-Googler, this guy Kaspar, is essentially a whistleblower. This was a pretty fascinating talk that I attended. Anyways, I might actually have him on the show and talk about these manipulations and how they work and why Google is doing it. But what’s your take on this whole idea of bias in the algorithm? Like these holistic health folks have said that they are being targeted unfairly, holistic homeopathy alternative therapy websites like Dr. Mercola, Dr. Axe and so forth. They’ve taken a real hit from some of these more recent updates. I’d love to hear your thoughts on that.
Sure. I’ll circle back to the health thing in a moment. One sight that I have intimate knowledge on is examine.com, that’s one of the health sites you might be referring to. I’m a little out of my league when it comes to learning machine language, but from my personal anecdotal standpoint, and my higher than formal knowledge level about search engines and just what I’m looking at Google, I believe there’s some bias.
I’ll do a search for something that I’m very familiar with like just last night, I was searching for Top Rappers 2019. As you can see, I try to stay on top of the hip-hop. I’ll look at the results and I’ll be like, “Yeah, there are some people in there. They are not even very well-known anymore.” You look at some results in Google if you’re a topical expert and you’re like, “I don’t know about those results. I just think some people are jumping to the top of the list just because they are popular for some reason, but they don’t really match that search.”
I do believe there’s some bias, but I think when it comes to the health sites in particular—and I’ll speak to about examine.com in a moment—but I do think, first and foremost, the algorithm is looking for relevance of topic and just how well the content is structured to fit that. With examine.com, I’ll get specific into that. Just full transparency and disclaimer, Sol Orwell, who’s one of the owners there, we’ve talked at length and I’m allowed to talk about this, but I help them with SEO.
I did a three-month project with them a couple of years ago or maybe 19 months ago. At that time, they were even having trouble with rankings and with the algorithm and SEO, but when you look at their set of content, this is one of the sites I was referring to. 90% of their content is just this one type of document that’s targeted at one-word supplement names like Creatine, for example, is one they lost a lot of traffic on.
When you look at that number one, that’s not a great website strategy. To put all your eggs in the basket, this is just one type of content that you want to rank for this super highly competitive high-volume type of keyword. But then, you couple that with when you actually look at their content—going back to structure—it’s not structured in the right way to rank for those things for a search audience. Commonly, when you’re searching for a supplement, you are going to see things like—you spoke to this surely with the video game example—you’ll see things like Creatine benefits, pros and cons, ingredients, uses, nutrition facts, the list goes on and on of these subtopics that always come after in search suggest these supplement terms in Google.
When you look at the contents, it’s really good, it’s very well-researched but it’s not reinforced by the search audience. It’s written, I believe, for somebody that wants maybe deeper research, but it doesn’t give that start at square one approach to those supplement topics that, I believe, users want. That’s unfortunately why sites like WebMD or Healthline or whatnot will do better in search, not only because they’re higher domain authorities, but they just structure the content for that. Unfortunately, it’s a watered-down version of a medical topic. Most people, when they’re searching Google for creatine benefits don’t want 6000 words with the most in-depth research sited on that. They just want their answers summarized on a site that they trust.
Would you say that this relates to the customer journey and whether they’re in their research mode or buying mode? They’ve already bought the product versus they want to collect information, do some research, maybe not even to buy the product at all but to write a blog post about it.
I think the customer journey is exactly the right way to frame that. Examine.com is great for people that are doing a certain type of research around those supplements but it’s not great for that person that just wants to know, “What is creatine? Give me a basic overview of how to use it,” etc.
On the flip side, I do try to question my assumptions. I worked with Sol on Examine, we did a project. I did an audit, I gave them some strategist, not all of which or many which they implemented which is, I believe, they’re still not doing well in search, but I don’t have intimate knowledge of the site anymore. But one thing I will say is that I was doing some searches where I was trying to see, is there some ranking suppression happening on Examine? One odd thing that I found is, I took the first sentence of an article of theirs, put it on Google without quotes. They were ranking number one for that.
On the other hand, even though I believe they need to begin with restructuring their content if they really want to do well, it does look like there’s some sort of potential ranking suppression happening with that site. We take the first sentence that’s the exact copy and throw it on Google like you should rank for that. Not only they are not ranking but other websites are getting the featured snippets and things like that and it’s something I literally only spend 10 minutes on. Obviously, it’s tough to really dig in deep without actually doing a project on something.From your blog, to your site structure, down to your internal and external links, make sure that everything is relevant for the search term you’re going for. That is where you need to begin. Click To Tweet
Did they rank when putting quotes around the entire phrase?
They did. But they got all kinds of sites scraping their content even though they’re like supplement eCommerce site scraping in their content, putting it in their product descriptions. Maybe they have an issue on that side of things. One thing that Paul Haart talked about was how Google tries to filter out duplicate content. I found it interesting that a separate algorithm that runs after the main algorithm which I think is very important that anybody doing SEO should realize is that Google has their main algorithm and then, they have filtering algorithms that come after that. One reason I try to do searches like that is I try to trick Google. I try to trip up the algorithm so it gives you something where you’re kind of breaking or you’re seeing a little bit behind the curtain of what might be going wrong in Google.
They must forget the little are-you-a-robot question all the time. I get that.
I get that often. There might be bias. Unfortunately, when you step back as an SEO and you look at WebMD versus Examine from a pure domain authority standpoint, WebMD is going to win. You look at Examine’s link profile and it hasn’t grown a lot or in a very great way in the last few years. In fact, I had Sol on my podcast a couple years ago, that’s how we first met and connected and the literal title on my podcast was, How to Rank Above Wikipedia.
Examine was outranking Wikipedia and WebMD for a while, but I think these relevance algorithms and maybe some of these authorities things kicked in and that’s where they’re not ranking as well, or Google is spending time collecting machine learning data on user behavior. The Rank-O website looks at user metrics and if the super metrics don’t come back good and they’ll de-rank that website after some time.
You mentioned they haven’t been accumulating so much link authority lately so maybe it’s a slowed downlink velocity trend that’s contributing as well. Maybe it was pretty good a few years ago but they just haven’t kept up with the competition in terms of link velocity. There’s a tool inside of linkresearchtools.com or a metric called LVT which does give you a sense whether it’s going up or down in terms of the link velocity.
I’ve never mentioned this before, but I think about it a lot. I hypothetically believe that Google might have something where they look at, “If we rank you, number one, two, or three, how does that contribute to your link acquisition compared to the other sites ranking one, two, or three or on Page 1?” Let’s say they rank 10 results for creatine, how many of those websites that have been ranking on Page 1 for creatine after a year have accumulated X amount of links and from where? Because I look at link acquisition as just ranking in Google is a strategy for link acquisition.
If you rank for, what I would call a high intent to link search, so statistic searches are great for that marketing statistics. If Google sees you rank marketing statistics for a year but haven’t got many backlinks from that, maybe they’re going to de-rank you. Maybe that’s a correlated metrics they look at.
Going back to the algorithm, you have to think about things Google can easily track without noise. Paul Haahr mentioned in his talk and Gary Ish, they all talk about how it’s hard to track something like clickthrough rate directly because it’s noise. But I wouldn’t be surprised if they’re looking at how well have you ranked and then did that contribute to links and linking being engaged for like, “Okay, we saw you ranking number one in Google. We looked at your page, we trust you enough to link to you.” I think it’s a very interesting that I or somebody should explore further.
Interesting. Just to clarify on this thing of clickthrough rate, Google is tracking all these listings in the search results have click tracked URLs. They know if somebody is pogo-sticking jumping right back from that result page from a particular result back to the SERP and then choosing another listing. What is apparently pretty noisy is whether that is a good indicator of the quality of the document or the relevance of it because folks do manipulate that by getting people probably in Third World countries to click on stuff to try and get it to artificially rise in the search results.
I pogo stick all the time, often with great results. Everybody who’s user out there is probably, I mean, that’s what noisy about it. Because you Google something like Marketing Strategist, I’m going to click on almost every single one of those, open them all up in a tab and look through each one for ideas. That’s something I’m looking for.
Right. You might add something to your Pocket. I use Pocket all the time, I love it, getpocket.com. You just think, “I’m going to read that later.” I tag it to read and now, I’m off to my next search or checking another search listing. There’s a lot of noise in there.
Now, one thing I want to circle back on that we just touched on very briefly and this relates to the customer journey is something that I heard first from Frank Curran. He talked about how he had this guitar site where you learn how to play the guitar and he would give people results in advance, and that’s specifically how he termed it, by teaching people how to play the F chord, which apparently is pretty hard, I don’t play the guitar.
It’s hard. If you can master that and you get it for free, you have this feeling of mastery and you feel a little indebted to the website or to the teacher and you’re more inclined to buy the product then which I think is brilliant. This relates, I think to what we’re discussing about content and the quality and where people are in the buyer journey is to think about what’s the endgame for that particular visitor. Is it just to collect information about creatine so they can write a school report or is it to buy a product and maybe start taking supplements? Is it to confirm that the emotional decision they made is now backed up by logic and they already have creatine in their pantry or what? What are your thoughts on that?
I like that a lot. In fact, one thing I always thought was interesting is—I don’t know if it still says that today—but in the past, Google’s main Webmaster docs or something said that their goal is to rank websites that are trustworthy and useful. The word useful really always stuck out because you think of all the ways people can use content.
I think that’s a little bit of what you’re getting at there with the journey is to think of all the ways people can use content. They can save it to Pocket like you said, they could link to it, they can email it to somebody, they can share it on social media, they can set it aside and come back to it later. There are all kinds of ways people can use content that are all tie into the different areas of their journey.Relevance and specificity. Two words that will sum up your next content strategy. Click To Tweet
I also think these ties into really thinking about when you’re ranking for keyword or if you want to rank for keyword, does that actually tie into what you want people to do in regard to your business? With your example of creatine, I see this a lot where something’s going to a lot of search volume and traffic but yes, probably all college students or people in high school like doing the paper. Is that really going to add in your business and your bottom line and help with what you’re trying to achieve in the business?
That’s why I always advocate to go for searches that are more specific, mid-tail or long-tail. With anybody selling a service, I always say, “Sure. Take your topic.” I’m trying to think of an example, social media marketing. If that’s your topic, add the word, “Service Agency company provider,” to that. Now, that’s really the search volume and the high-quality traffic that you should be aiming for. I think that’s where adding more words and creating a longer keyword that’s more specific, with more intent behind it is a really smart thing to do as well.
I’m not sure if I went a little bit off the path of your question but getting back to it, I’m a huge believer in the user journey and thinking about how that could tie into search. Unfortunately, on the flip side of what I said is like, “You can’t always tell the intent or the demographic or the point in the journey of the user when they’re searching for a keyword.” As SEOs, at the end of the day, all we have is a keyword. We have no idea who that person is or their demographics or what they’re trying to do.
The best that you can do is try to make a really good educated guess, and then as a business, be aware of the fact that even if it’s a student searching for that, all of that search behavior, and people clicking, and visiting your website, and getting in their browser history, and maybe some people linking to you, all those things have secondary benefits that then go back into helping your site rank better overall.
And that’s a big thing I’m an advocate of is because a lot of businesses will say to me, “Well, it’s great if we rank for history of bacon but we’re trying to sell meat. We don’t care about people looking up just the history.” I’m like, “Actually, when you start ranking well and getting clicks and links to this piece of content about bacon, you’re going to start ranking better for all of your product pages about bacon as well.”
It’s the rise and tide that lifts all boats.
Alright. Awesome. That’s a great way to end this. If folks wanted to learn more about you, potentially work with you and your agency, how do they get in touch?
I’ll point first the listeners to my podcast called Experts on the Wire. Probably a bit similar to this show so maybe some people might like to check that out. I also interview people that are doing medium to advance SEO. Everything across technical and content and things like that. I try to get, as I know you do as well, like real practitioners not just like people that are SEO famous as they said. You can check out my site, evolvingseo.com for stuff about the SEO services and things like that. I’m also easy to get a hold of on Twitter, it’s @dan_shure on Twitter.
Awesome. Well, thank you so much, Dan. Thank you, listeners. Now, get out there and do something with this knowledge. This hopefully isn’t just edutainment, this is something that is going to pay off and serious ROIs. Have fun, have a great week. This is your host, Stephan Spencer, signing off.
If you love this SEO focused episode, I also recommend you listen to the interview that Jay Abraham did of me about SEO, which is episode number 62.
- Dan Shure
- Experts on the Wire
- How to Rank Above Wikipedia
- Twitter – Dan Shure
- Google+ – Dan Shure
- Instagram – Dan Shure
- LinkedIn – Dan Shure
- Youtube – Evolving SEO
- TRAP BPMS: The Ultimate Guide For Beatmakers
- 5 Ways To Use Google’s Natural Language Tool for SEO (this thing is gold!)
- Google Power Search
- The Art of SEO
- Kaspar Szymanski – previous episode
- Jay Abraham – previous episode
- REALLY Understanding Topical Trust Flow
- Google’s Natural Language Processing API Tool for SEO
- WGBH in Boston
- TF-IDF tool
- Rand Fishkin
- Hemingway app
- Google Cloud
- Quality Raters Guidelines
- Better Business Bureau
- Google Search Console
- Web Developer Extension
- Chris Pederick
- Danny Sullivan
- Paul Haahr
- SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story
- Business Chron
- Dr. Mercola
- Dr. Axe
- Sol Orwell
Your Checklist of Actions to Take
Write content for a specific search audience. Put myself in their shoes and list different ways which my audience would search for something they need answers to.
Make it as convenient as possible for my audience to consume my content. Improve page speed and offering different varieties of content such as listicles, infographics, and videos.
For long articles, provide a table of contents with links so that readers don’t have to scroll down when finding answers.
Analyze my content and find out what the competitors are doing by utilizing an app called INK, a web content optimization tool for writers.
Categorize my published content especially if I cover several topics. Not only will it be easier for my audience to find topics of their interests, but Google’s algorithm can also find relevant content quickly.
Utilize link analyzing tools such as Majestic and Ahrefs to find out if the keywords I am using are competitive enough in SEO.
Take note of E-A-T also known as expertise, authoritativeness, trustworthiness when publishing content. Read Stephan’s article to learn more about it.
Create headlines with a strong punch, but be careful not to clickbait. Headlines serve as first impressions and when the reader clicks on the link, I should deliver what was promised.
Utilize keyword-rich phrases naturally all throughout my article so that it wouldn’t seem like it was made for bots.
Check out Dan Shure’s website, Evolving SEO and tune in to his podcast, Experts on the Wire.
About Dan Shure
Dan Shure is the co-owner of Evolving SEO, and the host of Experts on the Wire Podcast, with over 400,000 downloads. He is helping companies like WGBH and Sumo with SEO.