Episode 145 | Posted on

JavaScript and Other Technical SEO Conundrums with Bartosz Góralewicz

 

These days, most websites are running JavaScript, yet not all JavaScript is search engine friendly. It’s very important that you understand the implications of JavaScript and technical SEO. We have one of the world’s top experts on that topic, Bartosz Góralewicz. He’s the Co-founder of Elephate, which was recently awarded Best Small SEO Agency at the 2018 European Search Awards. Bartosz leads a highly specialized team of technical SEO experts who work on the deep technical optimization of large international websites. Technical SEO isn’t just Bartosz’ job, it’s also one of his biggest passions, which is why he enjoys traveling all around the world to share his enthusiasm with likeminded SEO folks.

Bartosz, it’s great to have you on the show.

Same here, Stephan. Thank you for the invite. I heard a lot of good stuff about you and the podcast. I’m excited to be one of your guests.

I’m excited to geek out with you because you are so skilled at the technical SEO stuff, things that would just make other people’s head spin. I don’t know if everyone is going to be able to digest everything that we talk about, but certainly their minds will be expanded by the end of this episode. I’m quite confident of that. Why don’t we start with a topic that you had just spoken about, and I know you did a bunch of speaking. Where were you? It was like three different conferences in one week, right?

Yeah, it was quite a marathon. I’m in this happy space when I finished all my talks for the off season for the SEO conferences. I figured that I’m going to do these three conferences within five, six days in two continents and after that I’m done for two and a half months or so. I did Search Elite in London. That’s actually a funny story. I got to London, it turns out that my hotel is closed at [1:00] AM and I had to find something else and I did. I had someone else book a hotel for me for the first time and it’s never a good idea, check it yourself. I woke up after three hours of sleep. I did my talk at Elite, this new conference in London. I finished at [1:00] PM. I had to run to the airport to catch my flight to Boston. I got to the airport on time just to find out that my flight was canceled. They said to pick another flight and I get any flight I can find. It was $2,000 for economy but they were cool enough to say, “Get it, we’ll make it happen.” I had to change the airport. That was the most exciting thing I did in a while.

It turns out that you can’t leave the airport just like that because it’s international ground. The only people who can let you out are the people from the airline you had the ticket with. I had to find the people from Norwegian to let me out of the airport with bunch of other people because the line to customer services was a thousand people long. I finally got out and there was this very nice guy called Jens. He was a CEO of some medical company in Boston. We get into the cab that’s like £200. We changed the airport, we arrived just literally one minute before the gate closed and I get to Boston.

The next day, I go on stage and do my talk in Boston. It turns out I got some weird stomach bug on a plane and I’m sick. For three days in Boston, I lay in bed sick. I got my flight back to London for UnGagged, that was keynoting. I was keynoting the whole thing during the first day. I didn’t finish my deck, which was very unfortunate because I was supposed to do that in Boston. I got the overnight flight, got one-hour sleep on the plane and got to London. I met Christian from my team and we built a deck. It was very nice, very cool. I was rated second top speaker in UnGagged. It must have been a productive day. We did 305 slides, I usually have a lot.

I never drink coffee and I ordered a decaf. Probably through my accent, they didn’t understand that it’s 11:00 PM and got me large venti Starbucks coffee that’s caffeinated instead of decaf. I never drink coffee. I rarely do. I stay up until 3:00 AM. I got on stage and do my talk. It went very well. I’m surprised I’m smiling and very energetic on stage, which I have no idea how I did that. I went back to my room and I fell asleep until 4:00 PM. That was my last marathon. Then I went home to Poland on Tuesday and on Wednesday had a half marathon. I did all that and now I feel like the weight is off my shoulders so I’m very happy to jump into this podcast with you in a relaxed state. SearchLove Conference in Boston, and UnGagged London.

Clearly you are in high demand that you would have three different conferences in a five or six-day period and that they’d be so driven to get you there that they will spend an extra $2,000 on a ticket.

I guess I was a little bit lucky because JavaScript started to be a problem for some of the websites around 2015. I had this question I was going through with all the developers, Google had this blog article that they stopped supporting the Escaped Fragment. Google published that and none of my friends, developers, SEOs could explain that to me. If they don’t use Escaped Fragment, what does that mean for SEOs? I started investigating that and we had a client who had a website that was not ranking well and they were on Angular.

Was this Angular 1 or 2? This was back in the days so probably Angular 1.

I don’t remember that but we found out that they did it wrong. There was this little rotating wheel, processed by JavaScript, when you open the website, while still loading, there would be a rotating wheel. When we removed it, the website exploded. I have the code somewhere because I saved it. It was twenty lines of code. I was like, “This is massive. That’s a problem for Google, we need to investigate that.” We’ve built this experiment, we’ve built a website where every single page was a different framework. We did this out of curiosity. At that time people were like, “This dude is crazy. Who cares about JavaScript?”

When you open a website, while it was still loading, there would be a rotating wheel. When we removed it, the website exploded. Click To Tweet

For our non-geeky readers, let’s just briefly describe this experiment. You had different versions of JavaScript that you were testing out to see whether links would get picked up by Google, whether text would get picked up. You were using Angular 1, you were using Angular 2, you were using React, all these different JavaScript frameworks.

We used twelve or more different frameworks. For all of you who don’t know what JavaScript SEO is, the key problem of JavaScript SEO or JavaScript in general, that if it’s a client-side JavaScript, if you as a user have to process that, the problem is that your computer or browser needs to process all the JavaScript into HTML. When you look at the source code, in many cases you don’t see the content. That’s tricky because Google and we as SEOs, we’re used to looking in the view source or inspect element, they were usually the same back then. We could see there’s this paragraph tag and there is this content or there’s a H1 tag and there is this content. In the case of JavaScript, that’s not the case. You have different parts of code, but they’re not human readable. That’s the key problem. That’s something you need to understand the JavaScript SEO.

What’s happening is that JavaScript could be preventing some content or some links to getting access that could be preventing the entire page from getting indexed. There are all sorts of potential issues when you start using JavaScript or AJAX, as some people refer to it as lazy loading or infinite scrolling. Stuff is being pulled in asynchronously that the URL isn’t changing but the content is coming in. Some of that content may not be accessible. If it’s not accessible to Googlebot, then it doesn’t end up in the index.

If you looked at HTML, you can imagine what you see on onscreen. It gets much more complex because with JavaScript, it’s not only JavaScript but there are all these frameworks. There is Angular, even Basecamp has their own JavaScript framework. Everyone started building frameworks. That was a huge boom, just like a competitor to cryptocurrencies there are so many right now. There are three key frameworks: Angular, React and Vue. Angular is supported by Google, React is supported by Facebook, and Vue doesn’t have any corporate support.

There are hundreds of different frameworks and it’s difficult for Google because of one key problem. Google has all the technology to process that, but it’s so much processing power. It gets tricky because we as SEOs, we don’t know if website X is indexable with this framework or not. That goes a little bit beyond the documentation. In 2017, Ilya Grigorik from Google said something that was a game changer, if you download Chrome 41, a two and a half year old version of Chrome, and you open the website, it loads and you see all the content. That means Google will be able to do the same because they use Chrome 41. It’s an old version and they’ll work on updating that. That’s going to be a huge game changer and not only from the JavaScript perspective, but for all.

What you’re saying is that they’re still running on a super old version of Chrome. That’s how they’re processing the fancy JavaScript stuff that you’re adding to your website. Anybody adding that functionality, it seems to work fine on all the modern browsers, but Googlebot and the indexer. It’s not a normal browser, it’s an old browser.

They also simplified some of the functions of JavaScript. One interesting example, if you use a random number, this number is always the same. Google tries to make it as efficient as possible because processing JavaScript is extremely expensive even for a giant like Google. You can see that on your laptop if you go through my experiments, some examples. Even for your machine, regardless of how expensive your computer is, you can see that some of the websites will slow you down a lot. This is how massive the problem is.

The key is that Google is using Chrome 41 to process JavaScript. For example, if you open Hulu.com it will load, but then if you will open it in Chrome 41 mobile, you will need to go develop it from the proper tools to change it for Mobile. Hulu won’t load on mobile for Chrome 41 and that equals that the content doesn’t load is not indexed in Google. That’s huge. You can see that a big chunk of Hulu.com is not indexed on Google, because Google is scrolling it in mobile first and you can’t see the content because of the JavaScript. That’s the moment when stuff gets real because if you go to Casual, like the TV show, you will find torrent websites. This is where stuff gets exciting even for non-geeks.

I wrote two articles about Hulu, I even tweeted them. In my last deck, I had this funny statement that, “I will help them for free just to fix it.” This is a huge challenge for them. They were one of the first people who decided to dive into JavaScript. They’re a little bit overwhelmed by that. Netflix is doing the same thing, but Netflix is one of the thought leaders for JavaScript frameworks for React and they are doing amazing things with JavaScript. If everyone had the same approach towards JavaScript as Netflix, we wouldn’t have most of these problems. Even though I haven’t used it for a while, it’s still up to date. Netflix is very good, you can see that if you look at their visibility in Search Metrics. For the first time in history, they took over all the visibility from Hulu and Hulu is going down because of JavaScript.

Netflix is doing cutting edge, smart implementation of JavaScript as far as SEO is concerned. Hulu is doing a just a tragic job of it. You have tweeted them, you have used them as a case study of what not to do in various keynote presentations, and you haven’t heard anything from them. You even offered to help them for free and they’re still screwed up. Their traffic is heading south, it’s looking bad on the graphs and tools like Search Metrics and SEMrush. They’re not fixing it.

MS 145 | Technical SEO
Netflix is doing cutting edge, smart implementation of JavaScript as far as SEO is concerned.

I don’t want to be their spokesperson, but we work with some fairly large brands and sometimes it’s complex to fix that once you took one step into the wrong direction. For example, for them it was client-side rendered JavaScript. They didn’t fully understand. It may be difficult to take a step back. It’s been two years, there is some major problem that they tried to fix. I have a lot of compassion for them as well. Looking at that from being a developer in that team of Hulu or I don’t know who’s in charge of the head of marketing, that’s also a huge challenge. There are not too many people who know how to fix it. We’re still growing as an SEO community to be better at that.

What you’re saying is that the way Hulu implemented JavaScript, the client-side render, it was done in a suboptimal way. They could potentially have done client-side rendering JavaScript in a way that was search engine friendly, they just didn’t know any better. Or do all clients that render JavaScript just wreak havoc on your SEO?

It’s complicated. Let’s say Hulu is going to contact you tomorrow, say, “We need to fix it as soon as possible. What do we do?” A quick fix is to start prerendering. If a client contacts you and you work in an SEO Agency, or a freelancer, a boss comes to you and says you can go with pre-rendering, which is a whole setup of headless browsers. Browsers that crawl your website and build an HTML snapshot of JavaScript, then they cache it. When Googlebot visits your page it gets HTML and users get JavaScript, that’s a Band-Aid solution, sometimes it’s a long-term solution. That’s something you can do even in a large-scale cooperation, you can do that in a few months. Within a small organization, you can do that for maybe a week or so. There is a lot of ready to go solutions. Google has a lot of documentations about that, there are services like Prerender.io who try to do that for you. That would be the easiest fix for Hulu.com.

Prerendering will create an html version of the content that was trapped inside of JavaScript. That would require executing, would require processors to figure out what should be happening in their links, copies, and images. The prerendering is done by headless browsers, just like Googlebot is a headless browser or Google’s Chrome 41, a mobile browser is a headless browser. You run headless browsers across your JavaScript based website that has client-side rendering. It then creates the prerendered HTML versions of this content. That becomes part of your page visible to Googlebot that can get indexed and ranked.

Recently in Google I/O, that was one of the recommended solutions, but this is a temporary thing. We work with a lot of clients, who struggle with prerendering. The problem is that once you have a website that’s prerendered for Googlebot, you deal with two websites in a way because you have one website for users and one website for Googlebot. You have to manage and optimize two of them and that gets tricky. If you’re prerendering engine breaks, you will return to 404, 500 or any other status code that’s not very good to Googlebot only. You are sometimes not aware of that because your website works. There is a lot of downsides to that. If your website is extremely dynamic, let’s imagine that you will try to prerender Facebook or Twitter, that’s almost impossible because content is changing all the time. I will do that for like eCommerce stores that are not as dynamic like blogs or websites that don’t change a hundred pages per hour.

I had advised a client of mine, you may know about this company. It’s called Focus@Will. Imagine Spotify but for focused, flow state type of music. If you want to get into a flow state and work at 200%, 500% than what you normally do. You know when you’re in flow, things just happen and you’re cranking. You can stay in the flow state for a longer time period if you listen to what’s referred to as streamlined music. Not regular music, this is a game changer. If you like listening to music, you’re listening to the wrong music, I guarantee you.

Focus@Will. Imagine Spotify but for focused, flow state type of music Click To Tweet

Most music that you enjoy will knock you out of a flow state instead of keeping you in it. This is a client of mine and this is the coolest technology, it’s very inexpensive to sign up for a subscription to it. They have a JavaScript-based website. I advised them that this is a disaster for SEO. They implemented prerendering. It took a long time for it to get picked up by Google, it took months. I think everyone should sign up for it.

You’re into biohacking, Stephan?

I’m big into biohacking. I have a whole podcast, probably half of it is dedicated to biohacking. That’s not this show obviously, but that’s The Optimized Geek. My other podcast where I’ve had Dave Asprey on, the Bulletproof coffee guy, that is not an SEO podcast, even though it might sound like it. It’s bio hacking, life hacking. I’ve got Tim Ferriss on as well. I’ve had these specialists in certain areas of biohacking like EMF exposure. I had Brian Hoyer on. I’ve had Dr. Daniel Kraft from Singular University. I just thought of the person who I interviewed about a 21-day water fast. Lisa Betts-LaCroix, she was talking about quantified-self, QS. We’re talking about keto and that’ll be a fun episode. The key is to scale yourself and you can’t do that by adding hours to the day. Nobody has more than 24 hours in a day, but what you can do is delegate everything and systematize everything.

I’m going to do the less public speaking because I outsource that to my team and I have very good public speakers in my team who are doing a lot for me now. My goal is to travel less, work a little bit less as well because I’ve got the six-hour work day.

If you think about how you scale yourself, how much time would you estimate that you’ve been spending on email per day?

I would say one hour a day. I’m way behind.

When you think about many hours is spent over the course of a year. The fact that you have not delegated that, it’s crazy.

I delegate as much as I can, it’s just difficult. I’ve got 32 people here, I think it’s 34 in the agency. They call me master of outsourcing because I outsource even the little things. I find it difficult with emails. You need to show me your tricks then because I’m doing a lot. My email is nowhere to be found. I still get quite a lot.

MS 145 | Technical SEO
I delegate as much as I can, it’s just difficult. They call me master of outsourcing because I outsource even the little things.

For readers who want to implement something like this, I’ve got some great episodes on The Optimized Geek where we go into inbox zero. I’ve had David Allen on, the creator of the GTD Methodology and author of Getting Things Done. Listen to Trivinia Barber. That was all about VAs. My email is set up with the GTD model. I have an @action folder, I have @read review folder, I have an @waitingfor folder. I use Google apps for business and I’ve got those labels but I’m using IMAP to connect up to the email. I’m using Mail.app on my Mac.

I have my team who are doing the same thing. At least two or three people who are trusted enough in my organization have access to my email. I also have a private, @email address in addition to my Stephan@. I would send lost passwords there because if somebody gets my regular email, I always turn on two-factor authentication. Turn off using your phones, text messaging as one of the two factor authentication methods because that can easily get spoofed or stolen and you’re screwed.

Use Google Authenticator app and that will generate those six-digit numbers every few minutes. Even if you’re not connected to the internet, the numbers work. For a backup, generate a bunch of one-time use codes that you store in a very safe location. That’s securing your email setup. I have a few trusted members that have access to my email, personal and business. They’re putting everything that’s in the inbox into the appropriate folders. They’re responding to a lot of things, travel stuff, client requests, and prospects wanting to set up a call and all that.

They’ll handle as much of that as they can. Some stuff they can’t handle and that goes into action. I stay out of my inbox. A lot of times it’s inbox zero, but even if they haven’t gotten to it in the last hour or two, I don’t pay attention to it. I go straight to my action folder and that’s where I live. I also check my read review folder, but not as often, maybe once or twice a day maximum just to see what the FYIS are. That cuts my amount of time by 90%, and my email time is 10% of what it used to be.

I feel very bad about my workflow, even though I felt I’m very good at that. I do a to-do list. I have two inboxes which is probably not as good.

There’s always somebody who’s better than you, smarter than you, more advanced than you in a particular area. I learned this early on in my career, to surround yourself with smarter people than you. Put your ego aside and bring somebody in who is just world class at that thing that you want to be world class at. Let’s say it’s taking your productivity to the next level, getting into flow states more often, getting more organized, putting all these systems in place and using a to-do list in an even more advanced way.

You could hire somebody like Mike Vardy. He’s got a great podcast called the Productivityist. He’s a great guy and he’s one of my coaches. I learned from him, we go through my stuff and I use an app called Things which is my trusted system. He gives me feedback like, “This looks like a Hornet’s nest here. You need to do this and that and you need to fix this.” I sound altogether in terms of my productivity and my systems, but we all have our own Hornet’s nests and horrible things that we’ve let get messy. You bring somebody like that and they’ll coach you and advise you. It’s a game changer. He had a great episode on my other podcast on The Optimized Geek.

Back to what we were talking about in terms of JavaScript. Focus@Will, this client had a big problem where their entire site was practically invisible. Only a couple pages were getting indexed. They implemented prerendering. It took months, but then they finally got hundreds of pages indexed. Their blog for whatever reason, it’s a WordPress blog but with this wrapper around it of JavaScript. That wasn’t even getting indexed until they set up the prerendering and then finally the blog post and the category pages from the blog that we’re finally showing up in Google’s index.

A quick fix is to start prerendering. Browsers that crawl your website and build an HTML snapshot of JavaScript and then cache it Click To Tweet

I get that it’s a Band-Aid, but it’s probably going to be a long-term band aid for them because getting them to implement that was a huge win. I can see where it would potentially go off the rails if they’re not monitoring, not just the status of the website but also monitoring the status codes that were being served up to Google. Doing the fetch and render tool in Google Search Console and checking the Google cache. This is an important distinction we need to talk about, if somebody is looking at the Google cache, they’re thinking probably that this is what Google sees and stuff doesn’t load. The lazy loading, the infinite scrolling doesn’t work or a bunch of images aren’t loading or functionalities not present.

It looks broken and you might think, “That’s not available to Googlebot,” but that’s not true. You might go instead to Google search console under the fetch and render tool and see that stuff’s not loading. This is what your page looks like and here’s what Googlebot sees. They see that images aren’t loading, the page content gets cut off partway down page. They think, “We’ve got a problem with our contents not getting into Google.” Instead, I teach my clients to use info: as advanced query operator in Google. They put in the exact URL of the page that’s in Google plus the key word or a phrase like in quotes from part of the page that looks like it’s not loading according to fetch and render or according to the Google cache. Then they see the truth, whether it was picked up by Google or not.

First of all, cache is a very bad way to look at anything because Google’s cache isn’t working very well with JavaScript. I wouldn’t use Google for anything. Search Console, on the other hand, is the ultimate resource. There is one more place where you can see how a website is rendered by Google, but you don’t need to have the access to a Google search console, which is Mobile-Friendly Test. They will show you a snapshot of a rendered website. You can see why I know that Hulu has problems with their mobile websites. You can enter any URL into Search.Google.com/Test/MobileFriendly. If you enter any URL in there, you can see if Google saw the content. You can see that USA Today in mobile version is invisible to Google. You can see a lot of very big brands like AliExpress. As soon as those websites and move to mobile first, like Hulu, they may see massive problems. You can see any website out there and how it is doing with mobile first and JavaScript.

Mobile first indexing, which is something that not every reader will be aware of, is Google’s big shift to focusing on the mobile version of your website as the definitive version to base the rankings on. To pull information from versus the desktop version of the site. If you’re running your site as a responsive website, the potential for issues is a lot less. You’re probably okay. If you’re running dynamic serving, then there’s all sorts of potential problems that you could be having if you’re not using the vary header or Google thinks that you’re doing cloaking. Or you’re screwing up the way that you’re doing the dynamic serving that can create all sorts of problems. The third approach is to have a separate mobile website and that mobile website has different URLs. That is the worst-case scenario and the one that’s going to be the least scalable and future proofed for this new world of mobile first indexing. What am I leaving out that’s important about mobile first?

With mobile first, it gets complicated at this point. Most of us would test the website like a mobile website on for example, iPhone X or Samsung Note 8 which is also on the higher end of the spectrum. When the median phone is Motorola G4 which I had never held in my hand ever. Just to give you an idea, this phone is ten times slower than the iPhone X because iPhone X’s processing power is better than in some of the MacBook Pros on the market right now. For some reason, Apple puts extremely powerful CPU into iPhone X, that’s better than thirteen-inch entry level MacBook pro.

If you’re opening a website like The Guardian. It is like the website performance-wise. It will load within one, two seconds regardless of anything but it will load in a way that they will see the content. The difference between fully loaded The Guardian on iPhone X and Motorola G4 is up to twenty seconds. If you look at CNN.com, there’s nine seconds difference to the website loaded between iPhone 8 and Motorola G4. This shows a problem that Google is also struggling with. First of all, we see massive performance issues on a 3G connection and 75% of people worldwide are on 3G or 2G. I think there is 50% of people worldwide on 2G connection and with mid-performance phones.

If you add those two things and your website loads very fast in your office or through Wi-Fi or 4G connection at your iPhone X, that’s a huge difference. That’s why, I believe, Google started this CrUX or Chrome user experience report where you can see how your website is doing on your user’s devices and you can see desktop, tablet, and phone and it’s a public database. This is where it gets interesting because I was thinking that our website, Elephate.com will load extremely quick because it is nicely optimized. It’s still fast but it’s not as fast as we thought on some mobile devices. We see that there is a lot to improve in this area.

That’s where it gets interesting because first all, mobile performance becomes a ranking factor. Secondly, all of this JavaScript can make your website just like USA Today, Google can’t see USA Today on mobile. Let’s imagine that USA Today is called Mobile First. Google is only crawling the mobile version and the content is invisible. The website is just going to plummet in search results badly. One more thing I wanted to touch on that Stephan mentioned is cloaking. This is a little bit interesting because it’s not something I’d recommend, but we ran some tests in here at Elephate.

MS 145 | Technical SEO
Mobile performance becomes a ranking factor. All of this JavaScript can make your website just like USA Today, Google can’t see USA Today on mobile. The website is just going to plummet in search results badly.

We have this very exciting research and development team and you can do as much cloaking with JavaScript as you like. I believe that you’re not manually flagged, which is very difficult for someone to support, Google has no technology to find out that, for example, your JavaScript website serves different content with JavaScript and HTML. This is a huge gap for all the black hats out there that should be addressed pretty quick because you can serve HTML with Wikipedia content and you can serve JavaScript off the casino websites. That’s something that we tested and it’s totally like a Wild West.

You had this theory that with mobile first, with performance issues and with JavaScript SEO, all of that converging together. Why don’t you share what that theory was and how it ended up playing out to be true?

Back in the day I was wondering, Google crawls mobile websites only, they will disregard the desktop. I was thinking, there are different challenges for Google with mobile. First of all, they can’t measure any of the performance during the crawl because it’s specific to the niche. We have clients that still have 90% desktop traffic, we have clients with 70% to 80 % of mobile traffic, that’s one. Secondly, we have a client who’s Jiffy in Poland and for them the devices used to enter the website are usually pretty low-end. We work with Oracle Commerce Cloud in looking at the data. It’s a little bit different for each of the stores.

From Google’s point of view, with desktop it was pretty straightforward. Most of the desktops were powerful enough to process all of the JavaScript, all of the HTML or whatever. Connection speed on a desktop was okay. No one had 2G connection on a desktop. With mobiles, the speed is different, the CPU and performance of the phones are different. They need to crowdsource a lot of the measurements first with JavaScript because of how difficult it is for a mobile to process all of the JavaScript with different mobiles. Let’s say The Guardian, for me it may be a little bit worse because I’m on a Samsung Note 8, but if you know someone with an iPhone 6 or a Motorola G4, they will load this website nine, ten seconds slower.

We did that after because someone wanted to benchmark my theory. We sat down in a hotel lobby with different mobiles. We started loading websites. We had three different mobiles. I think it was iPhone X, Samsung Note 8, and the third one was very cheap like an Asus phone or something like that. The differences were huge. iPhone X was finished, when you wouldn’t see any content on the Asus phone. That’s related to JavaScript. This is exactly the same issue that Google is going to have with JavaScript because they use Chrome 41. They have limited resources and that brings us to two waves of indexing that we somehow touched on.

What do you mean by that?

Google launched this Google I/O presentation with John Mueller and Tom. They had this peek behind the curtain of how Google processes JavaScript. I highly recommend you watching that. The key thing from the whole presentation was that there are two waves of indexing. First wave is checking for all the HTML content your status codes like 200 and all your meta data, like economic calls, no index tags, and whatever. The second wave is JavaScript and it gets difficult. With JavaScript for the second wave, you need to wait, and you can wait up to a week, in our experiments there was even a month. You can have the blank page indexed, you have to wait for the second wave, JavaScript processing to happen over a period of a week or a month because Google is waiting for resources. There is some queue that you probably jump if your higher authority or not. That’s just something I imagine what would be logical to happen.

This is why it took Focus@Will such a long time for their prerendered to get picked up.

If you’re stuck between those two waves, your HTML is indexed and it’s only your navigation. In the case of Hulu, the first wave or the only wave because JavaScript is not accessible to Googlebot, it’s only the navigation and some statement in the bottom of the page. You can play with mobile friendly and see what I mean by that. They’re stuck in the first wave. In the case of some of the other websites we’ve looked at, there will be I call partial indexing. Half of the website is indexed, half of the website is totally invisible because they wait for the second wave. JavaScript crawling is extremely slow. We had an experiment with six pages and deep. To get to page three, you had to go to page two and so on. It was never indexed.

An exact copy in HTML was indexed within ten minutes, fifteen minutes. Google is saving resources, because all of the CPU are with restrictions. Let’s use imaginary numbers here, to crawl HTML and index html is one and to crawl and index JavaScript is probably between 20 and 100. It’s so much more computing power needed. Google can’t expand their already huge server base or however you call it to 20 or 100 times. What they’re doing now is they’re finding the technology, including a new Chrome engine and some simplifications in crawling and indexing JavaScript to make it much more efficient because just throwing servers at it would be an infinite job.

Especially as people are starting to adopt JavaScript more and more in their website. There are so many websites out there that are still very old school and they’ll eventually incorporate JavaScript.

This brings us to this hypothetical scenario that I have in mind all the time. Imagine your Bing today and it’s struggling to index web as it is. Regardless of JavaScript, Bing is not a very good crawler because getting something indexed takes a long time, at least for our clients and even big brands. When Bing has to face the problem of crawling all the internet, including JavaScript and the cost for Bing, having like a percentage of the market compared to cost for Google is huge. I looked at the cost versus the income, Bing will never ever fully indexed JavaScript and they don’t right now.

They had the technology, there are a few pages indexing I found. Even though their Bing webmaster tools and their blog is terrible at explaining that because they compared to JavaScript to Flash. They say, don’t hide content in JavaScript or Flash. Their guidelines are very outdated. I believe that JavaScript will kill Bing through that because if you won’t find any of the Hulu shows in Bing, you won’t find Hulu in Google anyways, but it would get huge because you don’t realize how many websites are JavaScript websites. As of now, I didn’t realize that. I believe that Google is going to master JavaScript crawling and indexing and they will because they are the creator of Angular and they push that technology very hard. They will starve Bing completely and other search engines for that matter.

Google is going to master JavaScript crawling and indexing Click To Tweet

When they upgrade to the next version of Chrome, that’s the headless browser that they’re using in Googlebot and the rendering engine, is currently Chrome 41 and they’re going to go to what? Chrome 52 or something? How long are we going to have to wait for them to update from whatever version they’re going to move to the next version? Is it going to be another two years before they get to a more recent one? Are they going to keep this real time updated? How are the WordPress installations updated automatically as soon as a new update comes out with a new version?

I’m guessing that they will go for Chrome 59 which is the first headless Chrome. That would be the logical approach for me because they have to start to work on that to make it happen. My guess and I would say that this is a 60% guess would be they will go with Chrome 59, but that’s just one thing. The second thing is that Googlers, John Mueller and Tom mentioned in Google I/O that they work on making the indexing process seamless, just one step and instant. They want to move away from two waves of indexing, which causes a lot of problems for all of us.

I wouldn’t consider those two steps to be at the same time because they will launch a new Chrome engine, the power Googlebot, which will improve the performance. I actually talked to Ilya Grigorik and I was consulting one of my articles with Ilya. He said that he planned on releasing that at the end of this year or beginning of next year. He mentioned with a smiley face, it may change. I’m guessing It’s a huge project for I think an engine for the Googlebot. It can go bad in so many ways. We can imagine how difficult that task is. I don’t believe it’s going to be soon enough for that to solve all of our issues. The processing power on Google, is not enough. Webmasters are not fully aware of how they should build JavaScript websites.

Let me give you an example, there is a video from Angular Conference where one of the creators of Angular says that you should never put an Angular website client-side rendered if it’s Google facing. You should use server-side rendering for that. This is a bad practice in a way from development perspective to put any JavaScript clients-side render JavaScript Angular live. At the same time looking at all of the websites, this is what everyone is doing because it’s so much easier. If you’re a developer and you build a website, I’m guessing this isn’t one of the quicker and simpler solutions.

Google and Angular never ever supported pushing client-side rendered websites. I talk about that right now because I believe that there are two movements. One, Google is improving the efficiency of crawling and the second movement started by for example Netflix. They moved all of their React code from the frontend for performance. That improved their performance by 50 %. They only left plain JavaScript or vanilla JavaScript on the front and removed all of the React. It will slowly be like a two-way movement. On one end, developers will get their stuff together and start working on client-side or server-side rendered or isomorphic or all of these cool features of JavaScript. They will fully use it. On the other hand, Google is going to try to be a little bit more efficient with indexing the websites that don’t know they should do it like that.

What would you say would be the most important take away for someone who’s not super technical who is using JavaScript or plans to use JavaScript? What should be their takeaway or what should be their action from this episode?

When we started in 2015 or ‘16 with JavaScript, it was just me and a few of the guys from the agency. We had nothing. We were in the dark. Right now, there is Mobile Friendly Test. Make sure you use it because it’s so powerful and fetch and render in Search Console if you can, and Chrome 41. If you use all of those tools and if you watch any of the YouTube videos, of the framework you want to use. If you want to go with Angular, try to find out what’s the best practice to make your framework indexable properly. Most of the big frameworks totally support isomorphic dynamic and rendering, all of the new cool things. If you’re in this comfortable position when you plan a new website or you’re choosing frameworks or whatever, make sure you look at that first and use all of the tools I mentioned. You can’t go wrong.

We will also have an action-based checklist of items that you can go through to improve your technical SEO, especially around the area of JavaScript. If somebody wanted to work with you Bartosz and hire your agency to do technical SEO or to advise on a site revamp, how would they get in touch?

The easiest way would be through any of the social media but also you can find me at BartoszGoralewicz.com

Thank you, Bartosz.

Thank you very much.

 

Your Checklist of Actions to Take

☑ Make sure that computers are able to smoothly process JavaScript or HTML websites. , Slow loading times lead to high bounce rates.

☑ Take note of the three key JavaScript frameworks: Angular, React and Vue. Angular is supported by Google, React is supported by Facebook, and Vue doesn’t have any corporate support.

☑ Download Chrome 41 to test a JavaScript website. Google uses Chrome 41 so if the content appears, Google can see it and crawl it

☑ Prerender my website with the help of Prerender.io. When Googlebot visits, my page gets HTML and users get JavaScript. This can be a Band-Aid solution or even a long-term solution.

☑ Regularly monitor the status codes served up to Google.  Utilize the fetch and render tool in Google Search Console and check the Google cache.

☑ Use auditing tools like SearchMetrics and SEMrush to keep a close eye on my ranking progress.

☑ Use info: as an advanced query operator in Google. This query identifies whether or not my site content is picked up by Google.  

☑ Prioritize a mobile-friendly website since most online users are using their phones when they want to search for something online.

☑ Try entering any URL into https://search.google.com/test/mobile-friendly to check if my website is mobile-friendly.

☑ Utilize the Chrome User Experience Report, a Google tool that provides user experience metrics and shows how real-world Chrome users experience popular destinations on the web.

Important Links:

About Bartosz Góralewicz

MS 145 | Technical SEOWe are an SEO and content marketing agency. Elephate started as a 100% self funded “garage startup”, entirely bootstrapped by Bartosz Góralewicz and Wojciech Mazur. We originate from Poland where most of our team is presently located. Our team is made up of experts from all around the world. In fact, our Polish office includes people from four different continents and nine different countries.
No matter what we do, we always put great emphasis on data-driven work and on optimizing our performance. We invest heavily in training, tools, and most importantly, people. We are working on building the best possible SEO and content marketing agency, without cold calling or employing even one sales person. We simply attract clients who want to work with us, because of our knowledge, experience and the results we deliver.

Leave a Reply

Your email address will not be published. Required fields are marked *