Core Web Vitals
Q. The weight of Core Web Vitals doesn’t change depending on what kind of website is being assessed.
- (00:50 )Google doesn’t evaluate what kind of website it’s assessing and decide that some Core Web Vital indicators are more important in a particular case. The reason for that is that in some search results the competition is quite strong and everyone is similarly strong, and as a result it might look like some indicator has more weight, but that is not actually the case.
Reviews from applications
Q. Google doesn’t pick up reviews left on Android and IOS applications
- (05:44)John says that at least for web search, Google doesn’t take Android and IOS application reviews into account. Google doesn’t have a notion of quality score when it comes to web search. Indirectly these reviews might be picked up and get indexed if they are published somewhere on the web, but if they’re in an app stores, Google probably doesn’t even see them neither for web search nor for other kinds of searches.
Crawl Request
Q. The number of crawl requests depends on two things: crawl demand and crawl capacity
- (07:29)When it comes to the number of requests that Google makes on a website, it has two things to balance: crawl demand and crawl capacity. Crawl demand is how much Google wants to crawl from a website. When a website is reasonable, the crawl demand usually stays pretty stable. It can go up if Google sees there is a lot of new content, or it can go down if there is very little content, but these changes happen slowly over time.
Crawl capacity is how much Google thinks the server can support from crawling without causing any problems, and that is something that is evaluated on a daily basis. So Google reacts quickly if it thinks there is a critical problem on the website. Among critical problems are having lots of server errors, Google not being able to access the website properly, the server speed going down significantly (not the time to render a page, but the time to access HTML files directly) – those are the three aspects that play into that. For example, if the speed goes down significantly and Google decides that it’s from crawling too much, crawl capacity will scale back fairly quickly.
Also 5xx errors are considered more problematic than 400 errors, as the latter basically means content doesn’t exist, so if a page disappears that doesn’t cause problems.
Once these problems are addressed, the crawl rate usually goes back to what it was step by step within a couple of days.
Search Console Parameter Tool
Q. Parameter tool acts differently compared to robots.txt
- (15:32)Parameter tool is used as a way of recognising pages that shouldn’t be indexed and picking better canonical choices. If Google has never seen the page that is listed in the tool before, nothing will get indexed, and if it has seen it before, and there was real canonical on it previously, it helps Google to understand that the website owner doesn’t want it to get indexed. So Google doesn’t index it and follow the rel canonical.
Random increase in keyword impressions
Q. Random keyword impression increases in Search Console can be caused by bots and scrapers
- (18:42)Google tries to filter and block bots and scrapers at a different level in the search results, and it can certainly happen that some of these go through into Search Console as well.
It’s a strange situation that if someone runs these scrapers to see what his position or ranking would be on these pages, then they’re getting some metrics, but they’re also skewing other metrics, and that is discouraged by Google’s terms of service. It’s better to ignore these kinds of things when they happen because it’s not something that you can filter-out in the Search Console or manually do anything about.
Internal Linking
Q. Internal linking is about giving a relative importance to certain pages on a website
- (20:37) Internal linking can be used for spreading the value of external links pointing to that page, to other pages on the website, but only in a relative sense, meaning that Google understands that you think these pages are important, so we’ll take that feedback on board. For example, if all the external links go to the homepage, and that’s where all of the signals get collected, and the homepage has absolutely no links, then Google can focus purely on the homepage. As soon as the homepage has other links as well, then Google in a way distributes that out across all of these links. Depending on the way the website has its internal linking set up, there are certain places within the website that are relatively speaking more important based on the internal linking structure, and that can have an impact on rankings and at least tells Google that its important to you. It’s not a one-to-one mapping of the internal linking to the ranking, but it does give a sense of relative importance within the website. From that point of view it makes sense to link to important and new things on the website – Google will pick that up a little faster and might give it a little more weight in the search results. It doesn’t mean it will automatically rank better, it just means that Google will recognise its importance to you and try to treat it appropriately.
Website Speed and Core Web Vitals
Q. It takes about a month for the Core Web Vitals to catch up with changes in website speed
- (26:28)For the Core Web, Google takes into account the data that is delayed by 28 days or so. That means if there’s a significant speed changes made on the website that affect the Core Web Vitals, and accordingly, the page experience ranking factor, then it should be expected that it will take about a month to be visible in the search results. So if there are changes in search happening on the next day, that wouldn’t be related to the speed changes made the previous day. Similarly, if there are big speed changes, it will take about a month to see any effects from that.
Nested Pages for FAQ
Q. FAQ doesn’t have to be nested as long as the script is included in the page header and the data can be pulled out
- (28:35)FAQ doesn’t have to necessarily be nested. In case there’s an FAQ on the page, John suggests, it’s better to use appropriate structured data testing tools to make sure that the data can be pulled out. Testing tools essentially do what Google would do for indexing and tell the website owner if everything is fine.
Delayed loading of non-critical JavaScript elements
Q. It’s perfectly fine to delay loading of non-critical JavaScript until the first user interaction
- (30:17)If it’s the case that someone lazy loads the functionality that takes place when a user starts to interact with the page and not the actual content, John says, it’s perfectly fine. That’s something similar to what is called “hydration” in JavaScript based sites, where the content is loaded from HTML as a static HTML page, and then the JavaScript functionality is added on top of that.
From Google’s point of view, if the content is visible for indexing then it can be taken into account, and Googlebot will use that. It’s not the case that Googlebot will go off and click on different things, it just essentially needs the content to be there. The one thing, where clicking on different things might come into play is with regard to links on a page. If those links are not loaded as elements, Google won’t be able to recognise them as being links.
John refers to one of the questions from before about lazy loading of images on a page. If the images are not loaded as image elements then Google doesn’t recognise them as image elements for image search. For that it’s good to have a backup in the form of structured data or an image sitemap on the file. That way, Google understands that even if those images are currently not loaded on the page, they should be associated with that page.
Out of stock products
Q. There are different ways to handle temporarily out of stock products from the SEO point: structured data, internal linking, Merchant Center
- (33:38)There can be situations when some or lots of products are out of stock on the website, and the situation needs handling on the SEO side. For those situations, John suggests, it’s best if the URL can be kept online for things that are temporarily out of stock in a sense that the URL remains indexable and it is indicated with structured data that the product is currently not available. In that case, Google can at least keep the URL in the index and keep refreshing it regularly to pick up the change in availability as quickly as possible. However, if the website owner decides to ‘no index’ these kind of pages or to just remove the internal linking to these pages, then when that state changes back, Google should try to pick that up fairly quickly as well. Google will try to understand these state changes through things like sitemaps and internal links. So especially if the product is added back and then suddenly has internal links again, that helps Google to pick that up again. This process can be sped up a little by making internal linkings deliberately. For example, these products can be linked to from homepage, as Google views internal links from homepage as a little more important. It’s a good idea to add the products back and add a link to the homepage saying that these things are in stock again.
Another thing that could be done for out of stock products is hedging the website SEO together with product search so if a Merchant Center feed is submitted, those products can be shown within the product search sidebar. So Google doesn’t have to necessarily recrawl the individual pages to recognise that the products are back in stock, it can be recognised from the feed that was submitted.
Security Vulnerabilities
Q. Security vulnerabilities that can be found by using Lighthouse, for example, don’t affect SEO directly
- (37:28)John says that security vulnerabilities are not something that Google would flag as an SEO issue. But if these are real vulnerabilities on scripts that are being used and that means that the website ends up getting hacked, then the hacked state of the website would be a problem for SEO. But just the possibility that it might be hacked is not an issue with regard to SEO.
Authorship and E-A-T
Q. E-A-T mostly matter for medical and finance related websites and not more generic content
- (38:48)E-A-T, which stands for Expertise, Authoritativeness, Trustworthiness basically applies to sites that are really critical and essentially websites, where medical or financial information is given. In those cases it’s always better to make sure that an article is written by someone who’s trustworthy or has an authority on the topic. When it comes, to something more general, like theatre or SEO news or anything random on the web, that’s not necessarily something where trustworthiness of the author is a big issue. With regard to any business, it might be better to say that there’s no author that a piece of content is written by the website.
The one place where the author name does come into play is some types of structured data that have information for the author. In that case it might be something that is shown in the rich results on a page, so from that point of view it’s better to make sure there’s a reasonable name there.
Impressions and Infinite Scroll
Q. Impression works the usual way with infinite scroll, the difference being that some websites will probably get a little bit more impressions
- (45:51)From the Google’s side, even with infinite scroll, it’s still loading the search results in groups of 10, and as a user scrolls down, it loads the next set of 10 results. When that set of 10 results is loaded, that counts as an impression. That basically means that when a user scrolls down and starts seeing page two of the search result, Google sees it as page two and the page now has impressions similar to if someone were to just click on page two directly in the links. From that point of view not much changes. John suggests that what will change is that users will probably scroll a little bit easier to page two, three or four and based on that, the number of impressions that a website can get in the search results will probably go up a little bit. John also suggests that the click-through rate will be a little weird: it probably will go down slightly, and it might be due to the number of impressions going up rather than something being done wrong on the website.
Average Response Time
Q. Average response time can affect crawling
- (52:26)There is no fixed number regarding the average response time, however John recommends it to be 200 milliseconds maximum. That affects how quickly Google can crawl the website. So if Google wants to crawl 100 URLs from the website, and it thinks it can do five connections in parallel to the website, then based on the response time, those 100 URLs will be spread out and Google won’t be able to crawl that much per day. That’s the primary effect of average response time on crawling.
Average response time is about http requests that Google sends to the website’s server. So if there is a page that has CSS and images and things like that, the overall loading time goes into the Core Web Vitals. But the individual http requests go into the crawl rate, and that doesn’t affect the rankings – it’s purely from a technical point of view how much Google can crawl.
FAQ not showing in the search results
Q. FAQ not showing in the results might be due to its quality or technical issues, and there is a way to check that
- (52:54)The person asking the question is concerned by the fact that after his customer redesigned their website, all the FAQ schemas stopped being displayed in Google Search Results. John says there are two things that might have happened. The first is that the website might have been reevaluated in terms of quality at about the same time the changes were made. If the coincidence did take place, then Google probably is not so convinced about the quality of the website anymore, in that case it wouldn’t show any rich results and that includes FAQs. One way to double check that is to do a site query for these individual pages and see if rich results show up. If they do show up, that means they’re technically recognised by Google, but it doesn’t want to show them and that’s a hint that there needs to be an improvement in terms of quality. If they don’t show up, that means that there’s still something technical which is broken.
Sign up for our Webmaster Hangouts today!