Same Language Content for Different Countries
Q. It’s more reasonable to have one version of the website targeting different countries with the same language, rather than different versions of the same language website.
- (00:47) The person asking the question has a website that has almost the same but separate versions for UK and USA, and he is not sure what the best strategy for managing them is. John says, that having English US and English UK versions means that Google would swap out the URL for the appropriate version depending on the country, So if there is a different content for the two versions, even if that’s something like a contact address or currencies or things like that, then that makes sense to have it on separate URLs. If it’s all the same content, if it’s really just like a text article, then it’s more reasonable to make it one English version. The content can’t be limited to those two countries anyway, so having one version is an easy solution. Another advantage of having one version, apart from having to do less maintenance work, is the fact that for Google it’s a lot easier to rank that one page versus multiple pages with the same content.
Image Formats and Image Search
Q. The way the format of images on the website is changed defines whether or not there might be changes in rankings.
- (06:45) The person asking the question is redesigning her website in AMP pages and is converting all her images to WebP format, and she’s trying to use the same URLs. she is concerned with the fact that converting JPEG images to WebP might affect her rankings. John agrees, that it could potentially affect the rankings. He mentions that he has seen some people use the same image extensions and make them WebP files. If that works, that would help lower the amount of work that needs to be done. Because then the content would just be swapped out, but URLs will be the same and all of that will continue to work. Whereas, if the image URLs are changed, or if the URLs of the landing pages for the images are changed, the image search takes a little bit longer to pick up. The thing to keep in mind here is that not all sites get significant traffic from image search, so sometimes it’s something where theoretically it’s a problem to make these changes that take time.
Deleting JavaScript from the Website
Q. When deleting JavaScript from a page to simplify it for Googlebot, it’s important to not end up giving Googlebot and users a very different page experiences.
- (11:30) If JavaScript is not required for the website pages, for the content and for the internal linking, then probably deleting it doesn’t change anything. John says to be cautious about going down the route of making more simplified pages for Googlebot, because it’s easy to fall into a situation where Googlebot sees something very different than what the users usually see, and that can make it very hard to diagnose issues. So if there’s some bug that’s only affecting the Googlebot version of the site, the website owner wouldn’t see it, if users always see a working website.
Another thing to watch out for is the fact that in product search, Google sometimes checks to see what happens when users add something to their carts, just to kind of double-check that the pricing and things like that are the same. And if, for example, the Add to Cart functionality is removed completely for crawlers, then maybe that affects kind of those checks that product search does. However, John says that he doesn’t know the details of what exactly product search is looking for.
In general, Google tries to render the page to see if there’s something missing, but it doesn’t interact with the page, as it would take too much time to crawl the web if it had to click everywhere to see what actually happened. So Googlebot’s experience is different from what users see, and removing JavaScript might affect that.
Featured Snippet
Q. Sometimes it’s hard for Google to determine from the search query whether the user needs local results or the ones in a more global context. The same goes for both the regular search and the Featured Snippets.
- (16:44) The Featured Snippet from Google’s point of view is essentially a normal search result, that has a little bit of a bigger snippet and a little bit more information there. But otherwise it’ a normal search result. And from Google’s point of view, it tries to do two things when it comes to searches. On the one hand, it tries to recognise when a user wants to find something local. And when it recognises that, it uses the geotargeting information that it has from the websites to figure out which are likely the more local results that would be relevant for the user. The local aspect is something that helps to promote local websites, but it doesn’t mean that they will always replace anything that is global. Global in this context might mean bigger websites. So Google sees these global websites, and on the other hand, kind of local results from the same country. And depending on how it understands the query, it might show more local results or more from the global search results. For example, when someone is searching for Switzerland, then, of course, Google recognises that the user wants something from Switzerland, and it can strongly promote local results. But without that addition, sometimes it’s hard for Google to determine whether the local context is critical or not for this particular query. And sometimes it will just take global results in a case like that. And that’s not really something that a website owner can influence.
Website Authority
Q. With the fast-changing dynamics of the Internet, Google doesn’t have a long-term memory of the things that were wrong about the website.
- (22:38) Google pretty much has no memory for technical issues on websites, in the sense that if it can’t crawl a website for a while, or if something goes missing for a while and it comes back, then there is that content again, Google can have that information again, it can show it. And that gets picked up pretty quickly. That is something Google has to have because the Internet is sometimes very flaky, and sometimes sites go offline for a week or even longer, and they come back, and it’s like nothing has changed, but the website owners fixed the servers. And Google has to deal with that since the users are still looking for those websites.
It’s a lot trickier when it comes to things around quality in general, where assessing the overall quality and relevance of a website is not very easy, and it takes a lot of time for Google to understand how a website fits in with regards to the rest of the Internet. And that means on the one hand, that it takes a lot of time for Google to recognise that maybe something is not as good as it thought it was. And, similarly, it takes a lot of time for Google to learn the opposite way. And that’s something that can easily take a couple of months, a half a year, sometimes even longer than a half a year. So that’s something where compared to technical issues, it takes a lot longer for things to be refreshed in that regard.
John also points out that there are these very rare situations, when a website gets stuck in some kind of a weird in-between stage in Google’s systems, in that at some point the algorithms reviewed the website and found it to be absolutely terrible. And for whatever reason, those parts of the algorithms just took a very long time to be updated again, and sometimes that can be several years.
It happens extremely rarely, especially now, says John. But he suggests that if someone struggles and really sees that he’s doing a lot of things right and nothing seems to be working, it is worthwhile to reach out to Google stuff and see if there is something on the website that might be stuck.
Alt Text and Lazy Load Images
Q. It is not problematic to add alt text to the image that is lazy loaded, even if it’s a placeholder image.
- (26:31) When Google renders the page, it has to or tries to lazy load the images as well, because it tries to load the page in very high viewport, and that triggers lazy loading. And usually, that means Google can pick up the alt text and associate with the right images. If the alt text is already in place, and the placeholder image is currently there, and Google just sees that, then that shouldn’t be a problem per se. It’s kind of like giving information about an image that is unimportant. But it’s not that the rest of the website has kind of like a bad or a worse standing from that point of view. The thing to watch out for here more is that Google can actually load the images that are supposed to be lazy-loaded here. So in particular, Google doesn’t watch out for things like the data source attribute. It essentially needs to see the image URL in the appropriate source attribute for the image tag, so that it can pick it up as an image.
Google Analytics Traffic
Q. Sometimes traffic from Google Discover might create a situation where, when checking the website analytics, lots of traffic comes from direct traffic at random times.
- (32:02) If there is a situation where a huge amount of traffic starts dropping into the direct channel. One of the things that could be playing a role in that case is Google Discover. In particular, Google Discover is mostly seen as direct traffic in Google Analytics. And Google Discover is sometimes very binary, in the sense that it’s either the website gets a lot of traffic or it doesn’t get a lot of traffic from Google Discover. So that could be something where if the website owner is to just look at analytics, there might be these spikes of direct traffic happening there. In Search Console, there’s a separate report for Google Discover, so this kind of thing can be double-checked there.
Shop Ratings
Q. For an e-commerce shop it’ more advised to use product ratings, while for a directory of other e-commerce sites it’s fine to have ratings on those individual sites.
- (34:33) When it comes to shop ratings, Google wouldn’t try to show them if they’re on the shop themselves. So essentially, for local businesses, the idea is that if there is a kind of like a local directory of local businesses, then putting ratings on those local businesses will be fine, and that’s something Google might show and search. But for the local businesses themselves, putting a rating on their own websites is not an objective rating. It can be manipulated to look a little bit more legitimate. And that’s not really something that Google can trust and show in the search results. From that point of view, for an e-commerce shop, it’s better to use products ratings, because the individual products can be reviewed either by the website itself – the shop can clearly specify that it was reviewed by the shop itself. Or aggregate ratings can be used, which would be from users. On the other hand, for a directory of other e-commerce sites, of course, having ratings on those individual sites would be an option.
Search Console
Q. If the number of good pages goes down while the number of bad pages goes up in Search Console, that’s the sign of some kind of problem on the website, if both numbers just go down, it means there’s not enough data for Google to make any conclusion, and that’s perfectly fine.
- (37:35) Essentially, what Google does with regards to the Core Web Vitals and the speed in general – it tracks the data based on what it sees from users, a specific kind of user. That’s documented on the Chrome side. And only when Google has sufficient data from users, will it be able to use that in Search Console, and additionally, in Search Console, it creates groups of pages where it thinks that this set of pages is the same. And if Google has enough data for this set of pages, then it will use that. That also means that if there’s just barely enough data for that set of pages, then there can be a situation where sometimes Google has enough data and it would show it. And sometimes it doesn’t have enough data, and it might show as a drop in the number of good pages. That essentially doesn’t mean that the website is bad, it just means there’s not enough data to tell. So if just the overall number of the pages goes down in Search Console, and over time the overall number goes back up again, then it means there is almost enough data for Google to use. If the number of good pages goes down and the number of the bad ones goes up, that’s a sign that there’s a problem that Google sees there. So if just the overall number goes down, it could be ignored, it’s perfectly fine.
Out Of Stock Pages
Q. How Google will treat out of stock product pages depends on whether it will see them as soft 404 pages or not.
- (44:29) Google tries to understand when a page is no longer relevant based on the content of the page. So, in particular, the common example is a soft 404 page, where there is a page that looks like it could be a normal page, but it’s essentially an error page that says “This page no longer exists”. And Google tries to pick up things like that for e-commerce as well. When out of stock products are seen as soft 404 pages, Google drops them completely from search, if it keeps the page indexed despite being out of stock, the ranking of the page will not change. It essentially will still be ranked normally. It’s also still ranked normally if the structured data is changed to say that something is out of stock. So from that point of view, it’s not that the page would drop in ranking, it’s more that either it’s seen as a soft 404 page or it’s not. If it’s not seen as a soft 404 page, it’s still a normal page.
User Reviews
Q. There are different ways on how to go around showing customers’ reviews about the particular business, but it’s still not possible for the business to show those in the search results.
- (54:03) The customer rating reviews can be put on the homepage. Google would see these more as testimonials because the website owner is kind of picking and choosing what he wants to show. Using structured data for that on the website is something that Google wouldn’t like to see – it would probably ignore that. But sending users to a third-party review site where they can review it is kind of the best approach here. Because what would happen then is if Google shows that listing from that third-party review site, it can show the structured data for the business there. For example, the business is listed on Yelp, and they leave the reviews there. When that Yelp listing is shown in search, Google will know that this is a list of businesses essentially, so it can show the structured data about the business. And then it can use that review and markup and show those stars in the search results. So if Yelp has structured data, then that’s something that Google could pick up there. Showing it on the website as well in a textual form is perfectly fine.
And testimonials are very common and popular, but it’s just Google wouldn’t show the stars in the search result for that.
Sign up for our Webmaster Hangouts today!