Blog Pages are Ranking, while Product Pages are not
Q. If on a website certain types of pages rank well, while others don’t, it might be about the user query that lands them on the website
- (03:18) The person asking the question talks about the situation when the blog posts on his website get attention, while the product pages don’t really rank. John explains that for some types of queries google tries to understand what the intent is behind the query, and tries to figure out if someone is looking for information about a product or if they’re looking to buy a product. For this particular website from the question it might be that Google is interpreting the queries that are landing on those pages as more information-seeking queries rather than transactional queries. Since it’s Google’s assumption on what the user is asking for, it’s not easy to change that. What John suggests is to try making it as easy as possible for people to get to the products, so that there is a clear call to action in the blog post. In case people landing on the blog pages don’t want to buy products but are searching for information, then that’s really out of SEO’s control.
John advises to improve the website visibility by making the overall quality good, encouraging people to buy the products, to review them and to recommend the website to other people. With time, this will convert into better visibility, and ensuring there is an easy-to-follow call to action from the blog pages to the product pages will in its turn convert the visibility into valuable results.
Building Visibility
Q. It’s more reasonable to build visibility by first trying to rank for more specific unique keywords rather than general ones
- (10:34) Even with time and continued effort, it might be too hard to rank for general keywords like, for example, “green tea” and get visibility from them. John suggests finding more specific queries, more specific kinds of products where there is less competition, like a specific type of green tea or special leaf type – something unique where you can stand out. That kind of queries don’t get traffic comparable to the general keywords, but usually they get enough for a website that is just starting out. From there, you can start taking up the next specific keyword and keep expanding.
Reviews
Q. Reviews are an indirect factor of website assessment
- (13:54) Reviews gathered over-time on the products sold on the website are not a direct signal that Google takes into account, but seeing people engaging with the products is a good sign, and that means that other signals might be built up over-time.
Page Experience Update
Q. If there is a drop in website traffic right after the Page Experience Update rollout, it might not be due to the update
- (14:39) The Page Experience Update started to roll out in July and was finished at the end of August, and it was on a per-page basis. That means, if Google saw that a website was slow for Core Web Vitals, there would be a gradual change in traffic over-time. So if there happened to be some kind of drastic change, both gains and losses, around those dates, it might mean that there is something else causing it, not the update.
More Pages or Fewer Pages?
Q. It’s better to create fewer but stronger pages for the areas where there is more competition, and vice versa
- (16:54) Having pages on a website is all about balancing more general and more specific pages. John says, that whenever there are fewer more general pages, those pages tend to be a little bit stronger, whereas if there are a lot of pages, then the value is in a way spread out across those pages. If there is a specific topic, where the competition is stronger, then it’s better to have fewer but very strong pages, and if the targeted area doesn’t have a high competition, having more pages is fine. So, when starting out, it’s generally wiser to have fewer very strong pages so that the website can be as strong as possible in that area, and over-time as the website consolidates itself in that area, those pages could be split off into more niche topics.
Internal Linking
Q. The way to explain Google which pages are more of a priority is by internal linking
- (18:48) There isn’t really a way to give a priority to a certain page over the others but this can be helped by internal linking. So within the website it is possible to highlight certain pages by making sure they’re well linked internally, and maybe it’s also a good idea to have non-priority pages a little bit less well linked internally. John suggests linking to important pages from the homepage, and to the less important ones from category and subcategory pages. Google looks at the website, and it knows the homepage is very important, and the pages the homepage points to are also important. Google doesn’t always follow that, but it’s a way to give that kind of information.
Canonical URL
Q. Setting up canonical URLs for internal linking is not necessary
- (20:39) Canonical URLs are important if there are multiple URLs that show the same content, for example, if there are tracking URLs within a blog – in this case with the canonical URL Google understands what is the primary page there. But for a normal website where there are just links to different things, there is no critical need for canonical URLs – it’s a good practice to have but there are no SEO benefits for that.
New Language Version of Website
Q. If there is a new language version of a website, it’s good to add a JavaScript-based banner to direct users to the right version of the website
- (21:26) John suggests creating a JavaScript-based banner to the pages of the website that have another version to try to recognise if the wrong user is on the wrong version of the page by recognising the browser language or the user’s location, if possible. The banner on the top should say that there is a better version for this user, and that he can follow the link to the right version. Using a banner like that means that Google will still be able to index all of these pages, but it becomes possible to guide users to the appropriate one a little bit faster. If it is to be done on the server side, for example with a redirect, then the problem could be that Googlebot never sees the other version because it always gets redirected, and the banner is like a backup plan where usually hreflang will help and geotargeting is set up. Hreflang and geotargeting don’t guarantee there are only the right users going to these pages, so banner helps them find the right pages.
Wrong Publish Date in the Search Results
Q. There are 2 important things about publish date: date alignment across the page and time zones.
- (27:47) When it comes to dates in the Search Results, Google tries to find dates that essentially align the best across all of the signals that it gets. That means that Google looks at things like the structured data, page text etc. to understand what the date might be. If Google can’t recognise that the same date and time are used across multiple locations, then it tries to figure out which one of these might be the most relevant. In cases, when instead of the date there are things like “10 minutes ago” or “5 hours ago” in the visible part of the article, then that is something that Google wouldn’t be able to match because it doesn’t know what exactly is meant by that. Making the date and the time in a visible in the article together with the structured data is a good way of making Google use that. Also watching out for things like time zones is important, as it’s one of the things that usually go wrong.
Spam Traffic
Q. Google has a fairly good understanding of spam traffic and it doesn’t end up causing problems for websites
- (30:02) Google sees lots of weird spam traffic on the web over-time and has a good understanding of that. There are certain requirements that Google watches out for, and it filters out the usual spam traffic, so that shouldn’t be causing any problems to websites.
E-A-T
Q. E-A-T is not determined by some specific technical factors on a website
- (33:47) E-A-T stands for Expertise, Authoritativeness and Trustworthiness, and it’s something that comes from the Google’s Quality Rater Guidelines. Quality Rater Guidelines are not a handbook to Google’s algorithms, but rather something that Google gives to those who review changes that it makes in the algorithms. E-A-T is specific to certain kinds of sites and certain kinds of content – it’s not something where E-A-T score is based on a specific number of links or anything like that. It’s more about Google improving its algorithms and Quality Raters trying to review them, but there aren’t any certain technical moments that are involved into an SEO factor. John suggests looking into the E-A-T if the website maps into the broad area where google has mentioned E-A-T in the Quality Rater Guidelines.
Recognising Sarcasm
Q. Google is not adept at recognising sarcasm, so it’s better to make important messages very clear
- (36:40) There is always a risk that Google misunderstands things, so it doesn’t understand when there’s sarcasm on a page. Especially if it’s something where it’s really critical to get the right message across to google and all users – making sure the message is as clear as possible is important. It’s generally better to avoid sarcasm, when, for example, talking about medical information, but when writing about some entertainment topic it’s probably less of an issue.
Captcha
Q. If content is visible without the need to fill out the captcha, Google is okay with that, otherwise it might be a problem
- (41:57) Googlebot doesn’t fill out any captchas, even if they’re Google-based captchas. If the captcha needs to be completed in order for the content to be visible, then Google wouldn’t have access to the content, but if the content is available without needing to do anything and the captcha is just shown on top, usually that would be fine with Google crawling and indexing the page. To test that, John suggests using the Inspect URL Tool in Search Console and fetching those pages to see what comes back: on the one hand, the visible page to make sure that matches the visible content, and the HTML that is rendered there to make sure that that includes the content that is to be indexed. He restates that from a policy point of view, the situations where the full content is served, and the captcha is required on the user side – basically if things are done slightly differently for Googlebot or other search engines compared to an average user that would be fine.
One Author Writing for Different Websites
Q. There are no guidelines on one person writing content for different websites, but it’s better to help Google recognise it’s the same person
- (43:41) From Google’s point of view, there are no guidelines on where people can write and what kind of content they can create. People creating content on multiple sites is perfectly fine. From a practical point of view, it’s better if the author creates something like a profile page where he collects all the information about the things that he does. Pointing to something like the author page or profile page is a good way to make sure the Search Engines understand that this is a certain person who writes for certain pages. It’s not something that must be done according to some policies or guidelines, but it’s a good practice.
Recrawling Thin Content
Q. Updating and expanding an existing content might take longer to recrawl and re-index, and trying to push that by submitting manually might not be the best strategy
- (46:10) When publishing thin content, it sometimes takes a little bit longer for Google to recrawl the page and to pick up the new version of the content there, so John suggests trying to avoid doing that on a regular basis. Sometimes updating an article is necessary, and as it expands over-time, Google tries to pick that up over-time.
An issue for the person asking the question, who is primarily worried about Google not recrawling and re-indexing his recent article updates, John says, might actually be the fact that he manually submits all the links of every article after publishing. That makes Google a little bit nervous and pickier about the content of the website, because usually if there’s fantastic content on a website, Google goes off and crawls the website regularly, so there’s no need to submit everything manually.
Content Author’s Qualifications
Q. Qualifications of the author is not a direct factor in terms of SEO
- (51:06) The authority of a person writing the content (citations in journals and etc.) doesn’t really play a role as a ranking factor, John says. However, he points out, associating with a strong author might come into play in a bigger picture of the website, so it’s a long term thing, rather than an SEO factor.
Discovered, not Indexed
Q. If a website often runs into the problem of Googlebots discovering the new content but not indexing it, the problem might be the overall quality of the website
- (54:54) Sometimes Google might be not really sure about the quality of the website, and when new things get published on the website, Google understands that new content exists, it discovers the content, but ends up not indexing it. The main approach to solve this problem is to increase the overall quality of the website. Sometimes that means removing some old things and making sure that everything that is being published is fantastic. If the system is convinced of the quality of the website, it will crawl and index more and will try to pick up things as quickly as possible. Whereas, if the system is not 100 percent sure, it works sometimes, and sometimes it doesn’t work.
Sign up for our Webmaster Hangouts today!