LION Digital, Author at Premium eCommerce marketing services - Page 6 of 10

Is your agency focusing on ROAS and not Revenue?

As the cost of paid media advertising has increased over the last 2-3 years of exaggerated eCommerce usage, we have seen Cost Per Clicks (CPC’s) rise at an exponential rate as businesses scramble, fight to acquire market share & increase top-line revenue. As a result of this, we saw the ability to maintain a decent Return On Ad Spend (ROAS) be put in a comprising position were to maintain the ROAS, the business may have to sacrifice top line & paid media revenue contributions.

This now has created a fork in the road that feels like more of an ultimatum for the business, that it must either choose an acceptable ROAS with slower revenue growth of 5-10% YoY… or a historically below average ROAS with strong revenue growth of 20-30% YoY. A common example that you may be focusing on the wrong goal is if your ads account ROAS is growing, but your full site revenue may be stagnant or even in decline over the same period.

Here at LION Digital, we have seen both of these scenarios play out & neither of them is good or bad, right or wrong. Your agency should be having an in-depth conversation with your monthly/quarterly about what your true business goals are overall. It needs to be about more than just monthly budgets, monthly revenue targets, new product launches & increasing awareness of your product/service. We need to be setting goals for 1,2,3 years into the future & then reverse engineering them to work out micro checkpoints that will lead you to your macro goal. This may seem like common knowledge but somewhere along the line, digital marketing has been misconstrued that more money & big changes equal big returns & big improvements. Sadly this is not true.

This put digital media agency’s in an interesting position where they can no longer pump out a 1 size fits all approach & even the small businesses entering the online market need to have an in-depth understanding of what they are looking to get out of their marketing.

In this article, we’ll cover the most important points to consider when identifying goals for your business’s growth in 2022 & beyond.

Firstly, let’s break down what is Return on Ad Spend (ROAS) & if it is the right goal for the business at this stage in its life cycle.

When we have a new client start at LION, the most common strategy we implement is to spend 1-3 months optimising the account to an acceptable ROAS we have set with the client & doing an in-depth technical audit to ensure that when it comes time to scale, that we don’t have any issues behind the scenes that will hamper our ability or cause amplified inefficiencies. This is the most common issue we uncover in new accounts we onboard from other agencies, they have tried to scale in the past but the agency hasn’t done anything apart from putting the budgets up & hope for the best. Although some technical aspects may be time-consuming for LION lay a solid foundation by mastering the basics, we have found this is the only way to ensure long-term success.

Once we have created a stable foundation, we begin to set multiple KPIs, benchmarks & goals to work towards in manageable steps. One of those KPIs is ROAS & it can be looked at in two ways based on your business/industry/season/market conditions:

  • Firstly, you may use ROAS as an indicator of inefficiency to protect your margins. This has been important for the last 2-3 years with increasing supply issues, logistics costs increasing & margins getting thinner.
  • Secondly, you may use ROAS as a floor metric for performance that you do not wish to drop below while trying to maximise revenue.

Neither of these approaches is good or bad, right or wrong. You simply need to be honest about what would benefit the business in the short vs long term.

In terms of what approach you chose & what the ROAS could look like are vastly different from one industry to another. The main points that drive differences are:

  • Cost Per Click (CPC), driven by the level of competition
  • Average Order Value (AOV) of your online business
  • Total Ad Spend Budget (Cost) Allocated across the account
  • Types of campaigns running, Brand/PMAX/Shopping/Remarketing/Display/Search

It’s important we cover this, as these metrics will heavily influence your ROAS metric & overall results. The most overlooked point is simply what types of campaigns are running & do they align with the goals/approach you’re working towards. This point is the bread & butter of why there is a trade-off between ROAS & Revenue.

For example, if we were to focus on ROAS, the budget split would be more towards retaining market share, retention, loyalty & efficiency of ad spend. This would likely skew the spend towards PMAX/Shopping/Remarketing & maybe some niche search campaigns.

On the flip side, the more the ad account tracks into the generic search keyword territory, the competition is going to be higher, the conversion rate will be lower & ad costs will likely increase. The results will be similar if the ad account tracks more towards Display ads, Youtube & broad awareness marketing.

However, these campaigns still have value & will likely contain the highest % of people who haven’t yet bought from you. It’s imperative that if you do decide to spend money on these campaign types, you have price competitiveness, stock availability, unique selling points, reasons to buy from you over a competitive & lastly, a suitable budget for a minimum of 1-3 months of ad run time.

That last point is crucial as paid search is vastly different from social ads in that the longer your ads run, the better your reputation, expected clickthrough rate & bounce rate becomes. This will improve your ad relevance in the eyes of Google & allow you to creep up the paid rankings over time. Unfortunately, in paid search in 2022, there is very little likelihood that dropping $10k into broad keywords on Black Friday is going to reap the business any type of results if you don’t have any form of foundation from previous months’ work.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Sam McDonough –
Paid Media Director

WebMaster Hangout – Live from SEPTEMBER 07, 2022

A site that connects seekers and providers of household-related services.

LIZZI SASSMAN: (01:03) So the first question that we’ve got here is from Dimo, a site that connects seekers and providers of household related services. Matching of listings is based on zip code, but our users come from all over Germany. The best value for users is to get local matches. Is there some kind of schema markup code to tell Google algorithm show my site also in the Local Pack. Please note we do not have local businesses and cannot utilise the local business markup code. The site is… and I’m going to redact that.

  • A: (01:36) Yes, so Dimo– as you noted, local business markup is for businesses with physical locations. And that means that there’s typically one physical location in the world for that place. So it should only show up for that city. And your other question, currently, there’s no rich result feature for online services only that you could use structured data for, but you can claim your business profile with Google Business Profile manager and specify a service area there. So I think that that could help.

Is there a way to measure page experience or the core web vitals on Safari browsers, and is there a way to improve the experience?

MARTIN SPLITT: (02:12)  Indra asks, Google Search Console shows excellent core web vital scores for our site, but I understand that it only shows details of Chrome users. A majority of our users browse using Safari. Is there a way to measure page experience or the core web vitals on Safari browsers, and is there a way to improve the experience?

  • A: (02:35) Well, you can’t really use Google Search Console for this, but you can definitely measure these things yourself with the browser developer tools in a Safari browser and maybe ask around if you have any data from Safari users through analytics, for instance. There’s nothing here that we can do for the page experience or Search Console’s page experience resource because the data is just not available.

How can I best switch from one domain to a new one?

JOHN MUELLER: (03:01) For the next question, I’ll paraphrase. How can I best switch from one domain to a new one? Should I clone all the content or just use 80% of the content? What is the fastest way to tell Google that they’re both my sites?

  • A: (03:17) We call this process a site migration. It’s fairly well documented, so I would look up the details in our documentation. To simplify and leave out a lot of details, ideally, you’d move the whole website, 1 to 1, to the new domain name and use permanent 301 redirects from the old domain to the new one. This is the easiest for our system to process. We can transfer everything directly. If you do other things like removing content, changing the page URLs, restructuring, or using a different design on the new domain, that all adds complexity and generally makes the process a little bit slower. That said, with a redirect, users will reach your new site, regardless of whether they use the old domain or the new one.

Do you support the use and the full range of schema.org entities when trying to understand the content of a page, outside of use cases such as rich snippets?

LIZZI SASSMAN: (04:04) And our next question is from IndeQuest1. Do you support the use and the full range of schema.org entities when trying to understand the content of a page outside of use cases such as rich snippets? Can you talk about any limitations that might exist that might be relevant for developers looking to make deeper use of the standard?

  • A: (04:26) So, to answer your question, no, Google does not support all of the schema.org entities that are available on schema.org. We have the search gallery which provides a full list of what we do support for rich snippets, like you mentioned, in Google Search results. But not all of those things are visual. We do talk about certain properties that might be more metadata-like, and that aren’t necessarily visible as a rich result. And that still helps Google to understand things, like authors or other metadata information about a page. So we are leveraging that kind of thing.

What could be the reason that the sitemap cannot be read by the Googlebot?

GARY ILLYES: (05:07) Anton Littau is asking, in Search Console, I get the message “sitemap could not be read” in the sitemap report. No other information is provided. What could be the reason that the sitemap cannot be read by the Googlebot?

  • A: (05:21) Good question. The “sitemap could not be read” message in Search Console may be caused by a number of issues, some of them technical, some of them related to the content quality of the site itself. Rarely, it may also be related to the hosting service, specifically, if you are hosting on a free domain or subdomain of your hoster, and the hoster is overrun by spam sites, that may also cause issues with fetching sitemaps.

We’ve got guides and tips that are illustrated on our website, and they’re not performing well in the SERP.

LIZZI SASSMAN: (05:53) Our next question is from Nicholas. We would like to know how algorithms treat cartoon illustrations. We’ve got guides and tips that are illustrated on our website, and they’re not performing well in the SERP. We tried to be unique, using some types of illustrations and persona to make our readers happy. Do you think we did not do it right?

  • A: (06:18) I don’t know because I don’t think I’ve ever seen your cartoons, but I can speak to how to improve your cartoon illustrations in SERP. So our recommendation would be to add text to the page to introduce the cartoons, plus alt text for each of the images. Think about what people will be searching for in Google Images to find your content. And use those kinds of descriptive words versus just saying the title of your cartoon. Hope that helps.

Does posting one content daily increase rankings?

GARY ILLYES: (06:46) Chibuzor Lawrence is asking, does posting one content daily increase rankings?

  • A: (06:53) No, posting daily or at any specific frequency, for that matter, doesn’t help with ranking better in Google Search results. However, the more pages you have in the Google index, the more your content may show up in Search results.

Does Google agree with the word count or not?

LIZZI SASSMAN: (07:09) OK, and the next question is from Suresh. About the helpful content update that only 10% write quality content, and the rest, 90%, don’t right quality content, lengthy content, but how should they write quality content? Does Google agree with the word count or not?

  • A: (07:29) Well, nope, content can still be helpful whether it’s short or long. It just depends on the context and what that person is looking for. It doesn’t matter how many words, if it’s 500, 1,000. If it’s answering the user’s intent, then it’s fine. It can be helpful. These are not synonymous things.

When using words from a page title in the URL, should I include stopper words too?

JOHN MUELLER: (07:49) I’ll paraphrase the next question, hopefully, correctly. In short, when using words from a page title in the URL, should I include stopper words too? For example, should I call a page whyistheskyblue.HTML or whyskyblue.HTML?

  • A: (08:08) Well, thanks for asking. Words in URLs only play a tiny role in Google Search. I would recommend not overthinking it. Use the URLs that can last over time, avoid changing them too often, and try to make them useful for users. Whether you include stop words in them or not or decide to use numeric IDs, that’s totally up to you.

Do different bots type, image, and desktop share crawl budgets?

GARY ILLYES: (08:31) Sanjay Sanwal is asking: do different bots type, image, and desktop share crawl budget? And what about different hosts?

  • A: (08:40) Fantastic question. The short answer is yes, Google Bots and its friends share a single crawl budget. What this means to your site is that if you have lots of images, for example, Googlebot Images may use up some of the crawl budgets that otherwise could have been used by Googlebot. In reality, this is not a concern for the vast majority of the sites. So unless you have millions of pages and images or videos, I wouldn’t worry about it. It’s worth noting that the crawl budget is per host. So, for example, if you have subdomain.example.com, and you have another subdomain.example.com, they have different crawl budgets.

Request to 301 redirect the subdirectory to their new German site. Would you advise against it?

JOHN MUELLER: (09:24) Christopher asks: we’ve sold the German subdirectory of our website to another company. They request us to 301 redirect the subdirectory to their new German site. Would you advise against it? Would it hurt us?

  • A: (09:40) Well, on the one hand, it all feels kind of weird to sell just one language version of a website to someone else. On the other hand, why not? I don’t see any problems redirecting from there to a different website. The only thing I would watch out for, for security reasons, is that you avoid creating so-called open redirects, where any URL from there is redirected to an unknown third party. Otherwise, that sounds fine.

Can I expect to see clicks and impressions from this in the search appearance filter as we can see with some other rich results?

LIZZI SASSMAN: (10:08) Sam Gooch is asking: I’m experimenting with a new learning video, rich result, and can see it’s being picked up in Google Search Console. Can I expect to see clicks and impressions from this in the search appearance filter as we can see with some other rich results?

  • A: (10:23) Well, to answer this question specifically, there’s no guaranteed time that you’ll be able to see a specific rich result in Google Search after adding structured data. But I think what you’re asking about here is for a specific thing to be added to Search Console, and we’ll have to check with the team on the timeline for that. And we don’t pre-announce when certain things will be added to Search Console. But you can check the rich result status report for the learning video and make sure that you’re adding all of the right properties and that it’s valid and ready to go for Google to understand what it needs in order to generate a rich result. Hope that helps.

How big is the risk of penalising action if we use the same HTML structure, same components, layout, and same look and feel between the different brands?

JOHN MUELLER: (11:02) Roberto asks: we’re planning to share the same backend and front end for our two brands. We’re ranking quite well with both of them in Google. How big is the risk of penalising action if we use the same HTML structure, same components, layout, and same look and feel between the different brands? What would be different are the logos, fonts, and colours. Or would you suggest migrating to the same front end but keeping the different experience between the two brands?

  • A: (11:33) Well, this is a great question. Thanks for submitting it. First off, there’s no penalty or web spam manual action for having two almost identical websites. That said, if the URLs and the page content are the same across these two websites, then what can happen for identical pages is that our systems may pick one of the pages as a canonical page. This means we would focus our crawling, indexing, and ranking on that canonical page. For pages that aren’t identical, we generally index both of them. For example, if you have the same document on both websites, we’d pick one and only show that one in Search. In practice, that’s often fine. If you need both pages to be shown in Search, just make sure they’re significantly different, not just with a modified logo or colour scheme.

JavaScript SEO, what to avoid along with JavaScript links?

MARTIN SPLITT: (12:23) Anna Giaquinto asks, JavaScript SEO, what to avoid along with JavaScript links?

  • A: (12:30) Well, the thing with links is that you want to have a proper link, so avoid anything that isn’t a proper link. What is a proper link? Most importantly, it’s an HTML tag that has an href that lists a URL that is resolvable, so not like a JavaScript colon URL. And that’s pretty much it. If you want to learn more about JavaScript-specific things for Search, you can go to the JavaScript beginner’s guide on developers.google.com/search and see all the things that you might want to look out for.

I research a keyword that has no volume or keyword density, but we are appearing for those keywords on the first page. Should we target that keyword?

LIZZI SASSMAN: (13:05) Our next question is from Sakshi Singh. Let’s say I research a keyword that has no volume or keyword density, but we are appearing for those keywords on the first page. Should we target that keyword?

  • A: (13:19) Well, Sakshi, you can optimise for whatever keywords you want, and it’s not always about the keywords that have the most volume. I would think about how people should find your page and target those keywords.

Will audio content be given more priority and independent ranking following the helpful content algorithm update?

GARY ILLYES: (13:32) Kim Onasile is asking, hello, you previously advised that there are no SEO benefits to audio versions of text content and that audio-specific content doesn’t rank separately like video content. However, given you also said it might be that there are indirect effects like if users find this page more useful and they recommend it more, that’s something that could have an effect. Will audio content be given more priority and independent ranking following the helpful content algorithm update?

  • A: (14:07) This is an interesting question. And ignoring the helpful content algorithm update part, no, audio content, on its own, doesn’t play a role in the ranking of text results.

Is it OK to fetch meta contents through JavaScript?

MARTIN SPLITT: (14:33) Someone asked, is it OK to fetch meta contents through JavaScript? I think that means it is OK to update metatag data with JavaScript?

  • A: (14:44) While that is possible to do, it is best to not do that. It may give Google Search mixed signals, and some features may not pick up the changes. Like, some specific search result types might not work the way you expect them. Or it might have incorrect information, or it might miss something. So I would suggest not doing that.

Both of my websites have been hit by different updates, around 90% drops, and are suffering from some type of flag that is suppressing our sites until the soft penalty is lifted.

GARY ILLYES: (15:08) Anonymous is asking, both of my websites have been hit by different updates, around 90% drops, and are suffering from some type of flag that is suppressing our sites until the soft penalty is lifted. Or is there even a soft penalty?

  • A: (15:26) Good question. No, the named updates that we publish on the Rankings Updates page on Search Central are not penalties in any shape or form. They are adjustments to our ranking algorithms, so they surface even higher quality and more relevant results to Search users. If your site has dropped in rankings after an update, follow our general guidelines for content, take a look at how you could improve your site as a whole, both from content and user experience perspective, and you may be able to increase your rankings again.

When would be the next possible update for the Search results?

JOHN MUELLER: (16:03) Ayon asks, when would be the next possible update for the Search results?

  • A: (16:09) Well, on our How Search Works site, we mentioned that we did over 4,000 updates in 2021. That’s a lot of updates. Personally, I think it’s critical to keep working on things that a lot of people use. Our users and your users expect to find things that they consider to be useful and relevant. And what that means can change over time. Many of these changes tend to be smaller and are not announced. The bigger ones, and especially the ones which you, as a site owner, can work on, are announced and listed in our documentation. So in short, expect us to keep working on our systems, just like you, hopefully, keep working on yours.

Does having a star aggregated ranking on recipes improve its position?

LIZZI SASSMAN: (16:54) And our next question is from Darius. So Darius is asking, does having a star aggregated ranking on recipes improve its position?

  • A: (17:05) I think what Darius is asking about is the stars that show up for recipes and with structured data and whether or not that has an effect on ranking. So while the stars are more visual and eye-catching, structured data in and of itself is not a ranking signal. And it isn’t guaranteed that these rich results will show up all the time. The Google algorithm looks at many things when it’s creating what it thinks is the best Search experience for someone. And that can depend on a lot of things, like the location, language, and device type.

When I don’t set a rel-canonical, then I can see the internal links in the search console in the links report. Is this normal?

JOHN MUELLER: (17:37) Christian asks: I have set the rel-canonical together with a noindex meta tag. When Google does not accept a canonical at all, all internal links are dropped. When I don’t set a rel-canonical, then I can see the internal links in the search console in the links report. Is this normal?

  • A: (17:55) Well, this is a complex question since it mixes somewhat unrelated things. A noindex says to drop everything and the rel-canonical hints that everything should be forwarded. So what does using both mean? Well, it’s essentially undefined. Our systems will try to do the best they can in a conflicting case like this, but a specific outcome is not guaranteed. If that’s fine with you, for example, if you need to use this setup for other search engines, then that’s fine with us too. If you want something specific to happen, then be as clear as possible for all search engines.

If a video is indexed in the video indexing report, is it still worth adding the video structured data on that page and why?

LIZZI SASSMAN: (18:33) And our next question is from Thijs.  If a video is indexed in the video indexing report, is it still worth adding the video structured data on that page and why?

  • A: (18:47) Well, yes. Just because something’s indexed doesn’t mean that there’s not an opportunity to improve how it appears. Structured data helps Google understand more about your video, like what it’s about, the title, interaction statistics, and that kind of stuff. And adding structured data can make your videos eligible for other video features, like key moments. So it’s not just, oh, get your video indexed, and that’s it. There are other things that you can do to improve how your content appears on Google.

Can I cloak a list with lots of products to Googlebot and show users a Load More button?

MARTIN SPLITT: (19:20) Tamás asks, can I cloak a list with lots of products to Googlebot and show users a Load More button?

  • A: (19:26) I think this is not cloaking, as what users see when they click on the Search result roughly matches what Googlebot sees. And if you have a Load More button, users will click that if they don’t see the product they are expecting there. So I don’t think this is cloaking, and that’s a solution that I think works from a crawling point of view.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

Why are customer reviews important for eCommerce, and how can they be managed efficiently?

There is a general understanding among eCommerce business owners that reviews are important. However, the true significance of client reviews frequently eludes due to various, at first glance, more essential priorities in overall business processes. In a year-on-year comparison, more consumers read online reviews while searching for products and services. According to the 2022 State of Reviews provided by LION’ partner REVIEWS.io, 94% of users claim that reviews left by preceding business clients are influential when making a purchase decision. Moreover, 62% of respondents say that reviews significantly impact them and only 6% report no impact.

Where to find eCommerce business, product and service reviews?

Although eCommerce clients use multiple channels on the internet, there are three of the most influential points for review management:

  • Google. Google remains the first source to search for new businesses for 75% of consumers. Google Seller Ratings and Google My Business helps to build trust at the first point of contact for paid and organic channels. Google Seller Ratings helps improve the performance of Google paid marketing by increasing an ad’s click-through rate, thus lowering SEM’s cost-per-click (PPC). Whereas Google My Business and, properly integrated through data markup, Google 5-star ratings for individual products and services help businesses to stand out in organic SERP results and capture top-of-funnel traffic before competitors.
  • Social Media. Only 36% of customers go on social to directly search for products and services, which is more than twice lower as on Google. Nevertheless, the speed of information spread and the value for the user expressed in the amount of everyday dedicated time makes social media one of the most critical sources for client reviews.
  • Review sites and marketplaces. For some businesses, “Yelp” and other specialised review sites, Amazon and similar marketplaces could be the core source for customer reviews that can’t be neglected.

Trends and eCommerce customer reviews management

New requirements for trust

Preceding years of fake review generation finally gave the fruits, eCommerce prospects now frequently request if companies can fake reviews and question the perfect picture of 5-star reviews. They inspect reviews’ relevance, authenticity, recency and consistency through a critical lens, which found proof in 81% of respondents, claiming that reviews should be recent and contain relevant information to have significant influence. Hence, customers expect more balanced ratings and quality reviews from verified sources with factual insight into a business, products and services.

Average rating matters

Even if it is only a part of the bigger picture, 68% of respondents answer that before engaging with the business, they preferably search for the company with a 4 as the average star rating. In contrast, only 3% of consumers appeal to companies with 1 or 2 average star ratings. At the same time, 68% somewhat agree that a high rating could be trusted only if a significant number of such reviews support it.

Reviews before price

With the massive spread of online shopping and eCommerce businesses as a response to demand, shoppers’ behaviour is also gaining more sophistication. As a matter of fact, the most influential aspect of the decision-making process when it comes to online store choice became the reviews with the share of 40% of respondents, which overtook even the price with 27%, delivery time and free returns with 20% and 13%, respectively. Interpreting the numbers, it is an opportunity for retailers with higher prices to sell more than those with lower prices for the same goods just by having better reviews.

Fewer purchases proceed solely based on the product representation messages in marketing channels initiated by the company, and more people instead rely on the experience of others. Reviews increase the probability of unknown brands being discovered by customers and competing with top brands in their categories. At the same time, the competitiveness in the eCommerce market allows not to endure poor customer experiences, which amplifies the importance of client reviews.

Company’s response to a feedback

If there is something equally important in eCommerce client feedback management as past client experiences wrapped into words and images, it is the company’s response. Especially the one to negative feedback since the question “Do you read replies to negative reviews?” received “Yes” as the answer from 90% of eCommerce users that were approached. Most merchants seem to understand the importance of feedback, and 62% claim that they respond to all or most of the reviews they receive, in contrast to 15% that say they never or rarely respond to online reviews.

Negative reviews first

Research demonstrated that the first thing e-shoppers do while studying reviews nowadays is apply a filter for 1-star to check possible cons and evaluate the risks. Compared with the past, when an unsatisfied customer could most commonly influence people from his inner circle, the negative eCommerce review placed immediately alongside the goods and services descriptions can abruptly change the intention of any user that came on the page. Thus, responding to negative customer feedback promptly and adequately increases the positive impact on the client’s decision-making process even more.

Review collection strategy

As a part of nature, people are more eager to share their opinions in the extreme grades of perception – when experience exceeded or was below expectations. Therefore, an average customer with an intermediate level of satisfaction with the product or service is usually not eager to leave a review without encouragement. For instance, over half of respondents admit to leaving online reviews four times a year or even less, and 26% have never left a review at all. At the same time, only 5% of consumers say they never leave reviews based on a positive experience. Thus, eCommerce businesses should focus on an effective review collecting strategy that would include a 360o-degree view and engagement motivation at the final customer journey stages.

Review collecting systems

According to 81% of businesses that participated in the study, review collection systems provide a profitable return on investment.

REVIEWS.io provides tools for collecting and managing company and product reviews, user-generated content and other reputation management technologies. The system integrates with all popular eCommerce solutions, including Shopify, Google, WooCommerce, Klaviyo, Magento and many more. Reviews.io is trusted by over 8,200+ brands, such as Cake Vaay, BoxRaw, Bloom & Wild, helping businesses to grow through customer trust & advocacy.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Asselya Sekerova –
Marketing &

Project Director

6 Tips for Google Merchant Center

Introduction

ALAN KENT: (00:07) Google Merchant Center is a great way to share data about your eCommerce business with Google. Hi. My name is Alan Kent, and I’m a developer advocate at Google. In this episode, I’m going to share six tips on how to get the most out of Merchant Center for your presence in search results. The most common use for Merchant Center is to upload product data via structured feeds. Because feeds are designed to be read by computers, data is extracted more reliably than Googlebot crawling your site and extracting data from web page markup. If you’re familiar with structured data, you may wonder whether to embed structured data in web pages or provide a feed to the Merchant Center. Google’s recommendation is to do both. Google may cross-check feed data against your website. So product-structured data in web pages is still recommended even if you also provide Merchant Center feeds. If you have physical stores, you can also share inventory location data with Google. This can then be used by Google when answering queries for products near me.

Tip 1. Ensure products are indexed

(01:50) The Googlebot web crawler attempts to locate all products on your site by following links between pages. Googlebot, however, may miss pages in some circumstances. For example, you may have some products only reachable from on-site search results. Google typically does not enter search terms into the on-site search box to discover new pages. If you have a product page and are unsure if it is indexed, you can use the URL Inspection tool. This will report what Google Search knows about your page. You can also use the site colon URL as a search term to search for that specific URL. In a previous episode, I described creating a Sitemap file to list the important pages to index on your site. The Sitemap file is used by the Googlebot crawler to find pages on your site without relying solely on links between pages. But there is another way. Creating a Merchant Center product feed will help Google discover all the product pages on your website. These product page URLs are shared with the Googlebot crawler to potentially use as starting points for crawls of additional pages. It is, however, important to note that this and some other Merchant Center features are not available in all countries. Please refer to the Merchant Center Help Center for up-to-date lists of countries’ features available.

Tip 2. Check your prices are correct in the Search results

(03:26) The second tip is to check the accuracy of product pricing data used by Google. If Google incorrectly extracts pricing data from your product pages, it may show your original price instead of your discounted price in search results. To check if Google is extracting price data accurately, quickly test a sample of results. You can search for a product page and check the price displayed if rich results are displayed. Search using the site colon URL for your product page to return the web page as a search result. To accurately provide product information, such as list price, discounts, and net price, it is recommended to add structured data to your web pages and provide Merchant Center with structured feeds of your product data. This will help Google correctly interpret pricing shown on product pages.

Tip 3. Minimise price and availability lag

(04:24) Tip number 3 is to minimise inconsistencies in pricing and availability data between your website and Google’s understanding of your site due to timing lags. For example, Google crawls web pages on your site according to its schedule. Changes on your site may not be noticed until the next Googlebot crawl. On the other hand, Merchant Center can be updated on a more consistent schedule, such as once a day or even once an hour. These delays can result in Merchant Center and search indexes lagging behind site changes, such as when a product goes out of stock. I described how to check Google’s understanding of your pricing data in the previous tip using a site colon URL query. In addition, Merchant Center may identify products that have different pricing data according to your website due to delays in processing. This can negatively impact your products’ search results until the discrepancy is resolved. Merchant Center also allows you to download all pricing data in bulk if you want to do a more exhaustive reconciliation of pricing data in Merchant Center against your website. To reduce lag, you can request Merchant Center to process your feeds more frequently. This can reduce the time lag between the product data changing on your website, and Google is aware of it. Another approach is to enable automated item updates in Merchant Center. This causes Merchant Center to automatically update collected pricing and stock-level data based on web page contents when discrepancies are detected. This is based on the assumption that your website updates in real-time when pricing or availability changes.

Tip 4. Ensure your products are eligible for rich product results

(06:18) Tip number 4 is to check that your products are getting rich results treatment in search results. Rich results are displayed at Google’s discretion but rely on Google having rich product data. To check if your product pages are receiving rich results presentation treatment, you can use a site colon URL query to search for a specific web page. If not found, the page may not be indexed. You can also use the Google Search URL Inspection tool to verify if Google is indexing your product page. To get the special rich products presentation format, it is recommended to provide structured data in your product pages and a product feed to Merchant Center. This will help ensure that Google correctly understands how to extract product data from your product pages needed for rich text product results. Also, check for error messages in Google Search Console and Merchant Center.

Tip 5. Share your product inventory data

(07:18) Tip number 5 is to ensure, if you have physical stores, that your products are being found when users add phrases such as “near me” to the queries. To test if locality data is being processed correctly, you may need to be physically near one of your physical stores and then search for your product with “near me”, or similar added. Register your physical store locations in your Google Business Profile, and then provide a local inventory feed to Merchant Center. The local inventory feed includes product identifiers and store codes, so Google knows where your inventory is physically located. You might also like to check out Pointy from Google. Pointy is a device that connects to your in-store point of sale system and automatically informs Google of inventory data from your physical store.

Tip 6. Sign up for tab Shopping tab

(08:15) The final tip is related to the shopping tab. You may find your products are available in search results but do not appear. To see if your products are present, the easiest way is to go to the Shopping tab and search for them. To be eligible for the shopping tab, provide product data feeds via Merchant Center and opt-in to Surfaces Across Google. Structured data and product pages alone are not sufficient to be included in the Shopping Tab search results.

Conclusion

(08:45) This is the final episode in the current series on improving the presence of your commerce website in search results. If you have topics, you would like to see included in a future series. Please leave a comment. If you have found the series useful and want to see more similar content, make sure to Like and Subscribe. Google Search Central publishes new content every week. Until next time, take care.

Sign up for eCommerce Essentials today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

Is your business suffering
from the September slump?

Seasonality is an unavoidable challenge for any business. No matter if your business sells products that are geared toward Winter or Summer pursuits, or if you have a product that is in demand all year round, we all have to weather up and downs throughout the calendar year.

The majority of ecommerce clients’ peak season is unsurprisingly Oct-Dec, with Black Friday, Cyber Monday and the holiday gift-giving period boosting conversion rates and driving up revenue. However, this often means dealing with a much softer market in the month of September. The IMRG Online Retail Index noted a 12.5% drop in online sales YoY in 2021, and we can see a similar trend across the majority of clients this year.

On average, across our accounts, we can see a drop in conversion rate by a full percentage point or more compared to August, which has affected performance across a wide range of industries; however, CTRs on average are up by 25%, suggesting the consumers are in a stage of “browsing not buying”.

Considering the current economic climate, with consumers seeing a constant barrage of news around supply chain issues, rising inflation rates, and reports of an impending recession, this drop in performance is, of course, a concern to many. However, it’s not all bad news.

“Early data from Morning Consult, a global intelligence company, finds that people plan to spend about the same amount on gifts as they did last year.” Inflation rates and concerns around cost saving, however, mean they will be in the market for deals and discounts.

With this in mind, here are a few tips from the LION team on how to weather the storm and win in the holiday season:

  1. Capitalise on any low-cost traffic to the site now. Consumers who are visiting your site have put you in their consideration set and may come back to purchase in the following months. Invest in owned channels, like SEO, Email and CRO and make the most of the visitors already have and how you can expand this.
  2. Start planning for sales and promotions now, and talk to the team about the best way to market these. You might want to consider adding retargeting to your strategy to let people know about discounts or flesh out your email strategy to capture low-hanging fruit. Think creatively about how you will stand out from the crowd during Black Friday and other upcoming holiday sales periods.
  3. Consider your ROAS thresholds carefully. While we don’t recommend going dark during this time, don’t spend at the cost of margin to the business when the money can be better used later in the year.
  4. Leverage new formats like YouTube shopping and awareness channels to bring new customers to the brand.

Reach out to the team at LION for advice and strategy tips that are personalised to your business.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

SEO & web developers: Why we need to talk about audits

SEOs & web developers: Why we need to talk about audits

Do devs listen to SEOs recommendations?

(1:13) MARTIN: So, Bartosz, there is a thing that I want to talk to you about, and that is that I do hear, and I do observe that SEOs are often struggling actually to get developers to do stuff. Is that an experience that you share? It’s like you make recommendations, you give them input, and then it just doesn’t happen? Or is that not something that you would say is a particular problem?

BARTOSZ: I think that you might have touched on a very sensitive topic in the industry, and I know where you’re coming from with that. So, in general, this is a problem. Some agencies solved this problem and kind of went, past it. We cannot complain at all about our relationship with devs. At the same time, there are so many ways that things are done in the web development and the SEO space that doesn’t help.

Why we shouldn’t throw PDF reports over the fence

(2:10) BARTOSZ: For example, PDF audits are one of the things, just to name, I think, the elephant in the room. So if you’re going to create a PDF audit explaining how to fix core vitals, not knowing their stack, not knowing their technology, not knowing their resources`. How many devs are there in the team? Is it a Shopify-based platform, or is it a custom CMS? And in our experience, when you create a PDF audit, and the dev is going to run into a problem, they’ll skip that because there’s no fallback into what has to be done. So this can be unpacked in so many ways.

MARTIN: But I know exactly what you mean. So I come from a developer background, and I have worked with SEOs on both sides, both as a Google developer advocate, basically helping SEOs to do the right things and identify problems and solve problems. And both– also from the perspective of the developer working in the team. And the thing with the PDF report really struck a chord with me. Because I remember being a developer. I had so many different things on my plate already. And then, out of the blue, in the middle of a sprint, someone from the SEO department descended upon me and said, “Martin, here is a PDF with all the things that are wrong.” Bye. And then they just flew off. And I’m like, uh, OK. It says we are using JavaScript, which is very accurate because what we are actually building right now is a VR application that runs on your phone in the browser. You have to use JavaScript to do that. And the recommendation is to not use JavaScript? So that’s not really a thing we can do because then we don’t have VR capabilities. And because then that’s our product, we kind of have to build our product to have our product with the technologies that enable the features of our product. So a lot of these things are so unhelpful and so not reflective of the circumstances in which I, as a developer, work in.

Why do SEOs advise against JavaScript?

(4:21) BARTOSZ: So you work for Google. So I can ask you before we get into how to solve this problem, let me ask you a question. Is Google OK with JavaScript?

MARTIN: Yeah. We are OK with JavaScript.

BARTOSZ: So if I have a news website that’s 100% CSR, you’re going to be OK with that?

MARTIN: We’re going to be OK with it. It might take a little longer than you would like us sometimes because we have to actually render everything. And if you are rendering specifically badly designed, then that might take us a while. But in general, if you are doing things right, and if you’re doing things following the best practices and making sure that you test your setup up, we would be fine with that, yeah.

BARTOSZ: So I don’t want to argue with that statement. Obviously, this is not this kind of video. But just what I’m trying to say is there are so many complexities on all these fronts. So we have clients coming in with a news website that’s 100% JavaScript. And there is this kind of demon in the industry that all the SEOs would say that JavaScript is evil and that JavaScript is so bad. But on the other side of things, there is Google saying we’re OK with JavaScript. And there is this kind of reality when there are a lot of websites packed with JavaScript, and Google is not picking them up properly. They’re not getting rendered for, like, all these other problems. So then we hop on the call with a dev team of our client. And they’re like, but Martin said that JavaScript doc is OK. Why do we need to do why do you want us to do server-side rendering? How do you answer? You know what I mean. This makes us look like SEO wizards. So these are the problems that require a lot of knowledge. We don’t struggle with them as much anymore because we have so much research. We have our experiments, whatever. But if you think about that to a level when he or she can handle those questions, it is going to take years. And it is, for us, it slows down growth. And at the same time, if we’re going to look at the whole internet asking all of the SEO agencies to be as advanced as we are when we only do technical SEO and we specialize in JavaScript SEO is difficult. And so this is a crack, and I don’t have any solutions, and I’m not blaming Google or anything like that. I’m just saying that this is a change that’s happening, but it requires time. So there are some maybe more than some moments when it basically requires goodwill from both ends. So if devs want to understand it, and we want to explain it, we’ll make it happen.

MARTIN: I think there are lots of touchpoints where you can actually create this understanding and this cooperation. Because, as you said, there are lots of complexities and lots of background and lots of considerations. So if you are asking me, and this is also tricky for us Googlers, because if you’re asking me a question like is JavaScript OK? Then, in general, yes. Is it the best idea? No, not necessarily. If you can do it without JavaScript, do it without JavaScript. Server-side rendering is a recommendation that we give out as well. We need to make that more prominent in our docs. I’ve taken that point. But I really like the point where you said like, oh, so SEOs have this challenge that when you get a new SEO to join your team, they need to spend so much time on actually getting the knowledge that they need to work. Developers have the exact same challenge. Because the entire industry, the entire ecosystem, just keeps moving and keeps changing. So someone who becomes a developer today sees everyone else working with so many tools and so many things. And there is a tendency to skip the understanding. Because most developers who have been around a couple of years have started with some tool, learned the things that this tool does well and these things that this tool does not so well. And then they might be like, oh, you’re building a news website. I think in that case, with the interactivity that you described to me, we might be able to just do this better with server-side rendering. Whereas, if you want a highly engaging social network, you might actually want to use a client-side rendering for all the interactivity that is embedded, and that is not necessarily impairing your performance in search. So they learn these tools, and they learn the trade-offs, and then they make better decisions as they grow. But then people come in and might skip the entire learning process and go like, oh, everyone uses this framework, so I’ll build everything in this framework. Because if everyone uses it, it must be fantastic, without understanding the decision-making process behind it. And I think which is then exacerbated is the problem that if an SEO who cargo cults recommendations that they read or heard somewhere without understanding the background and the complexities they are encompassing to a developer who does the same thing, then there must be a clash. Because now they are running into territory where they think they know what they are doing when they actually don’t really know what they are doing. Would you say that might be a challenge that we are seeing playing out?

Complexities and differences between technical SEO and content marketing

(10:25) BARTOSZ: So let me unpack that one by one. There are a few statements within that. So, for example, the way you described the news website, you and I have known you for a few years. I know you’re not going to take that the wrong way. So most of our clients wouldn’t understand what you said. So if you’re talking to the key stakeholders, I’m assuming maybe not the CEO, but you know CMO, someone who’s making that decision along the lines, you will have this conversation. This is one of the problems that we were doing back in the days when you would hop on a call with five people from our client’s company. And we would start talking to dev, and you would lose everyone else. So simplifying that as much as possible is just it has to happen, so then everyone is kind of included in this conversation. But secondly, what you mentioned about dev teams is that this is so dynamic. This is also how SEO looks like. Maybe some SEO agencies didn’t realize it. I don’t think too many. So if someone is coming to us with a question, can you do a JavaScript SEO, web performance, and a little bit of content marketing? So this is extremely difficult to pack in one agency and do all these things well. So I think that we slowly need to normalize using a technical SEO agency, for one thing, using a content marketing agency for something else, and just trying to branch out so then everyone understands their goals. And then onboarding that one person is easier because that junior SEO, she only has to learn like JavaScript SEO, web performance, whatever, crawler budget, understand those technical aspects. At the same time, some agencies want those people to also do link-building and all these other aspects. So just like with devs, it got so complex. Sometimes I’m looking at job offers for devs, and I’m like, what does it even mean? 

MARTIN: Basically, one job offer is an entire IT department. I love it.

BARTOSZ: Basically, you need to divide organizations into aware in the web space and those that are not aware. And then if we’re going to so if someone is aware of how SEO works like it looks like what’s the difference between CSR and SSR, I’m assuming a lot of even high-level people in some organizations like for example, Germany is pushing a lot of people in the management position to know a lot about development, which I love. Talking to companies from Germany, most of the time, they’re just so aware of that. Some other companies would come to you and say, so if we’re going to fix this problem with rendering or with web performance, how much traffic can we expect and when? And that’s the main topic of discussion. And this is something I have daily, two-three times a day, that I have to answer. And this is usually showing me that there are so many ways to answer this question. Like, I never do that the same way. But anyhow, this is showing me that maybe they need a little bit more help understanding what has to be done and why it’s done. Sometimes it’s just beyond our scope of work, let’s call it that, that we cannot push them.

Web performance metrics and reaching stakeholders needs and wants

(13:35) BARTOSZ: So now that we have that, let’s assume we have someone that’s aware of or willing to learn about why we’re doing that, about– that “why” is kind of important here. Because if they only do that for traffic, and that’s the only KPI they look at, it’s very difficult sometimes not to skip a lot of important metrics. If you look at that, you can have a ton of traffic, and this is still going to be a terrible website in theory.

MARTIN: So are you saying that sometimes you literally have to rethink an organization’s KPIs?

BARTOSZ: Yeah. Very often, we would have just to give you an example. We have a call with a massive company, and they would be asking us what do we have to do to rank for the term “houses”? And just this question lets you unpack so many problems with the whole organization. And then we usually don’t want to offend anyone. You don’t want to get their ego involved in that. But at the same time, you want to explain that. And so that’s one. Let’s assume we have the stakeholders sold on the idea of what we want to do. They understand that. That’s amazing. That’s usually when things start to go well.

MARTIN: That is great. Because that also unlocks the possibility to basically have them on board with whatever the dev team will be doing about it. These things have basically been invisible to me. But as a developer, I just noticed that the stakeholders, the key stakeholders in the company that have an influence on the dev team, told me to do one thing, and then SEOs or marketing told me to do another thing. And then usually I picked the organization goals because that’s what I was measured by. So you are saying by bringing in the key stakeholders and making sure that they understand what they need to look for and adjusting their KPIs, you unlock the key to actually getting the development team on board with what you are trying to accomplish?

BARTOSZ: Yes and no. So usually, we don’t really– like as a technical SEO agency, we don’t struggle that much to get devs on board. This is not that big of a deal with us. The problem is for the stakeholders to understand what we want to do with them. Because sometimes, it’s like this is this technical SEO agency, and this is our dev team. Let them have fun with this project. And like almost literally, that’s how that might look in some cases. And this is usually a problem. But then if we know, OK, we want to get here. And this is the umbrella term. We want to have amazing web performance. We want to get rendered, indexed quickly, whatever. And they know that. I don’t even imagine how this could go wrong because the whole organization is just growing in one direction. And this is our Holy Grail, and this is happening very, very often.

MARTIN: How do you get that to happen? Because I have been in so many organizations where that did not happen.

BARTOSZ: This is a very simple answer. We did it wrong so many times. So we tried for years. Because when we started back in, I think 2013, ’14-ish, we wanted to focus only on the technical aspects of my team and me. People would make fun of us. They would be OK, there are white hat SEOs, and there are people who have traffic and all of these kinds of amazing jokes going our way that you cannot really. I can create a stand-up show for you with just the feedback we would get from the SEO community back in the day, basically moving to a technical side of things only.

MARTIN: Bonus episode right there.

Meeting with stakeholders, finding problems, and SEOs listening to devs to find solutions

(17:22) BARTOSZ: This was very weird for a second. Usually, we start with stakeholders. I’m going to condense this really quick. We talk to the stakeholders. We hop on the call before creating any offers or anything. We hop on the call and talk about what’s the KPI? What’s the problem? What are the challenges? Why are we even doing that? Why is it so important? Why do you want to fix that? Because if traffic is the only metric, we still will work with them, but we know how this might go. So we’re going to start with that. Then after the call, we look into their website. We create a statement of work. So we tell them, OK, this is what we’re going to do. This is the list of problems we’re seeing with your website. This is how we want to fix it and prioritize it. So the first month, we solve all of the most terrible aspects, like 404s or, I don’t know, 10 seconds to load a page, whatever. And with that, it’s extremely transparent. Because we tell them, OK, this project is going to take four months. We’re going to hop in, and now this is actually a spoiler alert. We’re going to hop into a PM, so like Jira or Trello with your dev team, and we’re going to make it happen.

MARTIN: OK. So you meet the dev team where they are anyway.

BARTOSZ: And we adjust to whatever solution that they go with. So if they work in sprints, we try to join that. However, this is, we had to kind of morph into this team that joins them without any interruptions. So this is the only way. We are aware that, like in a medium or large company, the dev team is seriously the most important part of the business. So then we tell them, OK, this has to be fixed. But we have to understand their tech stack. We have to understand all these boring aspects boring, like, we loved it. But usually, during that call with stakeholders, they don’t really want to talk about it, or they don’t know. Very often, stakeholders have no idea what kind of tech stack they are running.

MARTIN: To be fair, they should not have to, right? That’s something unless it’s like the CTO. I don’t think the CEO needs to know which tech stack they’re on as long as they know what their core business is, how it works, and what’s the vision? What’s the mission of the organization? I think that’s exactly what you have a development team for, to define these things based on requirements coming from elsewhere.

BARTOSZ: One more conversation that we have to schedule. But let’s move forward with that. Then we hop into Jira, Trello, whatever. We give them tasks. And they come back to us. Like, we cannot really do that. We have a custom solution around this one that doesn’t allow whatever. So then and again, we have a team that’s extremely technical. This is something we have been building for a few years. And they hop on the call. We either sometimes, when they really don’t get that, and this is an edge case, we write a snippet of code just to show them how to optimize the CSS or whatever. But most of the time, we just go and talk to them. And they would like devs, in my opinion, they’re very hardworking people. And they would tell us. We cannot do this. We have so many limitations. And we try to work within those limitations. If we hit the wall, we go back to stakeholders and say, maybe we could try to, and devs appreciate that. Because someone is really it’s not only them coming to stakeholders for budget, for more resources, whatever. But we’re going to come in and say, OK, guys, this is going to be difficult with so many places where you cut corners.

MARTIN: So you would say that you would also have to somehow support developers to get the right resources and to get the right environment to work in sometimes? That’s interesting.

BARTOSZ: Sometimes. Sometimes. This is going to sound funny. But sometimes we have clients who have 50 devs, and not in a country where there would be cheaper, but 50 devs in a very high-earning city somewhere in the world. And they would listen to us rather than to them because they are like, oh, we’re paying your invoice. We want to get the most bang for our buck. So because of that and they would tell us that openly we want to really move this project forward. And that’s why they would change things around in the dev team. And I guess you have to know that out of all people. Sometimes when you work in a team, you come to your manager, and you say, OK, this is a problem every day. They won’t listen to you until someone from the outside is going to come in and say, like, dude.

MARTIN: I’ve been there. I’ve done that. I’ve been on both sides of this. I’ve been the consultant that came in and basically just sat down with the development team, listened to them for a day, and then presented what I heard to the stakeholders. And they’re like, oh, these are really valuable insights. And I always thought, I’m billing you for this, but you could have just listened to your developers.

BARTOSZ: Exactly. And I guess every single dev listening to or watching this video series right now is going to have the story of this way, of this kind.

MARTIN: I’ve been on the developer side of that, too. It was like, hey, we need to do this. Oh, I don’t think so. And then I was like, OK. And then the consultant came in and said the same thing.

BARTOSZ: And also, just to elaborate on this story, we have usually once a quarter we have someone reaching out to us. Usually, the dev teams we know, saying “Partners, when we need an audit around core vitals”. But don’t go too deep. We just want them to hear what we told them from someone else. And they’re willing to spend the budget just to have a backup document just to say. We need to fix this.

Always ask questions

(23:36) MARTIN: But that is so smart. I really like that. Because so many developers are like, ugh, I don’t want to work with these people because they just tell them what we told them already, and they charge them for it. Why don’t developers more often leverage these external voices like they do in your case, where it’s like, Bartosz, I need this audit. I know the result of the audit, but please tell them what we already told them because they don’t listen. That is smart. I like that.

BARTOSZ: If there’s any SEO agency listening to that, or SEO like someone frustrated with dev teams. 100% honest. We don’t struggle with devs. Talking to them openly, speaking their language, and you might have to get a little bit of technical, or maybe just have one or two people on your team who can get the message across.

MARTIN: Oh, and just to add on to your last point, and that goes to all the SEOs out there listening, struggling with developers, lose your ego. It is not a problem to ask us developers questions. It’s like if you think that you need to know everything, no, you don’t. You’re not a developer. It’s OK. Developers don’t know about SEO. You don’t have to know everything about development. So if you don’t understand why they can’t do what you ask them to do, remember that developers are intrinsically motivated by or to solve problems. That’s what they love. That’s what they want to do. So if you give them something to solve, they’ll be excited to solve it. And then they hit the limitations, the limits of what they can do in the tech stack in the environment that they are in. And then they tell you, I can’t do that because of XYZ. If you don’t understand XYZ, that is OK. Ask clarifying questions until you get there because you, and I think, Bartosz, you said that very nicely, you may have to simplify that message that comes from the developers so that other people who are not developers in the organization that you work with understand why it doesn’t work.

BARTOSZ: Let me just build on that. And this is something that all the SEOs are going to love. We had that vision years back that we had to learn all of the frameworks. We have to know front and back and whatever. It was so stressful. Like, I was learning all these like, I was trying to know it all. But this is something leave it to devs. What we have to do as technical SEOs, we have to have an in-depth, massive understanding of how rendering works, how a browser works, how Google is rendering and what they render, like, rules around that. Chrome DevTools has to be your go-to place. And you need to understand what’s happening, and once you understand that, you add a little bit of documentation from different frameworks, from different technologies, but don’t learn how to write all of this code. Obviously, basics are OK, but basics. This is what they pay you to do. They don’t want you to know what they know. So you have to know this is complex enough. Understanding how a browser works step by step is something you can do for the rest of your life and never have enough of learning. Just wanted to add now we were talking from the SEO standpoint with the ego. If you own the company, if you’re on a dev team, go through a lot of agencies. Talk to them. Hop on the call with them. Ask them questions. See if you understand what they’re trying to do. If you’re talking to an agency that’s going to tell you nothing about the scope of work, what they’re going to do, and how they’re going to do that, this would raise a red flag. If you go with your car to have it fixed, and they won’t tell you what they did, I would be worried about driving this car. So basically, go through that. Talk to as many people. As soon as you feel the vibe, “OK, these guys, they understand what we want to accomplish” and get this conversation going.

MARTIN: And the same the other way is absolutely true. Developers really don’t like it when someone comes like, hey, you need to do this. And then they ask you why? And then they don’t get a proper answer. You can do the same thing with developers. If you say, hey, I want you to implement the canonicals. And then they say, we can’t do that. Then ask them why. If they say our solution doesn’t allow it. Then it’s like, why does it not allow you, like, does it not allow you to add anything to the head? Oh, it does, but not the canonicals. Why not the canonical? It might be just a knowledge gap on their side, or it might be an actual hard technical limitation of the environment and the stack and the platform they’re working with. But they need to be able to explain this to you in simple terms. If they are like, oh, it’s algorithmically impossible, that’s just a developer’s way of saying, bugger off. Ask questions. Don’t think like, “oh, they said something that I don’t understand, so I must stop questioning here”. No. If they can’t express it in simple terms so that you understand what they mean, they haven’t done the work themselves either. So hold them accountable, but be ready to be held accountable, too.

BARTOSZ: Yeah. Just wanted to hop on. We don’t know the stuff very, very often. We have five calls per day with five different stacks. Some we never heard about. So if we don’t know something, we’re very open about it. We have no idea what we have WAMP PWA recently. I was like, I never heard of WAMP. We just went and read the documentation, came back, and scheduled a call. Like, now we can talk in the same language. Again, this is an ego thing. You don’t assume that you need to know stuff. I’m very open about things I don’t know. There are tons of them.

MARTIN: I mean, even I was asked recently at a core vitals session, I was asked questions that I don’t know the answer to. I won’t give you a random non-answer or try to mettle my way through. I just say I don’t know, but I can check.

BARTOSZ: Yeah. And one thing that I feel is a deal-breaker between devs and SEOs a lot of times, and we back in the day were guilty of that as well, is devs will ask you a question of, why shouldn’t we just point canonicals to a page that’s most important, so we sculpt we push the most important page with different canonicals because this is just like a link. They would have all these ideas as well, like both SEOs and devs, on how to cheat the search engine. Yeah. SEO, you have to explain that step by step. But if we’re talking about SEOs and devs, we need to leave all of the conspiracy theories or urban legends behind. Just fall back on documentation, and that’s it. Because as soon as open the door into, if we point some canonicals here, or if we do this, or if we do that, some things might happen. We heard about it. We tested that. This is going to put you in the shoes of those snake oil salesmen. So be technical. If you’re talking to devs, be technical. Drop all of those. Even if you deeply believe this is the case, I think this opens the door to what we as SEOs want to run away from.

MARTIN: Another thing is if you are not very technical, that is perfectly OK. But then don’t try to find solutions because you are in a territory where you are not necessarily experienced. And if you are not just basically present the problem, and then work with the development team to solve that, don’t try to come up with a solution because it’s likely that your solution will not work in the tech stack or the environment that your development team works in. And then if you are getting attached to that solution, you’re like, but why can’t we just do it like this? It might not get through to the development team the right way, and it might feel like you are just obsessed with something rather than actually trying to solve the problem. And that’s what development teams need to do. So you want to be on their side, and you want to it’s OK to go the way together like basically, to research options, to experiment with things together. But trying to come up with a turnkey solution for development teams usually backfires.

The start of technical SEO and web developers working together

(32:30) BARTOSZ: Just one thing that I hope is going to clean the air a little bit. Technical SEO is fairly new. It’s maybe three, four, or five years old. Obviously, some will argue that this was because I was doing technical SEO in 1993. But it’s fairly new in the way that this is getting so popular. It’s getting so needed. It’s almost essential. So this is a brand new field. And if you look at that and drop all of the histories, this is going to get really exciting. Because now, I would assume that devs and SEOs will only get closer throughout the next few years. Because obviously, we’re all seeing the need for technical SEO.

MARTIN: I think that would clear the air. And for all of those who are scared and confused now on the SEO side, don’t be. You get to choose if you want to get into technical SEO or not. Technical SEO is a field of its own. It is complex. It is big. It is broad. It is new. It is fresh. But I think content still is an important field, and all the other things, all the other aspects inside SEO, will not become irrelevant or anything. It will continue to be a broad field. You get to pick your niche. But if you want to go technical, do it right. I think that’s a very, very nice way of looking at it.

BARTOSZ: And go technical.

MARTIN: Go technical. Awesome. Thank you so much, Bartosz, for joining me in this conversation. I think this was really interesting and insightful. And I hope to see more from you guys and also to hear what the community is saying about these things as well looking forward. Stay safe, stay healthy, everybody.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

What Happens When you Stop Doing SEO?

The 2020s are shaping up to be quite the outlier from decades past. In the last two years, we saw significant growth and focus on digital channels off the back of at-home pandemic buying; now with consumer confidence dipping, we’re seeing growth slow and marketers are increasingly focused on the effectiveness of their channel mix and may be considering where they can consolidate or reduce spend.

Business owners and marketers alike frequently wonder where SEO should sit – is it eCommerce, marketing or IT? Should it be a sustained marketing cost once we are happy with our visibility for core search terms? When is SEO’s job done?

SEO can sit anywhere in an organisation, but it makes the most sense to sit close to content, technical implementation and website changes, and new product launches. SEO is the art of being the least imperfect player in the search results so there is always work to do. Following core category content optimisation and technical audit fixes, research can uncover opportunities to develop content earlier in the journey to capture more users for paid search and email audiences to nurture them into being customers down the line.  In this way, the SEO job is never done and its absence in either activity or advisory can see good growth come undone. 

HERE ARE A FEW WATCH OUTS WORTH CONSIDERING IF SEO IS NEGLECTED:

  1. You may lose keyword growth momentum – Google values freshness and algorithm updates happen every time. If you’re not pruning and cultivating a healthy website and fresh content, you may fall out of favour and see rankings you worked hard on decline.
  2. Competitors can outperform your website by continuing optimisation works – as we said before, SEO is about being the least imperfect, so if you’re not investing time and effort, you can expect competitors who are to overtake you
  3. You may take a significant hit to your organic revenue – if you lose crucial Page 1 keywords to a competitor, their brand may be considered over yours and this can affect your bottom line as Organic commonly generates 35-60% of a company’s revenue.
  4. All websites aren’t created equally and neither are their budgets – unlike paid search, it’s difficult to gauge how much your competitors are investing in SEO. Content and link velocity, alongside internal team growth, is a good way to compare yourself to your competitors. A good SEO partner should be able to provide you with this view and help you outsmart your competitors where you can’t outspend them.

Get in touch for an obligation-free discussion with our growth strategists to find out how we can make your company take the LION’s share of the market online. We have achieved great results in the form of visibility, visitation and revenue growth that you can find on the case studies section of our website. 

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

Shopify Announces Launch of YouTube Shopping

Shopify announced the launch of YouTube shopping this week, outlining benefits including:

  • Customers can buy products right when they discover them
  • Instantly sync products to YouTube
  • Create authentic live shopping experiences
  • Sell on YouTube, manage on Shopify

What does this mean for our clients?

There are some eligibility restrictions for this product at the moment. You must already have 1000 subscribers to your YouTube channel and at least 4000 hours of annual watch time. This means as a brand, you will need to have an already well-established YouTube channel or look to start working with content creators who do.

Consider content creators who align with your brand or category and research their channels and content. There are specific websites and agencies that can help source content creators for a fee, including theright.fit and hypeauditor.com

YOUTUBE FOR ACTION WITH PRODUCT FEEDS

For clients who don’t meet the eligibility requirements, but still want to explore video for retail, there is another option. YouTube for action campaigns allow us to promote videos on the YouTube network, and attach a product feed through the merchant centre, creating a virtual shop front for the watcher, with an easy “shop now” functionality.

This powerful format allows brands to generate both awareness and engagement with their brand, whilst also driving bottom line sales. This can be managed through your Google Ads account allowing you to optimise towards the same conversions and use the same audience signals as your other Google campaigns.

What is YouTube for Action?

Previously named TrueView for Action, this product allows users to buy video ads on the YouTube network which are optimised towards a performance goal rather than pure reach or video views.

You can optimise towards:

  • Website traffic
  • Leads
  • Sales/Purchases

And have the option to choose your bud strategy based on:

  • Cost per View
  • Cost per Action
  • Maximise Conversions
  • Cost per thousand impressions

Who can I target?

YouTube and Google’s shared data provide a wealth of information to help us build audience segments that will fit your brand and services. The options include but are not limited to:

  • Demographic targeting: Age, gender, location –  based on signed-in user data
  • Affinity audiences: Pre-defined interest/hobby and behavioural data based on users browsing history
  • In-market audiences: Users deemed to be “in-market” for a product or service based on their searching behaviour and browsing history
  • Life-Events: Based on what a user is actively researching or planning, e.g. graduation, retirement etc
  • Topics:  Align your content with similar  themes to video content on the YouTube network
  • Placement: Align your content to specific YouTube channels, specific websites, or content on channels/websites.
  • Keyword: Similarly, to search, build portfolios of keywords to target specific themes on YouTube

The team at LION will work with you to select and define the right audiences to test and optimise to get the best results.

What content should I use?

Like any piece of content, there is no right or wrong answer, and what works for some brands may not for others. Your video should align with your brand tone of voice and guidelines. 

Think about what action you want the users to take and ensure the video aligns with this, e.g. if you want users to buy a specific product, show the product in the video and talk about its benefits. Testing multiple types of video content is the best way to learn about what your potential customers like and do not like.

What do I need to get started?

  1. At least one video uploaded to YouTube (we recommend 30 seconds in length)
  2. A Google merchant centre account & Google Ads account
  3. A testing budget of at least $1,000

YOU CAN CHAT WITH THE TEAM AT LION DIGITAL AND WE CAN HELP YOU TO SELECT AND DEFINE THE RIGHT AUDIENCES TO TEST AND OPTIMISE TO GET THE BEST RESULTS

LION stands for Leaders In Our Niche. We pride ourselves on being true specialists in each eCommerce Marketing Channel. LION Digital has a team of certified experts and the head of the department with 10+ years of experience in eCommerce and SEM. We follow an ROI-focused approach in paid search backed by seamless coordination and detailed reporting, thus helping our clients meet their goals.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

WEBMASTER HANGOUT – LIVE FROM JULY 01, 2022

Which number is correct, Page Speed Insights or Search Console?

Q: (00:30) Starting off, I have one topic that has come up repeatedly recently, and I thought I would try to answer it in the form of a question while we’re at it here. So, first of all, when I check my page speed insight score on my website, I see a simple number. Why doesn’t this match what I see in Search Console and the Core Web Vitals report? Which one of these numbers is correct?

  • (01:02) I think maybe, first of all, to get the obvious answer out of the door, there is no correct number when it comes to speed when it comes to an understanding of how your website is performing for your users. In PageSpeed Insights, by default, I believe we show a single number that is a score from 0 to 100, something like that, which is based on a number of assumptions where we assume that different things are a little bit faster or slower for users. And based on that, we calculate a score. In Search Console, we have the Core Web Vitals information based on three numbers: speed, responsiveness, and interactivity. And these numbers are slightly different because it’s three numbers, not just one. But, also, there’s a big difference in the way these numbers are determined. Namely, there’s a difference between so-called field data and lab data. Field data is what users see when they go to your website. And this is what we use in Google Search Console. That’s what we use for search, as well, whereas lab data is a theoretical view of your website, like where our systems have certain assumptions where they think, well, the average user is probably like this, using this kind of device, and with this kind of a connection, perhaps. And based on those assumptions, we will estimate what those numbers might be for an average user. And you can imagine those estimations will never be 100% correct. And similarly, the data that users have seen will change over time, as well, where some users might have a really fast connection or a fast device, and everything goes really fast on their website or when they visit your website, and others might not have that. And because of that, this variation can always result in different numbers. Our recommendation is generally to use the field data, the data you would see in Search Console, as a way of understanding what is kind of the current situation for our website, and then to use the lab data, namely, the individual tests that you can run yourself directly, to optimise your website and try to improve things. And when you are pretty happy with the lab data you’re getting with your new version of your website, then over time, you can collect the field data, which happens automatically, and double-check that users see it as being faster or more responsive, as well. So, in short, again, there is no absolutely correct number when it comes to any of these metrics. There is no absolutely correct answer where you’d say this is what it should be. But instead, there are different assumptions and ways of collecting data, and each is subtly different.

How can our JavaScript site get indexed better?

Q: (04:20) So, first up, we have a few custom pages using Next.js without a robots.txt or a sitemap file. Simplified, theoretically, Googlebot can reach all of these pages, but why is only the homepage getting indexed? There are no errors or warnings in Search Console. Why doesn’t Googlebot find the other pages?

  • (04:40) So, maybe taking a step back, Next.js is a JavaScript framework, meaning the whole page is generated with JavaScript. But as a general answer, as well, for all of these questions like, why is Google not indexing everything? It’s important first to say that Googlebot will never index everything across a website. I don’t think it happens to any kind of non-trivial-sized website where Google would completely index everything. From a practical point of view, it’s impossible to index everything across the web. So that kind of assumption that the ideal situation is everything is indexed, I would leave that aside and say you want Googlebot to focus on the important pages. The other thing, though, which became a little bit clearer when, I think, the person contacted me on Twitter and gave me a little bit more information about their website, was that the way that the website was generating links to the other pages was in a way that Google was not able to pick up. So, in particular, with JavaScript, you can take any element on an HTML page and say, if someone clicks on this, then execute this piece of JavaScript. And that piece of JavaScript can be used to navigate to a different page, for example. And Googlebot does not click on all elements to see what happens. Instead, we go off and look for normal HTML links, which is the kind of traditional way you would link to individual pages on a website. And, with this framework, it didn’t generate these normal HTML links. So we could not recognise that there’s more to crawl and more pages to look at. And this is something that you can fix in how you implement your JavaScript site. We have a tonne of information on the Search Developer Documentation site around JavaScript and SEO, particularly on the topic of links because that comes up now and then. There are many creative ways to create links, and Googlebot needs to find those HTML links to make them work. Additionally, we have a bunch of videos on our YouTube channel. And if you’re watching this, you must be on the YouTube channel since nobody is here. If you’re watching this on the YouTube channel, go out and check out those JavaScript SEO videos on our channel to get a sense of what else you could watch out for when it comes to JavaScript-based websites. We can process most kinds of JavaScript-based websites normally, but some things you still have to watch out for, like these links.

Does it affect my SEO score negatively if I link to HTTP pages?

Q: (07:35)Next up, does it affect my SEO score negatively if my page is linking to an external insecure website?

  • (07:44) So on HTTP, not HTTPS. So, first off, we don’t have a notion of an SEO score. So you don’t have to worry about the kind of SEO score. But, regardless, I kind of understand the question is, like, is it wrong if I link to an HTTP page instead of an HTTPS page. And, from our point of view, it’s perfectly fine. If these pages are on HTTP, then that’s what you would link to. That’s kind of what users would expect to find. There’s nothing against linking to sites like that. There is no downside for your website to avoid linking to HTTP pages because they’re kind of old or crusty and not as cool as on HTTPS. I would not worry about that.

Q: (08:39) With Symantec and voice search, is it better to use proper grammar or write how people actually speak? For example, it’s grammatically correct to write, “more than X years,” but people actually say, “over X years,” or write a list beginning with, “such as X, Y, and Z,” but people actually say, “like X, Y, and Z.”

  • (09:04) Good question. So the simple answer is, you can write however you want. There’s nothing holding you back from just writing naturally. And essentially, our systems try to work with the natural content found on your pages. So if we can crawl and index those pages with your content, we’ll try to work with that. And there’s nothing special that you need to do there. The one thing I would watch out for, with regards to how you write your content, is just to make sure that you’re writing for your audience. So, for example, if you have some very technical content, but you want to reach people who are non-technical, then write in the non-technical language and not in a way that is understandable to people who are deep into that kind of technical information. So kind of the, I would guess, the traditional marketing approach of writing for your audience. And our systems usually are able to deal with that perfectly fine.

Should I delete my disavow file?

Q: (10:20) Next up, a question about links and disavows. Over the last 15 years, I’ve disavowed over 11,000 links in total. I never bought a link or did anything unallowed, like sharing. The links that I disavowed may have been from hacked sites or from nonsense, auto-generated content. Since Google now claims that they have better tools to not factor these types of hacked or spammy links into their algorithms, should I just delete my disavow file? Is there any risk or upside, or downside to just deleting it?

  • (10:54) So this is a good question. It comes up now and then. And disavowing links is always kind of one of those tricky topics because it feels like Google is probably not telling you the complete information. But, from our point of view, we do work hard to avoid taking this kind of link into account. And we do that because we know that the disavow links tool is a niche tool, and SEOs know about it, but the average person who runs a website doesn’t know about it. And all those links you mentioned are the links that any website gets over the years. And our systems understand that these are not things you’re trying to do to game our algorithms. So, from that point of view, if you’re sure that there’s nothing around a manual action that you had to resolve with regards to these links, I would just delete the disavow file and move on with life and leave all of that aside. I would personally download it and make a copy so that you have a record of what you deleted. But, otherwise, if you’re sure these are just the normal, crusty things from the internet, I would delete it and move on. There’s much more to spend your time on when it comes to websites than just disavowing these random things that happen to any website on the web.

Can I add structured data with Google Tag Manager?

Q: (12:30) Adding schema markup with Google Tag Manager is that good or bad for SEO? Does it affect ranking?

  • (12:33) So, first of all, you can add structure data with Google Tag Manager. That’s an option. Google Tag Manager is a simple piece of JavaScript you add to your pages and then does something on the server-side. And it can modify your pages slightly using JavaScript. For the most part, we’re able to process this normally. And the structured data you generally like can be counted, just like any other structured data on your web pages. And, from our point of view, structured data, at least the types that we have documented, is primarily used to help generate rich results, we call them, which are these fancy search results with a little bit more information, a little bit more colour or detail around your pages. And if you add your structured data with the Tag Manager, that’s perfectly fine. From a practical point of view, I prefer to have the structured data on the page or your server so that you know exactly what is happening. It makes it a little bit easier to debug things. It makes it easier to test things. So trying it out with Tag Manager, from my point of view, I think, is legitimate. It’s an easy way to try things out. But, in the long run, I would try to make sure that your structured data is on your site directly, just to make sure that it’s easier to process for anyone who comes by to process your structured data and it’s easier for you to track and debug and maintain over time, as well, so that you don’t have to check all of these different separate sources.

Is it better to block by robots.txt or with the robots meta tag?

Q: (14:20) Simplifying a question a little bit, which is better, blocking with robots.txt or using the robots meta tag on the page? How do we best prevent crawling? 

  • (14:32) So this also comes up from time to time. We did a podcast episode recently about this, as well. So I would check that out. The podcasts are also on the YouTube channel, so you can click around a little bit, and you’ll probably find them quickly. In practice, there is a subtle difference here where, if you’re in SEO and you’ve worked with search engines, then probably you understand that already. But for people who are new to the area, it’s sometimes unclear exactly where these lines are. And with robots.txt, which is the first one you mentioned in the question, you can essentially block crawling. So you can prevent Googlebot from even looking at your pages. And with the robot’s meta tag, you can do things like blocking indexing when Googlebot looks at your pages and sees that robot’s meta tag. In practice, both of these results in your pages do not appear in the search results, but they’re subtly different. So if we can’t crawl, we don’t know what we’re missing. And it might be that we say, well, there are many references to this page. Maybe it is useful for something. We just don’t know. And then that URL could appear in the search results without any of its content because we can’t look at it. Whereas with the robot’s meta tag, if we can look at the page, then we can look at the meta tag and see if there’s no index there, for example. Then we stop indexing that page and drop it completely from the search results. So if you’re trying to block crawling, then definitely, robots.txt is the way to go. If you just don’t want the page to appear in the search results, I would pick whichever is easier for you to implement. On some sites, it’s easier to set a checkbox saying that I don’t want this page found in Search, and then it adds a noindex meta tag. For others, maybe editing the robots.txt file is easier. Kind of depends on what you have there.

Q: (16:38) Are there any negative implications to having duplicate URLs with different attributes in your XML sitemaps? For example, one URL in one sitemap with an hreflang annotation and the same URL in another sitemap without that annotation.

  • (16:55) So maybe, first of all, from our point of view, this is perfectly fine. This happens now and then. Some people have hreflang annotations in sitemap files separated away, and then they have a normal sitemap file for everything. And there is some overlap there. From our point of view, we process these sitemap files as we can, and we take all of that information into account. There is no downside to having the same URL in multiple sitemap files. The only thing I would watch out for is that you don’t have conflicting information in these sitemap files. So, for example, if with the hreflang annotations, you’re saying, oh, this page is for Germany and then on the other sitemap file, you’re saying, well, actually this page is also for France or in French, then our systems might be like, well, what is happening here? We don’t know what to do with this kind of mix of annotations. And then we may pick one or the other. Similarly, if you say this page was last changed 20 years ago, which doesn’t make much sense but say you say 20 years. And in the other sitemap file, you say, well, actually, it was five minutes ago. Then our systems might look at that and say, well, one of you is wrong. We don’t know which one. Maybe we’ll follow one or the other. Maybe we’ll just ignore that last modification date completely. So that’s kind of the thing to watch out for. But otherwise, if it’s just mentioned multiple sitemap files and the information is either consistent or kind of works together, in that maybe one has the last modification date, the other has the hr flange annotations, that’s perfectly fine.

How can I block embedded video pages from getting indexed?

Q: (19:00) I’m in charge of a video replay platform, and simplified, our embeds are sometimes indexed individually. How can we prevent that?

  • (19:10) So by embeds, I looked at the website, and basically, these are iframes that include a simplified HTML page with a video player embedded. And, from a technical point of view, if a page has iframe content, then we see those two HTML pages. And it is possible that our systems indexed both HTML pages because they are separate. One is included in the other, but they could theoretically stand on their own, as well. And there’s one way to prevent that, which is a reasonably new combination with robots meta tags that you can do, which is with the indexifembedded robots meta tag and a noindex robots meta tag. And, on the embedded version, so the HTML file with the video directly in it– you would add the combination of noindex plus indexifembedded robots meta tags. And that would mean that, if we find that page individually, we would see, oh, there’s a noindex. We don’t have to index this. But with the indexifembedded, it essentially tells us that, well, actually, if we find this page with the video embedded within the general website, then we can index that video content, which means that the individual HTML page would not be indexed. But the HTML page embedded with the video information would be indexed normally. So that’s kind of the setup that I would use there. And this is a fairly new robots meta tag, so it’s something that not everyone needs. Because this combination of iframe content or embedded content is kind of rare. But, for some sites, it just makes sense to do it like that.

Q: (21:15)Another question about HTTPS, maybe. I have a question around preloading SSL via HSTS. We are running into an issue where implementing HSTS into the Google Chrome preload list. And the question kind of goes on with a lot of details. But what should we search for?

  • (21:40) So maybe take a step back when you have HTTPS pages and an HTTP version. Usually, you would redirect from the HTTP version to HTTPS. And the HTTPS version would then be the secure version because that has all of the properties of the secure URLs. And the HTTP version, of course, would be the open one or a little bit vulnerable. And if you have this redirect, theoretically, an attacker could take that into account and kind of mess with that redirect. And with HSTS, you’re telling the browser that once they’ve seen this redirect, it should always expect that redirect, and it shouldn’t even try the HTTP version of that URL. And, for users, that has the advantage that nobody even goes to the HTTP version of that page anymore, making it a little more secure. And the pre-load list for Google Chrome is a static list that is included, I believe, in Chrome probably in all of the updates, or I don’t know if it’s downloaded separately. Not completely sure. But, essentially, this is a list of all of these sites where we have confirmed that HSTS is set up properly and that redirect to the secure page exists there so that no user ever needs to go to the HTTP version of the page, which makes it a little bit more secure. From a practical point of view, this difference is very minimal. And I would expect that most sites on the internet just use HTTPS without worrying about the pre-load list. Setting up HSTS is always a good practice, but it’s something that you can do on your server. And as soon as the user sees that, their Chrome version keeps that in mind automatically anyway. So from a general point of view, I think using the pre-load list is a good idea if you can do that. But if there are practical reasons why that isn’t feasible or not possible, then, from my point of view, I would not worry about only looking at the SEO side of things. When it comes to SEO, for Google, what matters is essentially the URL that is picked as the canonical. And, for that, it doesn’t need HSTS. It doesn’t need the pre-load list. That does not affect at all on how we pick the canonical. But rather, for the canonical, the important part is that we see that redirect from HTTP to HTTPS. And we can kind of get a confirmation within your website, through the sitemap file, the internal linking, all of that, that the HTTPS version is the one that should be used in Search. And if we use the HTTPS version in Search, that automatically gets all of those subtle ranking bonuses from Search. And the pre-load list and HSTS are not necessary there. So that’s kind of the part that I would focus on there.

How can I analyse why my site dropped in ranking for its brand name?

Q: (25:05) I don’t really have a great answer, but I think it’s important to at least mention, as well what are the possible steps for investigation if a website owner finds their website is not ranking for their brand term anymore, and they checked all of the things, and it doesn’t seem to be related to any of the usual things?

  • (25:24) So, from my point of view, I would primarily focus on the Search Console or the Search Central Health Community and post all of your details there. Because this is where all of those escalations go and where the product and the Help forum, they can take a look at that. And they can give you a little bit more information. They can also give you their personal opinion on some of these topics, which might not match 100% what Google would say, but maybe they’re a little bit more practical, where, for example, probably not relevant to this site, but you might post something and say, well, my site is technically correct and post all of your details. And one of the product experts looks at it and says it might be technically correct, but it’s still a terrible website. You need to get your act together, write, and create better content. And, from our point of view, we would focus on technical correctness. And you need someone to give you that, I don’t know, personal feedback. But anyway, in the Help forums, if you post the details of your website with everything that you’ve seen, the product experts are often able to take a look and give you some advice on, specifically, your website and the situation that it’s in. And if they’re not able to figure out what is happening there, they also have the ability to escalate these kinds of topics to the community manager of the Help forums. And the community manager can also bring things back to the Google Search team. So if there are things that are really weird and now and then, something really weird does happen with regards to Search. It’s a complex computer system. Anything can break. But the community managers and the product experts can bring that back to the Search team. And they can look to see if there is something that we need to fix, or is there something that we need to tell the site owner, or is this kind of just the way that it is, which, sometimes, it is. But that’s generally the direction I would go for these questions. The other subtly mentioned here is that I think the site does not rank for its brand name. One of the things to watch out for, especially with regards to brand names, is that it can happen that you say something is your brand name, but it’s not a recognised term from users. For example, you might say I don’t know. You might call your website bestcomputermouse.com. And, for you, that might be what you call your business or what you call your website. Best Computer Mouse. But when a user goes to Google and enters “best computer mouse,” that doesn’t necessarily mean they want to go directly to your website. It might be that they’re looking for a computer mouse. And, in cases like that, there might be a mismatch of what we show in the search results with what you think you would like to have shown for the search results for those queries if it’s something more of a generic term. And these kinds of things also play into search results overall. The product experts see these all the time, as well. And they can recognise that and say, actually, just because you call your website bestcomputermouse.com I hope that site doesn’t exist. But, anyway, just because you call your website doesn’t necessarily mean it will always show on top of the search results when someone enters that query. But that’s kind of something to watch out for. But, in general, I would go to the Help forums here and include all of the information you know that might play a role here. So if there was a manual action involved and you’re kind of, I don’t know, ashamed of that which, it’s kind of normal. But all of this information helps the product experts better understand your situation and give you something actionable that you can do to take as a next step or to understand the situation a little bit better. So the more information you can give them from the beginning, the more likely they’ll be able to help you with your problem.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

SEO & web developers: friends or foes?

Is SEO just another constraint?

(0:59) MARTIN: So you are saying that for you as a web developer, SEO is just another constraint. What do you mean by that?

SURMA: I mean, as web developers, as far as web development goes, we and I are pretending to be a representative of all web developers, which I’m clearly not. But the usual struggle involves stuff like, how many browsers do I support, or how far back in the browser versions do I go? Do I support IE11? Do I not support IE11? How do I polyfill certain things? Do I polyfill them, or do I use progressive enhancement? What kind of implications do both of these choices have? Do we actually make design fully responsive? Do we do a separate mobile site? Like, there are all these choices that I already have to make. Four and then now frameworks come along and are like; we’re just going to do most of the stuff on the client-side because we want to write a single-page app. And then you either have to say, do I set up something for server-side rendering or static rendering and build time, or do I go all-in on the single-page app, and just everything happens on the client? What do search engines think about that? And then search engines come in like, well, you should be doing this, and you should not be doing that because that gets penalised. And actually, we don’t even understand what you’re doing here because search engines are running Chrome 12 or something. And it’s just like it’s yet another constraint that I have to balance and make decisions on whether following their restrictions is more important or following my business restrictions, my marketing restrictions, or my project management restrictions. And I feel like some people are frustrated about that sometimes.

MARTIN: I remember when I was a web developer, there’s also this entire user first no, mobile-first, no, content first, no this first, no, that first. That’s probably also going in the same direction, and I understand the frustration. And I see that there are lots of requirements, and sometimes these requirements might even contradict each other. But I think as developers, we should understand what SEOs are trying to help us with and what search engines, separately from what we are building and doing, are actually trying to accomplish. And then we would see that it’s basically all of these requirements are important, but maybe some of them are more important than others, and they are important in different stages, I would say. So, for instance, you mentioned mobile-first versus or a responsive design versus having separate versions, right? I would say that’s a decision that you need to make relatively early on, right? In the technology process before you, yeah, whereas then should I use this feature, should I polyfill this feature, should I not use this feature because I need to support an old browser that doesn’t support it, and the polyfill is tricky, that’s something that probably happens a little later in development, right?

SURMA: Yeah, I think I agree with that. It depends on how much flexibility you’re given as a developer. I think we all may or may not have lived through the experience of working with a designer who insists on a pixel perfect design, which is just not something that works on the web, but sometimes, you’re given a task, and your job is to complete it and not have an opinion. I don’t want to go down. It depends on the route. But in the end, we won’t get whatever we end up talking about. We probably won’t find a definitive answer. Like context matters, and everyone has different constraints. And I think that’s really what it’s about that you need to just be aware of the constraints that you have and make decisions in the context of your situation.

SEO – friend, not foe

(04:41) SURMA: You mentioned something that I find quite interesting. You said SEOs are trying to help us with something because often they’re kind of like villains, almost like the people who just try to get you to the top of the rankings, whether you deserve it or not. But in the end, I feel like there is help going on. Both search engines, as well as the people that want you to do well in the search results, actually are trying to make your site better in the end. Like no search engine actually wants to give you results that are bad. That just doesn’t make sense. In the end, search engines are trying to put the best results on top. And if an SEO helps you get on top, then ideally, what that means is your site has gotten better. 

MARTIN: Yes, exactly. And I love that you are saying, like, oh yeah, you have to look at the context. You have to understand the constraints. And that’s actually something that a good SEO will help you with because if you look at it from a purely SEO standpoint, depending on what you’re building and how you’re building it, you might have different priorities. So, for instance, if you’re saying, oh, this is a test version of a landing page. We just want to see if the customer segment is even interested in what we are potentially building later on, and you don’t want to build for the bin, right? You don’t want to build something that then, later on, you find out doesn’t actually work because there’s no interest in it. So then, for these things, SEO might be relatively important because you definitely want people to find that so that you get enough data on making decisions later on. But you might not be so constrained in terms of oh, yeah, this has to be client-side versus server-side. We don’t really have to make this super fast. We just have to get this into people’s faces, especially through search engines, so that we get some meaningful data to make decisions later on, versus you’re building and improving on an existing product, and that should belong evitable.

Building better websites for everyone

(6:33) MARTIN: So, a good SEO helps you understand what kind of requirements you should take into account. And SEO is a gigantic field, and they should pick the bits and pieces that actually matter for your specific context. So you said like, oh, we want to build a single page application. Maybe. Maybe you do, maybe you don’t. Maybe it’s fine to build a client-side rendering, but maybe consider doing some bits and pieces of server-side rendering because you reap some performance benefits there. And that also influences SEO because, as you say, search engines want to find the good things. So making your site better includes making it faster but also making it accessible because if you think about it, search engines are computers interacting with your website, working through your website and trying to understand what your website says. So they have basic accessibility needs. They don’t necessarily interact with things. They don’t click on stuff. And yet they need to work with your content. So it should be accessible to them. And SEOs will point these things out.

SURMA: That’s really interesting that you bring that up because I was just thinking about both performance, like loading performance, for example, and accessibility. So, on the one hand, it’s kind of accepted that loading performance is important. But now that, for example, we have Core Web Vitals. And one of the core ones of their core statements is that they don’t want to just put a number on a metric or something that’s measurable. They want to measure things that are important to user experience. And so the Core Web Vitals that we currently have, which is just three metrics, LCP, CLS, and FID, right. All of these are statistically correlated to users enjoying the site more or staying on your site longer. And that means if you optimise for those, you actually will get something out of that. You will get users that stay longer. And now that search is looking into those, it means optimising for those metrics not only gets you higher in the rankings potentially but also the people that do see your site will most likely stay longer or engage with it more because we know that these metrics correlate with user behaviour. And I think that’s a really interesting approach, wherein, in the end, actually search engines are helping you do the right thing. And now I’m wondering which I don’t even know like accessibility is something, which we keep talking about, and we know it’s important. And yet it feels like it always falls off the truck. In many projects, it’s an afterthought, and many people know that it needs to be something that needs to be considered from the very beginning of a project because it’s hard to shoehorn in at the end. It needs to be something that works from the start. Has any search engine ever done anything in this space to help developers be better with accessibility?

MARTIN: We are more or less going in that direction, not necessarily from a purely accessible standpoint, but as search engines need to semantically understand what the site is about, we just don’t take the text and take it as plain. We basically try to figure out, oh, so this is a section, this is a section, this is the section that is most important on the page. This is just an explainer for the previous section and so on and so forth. For that, we need the semantics that HTML gives us. And actually, these semantics are also important for accessibility oftentimes because people need to be able to navigate your content differently, maybe with a keyboard, maybe with a screen reader. And for that, the semantics on the page need to be in place from the get-go, basically. So in that direction, having better semantics does help search engines better understand your content and, as a byproduct, also help people better navigate your content who have additional navigational needs. So you could say search engines are a little involved in terms of accessibility. That does not cover accessibility as a whole. There is so much more to accessibility than just that. But at least in the core of the semantics on the web, that is taken care of here. 

Keeping up with web development trends is important for SEOs

(10:37) MARTIN: Another thing that I really found interesting is where you say, oh, you know, SEOs are often seen as just coming with all of these additional constraints and requirements. What is there that they could do differently that you would think would help you and other developers understand where they’re coming from or have a meaningful discussion about these things and turn that into a positive, constructive input?

SURMA: I don’t know if this is the answer you’re looking for, but one thing I have seen is that some SEOs need to put a bit more effort into being up to date on what is good and what is not good guidance, or more specifically, what search engines are and are not capable of processing correctly. I think– I know that you have been fighting the no, no JavaScript is the fine fight for a long time now, but I think to this day, there are still some SEOs out there who insist that anything that is in JavaScript is invisible to the search engine. I think in general, I think it goes back to the trade-off thing, where I think web developers need to realise that SEOs are trying to help you be better, and SEOs need to realise that that they can’t give advice as a either you do this, or you’re screwed kind of approach. Like, it’s a trade-off. You can say that this is one way where you can make a site better. This is another way, and this is yet another thing you can do. And all of these things will accumulate to make your site better, ideally resulting in giving you a higher rank. But it’s not like an all or nothing approach, I think. Sometimes certain constraints just outweigh other constraints, and you then make a decision to go with plan A rather than plan B or stick to what you currently got. We have recently seen a lot of shifts from purely client-side up to like this hybrid approach, where the app is rendered on the server-side or even at build time but then turns into a single page app once loaded onto the client, and that has upsides, and it has downsides. Like we know that statically rendered content is very good for your first render, your largest loading time, that all goes down. But now we have this additional blob of JavaScript state that is somehow inserted into the document, and then often, the full dynamic client-side re-render happens, which can create an iffy user experience at times. And all these things are working for or against you in certain aspects. And I think that’s just something that the SEOs need to be mindful of as well, that the developer cannot just follow everything that they say because they’re different; they’re not the only deciding force on a project. I’m not saying that all SEOs behave like this, of course, because I’m honestly quite inexperienced in working with an SEO directly. But just based on stories that I hear and people that I see on Twitter, it’s all a trade-off. And I think people need just to realise that everyone is in 90% of the cases trying to do the best they can and do their job well. And just keep that in mind. And then probably find a solution that works for both or is a compromise.

Change your perspective

(13:57) MARTIN: Yeah. No, that makes perfect sense. And I wish that both SEOs and developers would look at it from a different perspective. Like both SEOs and developers want to build something that people are using, right? You don’t want to build something and no one uses it. That’s neither going to pay your bills very long. Nor is it making you happy to see like, oh, yeah, we built something that helps many people. That’s true for me. When I was a developer, I wanted to build things that have an impact, and that means that they need to be used by someone. And if we are building something that we genuinely are convinced is a good thing, then that should be reflected by the fact that search engines would agree on that and say like, oh, yeah, this is a good solution to this problem or this challenge that people might face and thus want to showcase your solution basically. But for that, there needs to be something that search engines can grasp and understand and look at and put into their index accordingly. So basically, they need to understand what is the page about, what it offers the users, is it fast, is it slow, is it mobile-friendly, all these kinds of things. And SEOs are then the ones who are– because you as a developer are focused on making it work in all the browsers that it needs to work in, making it fast, using all the best practices, using tree shaking, bundle splitting, all that kind of stuff. And then SEOs come in and help you make sure that search engines understand what you’re building and can access what you’re building and that you are following the few best practices that you might not necessarily be aware of yet. But you are right. For that, SEOs need to follow up-to-date best practice guidance, and not all of them do. Well, at the beginning of 2021, I ran a poll in one of the virtual events, asking if people were aware that the Google bot is now using an evergreen Chrome. So we are updating the Chromium instance that is used to render pages. And I think like 60% of the people were like, oh, I didn’t know that even though we announced that in 2019 at Google I/O in May.

SURMA: How was that?

MARTIN: That was amazing. I mean, launching this has been a great initiative. But I’m surprised that I think we have gotten developers to notice that, but not necessarily all SEOs have noticed. And it’s things that are not necessarily easy or not even your job as a developer, where SEOs can really help you or at least make the right connections for you. For instance, I know you build squoosh. The app, right?

SURMA: Well, not just me, but I was part of the team that built it.

MARTIN: Right. You were part of the team who built squoosh app. And I think squoosh.app is awesome. For those who don’t know it, it’s a web app that allows you to experiment with different image compression settings and then basically get the image that you put into the application in your browser. It’s all working from the browser. You don’t have to install anything. And basically, get like the best settings for the best gains in terms of file size, right? That’s roughly what it does.

SURMA: Yeah. It’s an image compressor, and you can fiddle with all the settings and can try out the next generation codecs now that are coming to the web. But yeah, you have more control than I think any other image compression app that I know.

MARTIN: And it’s really, really cool, and I really admire the work that the engineering put into this, that all the developers put into this to make this work so smoothly, so quickly, so nice. It implements lots of best practices. But for a search engine, if you were to sell that as a product, this might not be very good. And that’s because if you look at it, it’s an interface that allows me to drag and drop an image into it, and then it does a bunch of stuff in terms of user interface controls to fine-tune settings. But if I was robot access that page, it’s a bunch of HTML controls, but not that much content, right?

SURMA: Agreed

MARTIN: So would you want to have to sit down and figure out how you would describe this and how you probably don’t want to do all that work by yourself. You want to focus on building cool stuff with the new image algorithms and fine-tuning how to make the settings work better or more accessible, or easier to understand, right? That’s where you want to focus on.

SURMA: Yeah. And I think I actually would like to get help from someone who knows whether this site like I wouldn’t have been able to say if like I think our loading performance is excellent because we spend lots of time on making it good and trying to pioneer some techniques. But I wouldn’t have been able to tell you whether it gets a good ranking from a search bot or a bad ranking, to be honest. I mean, the name is unique enough that it’s very Google-able, so I think even if it didn’t do so well, people would probably find it. But in the end, it’s actually a very interesting example because you’re completely right. The opening page, because it’s about images, it mostly consists of images. The only text we have on the landing page is the name and the file size of the demo images, and the licensing link. So there’s not much going on for a bot to understand what the site does, especially because something this specific, there’s not even much to do with semantic markup, as you said. Right, OK, cool, there’s an image and an input tag. You can drag and drop an image. But even that, even the drag and drop is actually only communicated via the user interface, not via the markup. And so yeah, that’s a really interesting example. Like, I would have no idea how to optimise. I would have probably said like meta description tag. I don’t know. And then John Miller told me that apparently, we don’t pay attention to the meta description tag anymore.

MARTIN: Well, we do. It’s the keywords that we don’t.

SURMA: Oh, the keywords are the one. OK, I take that back. Yeah, exactly. So I think you’re right that it’s very easy for developers to sometimes also guess what is good for SEO and what is bad and actually get input from someone who put in the time to learn what is actually going on. Keep up to date with the most recent updates. As you say, people apparently don’t even know that Google bot is now evergreen Chrome, which is amazing. So there are probably a lot of SEOs who go around saying like, no, no, no, no, you can’t use Shadow Dom or something like that, even if they know the JavaScript actually works. I agree. Get someone who knows.

Making things on the web is a team sport.

(20:26) SURMA: I mean, I’ve been saying that even as a very enthusiastic experimenter and web developer, one single person cannot really understand and use the entire web platform. It’s now so incredibly widespread in the areas that cover. So you can do web audio, web assembly, web use B, MIDI, and all these things. Like, you will not have experience in all of these things. And some of these holes, like WebGL itself, is a huge rabbit holes to fall into. So, pick some stuff. Get good at it. And for things you don’t know, get help because otherwise, you’re going to work on half-knowledge that might end up very likely going to end up making actually counterproductive for what you’re trying to achieve.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH