Majestic

  • Site Explorer
    • Majestic
    • Summary
    • Ref Domains
    • Backlinks
    • * New
    • * Lost
    • Context
    • Anchor Text
    • Pages
    • Topics
    • Link Graph
    • Related Sites
    • Advanced Tools
    • Author ExplorerBeta
    • Summary
    • Similar Profiles
    • Profile Backlinks
    • Attributions
  • Compare
    • Summary
    • Backlink History
    • Flow Metric History
    • Topics
    • Clique Hunter
  • Link Tools
    • My Majestic
    • Recent Activity
    • Reports
    • Campaigns
    • Verified Domains
    • OpenApps
    • API Keys
    • Keywords
    • Keyword Generator
    • Keyword Checker
    • Search Explorer
    • Link Tools
    • Bulk Backlinks
    • Neighbourhood Checker
    • Submit URLs
    • Experimental
    • Index Merger
    • Link Profile Fight
    • Mutual Links
    • Solo Links
    • PDF Report
    • Typo Domain
  • Free SEO Tools
    • Get started
    • Backlink Checker
    • Majestic Million
    • Browser Plugins
    • Google Sheets
  • Support
    • Blog External Link
    • Support
    • Get started
    • Tools
    • Subscriptions & Billing
    • FAQs
    • Glossary
    • How To Videos
    • API Reference Guide External Link
    • Contact Us
    • About Backlinks and SEO
    • SEO in 2024
    • Link Building Guides
    • Webinars
  • Sign Up for FREE
  • Plans & Pricing
  • Login
  • Language flag icon
    • English
    • Deutsch
    • Español
    • Français
    • Italiano
    • 日本語
    • Nederlands
    • Polski
    • Português
    • 中文
  • Get started
  • Login
  • Plans & Pricing
  • Sign Up for FREE
    • Summary
    • Ref Domains
    • Map
    • Backlinks
    • New
    • Lost
    • Context
    • Anchor Text
    • Pages
    • Topics
    • Link Graph
    • Related Sites
    • Advanced Tools
    • Summary
      Pro
    • Backlink History
      Pro
    • Flow Metric History
      Pro
    • Topics
      Pro
    • Clique Hunter
      Pro
  • Bulk Backlinks
    • Keyword Generator
    • Keyword Checker
    • Search Explorer
      Pro
  • Neighbourhood Checker
    Pro
    • Index Merger
      Pro
    • Link Profile Fight
      Pro
    • Mutual Links
      Pro
    • Solo Links
      Pro
    • PDF Report
      Pro
    • Typo Domain
      Pro
  • Submit URLs
    • Summary
      Pro
    • Similar Profiles
      Pro
    • Profile Backlinks
      Pro
    • Attributions
      Pro
  • Custom Reports
    Pro
    • Get started
    • Backlink Checker
    • Majestic Million
    • Browser Plugins
    • Google Sheets
    • Get started
    • Tools
    • Subscriptions & Billing
    • FAQs
    • Glossary
    • How To Videos
    • API Reference Guide External Link
    • Contact Us
    • The Company
    • Style Guide
    • Terms & Conditions
    • Privacy Policy
    • GDPR
    • Contact Us
    • What is Trust Flow?
    • SEO in 2024
    • Link Building Guides
    • Webinars
  • Blog External Link
    • English
    • Deutsch
    • Español
    • Français
    • Italiano
    • 日本語
    • Nederlands
    • Polski
    • Português
    • 中文

Why you shouldn't let your technical SEO slip after the Helpful Content Update

Andy Frobisher

Andy Frobisher believs that you shouldn't let your Technical SEO slip after the Helpful Content Update.

@andy_frobisher  
Andy Frobisher 2024 Additional Insights podcast cover with logo
« Back to Additional insights
More Additional Insights YouTube Podcast Playlist Link Spotify Podcast Playlist Link Audible Podcast Playlist Link

Andy says: “The Helpful Content Update was a really, really big Google update in terms of how it impacted a lot of sites that were churning out content – particularly at scale. It was really going after low-quality content, or content that wasn't particularly helpful. It's really honed in again with the “relevance” that a lot of people are talking about in SEO.

Do you have the expertise and authority to talk about that subject? Is that content generally seen as helpful? Are people engaging with the content on the page? Are they bouncing back to Google or are they actually staying on your website? A lot of this stuff caused quite a scare in the industry, obviously, because a lot of content-led sites were impacted – and from subsequent updates following that as well. There have been other updates centred around content quality as well.

It's easy for people to automatically assume that the reason why they’re seeing a decline in visibility and their rankings dropping is because of the Helpful Content Update, or because of content. That's not always the case.

My opinion is that we should really be thinking about our technical SEO first before we decide whether it's actually content or it's links. Technical SEO is like the foundations of your house. You can't build your house without the foundations. If there's something structurally wrong with your house, check the foundations. Is there a problem that might be causing the house to fall down? It's that analogy.

You start with the technical SEO. Is there anything there that could be the root cause of why you're seeing ranking declines or why you're seeing a loss in impressions and traffic – before you determine whether it is actually the Helpful Content Update or it's around content quality itself.”

So, if the technical health of your website is not performing to a decent standard, can you ignore Helpful because you're going to get mixed messages and won’t be diagnosing the correct issue?

“Yeah, exactly. You can be pulling your hair out, you can be doing all the right things that you've been told to do, and you're not really seeing anything change.

Yes, Google have said that sites that are actually impacted might see a difference in the next Core Update, it’s not a quick thing, but we have seen sites recover from it. If you're pulling your hair out, wondering why on Earth you've made all these changes - you've done these things, but nothing's happening – it could be because there's a fundamental underlying technical problem that's actually the root cause.

If your rankings haven't completely dropped out of the top 100 (You're used to having rankings in the top 10-20, but now you're struggling to maybe get them into the top 50), that could be a signal that there's a technical problem with Google being able to either discover, crawl, or render a page on a website or pages on your website.”

What are the key aspects of technical SEO that you shouldn't let slip?

“For me, it's essentially a few areas. You start with just making sure that Googlebot and other search engine crawlers are not blocked from accessing important pages – whether that's accidentally blocking them by a robots.txt, whether it's incorrect canonicals and Google ignores that URL, whether it's soft 404s, or different things like that.

Make sure that Google can fetch and render the page in full in a timely fashion. There is a bit of a rule of thumb that if it takes Google longer than 5 seconds to fetch that page and render it, it's going to move on. Google doesn't have an infinite amount of resources. It's not going to wait around forever to fully understand that page. It will move on to the next one, so don't give them a reason to do that. Site speed/good server response times, are really important.

If your site's JavaScript-heavy as well, make sure Googlebot can easily fetch and render the content that's provided by JavaScript, as well as HTML. If you're using client-sized JavaScript rendering, using something like Prerender that will allow Googlebot to see a cached HTML version could be a nice temporary measure to make sure you're not seeing any unwanted issues with long load times to render that JavaScript.

Then just make sure there's a really, really good internal linking structure in place so Googlebot can easily follow and understand your site structure. That is quite often forgotten about a little bit. When I've worked across websites, one of the first things you have to do is an internal linking audit.”

Could you explain why you recommend a temporary renderer like Prerender?

“Prerender.io is a temporary solution that’s out there. If you’re having issues with Googlebot not being able to render JavaScript through client-side rendering, Prerender will actually cache a full HTML version of that page and provide that for bots and crawlers only.

The advantages there are that client-side rendering does have an improved UX for users because you're essentially caching that JavaScript file in the user's browser. So, as they're searching around pages, it loads much faster. Whereas, with server-side rendering (although that's much, much better for Googlebot because it gets a full HTML view of the page, delivered to them straight away), it does take a little bit longer from a user perspective for that entire thing to be loaded.

Google recommends that it should only be a temporary solution because, essentially, they don't want it to be seen as cloaking in the long term. Essentially, that's against Google's policies: cloaking for bots and crawlers and displaying different content for them.

It's a temporary option if you've got this issue. It's a quick fix in a way, because longer-term solutions (to maybe go down the server-side rendering route with a CDM to cache the website) could actually be a lot of work for website developers and website builders, or it could be quite an investment. You can put Prerender.io in place right now, and then work towards that longer solution.”

What other software do you use and recommend for tracking/measuring to see if there are issues, and providing alerts when issues do occur?

“For JavaScript specifically, there are a few free tools out there you can use.

Obviously, if you've got access to Google Search Console, you can put that URL into your website, inspect that URL, and it will give you a snapshot of the HTML that it sees along with a screenshot of the page. You can see there whether Google is struggling to bring in some of that content.

If you don't have access to the Google Search Console, you can also do the exact same thing in the Rich Results Test. That will also show you all the HTML of the page and it will show you that same snapshot. That's a handy little trick if you don't have access to the client’s or business’s Google Search Console yet, but you want to check.

Another way to do that is by comparing the DOM against the source code. The DOM is essentially everything that Google has rendered. If there are discrepancies between the source code and the DOM, then there could be issues with JavaScript there as well.

Then you have some paid options like Screaming Frog or Sitebulb. They have really good JavaScript crawling capabilities where you can, again, compare JavaScript versus HTML and check to see what level of content is being delivered by JavaScript versus HTML, etc. You can begin to get a picture of whether there's definitely content that's being brought in by JavaScript on the page, and then begin to investigate further from there.”

Another potential issue that you highlighted was internal linking. How do you determine that that is potentially an issue and how do you go about solving it?

“I would always recommend that this is done on a manual basis. It's really hard to automate internal linking because the anchor text of the internal linking is actually really critical, as is the placement of where that internal linking is.

For example, whether you're internally linking to something higher up the page versus right at the bottom where someone's not likely to scroll right to the end and read the full article – because we know that readers skim read. They want to find that answer. Once they've found that answer, they're either done or maybe they'll convert onto a different page. Internal links at the bottom don't really offer the same importance as an internal link at the top would.

I would always recommend that it's done manually. Make sure that the anchor text makes sense in terms of what page you're following on from. Google uses that anchor text to determine a little bit more about that page. If your anchor text isn't relevant – it doesn't make sense – then it's not really going to provide the benefit that you expect.

But also, with internal linking, it can be on a global level. How many times did it take to click to a page? How easy is it to access a page? Is that page too hidden? Does a user have to click on various points to get to that page? How can we make it much quicker for a user to get from their landing page to the page they want to access? If we can do that in 3 clicks or less, that’s absolutely fantastic. If you're taking 5/6 clicks to get to that page, that's 5/6 times Google has to follow links in order to find that page. Again, the longer that takes, the less importance you’re putting on that page.”

Is the number of clicks it takes for the search engine/user to discover the page the initial opportunity that you tend to focus on?

“That's one of them: how many clicks does it take to get to that page? Is the click depth too large? How many internal links are pointing to important pages? There are plenty of studies out there showing that, theoretically, the more links you have pointing to a page, the more importance you're giving to it.

Obviously, you want those links to be from relevant content. If you've got a hub page that's about selling cars, you want relevant content around questions or answers that will help around that topic – topically relevant internal links to that page.

You want lots of those. If there's a lack of them, then you're looking to add internal links. Likewise, if there are loads that aren't actually relevant, then you would probably look to remove those.

Then it's also the anchor text. If the page you're linking to is about second-hand Ford cars for sale, but your anchor text just says ‘second-hand cars’, it's not really the right target for that page. It needs to mention ‘second-hand Ford cars for sale’ or something more relevant to that. The use of the anchor text is important there as well.”

What's something that SEOs shouldn't be doing in 2024?

“Stop purely relying on tools to handle your technical SEO health.

Yes, there are lots of brilliant tools out there – Majestic is a brilliant tool – but just running an audit on a website and having it give you a score and a list of recommendations isn't enough. That's the kind of thing I was used to seeing 10 years ago. You have to use your own evaluation of those.

Some things are going to be more important than others. Some things will make a larger difference than others. Some things may be more difficult to achieve than others. You've got to use your own experience to prioritise what is needed and what isn't needed. What should be done first over other things? What's going to make the most difference to help towards your targets and KPIs?

Also, these tools will sometimes give you a hint at what could be a bigger, deeper problem that might lead your experience to help you figure out a much wider problem than what the tool is potentially suggesting.

Always use your own initiative, experience, and skills and don't be afraid to delve into things yourself. Use various other tools and software as well, to see if that hunch or that gut feeling is right, and stop being afraid to investigate those things. Yes, tools are going to be great for showing ongoing health. Are you improving things as you go along? It's going to spark other things, but don't be afraid to do a lot of that evaluation yourself and make sure that you're doing the right things that are going to make a difference.

Bringing it back to the house analogy, you wouldn't just give a novice a set of tools and say, ‘Go and build a loft extension.’ You actually need to know how to use these tools, and that will place your head and shoulders above your competitors in your industry.”

Andy Frobisher is a Freelance SEO Consultant, and you can find him over at AndyFrobisher.com.

Choose Your Own Learning Style

Webinar iconVideo

If you like to get up-close with your favourite SEO experts, these one-to-one interviews might just be for you.

Watch all of our episodes, FREE, on our dedicated SEO in 2024: Additional Insights playlist.

youtube Playlist Icon

Podcast iconPodcast

Maybe you are more of a listener than a watcher, or prefer to learn while you commute.

SEO in 2024: Additional Insights is available now via all the usual podcast platforms

Spotify Apple Podcasts Audible

Book iconSEO in 2024

Catch up on SEO tips from 101 SEO experts in the original SEO in 2024 series

Available as a video series, podcast, and a book.

SEO in 2024

Could we improve this page for you? Please tell us

Fresh Index

Unique URLs crawled 331,200,592,566
Unique URLs found 791,501,903,333
Date range 23 Jul 2024 to 20 Nov 2024
Last updated 38 minutes ago

Historic Index

Unique URLs crawled 4,502,566,935,407
Unique URLs found 21,743,308,221,308
Date range 06 Jun 2006 to 26 Mar 2024
Last updated 03 May 2024

SOCIAL

  • LinkedIn
  • YouTube
  • Facebook
  • Bluesky
  • Twitter / X
  • Blog External Link

COMPANY

  • Flow Metric Scores
  • About
  • Terms and Conditions
  • Privacy Policy
  • GDPR
  • Contact Us

TOOLS

  • Plans & Pricing
  • Site Explorer
  • Compare Domains
  • Bulk Backlinks
  • Search Explorer
  • Developer API External Link

MAJESTIC FOR

  • Link Context
  • Backlink Checker
  • SEO Professionals
  • Media Analysts
  • Influencer Discovery
  • Enterprise External Link
top ^