Google’s Link Spam Update: What to Do and What Not to Do?

Google's Link Spam Update

On December 14th 2022, Google decided to nullify the impact of irrelevant and unnatural linking of the search results. The approach is implemented through Spambrain, an AI-based spam prevention system. The system has been optimized to detect spam links directly. At the same time, detect both buying links and sites used to pass the outgoing link. 

If you saw a drop in your Google ranking during the rollout time, this new link spam upgrade could be the cause. Be sure you follow Google’s webmaster standards and use only natural links. Work on improving your site to naturally attract new links over time.

Our team will continue to provide data-backed insights and recommendations to navigate this evolving landscape.

Understanding Google’s Link Spam Update

An attempt is made in a link spam update to neutralize or stop counting, links that Google deems to be spammy and against its rules. This launch “may change when spammy links are neutralized and any credit passed by these unnatural links is lost,” according to a statement made by Google. All languages will be impacted by this launch.

This is leveraging what Google calls SpamBrain, Google referenced it in the 2018 Google spam report, specifically the spam trends section where Google talks about its “machine learning systems” to improve search spam detection.

Google added that SpamBrain can not only “detect spam directly, but it can also now detect both sites buying links, and sites used for the purpose of passing outgoing links.”

In the previous link spam update, Google used the term “neutralize” links, and in this instance, it is “nullifying,” which does not necessarily mean “penalize” but rather “ignore” or “just not count.” Since the 2016 release of Penguin 4.0, Google’s efforts to combat link spam have been to ignore and not count spammy links.

Similarly, the update’s focus isn’t on guest posts in general, but rather “commercial” or promotional posts. These advertorial-style posts, explicitly crafted to sell products or services, are the ones Google aims to address.

However, Google emphasizes that guest posts informing users, educating audiences, or raising awareness for a cause or company are encouraged and remain valuable.

This Update Is Not Intended To Penalize

Google has said to ignore unnatural links. Don’t know why they would hit websites for spam if they ignore links. There is a lot of contradictory information out there. Be very careful when using the disavow feature. It can actually tank your site further if you submit links that are actually keeping you afloat.

There’s a good chance some spammy links were previously giving you a ranking boost. It’s likely that Google got better at ignoring certain patterns with the December update and caught a bunch of old links that were passing value.

It’s also worth noting that there was a helpful content update around December 5. And the Link Spam update began rolling on December 14. Both the HCU and the Link Spam updates completed on January 12, 2023. Did you get hit in early December, or after the 14th?

The best thing you can do is make sure your content is as good as possible, and aligned with search intent. Also look at your internal links to make sure you’re following best practices there.

However, genuine outreach efforts to acquire high-quality backlinks from relevant sites featuring valuable content remain unaffected by this update.

Don’t Drop Your Guest Post Strategy

While the recent Google update may raise concerns about the future of guest posting as an SEO strategy, the reality is not as dramatic. The update specifically targets low-quality, spammy links, not guest posting in general. This means that high-quality guest posts with relevant and valuable content will continue to be rewarded by Google.

I think it has to be something about the language in the text. I have written 400+ skyscraper content pages. Some got hit by the spam update. My texts are not spam but really helpful and friendly researched content.

I think the problem could be that the content is created from too much keyword research, so the highest volume most relevant KW is the H1, less volume keywords are used in subheaders (either alone or in a sentence). Keywords are sprinkled naturally around the text.

I suspect the problem is that I do not invent anything new in my text since I base it on search volumes. Most content I created ends up on top 10 in Google, but I lost quite a few top 1 positions with the spam update.

Guest Posts Continue To Drive Results

While algorithm changes can stir up confusion and speculation, our ongoing data analysis reveals the continued effectiveness of guest posting in achieving SEO goals. Let’s take a look at real-world examples from our campaigns since the latest update.

  • Plumber Client: This client engaged in various link building and content creation campaigns, including a DA40+ guest post. Within a month, their target keyword ranking (medium difficulty) skyrocketed and remains strong. They now hold a top 3 position for their primary money keyword.
  • News Media Website: By strategically utilizing a DA30+ guest post alongside other SEO strategies (content and foundational link building), this website achieved page 1 ranking for a new keyword they were targeting.
  • Home Improvement Website: This website leveraged a DA40+ guest post to climb the SERPs steadily, reaching position #6 even during the Google update rollout.
  • Plumbing Parts E-commerce: After experiencing a slowdown in growth, this e-commerce store utilized diverse guest posts to regain momentum and recover rankings while the Google update was underway.

These successful cases demonstrate that high-quality guest posting remains a powerful SEO driver, aligning perfectly with Google’s updated guidelines.

The Big Picture

I work quite a bit with Python and the Google Search Console API and got a lot of the data through that.

Many sites have benefitted from the update, but it’s difficult to judge what really contributed to which change. This is not a statistically significant study or anything that can be measured super well since the sites are in different niches, have been created by different writers, have different keyword research methodologies, on-site SEO, etc.

I’m going with my gut feeling by looking at the sites that have benefited the most vs. those that have been hit the most.

And one more thing. After the update, I created many programmatic sites to test different hypotheses regarding spam sites.

Things that I’m 100% certain of:

  • Just because a website has been hit, that doesn’t mean it will affect websites owned under the same Google Search console
  • If you start a site today, blast 10,000 AI articles on it on day 1, and force indexation using GSC/Instant Indexing API, this can happen: https://i.imgur.com/dhjenAN.png
  • If you do the same and let Google slowly index the site, you won’t get penalized (at least so far on every site that I’ve tried)

Things I’m 90% certain of:

  • Google is not detecting AI content per se but penalizing thin content, most likely by using simple NLP (Natural Language Programming) algorithms to detect overstuffing of keywords, repetitions, etc.
  • Google is not penalizing sites that have very strong backlinks, aged domains, and organic traffic, even if the content is thin. If the number of pages has a good correlation with your traffic (see below).
  • Google is targeting websites where the ratio of traffic to the number of pages is bad. 1 mil indexed pages and only 500k monthly visits? Probably going to get hit.
  • On white hat sites, removing thin content and adding solid content + backlinking it already seems to be helping to lift the penalization.
  • Sites in very valuable/YMYL niches have been hit more. Tech, pets, VPNs, etc. It seems like Google has given these positions to big and rich “network” sites like Meredith, Dot Dash, etc.
  • Low-effort AI sites in niches with zero competition are doing fine.
  • Google has updated its algorithm to affect non-English websites too. No more easy rankings with super crappy content in French, German, Italian, etc.

Things that I’m guessing:

Domains which were penalized are somehow marked as “bad,” even though there’s no manual penalty in Google Search Console. A lot of bad articles and a few good articles on a site will affect the performance of the good articles.

Sites with weak E-A-T, ugly themes, and stock pictures have been hit more. Possibly because of manual ratings?

What Can Business Sites Do? 

  • According to Google spam update, business sites should be very responsible for putting tags for any links that have been placed through an exchange of money. 
  • There are different REL tags for different factors of artificial linking that should be notified. 
  • Business sites that rely mostly on guest blogs should try to diversify their links. 
  • Following the update, monitor the ranking of your business site. In case there is too much damage, contact the right authority. 

Your business can get help from LEIS to curate websites that always comply with Google updates.

Christopher Smith
Author: Christopher Smith

SEO and linkbuilding expert. More than 7 years of work in the field of website search engine optimization, specialist in backlink promotion. Head of linkbuilding products at GREAT Guest Posts, a global linkbuilding platform. He regularly participates in SEO conferences and also hosts webinars dedicated to website optimization, working with various marketing tools, strategies and trends of backlink promotion.

Leave a Comment

Your email address will not be published. Required fields are marked *