Data Seemingly Proves Googlebot Crawling has Slowed

Googlebot Crawling has Slowed
Share on facebook
Share on google
Share on twitter
Share on linkedin

Data Seemingly Proves Googlebot Crawling has Slowed

A lot of data has been circulating in the SEO community about how Googlebot has reduced website crawling.

For the past few weeks, across all social media, Googlebot’s slowed Crawling has become a hot topic.

Googlebot Crawling has Slowed
Googlebot Crawling has Slowed

No! Indexing downturn doesn’t influence every site, yet many on Twitter and Reddit noticed that there have been some changes in the Google crawling system. They support their claim with screenshots of Googlebot activity. 

Supporting Proofs

Many informal proofs of Googlebot crawling has slowed anomalies that have been piling up in social media. But, we all know that when it comes to reliability, social media is not a very reliable source of information. Anyone can make any observation, and people will support them. 

Being said, Anecdotal evidence is not always useless. But, data-backed analyses are better proof for any claims, and Twitter has been piling up with those recently. 

Seolyzer’s a crawling and logs analysis service founder, posted a graph of Google crawling performance captivating the dramatic fall of crawling beginning on 11th November. 

Post: 

“Googlebot is on strike! Googlebot has drastically reduced its crawl activity on many large sites since November 11 at 6 PM (GMT).”

304 Server Response Code And Crawling Of Googlebot

Some noticed that Googlebot is not crawling pages that serve a 304 server response code. 

304 server response code is a response code generated by a server when a browser requests a page.

304 response code
304 response code

This means a browser (or Googlebot) notifies the server about saving a web page in its cache. So don’t show it unless that page is modified. 

Definition Of 304 (Not Modified) Response Code By The HTTP Working Service: 

“The 304 (Not Modified) status code indicates that a conditional GET or HEAD request has been received and would have resulted in a 200 (OK) response if it were not for the fact that the condition evaluated to false.

In other words, there is no need for the server to transfer a representation of the target resource because the request indicates that the client, which made the request conditional, already has a valid representation;

the server is therefore redirecting the client to make use of that stored representation as if it were the payload of a 200 (OK) response.”

304 Response Code Lead To Subjective Googlebot Crawling

A person made a tweet (in french) that he experienced a drop on various pages that responded with a 304 response code in AMP. 

Then the Seolyzer’s person responded with another post of a graph representing how Googlebot almost stopped crawling pages that responded with 304 server response code. 

This started a thread where some noticed reduced crawling for pages that respond to 304 response code. While some noticed that Crawling has decreased for travel pages and increased for e-commerce pages. More data is coming backing up the subject matter. 

304 Response Code And Crawling 

Google’s documentation on Googlebot crawling states that 304 response code should not affect crawling. 

“Googlebot signals the indexing pipeline that the content is the same as last time it was crawled.

The indexing pipeline may recalculate signals for the URL, but otherwise, the status code has no effect on indexing.”

It might be possible that they have changed the system. 

What Does Cookie Consent Theory Say?

The 304 response theory is one of many theories to explain reduced Crawling.

cookie consent theory
cookie consent theory

 

Another theory that backs up this is Cookie consent theory. 

One person posted: 

Google not crawling and indexing new pages anymore? I had the same problem and removed the cookie consent bar (Cookiepro) to test.

Guess what – problem solved. @JohnMu – any ideas why Google might not crawl and index new pages with a cookie-consent popup?

So, maybe the cookie consent bar has triggered the 304 response code. 

Reddit threads

Reddit also discussed the reduced Crawling of Googlebot. ARedditorr mentioned how before, Google was quick to index articles on their successful sites.

And now the situation is: 

“For whatever reason now, less than half of our new articles are indexing, even with me manually submitting them all right after publishing.”

Then thread follows with: 

“A lot of people are experiencing similar right now… Something seems to be going on with Google.”

“Something is up with Google indexing new posts….”

“My website is 17 years old… suddenly, the latest article took weeks to get indexed.”

Google’s Response

One of Google’s spokesperson John Mueller said: 

“I don’t see anything broken in the way Google indexes stuff at the moment. I do see us being critical about what we pick up for indexing, though, as any search engine should.”

See also: Google Changes More Structured Data Requirements

Something New?

As we know, Microsoft and Yandex announced IndexNow for Crawling and indexing. And Google confirmed testing out this new protocol for Sustainability. So, maybe that’s what’s causing the issue. 

Although many claim that Google has stopped indexing their pages, it’s not entirely true. But there’s a significant amount of that supporting the changes in Googlebot’s indexing patterns. 

Sign up for our Newsletter

Talk to Digital Expert Now!