Data Seemingly Proves Googlebot Crawling has Slowed
A lot of data has been circulating in the SEO community about how Googlebot has reduced website crawling.
For the past few weeks, across all social media, Googlebot’s slowed Crawling has become a hot topic.
No! Indexing downturn doesn’t influence every site, yet many on Twitter and Reddit noticed that there have been some changes in the Google crawling system. They support their claim with screenshots of Googlebot activity.Â
Contents
- 1 Supporting Proofs
- 2 304 Server Response Code And Crawling Of Googlebot
- 3 Definition Of 304 (Not Modified) Response Code By The HTTP Working Service:Â
- 4 304 Response Code Lead To Subjective Googlebot Crawling
- 5 304 Response Code And CrawlingÂ
- 6 What Does Cookie Consent Theory Say?
- 7 Reddit threads
- 8 Google’s Response
- 9 Something New?
Supporting Proofs
Many informal proofs of Googlebot crawling has slowed anomalies that have been piling up in social media. But, we all know that when it comes to reliability, social media is not a very reliable source of information. Anyone can make any observation, and people will support them.Â
Being said, Anecdotal evidence is not always useless. But, data-backed analyses are better proof for any claims, and Twitter has been piling up with those recently.Â
Seolyzer’s a crawling and logs analysis service founder, posted a graph of Google crawling performance captivating the dramatic fall of crawling beginning on 11th November.Â
Post:Â
“Googlebot is on strike! Googlebot has drastically reduced its crawl activity on many large sites since November 11 at 6 PM (GMT).”
304 Server Response Code And Crawling Of Googlebot
Some noticed that Googlebot is not crawling pages that serve a 304 server response code.Â
304 server response code is a response code generated by a server when a browser requests a page.
This means a browser (or Googlebot) notifies the server about saving a web page in its cache. So don’t show it unless that page is modified.Â
Definition Of 304 (Not Modified) Response Code By The HTTP Working Service:Â
“The 304 (Not Modified) status code indicates that a conditional GET or HEAD request has been received and would have resulted in a 200 (OK) response if it were not for the fact that the condition evaluated to false.
In other words, there is no need for the server to transfer a representation of the target resource because the request indicates that the client, which made the request conditional, already has a valid representation;
the server is therefore redirecting the client to make use of that stored representation as if it were the payload of a 200 (OK) response.”
304 Response Code Lead To Subjective Googlebot Crawling
A person made a tweet (in french) that he experienced a drop on various pages that responded with a 304 response code in AMP.Â
Then the Seolyzer’s person responded with another post of a graph representing how Googlebot almost stopped crawling pages that responded with 304 server response code.Â
This started a thread where some noticed reduced crawling for pages that respond to 304 response code. While some noticed that Crawling has decreased for travel pages and increased for e-commerce pages. More data is coming backing up the subject matter.Â
304 Response Code And CrawlingÂ
Google’s documentation on Googlebot crawling states that 304 response code should not affect crawling.Â
“Googlebot signals the indexing pipeline that the content is the same as last time it was crawled.
The indexing pipeline may recalculate signals for the URL, but otherwise, the status code has no effect on indexing.”
It might be possible that they have changed the system.Â
What Does Cookie Consent Theory Say?
The 304 response theory is one of many theories to explain reduced Crawling.
Â
Another theory that backs up this is Cookie consent theory.Â
One person posted:Â
Google not crawling and indexing new pages anymore? I had the same problem and removed the cookie consent bar (Cookiepro) to test.
Guess what – problem solved. @JohnMu – any ideas why Google might not crawl and index new pages with a cookie-consent popup?
So, maybe the cookie consent bar has triggered the 304 response code.Â
Reddit threads
Reddit also discussed the reduced Crawling of Googlebot. ARedditorr mentioned how before, Google was quick to index articles on their successful sites.
And now the situation is:Â
“For whatever reason now, less than half of our new articles are indexing, even with me manually submitting them all right after publishing.”
Then thread follows with:Â
“A lot of people are experiencing similar right now… Something seems to be going on with Google.”
“Something is up with Google indexing new posts….”
“My website is 17 years old… suddenly, the latest article took weeks to get indexed.”
Google’s Response
One of Google’s spokesperson John Mueller said:Â
“I don’t see anything broken in the way Google indexes stuff at the moment. I do see us being critical about what we pick up for indexing, though, as any search engine should.”
Something New?
As we know, Microsoft and Yandex announced IndexNow for Crawling and indexing. And Google confirmed testing out this new protocol for Sustainability. So, maybe that’s what’s causing the issue.Â
Although many claim that Google has stopped indexing their pages, it’s not entirely true. But there’s a significant amount of that supporting the changes in Googlebot’s indexing patterns.Â