| |

Google Search Leak — What We Learned

About a week ago, a massive Google search engine data leak was discovered. Thousands of pieces of documentation on the Google Algorithm were discovered that were accidentally leaked by a Google Search bot. The documentation does not include code, but it does include variable names and brief descriptions of what they do.

Some have said that it could have negative consequences, but I think it actually revealed that Google Search is better than we thought at filtering bad content. It also opened useful information for honest SEO developers to utilize. It may also lead Google to be more transparent in the future about their ranking methods.

Google API Data Leak Docs

A Note About Small Site Ranking Concerns Caused by the Google Search Leak

There have been increasing questions raised about smaller sites losing more and more traffic, and some people believe it has been confirmed by the fact that the Google leak references a variable called smallPersonalSite.

smallPersonalSite variable from the Google Search Leak
This variable is only present in variables ranking Product Review sites

I do not necessarily believe that is true. I think product reviews are an area where this is definitely true, as large media outlets are creating lists of “best [product]” and often recommending either random products or products that pay them money.

However, I haven’t seen evidence myself for other websites like Info Toast that may not be in that business. I also know that many small business sites almost always outperform the Yelp reviews or anything else about that business.

Things We Learned from the Google Search Data Leak

Google is Punishing AI-Generated Content

There is a variable called contentEffort, which appears to analyze the amount of effort put into writing a certain article.

See also  The Closure of OpenSky

This likely utilizes their AI content detection algorithms.

Pages Should be Regularly and Meaningfully Updated

In years past, it was believed that simply changing the dates of a page being updated could assist in ranking. However, the Google search leak includes a variable called lastSignificantUpdate, which appears to matter more in page ranking than the latest “update”.

Class from the google search leak determining crawl times
Object for determining crawl times

Backlink “packages” you can purchase on the internet, can be very dangerous as Google can also purchase said packages and punish people for purchasing them. We already knew this, but we now know without a doubt since the Google Search Leak. Link sharing can be very dangerous for your website, and should be avoided.

I recommend teaming up with other SEO strategists to create a private backlink-sharing group. Alternatively, you can use the Backlink Software I will be publishing in a few months to Info Toast Tools, which partners local sites with others to create backlinks.

Conclusions

Over the past year, SEO people have been pushing the idea that the answer to getting higher in the search results is simply creating good content that helps people.

The Google Algorithm Leak showed us what Google is doing to enforce quality content. Ensuring good quality content gets to the top of the Google Search list is a job not only important for Google to compete with other search engines but also helps the consumers.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *