Spam sites attempt to game their way to the top of search results through techniques like repeating keywords over and over, buying links that pass PageRank or putting invisible text on the screen. This is bad for search because relevant websites get buried, and it’s bad for legitimate website owners because their sites become harder to find. The good news is that Google's algorithms can detect the vast majority of spam and demote it automatically. For the rest, we have teams who manually review sites.

Identifying Spam

Spam sites come in all shapes and sizes. Some sites are automatically-generated gibberish that no human could make sense of. Of course, we also see sites using subtler spam techniques. Check out these examples of “pure spam,” which are sites using the most aggressive spam techniques. This is a stream of live spam screenshots that we’ve manually identified and recently removed from appearing in search results.

*We’ve removed some pornographic content and malware from this demo, but otherwise this is an unfiltered stream of fresh English examples of “pure spam” removals.

Types of Spam

In addition to spam shown above, here are some other types of spam that we detect and take action on.

Cloaking and/or sneaky redirects

Site appears to be cloaking (displaying different content to human users than is shown to search engines) or redirecting users to a different page than Google saw.

Hacked site

Some pages on this site may have been hacked by a third party to display spammy content or links. Website owners should take immediate action to clean their sites and fix any security vulnerabilities.

Hidden text and/or keyword stuffing

Some of the pages may contain hidden text and/or keyword stuffing.

Parked domains

Parked domains are placeholder sites with little unique content, so Google doesn’t typically include them in search results.

Pure spam

Site appears to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines.

Spammy free hosts and dynamic DNS providers

Site is hosted by a free hosting service or dynamic DNS provider that has a significant fraction of spammy content.

Thin content with little or no added value

Site appears to consist of low-quality or shallow pages which do not provide users with much added value (such as thin affiliate pages, doorway pages, cookie-cutter sites, automatically generated content, or copied content).

Unnatural links from a site

Google detected a pattern of unnatural, artificial, deceptive or manipulative outbound links on this site. This may be the result of selling links that pass PageRank or participating in link schemes.

Unnatural links to a site

Google has detected a pattern of unnatural artificial, deceptive or manipulative links pointing to the site. These may be the result of buying links that pass PageRank or participating in link schemes.

User-generated spam

Site appears to contain spammy user-generated content. The problematic content may appear on forum pages, guestbook pages, or user profiles.

Taking Action

While our algorithms address the vast majority of spam, we address other spam manually to prevent it from affecting the quality of your results. This graph shows the number of domains that have been affected by a manual action over time and is broken down by the different spam types. The numbers may look large out of context, but the web is a really big place. A recent snapshot of our index showed that about 0.22% of domains had been manually marked for removal.

Manual Action by Month

Milestones for manual spam fighting

February 2005

We expanded our manual spam-fighting team to Hyderabad, India.

March 2005

We expanded our manual spam-fighting team to Dublin, Ireland.

April 2006

We expanded our manual spam-fighting team to Tokyo, Japan.

June 2006

We expanded our manual spam-fighting team to Beijing, China.

October 2007 - Legacy

In the fall of 2007, we changed our classification system to keep data in a more structured format based on the type of webspam violation (which allowed us to create this chart). Actions that couldn't be categorized appropriately into the new system are in the “legacy” category. We were still taking action on spam types like thin affiliates and cloaking prior to this time, but the breakdown by spam type isn't readily available for the older data.

October 2009 - Unnatural links from your site

Improvements in our systems allowed us to reduce the number of actions taken on sites with unnatural outbound links.

November 2009 - Hacked sites

We noticed an increase in hacked sites and increased our efforts to prevent them from affecting search results.

February 2011 - Spammy free hosts and dynamic DNS providers

We increased enforcement of a policy to take action on free hosting services and dynamic DNS providers when a large fraction of their sites or pages violate our Webmaster Guidelines. This allows us to protect our users from seeing spam, when taking action on the individual spammy accounts would be impractical.

October 2011 - Cloaking and/or sneaky redirects

We made a change to our classification system so that the majority of cloaking and sneaky redirect actions were labeled as “Pure spam.” Actions related to less egregious violations continue to be labeled separately.

October 2011 - Parked domains

We reduced our efforts to manually identify parked domains due to improvements in our algorithmic detection of these sites.

April 2012

We launched an algorithmic update codenamed “Penguin” which decreases the rankings of sites that are using webspam tactics.

Notifying Website Owners

When we take manual action on a website, we try to alert the site's owner to help him or her address issues. We want website owners to have the information they need to get their sites in shape. That’s why, over time, we’ve invested substantial resources in webmaster communication and outreach. The following graph shows the number of spam notifications sent to site owners through Webmaster Tools.

Messages by Month

History of webmaster communication

May 2007

We used to send notifications only via email, and in 2007 webmasters reported receiving fake notifications of Webmaster Guidelines violations. We temporarily paused our notifications in response to this incident while we worked on a new notification system.

July 2007

With the launch of the Message Center feature in Webmaster Tools, we resumed sending notifications in July 2007 after pausing the notifications in May due to email spoofing.

March 2010

We began using a new notification system which enabled us to more easily send messages to the Message Center of Webmaster Tools when we found spam. The first category of spam to use this new system was hacked sites.

July 2010

A bug in our hacked sites notification system reduced the number of messages that we sent to hacked sites.

November 2010

We upgraded our notification system. With this update, we fixed the hacked sites notification bug and began experimenting with sending messages for additional categories of spam such as unnatural links from a site.

February and March 2011

We expanded notifications to cover additional types of unnatural links to a site.

June 2011

We expanded the number of languages we send many of our messages in.

September 2011

We made a change to our classification system for spam. Messages for some categories of spam were not sent, while we created and translated new messages to fit the new categories.

November 2011

A bug in our hacked sites notification system reduced the number of messages that we sent to hacked sites.

December 2011

We expanded the categories of spam that we send notifications for to include pure spam and thin content.

February 2012

The bug affecting our hacked sites notifications was fixed.

Listening for Feedback

Manual actions don’t last forever. Once a website owner cleans up her site to remove spammy content, she can ask us to review the site again by filing a reconsideration request. We process all of the reconsideration requests we receive and communicate along the way to let site owners know how it's going.

Historically, most sites that have submitted reconsideration requests are not actually affected by any manual spam action. Often these sites are simply experiencing the natural ebb and flow of online traffic, an algorithmic change, or perhaps a technical problem preventing Google from accessing site content. This chart shows the weekly volume of reconsideration requests since 2006.

Reconsideration Requests by Week

Notable moments for reconsideration requests

December 2006

A bug prevented us from properly storing reconsideration requests for about a week. On December 25th (Christmas), we submitted requests on behalf of sites affected by the bug, creating a small spike at the end of the year.

May/June 2007

Many webmasters received fake notifications of Webmaster Guidelines violations, leading an unusual number to file reconsideration requests.

December 2007

Every year webmasters submit fewer reconsideration requests during the late December holidays.

April 2009

We released a video with tips for reconsideration requests.

June 2009

We started sending responses to reconsideration requests to let webmasters know that their requests have been processed.

October 2010

We upgraded our notification system and starting sending out more messages.

April 2011

We rolled out the Panda algorithm internationally. In the past, sites have often filed reconsideration requests when they see traffic changes that aren’t actually due to manual action.

April - Sept 2011

We started sending reconsideration responses with more information about the outcomes of reconsideration requests.

June 2012

We began sending messages for a wider variety of webspam issues. We now send notifications for all manual actions by the webspam team which may directly affect a site's ranking in web search results.