We want to organize the world’s information. But what about malware? What about credit card numbers? There are many tricky issues we think about on a daily basis. Here you’ll find a list of policies organized around particular topic areas. We’re starting with policies related primarily to content removals, but this is a living document and we plan to update over time. We’d love to get your feedback and suggestions.

Access to Information Comes First

We believe in free expression and the free flow of information. We try hard to make information available except for narrowly defined cases like spam, malware, legal requirements and preventing identity theft.

Illustration of Access to Information Comes First

Algorithms Over Manual Action

The relevance and comprehensiveness of our search results is central to helping you find what you’re looking for. We prefer machine solutions to manually organizing information. Algorithms are scalable, so when we make an improvement, it makes things better not just for one search results page, but for thousands or millions. However, there are certain cases where we utilize manual controls when machine solutions aren’t enough. Learn more about Algorithms.

Illustration of Algorithms Over Manual Action

Exceptions Lists

Like most search engines, in some cases, our algorithms falsely identify sites and we make limited exceptions to improve our search quality. For example, our SafeSearch algorithms are designed to protect children from adult content online. When one of these algorithms misidentifies websites (for example essex.edu) we sometimes make manual exceptions to prevent these sites from being classified as pornography.

Illustration of Exceptions Lists

Fighting Spam and Malware

We hate spam as much as you do. It hurts users by cluttering search results with irrelevant links. We have teams that work to detect spammy websites and remove them from our results. The same goes for phishing websites and malware. Learn more about Fighting Spam.

Illustration of Fighting Spam and Malware

Transparency for Webmasters

We have clear Webmaster Guidelines defining best practices and spammy behavior. Whenever the manual spam team takes action on a site and it may directly affect that site’s ranking, we’re transparent about it and we do our best to notify the webmaster. If we’ve taken manual action, webmasters can fix the problem and file a reconsideration request.

Illustration of Transparency for Webmasters

Preventing Identity Theft

Upon request, we’ll remove personal information from search results if we believe it could make you susceptible to specific harm, such as identity theft or financial fraud. This includes sensitive government ID numbers like U.S. Social Security Numbers, bank account numbers, credit card numbers and images of signatures. We generally don’t process removals of national ID numbers from official government websites because in those cases we consider the information to be public. We sometimes refuse requests if we believe someone is attempting to abuse these policies to remove other information from our results.

Illustration of Preventing Identity Theft

Legal Removals

Sometimes we remove content or features from our search results for legal reasons. For example, we’ll remove content if we receive valid notification under the Digital Millenium Copyright Act (DMCA) in the US. We also remove content from local versions of Google consistent with local law, when we’re notified that content is at issue. For example, we’ll remove content that illegally glorifies the Nazi party on google.de or that unlawfully insults religion on google.co.in. Whenever we remove content from our search results for legal reasons, we display a notification that results have been removed, and we report these removals to chillingeffects.org, a project run by the Berkman Center for Internet and Society, which tracks online restrictions on speech. We also disclose certain details about legal removals from our search results through our Transparency Report.

Illustration of Legal Removals

Fighting Child Exploitation

We block search results that lead to child sexual abuse imagery. This is a legal requirement and the right thing to do.

Illustration of Fighting Child Exploitation

Shocking Content

We want to make sure information is available when you’re looking for it, but we also want to be careful not to show potentially upsetting content when you haven’t asked for it. Accordingly, we may not trigger certain search features for queries where the results could be offensive in several narrowly defined categories.

Illustration of Shocking Content


When it comes to information on the web, we leave it up to you to decide what’s worth finding. That’s why we have a SafeSearch filter, which gives you more control over your search experience by helping you avoid adult content if you’d rather not see it.

Illustration of SafeSearch

If you’d like to learn more about our policies, you can find more details here.