After Google's algorithm was questioned in the Google Webmaster Forum earlier this month, the search engine giant reported that they do not, in fact, incorporate a "white-list for algorithms", but a recent statement from Matt Cutts proves completely different.
Last week at SMX West, Barry Schwartz reported that the head of Google's webspam team, Matt Cutts, answered this very same question during the Spam Police panel. Schwartz describes his answer as long and detailed, but the bottom line is that Google does have whitelists, known within Google as "exception lists".
As most search engine algorithms are not 100% perfect, Google has created exception lists on a per-algorithm basis- which means that no site is ever completely secure in the SERPs.
Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms. In our experience, algorithms generate much better results than humans ranking websites page by page. And given the hundreds of millions of queries we get every day, it wouldn't be feasible to handle them manually anyway. That said, we do sometimes take manual action to deal with problems like malware and copyright infringement. Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.
The most common manual exceptions we make are for sites that get caught by SafeSearch-a tool that gives people a way to filter adult content from their results. For example, "essex.edu" was incorrectly flagged by our SafeSearch algorithms because it contains the word "sex." On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.
Of course, we would much prefer not to make any manual changes and not to maintain any exception lists. But search is still in its infancy, and our algorithms can't answer all questions.
Although their explanation seems quite plausible, it's important to note that Bings exception list, as they pointed out above, is constantly revised and cleaned out. Bing does not have a running list of "exceptions" as Google seems to; when Bing updates its algorithm, the "exception list" gets updated also, using their list fairly and efficiently to help resolve old problems and deal with new ones as they come in.
What's your take on Google admitting to whitelists & exception lists?