On or about November 17, 2003, countless English-language ecommerce sites no longer appeared near the top of the rankings when their owners used the search terms that they considered most important. Four days later, I discovered that by adding a nonsense exclusion term, the links returned by Google shifted dramatically, and the results were very close to what these site owners had come to expect over the last few months.
A filter was in place, and it could be defeated by using one, or sometimes two, exclusion terms. If an exclusion term consists of characters that would never be found on a web page, then normally the addition of this excluded term to your usual terms will make little or no difference. Under normal circumstances, a search for callback service should return the same links as a search for callback service -qwzxwq because no sane web page has the term qwzxwq on the page.
By Wednesday of the following week, it was still working. I didn't expect this, and started this Scroogle site using a script that compared the top 100 results from Google with exclusion terms, against the top 100 results for the same terms, but without the exclusion. I began recording the terms entered by visitors to the site, along with the "casualty rate" for those terms. This rate is the number of links in the unfiltered top 100 that were missing from the filtered top 100.
It's a mess. Google's integrity is on the line. If they keep this up, all their dreams of riches from stock options will vanish. Who's in charge at the Googleplex anyway? There isn't much time.
Google Watch - A look at how Google's monopoly, algorithms, and privacy policies are undermining the Web.