(Just as a reminder: while I am a Google employee, the following post is my personal opinion.)

Recently I read a fascinating essay that I wanted to comment on. I found it via Ars Technica and it discusses (PDF link, but I promise it’s worth it). It’s written by >, an associate professor at New York Law School.

The New York Times called Grimmelmann “one of the most vocal critics” of the proposed Google Books agreement, so I was curious to read what he had to say about search neutrality.

What I discovered was a clear, cogent essay that calmly dissects the idea of “search neutrality” that was proposed in a New York Times editorial. If you’re at all interested in search policies, how search engines should work, or what “search neutrality” means when people ask search engines for information, advice, and answers–I highly recommend it.

Grimmelmann considers eight potential meanings for search neutrality throughout the article. As Grimmelmann says midway through the essay, “Search engines compete to give users relevant results; they exist at all only because they do. Telling a search engine to be more relevant is like telling a boxer to punch harder.” (emphasis mine)

On the notion of building a completely transparent search engine, Grimmelmann says

A fully public algorithm is one that the search engine’s competitors can copy wholesale. Worse, it is one that websites can use to create highly optimized search-engine spam. Writing in 2000, long before the full extent of search-engine spam was as clear as it is today, Introna and Nissenbaum thought that the “impact of these unethical practices would be severely dampened if both seekers and those wishing to be found were aware of the particular biases inherent in any given search engine.”

That underestimates the scale of the problem. Imagine instead your inbox without a spam filter. You would doubtless be “aware of the particular biases” of the people trying to sell you fancy watches and penis pills–but that will do you little good if your inbox contains a thousand pieces of spam for every email you want to read. That is what will happen to search results if search algorithms are fully public; the spammers will win.

And Grimmelmann independently hits on the reason that Google is willing to take manual action on webspam:

Search-engine-optimization is an endless game of loopholing. …. Prohibiting local manipulation altogether would keep the search engine from closing loopholes quickly and punishing the loopholers–giving them a substantial leg up in the SEO wars. Search results pages would fill up with spam, and users would be the real losers.

I don’t believe all search engine optimization (SEO) is spam. Plenty of SEOs do a great job making their clients’ websites more accessible, relevant, useful, and fast. Of course, there are some bad apples in the SEO industry too.

Grimmelmann concludes

The web is a place where site owners compete fiercely, sometimes viciously, for viewers and users turn to intermediaries to defend them from the sometimes-abusive tactics of information providers. Taking the search engine out of the equation leaves users vulnerable to precisely the sorts of manipulation search neutrality aims to protect them from.

Really though, you owe it to yourself to read the entire essay. The title is “”

From Matt Cutts blog:

via matt wan – our partner membership