For many years when Google was under threat of regulatory action for manipulating its search results for its own commercial gain, the company used every trick in the book — including ignorance, incompetence, safe harbor, fair use, First Amendment and even web traffic beneficence — to avoid criticism in the press and investigation by regulators.
Above all, despite many examples to the contrary, Google appealed to manifest impartiality: its search results were algorithmically derived, untouched by human biases and thus fair. The list of grandiose promises and statements made by Google that turned out to be false and hypocritical is uncomfortably long. Unfortunately for the rest of us, regulatory capture being what it is and the rare penalties being laughable for a $275 billion company, there isn’t much of a black cloud left over Google to worry about, especially under the current U.S. administration.
So perhaps Google now feels freshly emboldened to tell it like it is. In any case, I was impressed by this frank admission in New York Times:
Even at Google, where algorithms and engineers reign supreme in the company’s business and culture, the human contribution to search results is increasing. Google uses human helpers in two ways. Several months ago, it began presenting summaries of information on the right side of a search page when a user typed in the name of a well-known person or place, like “Barack Obama” or “New York City.” These summaries draw from databases of knowledge like Wikipedia, the C.I.A. World Factbook and Freebase, whose parent company, Metaweb, Google acquired in 2010. These databases are edited by humans.
When Google’s algorithm detects a search term for which this distilled information is available, the search engine is trained to go fetch it rather than merely present links to Web pages.
“There has been a shift in our thinking,” said Scott Huffman, an engineering director in charge of search quality at Google. “A part of our resources are now more human curated.”
Not a shift, but a new admission of on-going reality, I’d say. Let’s hope for Scott Huffman’s sake he ran this by Google legal before it was published. Or better yet, let’s hope Google now stops the unbecoming pretensions to being philosophically open and algorithmically impartial.