However, a few, although not many, have noticed that there are some 'selected' words which don't give this 'instant' results
Google has gone gaga over its Instant and 'Autocomplete' features, which it says is the future of search. Thanks to its 'autocomplete' feature people can get results as they type their queries.
We have seen that in Instant the suggestion box scrolls down with every letter and also throws up 10-15 search results simultaneously.
However, a few, although not many, might have noticed that there are some 'selected' words which don't give this 'instant' results. In this case, Google waits for you to complete the whole word and then go and hit the search, instead of displaying caches instantly. Try out the words lesbian or Aryan.
Recently Hacker publication 2600 had compiled a list of such words that Google had blacklisted. Here are they.
Altogether Google Instant has blacklisted 12 words beginning with A, including Aryan, ass, 29 that begin with B, including biseXual, bitcH, 21 words that start with C, including cum (blacklists after a space), clit (blacklists after a space), 19 from D category, such as dominatIon, deeptHroat, 4 from E (eroT).
Similarly 14 from F (fetIsh, fuCk), 18 from (google is Evil, gay man), 16 from H (how to kilL, how to murdeR), four from I (i haTe, inceSt), 13 from J (jack ofF, jacobs ladder piercing), 10 from L (Lesbian, latinA), seven from M (missionary Position), 12 from N (negrO, neonaZi), five from O (orgY), 12 from N (nigGer, nuDe), 17 from P (paedoPhile, pissIng), one from Q, nine from R (rapE), 37 from S (sex, seduceD) 18 from T (teeN, towelHead), three from U (undressIng), four from V (vibratoR), six from W and two from Y are also blacklisted.
Vinay Goel, head of products, Google India, said, “There are a number of reasons you may not be seeing search queries for a particular topic. Among other things, we apply a narrow set of removal policies for pornography, violence, and hate speech.
However, contrary to the blacklisted words there are a few which seem not to scare Google, although they can fall under the said category, such as 'how to commit suicide', 'ku klux klan', 'murder', 'nazi', 'homosexual', 'fascist' etc.
“It's important to note that removing queries from 'autocomplete' is a hard problem, and not as simple as blacklisting particular terms and phrases. In search, we get more than one billion searches each day. Because of this, we take an algorithmic approach to removals, and just like our search algorithms, these are imperfect. We will continue to work to improve our approach to removals in 'autocomplete', and are listening carefully to feedback from our users,” adds Goel.
No comments:
Post a Comment