Google Instant brings up racist, sexist, homophobic suggestions: Study

An example of a Google auto-complete suggestions. (SCREENSHOT)

An example of a Google auto-complete suggestions. (SCREENSHOT)

QMI Agency

, Last Updated: 1:03 PM ET

It can be very helpful when Google guesses what you're searching for, but a British study says the service perpetuates racist, sexist and homophobic stereotypes.

Researchers at Lancaster University used several terms to see how the Google Instant function attempted to automatically complete them. What often came up were offensive responses.

The researchers typed "Why do gay" and Google Instant completed the query with several suggestions including "men have high voices," "men get AIDS," and "people exist."

Google Instant is run using an algorithm that considers previously asked questions and popular searches to suggest what you might be looking for.

"It seems as though humans may have already shaped the Internet in their image, having taught stereotypes to search engines and even trained them to hastily present these as results of 'top relevance,'" Paul Baker and Amanda Potts wrote in the study, which appears in the current issue of Critical Discourse Studies.


Videos

Photos