Google's autocomplete feature is there to guess what you're searching before you've finished typing. Sometimes it saves time, other times it's hilariously off the mark — but it should never be hateful. However, the company ran into that very issue when it came to two suggested searches that people were finding when they Googled "women" and "jews."
For both terms, when the query began with "are women" or "are jews," Google suggested "evil." Or rather, Google's algorithm did. A representative told CNET that the suggestions are based on users' interests and search history.
"Terms that appear in Autocomplete may be unexpected or unpleasant," the representative continued. "We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn't an exact science and we're always working to improve our algorithms."
The company confirmed to the Guardian and the Telegraph that the suggestions have since been removed.
The issue with the suggestions isn't just that they're a jarring surprise in the middle of what was likely going to be a perfectly innocent query, but that they essentially encouraging users to search for and consume misogynistic and anti-semitic content. While people are free to Google whatever they like, let's not make bigotry any easier to access.
Like what you see? How about some more R29 goodness, right here?
It Just Got A Lot Easier To Shop On Instagram
Facebook Went Down — & Twitter Has The Best Reactions
How To Fix Your Instagram Explore Feed When It Gets Out Of Whack