Ask Google what "Maori are", and you get some pretty nasty suggestions. Whose fault is that (hint - look at the people around you)?

[Pre-publication update: Wow - that was quick! In the midst of writing this post (started at 11 am), it appears that the "problem" it discusses has been addressed by Google disabling the predictive search function for certain "high risk" searches. But seeing as I had pretty much finished writing, and I now have to run off to play indoor soccer, I'm posting it anyway.]

I'm not usually comfortable writing posts that defend huge multi-national tax-dodging technology companies that have plans to dominate the entire internet and then use their power to enslave humanity in a never-ending nirvana of liberal goodness. (Guess who read David Egger's The Circle last year?)

But this NZ Herald story on the fact that Google's predictive search function throws up lots of nasty and insulting suggestions when you start looking for information on Maori, Tongans, Australians, or just about any national or ethnic group seemed a bit unfair on the company.

First off, when the story begins by saying:

Kiwis are dumb, racist and stupid - that's according to Google's predictive search function.

And the English are boring and rude, Irish are drunk and thick and the Chinese are "everywhere".

it's spouting nonsense. Google's search function isn't making any claims that Kiwis (or anyone else) are anything. Rather, it's predicting what it thinks you are going to want to ask about, based on what you've typed so far and looking at what other people who've typed the same thing in the past then go on to type.

So, for example, when I type into the Google search field "Dunedin is", and Google's second occuring predictive term is "Dunedin is for lovers", that doesn't mean Google is telling the world that the city actually is "for lovers". Rather, it means that in the past some people have gone to Google to try and find the url for this blog (which actually is rather charming), and so Google thinks there's a fair chance that I'll want to do the same thing.

Which actually is a pretty useful function - as Google says about itself, on average you can complete your search 2-5 seconds more quickly with it operating. Not quite as earth shattering a technological breakthrough as commercial jet travel, perhaps ... but still a "nice-to-have".

(Of course, if you don't like the function, you can always turn it off.)

The potential down side is the one we're seeing in the Herald's article. Because Google is predicting what you might want to search for based on what others before have wanted to search for, if you end up looking for something that uses words a whole lot of nasty, racist bigots have also used, you get presented with their searches. 

Which gives us a real insight into the mindset of how people (or, at least, those who use Google) think and talk about various ethnic groups when they are in the comparative privacy of in front of their keyboard. As the Herald article quotes the PhD student who "uncovered" this issue: "Mr Elers said that given the results for Maori were based largely on what New Zealanders put in Google searches, it raised the issue of how Kiwis viewed Maori."

So this isn't really Google's problem. It's our problem.

Of course, you may say, Google could still take ownership of this issue by somehow vetting its predictive search function so as to remove all the nasty stuff (which it already does for some terms). But that then gets tricky. Because in and of themselves, some of the words that cause offence when applied to particular ethnic groups ("lazy", "dirty", "scum") are not offensive. Take, for example, a search for "how do I clean scum from my shower". So how does Google tell a "good" use of a term from a bad one?

[Update: For the moment, anyway, Google seems to have responded by disabling entirely its predictive search function for (almost all) ethnic groups. So if you type in "Maori are", you get no results at all. Same for "Australians are". But type in "Russians are", and you get this!] 

One last point to note. The Herald story comes just a couple of days after the Daily Show did a pretty funny bit on the same topic. It's worth watching.

Comments (4)

by Ross on March 07, 2014
Ross

Shooting the messenger is a little silly. I think google's predictive search function is helpful and, as you say, you can turn it off if you don't like it. Mobile phones have predictive text and I assume has similar issues but they can also be overcome by turning off the function.

by Andrew Osborn on March 07, 2014
Andrew Osborn

The Herald copied it from Whaleoil: An indication of how pathetic the Herald is.

 

 

by Che Nua on March 09, 2014
Che Nua

Maori are awesome!

by DeepRed on March 09, 2014
DeepRed

Is this yet another case of moral panic being whipped up about the latest technologies? The MPAA's Jack Valenti likening the VCR to the Boston Strangler, video games being accused of making people go on spree shootings, you name it.

Post new comment

You must be logged in to post a comment.