As Paul Baker reported yesterday, a paper that we co-authored entitled “‘Why do white people have thin lips?’ Google and the perpetuation of stereotypes via auto-complete search forms” (published 2013 in Critical Discourse Studies 10:2) has recently been garnering some media attention, being cited in the Mail Online and the 18 May 2013 print issue of The Daily Telegraph (image below). Our findings — that “the auto-complete search algorithm offered by the search tool Google can produce suggested terms which could be viewed as racist, sexist or homophobic” — come as a German court “said Google must ensure terms generated by auto-complete are not offensive or defamatory” (BBC News, 14 May 2013). Similar, earlier, cases of (personal) libel and defamation were recalled by both Paul and me during the process of our investigation, but — serious as it may be — the thrust of this study was not the potential for damage to individuals, but rather to entire social groups. We found that:
“Certain identity groups were found to attract particular stereotypes or qualities. For example, Muslims and Jewish people were linked to questions about aspects of their appearance or behaviour, while white people were linked to questions about their sexual attitudes. Gay and black identities appeared to attract higher numbers of questions that were negatively stereotyping.”
The nature of Google auto-complete is such that the content presented appears because a relatively high number of previous users have typed these strings into the search box. We argue, then, that the appearance of such a high frequency of (largely negatively) stereotyping results indicates that “humans may have already shaped the Internet in their image, having taught stereotypes to search engines and even trained them to hastily present these as results of ‘top relevance’.” This finding has been somewhat misinterpreted by the press; the short title revealed in the URL for the Mail Online article and used in the top ticker — ‘Is Google making us RACIST?’ — actually reverses the agency in this process, as we have argued that, in fact, users may have made Google racist.
This ties in to the main suggestion that we make in the conclusion of the article, that “there should be a facility to ﬂag certain auto-completion statements or questions as problematic”, much the same as the ‘down-votes’ utilised in the Google-owned and -operated site YouTube. The argument here being: if auto-complete results have been crowd-sourced from Google users, why not empower the same users to work as mass moderators?
The other main point in our conclusion section was that this was not (and could not have been) a reception study “in that we are unable to make generalisations about the effects on users of encountering unexpected auto-complete question forms in Google”, but that this was an area ripe for further research.
“Hall’s (1973) notion of dominant, oppositional and negotiated resistant readings indicates that audiences potentially have complex and varying reactions to a particular ‘text’. As noted earlier, we make no claim that people who see questions which contain negative social stereotypes will come to internalise such stereotypes. A similar-length (at least) paper to this one would be required to do justice to how individuals react to these question forms. And part of such a reception study would also involve examining the links to various websites which appear underneath the auto-completed questions. Do such links lead to pages which attempt to conﬁrm or refute the stereotyping questions?”
In short, we had found that Google auto-complete did offer a high frequency of (largely negative) stereotyping questions, and did not offer a way for users to problematise these at the point of presentation. What we did not find was that “Google searches ‘boost prejudice'”, though we did hope to spark a discussion on the topic, and to indicate that the field is open for researchers willing to conduct reception studies.
Nic Subtirelu, a PhD student in the Department of Applied Linguistics and ESL at Georgia State University, wrote an interesting blog post on his site Linguistic Pulse beginning to do just that. After following the links presented from a sample search of “why do black people have big lips”, he says:
“So what happens when you do type in these searches? Well if you’re genuinely interested in the question enough to actually read some of the first results you find, my own experience here suggests that what you’ll be exposed to are sources that would not be considered credible in academic communities (and whose scholarly merits may be questionable) but nonetheless contain information designed to answer the question honestly using scientific theories (in this case evolutionary biology) and which often also acknowledge the over-generalization of the original question or the ideological norm that the question assumes (that is the question assumes Africans have ‘big’ noses only because they are being implicitly compared to ‘normal’ European noses).”
Nic does come across some traces of pseudo-scientific, white supremacist discourse, and misogynistic ideologies in the websites linked by auto-suggestion, but summarizes that “While [Google auto-complete] clearly suggests we live in a world of stereotyping and particularly negative stereotyping in the case of historically oppressed groups, it may also indicate the potential for challenging these stereotypes” and enters his own suggestion for further work, urging that:
“people who generate content critical of racist, homophobic, or sexist ideologies should attempt to make that content searchable by popular questions like ‘Why do black people have big noses?’ as well as accessible to broad audiences so that audiences relying on these stereotypes can have them challenged.”