click to see this better! |
While searching for information on the web, Paul Baker and Amanda Potts noticed that Google’s auto completion algorithm was inadvertently reproducing stereotypes. Troubled by this phenomenon, they set out to investigate which social groups elicited more stereotyping questions than others and how these differed in nature.
In 2010 Google added auto-completion algorithms to offer a list of suggestions when users type words into the search box. The predicted text shown in a drop-down list has either been entered by previous users or appears on the web. While this process can save time, it also has some unintended consequences. For example, when one types ‘why do gay’ into the search box, the information in the picture above appears.
These suggestions are stereotypical; they ascribe characteristics to people on the basis of their group membership, reducing them to certain (often exaggerated) traits.
To carry out their study, the researchers first created a list of identity groups to investigate. They found that the terms which produced the most questions were related to ethnicity, gender, sexuality and religion. The groups eventually chosen were: black, Asian, white, Muslim, Jewish, Christian, men, women, gay, lesbian and straight. In each category, a selection of similar terms was used, so for example the male category included men, boys and guys. The researchers also chose people as a control group to show how humans are characterised when they are not associated with particular identities.
Next, they paired these terms with question forms in order to elicit auto-suggestions. The question forms included items such as why do, how do, where do as well as questions beginning with do, should and are. Each question was entered individually and the top suggestions were recorded.
Some of the elicited questions did not refer to social groups and were excluded. Finally, 2690 queries were analysed and it was found that the groups which elicited the most queries were Jewish, Black and Muslim with over 300 results each, whereas People only elicited 70 and Lesbian a mere 41 queries.
The questions were then divided into the following categories:
Each question was also rated with regards to evaluation, and was classed as positive, negative or neutral. While the majority of the questions were classed as neutral, most groups tended to have more of their questions categorised as negative than positive. Surprisingly, the control category people elicited proportionally the most negative questions, which tended to be about why people engage in hurtful behaviours such as bullying and self-harming.
The relatively high proportion of negative questions for three groups was particularly concerning. For black people, these involved constructions of them as lazy, criminal, cheating, under-achieving and suffering from various conditions such as fibroids. Gay people were constructed as contracting AIDS, going to hell, not deserving equal rights or talking like girls. The negative questions for males positioned them as catching thrush, under-achieving and treating females poorly.
Conversely, all ethnic groups also elicited many positive questions. Black people were constructed as stronger, more attractive and virile. Asians were viewed as smart, slim and attractive, while white people were viewed as attractive and ‘ruling’ other groups.
In general, race and gender searches elicited questions concerned with the level of interest of one group for another. Indeed, top results for both genders featured the opposite sex. However, sexual fulfilment and references to orgasm appeared more frequently for the female questions.
The straight category included fewer questions (perhaps as this is seen as the ‘norm’). These tended to be about whether straight men enjoyed stereotypically gay entertainment such as TV show Glee or singer Cher, and whether straight men could ‘turn’ gay or have homosexual thoughts.
Whereas the Gay results included questions about whether gay people should be allowed to marry, adopt, join the military or give blood, Lesbian questions included negative stereotypes such as acting/looking like men, questions about sexual and emotional behaviour towards men and the mechanics of lesbian relationships.
While the researchers do not claim that people who are exposed to such questions will be influenced by the stereotypes they encounter, it is important to acknowledge that these do exist. In addition, this paper raises the moral question of whether content-providers should ‘protect’ their users and remove offensive auto-suggestions (and indeed who decides what is inappropriate?), or should they simply reflect the phenomena that people are interested in?
Baker and Potts warn that auto-complete suggestions could perpetuate stereotypes, and that not all people would view them with a critical eye. They therefore recommend that there should be a facility to flag certain auto-completion suggestions as problematic, and that Google should consider removing those that are consistently flagged.
---------------------------------------------------------------
Baker, Paul & Amanda Potts (2013) ‘Why do white people have thin lips?’ Google and the perpetuation of stereotypes via auto-complete search forms, Critical Discourse Studies,10:2, 187-204
doi.10.1080/17405904.2012.744320
This summary was written by Danniella Samos
No comments:
Post a Comment
Note: only a member of this blog may post a comment.