This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
when_biometrics_fail_shoshana_amielle_magnet [2022/10/14 15:34] alexz |
when_biometrics_fail_shoshana_amielle_magnet [2022/11/19 17:21] (current) alexz |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | When Biometrics Fail, Shoshana Amielle Magnet | + | Notes on When Biometrics Fail, Shoshana Amielle Magnet |
\\ | \\ | ||
https:// | https:// | ||
Line 5: | Line 5: | ||
\\ | \\ | ||
Defining Biometrics (p21) | Defining Biometrics (p21) | ||
+ | \\ | ||
A biometric attribute is defined as “a physical or psychological trait that can be measured, recorded, and quantified” (P. Reid 2004:5). The process of acquiring the information about the physical or psychological trait — whether a digital fingerprint, | A biometric attribute is defined as “a physical or psychological trait that can be measured, recorded, and quantified” (P. Reid 2004:5). The process of acquiring the information about the physical or psychological trait — whether a digital fingerprint, | ||
\\ | \\ | ||
Line 20: | Line 21: | ||
\\ | \\ | ||
(p.39) These experiments represent only a few of the many biometric studies that classify individuals based on bodily identity, each using slightly different methods or a different data set. The initial labeling of the images by gender or race in order to train the computer system must be done manually by the scientists themselves. The following statement is standard, demonstrating that so-called physiognomies differences are key: “The ground truth label for gender and ethnic origin were determined by visual inspection after the images where collected” (Gutta et al. 2000:198). That is, scientists themselves decide on the gender and race of an individual before using algorithms to train their computers to do the same. […] The ways that biometric technologies identify racialized and gendered bodies differently is known to scientists. From comparisons of the impact of biometric scanners that could identify some “races” better than others (Tanaka et al. 2004) to attempts to teach biometric systems to classify gender (Ueki et al. 2004), these studies help to explain biometric failures and their connection to race and gender identities. The papers just reviewed rely on racist research that is long debunked, while the host of empirical work in the past century on the complexity of bodily identity. | (p.39) These experiments represent only a few of the many biometric studies that classify individuals based on bodily identity, each using slightly different methods or a different data set. The initial labeling of the images by gender or race in order to train the computer system must be done manually by the scientists themselves. The following statement is standard, demonstrating that so-called physiognomies differences are key: “The ground truth label for gender and ethnic origin were determined by visual inspection after the images where collected” (Gutta et al. 2000:198). That is, scientists themselves decide on the gender and race of an individual before using algorithms to train their computers to do the same. […] The ways that biometric technologies identify racialized and gendered bodies differently is known to scientists. From comparisons of the impact of biometric scanners that could identify some “races” better than others (Tanaka et al. 2004) to attempts to teach biometric systems to classify gender (Ueki et al. 2004), these studies help to explain biometric failures and their connection to race and gender identities. The papers just reviewed rely on racist research that is long debunked, while the host of empirical work in the past century on the complexity of bodily identity. | ||
+ | \\ | ||
+ | (p.46) As these scientists label the images according to their understandings of their own biological race and gender identities, preconceptions about gender and race are codified into the biometric scanners from the beginning. This assumptions that women are more likely to have long hair and man are more likely to wear ties or that the “Mongolian race” has a certain type eyelid are reified and then programmed into what David Lyon (2003) calls a “technology of social sorting”. Biometric scientists base their studies on dangerous, racist understandings of identity from the 1930s and earlier and ignore the past thirty years of research that has definitely disproved any biological theory of race. | ||
+ | \\ | ||
+ | \\ | ||
+ | Summary (p.49) | ||
+ | \\ | ||
+ | Scientific studies that rely upon biological understandings of gender and race represent some of the most egregious failures of biometric technologies. Any technology that takes as its premise the assumptions that bodies are stable entities that can be reliably quantified is problematic. Relying upon erroneous biological understandings of race and gender in the development of biometric technologies has a number of ramifications, | ||
+ | \\ | ||
+ | Using technology to tell us “truths” about the body never reveals the stable narratives we are hoping for. Biometric technologies cannot be counted on effectively and definitively to identify //any// bodies. However, as these technologies are specifically deployed to identify suspect bodies, the impact of technological failure manifests itself most consistently in othered communities. Representing these new identification technologies as able to circumvent cultural assumptions and subjective human judgment does not make it so. Rather biometric technologies are always already inflected by the cultural context in which they are produced. Biometrics are marketed as able to eliminate systemic forms of discrimination at the same time as they are produced in a context marked by the persistence of problematic assumptions about difference. That is, the rhetoric of scientific neutrality masks their racists, sexist, homophobic, ableist and classist practices. Given the context for which they are developed, it is unsurprising that biometric technologies are imagined as able definitively to identify suspect bodies. Nor is it surprising, given cultural assumptions about othered bodies, that these assumptions are both explicitly and implicitly coded into the technologies. Biometric fail most often and most spectacularly at the very objective they are marketed to be able to accomplice. Race and gender identities are not nearly as invisible to new identification technologies as is claimed. The technological fallibility of biometrics manifests itself practically in their disproportionate failure at the intersection of racialized, queered, gendered, classed, and disabled bodies, as they represent the latest attempt to definitively tie identity to the body. Rather that telling stores of mechanical objectivity, | ||
+ | \\ | ||
+ | \\ | ||
+ | //Biometric and policing are not strangers to each other.// | ||
+ | \\ | ||
+ | Ann Cavoukian, privacy commissioner of Ontario, Biometrics and Policing: Comments from a privacy perspective. (p.51: I-Tech and the beginnings of biometrics) | ||
+ | \\ | ||
+ | \\ | ||
+ | (p.92) | ||
+ | \\ | ||
+ | Although the border between Canada and the United States is described as self-evident, | ||
+ | \\ | ||
+ | \\ | ||
+ | (p.123) | ||
+ | \\ | ||
+ | An example of corporeal fetishism (Donna Haraway), biometric technologies make the body into a “thing”, | ||
+ | \\ | ||
+ | \\ | ||
+ | The knowledge generated by the use of biometrics to test identity is asked to do the cultural work of stabilizing identity […] | ||
+ | \\ | ||
+ | \\ | ||
+ | Offering to redefine social problems as scientific ones (Harding 1993:15), biometrics portray old inequities in new ways. | ||
+ | \\ | ||
+ | \\ | ||
+ | […] shaped by a discourse of technological neutrality and efficiency. | ||
+ | \\ | ||
+ | […] the process of making something visible is never as straightforward as the proponents of visualization technologies, | ||
+ | \\ | ||
+ | \\ | ||
+ | (p.128)\\ | ||
+ | Biometric discourse produces new forms of // | ||
+ | \\ | ||
+ | \\ | ||
+ | Using biometric technologies as a form of surveillance of suspect bodies and then reducing those suspect bodies into their component parts helps to allay the anxieties created around security through surveillance and dismemberment. | ||
+ | \\ | ||
+ | \\ | ||
+ | (p. 131)\\ | ||
+ | //Witwer: Flaws// | ||
+ | \\ | ||
+ | \\ | ||
+ | […]understanding scientific practice as interpretative act. | ||
+ | \\ | ||
+ | \\ | ||
+ | (p. 133)\\ | ||
+ | the // | ||
+ | \\ | ||
+ | \\ | ||
+ | Like the biometric image itself, a satisfyingly contained series of zeroes and ones, the biometric match seems to guarantee our security—closing up security holes with a satisfying clink—serving simultaneously to smooth and to identify. Here surveillant scopophilia is a visual process that serves to smooth our anxieties about security, as these visual representations suggest that those bodies that threaten our security will be reliably identified. | ||
+ | \\ | ||
+ | \\ | ||
+ | (p. 150)\\ | ||
+ | With this new search technology, bodies are imagined as stable entities that can reliably give us definitive proof of identity, creating processes of social stratification in which “material and technological infrastructures divide populations” by gender, race, class, and other axes of identity (Monaghan 2010:83). Yet biometric mismatches due to mechanical failures and the technology’s inability to work objectively dispute such stability. | ||
+ | \\ | ||
+ | \\ | ||
+ | (p. 151)\\ | ||
+ | We must also think of the intensification of existing inequalities as failures. | ||
+ | \\ | ||
+ | \\ | ||
+ | Given the devastating consequences of biometric errors for human lives, we need a language that is not restricted to technical terms. Biometric failures as //complex sorrows// is a beginning. | ||
+ | \\ | ||
+ | \\ | ||
+ | […]what Haraway and Goodeve (2000:115) calls “denaturalization… when what is taken for granted can no longer be taken for granted precisely because there is a glitch in the system.” | ||
+ | \\ | ||
+ | \\ | ||
+ | […]while cultural theorists generally understand the basics of the scientific objects they study, the same cannot be said for scientist’s knowledge of cultural theory. | ||
+ | \\ | ||
+ | \\ | ||
+ | […] within biometric science we are witnessing a return to the long discredited field of anthropometry and physiognomy, | ||
+ | \\ | ||
+ | \\ | ||
+ | before formulating further policies regulating biometric technologies, | ||
+ | \\ | ||
+ | \\ | ||
+ | We may also understand biometric representations of the body as a map of the contemporary social moment, both producing and reflecting its enduring inequalities, | ||
+ | \\ | ||
+ | We do not need a perfect technology for representing the body—there is no such technology. Rather we need to think critically about how security imperatives and the development of new technologies are trumping a commitment of the state to address poverty and the perpetuation of intersecting forms of discrimination. In speaking about the consequences of corporeal fetishism, Donna Haraway (1997:160) argues, “We need a critical hermeneutics of genetics as a constitutive part of scientific practice more urgently than we need better map resolution for genetic markers.” The same is true for biometric technologies. Technological solutions to social problems have tended to take an approach characterized by the prioritization of security over substantiative equality, global lockdown over emancipation, | ||