Notes on When Biometrics Fail, Shoshana Amielle Magnet
https://www.dukeupress.edu/when-biometrics-fail

Defining Biometrics (p21)
A biometric attribute is defined as “a physical or psychological trait that can be measured, recorded, and quantified” (P. Reid 2004:5). The process of acquiring the information about the physical or psychological trait — whether a digital fingerprint, iris scan, or distinctive gait— and then storing that information digitally in a biometric system is called enrollment (P. Reid 2004:6; Nanavati, Thieme, and Nanavati 2002:17). A template is the digital description of a physical or psychological trait, usually containing a string of alphanumerical characters that expresses the attributes of the trait (P. Reid 2004:6). Before the biometric data are converted to a digital form, they referred to as raw data (Nanavati, Thieme, and Nanavati 2002:17). Raw biometric data are not used to perform matches —they must first be translated into a biometric template before they can be utilized—a process that is achieved with the help of a biometric algorithm. Biometric algorithms are frequently described as recipes for turning a person’s biological traits into a “digital representation in the form of a template” (P. Reid 2004:6). This recipe is usually proprietary, and it is what a biometric technology company sells, arguing that their recipe for creating a template from raw biometric data is better than another company’s recipe.

Vendors represent biometric technologies as able to answer two questions. The first question refers to identification and asks, Who am I? Often described as a 1:N matching process, the presentation of a biometric template created in real time (called a live biometric) is checked against a database of stored biometric templates. Used more commonly is security and law enforcement applications, this process allows for one person to be checked against as list of persons (P. Reid 2004:14). The second question that biometric technologies are imagined to be able to answer concerns verification: Am I who I say I am? Referred to as a 1:1 matching process, verification checks the presentation of the live biometric with the person’s template stored in the database to determine if they match. If the live biometric is the same as the stored biometric, there is a match and the identity is verified. Verification is held up in biometric discourse to be a much quicker process than identification, since it must check only one live biometric against one stored biometric, rather than checking a particular biometric against an entire database. Biometric technologies that rely of verification are more commonly used in physical and informational access applications, including secure building and computer network access (14)

Scientific Representations of Biometrics (p.32)
The number of reported failures in biometric technologies demonstrates the fallibility of any technology that takes as its starting assumption the consistency of bodily identity.
[…]
It is necessary to investigate those instances when biometric technologies fail and to ask what their failures tells us. The moments in which they fail are useful to identify the assumptions upon which they rely and the cultural context they encode.
(p.39) These experiments represent only a few of the many biometric studies that classify individuals based on bodily identity, each using slightly different methods or a different data set. The initial labeling of the images by gender or race in order to train the computer system must be done manually by the scientists themselves. The following statement is standard, demonstrating that so-called physiognomies differences are key: “The ground truth label for gender and ethnic origin were determined by visual inspection after the images where collected” (Gutta et al. 2000:198). That is, scientists themselves decide on the gender and race of an individual before using algorithms to train their computers to do the same. […] The ways that biometric technologies identify racialized and gendered bodies differently is known to scientists. From comparisons of the impact of biometric scanners that could identify some “races” better than others (Tanaka et al. 2004) to attempts to teach biometric systems to classify gender (Ueki et al. 2004), these studies help to explain biometric failures and their connection to race and gender identities. The papers just reviewed rely on racist research that is long debunked, while the host of empirical work in the past century on the complexity of bodily identity.
(p.46) As these scientists label the images according to their understandings of their own biological race and gender identities, preconceptions about gender and race are codified into the biometric scanners from the beginning. This assumptions that women are more likely to have long hair and man are more likely to wear ties or that the “Mongolian race” has a certain type eyelid are reified and then programmed into what David Lyon (2003) calls a “technology of social sorting”. Biometric scientists base their studies on dangerous, racist understandings of identity from the 1930s and earlier and ignore the past thirty years of research that has definitely disproved any biological theory of race.

Summary (p.49)
Scientific studies that rely upon biological understandings of gender and race represent some of the most egregious failures of biometric technologies. Any technology that takes as its premise the assumptions that bodies are stable entities that can be reliably quantified is problematic. Relying upon erroneous biological understandings of race and gender in the development of biometric technologies has a number of ramifications, from the marginalization of transgendered bodies to facilitating forms of mechanized racial profiling. Like other identification technologies before them, biometric technologies are deployed in ways that remind us of other racist regimes premised on similar strategies of racialized and gendered classification.
Using technology to tell us “truths” about the body never reveals the stable narratives we are hoping for. Biometric technologies cannot be counted on effectively and definitively to identify any bodies. However, as these technologies are specifically deployed to identify suspect bodies, the impact of technological failure manifests itself most consistently in othered communities. Representing these new identification technologies as able to circumvent cultural assumptions and subjective human judgment does not make it so. Rather biometric technologies are always already inflected by the cultural context in which they are produced. Biometrics are marketed as able to eliminate systemic forms of discrimination at the same time as they are produced in a context marked by the persistence of problematic assumptions about difference. That is, the rhetoric of scientific neutrality masks their racists, sexist, homophobic, ableist and classist practices. Given the context for which they are developed, it is unsurprising that biometric technologies are imagined as able definitively to identify suspect bodies. Nor is it surprising, given cultural assumptions about othered bodies, that these assumptions are both explicitly and implicitly coded into the technologies. Biometric fail most often and most spectacularly at the very objective they are marketed to be able to accomplice. Race and gender identities are not nearly as invisible to new identification technologies as is claimed. The technological fallibility of biometrics manifests itself practically in their disproportionate failure at the intersection of racialized, queered, gendered, classed, and disabled bodies, as they represent the latest attempt to definitively tie identity to the body. Rather that telling stores of mechanical objectivity, race neutrality, and the guaranteed detection of formerly invisible bodies, biometric technologies continue to tell stories heavily inflected by the intersection of bodily identities.

Biometric and policing are not strangers to each other.
Ann Cavoukian, privacy commissioner of Ontario, Biometrics and Policing: Comments from a privacy perspective. (p.51: I-Tech and the beginnings of biometrics)

(p.92)
Although the border between Canada and the United States is described as self-evident, in fact attempts to use biometric technologies to make the border visible demonstrate that the line between the two nations is not as straightforward as it is imagined to be. Like the construction of biometric technologies themselves, the process of making the border visible depends upon practices of inscription, reading, and interpretation that are assumed to be transparent and yet remain complex, ambiguous, and inherently problematic.

(p.123)
An example of corporeal fetishism (Donna Haraway), biometric technologies make the body into a “thing”, a governable entity whose compliance is inevitable. Biometric technologies usefully render “the bodies as a kind of accessible digital map, something easily decipherable, understandable, containable—a body that is seemingly less mysterious than the body that is popularly conceived and individually experienced” (Sturken and Cartwright 2001:302). Standardizing bodies into binary code in a process of corporeal fetishism, biometric scientists construct a simplified material body that does not acknowledge the ways that this binary map of the body reflect the cultural context in which it was developed. And yet biometric representations of the body reflect the contemporary obsession with digital representations of the body, including a contemporary aesthetics of digitization, as well as a contemporary cultural moment preoccupied with identifying suspect Canadian bodies in the name of security. Rendering these bodies as code additionally suits the needs of what the communications theorist Dan Schiller (1999) calls “digital capitalism”. The flimsy material body is rendered rugged as biometric technologies make bodies replicable, transmittable, and segmentable—breaking the body down into its component parts (from retina to fingerprint) in ways that allow it to be marked more easily in the transnational marketplace, either as a security risk or a desirable consumer.

The knowledge generated by the use of biometrics to test identity is asked to do the cultural work of stabilizing identity […]

Offering to redefine social problems as scientific ones (Harding 1993:15), biometrics portray old inequities in new ways.

[…] shaped by a discourse of technological neutrality and efficiency.
[…] the process of making something visible is never as straightforward as the proponents of visualization technologies, whether cartographers, doctors, or biometric industry representatives, claim it to be. Representations of biometric technologies tend to depict them working perfectly, giving credence to industry assertions that biometric technologies can reliably identify individual identities beyond the shadow of a doubt. Yet rather than operating under the aegis of mechanical objectivity, biometric technologies bring to life assumptions about bodily identity, including race, class, gender, sexuality, and disability.

(p.128)
Biometric discourse produces new forms of scopophilia, pleasure in looking. I coin the term surveillant scopophilia to show that the new forms of pleasure in looking produced by biometric technologies are tied both to the violent dismembering of bodies marked by radicalized, gendered, classed, and sexualized identities and to pleasure in having anxieties about security resolved by biometric surveillance.

Using biometric technologies as a form of surveillance of suspect bodies and then reducing those suspect bodies into their component parts helps to allay the anxieties created around security through surveillance and dismemberment.

(p. 131)
Witwer: Flaws

[…]understanding scientific practice as interpretative act.

(p. 133)
the technofantastic

Like the biometric image itself, a satisfyingly contained series of zeroes and ones, the biometric match seems to guarantee our security—closing up security holes with a satisfying clink—serving simultaneously to smooth and to identify. Here surveillant scopophilia is a visual process that serves to smooth our anxieties about security, as these visual representations suggest that those bodies that threaten our security will be reliably identified.

(p. 150)
With this new search technology, bodies are imagined as stable entities that can reliably give us definitive proof of identity, creating processes of social stratification in which “material and technological infrastructures divide populations” by gender, race, class, and other axes of identity (Monaghan 2010:83). Yet biometric mismatches due to mechanical failures and the technology’s inability to work objectively dispute such stability.

(p. 151)
We must also think of the intensification of existing inequalities as failures.

Given the devastating consequences of biometric errors for human lives, we need a language that is not restricted to technical terms. Biometric failures as complex sorrows is a beginning.

[…]what Haraway and Goodeve (2000:115) calls “denaturalization… when what is taken for granted can no longer be taken for granted precisely because there is a glitch in the system.”

[…]while cultural theorists generally understand the basics of the scientific objects they study, the same cannot be said for scientist’s knowledge of cultural theory.

[…] within biometric science we are witnessing a return to the long discredited field of anthropometry and physiognomy, serving as a reminder that the pursuit of scientific knowledge is always already bound up with culture.

before formulating further policies regulating biometric technologies, it is imperative that we critically interrogate the assumptions upon which these technologies are based, the limits of any technology to address the larger context of inequality, and the complex relationships of these technologies to their existing cultural context. Moreover the conditions under which they are produced needs to be examined further, as does who gets to participate in conversations about their expansion and development and who is notably absent from these discussions. […] We need to continue to investigate the investments of billions of dollars on technological solutions to social problems.

We may also understand biometric representations of the body as a map of the contemporary social moment, both producing and reflecting its enduring inequalities, prejudices, and competing values, as well as engraved with continued resistance and hope.
We do not need a perfect technology for representing the body—there is no such technology. Rather we need to think critically about how security imperatives and the development of new technologies are trumping a commitment of the state to address poverty and the perpetuation of intersecting forms of discrimination. In speaking about the consequences of corporeal fetishism, Donna Haraway (1997:160) argues, “We need a critical hermeneutics of genetics as a constitutive part of scientific practice more urgently than we need better map resolution for genetic markers.” The same is true for biometric technologies. Technological solutions to social problems have tended to take an approach characterized by the prioritization of security over substantiative equality, global lockdown over emancipation, and an uncritical “more is better” approach to new technologies. We need the formulation of technological policy based on principle of inclusiveness ands which facilitate substantive claims to equality. Otherwise, as we have seen, in offering to redefine social problems as scientific ones, biometric discourse will simply portray old inequalities in new ways.