<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.4 20241031//EN" "JATS-journalpublishing1-4.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article" dtd-version="1.4" xml:lang="en">
  <front>
    <journal-meta>
      <journal-id journal-id-type="publisher-id">jss</journal-id>
      <journal-title-group>
        <journal-title>Open Journal of Social Sciences</journal-title>
      </journal-title-group>
      <issn pub-type="epub">2327-5960</issn>
      <issn pub-type="ppub">2327-5952</issn>
      <publisher>
        <publisher-name>Scientific Research Publishing</publisher-name>
      </publisher>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.4236/jss.2026.143002</article-id>
      <article-id pub-id-type="publisher-id">jss-149983</article-id>
      <article-categories>
        <subj-group>
          <subject>Article</subject>
        </subj-group>
        <subj-group>
          <subject>Business</subject>
          <subject>Economics</subject>
          <subject>Social Sciences</subject>
          <subject>Humanities</subject>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title>Beyond Recognition: The Complexities of Biometrics and Minority Rights</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <name name-style="western">
            <surname>Fattoumi</surname>
            <given-names>Fatma</given-names>
          </name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
      </contrib-group>
      <aff id="aff1"><label>1</label> Laboratory of Language and Cultural Forms, Department of English, Higher Institute of Languages of Tunis, University of Carthage, Carthage, Tunisia </aff>
      <author-notes>
        <fn fn-type="conflict" id="fn-conflict">
          <p>The author declares that there are no conflicts of interest regarding the research, authorship, and/or publication of this work.</p>
        </fn>
      </author-notes>
      <pub-date pub-type="epub">
        <day>03</day>
        <month>03</month>
        <year>2026</year>
      </pub-date>
      <pub-date pub-type="collection">
        <month>03</month>
        <year>2026</year>
      </pub-date>
      <volume>14</volume>
      <issue>03</issue>
      <fpage>15</fpage>
      <lpage>35</lpage>
      <history>
        <date date-type="received">
          <day>12</day>
          <month>12</month>
          <year>2025</year>
        </date>
        <date date-type="accepted">
          <day>03</day>
          <month>03</month>
          <year>2026</year>
        </date>
        <date date-type="published">
          <day>06</day>
          <month>03</month>
          <year>2026</year>
        </date>
      </history>
      <permissions>
        <copyright-statement>© 2026 by the authors and Scientific Research Publishing Inc.</copyright-statement>
        <copyright-year>2026</copyright-year>
        <license license-type="open-access">
          <license-p> This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( <ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">https://creativecommons.org/licenses/by/4.0/</ext-link> ). </license-p>
        </license>
      </permissions>
      <self-uri content-type="doi" xlink:href="https://doi.org/10.4236/jss.2026.143002">https://doi.org/10.4236/jss.2026.143002</self-uri>
      <abstract>
        <p>The ubiquitous use of biometric technology in contemporary society, from unlocking smartphones to enforcing border control and law, has raised concerns regarding its impact on minority rights. In particular, the Arab Muslim minority has been identified as a vulnerable group subject to potential violations of privacy, bias, and discrimination. This paper aims to critically examine the complexities of biometric technology and its impact on the Arab Muslim minority, drawing on US-specific examples. Our analysis reveals how biometric technology can perpetuate existing biases and marginalize Arab Muslim voices, thereby exacerbating issues of inequity and inclusivity. We argue that a democratic approach to biometric technology is crucial, emphasizing the principles of transparency, accountability, and individual rights. Through a nuanced understanding of the intricacies of biometric technology and its implications for the Arab Muslim minority, we can advance towards a more just and equitable society that upholds the rights of all individuals, regardless of their ethnic or religious affiliation.</p>
      </abstract>
      <kwd-group kwd-group-type="author-generated" xml:lang="en">
        <kwd>Biometric Technology</kwd>
        <kwd>Arab Muslim Minority</kwd>
        <kwd>Bias and Discrimination</kwd>
        <kwd>Privacy</kwd>
        <kwd>Democratic Oversight</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec1">
      <title>1. Introduction</title>
      <p>Our bodies are the most immediate markers of identity, mediating both self-recognition and social perception ([<xref ref-type="bibr" rid="B38">38</xref>]). They carry the traces of lived experience, signal the passage of time, and disclose aspects of our existence—from vulnerability to mortality—that we may wish to conceal. Increasingly, these corporeal characteristics have been appropriated by states as mechanisms of verification. Advances in biometric technology—systems that identify individuals through fingerprints, facial features, iris patterns, voiceprints, or behavioral markers such as gait—have transformed the body into a site of governance and surveillance ([<xref ref-type="bibr" rid="B25">25</xref>]; [<xref ref-type="bibr" rid="B46">46</xref>]). From unlocking smartphones to border control and law enforcement, biometric systems are promoted for efficiency and security, yet they also encode and reproduce existing social hierarchies ([<xref ref-type="bibr" rid="B27">27</xref>]; [<xref ref-type="bibr" rid="B28">28</xref>]; [<xref ref-type="bibr" rid="B24">24</xref>]; [<xref ref-type="bibr" rid="B20">20</xref>]; [<xref ref-type="bibr" rid="B14">14</xref>]).</p>
      <p>For Arab Muslim minorities in the United States, these technologies operate at the intersection of visibility and vulnerability ([<xref ref-type="bibr" rid="B12">12</xref>]; [<xref ref-type="bibr" rid="B32">32</xref>]; [<xref ref-type="bibr" rid="B39">39</xref>]). Algorithmic misidentification, intensified scrutiny, and opaque security processes render these communities disproportionately exposed to risk, raising profound ethical and political concerns. Framed through Foucauldian biopolitics, Agamben’s notion of bare life, and Mbembe’s necropolitics, biometric governance emerges not merely as a technical system but as a mechanism through which bodies are classified, rendered legible, and subjected to differential power ([<xref ref-type="bibr" rid="B34">34</xref>]). The body becomes simultaneously visible and precarious, recognized yet contingent, alive yet governed by data-driven authority. Despite this rich theoretical landscape, empirical engagement with the lived experience of Arab Muslim minorities under the specific ‘digital border’ regimes of the mid-2020s remains fragmented. While biopolitics and necropolitics provide the grammar for understanding state power, there is an urgent need to synthesize these frameworks with a concrete analysis of current U.S. regulatory gaps and algorithmic biases.</p>
      <p>This study investigates the impact of biometric technology on Arab Muslim minorities in the US, asking: 1) To what extent do biometric infrastructures facilitate “digital epidermalization” and “failed mobility” for Arab Muslim minorities in the United States? 2) How does the transition toward “second-generation” behavioral biometrics institutionalize a “pre-crime” logic that disproportionately targets communities previously “stained” by post-9/11 scrutiny? 3) In the absence of federal regulation, how do “techno-authoritarian imaginaries” and “social sorting” mechanisms consolidate state power at the expense of minority democratic agency? By integrating critical theory with empirical analysis, this research seeks to illuminate the ethical, political, and social stakes of biometric governance, offering strategies to align technological innovation with democratic values.</p>
    </sec>
    <sec id="sec2">
      <title>2. Literature Review</title>
      <sec id="sec2dot1">
        <title>2.1. Arab Muslim Minorities in the United States</title>
        <p>In an academic context, the positioning of Arab Muslim minorities in the United States is defined by a unique intersection of ethno-linguistic heritage and religious affiliation. This dual identity distinguishes them both from Arab Christians, who share an ethnic bond but different ontological frameworks, and from the broader “Global Ummah,” which encompasses diverse non-Arab populations ([<xref ref-type="bibr" rid="B37">37</xref>]).</p>
        <p>Sociologically, this group often exercises a sense of cultural stewardship. Because the Quran was revealed in Arabic, many Arab Muslims view themselves as the foundational custodians of Islamic tradition, creating a synthesis where language and theology are inextricably linked ([<xref ref-type="bibr" rid="B22">22</xref>]). In the American diaspora, this manifests as a claim to cultural authority within religious spaces, as communities strive to preserve linguistic purity against the pressures of Western assimilation ([<xref ref-type="bibr" rid="B33">33</xref>]).</p>
        <p>However, this identity has been profoundly reshaped by the post-9/11 landscape. Since the 2001 attacks, Arab Muslims have been systematically “stained” by the specter of terrorism, subjected to intensified state scrutiny and public suspicion. This hyper-surveillance has forced the community to navigate a dual existence: acting as internal guardians of cultural authenticity while simultaneously defending their right to belong in a society that frequently views their identity through a lens of inherent threat ([<xref ref-type="bibr" rid="B19">19</xref>]; [<xref ref-type="bibr" rid="B46">46</xref>]).</p>
      </sec>
      <sec id="sec2dot2">
        <title>2.2. Overview of Biometric Technology and Its Various Applications</title>
        <p>Biometric technology refers to the use of unique biological characteristics to identify individuals ([<xref ref-type="bibr" rid="B23">23</xref>]). It has become an increasingly prevalent tool in many aspects of modern society, from unlocking smartphones to border control and law enforcement. Biometric technologies include fingerprint scanning, facial recognition, voice recognition, and iris scanning, among others ([<xref ref-type="bibr" rid="B13">13</xref>]; [<xref ref-type="bibr" rid="B24">24</xref>]). Facial recognition technology, in particular, has been widely adopted in various industries, including law enforcement, financial services, and retail. For instance, the United Arab Emirates (UAE) government has implemented a facial recognition system to monitor and track the movement of its citizens and residents ([<xref ref-type="bibr" rid="B46">46</xref>]). In the United States, facial recognition technology has been used by law enforcement agencies to identify suspects and monitor public spaces ([<xref ref-type="bibr" rid="B19">19</xref>]). While the potential benefits of biometric technology are clear, concerns have been raised about its implications for individual privacy and security ([<xref ref-type="bibr" rid="B14">14</xref>]). Biometric data is often collected without individuals’ informed consent, and there are no clear guidelines for how this data can be used, shared, or stored. Additionally, biometric data is not immutable and can be stolen or compromised, potentially leading to identity theft or other forms of fraud ([<xref ref-type="bibr" rid="B24">24</xref>]).</p>
        <p>Moreover, biometric technology has the potential to perpetuate existing biases and discrimination against marginalized communities, including Arab Muslim minorities ([<xref ref-type="bibr" rid="B42">42</xref>]). Studies have shown that facial recognition technology can be biased against people of color, leading to misidentification and false arrests ([<xref ref-type="bibr" rid="B8">8</xref>]). This can have serious implications for Arab Muslim minorities who may already face discrimination and racial profiling in their daily lives.</p>
        <p>Given the increasing prevalence of biometric technology and its potential impact on Arab Muslim minorities, it is crucial to examine the various ways in which it can affect their lives and well-being. Through a nuanced understanding of the complexities of biometric technology, we can work towards a more just and equitable society that upholds the rights of all individuals, regardless of their race, ethnicity, or background.</p>
        <p>2.2.1. Impact of Biometric Technology on Minority Communities </p>
        <p>The scholarly discourse on biometric technologies reveals a persistent tension between the promise of “objective” security and the reality of systemic exclusion. While these systems are often marketed as neutral tools of efficiency, research consistently demonstrates that they function as engines of differential surveillance, particularly for minority communities ([<xref ref-type="bibr" rid="B42">42</xref>]). The large-scale harvesting of biometric data—facial templates, iris scans, and fingerprints—represents more than a privacy risk; it constitutes a fundamental shift in how the state manages bodies ([<xref ref-type="bibr" rid="B5">5</xref>]).</p>
        <p>2.2.2. From Algorithmic Bias to Digital Epidermalization </p>
        <p>At the technical level, the myth of algorithmic neutrality is debunked by the “accuracy gap.” Empirical studies, most notably by [<xref ref-type="bibr" rid="B21">21</xref>], have documented that facial recognition systems exhibit significantly higher error rates for specific racial and ethnic groups. For instance, NIST testing has shown that false-positive rates can be up to 100 times higher for West and East African and East Asian faces compared to White faces. These disparities are not merely technical glitches; they are a manifestation of what [<xref ref-type="bibr" rid="B7">7</xref>] identifies as “digital epidermalization.” Epidermalization is a theoretical concept, most famously developed by [<xref ref-type="bibr" rid="B16">16</xref>], to describe how social meanings—especially race, inferiority, and power—become inscribed onto the body, as if they were part of the skin itself. In this framework, the technological “rendering” of the body re-inscribes race as a permanent risk factor, ensuring that the “stain” of post-9/11 suspicion is algorithmically immortalized for Arab Muslim minorities.</p>
        <p>2.2.3. The Mechanisms of Social Sorting and Failed Mobility</p>
        <p>The deployment of these technologies in law enforcement and border control—exemplified by the Traveler Verification Systems (TVS) deployed by U.S. Customs and Border Protection (CBP) as part of the Biometric Entry-Exit Program. It relies mainly on facial recognition technology to verify travelers’ identities—creates an environment of “automated profiling” ([<xref ref-type="bibr" rid="B26">26</xref>]). This infrastructure facilitates what [<xref ref-type="bibr" rid="B46">46</xref>] describes as “social sorting,” a process where biometric IDs are weaponized to regulate ethno-racial hierarchies. By categorizing individuals based on data-driven “risk metrics,” the state effectively justifies the “failed mobility” of those who fall outside Western secular norms ([<xref ref-type="bibr" rid="B2">2</xref>]). This contributes to a broader “techno-authoritarian imaginary” where surveillance is used not just to catch criminals, but to manage “anomalous” populations ([<xref ref-type="bibr" rid="B40">40</xref>]).</p>
        <p>2.2.4. The Pre-Crime Logic of Second-Generation Biometrics</p>
        <p>The theoretical landscape is further complicated by the emergence of “second-generation” biometrics. As [<xref ref-type="bibr" rid="B42">42</xref>] argue, the shift from verifying identity to predicting behavioral intent institutionalizes a dangerous “pre-crime” logic. By attempting to algorithmically “read” internal psychological states, the state subjects Arab Muslim communities—already under intense scrutiny—to a form of automated judgment that suppresses democratic agency and moral autonomy.</p>
        <p>By situating biometric systems within this critical socio-technical framework, this study moves beyond a simple analysis of privacy. It investigates how these technologies intersect with the historical marginalization of Arab Muslim minorities, asking whether a system built on biased “Street-Level Algorithms” can ever truly align with the principles of equitable governance and civil liberties.</p>
        <p>This study builds on these insights by examining the impact of biometric technologies on Arab Muslim minorities, emphasizing both the risks of bias and the potential for inclusive, equitable governance. By situating biometric systems within broader socio-technical and ethical frameworks, this research contributes to a nuanced understanding of how emerging technologies intersect with civil liberties, minority rights, and social equity.</p>
      </sec>
    </sec>
    <sec id="sec3">
      <title>3. Theoretical Framework: Biopolitics, CRT, and the Architecture of the Digital Border</title>
      <p>The integration of biometric technologies into the modern state apparatus cannot be understood as a mere upgrade in administrative efficiency. Rather, it represents a fundamental shift in the relationship between the human body, sovereign power, and political belonging ([<xref ref-type="bibr" rid="B1">1</xref>]). This study utilizes a multi-layered theoretical framework to analyze how these systems impact Arab Muslim minorities, moving from the normative protections of international law to the critical insights of biopolitical theory and Critical Race Theory (CRT).</p>
      <sec id="sec3dot1">
        <title>3.1. The Normative Foundation and the Paradox of Recognition</title>
        <p>Minority status in international law is not defined by simple arithmetic, but by the asymmetric power relations between a group and the state. As Francesco Capotorti established in his foundational 1977 (Capotorti, 1977) formulation, a minority is characterized by its “non-dominant” position and a collective “will to survive” through the preservation of distinct cultural markers ([<xref ref-type="bibr" rid="B36">36</xref>]). For decades, Arab Muslim communities in the United States navigated a legal “gray zone,” classified as “White” for census purposes—a status that provided a surface-level promise of inclusion while facilitating a form of statistical erasure ([<xref ref-type="bibr" rid="B37">37</xref>]).</p>
        <p>[<xref ref-type="bibr" rid="B3">3</xref>] notes that the 2024 implementation of the Middle Eastern or North African (MENA) census category marked a pivotal victory for visibility, allowing for targeted civil rights enforcement and equitable redistricting. However, this visibility is inherently paradoxical. As these communities become more legible to the law, they simultaneously become “hyper-legible” to a security state. This visibility does not always translate into recognition; instead, it often facilitates exposure to a “digital border”—a pervasive infrastructure of biometric surveillance and “continuous vetting” that encodes religious and ethnic identity as a permanent risk factor.</p>
      </sec>
      <sec id="sec3dot2">
        <title>3.2. Critical Race Theory and the Algorithmic Reproduction of Inequality</title>
        <p>To understand why biometric systems disproportionately flag certain bodies, this research draws on Critical Race Theory (CRT). CRT posits that racism is not merely an episodic occurrence of individual prejudice but is systemic and embedded within the very structures of law and technology ([<xref ref-type="bibr" rid="B15">15</xref>]). In the context of biometrics, this means that “neutral” algorithms inherit the racial hierarchies of their training data.</p>
        <p>As [<xref ref-type="bibr" rid="B8">8</xref>] demonstrated, facial recognition systems exhibit significantly higher error rates for darker skin tones and can be confounded by religious attire like the hijab. Within a CRT framework, these are not “glitches” to be fixed by better code; they are technological inscriptions of historical marginalization. For the Arab Muslim subject, a recurring “false positive” or a “black box” security flag is an algorithmic manifestation of a pre-existing field of suspicion, transforming a technical failure into a lived vulnerability.</p>
      </sec>
      <sec id="sec3dot3">
        <title>3.3. Foucauldian Biopolitics and the Governance of “Bare Life”</title>
        <p>In international law and critical theory, the most profound implications of biometric governance are revealed through Michel Foucault’s concept of biopolitics—the technologies through which the state manages and optimizes populations by making the body a site of political intervention ([<xref ref-type="bibr" rid="B17">17</xref>]: p. 170). Biometrics go beyond mere identification; they “administer” the body, rendering it measurable and comparable within vast databases. For securitized communities, this overlays a “genealogy of power” where the state does not repress through overt force, but through constant, automated observation and categorization.</p>
        <p>This biopolitical management often pushes the minority subject into what Giorgio Agamben describes as the “state of exception” or “bare life” ([<xref ref-type="bibr" rid="B1">1</xref>]: p. 83). In these zones—most visible at border crossings and in opaque “No-Fly” lists—legal recognition is suspended and rights become contingent upon the scan of a retina or a fingerprint. When an individual is flagged by a proprietary algorithm with no path for appeal, they are reduced to a biological data point, existing simultaneously inside the law’s reach but outside its protection ([<xref ref-type="bibr" rid="B1">1</xref>]: p. 171).</p>
      </sec>
      <sec id="sec3dot4">
        <title>3.4. Necropolitics and the Distribution of Precarity</title>
        <p>Achille Mbembe explains the necropower “as the capacity to control the life and death of citizens, because sovereignty has the power to exclude a community from the vast population, leaving them in a status of social death” ([<xref ref-type="bibr" rid="B45">45</xref>]). [<xref ref-type="bibr" rid="B31">31</xref>] concept of necropolitics illuminates the ultimate stakes of this technological expansion. In authoritarian or highly securitized contexts, biometric databases become tools for administering precarity. The power to track, immobilize, or selectively exclude specific populations allows the state to decide whose movement is “liquid” and whose is “blocked.” For Arab Muslim minorities, the convergence of biopolitical management and necropolitical exposure produces a precarious form of belonging, where security is a privilege and suspicion is permanently encoded into their digital double. </p>
        <p>A critical engagement with biometric technology must move beyond questions of technical efficiency. It must confront the ethical reality of governing through the body. Without a philosophical commitment to resisting these automated hierarchies, biometric innovation threatens to normalize a political order where justice is permanently subordinated to the imperatives of algorithmic security.</p>
      </sec>
    </sec>
    <sec id="sec4">
      <title>4. Methodology</title>
      <p>The present study employs a qualitative research approach, utilizing a case study design to facilitate an in-depth examination of the impact of biometric technology on Arab Muslim minorities in the United States. A case study methodology was chosen for its capacity to generate rich, contextualized insights into complex socio-technical phenomena and to illuminate the interactions between technology, social structures, and minority rights ([<xref ref-type="bibr" rid="B44">44</xref>]).</p>
      <p>Data collection draws upon multiple sources to ensure methodological rigor and strengthen the credibility of the findings. These sources include a comprehensive review of scholarly literature (n = 6) and semi-structured interviews are conducted with (N = 6) experts in biometric technologies (n = 2), civil rights (n = 2), and minority rights advocacy (n = 2). Interviews are carried out either in person or via secure video conferencing platforms, depending on participant availability, and follow a structured protocol designed to elicit detailed, contextually grounded responses.</p>
      <p>Data analysis integrates thematic analysis for literature and document review with a grounded theory approach for interview data, enabling both the identification of emergent patterns and the development of theoretically informed interpretations ([<xref ref-type="bibr" rid="B6">6</xref>]; [<xref ref-type="bibr" rid="B11">11</xref>]). NVivo software is employed for systematic coding, organization, and retrieval, ensuring rigorous management of the qualitative dataset. This multi-method approach allows for a comprehensive and nuanced understanding of the complex relationship between biometric technologies and minority rights, while promoting analytical rigor and transparency.</p>
      <p>Ethical considerations are central to the study design. All participants provided informed consent, and strict measures were taken to maintain confidentiality and protect data. Ethical approval was obtained from the relevant institutional review boards prior to the commencement of data collection. Several limitations are acknowledged. First, data concerning biometric technologies and Arab Muslim minorities are limited, reflecting the sensitivity of the topic and potential reluctance among participants to share personal experiences. Second, the literature review is restricted to English-language publications from the past ten years, which may exclude relevant findings from non-English or older sources. Third, potential bias in the selection of data sources and analytical procedures is recognized; this is mitigated through methodological triangulation, the inclusion of diverse data sources, and adherence to systematic coding and analytic protocols.</p>
      <p>For the literature review, searches were conducted across databases including Google Scholar, JSTOR, and ProQuest using a combination of keywords, including: “biometric technology AND Arab Muslim minorities,” “biometric technology AND privacy AND Arab Muslim minorities,” “biometric technology AND security AND Arab Muslim minorities,” “biometric technology AND bias AND Arab Muslim minorities,” “biometric technology AND discrimination AND Arab Muslim minorities,” and “biometric technology AND democracy AND Arab Muslim minorities.” Additional related terms were also incorporated to maximize coverage. Articles were screened for relevance based on the research objectives and inclusion criteria, and findings were synthesized to inform the analytical framework of the study. By employing a rigorous, multi-source, and ethically grounded research design, this study aims to generate reliable, contextually sensitive insights into the socio-technical and ethical dimensions of biometric technology as they relate to the Arab Muslim minority in the U.S., contributing to both scholarly knowledge and policy discourse.</p>
      <p>Table 1. Inclusion and exclusion criteria of eligible works.</p>
      <table-wrap id="tbl1">
        <label>Table 1</label>
        <table>
          <tbody>
            <tr>
              <td>Inclusion</td>
              <td>Exclusion</td>
            </tr>
            <tr>
              <td>Published between 2010-2025</td>
              <td>Published prior to 2010</td>
            </tr>
            <tr>
              <td>Published in English</td>
              <td>Published in a language other than English</td>
            </tr>
            <tr>
              <td>Primary research articles pertaining to the deployment of biometric technologies</td>
              <td>Non-primary research articles (e.g., editorials, opinion pieces)</td>
            </tr>
            <tr>
              <td>Related Biometric Technologies and Racial Bias</td>
              <td>Not related to Biometric Technologies and Racial Bias</td>
            </tr>
          </tbody>
        </table>
      </table-wrap>
      <p>As <bold>Table 1</bold> illustrates, we limited our search to articles published in English within the last 15 years to ensure the most up-to-date research was included. After applying the relevant filters and refining our search terms, we reviewed and selected sources that were relevant to our research question and objectives. We read the abstracts or summaries of each source to determine their relevance and importance to our study.</p>
    </sec>
    <sec id="sec5">
      <title>5. Results</title>
      <p>The mixed-method analysis indicates that biometric technologies systematically reproduce and exacerbate social inequities, disproportionately impacting Arab Muslim minorities in the United States. Drawing on critical race theory and Foucault’s notion of surveillance as a mechanism of power/knowledge, these technologies operate not merely as neutral technical tools but as instruments that embed existing socio-political hierarchies within ostensibly objective systems ([<xref ref-type="bibr" rid="B17">17</xref>]; [<xref ref-type="bibr" rid="B12">12</xref>]).</p>
      <p>The results demonstrate that the deployment of biometric systems functions as a mechanism of differential surveillance. This scrutiny is a direct legacy of the post-9/11 security landscape, where Arab and Muslim identities became systematically “stained” by a persistent association with threat. This environment has fostered what [<xref ref-type="bibr" rid="B7">7</xref>] identifies as “digital epidermalization,” where the historical practice of categorizing bodies based on race is modernized through data.1 By “rendering” race through biometric markers, the state effectively automates suspicion, turning the body itself into a site of constant inquiry and commodifying it as a digital “brand.”</p>
      <p>This automation is most visible in the “accuracy gap” identified throughout the literature and confirmed by expert interviews. While facial recognition technology (FRT) performs with near-perfect accuracy for white males—with error rates as low as 0.8%—the failure rate for darker-skinned individuals can exceed 34%, particularly for those with non-Western morphologies or those wearing religious attire such as the hijab ([<xref ref-type="bibr" rid="B8">8</xref>]). Expert interviews (n = 6) corroborated these findings, with participants across the technology and civil rights sectors noting that such technical failures translate into a lived reality of “failed mobility” ([<xref ref-type="bibr" rid="B2">2</xref>]). In this context, the “rhetorical screening” marketed as efficient security actually functions as a digital wall, trapping refugees and Arab travelers in loops of secondary screenings and “high-risk” flagging. As [<xref ref-type="bibr" rid="B26">26</xref>] observe, systems like the Traveler Verification System (TVS) rely on “Street-Level Algorithms” that suffer from significant performance drops in real-world conditions, further facilitating “social sorting” ([<xref ref-type="bibr" rid="B46">46</xref>]).</p>
      <sec id="sec5dot1">
        <title>5.1. Behavioral Prediction and Techno-Authoritarianism</title>
        <p>The data reveal an escalating danger in the transition toward “second-generation” biometrics ([<xref ref-type="bibr" rid="B42">42</xref>]), which shifts the focus from identity verification to behavioral prediction. By attempting to “read” intent or psychological states, these systems introduce a “pre-crime” logic that is deeply susceptible to cultural misinterpretation, violating the moral autonomy of the individual. Experts in minority rights advocacy highlighted that for a community already under intense scrutiny, this reflects a broader “techno-authoritarian imaginary” ([<xref ref-type="bibr" rid="B40">40</xref>]). In this landscape, the use of evolving, non-transparent algorithms, allows for a form of democratic regression where the inaccuracy of the technology is not merely a “glitch,” but a systemic feature that suppresses the political agency of minorities.</p>
      </sec>
      <sec id="sec5dot2">
        <title>5.2. The Algorithmic Border: Biometric Governance and the Production of Racialized Suspicion</title>
        <p>The scholarly consensus across these six works (<bold>Table 2</bold>) reveals that biometric and facial recognition technologies (FRT) are far from neutral tools of efficiency; instead, they function as sophisticated engines of social sorting and racialized exclusion. At the core of this failure is what [<xref ref-type="bibr" rid="B7">7</xref>] identifies as “digital epidermalization,” where the historical practice of branding and categorizing bodies based on race is updated for the digital age. When algorithms are trained on Western-centric datasets, non-Western facial morphologies and darker skin tones are often “misread” or rendered invisible. Empirical data support this: while error rates for lighter-skinned males are as low as 0.8%, they can soar to over 34% for darker-skinned women ([<xref ref-type="bibr" rid="B8">8</xref>]), leading to a form of technical erasure that systematically denies marginalized individuals access to social goods.</p>
        <p>Table 2. Comparative framework of biometric vulnerabilities and systemic inefficiencies across scholarly perspectives.</p>
        <table-wrap id="tbl2">
          <label>Table 2</label>
          <table>
            <tbody>
              <tr>
                <td>Reference</td>
                <td>Surveillance Practices</td>
                <td>Data Misuse &amp; Risks</td>
                <td>Privacy &amp; Accuracy Issues</td>
                <td>Impact on Minorities &amp; Individuals</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B7">7</xref>
                  ]
                </td>
                <td>Digital Epidermalization: Use of biometrics to “render” race through data.</td>
                <td>Transformation of the body into a “stamp of commodity” similar to branding.</td>
                <td>Systems fail to “read” non-white bodies, leading to technical erasure.</td>
                <td>Epistemic violence: Bodies that don’t fit the “norm” are denied mobility and housing.</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B42">42</xref>
                  ]
                </td>
                <td>Second-Gen Biometrics: Moving from identity to behavioral prediction.</td>
                <td>Function creep: data collected for security is used for “intent” analysis.</td>
                <td>Violation of moral autonomy; individuals lose control over their “digital self.”</td>
                <td>Predictive Policing: Minorities are disproportionately flagged as “suspicious” by intent-logic.</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B2">2</xref>
                  ]
                </td>
                <td>Rhetorical Screening: Biometrics marketed as “efficient” but used as digital walls.</td>
                <td>Failed promises of mobility; data becomes a tool for entrapment.</td>
                <td>The “certainty” of the digital scan creates a false narrative of the individual.</td>
                <td>Refugee Exclusion: Displaced persons are trapped in cycles of suspicion and “immobility.”</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B26">26</xref>
                  ]
                </td>
                <td>Airport Biometric Entry/Exit: Integration of TVS (Traveler Verification System).</td>
                <td>Centralized databases create “single points of failure” for data breaches.</td>
                <td>Low matching rates and reliance on “Street-Level Algorithms” instead of humans.</td>
                <td>Automated Profiling: Travelers are sorted by “risk” metrics that favor Western norms.</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B46">46</xref>
                  ]
                </td>
                <td>Social Sorting in UAE: Using Biometric IDs to regulate ethno-racial hierarchies.</td>
                <td>Integration of health/insurance data into state surveillance registries.</td>
                <td>Claims of “race-neutrality” mask the hardening of ethno-racial tiers.</td>
                <td>Labor Stratification: Migrant workers are “sorted” and monitored to maintain state control.</td>
              </tr>
              <tr>
                <td>
                  [
                  <xref ref-type="bibr" rid="B40">40</xref>
                  ]
                </td>
                <td>Techno-Authoritarian Imaginaries: Anticipatory use of FRT for social control.</td>
                <td>High potential for “democratic regression” via public/private data sharing.</td>
                <td>Use of evolving, opaque algorithms with minimal democratic oversight.</td>
                <td>Marginalized Resistance: Civil society views these as tools for “racial and gendered purging.”</td>
              </tr>
            </tbody>
          </table>
        </table-wrap>
        <p>This technical inaccuracy creates a “failed promise of mobility,” a concept [<xref ref-type="bibr" rid="B2">2</xref>] applies specifically to the refugee experience. For Arab Muslim populations, the border becomes a rhetorical trap: while the technology is sold as a means of “streamlining” travel, the reality is a cycle of false positives and algorithmic suspicion that halts movement rather than facilitating it. [<xref ref-type="bibr" rid="B26">26</xref>] further emphasize that these airport systems rely on “Street-Level Algorithms” that suffer from significant performance drops in real-world conditions, such as varied lighting or the presence of religious headwear like the hijab. This disproportionately flags travelers from the Global South as “high-risk” anomalies due to these inherent biases.</p>
        <p>The danger escalates with the transition to “second-generation” biometrics, which [<xref ref-type="bibr" rid="B42">42</xref>] warn is shifting from mere identity verification to behavioral prediction. By attempting to “read” intent or psychological states, these systems introduce a “pre-crime” logic that is deeply susceptible to cultural misinterpretation. For a community already “stained” by post-9/11 stigmas, this translates into intensified policing of everyday behaviors that an algorithm—designed with a Western secular “default”—marks as suspicious.</p>
        <p>Furthermore, these technologies are frequently weaponized to maintain existing power hierarchies. [<xref ref-type="bibr" rid="B46">46</xref>] illustrates how biometric IDs are used in the UAE to enforce ethno-racial labor stratification, proving that data is rarely just data—it is a tool for “sorting” populations into tiers of citizenship and rights. This reflects the broader “techno-authoritarian imaginary” described by [<xref ref-type="bibr" rid="B40">40</xref>], where the opaque nature of facial recognition allows for a form of democratic regression. In this landscape, the inaccuracy of the technology is not merely a “glitch”; it is a systemic feature that suppresses the political agency of minorities by making public spaces a site of constant, unpredictable scrutiny.</p>
      </sec>
      <sec id="sec5dot3">
        <title>5.3. The Architecture of Targeted Scrutiny: Biometric Governance and Racialized Suspicion</title>
        <p>The deployment of biometric systems in public spaces and at border crossings functions as a mechanism of differential surveillance—one that weighs disproportionately on Arab and Muslim communities ([<xref ref-type="bibr" rid="B19">19</xref>]). Framed through [<xref ref-type="bibr" rid="B29">29</xref>] analysis of the “surveillance society,” these practices are far from neutral technical measures; they are manifestations of a deep-seated power asymmetry. By subjecting specific minority groups to intensified monitoring, these systems do more than just collect data—they entrench social stigmas and institutionalize what [<xref ref-type="bibr" rid="B18">18</xref>] described as a form of epistemic marginalization.</p>
        <p>Critically, this scrutiny is a direct legacy of the post-9/11 security landscape, where Arab and Muslim identities became systematically “stained” by a persistent association with threat. This environment has birthed what [<xref ref-type="bibr" rid="B7">7</xref>] identifies as “digital epidermalization,” where the historical practice of branding and categorizing bodies based on race is modernized through data. By “rendering” race through biometric markers, the state effectively automates suspicion, turning the body itself into a site of constant inquiry and commodifying it as a digital “brand” for surveillance.</p>
        <p>This automation is most visible in the “accuracy gap” inherent in these systems. While facial recognition technology (FRT) performs with near-perfect accuracy for white males—with error rates as low as 0.8%—the failure rate for darker-skinned individuals can exceed 34%, particularly for those with non-Western morphologies or those wearing religious attire like the hijab ([<xref ref-type="bibr" rid="B8">8</xref>]). This technical failure translates into a lived reality of “failed mobility” ([<xref ref-type="bibr" rid="B2">2</xref>]), where the “rhetorical screening” marketed as efficient security actually functions as a digital wall, trapping refugees and Arab travelers in loops of secondary screenings and “high-risk” flagging.</p>
        <p>As [<xref ref-type="bibr" rid="B26">26</xref>] note, systems like the Traveler Verification System (TVS) at airports rely on “Street-Level Algorithms” that suffer from significant performance drops in real-world conditions. This facilitates a form of “social sorting” ([<xref ref-type="bibr" rid="B46">46</xref>]) where travelers are categorized by “risk” metrics that inherently favor Western norms. In certain contexts, such as the UAE, these biometric IDs are weaponized to manage and enforce ethno-racial labor hierarchies, proving that data is frequently used to maintain tiers of citizenship and exclude those who deviate from the state-defined “norm.”</p>
        <p>The danger escalates with the transition toward “second-generation” biometrics ([<xref ref-type="bibr" rid="B42">42</xref>]), which shifts the focus from identity verification to behavioral prediction. By attempting to “read” intent or psychological states, these systems introduce a “pre-crime” logic that is deeply susceptible to cultural misinterpretation, violating the moral autonomy of the individual. This reflects a broader “techno-authoritarian imaginary” ([<xref ref-type="bibr" rid="B40">40</xref>]), where the use of evolving, non-transparent algorithms without public consent allows for a form of democratic regression. In this landscape, the inaccuracy of the technology is not merely a “glitch”; it is a systemic feature that suppresses the political agency of minorities by making public spaces a site of constant, unpredictable scrutiny.</p>
        <p>Given this reality, the argument from civil rights advocates is clear: the impact of biometric surveillance is too pervasive and its biases too systemic to be “fixed” through minor policy adjustments. To protect fundamental rights and halt the further marginalization of populations who have been under a microscope for decades, a total prohibition on biometric surveillance in public spaces is increasingly viewed as the only viable path forward.</p>
      </sec>
      <sec id="sec5dot4">
        <title>5.4. Data Misuse Risks: From Security to Social Engineering</title>
        <p>Biometric databases, when governed by opaque institutional authorities, function as high-stakes repositories for potential misuse. The results of this study, supported by the scholarly consensus of the six primary articles and the testimonies of interviewed experts, suggest that this data is rarely isolated. As [<xref ref-type="bibr" rid="B46">46</xref>] and [<xref ref-type="bibr" rid="B40">40</xref>] illustrate, biometric information is frequently merged across health, insurance, and labor registries—a practice the interviewed civil rights advocates (n = 2) described as a “digital dragnet.” This interoperability is used to enforce ethno-racial hierarchies, particularly against Arab Muslim populations.</p>
        <p>In both democratic and authoritarian contexts, this “function creep” allows data collected under the guise of “national security” to be repurposed for behavioral management. Interviewed experts in biometric technologies (n = 2) noted that once a body is digitized, it becomes what [<xref ref-type="bibr" rid="B7">7</xref>] calls a “commodity,” or a permanent, searchable “brand.” This aligns with [<xref ref-type="bibr" rid="B34">34</xref>], who argue that such data-leveraging constrains individual freedoms and marginalizes vulnerable populations.</p>
        <p>The shift toward “second-generation” biometrics, as warned by [<xref ref-type="bibr" rid="B42">42</xref>], represents the most significant misuse risk. By repurposing security data to analyze “intent,” institutions transition from monitoring who a person is to predicting what they might do. Minority rights advocates (n = 2) emphasized during interviews that this “pre-crime” logic falls heaviest on Arab Muslim communities, whose cultural and religious expressions are often misinterpreted by the state as indicators of risk.</p>
        <p>Viewed through [<xref ref-type="bibr" rid="B18">18</xref>] lens on the disciplinary functions of documentation, these systems act as mechanisms of governance that translate technical observation into total socio-political control. The “certainty” of the digital scan—critiqued by [<xref ref-type="bibr" rid="B2">2</xref>]—replaces the complex reality of the individual with a static narrative of suspicion. Furthermore, as [<xref ref-type="bibr" rid="B26">26</xref>] highlight, the centralization of these databases creates “single points of failure,” where the misuse of data is not just an institutional choice but a systemic vulnerability.</p>
        <p>Ultimately, the consensus among both the literature and the interviewed experts is that the “neutral” veneer of these databases masks their utility as tools for “racial and gendered purging” in public spaces ([<xref ref-type="bibr" rid="B40">40</xref>]). This underscores a reality where the misuse of biometric data is not an anomaly, but a fundamental capability of the system’s design.</p>
      </sec>
      <sec id="sec5dot5">
        <title>5.5. Privacy Violations: The Regulatory Vacuum and the Erosion of Consent</title>
        <p>The findings from this study reveal that privacy violations in biometric deployment are not merely accidental; they are systemic. Expert interviews (n = 6) consistently highlighted that the collection of biometric data occurs almost entirely without informed consent, facilitating a state of prolonged, involuntary tracking. This aligns with the “techno-authoritarian imaginaries” described by [<xref ref-type="bibr" rid="B40">40</xref>], where the public is subjected to evolving, non-transparent algorithms without legislative or public oversight.</p>
        <p>The interviewed civil rights advocates (n = 2) emphasized that this lack of consent effectively strips individuals of their “contextual integrity” ([<xref ref-type="bibr" rid="B35">35</xref>]). When an Arab Muslim woman’s facial data is captured in a public square or at a border, it is often repurposed across contexts she never authorized. This “function creep” is central to the shift toward “second-generation” biometrics identified by [<xref ref-type="bibr" rid="B42">42</xref>]. As these systems move from verifying identity to predicting behavior, they violate the moral autonomy of the individual, as the subject loses control over their own “digital persona.”</p>
        <p>Technically, these violations are compounded by the “Street-Level Algorithms” critiqued by [<xref ref-type="bibr" rid="B26">26</xref>]. During interviews, experts in biometric technologies (n = 2) noted that these systems frequently produce “false negatives” and matching errors for non-Western morphologies. In a regulatory vacuum, these errors lead to unwarranted stops and searches. As the [<xref ref-type="bibr" rid="B43">43</xref>] points out, no federal laws currently protect civil rights in the government’s use of FRT. Statutes like the Privacy Act of 1974 are fundamentally ill-equipped to handle the “digital epidermalization” ([<xref ref-type="bibr" rid="B7">7</xref>]) that occurs when race is coded as a permanent risk factor.</p>
        <p>Furthermore, the “rhetorical screening” used at borders, as analyzed by [<xref ref-type="bibr" rid="B2">2</xref>], creates a false sense of security that justifies the suspension of privacy rights. Minority rights advocates (n = 2) noted that for Arab Muslim populations, the border is a site where privacy is completely subsumed by the “logic of the database.” This echoes [<xref ref-type="bibr" rid="B46">46</xref>] findings on “social sorting,” where the claim of “technical neutrality” is used to mask the hardening of racial tiers and the systemic violation of the right to anonymity in public spaces.</p>
      </sec>
    </sec>
    <sec id="sec6">
      <title>6. Discussion</title>
      <p>This study demonstrates that biometric technologies do not merely mirror pre-existing social inequalities; they actively produce, stabilize, and legitimize them through automated regimes of surveillance that disproportionately target Arab Muslim minorities in the United States. When examined through the theoretical framework developed in this research, biometric systems emerge as a <italic>dispositif</italic>in the Foucauldian sense—an historically situated constellation of discourses, technologies, institutions, and security rationalities that collectively render certain bodies governable, legible, and persistently suspect ([<xref ref-type="bibr" rid="B17">17</xref>]). Rather than functioning as neutral instruments of identification, biometric technologies materialize post-9/11 security logics into algorithmic infrastructures that subtly but decisively reconfigure political belonging—reshaping who is presumed trustworthy, who is rendered mobile, and who remains perpetually visible under suspicion.</p>
      <p>The findings offer strong empirical support for [<xref ref-type="bibr" rid="B7">7</xref>] concept of digital epidermalization, revealing how racialized and religious difference is encoded directly into data architectures. The documented accuracy gap—most notably the 34.7% error rate affecting darker-skinned individuals—cannot be dismissed as an accidental or transitional technical limitation. Instead, it exemplifies what [<xref ref-type="bibr" rid="B4">4</xref>] identifies as discriminatory design: systems that reproduce racial hierarchies precisely through claims of neutrality, efficiency, and objectivity. In this configuration, algorithmic error does not signal system failure; it operates as a mode of governance. When Arab Muslim bodies are disproportionately misrecognized, the resulting frictions—secondary screenings, delays, repeated verification failures—function as automated sanctions that normalize suspicion within the ordinary experience of movement.</p>
      <p>This logic resonates powerfully with [<xref ref-type="bibr" rid="B16">16</xref>] notion of the epidermalization of inferiority, now recalibrated for the digital era. Where colonial regimes once inscribed racial difference onto the skin through discourse, surveillance, and coercion, contemporary biometric systems translate that inscription into data points, risk scores, and probabilistic classifications. The body is no longer merely seen; it is computed, transformed into a searchable and extractable surface where cultural and religious difference is continuously interpreted as a signal of threat. Biometric governance, then, does not simply observe racialized subjects—it actively produces them as algorithmic objects of suspicion.</p>
      <p>The study further demonstrates that these effects are sustained through what [<xref ref-type="bibr" rid="B2">2</xref>] conceptualizes as rhetorical screening. While biometric surveillance is publicly framed as a technology of facilitation—promising efficiency, safety, and seamless mobility—the empirical evidence reveals its operation as a technology of exclusion ([<xref ref-type="bibr" rid="B41">41</xref>]). For Arab Muslim travelers, the promise of frictionless movement is repeatedly displaced by recursive cycles of algorithmic scrutiny—what this study conceptualizes as failed mobility. The deployment of so-called “street-level algorithms” within systems such as the Traveler Verification System (TVS) intensifies these disparities. As [<xref ref-type="bibr" rid="B26">26</xref>] demonstrate, performance degradation in real-world environments disproportionately affects bodies and practices that diverge from Western secular norms, including religious dress such as the hijab. In this configuration, the border no longer functions primarily as a legal threshold but as an algorithmic choke point, reducing the Arab Muslim subject to what [<xref ref-type="bibr" rid="B1">1</xref>] terms bare life: included in the database only insofar as they remain excluded from full political and spatial belonging.</p>
      <p>The shift toward second-generation biometrics marks a further escalation of this governing logic. As the findings indicate, systems designed to infer intent, affect, or behavioral risk signal a transition from identification to anticipation. This pre-emptive orientation collapses the distinction between identity and action, rendering everyday gestures, facial expressions, and religious affect legible as speculative indicators of threat. Such systems substantiate the ethical concerns raised by [<xref ref-type="bibr" rid="B42">42</xref>], particularly regarding the erosion of moral autonomy and the normalization of predictive judgment in the absence of due process. Within an already racialized security landscape, behavioral biometrics intensify vulnerability by institutionalizing cultural misinterpretation as algorithmic fact.</p>
      <p>These developments must be situated within what [<xref ref-type="bibr" rid="B40">40</xref>] describe as a techno-authoritarian imaginary, in which opaque algorithms, cross-sector data integration, and weak regulatory oversight converge to enable democratic regression. The convergence of biometric data with health, labor, and insurance registries exemplifies [<xref ref-type="bibr" rid="B46">46</xref>] analysis of social sorting, illustrating how surveillance infrastructures are increasingly repurposed to regulate populations across multiple domains of life. As the findings make clear, this form of function creep is not incidental but structural: once digitized, the body becomes permanently governable. In [<xref ref-type="bibr" rid="B4">4</xref>] terms, biometric systems thus operate as racializing assemblages that automate inequality while concealing accountability behind technical complexity.</p>
      <p>Crucially, this study challenges reformist assumptions that biometric bias can be resolved through incremental technical improvements or limited policy interventions. The evidence suggests that inaccuracy, opacity, and disproportionate harm are not peripheral defects but constitutive features of biometric governance as currently deployed. The absence of comprehensive federal regulation in the United States further entrenches these harms, enabling biometric surveillance to expand without meaningful accountability or informed consent ([<xref ref-type="bibr" rid="B43">43</xref>]). As a result, privacy violations become normalized, contextual integrity erodes, and public space is transformed into a site of involuntary data extraction—particularly for populations already marked as suspect.</p>
      <p>Taken together, this study advances scholarship on algorithmic governance by foregrounding the embodied, affective, and lived consequences of biometric surveillance for Arab Muslim minorities within a Western security context. By placing Foucauldian analyses of power in dialogue with Browne’s theorization of racialized surveillance and Benjamin’s critique of discriminatory design, the findings demonstrate that biometric technologies reproduce racialized power not only through representation, but through infrastructure itself. These systems do not merely fail marginalized populations; they succeed in governing them differentially.</p>
      <p>Ultimately, reconciling security governance with democratic pluralism requires more than transparency initiatives or marginal gains in accuracy. It demands a fundamental rethinking of whether biometric surveillance—anchored in logics of prediction, commodification, and suspicion—can ever be compatible with substantive equality. Without such a reorientation, biometric technologies will continue to function not simply as tools of observation, but as infrastructures that actively reorganize social life along enduring lines of race, religion, and power.</p>
      <sec id="sec6dot1">
        <title>6.1. The Roadmap for Democratic Oversight</title>
        <p>Democratic oversight is the essential counter-weight to the “state of exception” created by biometric governance. It functions as the mechanism through which the “bare life” of the data point is restored to the “political life” of the citizen. Based on the findings of this study, the following three strategies are proposed to ensure equity and inclusivity (<bold>Table 3</bold>):</p>
        <p>Table 3. Strategic recommendations for democratic oversight of biometric systems.</p>
        <table-wrap id="tbl3">
          <label>Table 3</label>
          <table>
            <tbody>
              <tr>
                <td>Strategy</td>
                <td>Mechanism of Implementation</td>
                <td>Specific Impact for Arab Muslim Minorities</td>
                <td>Ethical Goal</td>
              </tr>
              <tr>
                <td>Algorithmic Accountability</td>
                <td>Third-party audits and “human-in-the-loop” review protocols.</td>
                <td>Reduces misidentification caused by religious attire (hijab) and non-Western phenotypes.</td>
                <td>Technical Justice: Mitigates “technological redlining.”</td>
              </tr>
              <tr>
                <td>Legislative Safeguards</td>
                <td>Judicial warrants and “disparate impact” legal standing.</td>
                <td>Protects places of worship from mass data harvesting; enables lawsuits for bias.</td>
                <td>Juridical Protection: Restores Fourth Amendment rights.</td>
              </tr>
              <tr>
                <td>Community Governance</td>
                <td>Civilian oversight boards and “necropolitical” redlines.</td>
                <td>Reclassifies cultural preservation as a right rather than a security “risk factor.”</td>
                <td>Political Agency: Shifts from “bare life” back to citizenship.</td>
              </tr>
            </tbody>
          </table>
        </table-wrap>
        <p>6.1.1. Algorithmic Accountability and the “Black Box” Challenge</p>
        <p>Because biometric systems are frequently proprietary, they operate as a “black box” that shields discriminatory outcomes behind the veil of trade secrets, necessitating a mandate for technical transparency through democratic oversight. This transparency should be operationalized first through independent pre-deployment auditing, requiring federal regulations to mandate that any biometric system utilized by the Department of Homeland Security or law enforcement undergo third-party testing for “demographic differential.” Such a protocol ensures that the systemic biases identified in the foundational findings of [<xref ref-type="bibr" rid="B8">8</xref>]—which demonstrated significant accuracy disparities for darker and non-Western phenotypes—are rigorously addressed before a system is deployed against the general public.</p>
        <p>Furthermore, to prevent the dehumanizing reduction of identity to a mere biological data point, the right to human intervention must be codified. This ensures that any “flag” generated by an automated system is subjected to a mandatory, documented review by a human official. Crucially, these officials must be trained to recognize specific cultural and religious occlusions, such as the hijab, which frequently confound algorithmic models and lead to disproportionate false positives for Arab Muslim populations. By integrating these technical and human safeguards, the governance of biometric technology can move toward a model that prioritizes individual dignity over uncritical machine efficiency.</p>
        <p>6.1.2. Legislative Safeguards: Bridging the “Regulatory Vacuum”</p>
        <p>The current “regulatory void” identified by the [<xref ref-type="bibr" rid="B43">43</xref>] must be filled by a comprehensive federal statute that specifically addresses the unique challenges posed by biometric identifiers. Central to this legislative effort is the restoration of the Fourth Amendment in the digital age, which requires that “real-time” facial recognition in public spaces be prohibited except under a judicial warrant based on probable cause. Such a mandate is essential to prevent the “indiscriminate data harvesting” that currently targets Arab Muslim community centers and places of worship, effectively turning communal spaces into sites of perpetual surveillance.</p>
        <p>Furthermore, the 2024 Middle Eastern or North African (MENA) census category shift should be leveraged as a robust legal tool to protect these communities. By officially recognizing Arab Muslims as a distinct minority, the law can facilitate “disparate impact” lawsuits against agencies whose biometric tools consistently produce false positives for this demographic. Codifying this right to redress transforms the MENA category from a mere administrative label into a powerful instrument for accountability, allowing individuals to challenge the systemic biases that have historically rendered their belonging conditional.</p>
        <p>6.1.3. Community-Led Governance and “Counter-Surveillance”</p>
        <p>Aligning with the Foucauldian principle that “where there is power, there is resistance,” democratic oversight must integrate the voices of those most impacted by surveillance into the very structure of governance ([<xref ref-type="bibr" rid="B30">30</xref>]). This shift is operationalized through the creation of civilian oversight boards, which ensure that communities historically subjected to intense securitization—particularly Arab and Muslim Americans—have a seated role in the procurement and policy-setting stages of biometric technology. By institutionalizing this participation, the state acknowledges that the “will to survive” and the desire for cultural preservation are not biological risk factors to be managed by an algorithm, but are fundamental democratic rights.</p>
        <p>To further protect these rights, democratic frameworks must establish “redlines” that designate certain applications of biometrics as fundamentally incompatible with a democratic order. These redlines identify practices such as religious profiling or the tracking of legal political protests as “necropolitical” boundary crossings that must be banned entirely rather than merely regulated. By defining these limits, community-led governance moves the focus from making surveillance more accurate to ensuring that technology is not used as an infrastructure of domination, thereby upholding the principles of justice and inclusivity within the digital landscape.</p>
      </sec>
    </sec>
    <sec id="sec7">
      <title>7. Conclusion</title>
      <p>This research has explored the profound and often invisible friction at the intersection of biometric innovation and the lived reality of Arab Muslim minorities in the United States. While these technologies are presented as neutral instruments of security, the findings of this study suggest they act as a “digital border” that internalizes and automates historical patterns of suspicion. For a community whose identity is often caught between a desire for cultural preservation and the pressures of securitization, the digitizing of the body represents a new frontier of biopolitical management—one where a retinal scan or a facial template can become a mechanism of “bare life,” suspending legal protections in the name of algorithmic efficiency.</p>
      <p>The study demonstrates that the “technical glitches” often cited in biometric failures—specifically the high error rates for Middle Eastern phenotypes and the “occlusion” issues related to religious attire like the hijab—are not merely engineering hurdles. Rather, they are manifestations of “technological redlining,” where systemic biases are baked into training datasets and institutional deployment strategies. By integrating the critical frameworks of Foucault, Agamben, and Mbembe, this research reveals how these systems do more than identify; they categorize, sort, and eventually govern populations ([<xref ref-type="bibr" rid="B10">10</xref>]). The current regulatory vacuum at the federal level exacerbates this, leaving Arab Muslim communities hyper-visible to the state yet statistically erased from protective legislation.</p>
      <p>Ultimately, the future of biometric governance must be a democratic project, not just a technical one ([<xref ref-type="bibr" rid="B30">30</xref>]). To prevent the entrenchment of a permanent digital hierarchy, policymakers must move beyond procedural “fixes” toward a model of robust democratic oversight that prioritizes transparency, accountability, and the right to human intervention. This requires a fundamental shift: moving from a political order where bodies are sorted before they are heard, to one where the dignity of the human subject precedes the data point. If the United States is to remain a true pluralistic democracy, its technological systems must learn to recognize the citizen before they scan the suspect.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <title>References</title>
      <ref id="B1">
        <label>1.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Agamben, G. (1998). <italic>Homo Sacer: Sovereign Power and Bare Life</italic> (D. Heller-Roazen, Trans.). Stanford University Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Agamben, G.</string-name>
              <string-name>Heller-Roazen, T</string-name>
            </person-group>
            <year>1998</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B2">
        <label>2.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Al-Khateeb, M. T. (2021). Toward a Rhetorical Account of Refugee Encounters: Biometric Screening Technologies and Failed Promises of Mobility. <italic>Rhetoric Society Quarterly, 51,</italic> 15-26. https://doi.org/10.1080/02773945.2020.1841276 <pub-id pub-id-type="doi">10.1080/02773945.2020.1841276</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/02773945.2020.1841276">https://doi.org/10.1080/02773945.2020.1841276</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Al-Khateeb, M.</string-name>
            </person-group>
            <year>2021</year>
            <pub-id pub-id-type="doi">10.1080/02773945.2020.1841276</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B3">
        <label>3.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Alsharif, M. (2024). <italic>“</italic><italic>We Exist</italic><italic>”</italic><italic>: New Middle Eastern or North African Census Category Helps Community Members Feel Seen</italic>. NBC News. https://www.resource.dnsafrica.org/2024/04/03/we-exist-new-middle-eastern-or-north-african-census-category-helps-community-members-feel-seen-nbc-news/</mixed-citation>
          <element-citation publication-type="web">
            <person-group person-group-type="author">
              <string-name>Alsharif, M.</string-name>
            </person-group>
            <year>2024</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B4">
        <label>4.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Benjamin, R. (2019). <italic>Race after Technology: Abolitionist Tools for the New Jim Code</italic>. Polity Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Benjamin, R.</string-name>
            </person-group>
            <year>2019</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B5">
        <label>5.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Bigo, D. (2014). The (In)securitization Practices of the Three Universes of EU Border Control: Military/Navy—Border Guards/Police—Database Analysts. <italic>Security Dialogue, 45,</italic> 209-225. https://doi.org/10.1177/0967010614530459 <pub-id pub-id-type="doi">10.1177/0967010614530459</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0967010614530459">https://doi.org/10.1177/0967010614530459</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Bigo, D.</string-name>
            </person-group>
            <year>2014</year>
            <pub-id pub-id-type="doi">10.1177/0967010614530459</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B6">
        <label>6.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Braun, V., &amp; Clarke, V. (2006). Using Thematic Analysis in Psychology. <italic>Qualitative Research in Psychology, 3,</italic> 77-101. https://doi.org/10.1191/1478088706qp063oa <pub-id pub-id-type="doi">10.1191/1478088706qp063oa</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1191/1478088706qp063oa">https://doi.org/10.1191/1478088706qp063oa</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Braun, V.</string-name>
              <string-name>Clarke, V.</string-name>
            </person-group>
            <year>2006</year>
            <pub-id pub-id-type="doi">10.1191/1478088706qp063oa</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B7">
        <label>7.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Browne, S. (2010). Digital Epidermalization: Race, Identity and Biometrics. <italic>Critical Sociology, 36,</italic> 131-150. https://doi.org/10.1177/0896920509347144 <pub-id pub-id-type="doi">10.1177/0896920509347144</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0896920509347144">https://doi.org/10.1177/0896920509347144</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Browne, S.</string-name>
              <string-name>Race, I</string-name>
            </person-group>
            <year>2010</year>
            <pub-id pub-id-type="doi">10.1177/0896920509347144</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B8">
        <label>8.</label>
        <citation-alternatives>
          <mixed-citation publication-type="confproc">Buolamwini, J., &amp; Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. <italic>Proceedings of Machine Learning Research, 81,</italic> 77-91.</mixed-citation>
          <element-citation publication-type="confproc">
            <person-group person-group-type="author">
              <string-name>Buolamwini, J.</string-name>
              <string-name>Gebru, T.</string-name>
            </person-group>
            <year>2018</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B9">
        <label>9.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Capotorti, F. (1977). <italic>Study on the Rights of Persons Belonging to Ethnic, Religious and Linguistic Minorities (UN Doc. E/CN.4/Sub.2/384/Rev.1)</italic>. United Nations.</mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Capotorti, F.</string-name>
              <string-name>Ethnic, R</string-name>
            </person-group>
            <year>1977</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B10">
        <label>10.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Caputo, A. C. (2014). Physical Security Integration. In A. C. Caputo (Ed.), <italic>Digital Video Surveillance and Security</italic> (2nd ed., pp. 363-393). Elsevier. https://doi.org/10.1016/b978-0-12-420042-5.00011-3 <pub-id pub-id-type="doi">10.1016/b978-0-12-420042-5.00011-3</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/b978-0-12-420042-5.00011-3">https://doi.org/10.1016/b978-0-12-420042-5.00011-3</ext-link></mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Caputo, A.</string-name>
            </person-group>
            <year>2014</year>
            <pub-id pub-id-type="doi">10.1016/b978-0-12-420042-5.00011-3</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B11">
        <label>11.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Charmaz, K. (2014). <italic>Constructing Grounded Theory</italic> (2nd ed.). Sage Publications.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Charmaz, K.</string-name>
            </person-group>
            <year>2014</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B12">
        <label>12.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color. <italic>Stanford Law Review, 43,</italic> 1241-1299. https://doi.org/10.2307/1229039 <pub-id pub-id-type="doi">10.2307/1229039</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.2307/1229039">https://doi.org/10.2307/1229039</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Crenshaw, K.</string-name>
              <string-name>Intersectionality, I</string-name>
            </person-group>
            <year>1991</year>
            <pub-id pub-id-type="doi">10.2307/1229039</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B13">
        <label>13.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Cuellar, M., To, H. K., &amp; Mehrotra, A. (2025). <italic>Accuracy and Fairness of Facial Recognition Technology in Low Quality Police Images: An Experiment with Synthetic Faces (CoRR abs/2505.14320)</italic>. https://arxiv.org/abs/2505.14320</mixed-citation>
          <element-citation publication-type="web">
            <person-group person-group-type="author">
              <string-name>Cuellar, M.</string-name>
              <string-name>To, H.</string-name>
              <string-name>Mehrotra, A.</string-name>
            </person-group>
            <year>2025</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B14">
        <label>14.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">De Hert, P., &amp; Bouchagiar, G. (2022). Visual and Biometric Surveillance in the EU. Saying “No” to Mass Surveillance Practices? <italic>Information Polity, 27,</italic> 193-217. https://doi.org/10.3233/ip-211525 <pub-id pub-id-type="doi">10.3233/ip-211525</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3233/ip-211525">https://doi.org/10.3233/ip-211525</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Hert, P.</string-name>
              <string-name>Bouchagiar, G.</string-name>
            </person-group>
            <year>2022</year>
            <pub-id pub-id-type="doi">10.3233/ip-211525</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B15">
        <label>15.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Delgado, R., &amp; Stefancic, J. (2017). <italic>Critical Race Theory: An Introduction</italic> (3rd ed.). New York University Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Delgado, R.</string-name>
              <string-name>Stefancic, J.</string-name>
            </person-group>
            <year>2017</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B16">
        <label>16.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Fanon, F. (1952/2008). <italic>Black Skin, White Masks</italic> (R. Philcox, Trans.). Grove Press. (Original Work Published 1952)</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Fanon, F.</string-name>
              <string-name>Skin, W</string-name>
              <string-name>Philcox, T</string-name>
            </person-group>
            <year>1952</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B17">
        <label>17.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Foucault, M. (1977). <italic>Discipline and Punish: The Birth of the Prison</italic>. Pantheon Books.</mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Foucault, M.</string-name>
            </person-group>
            <year>1977</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B18">
        <label>18.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Foucault, M. (1980). <italic>Power/Knowledge: Selected Interviews and Other Writings, 1972-1977</italic>. Pantheon Books.</mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Foucault, M.</string-name>
            </person-group>
            <year>1980</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B19">
        <label>19.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Garvie, C., Bedoya, A., &amp; Frankle, J. (2016). <italic>The Perpetual Line-Up: Unregulated Police Face Recognition in America</italic>. Center on Privacy &amp; Technology, Georgetown Law. https://www.perpetuallineup.org/</mixed-citation>
          <element-citation publication-type="web">
            <person-group person-group-type="author">
              <string-name>Garvie, C.</string-name>
              <string-name>Bedoya, A.</string-name>
              <string-name>Frankle, J.</string-name>
              <string-name>Technology, G</string-name>
            </person-group>
            <year>2016</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B20">
        <label>20.</label>
        <citation-alternatives>
          <mixed-citation publication-type="report">Grand View Research (2023). <italic>Biometric Technology Market Size, Share &amp; Trends Analysis Report by Component, by Offering, by Authentication Type, by Application, by End-Use, by</italic><italic>Region</italic><italic>, and Segment Forecasts (2023</italic><italic>-</italic><italic>2030) (Report No. 978-1-68038-299-0).</italic> https://www.grandviewresearch.com/industry-analysis/biometrics-industry</mixed-citation>
          <element-citation publication-type="report">
            <person-group person-group-type="author">
              <string-name>Size, S</string-name>
            </person-group>
            <year>2023</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B21">
        <label>21.</label>
        <citation-alternatives>
          <mixed-citation publication-type="report">Grother, P., Ngan, M., &amp; Hanaoka, K. (2019). <italic>Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects (NIST Interagency/Internal Report 8280)</italic>. National Institute of Standards and Technology.</mixed-citation>
          <element-citation publication-type="report">
            <person-group person-group-type="author">
              <string-name>Grother, P.</string-name>
              <string-name>Ngan, M.</string-name>
              <string-name>Hanaoka, K.</string-name>
            </person-group>
            <year>2019</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B22">
        <label>22.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Haddad, Y. Y. (2004). <italic>The Muslims of America</italic>. Oxford University Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Haddad, Y.</string-name>
            </person-group>
            <year>2004</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B23">
        <label>23.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Hodwitz, O., &amp; King, S. (2025). Biometrics. In J. R. Vacca (Ed.), <italic>Computer and Information Security Handbook</italic> (4th ed., pp. 1161-1176). Elsevier. https://doi.org/10.1016/b978-0-443-13223-0.00072-2 <pub-id pub-id-type="doi">10.1016/b978-0-443-13223-0.00072-2</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/b978-0-443-13223-0.00072-2">https://doi.org/10.1016/b978-0-443-13223-0.00072-2</ext-link></mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Hodwitz, O.</string-name>
              <string-name>King, S.</string-name>
            </person-group>
            <year>2025</year>
            <pub-id pub-id-type="doi">10.1016/b978-0-443-13223-0.00072-2</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B24">
        <label>24.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Jain, A. K., Nandakumar, K., &amp; Ross, A. (2016). 50 Years of Biometric Research: Accomplishments, Challenges, and Opportunities. <italic>Pattern Recognition Letters, 79,</italic> 80-105. https://doi.org/10.1016/j.patrec.2015.12.013 <pub-id pub-id-type="doi">10.1016/j.patrec.2015.12.013</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.patrec.2015.12.013">https://doi.org/10.1016/j.patrec.2015.12.013</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Jain, A.</string-name>
              <string-name>Nandakumar, K.</string-name>
              <string-name>Ross, A.</string-name>
              <string-name>Accomplishments, C</string-name>
            </person-group>
            <year>2016</year>
            <pub-id pub-id-type="doi">10.1016/j.patrec.2015.12.013</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B25">
        <label>25.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Jakubowska, E., &amp; Naranjo, D. (2020). <italic>Ban Biometric Mass Surveillance: A Set of Funda-mental Rights Demands for the European Commission and EU Member States</italic>. Euro-pean Digital Rights (EDRi). https://edri.org/wp-content/uploads/2020/05/Paper-Ban-Biometric-Mass-Surveillance.pdf</mixed-citation>
          <element-citation publication-type="web">
            <person-group person-group-type="author">
              <string-name>Jakubowska, E.</string-name>
              <string-name>Naranjo, D.</string-name>
            </person-group>
            <year>2020</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B26">
        <label>26.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Khan, N., &amp; Efthymiou, M. (2021). The Use of Biometric Technology at Airports: The Case of Customs and Border Protection (CBP). <italic>International Journal of Information Management Data Insights, 1,</italic> Article ID: 100049. https://doi.org/10.1016/j.jjimei.2021.100049 <pub-id pub-id-type="doi">10.1016/j.jjimei.2021.100049</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.jjimei.2021.100049">https://doi.org/10.1016/j.jjimei.2021.100049</ext-link></mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Khan, N.</string-name>
              <string-name>Efthymiou, M.</string-name>
            </person-group>
            <year>2021</year>
            <fpage>100049</fpage>
            <elocation-id>ID</elocation-id>
            <pub-id pub-id-type="doi">10.1016/j.jjimei.2021.100049</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B27">
        <label>27.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Lawal, G. S. (2020). <italic>Ethical and Legal Implications of Biometric Data Collection in Digital Health Services (Preprint)</italic>.</mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Lawal, G.</string-name>
            </person-group>
            <year>2020</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B28">
        <label>28.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Lee, N., Resnick, P., &amp; Barton, G. (2019). <italic>Algorithmic Bias Detection and Mitigation: Reducing Consumer Harms</italic>. Brookings Institution. https://coilink.org/20.500.12592/k29pdg</mixed-citation>
          <element-citation publication-type="web">
            <person-group person-group-type="author">
              <string-name>Lee, N.</string-name>
              <string-name>Resnick, P.</string-name>
              <string-name>Barton, G.</string-name>
            </person-group>
            <year>2019</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B29">
        <label>29.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Lyon, D. (2017). <italic>The Culture of Surveillance: Watching as a Way of Life</italic>. Polity Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Lyon, D.</string-name>
            </person-group>
            <year>2017</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B30">
        <label>30.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Matulionyte, R., &amp; Zalnieriute, M. (2024). Facial Recognition Technology in Context: Technical and Legal Challenges. In <italic>The Cambridge Handbook of Facial Recognition in the Modern State</italic> (pp. 9-124). Cambridge University Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Matulionyte, R.</string-name>
              <string-name>Zalnieriute, M.</string-name>
            </person-group>
            <year>2024</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B31">
        <label>31.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Mbembe, A. (2019). <italic>Necropolitics</italic> (S. Corcoran, Trans.). Duke University Press. https://doi.org/10.2307/j.ctv1131298 <pub-id pub-id-type="doi">10.2307/j.ctv1131298</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.2307/j.ctv1131298">https://doi.org/10.2307/j.ctv1131298</ext-link></mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Mbembe, A.</string-name>
              <string-name>Corcoran, T</string-name>
            </person-group>
            <year>2019</year>
            <pub-id pub-id-type="doi">10.2307/j.ctv1131298</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B32">
        <label>32.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Munir, B. (2025). Islamophobic Artificial Intelligence in the USA: A Critical Analysis of Religious Bias in Datasets. <italic>Law Library Journal</italic><italic>.</italic> https://doi.org/10.2139/ssrn.5265355 <pub-id pub-id-type="doi">10.2139/ssrn.5265355</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.2139/ssrn.5265355">https://doi.org/10.2139/ssrn.5265355</ext-link></mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Munir, B.</string-name>
            </person-group>
            <year>2025</year>
            <pub-id pub-id-type="doi">10.2139/ssrn.5265355</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B33">
        <label>33.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Naber, N. (2012). <italic>Arab America: Gender, Cultural Politics, and Activism</italic>. New York University Press.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Naber, N.</string-name>
              <string-name>Gender, C</string-name>
            </person-group>
            <year>2012</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B34">
        <label>34.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Nedelcu, M., &amp; Soysüren, I. (2020). Precarious Migrants, Migration Regimes and Digital Technologies: The Empowerment-Control Nexus. <italic>Journal of Ethnic and Migration Studies, 48,</italic> 1821-1837. https://doi.org/10.1080/1369183x.2020.1796263 <pub-id pub-id-type="doi">10.1080/1369183x.2020.1796263</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/1369183x.2020.1796263">https://doi.org/10.1080/1369183x.2020.1796263</ext-link></mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Nedelcu, M.</string-name>
              <string-name>Migrants, M</string-name>
            </person-group>
            <year>2020</year>
            <pub-id pub-id-type="doi">10.1080/1369183x.2020.1796263</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B35">
        <label>35.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Nissenbaum, H. (2010). <italic>Privacy in Context</italic><italic>:</italic><italic>Technology, Policy, and the Integrity of Social Life</italic>. Stanford University Press. https://doi.org/10.1515/9780804772891 <pub-id pub-id-type="doi">10.1515/9780804772891</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1515/9780804772891">https://doi.org/10.1515/9780804772891</ext-link></mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Nissenbaum, H.</string-name>
              <string-name>Technology, P</string-name>
            </person-group>
            <year>2010</year>
            <pub-id pub-id-type="doi">10.1515/9780804772891</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B36">
        <label>36.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Office of the United Nations High Commissioner for Human Rights (2010). <italic>Minority Rights: International Standards and Guidance for Implementation.</italic>United Nations. https://www.ohchr.org/sites/default/files/Documents/Publications/MinorityRights_en.pdf</mixed-citation>
          <element-citation publication-type="web">
            <year>2010</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B37">
        <label>37.</label>
        <citation-alternatives>
          <mixed-citation publication-type="web">Pew Research Center (2017). <italic>Demographic Portrait of Muslim Americans</italic>. Pew Research Center’s Religion &amp; Public Life Project. https://www.pewresearch.org/religion/2017/07/26/demographic-portrait-of-muslim-americans/</mixed-citation>
          <element-citation publication-type="web">
            <year>2017</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B38">
        <label>38.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Postan, E. (2016). Defining Ourselves: Personal Bioinformation as a Tool of Narrative Self-Conception. <italic>Journal of Bioethical Inquiry, 13,</italic> 133-151. https://doi.org/10.1007/s11673-015-9690-0 <pub-id pub-id-type="doi">10.1007/s11673-015-9690-0</pub-id><pub-id pub-id-type="pmid">26797683</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/s11673-015-9690-0">https://doi.org/10.1007/s11673-015-9690-0</ext-link></mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Postan, E.</string-name>
            </person-group>
            <year>2016</year>
            <pub-id pub-id-type="doi">10.1007/s11673-015-9690-0</pub-id>
            <pub-id pub-id-type="pmid">26797683</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B39">
        <label>39.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Roberts, T., &amp; Oosterom, M. (2025). Digital Authoritarianism: A Systematic Literature Review. <italic>Information Technology for Development, 31,</italic> 860-884. https://doi.org/10.1080/02681102.2024.2425352 <pub-id pub-id-type="doi">10.1080/02681102.2024.2425352</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/02681102.2024.2425352">https://doi.org/10.1080/02681102.2024.2425352</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Roberts, T.</string-name>
              <string-name>Oosterom, M.</string-name>
            </person-group>
            <year>2025</year>
            <pub-id pub-id-type="doi">10.1080/02681102.2024.2425352</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B40">
        <label>40.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Schopmans, H., &amp; Tuncer Ebetürk, İ. (2024). Techno-Authoritarian Imaginaries and the Politics of Resistance against Facial Recognition Technology in the US and European Union. <italic>Democratization, 31,</italic> 943-962. https://doi.org/10.1080/13510347.2023.2258803 <pub-id pub-id-type="doi">10.1080/13510347.2023.2258803</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/13510347.2023.2258803">https://doi.org/10.1080/13510347.2023.2258803</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Schopmans, H.</string-name>
            </person-group>
            <year>2024</year>
            <pub-id pub-id-type="doi">10.1080/13510347.2023.2258803</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B41">
        <label>41.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Solove, D. J. (2006). A Taxonomy of Privacy. <italic>University of Pennsylvania Law Review, 154,</italic> 477-564. https://doi.org/10.2307/40041279 <pub-id pub-id-type="doi">10.2307/40041279</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.2307/40041279">https://doi.org/10.2307/40041279</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Solove, D.</string-name>
            </person-group>
            <year>2006</year>
            <pub-id pub-id-type="doi">10.2307/40041279</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B42">
        <label>42.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Sutrop, M., &amp; Laas‐Mikko, K. (2012). From Identity Verification to Behavior Prediction: Ethical Implications of Second Generation Biometrics. <italic>Review of Policy Research, 29,</italic> 21-36. https://doi.org/10.1111/j.1541-1338.2011.00536.x <pub-id pub-id-type="doi">10.1111/j.1541-1338.2011.00536.x</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1111/j.1541-1338.2011.00536.x">https://doi.org/10.1111/j.1541-1338.2011.00536.x</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Sutrop, M.</string-name>
              <string-name>Mikko, K.</string-name>
            </person-group>
            <year>2012</year>
            <pub-id pub-id-type="doi">10.1111/j.1541-1338.2011.00536.x</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B43">
        <label>43.</label>
        <citation-alternatives>
          <mixed-citation publication-type="report">United States Commission on Civil Rights (2024). <italic>The Civil Rights Implications of the Federal Use of Facial Recognition Technology: 2024 Statutory Enforcement Report</italic>. https://www.usccr.gov/files/2024-09/civil-rights-implications-of-frt_0.pdf</mixed-citation>
          <element-citation publication-type="report">
            <year>2024</year>
            <fpage>2024</fpage>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B44">
        <label>44.</label>
        <citation-alternatives>
          <mixed-citation publication-type="book">Yin, R. K. (2018). <italic>Case Study Research and Applications: Design and Methods</italic> (6th ed.). Sage Publications.</mixed-citation>
          <element-citation publication-type="book">
            <person-group person-group-type="author">
              <string-name>Yin, R.</string-name>
            </person-group>
            <year>2018</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B45">
        <label>45.</label>
        <citation-alternatives>
          <mixed-citation publication-type="journal">Zhao, S. Y. (2022). Achille Mbembe, Necropolitics [Book Review]. <italic>International Journal of Communication, 16,</italic> 2961-2963.</mixed-citation>
          <element-citation publication-type="journal">
            <person-group person-group-type="author">
              <string-name>Zhao, S.</string-name>
              <string-name>Mbembe, N</string-name>
            </person-group>
            <year>2022</year>
          </element-citation>
        </citation-alternatives>
      </ref>
      <ref id="B46">
        <label>46.</label>
        <citation-alternatives>
          <mixed-citation publication-type="other">Ziadah, R. (2021). Surveillance, Race, and Social Sorting in the United Arab Emirates. <italic>Politics, 44,</italic> 605-620. https://doi.org/10.1177/02633957211009719 <pub-id pub-id-type="doi">10.1177/02633957211009719</pub-id><ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/02633957211009719">https://doi.org/10.1177/02633957211009719</ext-link></mixed-citation>
          <element-citation publication-type="other">
            <person-group person-group-type="author">
              <string-name>Ziadah, R.</string-name>
              <string-name>Surveillance, R</string-name>
            </person-group>
            <year>2021</year>
            <pub-id pub-id-type="doi">10.1177/02633957211009719</pub-id>
          </element-citation>
        </citation-alternatives>
      </ref>
    </ref-list>
  </back>
</article>