Facial recognition system: Difference between revisions

Content deleted Content added
m This article referred to the US Government Accountability Office (GAO) as the General Accountability Office. The correct term is the Government Accountability Office (GAO).
 
(22 intermediate revisions by 17 users not shown)
Line 17:
In 1970, [[Takeo Kanade]] publicly demonstrated a face-matching system that located anatomical features such as the chin and calculated the distance ratio between facial features without human intervention. Later tests revealed that the system could not always reliably identify facial features. Nonetheless, interest in the subject grew and in 1977 Kanade published the first detailed book on facial recognition technology.<ref>{{Cite book|title=The History of Information Security: A Comprehensive Handbook|last1=de Leeuw| first1=Karl| last2=Bergstra| first2=Jan| publisher=Elsevier| year=2007| isbn=9780444516084|pages=266}}</ref>
 
In 1993, the [[Defense Advanced Research Project Agency]] (DARPA) and the [[Army Research Laboratory]] (ARL) established the face recognition technology program [[FERET (facial recognition technology)|FERET]] to develop "automatic face recognition capabilities" that could be employed in a productive real life environment "to assist security, intelligence, and law enforcement personnel in the performance of their duties." Face recognition systems that had been trialedtrialled in research labs were evaluated. and theThe FERET tests found that while the performance of existing automated facial recognition systems varied, a handful of existing methods could viably be used to recognize faces in still images taken in a controlled environment.<ref>{{Cite book|title=Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance|last1=Gates| first1=Kelly| publisher=NYU Press| year=2011| isbn=9780814732090|pages=48–49}}</ref> The FERET tests spawned three US companies that sold automated facial recognition systems. Vision Corporation and Miros Inc were both founded in 1994, by researchers who used the results of the FERET tests as a selling point. [[Viisage Technology]] was established by a [[identification card]] defense contractor in 1996 to commercially exploit the rights to the facial recognition algorithm developed by [[Alex Pentland]] at [[MIT]].<ref>{{Cite book|title=Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance|last1=Gates| first1=Kelly| publisher=NYU Press| year=2011| isbn=9780814732090|pages=49–50}}</ref>
 
Following the 1993 FERET face-recognition vendor test, the [[Department of Motor Vehicles]] (DMV) offices in [[West Virginia]] and [[New Mexico]] became the first DMV offices to use automated facial recognition systems to prevent people from obtaining multiple driving licenses using different names. [[Driver's licenses in the United States]] were at that point a commonly accepted form of [[photo identification]]. DMV offices across the United States were undergoing a technological upgrade and were in the process of establishing databases of digital ID photographs. This enabled DMV offices to deploy the facial recognition systems on the market to search photographs for new driving licenses against the existing DMV database.<ref>{{Cite book|title=Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance|last1=Gates| first1=Kelly| publisher=NYU Press| year=2011| isbn=9780814732090|pages=52}}</ref> DMV offices became one of the first major markets for automated facial recognition technology and introduced US citizens to facial recognition as a standard method of identification.<ref>{{Cite book|title=Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance|last1=Gates| first1=Kelly| publisher=NYU Press| year=2011| isbn=9780814732090|pages=53}}</ref> The increase of the [[Incarceration in the United States|US prison population]] in the 1990s prompted U.S. states to established connected and automated identification systems that incorporated digital [[biometric]] databases, in some instances this included facial recognition. In 1999, [[Minnesota]] incorporated the facial recognition system FaceIT by Visionics into a [[mug shot]] booking system that allowed police, judges and court officers to track criminals across the state.<ref>{{Cite book|title=Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance|last1=Gates| first1=Kelly| publisher=NYU Press| year=2011| isbn=9780814732090|pages=54}}</ref>
Line 101:
Police forces in the United Kingdom have been trialing live facial recognition technology at public events since 2015.<ref name=":0">{{Cite web|url=https://bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf|title=Face Off: The lawless growth of facial recognition in UK policing|website=Big Brother Watch}}</ref> In May 2017, a man was arrested using an automatic facial recognition (AFR) system mounted on a van operated by the South Wales Police. [[Ars Technica]] reported that "this appears to be the first time [AFR] has led to an arrest".<ref>{{cite web|url=https://arstechnica.com/tech-policy/2017/06/police-automatic-face-recognition/|title=UK police arrest man via automatic face-recognition tech|first=Sebastian|last=Anthony|date=June 6, 2017|website=Ars Technica}}</ref> However, a 2018 report by [[Big Brother Watch]] found that these systems were up to 98% inaccurate.<ref name=":0" /> The report also revealed that two UK police forces, [[South Wales Police]] and the [[Metropolitan Police]], were using live facial recognition at public events and in public spaces.<ref name="Rees">{{Cite news|url=https://www.bbc.com/news/uk-wales-49565287|title=Police use of facial recognition ruled lawful|last=Rees|first=Jenny|date=September 4, 2019|access-date=November 8, 2019|language=en-GB}}</ref>
In September 2019, South Wales Police use of facial recognition was ruled lawful.<ref name="Rees"/> Live facial recognition has been trialled since 2016 in the streets of London and will be used on a regular basis from [[Metropolitan Police]] from beginning of 2020.<ref>{{Cite magazine|url=https://www.wired.co.uk/article/london-met-police-facial-recognition|title=The Met Police will start using live facial recognition across London|last=Burgess|first=Matt|date=January 24, 2020|magazine=Wired UK|access-date=January 24, 2020|issn=1357-0978}}</ref> In August 2020 the [[Court of Appeal (England and Wales)|Court of Appeal]] ruled that the way the facial recognition system had been used by the South Wales Police in 2017 and 2018 violated human rights.<ref>{{cite web|url=https://techxplore.com/news/2020-08-uk-court-recognition-violates-human.html |author= Danica Kirka |title= UK court says face recognition violates human rights| date = August 11, 2020|access-date=October 4, 2020 |website= TechPlore}}</ref>
 
However, by 2024 the Metropolitan Police were using the technique with a database of 16,000 suspects, leading to over 360 arrests, including rapists and someone wanted for [[grievous bodily harm]] for 8 years. They claim a [[False positives and false negatives|false positive]] rate of only 1 in 6,000. The photos of those not identified by the system are deleted immediately.<ref>{{cite news |last1=Sylvester |first1=Rachel |title='No human could do this': how facial recognition is transforming policing. |url=https://www.thetimes.com/article/86256fe3-a218-4c00-97c3-f1021a8f8c11 |access-date=5 October 2024 |work=The Times |date=5 October 2024}}</ref>
 
==== United States ====
[[File:Facial recognition technology at gate (44275734970).jpg|thumb|Flight boarding gate with "biometric face scanners" developed by [[U.S. Customs and Border Protection]] at [[Hartsfield–Jackson Atlanta International Airport]] ]]
The [[U.S. Department of State]] operates one of the largest face recognition systems in the world with a database of 117 million American adults, with photos typically drawn from driver's license photos.<ref>{{cite web|url=http://fortune.com/2016/10/18/facial-recognition-database/|title=Here's How Many Adult Faces Are Scanned From Facial Recognition Databases|lastdate=2016-10-18|website=Fortune}}</ref> Although it is still far from completion, it is being put to use in certain cities to give clues as to who was in the photo. The FBI uses the photos as an investigative tool, not for positive identification.<ref name="phys.org2">{{Cite web|url=https://phys.org/news/2016-12-facial-recognition-technology-real-world.html|title=The trouble with facial recognition technology (in the real world)|first1=Robin|last1=Kramer|first2=Kay|last2=Ritchie|date=2016-12-14|website=phys.org}}</ref> {{As of|2016|post=,}} facial recognition was being used to identify people in photos taken by police in [[San Diego]] and Los Angeles (not on real-time video, and only against booking photos)<ref>{{cite web|url=https://www.npr.org/2018/05/10/609422158/real-time-facial-recognition-is-available-but-will-u-s-police-buy-it|title=Real-Time Facial Recognition Is Available, But Will U.S. Police Buy It?|date=2018-05-10|website=NPR.org |publisher=NPR}}</ref> and use was planned in [[West Virginia]] and [[Dallas]].<ref>{{cite web|url=https://www.npr.org/2016/10/23/499042369/police-facial-recognition-databases-log-about-half-of-americans|title=Police Facial Recognition Databases Log About Half Of Americans|date=2016-10-23|website=NPR.org |publisher=NPR}}</ref>
 
In recent years Maryland has used face recognition by comparing people's faces to their driver's license photos. The system drew controversy when it was used in Baltimore to arrest unruly protesters after the [[death of Freddie Gray]] in police custody.<ref>{{cite web|url=http://www.baltimoresun.com/news/maryland/crime/bs-md-facial-recognition-20161017-story.html|title=Maryland's use of facial recognition software questioned by researchers, civil liberties advocates|lastlast1=KnezevichRector|firstfirst1=Kevin|last2=Knezevich|first2=Alison|date=2016-10-17|work=The Rector,Baltimore AlisonSun}}</ref> Many other states are using or developing a similar system however some states have laws prohibiting its use.
 
The [[Federal Bureau of Investigation|FBI]] has also instituted its [[Next Generation Identification]] program to include face recognition, as well as more traditional biometrics like [[fingerprint]]s and [[Iris recognition|iris scans]], which can pull from both criminal and civil databases.<ref>{{Cite web|url=https://www.fbi.gov/about-us/cjis/fingerprints_biometrics/ngi|title=Next Generation Identification|website=FBI|access-date=April 5, 2016}}</ref> The federal [[Government Accountability Office]] criticized the FBI for not addressing various concerns related to privacy and accuracy.<ref name="ICE" />
 
Starting in 2018, [[U.S. Customs and Border Protection]] deployed "biometric face scanners" at U.S. airports. Passengers taking outbound international flights can complete the check-in, security and the boarding process after getting facial images captured and verified by matching their ID photos stored on CBP's database. Images captured for travelers with U.S. citizenship will be deleted within up to 12-hours. The [[Transportation Security Administration]] (TSA) had expressed its intention to adopt a similar program for domestic air travel during the security check process in the future. The [[American Civil Liberties Union]] is one of the organizations against the program, concerning that the program will be used for surveillance purposes.<ref>{{cite news|title=TSAFacial hadrecognition expressedat itsairports: intentionEverything toyou adoptneed a similar program for domestic air travel.to know|url=https://www.usatoday.com/story/travel/airline-news/2019/08/16/biometric-airport-screening-facial-recognition-everything-you-need-know/1998749001/|work=USA Today|date=August 16, 2019}}</ref>
 
In 2019, researchers reported that [[Immigration and Customs Enforcement]] (ICE) uses facial recognition software against state driver's license databases, including for some states that provide licenses to undocumented immigrants.<ref name="ICE">{{Cite news |date=2019-07-08 |title=ICE Uses Facial Recognition To Sift State Driver's License Records, Researchers Say |language=en |work=NPR.org |url=https://www.npr.org/2019/07/08/739491857/ice-uses-facial-recognition-to-sift-state-drivers-license-records-researchers-sa |access-date=2022-12-09}}</ref>
 
In December 2022, 16 major domestic airports in the US started testing facial-recognition tech where kiosks with cameras are checking the photos on travelers' IDs to make sure that passengers are not impostors.<ref>{{Cite news |date=2022-12-02 |title=TSA nowis wants to scan youradding face recognition at securitybig airports. Here's arehow to youropt rightsout. |language=en-US |newspaper=Washington Post |url=https://www.washingtonpost.com/technology/2022/12/02/tsa-security-face-recognition/ |access-date=2022-12-09 |issn=0190-8286}}</ref>
 
==== China ====
{{See also|Mass surveillance in China}}
In 2006, the "Skynet" (天網))Project was initiated by the Chinese government to implement [[Closed-circuit television|CCTV]] surveillance nationwide and {{as of|lc=y|2018|post=,}} there have been 20 million cameras, many of which are capable of real-time facial recognition, deployed across the country for this project.<ref>{{cite web |last1=Shen |first1=Xinmei |title="Skynet", China's massive video surveillance network |url=https://www.scmp.com/abacus/who-what/what/article/3028246/skynet-chinas-massive-video-surveillance-network |work=South China Morning Post |date=October 4, 2018 |access-date=December 13, 2020}}</ref> Some official claim that the current Skynet system can scan the entire Chinese population in one second and the world population in two seconds.<ref>{{cite web |last1=Chan |first1=Tara Francis |title=16 parts of China are now using Skynet |url=https://www.businessinsider.com.au/china-facial-recognition-technology-works-in-one-second-2018-3 |work=Business Insider |date=March 27, 2018 |access-date=December 13, 2020}}</ref>
[[File:Entrance faregates at waiting room 10 of Beijing West Railway Station (20190908184801).jpg|thumb|Boarding gates with facial recognition technology at [[Beijing West railway station]] ]]
Line 151 ⟶ 153:
In the [[2000 Mexican presidential election]], the Mexican government employed face recognition software to prevent [[voter fraud]]. Some individuals had been registering to vote under several different names, in an attempt to place multiple votes. By comparing new face images to those already in the voter database, authorities were able to reduce duplicate registrations.<ref>{{cite news|url=http://www.thefreelibrary.com/Mexican+Government+Adopts+FaceIt+Face+Recognition+Technology+to...-a062019954|title=Mexican Government Adopts FaceIt Face Recognition Technology to Eliminate Duplicate Voter Registrations in Upcoming Presidential Election|date=May 11, 2000|access-date=June 2, 2008|publisher=Business Wire|archive-date=March 5, 2016|archive-url=https://web.archive.org/web/20160305151832/http://www.thefreelibrary.com/Mexican+Government+Adopts+FaceIt+Face+Recognition+Technology+to...-a062019954}}</ref>
 
In Colombia public transport busses are fitted with a facial recognition system by [https://www.facefirst.com/ FaceFirst Inc] to identify passengers that are sought by the [[National Police of Colombia]]. FaceFirst Inc also built the facial recognition system for [[Tocumen International Airport]] in Panama. The face recognition system is deployed to identify individuals among the travelerstravellers that are sought by the [[Panamanian National Police]] or [[Interpol]].<ref name="Cambridge University Press">{{Cite book|title=The Cambridge Handbook of Consumer Privacy|last1=Selinger|first1=Evan| last2= Polonetsky| first2= Jules | last3= Tene| first3=Omer|publisher=Cambridge University Press|year=2018|isbn=9781316859278|pages=112}}</ref> Tocumen International Airport operates an airport-wide surveillance system using hundreds of live face recognition cameras to identify wanted individuals passing through the airport. The face recognition system was initially installed as part of a US$11&nbsp;million contract and included a [[computer cluster]] of sixty computers, a [[fiber-optic cable]] network for the airport buildings, as well as the installation of 150 surveillance cameras in the [[airport terminal]] and at about 30 [[airport gate]]s.<ref>{{cite news|url=http://www.ihsairport360.com/article/4812/panama-puts-names-to-more-faces|title=Panama puts names to more faces|last=Vogel|first=Ben|access-date=October 7, 2014|archive-url=https://web.archive.org/web/20141012185226/http://www.ihsairport360.com/article/4812/panama-puts-names-to-more-faces|archive-date=October 12, 2014|url-status=live|publisher=IHS Jane's Airport Review}}</ref>
 
At the [[2014 FIFA World Cup]] in Brazil the [[Federal Police of Brazil]] used face recognition [[goggles]]. Face recognition systems "made in China" were also deployed at the [[2016 Summer Olympics]] in Rio de Janeiro.<ref name="Cambridge University Press"/> [[Nuctech Company]] provided 145 insepctioninspection terminals for [[Maracanã Stadium]] and 55 terminals for the [[Deodoro Olympic Park]].<ref>{{cite news|url=http://english.www.gov.cn/news/photos/2016/08/15/content_281475417902847.htm|title='Made-in-China' products shine at Rio Olympics |access-date=November 14, 2020|date=August 15, 2016|publisher=The State Council, The people's Republic of China}}</ref>
 
==== European Union ====
Line 217 ⟶ 219:
 
Although high degrees of accuracy have been claimed for some facial recognition systems, these outcomes are not universal. The consistently worst accuracy rate is for those who are 18 to 30 years old, Black and female.<ref name=":17" />
 
Facial recognition systems have been criticized for upholding and judging based on assumptions about people of colour <ref>{{Cite news |last=Buolamwini |first=Joy |date=July 2023 |title=AI replicates anti-Blackness |pages=34 |work=[[The Voice (British newspaper)|The Voice]]}}</ref> and also on a [[Gender binary|binary gender]] assumption.<ref name=":1">{{Cite web|title=Facial recognition software has a gender problem|url=https://www.nsf.gov/discoveries/disc_summ.jsp?cntn_id=299486|access-date=May 9, 2021|website=nsf.gov| date=November 2019 |language=English}}</ref><ref>{{Cite web|last=Rehnman|first=Jenny|title=The role of gender in face recognition|url=https://www.diva-portal.org/smash/get/diva2:196786/FULLTEXT01.pdf|pages=69|website=www.diva-portal.org}}</ref><ref>{{Cite journal|last1=Mishra|first1=Maruti V.|last2=Likitlersuang|first2=Jirapat|last3=Wilmer|first3=Jeremy B.|last4=Cohan|first4=Sarah|last5=Germine|first5=Laura|last6=DeGutis|first6=Joseph M.|date=November 29, 2019|title=Gender Differences in Familiar Face Recognition and the Influence of Sociocultural Gender Inequality|journal=Scientific Reports|language=en|volume=9|issue=1|pages=17884|doi=10.1038/s41598-019-54074-5|pmid=31784547|pmc=6884510|bibcode=2019NatSR...917884M|issn=2045-2322|doi-access=free}}</ref><ref>{{Cite web|date=August 27, 2020|title=Facing gender bias in facial recognition technology|url=https://www.helpnetsecurity.com/2020/08/27/facial-recognition-bias/|access-date=May 9, 2021|website=Help Net Security|language=en-US}}</ref><ref>{{Cite journal|last1=Palmer|first1=Matthew A.|last2=Brewer|first2=Neil|last3=Horry|first3=Ruth|date=March 2013|title=Understanding gender bias in face recognition: Effects of divided attention at encoding|url=https://www.sciencedirect.com/science/article/pii/S0001691813000231|journal=Acta Psychologica|language=en|volume=142|issue=3|pages=362–369|doi=10.1016/j.actpsy.2013.01.009|pmid=23422290|s2cid=205260206 |issn=0001-6918}}</ref><ref>{{Cite web|title=Why Gender-Neutral Facial Recognition Will Change How We Look at Technology|url=https://www.technologynetworks.com/informatics/articles/why-gender-neutral-facial-recognition-will-change-how-we-look-at-technology-332962|access-date=May 9, 2021|website=Informatics from Technology Networks|language=en}}</ref><ref>{{Cite web|title=Facial Recognition {{!}} Gendered Innovations|url=https://genderedinnovations.stanford.edu/case-studies/facial.html|access-date=May 9, 2021|website=genderedinnovations.stanford.edu}}</ref><ref>{{Cite journal|last=Mason|first=Susan E.|date=September 27, 2007|title=Age and gender as factors in facial recognition and identification|url=https://www.tandfonline.com/doi/abs/10.1080/03610738608259453|journal=Experimental Aging Research|volume=12|issue=3|pages=151–154|language=en|doi=10.1080/03610738608259453|pmid=3830234}}</ref><ref name=":23">{{Citation|title=Facial recognition software has a gender problem|url=https://www.eurekalert.org/pub_releases/2019-10/uoca-frs102919.php|language=en|access-date=May 9, 2021}}</ref> When classifying the faces of [[cisgender]] individuals into male or female, these systems are often fairly accurate,<ref name=":1" /> however were typically confused or unable to determine the [[gender identity]] of [[transgender]] and [[Non-binary gender|non-binary]] people.<ref name=":1" /> [[Gender role|Gender norms]] are being upheld by these systems, so much so that even when shown a photo of a cisgender male with long hair, algorithms were split between following the gender norm of males having short hair, and the [[Masculinity|masculine]] facial features and became confused.<ref name=":1" /><ref name=":23" /> This accidental misgendering of people can be very harmful for those who do not identify with their [[Sex assignment|sex assigned at birth]], by disregarding and invalidating their gender identity. This is also harmful for people who do not ascribe to traditional gender norms, because it invalidates their [[gender expression]], regardless of their [[gender identity]].
 
=== Ineffectiveness ===
Line 354:
* [https://uwe-repository.worktribe.com/output/1024266 ''A Photometric Stereo Approach to Face Recognition''] (master's thesis). The [[University of the West of England, Bristol]].
 
{{Artificial intelligence (AI)}}
{{Differentiable computing}}
{{Authority control}}