AUTOMATED FACIAL RECOGNITION SYSTEM – TECHNOLOGY’S TRYST WITH CRIMINAL INVESTIGATION
The Automated Facial Recognition System, more commonly known as the AFRS, was developed by the National Crime Records Bureau (NCRB), so as to extract, digitise and compare spatial and geometric distribution of facial features, that would assist in the compilation and maintenance of a comprehensive list of suspects and criminals. The NCRB has called for the collection of CCTV footage, photos from newspapers and sketches in order to create a database of information. This database which contains bio metric and facial information of individuals is then used to match facial data and fingerprints obtained from crime sites for the purposes of criminal identification. AFRS has even been dubbed to be a better alternative to the traditional identification methods of criminal identification such as the Test Identification Parade. However, in this piece, the author seeks to analyse the validity of the AFRS on the touchstone of the Indian Constitution while factoring in its benefits against the privacy concerns that it raises.
AFRS as a Tool For Criminal Identification
The AFRS as an efficient means of criminal identification has been lauded for a plethora of reasons that it has to offer – reduction in costs, extensive coverage and relatively lesser delay. This is more efficient and saves times as there is no longer the requirement of manually sifting through files, pictures and text involving people so as to spot a suitable match. The efficiency of the AFRS has been reflected in the Delhi Police being able to locate almost 3000 missing children over the span of 4 days, during its trial run. This has been purported to be hosted in NCRB’s Data Centre in Delhi while being made accessible to police stations across the country.
The already existing database for criminal identification in India is the Crime and Criminal Tracking Network and Systems (CCTNS) that was introduced post the 2008 Bombay terror attacks so as to form an integrated database containing crime incidents and suspects on the basis of FIRs, investigations and charge sheets. However, the Ministry of Home Affairs hopes to further boost the already existing criminal tracking system by the introduction of the AFRS and also the integration of bio metric solutions such as Iris and AFIS.
For the purpose of effectively implementing the AFRS, the NCRB has stipulated that it shall have access across databases – criminal, passport and Aadhaar databases. The collection of digital images for the issuing of documents such as Aadhaar, Passport etc. along with the extensive installation of CCTV cameras in public spaces likes roads, parks and airports facilitate a regime of mass surveillance that helps in the creation of a large reservoir of people’s data such as their facial images. This data, that has been compiled over a long period of time, when compared with the information of a suspect at hand, helps in the process of criminal identification.
This automated system, with the help of numerous algorithms, allows the identification of distinctive facial features of individuals known as ‘facial landmarks’ that are then converted to a ‘face template’. This is then matched against similar information on the database to identify the criminal. However, experiments across the world have indicated that such an automated system could be tricked as it fails to take into consideration eye wear or different types of make-up. In the case of a similar software used by the UK’s metropolitan police, it returned false readings in 98% of the instances. Therefore, in addition to the privacy concerns that it poses, the AFRS, a facial-recognition system, could also be prone to be yielding results that are false.
The Introduction of AFRS: A Cause of Concern
The Indian judiciary was reluctant to grant the status of fundamentality and inalienability to the right to privacy. This dictum that was initially found in the case of M.P. Sharma v Satish Chandra[i] was later followed in the Kharak Singh[ii] case. However, J. Subba Rao, in his dissenting opinion leaned towards the expansive interpretation of Article 21 such that it would include the right to privacy as well. Further, the invalidation of S.66A of the IT Act 2000 (which allowed the State to control and prohibit the dissemination of information) in the Shreya Singhal[iii] judgment is a notable landmark in not using technology for the purposes of facilitating public surveillance.
In the 2018 judgment of K.S. Puttaswamy v Union of India[iv], the Supreme Court affirmed the right to privacy as a fundamental right within the purview of the Right to Life under Article 21 of the Indian Constitution. Article 12 of the UDHR and Article 17 of the ICCPR which prevents the ‘arbitrary interference’ with a person’s privacy, ensure that the right to privacy enjoys a robust international framework. The creation of the database requisite for the functioning of the AFRS presents a problem as to whether the consent of the person was obtained in acquiring the image to be stored on the database. There have been multiple reports that have indicated that facial recognition tools, as the one in this case, use images extracted from the internet and CCTV visuals without the consent of those individuals.
Further, while interfering with the privacy of individuals, the three-fold test of legality, legitimate aim and proportionality that has been laid down in the Puttaswamy judgment ought to be followed. There needs to be an existent legal framework that regulates such an encroachment into a person’s privacy. In the absence of specific laws guarding AFRS and personal data protection laws in India, such a system for criminal identification is bound to fail the constitutionality test. Further, the test laid down in Puttaswamy mandates that such an incursion into a person’s privacy must be in furtherance of a legitimate state interest. It has also been laid down that such an intervention should not be disproportionate to its purpose. Therefore, the determination of a legitimate state interest and whether the incursion meets the proportionality test, require a functioning regulatory framework in place.
In the Aadhaar[v] judgment, the SC allowed the accumulation of demographic details and facial pictures, finding such a collection not to violate a person’s ‘reasonable expectation of privacy’, as the data stand-alone could not reveal much about an individual on its own. However, the mosaic theory of privacy states that the data of an individual collected over a long period of time allows information to be analysed on a qualitatively larger scale as compared to smaller bits of information. The AFRS database comprises the collection of data of an individual over a long period of time facilitating the bridging of information and integrates data, making surveillance much easier. Therefore, the mode of data collection under the AFRS is not a single step but a sequence of steps in collecting information about an individual.
These problems arising from the introduction of the AFRS are mainly due to the lack of rules or guidelines that comes along with it. In the absence of a regulatory framework, it could be prone to arbitrary misuse of power and the unfair targeting of a specific set of individuals, caused by a dangerously uncontrolled extent of profiling. While it was initially purported to trace missing children and undercut such criminal activities, instances have been reported where AFRS has been employed to target protestors and dissenters in rallies. Its ability to collect and store information of an individual was maybe manipulated by the police force to unfairly target and retain persons as ‘habitual protestors’ or ‘rowdy elements’ as was seen in the case of Hong Kong and the recent protests in Delhi.
A facial recognition software such as the AFRS, if misused, may be used as a tool of control and could also yield inaccurate results which would hit right at the edifice our criminal justice system. If left unregulated, the implementation of AFRS can have a chilling effect on people as it may go on to curtail fundamental rights of expression and assembly along with that of privacy, as stated by Daragh Murray. Taking into consideration the ‘chilling effect’ AFRS seems to have on the general populace, San Francisco has already banned the facial recognition system while EU has proposed a temporary ban on this technology. Even if it were to be conceded that AFRS could be successfully implemented with efficient safeguards, the harm that it poses to the society in curtailing the fundamental rights of expression and privacy outweighs the benefits it provides.
[i] M.P. Sharma v. Satish Chandra AIR 1954 SC 300. [ii] Kharak Singh v. State of U.P. AIR 1963 SC 1295. [iii] Shreya Singhal v. Union of India (2015) 5 SCC 1. [iv] K.S. Puttaswamy v. Union of India (2018) 1 SCC 809. [v] K.S. Puttaswamy v. Union of India (2018) 1 SCC 809.
Title Image Source: BiometricToday
The article has been written by Philip Ashok Alex who is a third-year student at National Law University Delhi. He is passionate about International Law, world affairs and the growing importance of tech law as well as the evolution of IPR in India along with the issues concerning intellectual property rights in the TRIPS regime.