• TIPLPR

AUTOMATED FACIAL RECOGNITION - IS THE NEW FACE OF INNOVATIVE POLICING IN INDIA CONSTITUTIONAL?


Introduction

During this Pandemic, when the nation’s attention was fixed upon the Environmental Impact Assessment Draft (EIA), 2020, the National Education Policy (NEP), 2020 through a little advertisement tucked away within newspapers invited applicants to bid for the implementation of a nationwide Automated Facial Recognition System. Unlike the EIA Draft, 2020 and NEP, 2020, the 172-page Request for Proposal(RFP) released by the National Crime Records Bureau (NCRB), shrouded in secrecy within the folders of its website, garnered no attention albeit its legal implication are wide and severe.

While this national-level project remains in its nascent stages, police departments are already in possession of Facial Recognition Technology (hereinafter referred as ‘FRT’) developed by public and/or private developers such as Facetagr, the face-recognition mobile app, developed in 2017 by Tamil Nadu IPS officer P. Aravindhan, with his team. The application has helped police officials identify and catch criminals within the state and was awarded the prestigious Skoch award. Similarly, Punjab Police won the distinguished FICCI Smart Policing Award for developing the Punjab Artificial Intelligence System (PAIS), a mobile application to curb the activities of gangsters and other criminals. Other states such as Telangana, Uttar Pradesh, Uttarakhand, Delhi, Maharashtra have similar apps.

The objective of this article is to discuss the ramifications of the usage of the Automated Facial Recognition System by the Bureau without a proper legal framework. The article intends to bring a clear picture of the accuracy & and effectiveness of Automated Facial Recognition Technology, and the many ways in which it falls short of satisfying the tests laid down by the Supreme Court in the Puttaswamy right to privacy case.

What is Facial Recognition Technology?

To understand problems associated with FRT implementation, we first must understand how FRT works. As Smriti Parsheera explains in her recent paper, its working can be summarised in the following steps:

  1. Detection of the face in a picture and/or video.

  2. Extraction of particular features from the detected image.

  3. The recognition which ends up in either identification or verification of a person.

What Is The Cause Of Concern With The Request For Proposal By NCRB?

One of the biggest cause of alarm is the Request for Proposals (RFP) issued by the National Crime Records Bureau. The RFP invites bids for the creation of a National Automated Facial Recognition System (AFRS) which would be used to create a national database of photographs which might help in swiftly identifying criminals by gathering data from various other databases like Passport, CCTNS, ICJS, and Prisons, WCD Ministry's KhoyaPaya, NAFIS or other image databases available with police/other entity. The rapid deployment of FRTs, without accompanying guidelines or legal frameworks, raises a number of questions. These concerns revolve around the lack of transparency around facial recognition systems; their implications for privacy and civil liberties; and evidence of bias and discrimination in their outcomes.

Equally, we also need to question the accuracy and effectiveness of facial recognition systems, namely-their ability to achieve what they claim to do, and their suitability for the specific context in which the technology is sought to be deployed.

While all these concerns hold true for the use of FRTs by the government as well as private entities, the imbalance of power between the citizen and the state and the likely consequences from the abuse of that power makes this a particularly daunting problem in the context of law enforcement uses. Facial Recognition Technology (FRT) is being introduced in India without paying heed to the problems which may arise from its implementation. While the obvious concern is that a faulty system with a high error rate may be implemented, the implementation of an accurate FRT system also raises serious privacy concerns.

Analysing Facial Recognition System Through The Puttaswamy Lens

The legitimacy of any state-led intervention involving the use of FRTs will necessarily have to pass muster under the tests laid down by the Supreme Court in the case of K. S. Puttaswamy v. Union of India. This was followed by the decision in the Aadhaar case where we saw the application of these tests in the context of biometric technologies. The Internet Freedom Foundation’s Project Panoptic through a legal notice to NCRB has addressed how RFB issued by NCRB fails the Proportionality Standard laid down in the Puttaswamy decisions. The Hon’ble Supreme Court has propounded an interpretation of proportionality which requires the Court to consider it in the following manner[1]-

1. Is the State pursuing a legitimate aim?

A. The State’s aim must be legitimate, not necessarily compelling.

The stated objectives of the NAFRS, which include the identification of criminals, missing children, and adults and unidentified dead bodies, all lie well within the bounds of legitimate state objectives. Therefore, it would be reasonable to expect that any court in India will not find it hard to agree that the proposed NAFRS satisfies this first test. However, the mechanism proposed under the NCRB tender is a prima facie violation of the first test of legality as the system being proposed under it does not have any statutory basis. Neither is it created under any rules or regulations, which might in turn have statutory backing. This is in direct contrast to situations where there exists a statutory basis for access to personal data. This includes provisions on the interception of telephonic messages and data under the Telegraph Act and the Information Technology Act, respectively, or collection of fingerprints under the Identification of Prisoners Act.

In the Aadhaar case, we saw the Supreme Court strike down certain requirements of mandatory linking of Aadhaar precisely for the reason that such actions did not have a legal basis. This was the case with the administrative circular on mandatory verification of SIM card ownership as well as the linkage with various scholarship schemes by bodies such as the Central Board of Secondary Education and the University Grants Commission. The lack of any enabling legal provision in the case of NAFRS would therefore be the first barrier to its legitimate adoption. Even if the government were to subsequently enact such a law, or argue that the basis for the adoption of FRTs flows from general investigation powers under criminal law, the design and implementation of NAFRS would of course still have to pass muster under the other layers of the Puttaswamy tests.

2. Are the means used to achieving this aim reasonable or suitable?

It is going to be much harder to justify how the deployment of FRTs over large segments of the population, without their consent, can be regarded to be a proportionate response for meeting the desired goals. While rejecting the justification of countering black money as the basis for mandatory linkage of Aadhaar with bank accounts, the majority verdict in the Aadhaar case[2] had noted that imposing such a restriction on the entire population, without any evidence of wrongdoing on their part, would constitute as a disproportionate response. In the words of the court, “under the garb of prevention of money laundering or black money, there cannot be such a sweeping provision which targets every resident of the country as a suspicious person”. Such a “presumption of criminality” would be treated as being disproportionate and arbitrary. The lack of any data minimisation norms or mechanisms to ensure purpose limitation will also make it harder for the state to justify the reasonableness of the selected mechanism. For instance, there are no effective limitations on the sources of images that can be legitimately used by the system, the gravity of offences that might qualify for its use, or checks against any further mission creep in the purposes for which the NAFRS may be used. Hence it fails the second test

3. Is there a less intrusive way to achieve the State objective?

This enquiry includes:

A. Identifying alternatives to the measure adopted by the State.

B. Asking how effective each of these alternative measures are. Do they achieve the State objective in a ‘real and substantial manner’?

C. What is the impact of each of these measures on the infringed right?

D. The Court will undertake a ‘balancing exercise’ at this stage.

The third prong of the proportionality test is a fact-based test as it necessarily entails for the Court to examine various alternative measures that can be adopted to achieve the intended goal of the state. After such examination, the Court should choose the least restrictive but equally effective measure to achieve the intended goal of the state. Herein the stated objective of this system is “to act as a foundation for a national level searchable platform of facial images, and to improve outcomes in the area of criminal identification and verification by facilitating easy recording, analysis, retrieval and sharing of information between different organizations”.

In the IMAI case[3], the Court establishes a new standard that had not been seen before. The standard set through this case is – the state through ‘empirical data’ must establish why the action it took was the only possible measure and no alternative measure could have been adopted. Going by the available empirical data, FRT’s will fail the third prong test. The claims concerning the accuracy of FRT systems are routinely exaggerated. For Instance, in 2018, the Delhi Police reported that, the FRT system had an accuracy rate of 2% which fell to 1% in 2019 with the system failing to distinguish between girls and boys. While there have been claims of a perfectly accurate FRT system, none of these claims have been corroborated by an independent review and audit. The National Institute of Standards and Technology (NIST) has extensively tested FRT systems for 1:1 verification and 1:many identification and how the accuracy of these systems vary across demographic groups. These independent studies have concluded that currently, no FRT system has 100% accuracy. The implementation of such faulty FRT systems would lead to high rates of false positives and negatives in the recognition process. This in turn may lead to the arbitrary exclusion of an individual from government schemes and benefits. Failure of biometric-based authentication under Aadhaar has led to many people being excluded from receiving essential government services and even led to starvation deaths

4. Balancing the State objective on the one hand with the importance of the right and the extent of the intrusion on the right on the other.

A. This balancing is best done by following bright-line rules which are either established or need to be created.

The implementation of the FRT system would also violate fundamental rights by facilitating mass surveillance. For instance, there will be a chilling effect on the right to freedom of speech and expression because people will be wary of being prosecuted in case they express anti-government sentiments one such example being the use of FTR in Delhi Protests. Further, the right to freedom of movement would be hampered as mass surveillance would allow the government to track the movements of individuals in real-time across the country. Finally, the right to privacy will be violated as sensitive personal data that is collected by these FRT systems will be used by the Government without the informed consent of the individual. This would also hamper the individual from exercising the liberty to share their information in some contexts and remain anonymous in others according to their individual choice. This extent of intrusion into Fundamental rights cannot be balanced and justified by the objectives of RFP by Bureau

Conclusion

From the NCRB’s tender, we demonstrate how the proposed design of the facial recognition system falls grossly short of satisfying the constitutional safeguards. In short, the system lacks legal authorisation, it does not constitute a necessary and proportionate intervention to meet the desired objectives; and there is a complete absence of procedural safeguards to ensure fair and reasonable application. Before law enforcement bodies in India can make use of FRTs, the Parliament has to authorise the same through an appropriate legal framework and a mere cabinet note will not suffice. Some of the basic checks and balances of such framework would include, narrow tailoring of the purposes for which the system may be deployed and the persons whose images may be used for the probe and gallery databases; prior judicial approval for the use of the system; and inbuilt mechanisms for independent analysis and verification of the system’s performance.

[1]Modern Dental College & Research Centre v. State of M.P., (2016) 7 SCC 353: 2016 SCC OnLine SC 373 at page 414; K.S Puttaswamy v. Union of India., (2017) 10 SCC 1. [2] K.S Puttaswamy v. Union of India., (2019) 1 SCC 1 [3] Internet and Mobile Association of India v. Reserve Bank of India 2020 SCC OnLine SC 275

Title Image Source: The Economic Times


This article has been written by BALAJI A.P who is a law student in School of Excellence in Law, Chennai.