Top News
Next Story
Newszop

NITI Aayog calls for policy and legal reforms to regulate use of facial recognition technology (FRT) in India

Send Push
NITI Aayog has proposed imposing liability and the extent of liability arising from any harms or damages caused by the use of facial recognition technology (FRT) system besides suggesting a need for an ethical committee to address issues pertaining to transparency, accountability, and biases emanating from the use of such systems in India.

“These issues warrant separate regulation, either through codes of practice, industry manuals and self-regulation, or through more formal modes like statute and rules made there under,” it said.

“The objective is to create a holistic governance framework addressing the multifaceted challenges posed by FRT systems,” it said in a report, prepared by the Vidhi Centre for Legal Policy on behalf of the Aayog.

The report calls for a need to put in place policy and legal reforms to regulate the use of facial recognition technology in India.

According to the report, organisations deploying an AI system can constitute an ethical committee to assess the ethical implications and oversee mitigation measures.


“Specifically, for FRT systems, it is imperative that such committees are constituted and given adequate autonomy to prescribe guidelines and codes of practice to ensure compliance,” it said, adding this is also crucial for ensuring India develops and leads thought leadership around FRT governance and regulation at an international level as well.

The report also suggests the need for transparency around the deployment of FRT systems in the public domain, both at the central and state level. “This is necessary for individuals to exercise their informational autonomy (and the right to privacy) as well as securing public trust in the development and deployment of such systems, which is intrinsic to the concept of responsible AI,” it added.

Talking about the need for a data protection regime, the report said that the ministry of electronic and information technology is involved in establishing the data protection framework.

“Any ongoing or future application of FRT systems by governments in India, must be compliant with the three pronged test of legality, reasonability and proportionality, as set out by the Supreme Court, in order to ensure constitutional validity,” it added.

Commenting on a case study on Digi Yatra, a digital platform to verify air travellers using biometric data, the study said it needs to spell out all the rules related to deletion of passenger information from the database once the travel is complete.

There have been privacy concerns in various quarters about Digi Yatra user data.

Based on Facial Recognition Technology (FRT), Digi Yatra provides for contactless and seamless movement of passengers at various checkpoints at airports.

Though the policy for Digi Yatra provides for deletion of facial biometrics from the local airport's database 24 hours after the departure of the passengers’ flight, the study said, “the rules related to deletion of other information collected from the passengers, as well as any facial biometrics that are stored in other registries, must be clearly set out in the policy,”.
Loving Newspoint? Download the app now