top of page
Souhardya Roy

A Looming Panopticon in the Civil Aviation Sector: The Immediate Need to Address Privacy Concerns Arising out of DigiYatra's Data Storing Policy



Written By Souhardya Roy, the author is a law student currently pursuing BA.LLB from The West Bengal National University of Juridical Sciences, Kolkata


Introduction

With the advent of Artificial Intelligence (“AI”) in the mainstream, the scope of application of AI has grown exponentially. Specifically in synthesizing big data analytics, AI has proved itself to be a significant game-changer, allowing both private and state entities to examine and analyze public behavior and influence it to a large extent. This also holds for the State's employment of Facial Recognition Technology (“FRT”). FRT is an AI system that identifies and verifies an individual based on digital media, like images or videos, with underlying algorithms. This also includes accessing and synthesizing biometric data like facial images, which can be captured via individual consent or public channels like social media or other databases that store biometric information like the UIDIA. Accessing sensitive information like this with or without the individual's consent, especially for purposes the individual did not envision, constitutes a significant violation of liberty. An important case where FRT is being employed in an unregulated manner is in the opt-in Digi Yatra ecosystem, which should be governed by the Digi Yatra Biometric Boarding System (“DY-BBS”) policy. Even in its nascent implementation stage in thirteen airports, with plans of implementing it in another fourteen more airports, this system poses severe threats to the right to privacy and liberty, which must be addressed immediately before it becomes a mandatory requirement for every airline passenger in India.


What is DigiYatra, and How does it use FRT

Introduced in 2017, DigiYatra (“DY”) was set up as an opt-in service by the Ministry of Civil Aviation (“Ministry”). The DY is a biometric boarding system (“BBS”), which depends on the identity management of passengers to provide a host of services necessary for boarding, baggage check-in, security checks, etc. The preliminary adjustments were in the form of enhancing airport infrastructure, digitizing manual operations, improving security standards, and lowering the overall cost of operations. The primary agenda was to digitize and automate the airport screening process so that passengers could board their flights in a “seamless, contact-free, hassle-free, and paperless” manner. This service is available at select Indian airports (as of June 2024) and utilizes FRT and Aadhar-linked credentials to authenticate passengers as a substitute for in-person authentication at the different security checkpoints in the airport. The objective was clear from the outset – to streamline operations necessary for air travel and provide significant benefits for the civil aviation sector in India.


The employment of FRT in the DY-BBS framework is done on a 1:1 basis,  wherein the FRT is used to authenticate or verify a specific person’s facial data from a gallery dataset, which is used to authenticate individuals for disbursing services and other benefits. The FRT in the DY-BBS ecosystem involves two components – authentication and creation of a digital identity for a user/passenger and the subsequent digital verification of this identity as the various checkpoints in the boarding process. This identity management system replaces the present system of employing manual verifications for security checks, which prima facie has the potential to supplement or even substitute manual verification in the future. When contrasted with the physical verification system in place, this opt-in system makes it a lucrative option for the State and passengers.


As of now, DY-BBS has been implemented as an opt-in service by the Ministry of Civil Aviation (“Ministry”) per their announcement of the DY-BBS policy in 2018. This policy set out the passenger process and technical features of DY-BBS. Also, the Digi Yatra Foundation (“DYF”) was established in 2022 to implement the DY-BBS ecosystem. It is registered as a not-for-profit company under Section 8 of the Companies Act 2013 and continues to govern its implementation process.


Evaluation of the DY-BBS Policy

Although the DY-BBS policy envisages an opt-in model that involves a passenger’s voluntary consent to access their biometrics, there have been instances where state actors at airports were involved in coercing or stating that the service is mandatory. Furthermore, there is a lacuna when it comes to the DY-BBS policy having legal backing because there are no legislations or legal frameworks that legally implement the policy. Hence, as a result, there are no prescribed redressal mechanisms for any infringement made by the state or non-state actors.  This is particularly true for determining what constitutes free and “informed” consent from the users and non-users of the DY-BBS ecosystem.


When it comes to the data storing policy, there is a broad and vague exemption in the DY-BBS policy, which states that passenger data can be shared with government authorities without providing an exhaustive list of circumstances for such an exemption. This brings about the significant potential for the state to misuse such sensitive biometric data for malicious violations of free speech and liberty under Article 19 of the Indian Constitution, the FRT system may be used to conduct additional screening of specific demographics at the whims of the State. Another aspect that brings about significant concerns is the types of data that can be collected from the users. This ranges from identity and biometrics to passwords and other media collected at various checkpoints in the boarding and screening process. Furthermore, there is no limitation on the scope of using the collected data, which the policy states may be used for “improvement of products”, “conducting surveys”, etc. However, such an extensive data collection does not have a nexus with providing authentication services, which primarily require facial biometrics. This is directly opposed to the privacy principle of data minimization in the processing of data, which is currently accepted standard.


Another clause in the DY-BBS policy that brings up an issue is sharing the user’s personal information with connected service providers like hotels and cab-hailing services. This is an inherent contradiction to the claim that other entities cannot use data collected through DY-BBS as it is encrypted. This also brings the policy of localized data storage into question, in which the Ministry has claimed that data is stored in the user’s devices. There is no centralized storage, and the sensitive biometric information is removed 24 hours after the departure. This becomes a significant issue, which has been left unaddressed since the “Dataevolve” scandal, where it was found that data authentication for DY-BBS takes place on the Amazon Web Services cloud platform. Another subsequent problem with the present framework is the lack of transparency and accountability from the state and DYF. This is because DYF does not come under the purview of the Right to Information Act, 2015, as a private company and hence is not accountable to the general populace.


These issues arising out of the ambiguous DY-BBS policy compel an application of the Puttaswamy test, developed by the Supreme Court in K.S. Puttaswamy v. Union of India [(2017) 10 SSC 1] and subsequently in the Justice K. Puttaswamy (Retd.) vs Union of India [(2019) 1 SCC 1]. These tests are equipped to deal with the validity of this consent-based opt-in framework, as the three-fold test of legal validity, legitimate interests, and proportionality will be invoked, and the four-prong test of proportionality will also come into play. Answers to the initial test would be negative for all three prongs – first, there is no anchoring legislation or operative data protection rules upon which the policy may survive; second, there may be a legitimate interest in ensuring streamlined operations, but convenience can never prove to be a necessity for curbing civil liberties enshrined under Article 21 of the Constitution of India. When it comes to the question of proportionality – the policy may have a legitimate goal, but it does not employ a suitable means to further such a goal, neither is it the least restrictive means, and finally, it has the potential to cause a disproportionately adverse impact on the passengers by jeopardizing their sensitive, confidential data, not just to the state, but also to third party vendors.


An effort to anchor the DY-BBS policy, amongst other FRT systems, is being made under the Digital Personal Data Protection Act, 2023 (“DPDPA”). Since the DPDPA has not yet come into effect, FRT systems like DY-BBS are still without a legislative anchor from which they can derive their authority. However, even after the DPDPA comes into effect, it does not unequivocally address the issues arising out of the DY-BBS framework. Sections 3 and 4 of the DPDPA, 2023, aim to address the application of the act and outline the ground requirements for the data fiduciary to process any personal data. The issue arises regarding the ambiguous terminology in section 4 (1) (b), which allows the data fiduciary to process data in accordance with the Act for “legitimate purposes”. This ambiguity provides for a significant degree of state involvement, especially where section 17 of the DPDPA allows data fiduciaries to be exempt from the application of the Act under specific circumstances, especially where state actors like law enforcement require access to such personal data for investigation or in the furtherance of any of the reasonable restrictions as highlighted under Article 19 of the Constitution.


Although such exemption is subject to a central government notification, preserving civil liberties cannot be hinged upon the discretion of a state, which is capable of suppressing free speech and dissent. Sensitive Personal data must be safeguarded through more accountable mechanisms instead of policies like the DY-BBS privacy policy, which may not be able to provide the requisite safeguards against DY-BBS’s data processing and sharing mechanisms. Furthermore, there is no classification of “sensitive personal data” as a distinct category of data that requires additional layers of secure processing. This is in contravention of legislations like the IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, or the global standards like the European Council’s “Guidelines on Facial Recognition”, which lay down the extent of information which is classified as sensitive personal data, which includes biometrics like facial information, that require additional care and security while being processed. The DPDPA, 2023, hence, has an abject lack of such a classification which delineates sensitive personal data from general personal data. A way to remedy this might lie in examining the IT Rules of 2011.


The Way Forward

Amidst such severe criticism, it is undeniable that AI is the future, and preventing its implementation would be futile. Hence, instead of actively opposing the presence of AI in the civil aviation sector, it is necessary to reform the present legislative framework to regulate such unchecked application of AI by state entities. In this particular case, the DY-BBS framework holds significant promise, which may lead to achieving its envisioned goals; however, before such goals are fulfilled, the framework in the policy has to be reformed. This may be done by adhering to the Puttaswamy test, specifically by - constructing a comprehensive legislative framework to govern data protection in an FRT framework, binding upon the state and private actors. This legislation has to include a clear definition of which category of data is to be taken, how it is to be used to create the digital identity, and how the data is collected after such a creation is synthesized for authorization. Further, to safeguard the right to privacy of the users, the data collected must be purged without granting access to a third party, which may be achieved by an opt-in system for such sharing while informing the purpose of such data collection. Also, for the state to access any records of the users, judicial permission must be made mandatory, as envisioned concerning respect to Section 69A of the Information Technology Act, 2000, in the case of Shreya Singhal v. Union of India (2013) 12 SCC 73 must be developed. Hence, these measures must ensure that the DY-BBS complies with India's jurisprudence. Otherwise, it will proceed until surveillance is aggravated to the extent of a panopticon.


.

67 views0 comments
bottom of page