{"id":165,"date":"2022-09-21T13:27:49","date_gmt":"2022-09-21T13:27:49","guid":{"rendered":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/?post_type=chapter&#038;p=165"},"modified":"2022-10-03T08:13:41","modified_gmt":"2022-10-03T08:13:41","slug":"automated-administration-administrative-law-and-algorithmic-decision-making-in-india","status":"publish","type":"chapter","link":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/chapter\/automated-administration-administrative-law-and-algorithmic-decision-making-in-india\/","title":{"rendered":"Automated Administration: Administrative Law and Algorithmic Decision-Making in India"},"content":{"raw":"<div style=\"font-weight: 400;\">\r\n\r\n<strong>Introduction[footnote]PhD Candidate, Faculty of Laws, University College London. The author would like to thank Kruthika R. for her inputs and discussions which are invaluable to this paper.[\/footnote]<\/strong>\r\n\r\nWith the ubiquity of digital information and computational tools, there has been a concomitant proliferation in the use of computers to analyse information and produce specific outputs on the basis of encoded rules and logics. Such computational tools, which we will refer to as \u2018algorithmic systems\u2019 have implications not only for their use in particular domains (like healthcare or policing), but also in their systemic effects on the manner in which knowledge about individuals and societies is parsed and acted upon.[footnote]Tarleton Gillespie, \u2018The Relevance of Algorithms\u2019 in Tarleton Gillespie, Pablo J Boczkowski and Kirsten A Foot (eds), Media Technologies (The MIT Press 2014) &lt;<a href=\"http:\/\/mitpress.universitypressscholarship.com\/view\/10.7551\/mitpress\/9780262525374.001.0001\/upso-9780262525374-chapter-9\">http:\/\/mitpress.universitypressscholarship.com\/view\/10.7551\/mitpress\/9780262525374.001.0001\/upso-9780262525374-chapter-9<\/a>&gt; accessed 29 July 2020.[\/footnote]\u00a0In this paper, I focus on automated decision-making in the public sector, a subset of algorithmic systems which are used within decision-making processes in public administration, either producing kinds of knowledge as outputs to be acted upon by human agents, or directly triggering particular actions as an outcome of an algorithmic process.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAlgorithmic systems are assuming an increasingly prominent role in public administration in India. Decisions ranging from policy formulation and rule-making, to quasi-judicial functions of evaluating specific claims are now delegated, in varying degrees, to computer algorithms which function with some degree of autonomy and without requiring direct human involvement. Algorithmic systems have been used in bureaucratic processes in India since at least the 1980s, when \u2018rule-based\u2019 systems were piloted within tax and healthcare administration.[footnote]Patrick Saint-Dizier, \u2018The Knowledge-Based Computer System Development Program of India: A Review\u2019 (1991) 12 AI Magazine 33.[\/footnote] Contemporary administrative use of algorithmic systems includes the proliferation of \u2018machine learning\u2019 systems, which seek to create their own logics and patterns of understanding based on analysis of vast underlying datasets, in order to optimise for specific outcomes.[footnote]Michael Veale and Irina Brass, \u2018Administration by Algorithm?: Public Management Meets Public Sector Machine Learning\u2019, <em>Algorithmic Regulation<\/em> (Oxford University Press 2019) &lt;<a href=\"https:\/\/oxford.universitypressscholarship.com\/10.1093\/oso\/9780198838494.001.0001\/oso-9780198838494-chapter-6\">https:\/\/oxford.universitypressscholarship.com\/10.1093\/oso\/9780198838494.001.0001\/oso-9780198838494-chapter-6<\/a>&gt;[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAs the use of algorithmic systems in society has proliferated, there has been a substantial body of literature engaging with questions about information processing within algorithmic systems, and its legal consequences, particularly under public law. Scholars have examined how the move towards data-driven decision-making systems fundamentally impact concepts of the rule of law and justice, which are the root of constitutional democracies.[footnote]Mireille Hildebrandt, <em>Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology<\/em> (Paperback edition, EE Edward Elgar Publishing 2016).[\/footnote][footnote]Danielle Keats Citron, \u2018Technological Due Process\u2019 (2007\u20132008) 85 Washington University Law Review 1249.[\/footnote] Scholarship has also dwelled on the impact of algorithmic systems on privacy and data protection law, particularly on the aspect of privacy which preserves individual self-determination and selfhood.[footnote]Helen Nissenbaum, <em>Privacy in Context: Technology, Policy, and the Integrity of Social Life<\/em> (Stanford University Press 2009); Mireille Hildebrandt, \u2018Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning\u2019 (2019) 20 Theoretical Inquiries in Law 83.[\/footnote] A related branch of studies has contended with algorithmic fairness, transparency and accountability, and its implications for legal systems concerned with, for example, the right to information, rights against anti-discrimination and liability for wrongful conduct.[footnote]Solon Barocas and Andrew D Selbst, \u2018Big Data\u2019s Disparate Impact\u2019 (2016) 104 California Law Review 671.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nWithin this broader field of algorithmic studies, there is a specific body of literature which has engaged with the effects of algorithmic systems in government administrations. Early engagement with this subject examined the impact of rule-based expert systems within government and the rise of the \u2018data processing model of bureaucracy\u2019 on concepts of administrative law, including reasonableness and fairness in administrative decision-making and public participation in policy processes.[footnote]Paul Schwartz, \u2018Data Processing and Government Administration: The Failure of the American Legal Response to the Computer\u2019 (1991) 43 Hastings LJ 1321; Citron (n 5).[\/footnote] More recent engagement incorporates concerns relating to developments in big data analysis and machine learning systems as well as the increasing autonomy attributed to algorithmic decision-making systems, including its impact on administrative discretion and processes of adjudication.[footnote]Michael Veale, Max Van Kleek and Reuben Binns, \u2018Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision-Making\u2019 [2018] Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems 1; Deirdre K Mulligan and Kenneth A Bamberger, \u2018Procurement as Policy: Administrative Process for Machine Learning\u2019 (2019) 34 Berkeley Technology Law Journal 773.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nLegal scholarship engaging with administrative law and algorithmic systems has mostly been within the United States (\u201cU.S.\u201d) and European contexts. In India, while there has been renewed and multi-disciplinary scholarly attention paid to information systems utilised within government administration, largely as a result of large-scale projects like Aadhaar,[footnote]Reetika Khera, \u2018Impact of Aadhaar in Welfare Programmes\u2019 (2017) SSRN Scholarly Paper ID 3045235 &lt;<a href=\"https:\/\/papers.ssrn.com\/abstract=3045235\">https:\/\/papers.ssrn.com\/abstract=3045235<\/a>&gt;[\/footnote] legal scholarship as well as judicial and policy attention has approached administrative information processing activities primarily from the lens of informational privacy and data protection. While the lens of privacy and data protection law can and should inform the regulation of algorithmic systems, it is not sufficient to respond to the specific questions that these systems pose within the context of administration and bureaucracy.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe use of algorithmic systems for administrative decision-making is a subject which should concern legal and regulatory scholarship for two interrelated reasons. First, the use of algorithmic systems requires deliberating trade-offs between their presumed benefits, (for example, in reducing costs and increasing efficiency, or curtailing arbitrariness) and perceived harms, (for eg. increasing opacity and reducing accountability). These trade-offs must be deliberated within the context of specific legal frameworks, including constitutional rights, which place constraints on state action, and consequently, on the deployment of algorithmic systems. Second, algorithmic systems pose questions of normative and institutional change for administrative agencies which must be contended with. Algorithmic systems substantially impact norms of administrative decision-making, ranging from the role of bureaucratic discretion in the application of statutory rules and standards, to the norms governing procedural fairness in formulating administrative policies and decisions \u2013 questions which are fundamental to administrative law.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThere is a long history of administrative law jurisprudence in India, the goal of which is to ensure administrative action ascribes to constitutional principles \u2013 including rights against arbitrary state action, administrative and procedural fairness and equality before the law. This jurisprudence addresses aspects of administrative action from the delegation of legislative powers and administrative rule-making, to public involvement in policy processes, to administrative procurement processes and individual decision-making. Even as algorithmic systems fundamentally alter the characteristics of each of these forms of administrative action, there has been little consideration given to legal or regulatory responses to ensure adherence to recognised principles of administrative law.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThis article seeks to explore how algorithmic systems are impacting the function and role of government administration in India, and what this implies for the areas of law which are concerned with the regulation and governance of administrative decision-making within government agencies \u2013 broadly categorised as administrative law. This will also illuminate broader questions about the philosophy of information regulation in India, including how information collection and processing activities within algorithmic systems are mediating the citizen-state relationship.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThis paper will locate the debates about normative and institutional change brought about by the use of algorithmic systems in the Indian administrative law context. This provides a valuable contribution to the existing literature on the subject for two reasons. First, it provides a framework to engage with administrative algorithmic decision-making within the contours of Indian law and jurisprudence. Second, understanding the effects of the use of algorithmic systems within the context of administrative systems in the particular context of India can inform literature on questions of algorithmic fairness, transparency, accountability and ethics more broadly.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nPart I will review the literature around the operations of algorithmic systems and their implications for important public values. Part II of the paper will briefly outline the history and the political economy of the contemporary era of \u2018government-by-algorithm\u2019, and review jurisprudence and literature on its implications for the law of public administration. Part III will examine how public agencies in India are utilising automated or algorithmic systems for decision-making. Part IV will examine the implications of automated decision-making on administrative legal principles under Indian law.\r\n\r\n&nbsp;\r\n\r\n<strong><span class=\"TextRun MacChromeBold SCXW215688055 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW215688055 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">Fairness, Accountability and Transparency in Algorithmic Decision-Making<\/span><\/span><span class=\"EOP SCXW215688055 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:276}\">\u00a0<\/span><\/strong>\r\n\r\nPublic administration today is increasingly characterised by the use of computational and digital systems to integrate and analyse information or data through algorithmic logics. In particular, there is a rise in the use of so-called \u2018Artificial Intelligence\u2019 (\u201cAI\u201d) and \u2018Big Data\u2019 technologies, propelled by the use of Machine Learning (\u201cML\u201d) systems, which utilise statistical methods to make causal inferences between large sets of data, or optimise mathematical functions in order to make \u2018predictions\u2019 for future instances of data. This section briefly examines how algorithmic systems impact upon values of fairness, transparency and accountability, which are also normative values upheld by administrative law and regulation, as well as values that (nominally) motivate government administration at large.\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe term algorithm describes a series of steps through which particular inputs can be turned into outputs.[footnote]Thomas H Cormen and others, <em>Introduction to Algorithms<\/em> (MIT press 2009).[\/footnote] An algorithmic system is a system which uses one or more algorithms, usually as part of a computational software, to produce outputs which may be used for making decisions. Algorithmic systems are characterised not only by the underlying technologies used to compute information, but equally by the social, cultural, legal and institutional contexts where algorithms are embedded, which are crucial determinants of how these systems are used and governed.[footnote]Tarleton Gillespie, \u20182. Algorithm\u2019, 2. <em>Algorithm<\/em> (Princeton University Press 2016) &lt;https:\/\/www.degruyter.com\/document\/doi\/10.1515\/9781400880553-004\/html&gt; accessed 26 November 2021.[\/footnote] These algorithmic systems, and their implication for public administration and legal and constitutional rights, are the socio-technical systems that this paper focusses on.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe proliferation of these systems in a number of socially consequential areas, such as policing, education, finance and healthcare, both within and external to government, has spurred substantial debates on their implications for important public values, centred largely around values of transparency, fairness and accountability of these systems. This framing, while not exhaustive of the range of implications posed by the widespread use of automated decision-making systems and algorithmic technologies, emphasises how algorithmic decision-making systems challenge important assumptions and expectations about consequential decision-making that concerns people relating to the transparency about how a decision is made, the \u2018fairness\u2019 of such a decision, and who should be accountable for these decisions.[footnote]Rob Kitchin, \u2018Thinking Critically about and Researching Algorithms\u2019 (2017) 20 Information, Communication &amp; Society 14.[\/footnote] Each of these concepts are highly contested, highly context-specific, and escape universal definition, yet they broadly describe the anxieties that algorithmic decision-making has given rise to in various contexts, that are relevant for our study.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nTransparency, in the context of algorithmic decision-making may broadly be described as \u201ca system of observing and knowing that promises a form of control\u201d.[footnote]Mike Ananny and Kate Crawford, \u2018Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability\u2019 (2018) 20 New Media &amp; Society 973.[\/footnote] Transparency is instrumental in understanding and demanding accountability about a decision. Algorithmic decision-making gives rise to challenges of transparency owing both to the intrinsic technological inscrutability of some novel forms of algorithmic systems \u2013 such as complex machine learning systems utilising data with a high number of characteristics,[footnote]Jenna Burrell, \u2018How the Machine \u201cThinks\u201d: Understanding Opacity in Machine Learning Algorithms\u2019 (2016) 3 Big Data &amp; Society 205395171562251.[\/footnote] or which computes data in a manner unintelligible to the audience demanding transparency.[footnote]Jakko Kemper and Daan Kolkman, \u2018Transparent to Whom? No Algorithmic Accountability without a Critical Audience\u2019 (2019) 22 Information, Communication &amp; Society 2081.[\/footnote] However, transparency is also a function of how these systems are integrated into and engage with existing social, institutional or organisational contexts.[footnote]Franck Pasquale, <em>The Black Box Society<\/em> (Harvard University Press 2015).[\/footnote] For example, a major factor inhibiting transparency of algorithmic systems used in the public sector is the reluctance of governments or private contractors to reveal the details of trade sensitive information.[footnote]<em>Id<\/em>.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nFairness, in the context of algorithmic decision-making, is implicated both in the manner in which decisions are made, as well as on its effects on particular individuals or groups, concerning aspects of both the intrinsic quality of a decision-making process, as well as the broader distributive implications of decisions made.[footnote]Solon Barocas, Moritz Hardt and Arvind Narayanan, \u2018Fairness and Machine Learning\u2019 253, (fairmlbook.org).[\/footnote] Several studies of algorithmic systems used in different social contexts have shown how the impacts of these systems are distributed in ways that are considered \u2018unfair\u2019 \u2013 either indicating statistical bias based on particular characteristics like class, race or caste (which are often characteristics legally protected against discrimination).[footnote]Barocas and Selbst (n 7).[\/footnote] Bias or discrimination can arise owing to a number of elements in the decision-making process, including (1) the kinds of historical data that a Machine Learning algorithm might take into account, which may include protected characteristics; (2) how the data is processed and whether the processing itself produces (statistically) biased or arbitrary results, or, (3) if the context in which a decision-making system is used is consistently biased towards a particular group.[footnote]Barocas, Hardt and Narayanan (n 19).[\/footnote] Owing often to the scale at which algorithmic systems are often used, implicit or explicit biases in algorithmic decision-making can often lead to systematic discrimination at socially consequential scales.[footnote]ibid.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAccountability in the context of algorithmic decision-making refers to ability of various actors involved in the production of a decision through an algorithmic system to be held to account for such decisions, including \u201cthe obligation to explain and justify their use, design, and\/or decisions of\/concerning the system and the subsequent effects of that conduct.\u201d[footnote]24Maranke Wieringa, \u2018What to Account for When Accounting for Algorithms: A Systematic Literature Review on Algorithmic Accountability\u2019, <em>Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency<\/em> (ACM 2020) &lt;<a href=\"http:\/\/dl.acm.org\/doi\/10.1145\/3351095.3372833\">http:\/\/dl.acm.org\/doi\/10.1145\/3351095.3372833<\/a>&gt; accessed 29 July 2020.[\/footnote] Algorithmic systems within governments are often complex systems, assemblages of data, computational techniques and varying institutional or organisational contexts, involving different actors responsible for different elements of the system (for example, a developer of a software, the agency responsible for procuring the system and the agency responsible for using it, etc.).[footnote]European Parliament. Directorate General for Parliamentary Research Services., <em>A Governance Framework for Algorithmic Accountability and Transparency<\/em>. (Publications Office 2019) &lt;<a href=\"https:\/\/data.europa.eu\/doi\/10.2861\/59990\">https:\/\/data.europa.eu\/doi\/10.2861\/59990<\/a>&gt;[\/footnote] This complexity makes it difficult to attribute responsibility for the ultimate decision taken through the use or aid of an algorithmic system to a single actor or organisation, in many cases undermining effective accountability.[footnote]Madeleine Clare Elish, \u2018Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction\u2019 (2019) 5 Engaging Science, Technology, and Society 40.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe admittedly broad values of fairness, accountability and transparency offer but one frame of analysis for the consequences of algorithmic systems on public values. Algorithmic systems also portend structural effects on, for example, democratic participation and human agency, and their impacts may be usefully analysed from a number of normative lenses or frameworks. However, this framing is particularly useful in the context of the aims of this chapter \u2013 to highlight the impact of algorithmic systems on public administration and the values, norms and laws that guide or govern public administration.\r\n\r\n&nbsp;\r\n<div style=\"font-weight: 400;\">\r\n\r\n<strong>Algorithmic Administrative Decision-Making in India\u00a0<\/strong>\r\n\r\nThe use of algorithmic systems and logics for decision-making is hardly a novel phenomenon. Information systems have long played a part in public administration, even within jurisdictions like India which have seen relatively delayed adoption of computers and digital technologies. Historically, digital systems were implemented in order to automate routine and clerical tasks of administration.[footnote]Saint-Dizier (n 2).[\/footnote] Although there is some evidence of the use of more complex systems, such as the use of knowledge based expert systems (an early form of \u2018artificial intelligence\u2019 systems which relied on programming syntactic rules to aid in tasks like legal interpretation and analysis), it is only in the past two decades that the implementation of digital systems within public administration has emerged as a transformative phenomenon in India. Despite the highly fragmented nature of digital technology use in India, governments at both the Central and the State level have been eagerly adopting these technologies in order to augment and supplant their decision-making capabilities.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn this part, we use three case studies to examine how algorithmic technologies intersect with administrative decision-making processes at different stages, and explore their implications for administrative law discussed previously.\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<p style=\"padding-left: 40px;\"><strong><span class=\"TextRun MacChromeBold SCXW202459856 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW202459856 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">1. Tax<\/span><\/span><span class=\"TextRun MacChromeBold SCXW202459856 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW202459856 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\"> Assessment and Case Allocation under the Income Tax Act<\/span><\/span><\/strong><\/p>\r\nIn 2019, the Government of India introduced a scheme to replace manual assessment of income tax for the purpose of additional scrutiny with an automated system known as the Faceless Assessment Scheme (\u201cFAS\u201d). In 2020, the Indian parliament amended certain provisions of the Income Tax Act (\u201cTax Amendment Act\u201d)[footnote]The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.[\/footnote] to incorporate the FAS, which, inter alia, includes provisions for an \u2018automated allocation tool\u2019 and \u2018automated examination tool\u2019 which are defined as algorithmic systems for the randomised allocation of cases, and standardised assessment of draft orders, respectively.\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAs per the Tax Amendment Act,\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n\u201c\u2018Automated allocation tool\u2019 means an algorithm for randomised allocation of cases, by using suitable technological tools, including artificial intelligence and machine learning, with a view to optimise the use of resources.\u201d[footnote]S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.[\/footnote] and;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n\u2018\u201cAutomated examination tool\u2019 means an algorithm for standardised examination of draft orders, by using suitable technological tools, including artificial intelligence and machine learning, with a view to reduce the scope of discretion.\u2019[footnote]S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nUnder these provisions of the Tax Amendment Act, decisions about the \u2018randomised allocation\u2019 of tax assessments, as well as the examination of draft assessment orders are to be automated through the suitable technological tools, including \u201cartificial intelligence and machine learning\u201d, in order to optimise resources and reduce discretion, respectively (echoing the standard justifications for automating administrative decisions which we noted in the previous section).\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAutomation enters the tax assessment system at two points. The Automated Allocation algorithm is used by the tax authorities in order to identify specific cases for tax assessments, and to allocate the scrutiny of tax returns to a specific regional assessment centre, ostensibly to reduce bias and increase transparency in the selection and allotment of cases for further scrutiny. After the initial assessment, a draft assessment order is prepared by the authority, which is then analysed using the Automated Examination Tool, using an algorithmic system, and the taxpayer is intimated of the final assessment.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe details of the algorithms used, the statistical techniques applied or the data on which the Machine Learning system is supposed to work on have not been made publicly available, and the considerations that an algorithmic system for allocation or examination must take into account are not specified in the primary legislation (the Income Tax Act) or in the rules made by the tax administrative authority (the Central Board for Direct Taxes).\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn addition to automating and augmenting manual allocation and assessment of draft orders, the FAS also resulted in assessments being conducted without providing a hearing to affected persons. Consequently, a number of challenges were raised before various High Courts arguing that proceedings with finalising assessments were conducted without granting the right to a personal hearing before the adjudicating officers.[footnote]Chander Arjandas Manwani, Bombay High Court, (Writ Petition no. 3195 of 2021) order dated 21st September 2021; RMSI Private Ltd. v. National E-Assessment Centre., Delhi High Court, W.P.(C) 6482\/2021 (Delhi HC), order dated 14\/07\/2021.[\/footnote]\r\n\r\n&nbsp;\r\n<p style=\"padding-left: 40px;\"><strong>2. <span class=\"TextRun MacChromeBold SCXW129818391 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW129818391 BCX0\" data-ccp-parastyle=\"Normal (Web)\">Voter Roll \u2018Deduplication\u2019 by the Electoral Commission of India\u00a0<\/span><\/span><span class=\"EOP SCXW129818391 BCX0\" data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:276}\">\u00a0<\/span><\/strong><\/p>\r\nRecent exercises undertaken by the Electoral Commission of India (\u201cECI\u201d) to \u2018clean\u2019 voter rolls through digital deduplication algorithms are another important example of algorithmic decision-making disturbing individual rights in novel ways.\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn 2015, the ECI launched the National Electoral Roll Purification and Authentication Programme (\u201cNERPAP\u201d) with the objective of \u201cbringing a totally error-free and authenticated electoral roll\u201d, through linking electoral databases with the database of India\u2019s national biometric resident database \u2013 UID or Aadhaar. The process of \u2018linking\u2019 databases was implemented through a computer software programme which was used to algorithmically \u2018deduplicate\u2019 \u2013 i.e., remove multiple copies of the same data from a database \u2013 voter lists, ostensibly in order to ensure that there is no voter fraud due to the possession of multiple voter ID cards. This was achieved by comparing Aadhaar data \u2013 deemed to be a unique reference, with the demographic details of individuals enrolled on voter lists. Ostensibly, if the Aadhaar data mapped to more than one voter record, it would be deemed to be a \u2018duplicate\u2019 and removed from the voter rolls.[footnote]\u2018Linking of Electoral Data with Aadhaar: All You Need to Know\u2019 <em>The Times of India<\/em> (21 December 2021) &lt;<a href=\"https:\/\/timesofindia.indiatimes.com\/business\/india-business\/linking-of-electoral-data-with-aadhaar-all-you-need-to-know\/articleshow\/88408171.cms\">https:\/\/timesofindia.indiatimes.com\/business\/india-business\/linking-of-electoral-data-with-aadhaar-all-you-need-to-know\/articleshow\/88408171.cms<\/a>&gt;.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe NERPAP process was trialled across a number of jurisdictions, most prominently perhaps in Telangana, where 30,00,000 people were reportedly removed from the voter rolls without following the established procedure, thereby disentitling them from participating in the state elections.[footnote]\u2018Democracy at Stake: Why Many Eligible Voters Might Not Vote in Telangana on Dec 7 | The News Minute\u2019 &lt;<a href=\"https:\/\/www.thenewsminute.com\/article\/democracy-stake-why-many-eligible-voters-might-not-vote-telangana-dec-7-92706\">https:\/\/www.thenewsminute.com\/article\/democracy-stake-why-many-eligible-voters-might-not-vote-telangana-dec-7-92706<\/a>&gt;[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nA challenge to the NERPAP Scheme and the use of software to automate voter deduplication was filed before the Telangana High Court, claiming, among other things, that the ECI deployed an \u201c<em>algorithm \u2026 which is neither transparent nor public, to carry out its statutory and constitutional duty of preparing and maintaining the voter rolls in India generally and Andhra Pradesh and Telangana in particular, which led to the deletion of almost 27 lakh voters in Telangana and 19 lakh voters in Andhra Pradesh in violation of the procedure established by law and declared by the Supreme Court of India.<\/em>\u201d[footnote]\u2018Srinivas Kodali v. Election Commission Of India, Through Secretary And Others, Telangana High Court, (PIL No. 374 \/ 2018)[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAs with the case of the tax administration, the claims made before the High Court in the case of the NERPAP automation of voter deduplication relate to the opacity of the software and logic employed, as well as the lack of due process followed when making a decision that disturbed the rights of affected persons.\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n<p style=\"padding-left: 40px;\"><strong>3. <span class=\"NormalTextRun SCXW38194781 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">Fraud Analytics in Healthcare<\/span><span class=\"NormalTextRun SCXW38194781 BCX0\" data-ccp-parastyle=\"Standard\"> Administration<\/span><\/strong><\/p>\r\nIn 2018 the Government of India launched a national public health insurance scheme termed as the Pradhan Mantri Jan Arogya Yojna (PMJAY), which, among other things, aims to provide health insurance coverage to poor households. Over the course of implementation of the scheme, the Government of India has entered into various partnerships with private firms for fraud detection and analysis of transactions and claims made through the scheme.[footnote]\u20185 Analytical Firms Look for Fraud in Ayushman Bharat PMJAY - Health News, Medibulletin\u2019 &lt;<a href=\"https:\/\/medibulletin.com\/5-analytical-firms-look-for-fraud-in-ayushman-bharat-pmjay\/\">https:\/\/medibulletin.com\/5-analytical-firms-look-for-fraud-in-ayushman-bharat-pmjay\/<\/a>&gt;.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAccording to public documentation about the scheme released by the National Health Authority, a\u2018Fraud Analytics Control and Tracking System\u2019, (\u201cFACTS\u201d) has been implemented, which will ostensibly use Artificial Intelligence and Machine Learning in order to \u201c<em>identify suspect transactions &amp; entities. Using advanced tools such as Natural Language Processing and Optical Character Recognition and Image Analytics, unstructured data such as images, documents and clinical notes submitted are analysed to detect cases of potential fraud and abuse.<\/em>\u201d[footnote]Ayushman Bharat PM-JAY Annual Report, 2020-2021, National Health Authority, &lt;<a href=\"https:\/\/nha.gov.in\/img\/resources\/Annual-Report-2020-21.pdf\">https:\/\/nha.gov.in\/img\/resources\/Annual-Report-2020-21.pdf<\/a>&gt;.[\/footnote] As per guidelines for the scheme, a finding of prima facie fraud from the algorithm can trigger an investigation which can result in the rejection of an insurance claim as well as further disciplinary action on the identified entity.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nAs with the above cases of using automation in administrative decisions, the algorithmic system utilised for identifying and making the initial decision about \u2018fraudulent claims\u2019 is not made public, nor is there information about the basis on which it operates, apart from the fact that it is based on Machine Learning techniques.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe algorithmic techniques that the FACTS system reportedly uses, known as Machine Learning, or ML, is based on analysing large datasets to find patterns among data, and impose that logic or pattern among future instances of data. As we will discuss in the next section, apart from the general concerns posed by automated decision making, ML introduces distinct challenges for the purpose of reviewing the propriety of administrative action from the lens of administrative law.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn the subsequent section, we explore how these legal-ethical considerations around fairness, accountability and transparency have emerged specifically in the context of public administration in India, and briefly review the jurisprudence and literature pertaining to algorithmic decision-making, and public administrative law.\r\n\r\n<\/div>\r\n&nbsp;\r\n<div style=\"font-weight: 400;\">\r\n\r\n<strong>Public Administration in the Age of Automation\u00a0<\/strong>\r\n\r\nThe emerging centrality of information technologies, and automated decision-making systems, within public administration is as much a phenomenon that concerns organisational changes in government, and wider political and economic trends, as much as it does technological change.[footnote]Helen Margetts and Patrick Dunleavy, \u2018The Second Wave of Digital-Era Governance: A Quasi-Paradigm for Government on the Web\u2019 (2013) 371 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20120382.[\/footnote] Scholars of public administration have theorised how these technological transformations fundamentally the context within which policy choices are made and within which public administration takes place. In particular, scholars have noted how contemporary public administration around the world, including in India, has been characterised by \u2018New Public Management\u2019, or NPM, a \u2018market-based\u2019 model of governance emphasising efficiency, innovation and service-delivery, in turn encouraging deregulation, public-private partnerships, and technification of government administration.[footnote]Baru RV and Nundy M, \u2018Blurring of Boundaries: Public-Private Partnerships in Health Services in India\u2019 (2008) 43 Economic and Political Weekly 62.[\/footnote] As Magretts et. al., note, principles of NPM laid the foundation for the contemporary technification and digitisation of public administration, leading to what they identify as \u2018Digital Era Governance\u2019, which places the use and integration of previously siloed government information systems at the very heart of public administration functions, driving transformations in the organisation and culture of public administration at large by influencing public sector values and changing the role of judgement and discretion which are at the heart of administrative decisions.[footnote]Margetts and Dunleavy (n 36).[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nCuellar, similarly, argues that algorithmic systems are bringing about both complex and subtle organisational changes within the administrative state, with the increasing adoption of opaque data-modelling and data science techniques in administrative decision-making requiring specific trade-offs between optimising social welfare concerns with \u2018political pragmatism and procedural constraints\u2019, and restructuring administrative functions and organisation in the process.[footnote]Mariano-Florentino Cu\u00e9llar, \u2018Cyberdelegation and the Administrative State\u2019 in Nicholas R Parrillo (ed), <em>Administrative Law from the Inside Out: Essays on Themes in the Work of Jerry L. Mashaw<\/em> (Cambridge University Press 2017).[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe emergence of these technologies as crucial elements in the administrative establishment of the state has attracted some degree of interest from courts, regulators as well as within legal scholarship attempting to explain and account for the implications of algorithmic technologies for public administration and the citizen-state relationship. Before turning to the analysis of algorithmic decision-making in the context of Indian administrative law, it is useful to examine how this interaction has been analysed in some common law jurisdictions.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nScholars of public law in the U.S. have written about the potential implications of computerisation and digitisation on administrative procedure since the early 1990s. Schwartz\u2019 germinal paper on data processing and government administration notes how bureaucracy in the U.S. was transforming into an \u2018information processing\u2019 system, and its implications for \u2018bureaucratic justice\u2019 \u2013 accuracy, efficiency and dignity of the participant in an administrative process, particularly owing to the non-transparent nature of relying upon computer operations. Schwartz argues for building in both procedural safeguards through data protection regulation, as well as an independent oversight mechanism for such decision-making within public administration.[footnote]Schwartz (n 8).[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nDanielle Citron has also argued for revamping procedural rights in the US context. Citron argues that computerised decision-making nullifies distinctions between administrative rule-making and adjudication functions, without providing the adequate safeguards offered by administrative law for either function. Administrative decision-making usually assumes procedural safeguards such as notice and hearing mechanisms in the case of individualised adjudications, or notice and comments, and more generally public transparency and participation mechanisms for rulemaking and delegated legislative functions. Citron argues, however, that contemporary algorithmic and data-driven systems combine rulemaking and adjudication functions in ways that obscure the specific procedural protections of either of these, resulting in a procedural void as far as administrative law and regulation is concerned. This is particularly true in cases where \u2018data-based decisions\u2019 lead to the creation by computer systems of new rules by which to process individual cases, as well as the application of these rules to particular cases.[footnote]Citron (n 5).[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nDeirdre Mulligan and Kenneth Bamberger have also argued for the application of administrative law protections to administrative decision-making which involves algorithmic systems. Their analysis is particularly important for taking into account the organisational and institutional context of the modern administrative state, where software for administrative functions is often outsourced to private actors, thereby also outsourcing the policymaking functions that algorithmic systems displace. They argue that such \u2018policy-by-procurement\u2019 should be restructured to incorporate specific rules of administrative accountability including public input and expert deliberation into algorithmic processes.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe legal implications of automated decision-making systems were also considered by the Australian Administrative Review Council (\u201cARC\u201d) as far back as 2004, when it provided specific guidance for administrative agencies to consider the legality of the use of automated decision systems, in line with administrative law values of \u2018lawfulness, fairness, rationality, openness (or transparency) and efficiency\u2019.[footnote]Administrative Review Council (Australia), <em>Automated Assistance in Administrative Decision Making: Report to the Attorney-General<\/em> (AGPS 2005).[\/footnote] The ARC guidance notes that administrative law principles governing the legality of administrative decisions, the use of discretion and natural justice are at stake in the consideration to use or rely upon automated \u2018expert systems\u2019 which make decisions or aid in human decision-making.[footnote]<em>Id.<\/em>[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn the United Kingdom, legal scholars have scrutinised specific executive actions in the administrative legal context, arguing for greater scrutiny through judicial review as well as re-framing administrative law principles in light of automated decision-making. Marion Oswald examines, in particular, the impact of machine learning and so-called \u2018predictive\u2019 tools in administrative decisions. She argues that automated decision-making changes the nature and meaning of duties of administrative agencies to \u2018give reasons\u2019 for decisions, as well as the standard of \u2018relevance\u2019 of fact and reasonableness of executive decision-making.[footnote]Marion Oswald, \u2018Algorithm-Assisted Decision-Making in the Public Sector: Framing the Issues Using Administrative Law Rules Governing Discretionary Power\u2019 (2018) 376 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20170359.[\/footnote] Similarly, Jennifer Cobbe analyses how the (largely uncodified) principles of English administrative law might be applied to a range of automated decision-making systems in particular contexts. Cobbe draws from data protection regulation and standards, to argue that machine learning tools, in particular, might fall foul of certain principles including the right to provide adequate legal justifications for certain decisions, the duty of a delegate to not fetter the discretionary power granted to them, and the requirement to only consider relevant facts in administrative adjudications.[footnote]Jennifer Cobbe, \u2018Administrative Law and the Machines of Government: Judicial Review of Automated Public-Sector Decision-Making\u2019 (2019) 39 Legal Studies 636.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nCourts in most common law jurisdictions have not had much opportunity to consider the specific legal implications of administrative use of algorithmic systems.[footnote]Nb. Courts have had the opportunity to consider algorithmic systems implicated in challenges to administrative action, but few have specifically commented on the specific implications of the use of automated systems and similar technology. Cf. Peter Whiteford, \u2018Debt by Design: The Anatomy of a Social Policy Fiasco \u2013 Or Was It Something Worse?\u2019 (2021) 80 Australian Journal of Public Administration 340.[\/footnote] A notable exception is the algorithmic system was at issue in <em>State v Loomis<\/em>, before the Wisconsin Supreme Court.48 Here, the use of an algorithmic risk assessment system known as COMPAS for making bail decisions was challenged as being contrary to due process requirements. However, the court noted that the algorithmic system\u2019s outputs were not making individualised adjudications in a manner which was sufficient to attract the due process requirement under U.S. administrative law, and the court proceeded to allow for its use, among other things, on the grounds that bail decisions were not <em>relying upon<\/em> the COMPAS system, but were merely considering it. The distinction between these two was not clearly articulated \u2013 an issue we will discuss later\u2013 and the decision has been criticised subsequently for failing to take into account due process requirements in algorithmic decision-making.[footnote]Katherine Freeman, \u2018Algorithmic Injustice: How the Wisconsin Supreme Court Failed to Protect Due Process Rights in State v. Loomis\u2019 18 33.[\/footnote]\r\n\r\n&nbsp;\r\n<div style=\"font-weight: 400;\">\r\n\r\n<strong>Administrative Law in India and Automated Decision-Making\u00a0<\/strong>\r\n\r\nAdministrative law in India constitutes a largely uncodified field based to a large extent on principles of constitutional law and the bill of fundamental rights in Part III of the Constitution of India.[footnote]Sujit Choudhry, Madhav Khosla and Pratap Bhanu Mehta (eds), <em>The Oxford Handbook of the Indian Constitution<\/em> (Oxford University Press 2016); Raeesa Vakil, \u2018Constitutionalizing Administrative Law in the Indian Supreme Court: Natural Justice and Fundamental Rights\u2019 (2018) 16 International Journal of Constitutional Law 475.[\/footnote] Legal review of administrative action is based on a mix of reinterpreted English common law principles and analysis of constitutional principles under Article 14 of the Constitution, which establishes the right to equality, encompassing, among other things, the concept of reasonableness of administrative action.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nBroadly, the grounds for legal review (and the permissible limits of administrative action) were laid down in the Supreme Court\u2019s judgement in <em>Tata Cellular v Union of India<\/em>,[footnote]1994 SCC (6) 651.[\/footnote] whereby the court noted that there are three broad grounds for challenging administrative action, namely: Illegality of the action \u2013 exceeding or not giving effect to the statutory or legal provision by which a decision-maker derives power from; irrationality or unreasonableness, which governs the exercise of discretionary power; and procedural impropriety, more broadly framed as rules of natural justice.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIn this section, we will examine how these rules of judicial review, or regulations and limitations on administrative action might apply in the context of administrative use of automated decision-making systems outlined in the case studies above.\r\n\r\n<\/div>\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<strong>Rules of Discretion\u00a0\u00a0<\/strong>\r\n\r\nA central concern of administrative law and regulation is the control over discretionary executive action. Effective administration is only possible by providing a large degree of discretionary power to execute legislative policy, and in particular, as Upendra Baxi argues, \u2018<em>discretion is a tool for the individualisation of justice<\/em>\u2019 allowing for the operation of a socio-economic welfare state like India.[footnote]Upendra Baxi, \"<em>Development in Indian Administrative Law<\/em>\" in A.G. Noorani (ed.), Public Law India (1982).[\/footnote] Administrative law is therefore concerned with balancing the imperative of delegating discretionary power to the executive with concerns around its appropriate and rights-conforming use.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<strong>Improper Delegation of Discretionary Power\u00a0\u00a0<\/strong>\r\n\r\nWhen examining delegated discretionary power, the courts must assess whether the delegation is legal. Judicial review of legislative action here examines whether the power conferred on the executive has been \u2018properly\u2019 delegated, namely, falls within the constitutional bounds of legislative delegation, which are assessed primarily against the trinity of rights under Articles 14, 19 and 21. In a number of cases, the Supreme Court of India has held that statutes that are so vague as to provide no guidance to those enforcing the law, to prevent its arbitrary exercise, must be struck down as discriminatory.[footnote]Shreya Singhal v. UOI, (2015) 5 SCC 1.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIf we apply this standard of scrutiny to the language of the Taxation (Amendment) Act, 2020, it could be argued that the language of the statute confers broad discretionary power to utilise the automated tools for the allotment and examination of tax assessments. In defining the mechanism that should be utilised for such assessments, the statute provides no guidance on how or on what principles such assessment should take place, apart from through \u2018suitable technological tools including Artificial Intelligence and Machine Learning\u2019. As explained previously, the scope of these words, and the technologies that they incorporate, is incredibly broad and constitutes a wide range of activities, processes and technologies. Machine Learning, for example, may be utilised to incorporate any number of factors, relevant or not, on the basis of which a tax return could be allotted or examined. Delegating administrative power on the basis of the use of these particular technologies for automated assessment, may, therefore, fall foul of the standard against vagueness of a statute, since it provides no guidance for the executive in how such technologies may be utilised and what factors they might consider in coming to decisions that affect rights of legal persons, nor does it provide procedural safeguards to ensure against arbitrary exercise of such power.[footnote]Indeed, the lack of a requirement to provide a personal hearing in the FAS has been challenged before multiple High Courts, as of the time of writing.[\/footnote] This example may usefully be extended to other areas where delegated power may be sought to be conferred through the use of technological tools for automated decision-making. Consider, for example, a requirement that executive authorities examine illegal speech on online platforms through the use of \u2018machine learning tools\u2019 or \u2018automated tools\u2019, without laying down the criteria on which such analysis must be based, would also likely fall foul of the rule against vagueness.[footnote]This example is consciously borrowed from a similar rule incorporated in the IT Act (Intermediary Guidelines) Rules, 2021.[\/footnote]\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<strong>Improper Exercise of Delegated Power\u00a0<\/strong>\r\n\r\nThe exercise of delegated power must also conform to certain legal principles. When the law confers discretionary power on an administrative authority, the authority must ensure that (1) the discretion is not abandoned or fettered; and (2) that the discretion is exercised \u2018properly\u2019.[footnote]I.P. Massey, <em>Administrative Law<\/em>, (10th Edition, Eastern Book Company, 2017)[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe rule against fettering jurisdiction implies that when discretion is conferred on an authority, the authority must itself exercise such delegation, and must not sub-delegate its powers (without legal authority), place the power to take a decision on another body, blindly follow the dictation of a third party, or follow a procedure in exercising discretion whereby it is unable to take into account the merits and circumstances of a particular case.[footnote]Indian Rly. Construction Co. Ltd. v. Ajay Kumar, (2003) 4 SCC 579.[\/footnote] The rule against fettering discretion is particularly relevant when considering how human agents and automated decision-making systems interact and the contexts in which administrative decisions are formally \u2018assisted\u2019 by automated systems. As indicated above, even the most complex algorithmic system is incapable of utilising its own discretion. Algorithmic systems are by definition bound by specific rules (although the rule-base of certain contemporary systems may constantly evolve or be incredibly vast).[footnote]Cormen and others (n 11).[\/footnote] As such, the wholesale exercise of an administrative power by an algorithmic system, or in other words, if an algorithmic system directly makes and effects an administrative decision, it would be a clear violation of the rule against discretion being fettered.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nHowever, in most cases, there is (at least formally) a human agent making a \u2018final decision\u2019, usually \u2018assisted\u2019 by an automated system. Consider, for example, the case of the NERPAP algorithm. By simple calculation of the numbers involved in the voter removal exercise and the timeline, it is apparent that human decision-makers would not have been able to apply their discretion in any meaningful manner. It is more likely that they merely proceeded on the basis of the \u2018decision\u2019 that was provided to them by the software used ostensibly for deduplication, without application of their own discretion. This is commonly referred to as \u2018automation bias\u2019 in the literature studying the interaction between human agents and computer systems \u2013 namely, where, for multiple reasons, a decision-maker would choose to rely on an automated system instead of considering countervailing evidence or administering their own discretion.[footnote]Ben Green and Yiling Chen, \u2018Disparate Interactions: An Algorithm-in-the-Loop Analysis of Fairness in Risk Assessments\u2019, <em>Proceedings of the Conference on Fairness, Accountability, and Transparency<\/em> (ACM 2019) &lt;<a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3287560.3287563\">https:\/\/dl.acm.org\/doi\/10.1145\/3287560.3287563<\/a>&gt;.[\/footnote] Automation bias is merely one example of the ways in which complex algorithmic systems interact with human agents and oversight. However, it indicates that the exercise of discretion by administrative authorities is substantially challenged by the use of automated systems, and that merely the fact that the final decision is made by a human being should not disallow from scrutiny that the decision was made without application of mind or in violation against the principle of fettering discretion.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe proper exercise of discretionary power considers the manner in which discretion is exercised, and the factors that any administrative decision must take into account. Needless to say, administrative action can always be reviewed on grounds of its unconstitutionality or violation of fundamental rights. However, for the purpose of this paper, we will examine the tenets of administrative law relating to the procedure, and not the effect, of administrative decision-making. The popular formulation of administrative propriety in decision-making under English common law is the Wednesbury test for reasonableness or \u2018irrationality\u2019 of decision-making, which has also been imported into jurisprudence in Indian High Courts and the Supreme Court.[footnote]G.B. <em>Mahajan v. Jalgaon<\/em> Municipal Council, [1991] 3 SCC 91[\/footnote] The standard of rationality applied in judicial review is that the decision must not be \u2018in outrageous defiance of logic or moral standards\u2019, or that the decision takes into account irrelevant or extraneous factors, or fails to take into account relevant facts.[footnote]Indian Railway Construction Co. Ltd. v. Ajay Kumar (2003 (4) SCC 579)[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe standards of relevance and rationality of a decision is clearly implicated in the process of automated decision-making. Relevance of the material facts taken into consideration are implicated particularly in automated systems that incorporate large amounts of data sets in order to find patterns and establish links between underlying data and a specified outcome. Consider the example of the FACTS fraud analytics system. Hypothetically, the algorithm on which the data analytics system decides whether a hospital or a beneficiary is \u2018fraudulent\u2019 may take into account a number of factors, including transactional information about health purchases, but also factors such as social media behaviour, consumer consumption data, etc.[footnote]The hypothetical is not too far from reality. Data from social media is widely used in algorithmic determinations of credit scores in India and elsewhere. See \u2018Not CIBIL, This Lender Uses Your Social Media Behaviour for Loan up to Rs 2 Lakh!\u2019 (<em>Financialexpress<\/em>) &lt;<a href=\"https:\/\/www.financialexpress.com\/money\/not-cibil-this-lender-uses-your-social-media-behaviour-for-loan-up-to-rs-2-lakh\/1761934\/\">https:\/\/www.financialexpress.com\/money\/not-cibil-this-lender-uses-your-social-media-behaviour-for-loan-up-to-rs-2-lakh\/1761934\/<\/a>&gt;.[\/footnote] The former is arguably relevant to a determination of fraud, while the latter is likely to have little to no bearing on whether a person commits fraud in this particular scheme. As such, if courts were to examine the facts on which such a system made decisions, which was subsequently relied upon by administrative agencies, it may find that they do not satisfy the doctrine of reasonableness. Similarly, algorithmic systems may incorporate logical rules which do not satisfy the reasonableness or rationality criterion. In particular, algorithmic systems which are based on drawing inferences between categories of information are intended to optimise particular functions without consideration of any underlying logic. In doing so, they can both reproduce historically prejudiced action, but also confuse co-relation with causation and establish rules of decision-making which are wholly illogical or arbitrary. For example, to revisit once again the FACTS system, the algorithm may establish a rule (based on available statistical information) that persons who suffer from particular disabilities are more likely to commit fraud. Where the law requires that particular facts be taken into account, or that irrelevant factors are not taken into account, or that the logic of decision-making should adhere to certain normative standards in administrative decisions, various kinds of formulations and \u2018data-based\u2019 analytics which are based on processing large volumes of diverse information may be implicated.\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n<strong>Rules of Adjudication and Principles of Natural Justice\u00a0<\/strong>\r\n\r\nA third, and particularly important consideration in administrative decision-making is its procedural propriety. While adherence to specific procedural norms is writ across administrative decision-making, it is particularly important when an authority is in a \u2018quasi-judicial\u2019 role, namely, where it must make a determination on facts and application of standards or rules, which can prejudice the rights of an individual or a group.[footnote]Although the distinction between a \u2018quasi-judicial\u2019 and administrative action is increasingly waning inasmuch as procedural propriety is concerned. See A.K Kraipak v. Union of India 1969 2 SCC 262[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nWhere an administrative action prejudicially affects the rights of a person, the principles of natural justice are applicable to such a decision. Broadly, these principles may be classified as \u2013 (1) the rule against bias, and (2) the right to a fair hearing.[footnote]D.K. Yadav vs J.M.A. Industries Ltd, 1993 SCC (3) 259.[\/footnote]\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nThe rule against bias requires that, where a fair adjudication of facts is required, the issue should not be prejudiced or pre-determined by biases that might arise in various contexts. Bias generally depends upon the individual circumstances of a case, concerning the decision-making body or institutional context, and their pre-conceived notions. The standard for determining bias is whether a \u201c<em>reasonable man, in possession of relevant information, would have thought that bias was likely and whether the authority concerned was likely to be disposed to decide the matter in a particular way.<\/em>\u201d Therefore, the fact of bias does not need to be proven, and the reasonable likelihood of bias is sufficient grounds to challenge a decision.[footnote]Jiwan K. Lohia v. Durga Dutt Lohia, (1992) 1 SCC 56.[\/footnote] The rule against bias has generally operated where there is a personal or pecuniary interest of the decision-maker, but its broader formulation cautions against situations in which decisions cannot be taken objectively. As noted above, algorithmic systems exhibit bias and discrimination in many ways, which could systematically preclude an objective assessment in certain contexts. For example, a system that takes into account historical information may inherit historical biases on the basis of caste, class, gender or sexuality, or their proxies, which are then used as part of\u00a0 the decision-making matrix. In such cases, the decisions relying upon such systems may both be substantively discriminatory and violative of Articles 14, 15 or 16, but could also give rise to a reasonable likelihood of bias that violates the procedural norms of natural justice.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIt is unclear what a judicial analysis of the rule against bias in administrative decision might look like in the context of algorithmic decision-making. While algorithmic systems have been shown to indicate discriminatory \u2018biases\u2019 - bias in the training data, statistical biases of the model used, or bias in the choice of application, it might prove difficult for an affected party to challenge a decision on the basis that it violates the rule against bias, without sufficient material on which to make such a claim.[footnote]Cobbe (n 45).[\/footnote] The burden of proof to show that there is a \u2018real likelihood of bias\u2019 normally falls on the affected person or the person making the claim. However, under the present conditions of non-transparency about how decisions utilising algorithmic systems are made, it might prove challenging to sustain such a claim.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n<em>The right to a fair<\/em> hearing encompasses a number of principles that ensures that a person suffering the consequences of an administrative adjudication has the ability to present their case and change the outcome of a decision.[footnote]Keshav Mills Co. Ltd. v. Union of India, (1973) 1 SCC 380.[\/footnote] This rule, often captured in the phrase <em>audi alterem partem<\/em>, or \u2018hear the other side\u2019 requires an administrative authority to satisfy a number of procedural conditions in coming to a decision. Broadly, these include the requirement to provide a notice that a hearing will take place, a right to the affected person to know the evidence used against them, including a right to inspect the evidence available before the authority, and the right to present evidence and cross-examine the evidence presented against them. In some cases, there is also a duty to provide reasons for coming to a particular decision (although there is no general duty to provide reasons), linking the materially relevant facts with the final decision.[footnote]Gurdial Singh Fiji v. State of Punjab, (1979) 2 SCC 368; Kranti Associates (P) Ltd. v. Masood Ahmed Khan, (2010) 9 SCC 496.[\/footnote] As per the Supreme Court, the rationale for providing reasons is linked to the transparency of the decision-making process for the affected persons as well as for the purpose of judicial or appellate review.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nIt is apparent from the case studies discussed previously that the use of automated decision-making systems challenges many aspects of natural justice as laid down by the Supreme Court. In particular, challenges arise when decision-making processes are unable to provide sufficient justification or rationale for a decision, and are unable to consider any additional or extenuating evidence presented by parties to the decision. As we noted previously, the outputs of an algorithmic system are often inscrutable or opaque for a number of reasons, including the nature of the mathematical operations or due to the confidentiality of the algorithmic system of the data. This implies that the duty to provide reasons cannot always be suitably satisfied in cases where automated systems make or assist in making decisions. In each of the examples above, the algorithmic systems used have not been made transparent to affected persons in any meaningful way. It is unclear what data is used in the system, or what logical process is followed by the algorithm in order to arrive at a conclusion. Similarly, the system itself is unable to consider additional evidence in its decision-making process. In the case of the FAS and the NERPAP, it has also been alleged in court proceedings that personal hearings were also disposed of, owing to the reliance on the automated system for expediency, further implying that many of these decisions may fall foul of important conditions that principles of natural justice require to be satisfied.\r\n\r\n&nbsp;\r\n<div style=\"font-weight: 400;\">\r\n\r\n<strong>Conclusion<\/strong>\r\n\r\nThis paper has argued that algorithmic systems \u2013 assemblages of computational and data-based tools \u2013 are being used in the context of public sector administrative decision-making in India in a manner that implicates important norms that regulate administrative conduct. These include norms that place limits on the delegation of power to the executive branch, as well as norms about how administrative power should be exercised in order to protect certain important constitutionally guaranteed rights, including non-discrimination and equality, as well as the concept of \u2018natural justice\u2019, also read into constitutional guarantees.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nNew digital technologies, particularly computational and data-based systems, are likely to remain mainstays of government administration, offering improvements in administrative efficiency and certainty. In the process, digital technologies are also systematically changing norms and values of the public sector. This raises an important question about the evolution of legal systems in conjunction with these changes in administrative procedure. In particular, administrative law faces distinct challenges \u2013 how should the law balance the values which potentially conflict with the use of automated decision-making systems? Should bureaucratic efficiency be provided greater leeway as against individualised adjudication and procedural justice? Should the scope of administrative discretion be expanded, as large-scale information systems allow for a greater role of the administrative state?\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\nI have argued that the manner in which the Indian state is uncritically deploying and relying upon algorithmic systems in administration today requires us to urgently address these questions, particularly in asking whether this use comports with established legal norms and principles that guide and regulate administrative conduct. A bare assessment of a sample of algorithmic systems deployed indicates that they do not fulfil important criterion on the basis of which we judge the legality and constitutionality of administrative decision-making \u2013 they ignore established limits on the delegation of power, occlude protections on transparency and accountability about the manner in which administrative discretion is exercised, and override procedural protections which form the basis for the delivery of individualised justice in administrative proceedings. There is an urgent need for a legal response that understands the implications of these technologies. Considering the largely uncodified basis of Indian administrative law and its roots in the Indian constitution, it is likely that such a response would need to come from higher courts in India, who must re-assert the application of administrative legal principles in scrutinising administrative conduct which is guided by automated decision-making systems.\r\n\r\n<\/div>\r\n<div style=\"font-weight: 400;\">\r\n\r\n&nbsp;\r\n\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<\/div>","rendered":"<div style=\"font-weight: 400;\">\n<p><strong>Introduction<a class=\"footnote\" title=\"PhD Candidate, Faculty of Laws, University College London. The author would like to thank Kruthika R. for her inputs and discussions which are invaluable to this paper.\" id=\"return-footnote-165-1\" href=\"#footnote-165-1\" aria-label=\"Footnote 1\"><sup class=\"footnote\">[1]<\/sup><\/a><\/strong><\/p>\n<p>With the ubiquity of digital information and computational tools, there has been a concomitant proliferation in the use of computers to analyse information and produce specific outputs on the basis of encoded rules and logics. Such computational tools, which we will refer to as \u2018algorithmic systems\u2019 have implications not only for their use in particular domains (like healthcare or policing), but also in their systemic effects on the manner in which knowledge about individuals and societies is parsed and acted upon.<a class=\"footnote\" title=\"Tarleton Gillespie, \u2018The Relevance of Algorithms\u2019 in Tarleton Gillespie, Pablo J Boczkowski and Kirsten A Foot (eds), Media Technologies (The MIT Press 2014) &lt;http:\/\/mitpress.universitypressscholarship.com\/view\/10.7551\/mitpress\/9780262525374.001.0001\/upso-9780262525374-chapter-9&gt; accessed 29 July 2020.\" id=\"return-footnote-165-2\" href=\"#footnote-165-2\" aria-label=\"Footnote 2\"><sup class=\"footnote\">[2]<\/sup><\/a>\u00a0In this paper, I focus on automated decision-making in the public sector, a subset of algorithmic systems which are used within decision-making processes in public administration, either producing kinds of knowledge as outputs to be acted upon by human agents, or directly triggering particular actions as an outcome of an algorithmic process.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Algorithmic systems are assuming an increasingly prominent role in public administration in India. Decisions ranging from policy formulation and rule-making, to quasi-judicial functions of evaluating specific claims are now delegated, in varying degrees, to computer algorithms which function with some degree of autonomy and without requiring direct human involvement. Algorithmic systems have been used in bureaucratic processes in India since at least the 1980s, when \u2018rule-based\u2019 systems were piloted within tax and healthcare administration.<a class=\"footnote\" title=\"Patrick Saint-Dizier, \u2018The Knowledge-Based Computer System Development Program of India: A Review\u2019 (1991) 12 AI Magazine 33.\" id=\"return-footnote-165-3\" href=\"#footnote-165-3\" aria-label=\"Footnote 3\"><sup class=\"footnote\">[3]<\/sup><\/a> Contemporary administrative use of algorithmic systems includes the proliferation of \u2018machine learning\u2019 systems, which seek to create their own logics and patterns of understanding based on analysis of vast underlying datasets, in order to optimise for specific outcomes.<a class=\"footnote\" title=\"Michael Veale and Irina Brass, \u2018Administration by Algorithm?: Public Management Meets Public Sector Machine Learning\u2019, Algorithmic Regulation (Oxford University Press 2019) &lt;https:\/\/oxford.universitypressscholarship.com\/10.1093\/oso\/9780198838494.001.0001\/oso-9780198838494-chapter-6&gt;\" id=\"return-footnote-165-4\" href=\"#footnote-165-4\" aria-label=\"Footnote 4\"><sup class=\"footnote\">[4]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>As the use of algorithmic systems in society has proliferated, there has been a substantial body of literature engaging with questions about information processing within algorithmic systems, and its legal consequences, particularly under public law. Scholars have examined how the move towards data-driven decision-making systems fundamentally impact concepts of the rule of law and justice, which are the root of constitutional democracies.<a class=\"footnote\" title=\"Mireille Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Paperback edition, EE Edward Elgar Publishing 2016).\" id=\"return-footnote-165-5\" href=\"#footnote-165-5\" aria-label=\"Footnote 5\"><sup class=\"footnote\">[5]<\/sup><\/a><a class=\"footnote\" title=\"Danielle Keats Citron, \u2018Technological Due Process\u2019 (2007\u20132008) 85 Washington University Law Review 1249.\" id=\"return-footnote-165-6\" href=\"#footnote-165-6\" aria-label=\"Footnote 6\"><sup class=\"footnote\">[6]<\/sup><\/a> Scholarship has also dwelled on the impact of algorithmic systems on privacy and data protection law, particularly on the aspect of privacy which preserves individual self-determination and selfhood.<a class=\"footnote\" title=\"Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford University Press 2009); Mireille Hildebrandt, \u2018Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning\u2019 (2019) 20 Theoretical Inquiries in Law 83.\" id=\"return-footnote-165-7\" href=\"#footnote-165-7\" aria-label=\"Footnote 7\"><sup class=\"footnote\">[7]<\/sup><\/a> A related branch of studies has contended with algorithmic fairness, transparency and accountability, and its implications for legal systems concerned with, for example, the right to information, rights against anti-discrimination and liability for wrongful conduct.<a class=\"footnote\" title=\"Solon Barocas and Andrew D Selbst, \u2018Big Data\u2019s Disparate Impact\u2019 (2016) 104 California Law Review 671.\" id=\"return-footnote-165-8\" href=\"#footnote-165-8\" aria-label=\"Footnote 8\"><sup class=\"footnote\">[8]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Within this broader field of algorithmic studies, there is a specific body of literature which has engaged with the effects of algorithmic systems in government administrations. Early engagement with this subject examined the impact of rule-based expert systems within government and the rise of the \u2018data processing model of bureaucracy\u2019 on concepts of administrative law, including reasonableness and fairness in administrative decision-making and public participation in policy processes.<a class=\"footnote\" title=\"Paul Schwartz, \u2018Data Processing and Government Administration: The Failure of the American Legal Response to the Computer\u2019 (1991) 43 Hastings LJ 1321; Citron (n 5).\" id=\"return-footnote-165-9\" href=\"#footnote-165-9\" aria-label=\"Footnote 9\"><sup class=\"footnote\">[9]<\/sup><\/a> More recent engagement incorporates concerns relating to developments in big data analysis and machine learning systems as well as the increasing autonomy attributed to algorithmic decision-making systems, including its impact on administrative discretion and processes of adjudication.<a class=\"footnote\" title=\"Michael Veale, Max Van Kleek and Reuben Binns, \u2018Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision-Making\u2019 [2018] Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems 1; Deirdre K Mulligan and Kenneth A Bamberger, \u2018Procurement as Policy: Administrative Process for Machine Learning\u2019 (2019) 34 Berkeley Technology Law Journal 773.\" id=\"return-footnote-165-10\" href=\"#footnote-165-10\" aria-label=\"Footnote 10\"><sup class=\"footnote\">[10]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Legal scholarship engaging with administrative law and algorithmic systems has mostly been within the United States (\u201cU.S.\u201d) and European contexts. In India, while there has been renewed and multi-disciplinary scholarly attention paid to information systems utilised within government administration, largely as a result of large-scale projects like Aadhaar,<a class=\"footnote\" title=\"Reetika Khera, \u2018Impact of Aadhaar in Welfare Programmes\u2019 (2017) SSRN Scholarly Paper ID 3045235 &lt;https:\/\/papers.ssrn.com\/abstract=3045235&gt;\" id=\"return-footnote-165-11\" href=\"#footnote-165-11\" aria-label=\"Footnote 11\"><sup class=\"footnote\">[11]<\/sup><\/a> legal scholarship as well as judicial and policy attention has approached administrative information processing activities primarily from the lens of informational privacy and data protection. While the lens of privacy and data protection law can and should inform the regulation of algorithmic systems, it is not sufficient to respond to the specific questions that these systems pose within the context of administration and bureaucracy.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The use of algorithmic systems for administrative decision-making is a subject which should concern legal and regulatory scholarship for two interrelated reasons. First, the use of algorithmic systems requires deliberating trade-offs between their presumed benefits, (for example, in reducing costs and increasing efficiency, or curtailing arbitrariness) and perceived harms, (for eg. increasing opacity and reducing accountability). These trade-offs must be deliberated within the context of specific legal frameworks, including constitutional rights, which place constraints on state action, and consequently, on the deployment of algorithmic systems. Second, algorithmic systems pose questions of normative and institutional change for administrative agencies which must be contended with. Algorithmic systems substantially impact norms of administrative decision-making, ranging from the role of bureaucratic discretion in the application of statutory rules and standards, to the norms governing procedural fairness in formulating administrative policies and decisions \u2013 questions which are fundamental to administrative law.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>There is a long history of administrative law jurisprudence in India, the goal of which is to ensure administrative action ascribes to constitutional principles \u2013 including rights against arbitrary state action, administrative and procedural fairness and equality before the law. This jurisprudence addresses aspects of administrative action from the delegation of legislative powers and administrative rule-making, to public involvement in policy processes, to administrative procurement processes and individual decision-making. Even as algorithmic systems fundamentally alter the characteristics of each of these forms of administrative action, there has been little consideration given to legal or regulatory responses to ensure adherence to recognised principles of administrative law.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>This article seeks to explore how algorithmic systems are impacting the function and role of government administration in India, and what this implies for the areas of law which are concerned with the regulation and governance of administrative decision-making within government agencies \u2013 broadly categorised as administrative law. This will also illuminate broader questions about the philosophy of information regulation in India, including how information collection and processing activities within algorithmic systems are mediating the citizen-state relationship.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>This paper will locate the debates about normative and institutional change brought about by the use of algorithmic systems in the Indian administrative law context. This provides a valuable contribution to the existing literature on the subject for two reasons. First, it provides a framework to engage with administrative algorithmic decision-making within the contours of Indian law and jurisprudence. Second, understanding the effects of the use of algorithmic systems within the context of administrative systems in the particular context of India can inform literature on questions of algorithmic fairness, transparency, accountability and ethics more broadly.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Part I will review the literature around the operations of algorithmic systems and their implications for important public values. Part II of the paper will briefly outline the history and the political economy of the contemporary era of \u2018government-by-algorithm\u2019, and review jurisprudence and literature on its implications for the law of public administration. Part III will examine how public agencies in India are utilising automated or algorithmic systems for decision-making. Part IV will examine the implications of automated decision-making on administrative legal principles under Indian law.<\/p>\n<p>&nbsp;<\/p>\n<p><strong><span class=\"TextRun MacChromeBold SCXW215688055 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW215688055 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">Fairness, Accountability and Transparency in Algorithmic Decision-Making<\/span><\/span><span class=\"EOP SCXW215688055 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:276}\">\u00a0<\/span><\/strong><\/p>\n<p>Public administration today is increasingly characterised by the use of computational and digital systems to integrate and analyse information or data through algorithmic logics. In particular, there is a rise in the use of so-called \u2018Artificial Intelligence\u2019 (\u201cAI\u201d) and \u2018Big Data\u2019 technologies, propelled by the use of Machine Learning (\u201cML\u201d) systems, which utilise statistical methods to make causal inferences between large sets of data, or optimise mathematical functions in order to make \u2018predictions\u2019 for future instances of data. This section briefly examines how algorithmic systems impact upon values of fairness, transparency and accountability, which are also normative values upheld by administrative law and regulation, as well as values that (nominally) motivate government administration at large.<\/p>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The term algorithm describes a series of steps through which particular inputs can be turned into outputs.<a class=\"footnote\" title=\"Thomas H Cormen and others, Introduction to Algorithms (MIT press 2009).\" id=\"return-footnote-165-12\" href=\"#footnote-165-12\" aria-label=\"Footnote 12\"><sup class=\"footnote\">[12]<\/sup><\/a> An algorithmic system is a system which uses one or more algorithms, usually as part of a computational software, to produce outputs which may be used for making decisions. Algorithmic systems are characterised not only by the underlying technologies used to compute information, but equally by the social, cultural, legal and institutional contexts where algorithms are embedded, which are crucial determinants of how these systems are used and governed.<a class=\"footnote\" title=\"Tarleton Gillespie, \u20182. Algorithm\u2019, 2. Algorithm (Princeton University Press 2016) &lt;https:\/\/www.degruyter.com\/document\/doi\/10.1515\/9781400880553-004\/html&gt; accessed 26 November 2021.\" id=\"return-footnote-165-13\" href=\"#footnote-165-13\" aria-label=\"Footnote 13\"><sup class=\"footnote\">[13]<\/sup><\/a> These algorithmic systems, and their implication for public administration and legal and constitutional rights, are the socio-technical systems that this paper focusses on.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The proliferation of these systems in a number of socially consequential areas, such as policing, education, finance and healthcare, both within and external to government, has spurred substantial debates on their implications for important public values, centred largely around values of transparency, fairness and accountability of these systems. This framing, while not exhaustive of the range of implications posed by the widespread use of automated decision-making systems and algorithmic technologies, emphasises how algorithmic decision-making systems challenge important assumptions and expectations about consequential decision-making that concerns people relating to the transparency about how a decision is made, the \u2018fairness\u2019 of such a decision, and who should be accountable for these decisions.<a class=\"footnote\" title=\"Rob Kitchin, \u2018Thinking Critically about and Researching Algorithms\u2019 (2017) 20 Information, Communication &amp; Society 14.\" id=\"return-footnote-165-14\" href=\"#footnote-165-14\" aria-label=\"Footnote 14\"><sup class=\"footnote\">[14]<\/sup><\/a> Each of these concepts are highly contested, highly context-specific, and escape universal definition, yet they broadly describe the anxieties that algorithmic decision-making has given rise to in various contexts, that are relevant for our study.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Transparency, in the context of algorithmic decision-making may broadly be described as \u201ca system of observing and knowing that promises a form of control\u201d.<a class=\"footnote\" title=\"Mike Ananny and Kate Crawford, \u2018Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability\u2019 (2018) 20 New Media &amp; Society 973.\" id=\"return-footnote-165-15\" href=\"#footnote-165-15\" aria-label=\"Footnote 15\"><sup class=\"footnote\">[15]<\/sup><\/a> Transparency is instrumental in understanding and demanding accountability about a decision. Algorithmic decision-making gives rise to challenges of transparency owing both to the intrinsic technological inscrutability of some novel forms of algorithmic systems \u2013 such as complex machine learning systems utilising data with a high number of characteristics,<a class=\"footnote\" title=\"Jenna Burrell, \u2018How the Machine \u201cThinks\u201d: Understanding Opacity in Machine Learning Algorithms\u2019 (2016) 3 Big Data &amp; Society 205395171562251.\" id=\"return-footnote-165-16\" href=\"#footnote-165-16\" aria-label=\"Footnote 16\"><sup class=\"footnote\">[16]<\/sup><\/a> or which computes data in a manner unintelligible to the audience demanding transparency.<a class=\"footnote\" title=\"Jakko Kemper and Daan Kolkman, \u2018Transparent to Whom? No Algorithmic Accountability without a Critical Audience\u2019 (2019) 22 Information, Communication &amp; Society 2081.\" id=\"return-footnote-165-17\" href=\"#footnote-165-17\" aria-label=\"Footnote 17\"><sup class=\"footnote\">[17]<\/sup><\/a> However, transparency is also a function of how these systems are integrated into and engage with existing social, institutional or organisational contexts.<a class=\"footnote\" title=\"Franck Pasquale, The Black Box Society (Harvard University Press 2015).\" id=\"return-footnote-165-18\" href=\"#footnote-165-18\" aria-label=\"Footnote 18\"><sup class=\"footnote\">[18]<\/sup><\/a> For example, a major factor inhibiting transparency of algorithmic systems used in the public sector is the reluctance of governments or private contractors to reveal the details of trade sensitive information.<a class=\"footnote\" title=\"Id.\" id=\"return-footnote-165-19\" href=\"#footnote-165-19\" aria-label=\"Footnote 19\"><sup class=\"footnote\">[19]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Fairness, in the context of algorithmic decision-making, is implicated both in the manner in which decisions are made, as well as on its effects on particular individuals or groups, concerning aspects of both the intrinsic quality of a decision-making process, as well as the broader distributive implications of decisions made.<a class=\"footnote\" title=\"Solon Barocas, Moritz Hardt and Arvind Narayanan, \u2018Fairness and Machine Learning\u2019 253, (fairmlbook.org).\" id=\"return-footnote-165-20\" href=\"#footnote-165-20\" aria-label=\"Footnote 20\"><sup class=\"footnote\">[20]<\/sup><\/a> Several studies of algorithmic systems used in different social contexts have shown how the impacts of these systems are distributed in ways that are considered \u2018unfair\u2019 \u2013 either indicating statistical bias based on particular characteristics like class, race or caste (which are often characteristics legally protected against discrimination).<a class=\"footnote\" title=\"Barocas and Selbst (n 7).\" id=\"return-footnote-165-21\" href=\"#footnote-165-21\" aria-label=\"Footnote 21\"><sup class=\"footnote\">[21]<\/sup><\/a> Bias or discrimination can arise owing to a number of elements in the decision-making process, including (1) the kinds of historical data that a Machine Learning algorithm might take into account, which may include protected characteristics; (2) how the data is processed and whether the processing itself produces (statistically) biased or arbitrary results, or, (3) if the context in which a decision-making system is used is consistently biased towards a particular group.<a class=\"footnote\" title=\"Barocas, Hardt and Narayanan (n 19).\" id=\"return-footnote-165-22\" href=\"#footnote-165-22\" aria-label=\"Footnote 22\"><sup class=\"footnote\">[22]<\/sup><\/a> Owing often to the scale at which algorithmic systems are often used, implicit or explicit biases in algorithmic decision-making can often lead to systematic discrimination at socially consequential scales.<a class=\"footnote\" title=\"ibid.\" id=\"return-footnote-165-23\" href=\"#footnote-165-23\" aria-label=\"Footnote 23\"><sup class=\"footnote\">[23]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Accountability in the context of algorithmic decision-making refers to ability of various actors involved in the production of a decision through an algorithmic system to be held to account for such decisions, including \u201cthe obligation to explain and justify their use, design, and\/or decisions of\/concerning the system and the subsequent effects of that conduct.\u201d<a class=\"footnote\" title=\"24Maranke Wieringa, \u2018What to Account for When Accounting for Algorithms: A Systematic Literature Review on Algorithmic Accountability\u2019, Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (ACM 2020) &lt;http:\/\/dl.acm.org\/doi\/10.1145\/3351095.3372833&gt; accessed 29 July 2020.\" id=\"return-footnote-165-24\" href=\"#footnote-165-24\" aria-label=\"Footnote 24\"><sup class=\"footnote\">[24]<\/sup><\/a> Algorithmic systems within governments are often complex systems, assemblages of data, computational techniques and varying institutional or organisational contexts, involving different actors responsible for different elements of the system (for example, a developer of a software, the agency responsible for procuring the system and the agency responsible for using it, etc.).<a class=\"footnote\" title=\"European Parliament. Directorate General for Parliamentary Research Services., A Governance Framework for Algorithmic Accountability and Transparency. (Publications Office 2019) &lt;https:\/\/data.europa.eu\/doi\/10.2861\/59990&gt;\" id=\"return-footnote-165-25\" href=\"#footnote-165-25\" aria-label=\"Footnote 25\"><sup class=\"footnote\">[25]<\/sup><\/a> This complexity makes it difficult to attribute responsibility for the ultimate decision taken through the use or aid of an algorithmic system to a single actor or organisation, in many cases undermining effective accountability.<a class=\"footnote\" title=\"Madeleine Clare Elish, \u2018Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction\u2019 (2019) 5 Engaging Science, Technology, and Society 40.\" id=\"return-footnote-165-26\" href=\"#footnote-165-26\" aria-label=\"Footnote 26\"><sup class=\"footnote\">[26]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The admittedly broad values of fairness, accountability and transparency offer but one frame of analysis for the consequences of algorithmic systems on public values. Algorithmic systems also portend structural effects on, for example, democratic participation and human agency, and their impacts may be usefully analysed from a number of normative lenses or frameworks. However, this framing is particularly useful in the context of the aims of this chapter \u2013 to highlight the impact of algorithmic systems on public administration and the values, norms and laws that guide or govern public administration.<\/p>\n<p>&nbsp;<\/p>\n<div style=\"font-weight: 400;\">\n<p><strong>Algorithmic Administrative Decision-Making in India\u00a0<\/strong><\/p>\n<p>The use of algorithmic systems and logics for decision-making is hardly a novel phenomenon. Information systems have long played a part in public administration, even within jurisdictions like India which have seen relatively delayed adoption of computers and digital technologies. Historically, digital systems were implemented in order to automate routine and clerical tasks of administration.<a class=\"footnote\" title=\"Saint-Dizier (n 2).\" id=\"return-footnote-165-27\" href=\"#footnote-165-27\" aria-label=\"Footnote 27\"><sup class=\"footnote\">[27]<\/sup><\/a> Although there is some evidence of the use of more complex systems, such as the use of knowledge based expert systems (an early form of \u2018artificial intelligence\u2019 systems which relied on programming syntactic rules to aid in tasks like legal interpretation and analysis), it is only in the past two decades that the implementation of digital systems within public administration has emerged as a transformative phenomenon in India. Despite the highly fragmented nature of digital technology use in India, governments at both the Central and the State level have been eagerly adopting these technologies in order to augment and supplant their decision-making capabilities.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In this part, we use three case studies to examine how algorithmic technologies intersect with administrative decision-making processes at different stages, and explore their implications for administrative law discussed previously.<\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<p style=\"padding-left: 40px;\"><strong><span class=\"TextRun MacChromeBold SCXW202459856 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW202459856 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">1. Tax<\/span><\/span><span class=\"TextRun MacChromeBold SCXW202459856 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW202459856 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\"> Assessment and Case Allocation under the Income Tax Act<\/span><\/span><\/strong><\/p>\n<p>In 2019, the Government of India introduced a scheme to replace manual assessment of income tax for the purpose of additional scrutiny with an automated system known as the Faceless Assessment Scheme (\u201cFAS\u201d). In 2020, the Indian parliament amended certain provisions of the Income Tax Act (\u201cTax Amendment Act\u201d)<a class=\"footnote\" title=\"The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.\" id=\"return-footnote-165-28\" href=\"#footnote-165-28\" aria-label=\"Footnote 28\"><sup class=\"footnote\">[28]<\/sup><\/a> to incorporate the FAS, which, inter alia, includes provisions for an \u2018automated allocation tool\u2019 and \u2018automated examination tool\u2019 which are defined as algorithmic systems for the randomised allocation of cases, and standardised assessment of draft orders, respectively.<\/p>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>As per the Tax Amendment Act,<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>\u201c\u2018Automated allocation tool\u2019 means an algorithm for randomised allocation of cases, by using suitable technological tools, including artificial intelligence and machine learning, with a view to optimise the use of resources.\u201d<a class=\"footnote\" title=\"S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.\" id=\"return-footnote-165-29\" href=\"#footnote-165-29\" aria-label=\"Footnote 29\"><sup class=\"footnote\">[29]<\/sup><\/a> and;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>\u2018\u201cAutomated examination tool\u2019 means an algorithm for standardised examination of draft orders, by using suitable technological tools, including artificial intelligence and machine learning, with a view to reduce the scope of discretion.\u2019<a class=\"footnote\" title=\"S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020.\" id=\"return-footnote-165-30\" href=\"#footnote-165-30\" aria-label=\"Footnote 30\"><sup class=\"footnote\">[30]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Under these provisions of the Tax Amendment Act, decisions about the \u2018randomised allocation\u2019 of tax assessments, as well as the examination of draft assessment orders are to be automated through the suitable technological tools, including \u201cartificial intelligence and machine learning\u201d, in order to optimise resources and reduce discretion, respectively (echoing the standard justifications for automating administrative decisions which we noted in the previous section).<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Automation enters the tax assessment system at two points. The Automated Allocation algorithm is used by the tax authorities in order to identify specific cases for tax assessments, and to allocate the scrutiny of tax returns to a specific regional assessment centre, ostensibly to reduce bias and increase transparency in the selection and allotment of cases for further scrutiny. After the initial assessment, a draft assessment order is prepared by the authority, which is then analysed using the Automated Examination Tool, using an algorithmic system, and the taxpayer is intimated of the final assessment.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The details of the algorithms used, the statistical techniques applied or the data on which the Machine Learning system is supposed to work on have not been made publicly available, and the considerations that an algorithmic system for allocation or examination must take into account are not specified in the primary legislation (the Income Tax Act) or in the rules made by the tax administrative authority (the Central Board for Direct Taxes).<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In addition to automating and augmenting manual allocation and assessment of draft orders, the FAS also resulted in assessments being conducted without providing a hearing to affected persons. Consequently, a number of challenges were raised before various High Courts arguing that proceedings with finalising assessments were conducted without granting the right to a personal hearing before the adjudicating officers.<a class=\"footnote\" title=\"Chander Arjandas Manwani, Bombay High Court, (Writ Petition no. 3195 of 2021) order dated 21st September 2021; RMSI Private Ltd. v. National E-Assessment Centre., Delhi High Court, W.P.(C) 6482\/2021 (Delhi HC), order dated 14\/07\/2021.\" id=\"return-footnote-165-31\" href=\"#footnote-165-31\" aria-label=\"Footnote 31\"><sup class=\"footnote\">[31]<\/sup><\/a><\/p>\n<p>&nbsp;<\/p>\n<p style=\"padding-left: 40px;\"><strong>2. <span class=\"TextRun MacChromeBold SCXW129818391 BCX0\" lang=\"EN-IN\" xml:lang=\"EN-IN\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW129818391 BCX0\" data-ccp-parastyle=\"Normal (Web)\">Voter Roll \u2018Deduplication\u2019 by the Electoral Commission of India\u00a0<\/span><\/span><span class=\"EOP SCXW129818391 BCX0\" data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:276}\">\u00a0<\/span><\/strong><\/p>\n<p>Recent exercises undertaken by the Electoral Commission of India (\u201cECI\u201d) to \u2018clean\u2019 voter rolls through digital deduplication algorithms are another important example of algorithmic decision-making disturbing individual rights in novel ways.<\/p>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In 2015, the ECI launched the National Electoral Roll Purification and Authentication Programme (\u201cNERPAP\u201d) with the objective of \u201cbringing a totally error-free and authenticated electoral roll\u201d, through linking electoral databases with the database of India\u2019s national biometric resident database \u2013 UID or Aadhaar. The process of \u2018linking\u2019 databases was implemented through a computer software programme which was used to algorithmically \u2018deduplicate\u2019 \u2013 i.e., remove multiple copies of the same data from a database \u2013 voter lists, ostensibly in order to ensure that there is no voter fraud due to the possession of multiple voter ID cards. This was achieved by comparing Aadhaar data \u2013 deemed to be a unique reference, with the demographic details of individuals enrolled on voter lists. Ostensibly, if the Aadhaar data mapped to more than one voter record, it would be deemed to be a \u2018duplicate\u2019 and removed from the voter rolls.<a class=\"footnote\" title=\"\u2018Linking of Electoral Data with Aadhaar: All You Need to Know\u2019 The Times of India (21 December 2021) &lt;https:\/\/timesofindia.indiatimes.com\/business\/india-business\/linking-of-electoral-data-with-aadhaar-all-you-need-to-know\/articleshow\/88408171.cms&gt;.\" id=\"return-footnote-165-32\" href=\"#footnote-165-32\" aria-label=\"Footnote 32\"><sup class=\"footnote\">[32]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The NERPAP process was trialled across a number of jurisdictions, most prominently perhaps in Telangana, where 30,00,000 people were reportedly removed from the voter rolls without following the established procedure, thereby disentitling them from participating in the state elections.<a class=\"footnote\" title=\"\u2018Democracy at Stake: Why Many Eligible Voters Might Not Vote in Telangana on Dec 7 | The News Minute\u2019 &lt;https:\/\/www.thenewsminute.com\/article\/democracy-stake-why-many-eligible-voters-might-not-vote-telangana-dec-7-92706&gt;\" id=\"return-footnote-165-33\" href=\"#footnote-165-33\" aria-label=\"Footnote 33\"><sup class=\"footnote\">[33]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>A challenge to the NERPAP Scheme and the use of software to automate voter deduplication was filed before the Telangana High Court, claiming, among other things, that the ECI deployed an \u201c<em>algorithm \u2026 which is neither transparent nor public, to carry out its statutory and constitutional duty of preparing and maintaining the voter rolls in India generally and Andhra Pradesh and Telangana in particular, which led to the deletion of almost 27 lakh voters in Telangana and 19 lakh voters in Andhra Pradesh in violation of the procedure established by law and declared by the Supreme Court of India.<\/em>\u201d<a class=\"footnote\" title=\"\u2018Srinivas Kodali v. Election Commission Of India, Through Secretary And Others, Telangana High Court, (PIL No. 374 \/ 2018)\" id=\"return-footnote-165-34\" href=\"#footnote-165-34\" aria-label=\"Footnote 34\"><sup class=\"footnote\">[34]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>As with the case of the tax administration, the claims made before the High Court in the case of the NERPAP automation of voter deduplication relate to the opacity of the software and logic employed, as well as the lack of due process followed when making a decision that disturbed the rights of affected persons.<\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p style=\"padding-left: 40px;\"><strong>3. <span class=\"NormalTextRun SCXW38194781 BCX0\" data-ccp-parastyle=\"Standard\" data-ccp-parastyle-defn=\"{&quot;ObjectId&quot;:&quot;f633e141-cdb8-41b4-b11f-ee5df6972fc5|251&quot;,&quot;ClassId&quot;:1073872969,&quot;Properties&quot;:[469775450,&quot;Standard&quot;,201340122,&quot;2&quot;,134233614,&quot;true&quot;,469778129,&quot;Standard&quot;,335572020,&quot;1&quot;,469778325,&quot;[\\&quot;Heading\\&quot;,\\&quot;Text body\\&quot;,\\&quot;caption\\&quot;,\\&quot;Index\\&quot;,\\&quot;Footnote\\&quot;]&quot;]}\">Fraud Analytics in Healthcare<\/span><span class=\"NormalTextRun SCXW38194781 BCX0\" data-ccp-parastyle=\"Standard\"> Administration<\/span><\/strong><\/p>\n<p>In 2018 the Government of India launched a national public health insurance scheme termed as the Pradhan Mantri Jan Arogya Yojna (PMJAY), which, among other things, aims to provide health insurance coverage to poor households. Over the course of implementation of the scheme, the Government of India has entered into various partnerships with private firms for fraud detection and analysis of transactions and claims made through the scheme.<a class=\"footnote\" title=\"\u20185 Analytical Firms Look for Fraud in Ayushman Bharat PMJAY - Health News, Medibulletin\u2019 &lt;https:\/\/medibulletin.com\/5-analytical-firms-look-for-fraud-in-ayushman-bharat-pmjay\/&gt;.\" id=\"return-footnote-165-35\" href=\"#footnote-165-35\" aria-label=\"Footnote 35\"><sup class=\"footnote\">[35]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>According to public documentation about the scheme released by the National Health Authority, a\u2018Fraud Analytics Control and Tracking System\u2019, (\u201cFACTS\u201d) has been implemented, which will ostensibly use Artificial Intelligence and Machine Learning in order to \u201c<em>identify suspect transactions &amp; entities. Using advanced tools such as Natural Language Processing and Optical Character Recognition and Image Analytics, unstructured data such as images, documents and clinical notes submitted are analysed to detect cases of potential fraud and abuse.<\/em>\u201d<a class=\"footnote\" title=\"Ayushman Bharat PM-JAY Annual Report, 2020-2021, National Health Authority, &lt;https:\/\/nha.gov.in\/img\/resources\/Annual-Report-2020-21.pdf&gt;.\" id=\"return-footnote-165-36\" href=\"#footnote-165-36\" aria-label=\"Footnote 36\"><sup class=\"footnote\">[36]<\/sup><\/a> As per guidelines for the scheme, a finding of prima facie fraud from the algorithm can trigger an investigation which can result in the rejection of an insurance claim as well as further disciplinary action on the identified entity.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>As with the above cases of using automation in administrative decisions, the algorithmic system utilised for identifying and making the initial decision about \u2018fraudulent claims\u2019 is not made public, nor is there information about the basis on which it operates, apart from the fact that it is based on Machine Learning techniques.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The algorithmic techniques that the FACTS system reportedly uses, known as Machine Learning, or ML, is based on analysing large datasets to find patterns among data, and impose that logic or pattern among future instances of data. As we will discuss in the next section, apart from the general concerns posed by automated decision making, ML introduces distinct challenges for the purpose of reviewing the propriety of administrative action from the lens of administrative law.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In the subsequent section, we explore how these legal-ethical considerations around fairness, accountability and transparency have emerged specifically in the context of public administration in India, and briefly review the jurisprudence and literature pertaining to algorithmic decision-making, and public administrative law.<\/p>\n<\/div>\n<p>&nbsp;<\/p>\n<div style=\"font-weight: 400;\">\n<p><strong>Public Administration in the Age of Automation\u00a0<\/strong><\/p>\n<p>The emerging centrality of information technologies, and automated decision-making systems, within public administration is as much a phenomenon that concerns organisational changes in government, and wider political and economic trends, as much as it does technological change.<a class=\"footnote\" title=\"Helen Margetts and Patrick Dunleavy, \u2018The Second Wave of Digital-Era Governance: A Quasi-Paradigm for Government on the Web\u2019 (2013) 371 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20120382.\" id=\"return-footnote-165-37\" href=\"#footnote-165-37\" aria-label=\"Footnote 37\"><sup class=\"footnote\">[37]<\/sup><\/a> Scholars of public administration have theorised how these technological transformations fundamentally the context within which policy choices are made and within which public administration takes place. In particular, scholars have noted how contemporary public administration around the world, including in India, has been characterised by \u2018New Public Management\u2019, or NPM, a \u2018market-based\u2019 model of governance emphasising efficiency, innovation and service-delivery, in turn encouraging deregulation, public-private partnerships, and technification of government administration.<a class=\"footnote\" title=\"Baru RV and Nundy M, \u2018Blurring of Boundaries: Public-Private Partnerships in Health Services in India\u2019 (2008) 43 Economic and Political Weekly 62.\" id=\"return-footnote-165-38\" href=\"#footnote-165-38\" aria-label=\"Footnote 38\"><sup class=\"footnote\">[38]<\/sup><\/a> As Magretts et. al., note, principles of NPM laid the foundation for the contemporary technification and digitisation of public administration, leading to what they identify as \u2018Digital Era Governance\u2019, which places the use and integration of previously siloed government information systems at the very heart of public administration functions, driving transformations in the organisation and culture of public administration at large by influencing public sector values and changing the role of judgement and discretion which are at the heart of administrative decisions.<a class=\"footnote\" title=\"Margetts and Dunleavy (n 36).\" id=\"return-footnote-165-39\" href=\"#footnote-165-39\" aria-label=\"Footnote 39\"><sup class=\"footnote\">[39]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Cuellar, similarly, argues that algorithmic systems are bringing about both complex and subtle organisational changes within the administrative state, with the increasing adoption of opaque data-modelling and data science techniques in administrative decision-making requiring specific trade-offs between optimising social welfare concerns with \u2018political pragmatism and procedural constraints\u2019, and restructuring administrative functions and organisation in the process.<a class=\"footnote\" title=\"Mariano-Florentino Cu\u00e9llar, \u2018Cyberdelegation and the Administrative State\u2019 in Nicholas R Parrillo (ed), Administrative Law from the Inside Out: Essays on Themes in the Work of Jerry L. Mashaw (Cambridge University Press 2017).\" id=\"return-footnote-165-40\" href=\"#footnote-165-40\" aria-label=\"Footnote 40\"><sup class=\"footnote\">[40]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The emergence of these technologies as crucial elements in the administrative establishment of the state has attracted some degree of interest from courts, regulators as well as within legal scholarship attempting to explain and account for the implications of algorithmic technologies for public administration and the citizen-state relationship. Before turning to the analysis of algorithmic decision-making in the context of Indian administrative law, it is useful to examine how this interaction has been analysed in some common law jurisdictions.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Scholars of public law in the U.S. have written about the potential implications of computerisation and digitisation on administrative procedure since the early 1990s. Schwartz\u2019 germinal paper on data processing and government administration notes how bureaucracy in the U.S. was transforming into an \u2018information processing\u2019 system, and its implications for \u2018bureaucratic justice\u2019 \u2013 accuracy, efficiency and dignity of the participant in an administrative process, particularly owing to the non-transparent nature of relying upon computer operations. Schwartz argues for building in both procedural safeguards through data protection regulation, as well as an independent oversight mechanism for such decision-making within public administration.<a class=\"footnote\" title=\"Schwartz (n 8).\" id=\"return-footnote-165-41\" href=\"#footnote-165-41\" aria-label=\"Footnote 41\"><sup class=\"footnote\">[41]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Danielle Citron has also argued for revamping procedural rights in the US context. Citron argues that computerised decision-making nullifies distinctions between administrative rule-making and adjudication functions, without providing the adequate safeguards offered by administrative law for either function. Administrative decision-making usually assumes procedural safeguards such as notice and hearing mechanisms in the case of individualised adjudications, or notice and comments, and more generally public transparency and participation mechanisms for rulemaking and delegated legislative functions. Citron argues, however, that contemporary algorithmic and data-driven systems combine rulemaking and adjudication functions in ways that obscure the specific procedural protections of either of these, resulting in a procedural void as far as administrative law and regulation is concerned. This is particularly true in cases where \u2018data-based decisions\u2019 lead to the creation by computer systems of new rules by which to process individual cases, as well as the application of these rules to particular cases.<a class=\"footnote\" title=\"Citron (n 5).\" id=\"return-footnote-165-42\" href=\"#footnote-165-42\" aria-label=\"Footnote 42\"><sup class=\"footnote\">[42]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Deirdre Mulligan and Kenneth Bamberger have also argued for the application of administrative law protections to administrative decision-making which involves algorithmic systems. Their analysis is particularly important for taking into account the organisational and institutional context of the modern administrative state, where software for administrative functions is often outsourced to private actors, thereby also outsourcing the policymaking functions that algorithmic systems displace. They argue that such \u2018policy-by-procurement\u2019 should be restructured to incorporate specific rules of administrative accountability including public input and expert deliberation into algorithmic processes.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The legal implications of automated decision-making systems were also considered by the Australian Administrative Review Council (\u201cARC\u201d) as far back as 2004, when it provided specific guidance for administrative agencies to consider the legality of the use of automated decision systems, in line with administrative law values of \u2018lawfulness, fairness, rationality, openness (or transparency) and efficiency\u2019.<a class=\"footnote\" title=\"Administrative Review Council (Australia), Automated Assistance in Administrative Decision Making: Report to the Attorney-General (AGPS 2005).\" id=\"return-footnote-165-43\" href=\"#footnote-165-43\" aria-label=\"Footnote 43\"><sup class=\"footnote\">[43]<\/sup><\/a> The ARC guidance notes that administrative law principles governing the legality of administrative decisions, the use of discretion and natural justice are at stake in the consideration to use or rely upon automated \u2018expert systems\u2019 which make decisions or aid in human decision-making.<a class=\"footnote\" title=\"Id.\" id=\"return-footnote-165-44\" href=\"#footnote-165-44\" aria-label=\"Footnote 44\"><sup class=\"footnote\">[44]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In the United Kingdom, legal scholars have scrutinised specific executive actions in the administrative legal context, arguing for greater scrutiny through judicial review as well as re-framing administrative law principles in light of automated decision-making. Marion Oswald examines, in particular, the impact of machine learning and so-called \u2018predictive\u2019 tools in administrative decisions. She argues that automated decision-making changes the nature and meaning of duties of administrative agencies to \u2018give reasons\u2019 for decisions, as well as the standard of \u2018relevance\u2019 of fact and reasonableness of executive decision-making.<a class=\"footnote\" title=\"Marion Oswald, \u2018Algorithm-Assisted Decision-Making in the Public Sector: Framing the Issues Using Administrative Law Rules Governing Discretionary Power\u2019 (2018) 376 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20170359.\" id=\"return-footnote-165-45\" href=\"#footnote-165-45\" aria-label=\"Footnote 45\"><sup class=\"footnote\">[45]<\/sup><\/a> Similarly, Jennifer Cobbe analyses how the (largely uncodified) principles of English administrative law might be applied to a range of automated decision-making systems in particular contexts. Cobbe draws from data protection regulation and standards, to argue that machine learning tools, in particular, might fall foul of certain principles including the right to provide adequate legal justifications for certain decisions, the duty of a delegate to not fetter the discretionary power granted to them, and the requirement to only consider relevant facts in administrative adjudications.<a class=\"footnote\" title=\"Jennifer Cobbe, \u2018Administrative Law and the Machines of Government: Judicial Review of Automated Public-Sector Decision-Making\u2019 (2019) 39 Legal Studies 636.\" id=\"return-footnote-165-46\" href=\"#footnote-165-46\" aria-label=\"Footnote 46\"><sup class=\"footnote\">[46]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Courts in most common law jurisdictions have not had much opportunity to consider the specific legal implications of administrative use of algorithmic systems.<a class=\"footnote\" title=\"Nb. Courts have had the opportunity to consider algorithmic systems implicated in challenges to administrative action, but few have specifically commented on the specific implications of the use of automated systems and similar technology. Cf. Peter Whiteford, \u2018Debt by Design: The Anatomy of a Social Policy Fiasco \u2013 Or Was It Something Worse?\u2019 (2021) 80 Australian Journal of Public Administration 340.\" id=\"return-footnote-165-47\" href=\"#footnote-165-47\" aria-label=\"Footnote 47\"><sup class=\"footnote\">[47]<\/sup><\/a> A notable exception is the algorithmic system was at issue in <em>State v Loomis<\/em>, before the Wisconsin Supreme Court.48 Here, the use of an algorithmic risk assessment system known as COMPAS for making bail decisions was challenged as being contrary to due process requirements. However, the court noted that the algorithmic system\u2019s outputs were not making individualised adjudications in a manner which was sufficient to attract the due process requirement under U.S. administrative law, and the court proceeded to allow for its use, among other things, on the grounds that bail decisions were not <em>relying upon<\/em> the COMPAS system, but were merely considering it. The distinction between these two was not clearly articulated \u2013 an issue we will discuss later\u2013 and the decision has been criticised subsequently for failing to take into account due process requirements in algorithmic decision-making.<a class=\"footnote\" title=\"Katherine Freeman, \u2018Algorithmic Injustice: How the Wisconsin Supreme Court Failed to Protect Due Process Rights in State v. Loomis\u2019 18 33.\" id=\"return-footnote-165-48\" href=\"#footnote-165-48\" aria-label=\"Footnote 48\"><sup class=\"footnote\">[48]<\/sup><\/a><\/p>\n<p>&nbsp;<\/p>\n<div style=\"font-weight: 400;\">\n<p><strong>Administrative Law in India and Automated Decision-Making\u00a0<\/strong><\/p>\n<p>Administrative law in India constitutes a largely uncodified field based to a large extent on principles of constitutional law and the bill of fundamental rights in Part III of the Constitution of India.<a class=\"footnote\" title=\"Sujit Choudhry, Madhav Khosla and Pratap Bhanu Mehta (eds), The Oxford Handbook of the Indian Constitution (Oxford University Press 2016); Raeesa Vakil, \u2018Constitutionalizing Administrative Law in the Indian Supreme Court: Natural Justice and Fundamental Rights\u2019 (2018) 16 International Journal of Constitutional Law 475.\" id=\"return-footnote-165-49\" href=\"#footnote-165-49\" aria-label=\"Footnote 49\"><sup class=\"footnote\">[49]<\/sup><\/a> Legal review of administrative action is based on a mix of reinterpreted English common law principles and analysis of constitutional principles under Article 14 of the Constitution, which establishes the right to equality, encompassing, among other things, the concept of reasonableness of administrative action.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Broadly, the grounds for legal review (and the permissible limits of administrative action) were laid down in the Supreme Court\u2019s judgement in <em>Tata Cellular v Union of India<\/em>,<a class=\"footnote\" title=\"1994 SCC (6) 651.\" id=\"return-footnote-165-50\" href=\"#footnote-165-50\" aria-label=\"Footnote 50\"><sup class=\"footnote\">[50]<\/sup><\/a> whereby the court noted that there are three broad grounds for challenging administrative action, namely: Illegality of the action \u2013 exceeding or not giving effect to the statutory or legal provision by which a decision-maker derives power from; irrationality or unreasonableness, which governs the exercise of discretionary power; and procedural impropriety, more broadly framed as rules of natural justice.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>In this section, we will examine how these rules of judicial review, or regulations and limitations on administrative action might apply in the context of administrative use of automated decision-making systems outlined in the case studies above.<\/p>\n<\/div>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<p><strong>Rules of Discretion\u00a0\u00a0<\/strong><\/p>\n<p>A central concern of administrative law and regulation is the control over discretionary executive action. Effective administration is only possible by providing a large degree of discretionary power to execute legislative policy, and in particular, as Upendra Baxi argues, \u2018<em>discretion is a tool for the individualisation of justice<\/em>\u2019 allowing for the operation of a socio-economic welfare state like India.<a class=\"footnote\" title=\"Upendra Baxi, &quot;Development in Indian Administrative Law&quot; in A.G. Noorani (ed.), Public Law India (1982).\" id=\"return-footnote-165-51\" href=\"#footnote-165-51\" aria-label=\"Footnote 51\"><sup class=\"footnote\">[51]<\/sup><\/a> Administrative law is therefore concerned with balancing the imperative of delegating discretionary power to the executive with concerns around its appropriate and rights-conforming use.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<p><strong>Improper Delegation of Discretionary Power\u00a0\u00a0<\/strong><\/p>\n<p>When examining delegated discretionary power, the courts must assess whether the delegation is legal. Judicial review of legislative action here examines whether the power conferred on the executive has been \u2018properly\u2019 delegated, namely, falls within the constitutional bounds of legislative delegation, which are assessed primarily against the trinity of rights under Articles 14, 19 and 21. In a number of cases, the Supreme Court of India has held that statutes that are so vague as to provide no guidance to those enforcing the law, to prevent its arbitrary exercise, must be struck down as discriminatory.<a class=\"footnote\" title=\"Shreya Singhal v. UOI, (2015) 5 SCC 1.\" id=\"return-footnote-165-52\" href=\"#footnote-165-52\" aria-label=\"Footnote 52\"><sup class=\"footnote\">[52]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>If we apply this standard of scrutiny to the language of the Taxation (Amendment) Act, 2020, it could be argued that the language of the statute confers broad discretionary power to utilise the automated tools for the allotment and examination of tax assessments. In defining the mechanism that should be utilised for such assessments, the statute provides no guidance on how or on what principles such assessment should take place, apart from through \u2018suitable technological tools including Artificial Intelligence and Machine Learning\u2019. As explained previously, the scope of these words, and the technologies that they incorporate, is incredibly broad and constitutes a wide range of activities, processes and technologies. Machine Learning, for example, may be utilised to incorporate any number of factors, relevant or not, on the basis of which a tax return could be allotted or examined. Delegating administrative power on the basis of the use of these particular technologies for automated assessment, may, therefore, fall foul of the standard against vagueness of a statute, since it provides no guidance for the executive in how such technologies may be utilised and what factors they might consider in coming to decisions that affect rights of legal persons, nor does it provide procedural safeguards to ensure against arbitrary exercise of such power.<a class=\"footnote\" title=\"Indeed, the lack of a requirement to provide a personal hearing in the FAS has been challenged before multiple High Courts, as of the time of writing.\" id=\"return-footnote-165-53\" href=\"#footnote-165-53\" aria-label=\"Footnote 53\"><sup class=\"footnote\">[53]<\/sup><\/a> This example may usefully be extended to other areas where delegated power may be sought to be conferred through the use of technological tools for automated decision-making. Consider, for example, a requirement that executive authorities examine illegal speech on online platforms through the use of \u2018machine learning tools\u2019 or \u2018automated tools\u2019, without laying down the criteria on which such analysis must be based, would also likely fall foul of the rule against vagueness.<a class=\"footnote\" title=\"This example is consciously borrowed from a similar rule incorporated in the IT Act (Intermediary Guidelines) Rules, 2021.\" id=\"return-footnote-165-54\" href=\"#footnote-165-54\" aria-label=\"Footnote 54\"><sup class=\"footnote\">[54]<\/sup><\/a><\/p>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<p><strong>Improper Exercise of Delegated Power\u00a0<\/strong><\/p>\n<p>The exercise of delegated power must also conform to certain legal principles. When the law confers discretionary power on an administrative authority, the authority must ensure that (1) the discretion is not abandoned or fettered; and (2) that the discretion is exercised \u2018properly\u2019.<a class=\"footnote\" title=\"I.P. Massey, Administrative Law, (10th Edition, Eastern Book Company, 2017)\" id=\"return-footnote-165-55\" href=\"#footnote-165-55\" aria-label=\"Footnote 55\"><sup class=\"footnote\">[55]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The rule against fettering jurisdiction implies that when discretion is conferred on an authority, the authority must itself exercise such delegation, and must not sub-delegate its powers (without legal authority), place the power to take a decision on another body, blindly follow the dictation of a third party, or follow a procedure in exercising discretion whereby it is unable to take into account the merits and circumstances of a particular case.<a class=\"footnote\" title=\"Indian Rly. Construction Co. Ltd. v. Ajay Kumar, (2003) 4 SCC 579.\" id=\"return-footnote-165-56\" href=\"#footnote-165-56\" aria-label=\"Footnote 56\"><sup class=\"footnote\">[56]<\/sup><\/a> The rule against fettering discretion is particularly relevant when considering how human agents and automated decision-making systems interact and the contexts in which administrative decisions are formally \u2018assisted\u2019 by automated systems. As indicated above, even the most complex algorithmic system is incapable of utilising its own discretion. Algorithmic systems are by definition bound by specific rules (although the rule-base of certain contemporary systems may constantly evolve or be incredibly vast).<a class=\"footnote\" title=\"Cormen and others (n 11).\" id=\"return-footnote-165-57\" href=\"#footnote-165-57\" aria-label=\"Footnote 57\"><sup class=\"footnote\">[57]<\/sup><\/a> As such, the wholesale exercise of an administrative power by an algorithmic system, or in other words, if an algorithmic system directly makes and effects an administrative decision, it would be a clear violation of the rule against discretion being fettered.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>However, in most cases, there is (at least formally) a human agent making a \u2018final decision\u2019, usually \u2018assisted\u2019 by an automated system. Consider, for example, the case of the NERPAP algorithm. By simple calculation of the numbers involved in the voter removal exercise and the timeline, it is apparent that human decision-makers would not have been able to apply their discretion in any meaningful manner. It is more likely that they merely proceeded on the basis of the \u2018decision\u2019 that was provided to them by the software used ostensibly for deduplication, without application of their own discretion. This is commonly referred to as \u2018automation bias\u2019 in the literature studying the interaction between human agents and computer systems \u2013 namely, where, for multiple reasons, a decision-maker would choose to rely on an automated system instead of considering countervailing evidence or administering their own discretion.<a class=\"footnote\" title=\"Ben Green and Yiling Chen, \u2018Disparate Interactions: An Algorithm-in-the-Loop Analysis of Fairness in Risk Assessments\u2019, Proceedings of the Conference on Fairness, Accountability, and Transparency (ACM 2019) &lt;https:\/\/dl.acm.org\/doi\/10.1145\/3287560.3287563&gt;.\" id=\"return-footnote-165-58\" href=\"#footnote-165-58\" aria-label=\"Footnote 58\"><sup class=\"footnote\">[58]<\/sup><\/a> Automation bias is merely one example of the ways in which complex algorithmic systems interact with human agents and oversight. However, it indicates that the exercise of discretion by administrative authorities is substantially challenged by the use of automated systems, and that merely the fact that the final decision is made by a human being should not disallow from scrutiny that the decision was made without application of mind or in violation against the principle of fettering discretion.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The proper exercise of discretionary power considers the manner in which discretion is exercised, and the factors that any administrative decision must take into account. Needless to say, administrative action can always be reviewed on grounds of its unconstitutionality or violation of fundamental rights. However, for the purpose of this paper, we will examine the tenets of administrative law relating to the procedure, and not the effect, of administrative decision-making. The popular formulation of administrative propriety in decision-making under English common law is the Wednesbury test for reasonableness or \u2018irrationality\u2019 of decision-making, which has also been imported into jurisprudence in Indian High Courts and the Supreme Court.<a class=\"footnote\" title=\"G.B. Mahajan v. Jalgaon Municipal Council, [1991] 3 SCC 91\" id=\"return-footnote-165-59\" href=\"#footnote-165-59\" aria-label=\"Footnote 59\"><sup class=\"footnote\">[59]<\/sup><\/a> The standard of rationality applied in judicial review is that the decision must not be \u2018in outrageous defiance of logic or moral standards\u2019, or that the decision takes into account irrelevant or extraneous factors, or fails to take into account relevant facts.<a class=\"footnote\" title=\"Indian Railway Construction Co. Ltd. v. Ajay Kumar (2003 (4) SCC 579)\" id=\"return-footnote-165-60\" href=\"#footnote-165-60\" aria-label=\"Footnote 60\"><sup class=\"footnote\">[60]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The standards of relevance and rationality of a decision is clearly implicated in the process of automated decision-making. Relevance of the material facts taken into consideration are implicated particularly in automated systems that incorporate large amounts of data sets in order to find patterns and establish links between underlying data and a specified outcome. Consider the example of the FACTS fraud analytics system. Hypothetically, the algorithm on which the data analytics system decides whether a hospital or a beneficiary is \u2018fraudulent\u2019 may take into account a number of factors, including transactional information about health purchases, but also factors such as social media behaviour, consumer consumption data, etc.<a class=\"footnote\" title=\"The hypothetical is not too far from reality. Data from social media is widely used in algorithmic determinations of credit scores in India and elsewhere. See \u2018Not CIBIL, This Lender Uses Your Social Media Behaviour for Loan up to Rs 2 Lakh!\u2019 (Financialexpress) &lt;https:\/\/www.financialexpress.com\/money\/not-cibil-this-lender-uses-your-social-media-behaviour-for-loan-up-to-rs-2-lakh\/1761934\/&gt;.\" id=\"return-footnote-165-61\" href=\"#footnote-165-61\" aria-label=\"Footnote 61\"><sup class=\"footnote\">[61]<\/sup><\/a> The former is arguably relevant to a determination of fraud, while the latter is likely to have little to no bearing on whether a person commits fraud in this particular scheme. As such, if courts were to examine the facts on which such a system made decisions, which was subsequently relied upon by administrative agencies, it may find that they do not satisfy the doctrine of reasonableness. Similarly, algorithmic systems may incorporate logical rules which do not satisfy the reasonableness or rationality criterion. In particular, algorithmic systems which are based on drawing inferences between categories of information are intended to optimise particular functions without consideration of any underlying logic. In doing so, they can both reproduce historically prejudiced action, but also confuse co-relation with causation and establish rules of decision-making which are wholly illogical or arbitrary. For example, to revisit once again the FACTS system, the algorithm may establish a rule (based on available statistical information) that persons who suffer from particular disabilities are more likely to commit fraud. Where the law requires that particular facts be taken into account, or that irrelevant factors are not taken into account, or that the logic of decision-making should adhere to certain normative standards in administrative decisions, various kinds of formulations and \u2018data-based\u2019 analytics which are based on processing large volumes of diverse information may be implicated.<\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p><strong>Rules of Adjudication and Principles of Natural Justice\u00a0<\/strong><\/p>\n<p>A third, and particularly important consideration in administrative decision-making is its procedural propriety. While adherence to specific procedural norms is writ across administrative decision-making, it is particularly important when an authority is in a \u2018quasi-judicial\u2019 role, namely, where it must make a determination on facts and application of standards or rules, which can prejudice the rights of an individual or a group.<a class=\"footnote\" title=\"Although the distinction between a \u2018quasi-judicial\u2019 and administrative action is increasingly waning inasmuch as procedural propriety is concerned. See A.K Kraipak v. Union of India 1969 2 SCC 262\" id=\"return-footnote-165-62\" href=\"#footnote-165-62\" aria-label=\"Footnote 62\"><sup class=\"footnote\">[62]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>Where an administrative action prejudicially affects the rights of a person, the principles of natural justice are applicable to such a decision. Broadly, these principles may be classified as \u2013 (1) the rule against bias, and (2) the right to a fair hearing.<a class=\"footnote\" title=\"D.K. Yadav vs J.M.A. Industries Ltd, 1993 SCC (3) 259.\" id=\"return-footnote-165-63\" href=\"#footnote-165-63\" aria-label=\"Footnote 63\"><sup class=\"footnote\">[63]<\/sup><\/a><\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>The rule against bias requires that, where a fair adjudication of facts is required, the issue should not be prejudiced or pre-determined by biases that might arise in various contexts. Bias generally depends upon the individual circumstances of a case, concerning the decision-making body or institutional context, and their pre-conceived notions. The standard for determining bias is whether a \u201c<em>reasonable man, in possession of relevant information, would have thought that bias was likely and whether the authority concerned was likely to be disposed to decide the matter in a particular way.<\/em>\u201d Therefore, the fact of bias does not need to be proven, and the reasonable likelihood of bias is sufficient grounds to challenge a decision.<a class=\"footnote\" title=\"Jiwan K. Lohia v. Durga Dutt Lohia, (1992) 1 SCC 56.\" id=\"return-footnote-165-64\" href=\"#footnote-165-64\" aria-label=\"Footnote 64\"><sup class=\"footnote\">[64]<\/sup><\/a> The rule against bias has generally operated where there is a personal or pecuniary interest of the decision-maker, but its broader formulation cautions against situations in which decisions cannot be taken objectively. As noted above, algorithmic systems exhibit bias and discrimination in many ways, which could systematically preclude an objective assessment in certain contexts. For example, a system that takes into account historical information may inherit historical biases on the basis of caste, class, gender or sexuality, or their proxies, which are then used as part of\u00a0 the decision-making matrix. In such cases, the decisions relying upon such systems may both be substantively discriminatory and violative of Articles 14, 15 or 16, but could also give rise to a reasonable likelihood of bias that violates the procedural norms of natural justice.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>It is unclear what a judicial analysis of the rule against bias in administrative decision might look like in the context of algorithmic decision-making. While algorithmic systems have been shown to indicate discriminatory \u2018biases\u2019 &#8211; bias in the training data, statistical biases of the model used, or bias in the choice of application, it might prove difficult for an affected party to challenge a decision on the basis that it violates the rule against bias, without sufficient material on which to make such a claim.<a class=\"footnote\" title=\"Cobbe (n 45).\" id=\"return-footnote-165-65\" href=\"#footnote-165-65\" aria-label=\"Footnote 65\"><sup class=\"footnote\">[65]<\/sup><\/a> The burden of proof to show that there is a \u2018real likelihood of bias\u2019 normally falls on the affected person or the person making the claim. However, under the present conditions of non-transparency about how decisions utilising algorithmic systems are made, it might prove challenging to sustain such a claim.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p><em>The right to a fair<\/em> hearing encompasses a number of principles that ensures that a person suffering the consequences of an administrative adjudication has the ability to present their case and change the outcome of a decision.<a class=\"footnote\" title=\"Keshav Mills Co. Ltd. v. Union of India, (1973) 1 SCC 380.\" id=\"return-footnote-165-66\" href=\"#footnote-165-66\" aria-label=\"Footnote 66\"><sup class=\"footnote\">[66]<\/sup><\/a> This rule, often captured in the phrase <em>audi alterem partem<\/em>, or \u2018hear the other side\u2019 requires an administrative authority to satisfy a number of procedural conditions in coming to a decision. Broadly, these include the requirement to provide a notice that a hearing will take place, a right to the affected person to know the evidence used against them, including a right to inspect the evidence available before the authority, and the right to present evidence and cross-examine the evidence presented against them. In some cases, there is also a duty to provide reasons for coming to a particular decision (although there is no general duty to provide reasons), linking the materially relevant facts with the final decision.<a class=\"footnote\" title=\"Gurdial Singh Fiji v. State of Punjab, (1979) 2 SCC 368; Kranti Associates (P) Ltd. v. Masood Ahmed Khan, (2010) 9 SCC 496.\" id=\"return-footnote-165-67\" href=\"#footnote-165-67\" aria-label=\"Footnote 67\"><sup class=\"footnote\">[67]<\/sup><\/a> As per the Supreme Court, the rationale for providing reasons is linked to the transparency of the decision-making process for the affected persons as well as for the purpose of judicial or appellate review.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>It is apparent from the case studies discussed previously that the use of automated decision-making systems challenges many aspects of natural justice as laid down by the Supreme Court. In particular, challenges arise when decision-making processes are unable to provide sufficient justification or rationale for a decision, and are unable to consider any additional or extenuating evidence presented by parties to the decision. As we noted previously, the outputs of an algorithmic system are often inscrutable or opaque for a number of reasons, including the nature of the mathematical operations or due to the confidentiality of the algorithmic system of the data. This implies that the duty to provide reasons cannot always be suitably satisfied in cases where automated systems make or assist in making decisions. In each of the examples above, the algorithmic systems used have not been made transparent to affected persons in any meaningful way. It is unclear what data is used in the system, or what logical process is followed by the algorithm in order to arrive at a conclusion. Similarly, the system itself is unable to consider additional evidence in its decision-making process. In the case of the FAS and the NERPAP, it has also been alleged in court proceedings that personal hearings were also disposed of, owing to the reliance on the automated system for expediency, further implying that many of these decisions may fall foul of important conditions that principles of natural justice require to be satisfied.<\/p>\n<p>&nbsp;<\/p>\n<div style=\"font-weight: 400;\">\n<p><strong>Conclusion<\/strong><\/p>\n<p>This paper has argued that algorithmic systems \u2013 assemblages of computational and data-based tools \u2013 are being used in the context of public sector administrative decision-making in India in a manner that implicates important norms that regulate administrative conduct. These include norms that place limits on the delegation of power to the executive branch, as well as norms about how administrative power should be exercised in order to protect certain important constitutionally guaranteed rights, including non-discrimination and equality, as well as the concept of \u2018natural justice\u2019, also read into constitutional guarantees.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>New digital technologies, particularly computational and data-based systems, are likely to remain mainstays of government administration, offering improvements in administrative efficiency and certainty. In the process, digital technologies are also systematically changing norms and values of the public sector. This raises an important question about the evolution of legal systems in conjunction with these changes in administrative procedure. In particular, administrative law faces distinct challenges \u2013 how should the law balance the values which potentially conflict with the use of automated decision-making systems? Should bureaucratic efficiency be provided greater leeway as against individualised adjudication and procedural justice? Should the scope of administrative discretion be expanded, as large-scale information systems allow for a greater role of the administrative state?<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>I have argued that the manner in which the Indian state is uncritically deploying and relying upon algorithmic systems in administration today requires us to urgently address these questions, particularly in asking whether this use comports with established legal norms and principles that guide and regulate administrative conduct. A bare assessment of a sample of algorithmic systems deployed indicates that they do not fulfil important criterion on the basis of which we judge the legality and constitutionality of administrative decision-making \u2013 they ignore established limits on the delegation of power, occlude protections on transparency and accountability about the manner in which administrative discretion is exercised, and override procedural protections which form the basis for the delivery of individualised justice in administrative proceedings. There is an urgent need for a legal response that understands the implications of these technologies. Considering the largely uncodified basis of Indian administrative law and its roots in the Indian constitution, it is likely that such a response would need to come from higher courts in India, who must re-assert the application of administrative legal principles in scrutinising administrative conduct which is guided by automated decision-making systems.<\/p>\n<\/div>\n<div style=\"font-weight: 400;\">\n<p>&nbsp;<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<hr class=\"before-footnotes clear\" \/><div class=\"footnotes\"><ol><li id=\"footnote-165-1\">PhD Candidate, Faculty of Laws, University College London. The author would like to thank Kruthika R. for her inputs and discussions which are invaluable to this paper. <a href=\"#return-footnote-165-1\" class=\"return-footnote\" aria-label=\"Return to footnote 1\">&crarr;<\/a><\/li><li id=\"footnote-165-2\">Tarleton Gillespie, \u2018The Relevance of Algorithms\u2019 in Tarleton Gillespie, Pablo J Boczkowski and Kirsten A Foot (eds), Media Technologies (The MIT Press 2014) &lt;<a href=\"http:\/\/mitpress.universitypressscholarship.com\/view\/10.7551\/mitpress\/9780262525374.001.0001\/upso-9780262525374-chapter-9\">http:\/\/mitpress.universitypressscholarship.com\/view\/10.7551\/mitpress\/9780262525374.001.0001\/upso-9780262525374-chapter-9<\/a>&gt; accessed 29 July 2020. <a href=\"#return-footnote-165-2\" class=\"return-footnote\" aria-label=\"Return to footnote 2\">&crarr;<\/a><\/li><li id=\"footnote-165-3\">Patrick Saint-Dizier, \u2018The Knowledge-Based Computer System Development Program of India: A Review\u2019 (1991) 12 AI Magazine 33. <a href=\"#return-footnote-165-3\" class=\"return-footnote\" aria-label=\"Return to footnote 3\">&crarr;<\/a><\/li><li id=\"footnote-165-4\">Michael Veale and Irina Brass, \u2018Administration by Algorithm?: Public Management Meets Public Sector Machine Learning\u2019, <em>Algorithmic Regulation<\/em> (Oxford University Press 2019) &lt;<a href=\"https:\/\/oxford.universitypressscholarship.com\/10.1093\/oso\/9780198838494.001.0001\/oso-9780198838494-chapter-6\">https:\/\/oxford.universitypressscholarship.com\/10.1093\/oso\/9780198838494.001.0001\/oso-9780198838494-chapter-6<\/a>&gt; <a href=\"#return-footnote-165-4\" class=\"return-footnote\" aria-label=\"Return to footnote 4\">&crarr;<\/a><\/li><li id=\"footnote-165-5\">Mireille Hildebrandt, <em>Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology<\/em> (Paperback edition, EE Edward Elgar Publishing 2016). <a href=\"#return-footnote-165-5\" class=\"return-footnote\" aria-label=\"Return to footnote 5\">&crarr;<\/a><\/li><li id=\"footnote-165-6\">Danielle Keats Citron, \u2018Technological Due Process\u2019 (2007\u20132008) 85 Washington University Law Review 1249. <a href=\"#return-footnote-165-6\" class=\"return-footnote\" aria-label=\"Return to footnote 6\">&crarr;<\/a><\/li><li id=\"footnote-165-7\">Helen Nissenbaum, <em>Privacy in Context: Technology, Policy, and the Integrity of Social Life<\/em> (Stanford University Press 2009); Mireille Hildebrandt, \u2018Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning\u2019 (2019) 20 Theoretical Inquiries in Law 83. <a href=\"#return-footnote-165-7\" class=\"return-footnote\" aria-label=\"Return to footnote 7\">&crarr;<\/a><\/li><li id=\"footnote-165-8\">Solon Barocas and Andrew D Selbst, \u2018Big Data\u2019s Disparate Impact\u2019 (2016) 104 California Law Review 671. <a href=\"#return-footnote-165-8\" class=\"return-footnote\" aria-label=\"Return to footnote 8\">&crarr;<\/a><\/li><li id=\"footnote-165-9\">Paul Schwartz, \u2018Data Processing and Government Administration: The Failure of the American Legal Response to the Computer\u2019 (1991) 43 Hastings LJ 1321; Citron (n 5). <a href=\"#return-footnote-165-9\" class=\"return-footnote\" aria-label=\"Return to footnote 9\">&crarr;<\/a><\/li><li id=\"footnote-165-10\">Michael Veale, Max Van Kleek and Reuben Binns, \u2018Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision-Making\u2019 [2018] Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems 1; Deirdre K Mulligan and Kenneth A Bamberger, \u2018Procurement as Policy: Administrative Process for Machine Learning\u2019 (2019) 34 Berkeley Technology Law Journal 773. <a href=\"#return-footnote-165-10\" class=\"return-footnote\" aria-label=\"Return to footnote 10\">&crarr;<\/a><\/li><li id=\"footnote-165-11\">Reetika Khera, \u2018Impact of Aadhaar in Welfare Programmes\u2019 (2017) SSRN Scholarly Paper ID 3045235 &lt;<a href=\"https:\/\/papers.ssrn.com\/abstract=3045235\">https:\/\/papers.ssrn.com\/abstract=3045235<\/a>&gt; <a href=\"#return-footnote-165-11\" class=\"return-footnote\" aria-label=\"Return to footnote 11\">&crarr;<\/a><\/li><li id=\"footnote-165-12\">Thomas H Cormen and others, <em>Introduction to Algorithms<\/em> (MIT press 2009). <a href=\"#return-footnote-165-12\" class=\"return-footnote\" aria-label=\"Return to footnote 12\">&crarr;<\/a><\/li><li id=\"footnote-165-13\">Tarleton Gillespie, \u20182. Algorithm\u2019, 2. <em>Algorithm<\/em> (Princeton University Press 2016) &lt;https:\/\/www.degruyter.com\/document\/doi\/10.1515\/9781400880553-004\/html&gt; accessed 26 November 2021. <a href=\"#return-footnote-165-13\" class=\"return-footnote\" aria-label=\"Return to footnote 13\">&crarr;<\/a><\/li><li id=\"footnote-165-14\">Rob Kitchin, \u2018Thinking Critically about and Researching Algorithms\u2019 (2017) 20 Information, Communication &amp; Society 14. <a href=\"#return-footnote-165-14\" class=\"return-footnote\" aria-label=\"Return to footnote 14\">&crarr;<\/a><\/li><li id=\"footnote-165-15\">Mike Ananny and Kate Crawford, \u2018Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability\u2019 (2018) 20 New Media &amp; Society 973. <a href=\"#return-footnote-165-15\" class=\"return-footnote\" aria-label=\"Return to footnote 15\">&crarr;<\/a><\/li><li id=\"footnote-165-16\">Jenna Burrell, \u2018How the Machine \u201cThinks\u201d: Understanding Opacity in Machine Learning Algorithms\u2019 (2016) 3 Big Data &amp; Society 205395171562251. <a href=\"#return-footnote-165-16\" class=\"return-footnote\" aria-label=\"Return to footnote 16\">&crarr;<\/a><\/li><li id=\"footnote-165-17\">Jakko Kemper and Daan Kolkman, \u2018Transparent to Whom? No Algorithmic Accountability without a Critical Audience\u2019 (2019) 22 Information, Communication &amp; Society 2081. <a href=\"#return-footnote-165-17\" class=\"return-footnote\" aria-label=\"Return to footnote 17\">&crarr;<\/a><\/li><li id=\"footnote-165-18\">Franck Pasquale, <em>The Black Box Society<\/em> (Harvard University Press 2015). <a href=\"#return-footnote-165-18\" class=\"return-footnote\" aria-label=\"Return to footnote 18\">&crarr;<\/a><\/li><li id=\"footnote-165-19\"><em>Id<\/em>. <a href=\"#return-footnote-165-19\" class=\"return-footnote\" aria-label=\"Return to footnote 19\">&crarr;<\/a><\/li><li id=\"footnote-165-20\">Solon Barocas, Moritz Hardt and Arvind Narayanan, \u2018Fairness and Machine Learning\u2019 253, (fairmlbook.org). <a href=\"#return-footnote-165-20\" class=\"return-footnote\" aria-label=\"Return to footnote 20\">&crarr;<\/a><\/li><li id=\"footnote-165-21\">Barocas and Selbst (n 7). <a href=\"#return-footnote-165-21\" class=\"return-footnote\" aria-label=\"Return to footnote 21\">&crarr;<\/a><\/li><li id=\"footnote-165-22\">Barocas, Hardt and Narayanan (n 19). <a href=\"#return-footnote-165-22\" class=\"return-footnote\" aria-label=\"Return to footnote 22\">&crarr;<\/a><\/li><li id=\"footnote-165-23\">ibid. <a href=\"#return-footnote-165-23\" class=\"return-footnote\" aria-label=\"Return to footnote 23\">&crarr;<\/a><\/li><li id=\"footnote-165-24\">24Maranke Wieringa, \u2018What to Account for When Accounting for Algorithms: A Systematic Literature Review on Algorithmic Accountability\u2019, <em>Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency<\/em> (ACM 2020) &lt;<a href=\"http:\/\/dl.acm.org\/doi\/10.1145\/3351095.3372833\">http:\/\/dl.acm.org\/doi\/10.1145\/3351095.3372833<\/a>&gt; accessed 29 July 2020. <a href=\"#return-footnote-165-24\" class=\"return-footnote\" aria-label=\"Return to footnote 24\">&crarr;<\/a><\/li><li id=\"footnote-165-25\">European Parliament. Directorate General for Parliamentary Research Services., <em>A Governance Framework for Algorithmic Accountability and Transparency<\/em>. (Publications Office 2019) &lt;<a href=\"https:\/\/data.europa.eu\/doi\/10.2861\/59990\">https:\/\/data.europa.eu\/doi\/10.2861\/59990<\/a>&gt; <a href=\"#return-footnote-165-25\" class=\"return-footnote\" aria-label=\"Return to footnote 25\">&crarr;<\/a><\/li><li id=\"footnote-165-26\">Madeleine Clare Elish, \u2018Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction\u2019 (2019) 5 Engaging Science, Technology, and Society 40. <a href=\"#return-footnote-165-26\" class=\"return-footnote\" aria-label=\"Return to footnote 26\">&crarr;<\/a><\/li><li id=\"footnote-165-27\">Saint-Dizier (n 2). <a href=\"#return-footnote-165-27\" class=\"return-footnote\" aria-label=\"Return to footnote 27\">&crarr;<\/a><\/li><li id=\"footnote-165-28\">The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020. <a href=\"#return-footnote-165-28\" class=\"return-footnote\" aria-label=\"Return to footnote 28\">&crarr;<\/a><\/li><li id=\"footnote-165-29\">S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020. <a href=\"#return-footnote-165-29\" class=\"return-footnote\" aria-label=\"Return to footnote 29\">&crarr;<\/a><\/li><li id=\"footnote-165-30\">S.4 (XXIV), The Taxation And Other Laws (Relaxation And Amendment Of Certain Provisions) Act, 2020. <a href=\"#return-footnote-165-30\" class=\"return-footnote\" aria-label=\"Return to footnote 30\">&crarr;<\/a><\/li><li id=\"footnote-165-31\">Chander Arjandas Manwani, Bombay High Court, (Writ Petition no. 3195 of 2021) order dated 21st September 2021; RMSI Private Ltd. v. National E-Assessment Centre., Delhi High Court, W.P.(C) 6482\/2021 (Delhi HC), order dated 14\/07\/2021. <a href=\"#return-footnote-165-31\" class=\"return-footnote\" aria-label=\"Return to footnote 31\">&crarr;<\/a><\/li><li id=\"footnote-165-32\">\u2018Linking of Electoral Data with Aadhaar: All You Need to Know\u2019 <em>The Times of India<\/em> (21 December 2021) &lt;<a href=\"https:\/\/timesofindia.indiatimes.com\/business\/india-business\/linking-of-electoral-data-with-aadhaar-all-you-need-to-know\/articleshow\/88408171.cms\">https:\/\/timesofindia.indiatimes.com\/business\/india-business\/linking-of-electoral-data-with-aadhaar-all-you-need-to-know\/articleshow\/88408171.cms<\/a>&gt;. <a href=\"#return-footnote-165-32\" class=\"return-footnote\" aria-label=\"Return to footnote 32\">&crarr;<\/a><\/li><li id=\"footnote-165-33\">\u2018Democracy at Stake: Why Many Eligible Voters Might Not Vote in Telangana on Dec 7 | The News Minute\u2019 &lt;<a href=\"https:\/\/www.thenewsminute.com\/article\/democracy-stake-why-many-eligible-voters-might-not-vote-telangana-dec-7-92706\">https:\/\/www.thenewsminute.com\/article\/democracy-stake-why-many-eligible-voters-might-not-vote-telangana-dec-7-92706<\/a>&gt; <a href=\"#return-footnote-165-33\" class=\"return-footnote\" aria-label=\"Return to footnote 33\">&crarr;<\/a><\/li><li id=\"footnote-165-34\">\u2018Srinivas Kodali v. Election Commission Of India, Through Secretary And Others, Telangana High Court, (PIL No. 374 \/ 2018) <a href=\"#return-footnote-165-34\" class=\"return-footnote\" aria-label=\"Return to footnote 34\">&crarr;<\/a><\/li><li id=\"footnote-165-35\">\u20185 Analytical Firms Look for Fraud in Ayushman Bharat PMJAY - Health News, Medibulletin\u2019 &lt;<a href=\"https:\/\/medibulletin.com\/5-analytical-firms-look-for-fraud-in-ayushman-bharat-pmjay\/\">https:\/\/medibulletin.com\/5-analytical-firms-look-for-fraud-in-ayushman-bharat-pmjay\/<\/a>&gt;. <a href=\"#return-footnote-165-35\" class=\"return-footnote\" aria-label=\"Return to footnote 35\">&crarr;<\/a><\/li><li id=\"footnote-165-36\">Ayushman Bharat PM-JAY Annual Report, 2020-2021, National Health Authority, &lt;<a href=\"https:\/\/nha.gov.in\/img\/resources\/Annual-Report-2020-21.pdf\">https:\/\/nha.gov.in\/img\/resources\/Annual-Report-2020-21.pdf<\/a>&gt;. <a href=\"#return-footnote-165-36\" class=\"return-footnote\" aria-label=\"Return to footnote 36\">&crarr;<\/a><\/li><li id=\"footnote-165-37\">Helen Margetts and Patrick Dunleavy, \u2018The Second Wave of Digital-Era Governance: A Quasi-Paradigm for Government on the Web\u2019 (2013) 371 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20120382. <a href=\"#return-footnote-165-37\" class=\"return-footnote\" aria-label=\"Return to footnote 37\">&crarr;<\/a><\/li><li id=\"footnote-165-38\">Baru RV and Nundy M, \u2018Blurring of Boundaries: Public-Private Partnerships in Health Services in India\u2019 (2008) 43 Economic and Political Weekly 62. <a href=\"#return-footnote-165-38\" class=\"return-footnote\" aria-label=\"Return to footnote 38\">&crarr;<\/a><\/li><li id=\"footnote-165-39\">Margetts and Dunleavy (n 36). <a href=\"#return-footnote-165-39\" class=\"return-footnote\" aria-label=\"Return to footnote 39\">&crarr;<\/a><\/li><li id=\"footnote-165-40\">Mariano-Florentino Cu\u00e9llar, \u2018Cyberdelegation and the Administrative State\u2019 in Nicholas R Parrillo (ed), <em>Administrative Law from the Inside Out: Essays on Themes in the Work of Jerry L. Mashaw<\/em> (Cambridge University Press 2017). <a href=\"#return-footnote-165-40\" class=\"return-footnote\" aria-label=\"Return to footnote 40\">&crarr;<\/a><\/li><li id=\"footnote-165-41\">Schwartz (n 8). <a href=\"#return-footnote-165-41\" class=\"return-footnote\" aria-label=\"Return to footnote 41\">&crarr;<\/a><\/li><li id=\"footnote-165-42\">Citron (n 5). <a href=\"#return-footnote-165-42\" class=\"return-footnote\" aria-label=\"Return to footnote 42\">&crarr;<\/a><\/li><li id=\"footnote-165-43\">Administrative Review Council (Australia), <em>Automated Assistance in Administrative Decision Making: Report to the Attorney-General<\/em> (AGPS 2005). <a href=\"#return-footnote-165-43\" class=\"return-footnote\" aria-label=\"Return to footnote 43\">&crarr;<\/a><\/li><li id=\"footnote-165-44\"><em>Id.<\/em> <a href=\"#return-footnote-165-44\" class=\"return-footnote\" aria-label=\"Return to footnote 44\">&crarr;<\/a><\/li><li id=\"footnote-165-45\">Marion Oswald, \u2018Algorithm-Assisted Decision-Making in the Public Sector: Framing the Issues Using Administrative Law Rules Governing Discretionary Power\u2019 (2018) 376 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20170359. <a href=\"#return-footnote-165-45\" class=\"return-footnote\" aria-label=\"Return to footnote 45\">&crarr;<\/a><\/li><li id=\"footnote-165-46\">Jennifer Cobbe, \u2018Administrative Law and the Machines of Government: Judicial Review of Automated Public-Sector Decision-Making\u2019 (2019) 39 Legal Studies 636. <a href=\"#return-footnote-165-46\" class=\"return-footnote\" aria-label=\"Return to footnote 46\">&crarr;<\/a><\/li><li id=\"footnote-165-47\">Nb. Courts have had the opportunity to consider algorithmic systems implicated in challenges to administrative action, but few have specifically commented on the specific implications of the use of automated systems and similar technology. Cf. Peter Whiteford, \u2018Debt by Design: The Anatomy of a Social Policy Fiasco \u2013 Or Was It Something Worse?\u2019 (2021) 80 Australian Journal of Public Administration 340. <a href=\"#return-footnote-165-47\" class=\"return-footnote\" aria-label=\"Return to footnote 47\">&crarr;<\/a><\/li><li id=\"footnote-165-48\">Katherine Freeman, \u2018Algorithmic Injustice: How the Wisconsin Supreme Court Failed to Protect Due Process Rights in State v. Loomis\u2019 18 33. <a href=\"#return-footnote-165-48\" class=\"return-footnote\" aria-label=\"Return to footnote 48\">&crarr;<\/a><\/li><li id=\"footnote-165-49\">Sujit Choudhry, Madhav Khosla and Pratap Bhanu Mehta (eds), <em>The Oxford Handbook of the Indian Constitution<\/em> (Oxford University Press 2016); Raeesa Vakil, \u2018Constitutionalizing Administrative Law in the Indian Supreme Court: Natural Justice and Fundamental Rights\u2019 (2018) 16 International Journal of Constitutional Law 475. <a href=\"#return-footnote-165-49\" class=\"return-footnote\" aria-label=\"Return to footnote 49\">&crarr;<\/a><\/li><li id=\"footnote-165-50\">1994 SCC (6) 651. <a href=\"#return-footnote-165-50\" class=\"return-footnote\" aria-label=\"Return to footnote 50\">&crarr;<\/a><\/li><li id=\"footnote-165-51\">Upendra Baxi, \"<em>Development in Indian Administrative Law<\/em>\" in A.G. Noorani (ed.), Public Law India (1982). <a href=\"#return-footnote-165-51\" class=\"return-footnote\" aria-label=\"Return to footnote 51\">&crarr;<\/a><\/li><li id=\"footnote-165-52\">Shreya Singhal v. UOI, (2015) 5 SCC 1. <a href=\"#return-footnote-165-52\" class=\"return-footnote\" aria-label=\"Return to footnote 52\">&crarr;<\/a><\/li><li id=\"footnote-165-53\">Indeed, the lack of a requirement to provide a personal hearing in the FAS has been challenged before multiple High Courts, as of the time of writing. <a href=\"#return-footnote-165-53\" class=\"return-footnote\" aria-label=\"Return to footnote 53\">&crarr;<\/a><\/li><li id=\"footnote-165-54\">This example is consciously borrowed from a similar rule incorporated in the IT Act (Intermediary Guidelines) Rules, 2021. <a href=\"#return-footnote-165-54\" class=\"return-footnote\" aria-label=\"Return to footnote 54\">&crarr;<\/a><\/li><li id=\"footnote-165-55\">I.P. Massey, <em>Administrative Law<\/em>, (10th Edition, Eastern Book Company, 2017) <a href=\"#return-footnote-165-55\" class=\"return-footnote\" aria-label=\"Return to footnote 55\">&crarr;<\/a><\/li><li id=\"footnote-165-56\">Indian Rly. Construction Co. Ltd. v. Ajay Kumar, (2003) 4 SCC 579. <a href=\"#return-footnote-165-56\" class=\"return-footnote\" aria-label=\"Return to footnote 56\">&crarr;<\/a><\/li><li id=\"footnote-165-57\">Cormen and others (n 11). <a href=\"#return-footnote-165-57\" class=\"return-footnote\" aria-label=\"Return to footnote 57\">&crarr;<\/a><\/li><li id=\"footnote-165-58\">Ben Green and Yiling Chen, \u2018Disparate Interactions: An Algorithm-in-the-Loop Analysis of Fairness in Risk Assessments\u2019, <em>Proceedings of the Conference on Fairness, Accountability, and Transparency<\/em> (ACM 2019) &lt;<a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3287560.3287563\">https:\/\/dl.acm.org\/doi\/10.1145\/3287560.3287563<\/a>&gt;. <a href=\"#return-footnote-165-58\" class=\"return-footnote\" aria-label=\"Return to footnote 58\">&crarr;<\/a><\/li><li id=\"footnote-165-59\">G.B. <em>Mahajan v. Jalgaon<\/em> Municipal Council, [1991] 3 SCC 91 <a href=\"#return-footnote-165-59\" class=\"return-footnote\" aria-label=\"Return to footnote 59\">&crarr;<\/a><\/li><li id=\"footnote-165-60\">Indian Railway Construction Co. Ltd. v. Ajay Kumar (2003 (4) SCC 579) <a href=\"#return-footnote-165-60\" class=\"return-footnote\" aria-label=\"Return to footnote 60\">&crarr;<\/a><\/li><li id=\"footnote-165-61\">The hypothetical is not too far from reality. Data from social media is widely used in algorithmic determinations of credit scores in India and elsewhere. See \u2018Not CIBIL, This Lender Uses Your Social Media Behaviour for Loan up to Rs 2 Lakh!\u2019 (<em>Financialexpress<\/em>) &lt;<a href=\"https:\/\/www.financialexpress.com\/money\/not-cibil-this-lender-uses-your-social-media-behaviour-for-loan-up-to-rs-2-lakh\/1761934\/\">https:\/\/www.financialexpress.com\/money\/not-cibil-this-lender-uses-your-social-media-behaviour-for-loan-up-to-rs-2-lakh\/1761934\/<\/a>&gt;. <a href=\"#return-footnote-165-61\" class=\"return-footnote\" aria-label=\"Return to footnote 61\">&crarr;<\/a><\/li><li id=\"footnote-165-62\">Although the distinction between a \u2018quasi-judicial\u2019 and administrative action is increasingly waning inasmuch as procedural propriety is concerned. See A.K Kraipak v. Union of India 1969 2 SCC 262 <a href=\"#return-footnote-165-62\" class=\"return-footnote\" aria-label=\"Return to footnote 62\">&crarr;<\/a><\/li><li id=\"footnote-165-63\">D.K. Yadav vs J.M.A. Industries Ltd, 1993 SCC (3) 259. <a href=\"#return-footnote-165-63\" class=\"return-footnote\" aria-label=\"Return to footnote 63\">&crarr;<\/a><\/li><li id=\"footnote-165-64\">Jiwan K. Lohia v. Durga Dutt Lohia, (1992) 1 SCC 56. <a href=\"#return-footnote-165-64\" class=\"return-footnote\" aria-label=\"Return to footnote 64\">&crarr;<\/a><\/li><li id=\"footnote-165-65\">Cobbe (n 45). <a href=\"#return-footnote-165-65\" class=\"return-footnote\" aria-label=\"Return to footnote 65\">&crarr;<\/a><\/li><li id=\"footnote-165-66\">Keshav Mills Co. Ltd. v. Union of India, (1973) 1 SCC 380. <a href=\"#return-footnote-165-66\" class=\"return-footnote\" aria-label=\"Return to footnote 66\">&crarr;<\/a><\/li><li id=\"footnote-165-67\">Gurdial Singh Fiji v. State of Punjab, (1979) 2 SCC 368; Kranti Associates (P) Ltd. v. Masood Ahmed Khan, (2010) 9 SCC 496. <a href=\"#return-footnote-165-67\" class=\"return-footnote\" aria-label=\"Return to footnote 67\">&crarr;<\/a><\/li><\/ol><\/div>","protected":false},"author":4,"menu_order":6,"template":"","meta":{"inline_featured_image":false,"pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":["divij-joshi"],"pb_section_license":""},"chapter-type":[],"contributor":[65],"license":[],"part":3,"_links":{"self":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapters\/165"}],"collection":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/wp\/v2\/users\/4"}],"version-history":[{"count":5,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapters\/165\/revisions"}],"predecessor-version":[{"id":172,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapters\/165\/revisions\/172"}],"part":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/parts\/3"}],"metadata":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapters\/165\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/wp\/v2\/media?parent=165"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/pressbooks\/v2\/chapter-type?post=165"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/wp\/v2\/contributor?post=165"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/publications.clpr.org.in\/the-philosophy-and-law-of-information-regulation-in-india\/wp-json\/wp\/v2\/license?post=165"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}