Amnesty International today releases a technical commentary on the underlying technology behind Samagra Vedika, an algorithmic system used in India to ensure automated social protection systems are fit for purpose and people on welfare. He said it was necessary to ensure that eligible people were not prevented from receiving benefits. Telangana state since 2016.
A technical explainer describes the human rights risks of Samagra Vedika and its 'entity', which uses machine learning algorithms to combine databases with the aim of assessing the eligibility of welfare applicants and detecting fraud and duplication. It sheds light on the use of technological processes called “solutions”. Beneficiaries of social protection programs.
Automated decision-making systems like Samagra Vedika are opaque and flatten people's lives by quantifying them using artificial intelligence (AI) and algorithms. It is extremely difficult to investigate the human rights impact of such systems in the absence of regulatory gaps and transparency.
David Nolan, Senior Research Fellow, Amnesty Tech
The publication of this technical manual follows media reports that Samagra Vedika is allegedly excluding thousands of people from accessing social protection measures, including those related to food security, income and housing. It is something. A 2024 study published in Al Jazeera found that thousands of families were denied vital benefits due to errors in systems that integrate personal data from multiple government databases, resulting in serious human rights disputes over the right to social security. It has now become clear how the concerns came to light.
“Automated decision-making systems like Samagra Vedika are opaque and flatten people’s lives by using artificial intelligence (AI) and algorithms to quantify their lives. Regulatory vacuum and lack of transparency. “It is extremely difficult to investigate the human rights impacts of such systems,” said David Nolan, senior research fellow at Amnesty Tech.
The use of entity resolution represents a new class of welfare technology. This technology uses a complex process, often incorporating AI and machine learning, to systematically compare pairs of individual records across large datasets, determine whether they match, and evaluate the results. Masu. Determine applicant eligibility against a variety of criteria and detect fraudulent or duplicate recipients.
Amnesty International spent a year attempting to design and conduct an audit of the Samagra Vedika system. Despite these efforts, audits remain incomplete due to challenges in accessing the underlying system and an overall lack of transparency by the developers and adopters of this system. Nevertheless, by embarking on this process, Amnesty International has uncovered important methodological learnings and insights into the nascent field of algorithmic research. By sharing these, Amnesty International aims to increase the joint capacity of civil society, NGOs and journalists to conduct future research in this area.
“The government needs to recognize that real lives are at risk here,” said David Nolan.
“The outsourcing of these systems from private companies by governments raises barriers for civil society and journalists investigating the technical configuration of digitalized social protection. As a result, the design and Private and public actors responsible for implementation are able to escape responsibility, while those affected by these systems are stuck in a bureaucratic maze with little or no redress.”
The Samagra Vedika case in Telangana is emblematic of the government's increasing reliance on AI and automated decision-making systems (ADM) to manage social protection programmes. This trend often results in unjust outcomes for already marginalized groups, such as exclusion from social security benefits, without sufficient accountability, transparency, or redress.
It is essential for all countries to conduct a thorough human rights impact assessment before introducing technology into their social protection systems. It is important that the introduction of any technology is accompanied by appropriate and robust human rights impact assessments throughout the system lifecycle, from design to deployment, and effective mitigation measures as part of human rights due diligence procedures.
The government needs to recognize that real lives are at stake here.
Engagement with affected communities is essential and changes to critical support systems must be clearly and easily communicated. Ultimately, if a system is found to pose a significant risk to human rights that cannot be adequately mitigated, it should not be implemented.
background
Technical explainers build their own based on the following: Survey results will be published in 2024 On Al Jazeera, in collaboration with the Pulitzer Center's Artificial Intelligence (AI) Accountability Network. The investigation revealed flawed implementation patterns in the Samagra Vedika system, resulting in the arbitrary denial of access to welfare to thousands of people.
Prior to the publication of this technical guide, Amnesty International wrote to Posidex Technologies Private Limited, the private company that provides the entity resolution software on which the Samagra Vedika system relies. Amnesty International had not received a response at the time of publication.
According to research by Amnesty International, in 2023, The automation trap: Poverty and discrimination in the Serbian welfare stateshows how many people, especially Roma and people with disabilities, are unable to pay their bills and put food on the table after being removed from social assistance support with the introduction of the social card registration system. Documented their struggles to make a living.
In 2021, Amnesty International documented how the algorithmic system used by the Dutch tax authorities had an impact. Recipients of racially distinctive childcare benefits. The tool was supposed to check whether benefit claims were genuine or fraudulent, but the system unfairly punished thousands of parents from low-income and immigrant backgrounds, leaving them in exorbitant debt and poverty. Ta.