This series was supported by the Pulitzer Center.
The community health worker lifts her smartphone toward the woman's face and waits. The screen flashes: "Face not matched". She tries again, adjusting the angle, asking the woman to remove her headscarf, to wipe the sweat from her forehead, and to stand closer to the doorway where the light is better. Ten minutes pass. The line of women waiting behind grows restless. Finally, the phone accepts the image and the worker can hand over the ration.
But the technology doesn’t work for many. Pregnant women are turned away because an AI algorithm cannot recognise their changing faces.
Across India, pregnant women and nursing mothers are being denied emergency food rations because facial recognition technology fails to match their current appearance to years-old identity photographs, a Decode investigation has found.
Since July 2025, beneficiaries of the Integrated Child Development Services (ICDS), the world's largest maternal and child nutrition programme serving 4.73 crore people, must compulsorily pass AI-powered facial recognition scans before collecting supplies. The system was mandated by the Ministry of Women and Child Development despite warnings from workers' unions that it would fail vulnerable women whose faces change during pregnancy, illness or ageing.
Government data shows that nearly half of eligible beneficiaries had not received rations through the system by the end of 2025, with authentication failures leaving thousands of pregnant women, nursing mothers and young children without food.
Google’s Face-Detection AI Decides Welfare
The facial recognition system runs on Google's ML Kit—a revelation that emerged only after technical analysis of the government's Poshan Tracker app by a researcher working with Decode.
Anoop, a researcher at Aalto University in Finland who analysed the technical architecture of the Poshan Tracker app for Decode, said a review of the app’s installation file (APK) shows that it includes components of Google ML Kit used for face detection.
He found out that the app contains references to the ML Kit Face Detection library, a tool that enables phones to detect faces, and not individuals, directly on the device. These references appear in the app’s configuration files and internal assets, indicating that the feature is built into the application itself.
Static analysis revealing Google ML Kit in Poshan Tracker
The government has never publicly disclosed that it uses Google's technology to ration food. ML Kit is an artificial intelligence tool commonly used in consumer apps for tasks like unlocking phones or tagging faces in photos. The same technology is now determining whether India's poorest pregnant women can access nutrition.
Responding to Decode’s query, Google said that ML Kit “does not have facial recognition capabilities” and is not designed to identify specific individuals. “It is offered as a freely-available API to developers globally, and since it runs on-device, Google does not have visibility into the applications that use it,” the statement read.
The company said that the responsibility for how the technology is designed, deployed, and kept compliant with the law lies with the developers who integrate it.
The Ministry has not disclosed whether the technology was assessed for accuracy on pregnant women before deployment, nor has it published any data on how many fraudulent claims the system has prevented.
An RTI response revealed that the Poshan Tracker—the government app that houses the facial recognition system—has a five-year budget of Rs 53.79 crore. In the first year itself, Rs 21.37 crore was allocated. The RTI did not specify how much of this was spent on the facial recognition system.
The response outlined several technical challenges encountered during the pilot phase in August 2024, including smartphone specifications (2–3 GB RAM), difficulties in receiving OTPs, and the time required for facial scans. It stated that the app was subsequently optimised to address the authentication issue.
When Pregnancy Triggers Exclusion
Ten weeks after giving birth in August, Renu (name changed to protect her identity) was cut off from supplementary nutrition because the facial recognition software could not match her face to a photograph taken eight years earlier, when she was 18.
"I don't look like my Aadhaar photo anymore," said the 26-year-old mother of two from Chandankyari village in Jharkhand. "And the phone linked to my Aadhaar is with my husband. He works at a construction site in Delhi. I cannot receive the one-time password they send."
With the ration cut off, Renu says she has struggled to feed herself and her newborn adequately. She fears the lack of nutrition could affect his growth as he is at a crucial age of development.
Without completing electronic verification, Renu was removed from the welfare rolls. Her supplementary nutrition stopped. Her elder four-year-old son's access to pre-school education, meals and health monitoring was also jeopardised because his records are tied to hers.
“I still go to the centre,” Renu says. “But they tell me my name is not there. They say I need to fix the Aadhaar. But how?”
‘My Face Will Change Again…’
Preeti (name changed) was nine months pregnant in November 2025 when the reporter first met her. She had been denied food rations for four months because the AI system could not match her current appearance to an Aadhaar photograph taken before her marriage, when she was noticeably younger.
She lived with her elderly mother-in-law in Karpi, a village in Bihar, while her husband worked away. Her doctor at the local primary health centre had warned her about the risks of poor nutrition late in pregnancy, especially because she was anaemic. She had been advised to eat more protein and iron-rich food.
"They tell me to eat properly, to take care," Preeti said at the time. "But how do I do that if the ration doesn't come?"
Facial authentication fails while matching live photo with the Aadhaar (Image Credit: Tej Bahadur Singh)
Without the ICDS take-home ration, her diet was largely limited to rice and seasonal vegetables. Milk, eggs and pulses were too expensive. To manage her medical visits and ensure access to nutritious food, her husband had begun taking on additional shifts, sending back whatever he could so she could "eat properly" during her final months of pregnancy.
Preeti had attempted to update her Aadhaar photograph twice, travelling four kilometres to the block office each time. "Both times they told me the server was down," she said.
Even if she succeeded, Preeti questioned what would happen next.
"My face will change again after delivery. Pregnancy changes a woman's body. Then the same thing will start all over again."
Her fear was medically valid. Dr Karishma Thariani, a gynaecologist based in Delhi, explained that facial changes during and sometimes after pregnancy are common. "There can be pigmentation during pregnancy, and sometimes it continues after," she said. Weight loss, changes in complexion and hair can all alter facial appearance. These shifts, Thariani noted, tend to be sharper in rural and low-income settings where malnutrition is more prevalent.
What medicine recognises as a normal physiological process, the system treats as a mismatch.
A Fundamental Design Flaw
Anoop, tech researcher, said the problem goes beyond implementation and points to a fundamental design flaw. He explained, “At its core, ML Kit is designed to detect faces, not reliably recognise people over time,” he said.
In welfare settings, this distinction becomes critical. The system’s performance depends heavily on the quality and recency of the reference image—which, in ICDS, is an Aadhaar photograph taken years earlier. When that image is outdated or low-quality, Anoop noted, the likelihood of failure increases sharply.
“If the state mandates biometric compliance for accessing food or care, it also has a responsibility to guarantee the infrastructure, accuracy, and fail-safes needed to make that system work,” he said.
Without those safeguards, biometric authentication becomes an accountability mechanism that pushes the cost of failure onto women, children, and frontline workers. From a systems perspective, the researcher warned that embedding facial recognition into welfare delivery sets a “troubling precedent”. Normalising such technologies without transparency, consent, or meaningful redress, he said, risks turning exceptional surveillance into routine governance.
The system was introduced, the Ministry of Women and Child Development has said, to prevent "duplication and leakages" in ICDS, a 50-year-old programme that provides supplementary nutrition, health monitoring and pre-school education to pregnant women, nursing mothers and children under six.
But interviews with more than 15 Anganwadi workers across Bihar, Jharkhand and Karnataka —the 1.4 million frontline women who deliver the programme across India— and documents reviewed by Decode reveal no evidence the system has prevented significant fraud, while excluding genuine beneficiaries at scale.
Government figures show that of 4.73 crore eligible beneficiaries, 4.51 crore (over 91%) had completed initial documentation and were awaiting face matching as of 31 December 2025. Yet only 2.79 crore (52.7%) had successfully received rations through facial recognition. Nearly half had been locked out.
In Bihar, 71.5% of eligible beneficiaries had received rations through facial recognition by August 2025, according to government data. In Jharkhand, the figure was 58.5%.
The Ministry of Women and Child Development did not respond to Decode’s questions about where the missing beneficiaries are, how many have been permanently removed from welfare rolls, or what evidence exists that the system has prevented fraud.
The Smartphone Gatekeeper
Inside an Anganwadi centre in Karpi, Bihar, women queue holding toddlers. Neelam, an Anganwadi worker, lifts her smartphone towards each face. Some women leave with sacks of rice and lentils. Others are turned away.
Before the AI can scan a face, the women must first complete Aadhaar electronic verification, requiring updated Aadhaar details and access to a mobile number linked to the ID to receive a one-time password. This process is called Aadhaar eKYC.
For many women, this is where the process breaks down. Mobile numbers are often outdated, inactive, or tied to phones that travel with husbands to distant worksites. Without the password, verification stalls and the beneficiary drops out before reaching the facial recognition stage.
The mandate has reshaped not only who receives food, but how the system is run. India’s ICDS rests on nearly 1.4 million Anganwadi workers—women drawn from the same communities they serve, paid modest honorariums, and tasked with turning welfare policy into daily practice. The new system has turned routine service delivery into a series of high-stakes decisions, where technical failure can mean denying food to someone they know needs it.
"Still no match," Neelam said, stepping closer to the doorway to catch a stronger mobile signal. "Remove the dupatta from your head," she told the woman in front of her.
Neelam conducts door-to-door facial authentication for women unable to travel to the centre (Image Credit: Tej Bahadur Singh)
Neelam is trying to make the living face resemble a frozen photograph. Sometimes, she asks women to remove lipstick or wipe sweat. Sometimes she adjusts the angle or lighting.
"Keep face straight. Blink once," the screen instructs.
Nearly 10 minutes later, the AI accepts the face. Neelam hands over the ration packets. The line has barely moved. On weeks when FRS needs to be done, Neelam spends over 4 hours each day to get the AI system to recognise the women’s faces.
"No one taught us this," Neelam said. "We are just taking blind shots."
At Neelam's centre alone, three pregnant women and five nursing mothers have been removed because their faces no longer match old Aadhaar photos.
While Neelam’s centre is only one among 1.4 million Anganwadis across India, the scale is sobering. Even if just one in ten centres sees similar facial recognition failures, the resulting exclusions could affect more than a million pregnant and lactating women.
When There Is No Food To Give
In Jharkhand, the crisis has a surreal dimension. Many Anganwadi centres have not received supplies for two months, yet women are still required to present themselves for facial recognition, to remain visible in the system.
In practice, facial recognition has become a mandatory check-in to avoid being deleted from welfare rolls entirely, regardless of whether any food is actually being distributed. This reveals a function beyond ration distribution: the technology is being used to enforce periodic biometric authentication as a condition of remaining in the system at all.
"These are pregnant and lactating women," said Shyamla Devi, an Anganwadi worker since 2007 in Chandankyari village. "It doesn't make sense to ask them to get their face scanned when there is no ration. But if they don't come, the system removes them."
Although the ICDS scheme does not explicitly mandate periodic facial authentication to maintain eligibility, Anganwadi workers report that in practice, prolonged absence from facial recognition checks triggers automatic removal from beneficiary lists.
Many have stopped coming, Devi said. "They ask why stand in line and blink at a phone when there's nothing at the end."
Repeated facial authentication failures leave women at Shyamla Devi’s centre frustrated. (Image Credit: Tej Bahadur Singh)
Since facial recognition was introduced, Devi has lost nearly 10 women beneficiaries whose Aadhaar photos are years old. "Some just give up. Why come and fail again and again in front of a camera?"
The mandate is expanding. Facial recognition has now been extended to children aged three to six. For children who do not yet have Aadhaar numbers, the system uses a parent's identity, usually the mother's. When verification fails, a child is dropped from the rolls. They are cut off from food, pre-school education, health monitoring and daily meals.
Devi was instructed to remove at least four children after repeated facial recognition failures linked to their mothers' records. "They still come for morning snacks and to study," she said.
"I can't turn them away just because the app didn't recognise their mother's face. I adjust whatever little allocation we get and feed them anyway."
The Workers Who Bear The Cost
As beneficiaries disappear from welfare lists, anger falls on Anganwadi workers, the last human interface in an AI-driven system they do not control.
"They think we are keeping the ration," Neelam said. "But if the app doesn't show a name, I can't give anything."
Most workers remain in the same village until retirement, absorbing blame for policies made elsewhere. "The government makes five-year rules," Devi said. "We have to live here forever, facing the consequences."
The pressure is also financial. Anna, a 59-year-old Anganwadi worker in Jharkhand’s Kodekel village, spends nearly Rs 1,000 every month escorting beneficiaries to the block office 12 kilometers away, where mobile connectivity is strong enough for verification.
“Sometimes I pay for everyone,” Anna said.
She earns Rs 10,500 a month—Rs 4,500 from the Centre and Rs 6,000 from the state. The state share arrives on time; the central payment is often delayed by months, she says.
“No worker has ever received the full salary together,” she said. “And after retirement at 62, there is nothing—no pension, no security.”
Workers also bear the cost of running the digital system. Government-issued phones stopped working years ago, forcing many to buy their own. Internet reimbursements of Rs 2,000 a year are delayed and inadequate, with monthly recharges alone costing around Rs 300.
In effect, some of India’s poorest women are paying out of pocket to operate a facial recognition system that denies food to other poor women.
‘Snake Oil For Accountability’
Anganwadi workers say they face sustained pressure from supervisors to display “100% FRS completion” on the government dashboard. To achieve this, workers say they are often instructed by their supervisors to remove beneficiaries who repeatedly fail facial authentication—at least three to four women per centre—and replace them with new names who can pass the verification process.
This shifts the focus from addressing authentication failures to managing numbers, workers allege. In some cases, beneficiaries with greater nutritional needs are dropped, while others who can clear facial checks are added.
A message from a supervisor to Anganwadi workers stated: “Those distributing rations to children in the 3–6 years age category without FRS will have to stop. From next month, ration quantities will be released based on the number of beneficiaries authenticated via FRS.”
Representational image of the supervisor’s message on FRS verification. (Recreated with AI)
Several workers shared messages from supervisors urging them to “clear pending FRS cases” and warning that incomplete dashboards would require explanations or could lead to disciplinary action.
Economist Reetika Khera, who has studied welfare delivery and exclusion for years, describes such technologies as “snake oil for accountability”. “We don’t have rigorous, reliable evidence of how much is being siphoned off,” she said. Where leakage exists, she acknowledged, corrective measures are needed, but facial recognition is not the answer.
Instead, Khera said, these systems create new and arbitrary hurdles for beneficiaries. “What these technologies are doing is creating fresh barriers for people to access their entitlements. Access to nutrition is now being made conditional on some random technology that is neither transparent nor consistently reliable, one that can be switched off by a failed scan or a clunky app.”
Beyond accuracy or efficiency, Khera said, the use of biometrics in welfare should be recognised as surveillance and scrutinised from that point of view. “At every drop of a hat, if you want to start taking photos and surveying people, this is something that needs to be challenged at a more fundamental level,” she said.
Sameet Panda of LibTech, a research and advocacy collective working on welfare transparency and accountability, said that under the National Food Security Act (NFSA) access to food cannot legally be made conditional without a legislative overhaul. “Yet exclusionary technologies like facial recognition are effectively doing just that. The tech is negating the objective,” he added.
According to Advocate Dipika Sahni said the constitutional right most directly implicated when access to food and nutrition is made conditional on biometric or facial authentication is Article 21, which guarantees the Right to Life and Personal Liberty.
"The Supreme Court has consistently held that the right to life includes the right to live with human dignity, and that encompasses the right to food," she said.
She added that Article 14, which guarantees the Right to Equality, may also be violated if the technology results in arbitrary exclusions or creates irrational classifications between beneficiaries.
Facial recognition is also straining Anganwadi workers’ relationships with communities, Panda said. “Frontline workers usually work on trust, but failed authentications make them appear complicit in denial, breaking years of trust.”
Any technological intervention in welfare delivery, he argued, must have a non-technological alternative.
“Without this safeguard, a rights-based system is being reduced to a relationship between a benevolent government and beneficiaries, instead of functioning as a guaranteed right.”
‘Remove It Once And For All….’
In December, nearly 10,000 Anganwadi workers protested across Bengaluru, Tumakuru, Mandya, and Hubballi in Karnataka, calling for the removal of facial recognition, recognition of ICDS as a full government department, and exemption from election duties.
"We are called 'honorary' workers," said S Varalakshmi, president of the workers' union. "But the food and education we provide are rights, not charity. The people we serve are rights holders, not beneficiaries. We want recognition."
On facial recognition, Varalakshmi was unequivocal: "Remove it once and for all. The government claims it will reduce corruption. If that is the problem, fix the system instead of shifting the burden onto us and excluding the very people the scheme is meant to serve."
On December 3, a delegation of ten workers from Karnataka met with the Ministry of Women and Child Development officials in New Delhi. Officials promised corrective steps, including allowing alternative identity documents in place of Aadhaar for facial authentication and addressing concerns around election duties.
Thousands of Anganwadi workers gathered at protest sites, shutting down their centres. (Image Credit: Tej Bahadur Singh)
On February 11, an app update introduced a limited alternative identity provision, applicable only to pregnant and lactating women. The update allows workers to upload documents such as voter ID cards, bank passbooks, passports, or ration cards. However, this change allows a woman’s name to be added to the system only after the Anganwadi supervisor verifies and approves the alternate identity document.
For infants in the 0–6 months category, the updated rules state that apart from the mother’s Aadhaar number, other proof of identity may also be accepted. However, the app specifies that when registration is done using the father’s or guardian’s Aadhaar, eKYC and facial capture become mandatory. As the updated process on the app notes, “While registering with Father’s or Guardian’s Aadhaar, it is mandatory to perform eKYC and Face Capture.”
The workaround still requires facial recognition, leaving the fundamental problem unresolved.
Globally, governments have been cautious about using facial recognition in public services. In Sweden, authorities fined a municipality in 2019 for using it to track student attendance, calling it disproportionate. In the Netherlands, the System Risk Indication (SyRI) welfare-fraud system was struck down by a court for violating privacy and targeting vulnerable groups. In Indonesia, biometric tools for welfare distribution have largely remained limited to small pilots due to concerns over exclusion and access.
In India, however, facial recognition has quietly entered welfare delivery without public debate or legislative authorisation, now mediating access to food under a rights-based programme.
While Members of Parliament have raised concerns in the Rajya Sabha about authentication failures, added burdens on Anganwadi workers, and the risk of exclusion, there has been no dedicated parliamentary debate or standalone committee review examining the constitutional, privacy, or proportionality implications of mandating facial recognition in ICDS.
Advocate Dipika Sahni reiterated that benefits under ICDS are not charity, but a statutory entitlement flowing from the National Food Security Act and from the constitutional right to life under Article 21.
“Any invasion of privacy, including compulsory biometric collection, must satisfy the proportionality test laid down by the Supreme Court,” she said. This means the state must show that the measure serves a legitimate aim, is necessary, that no less intrusive alternative exists, and that the harm caused, such as exclusion from food, is not disproportionate to its intended benefit.
The consequences of that test are already visible on the ground.
Waiting For The Algorithm’s Approval
Preeti gave birth to a healthy baby in December. She has been recovering steadily, but her name remains missing from the welfare system. As a nursing mother, she is entitled to supplementary nutrition, yet she has been unable to access it.
While she says she can “somehow manage” her own health, she worries about what lies ahead for her child.
“My concern is not just now,” Preeti says. “When he grows, will he get the food and the schooling he is supposed to? If my name itself is not there, how will he be counted?”
Asked whether she filed a complaint, Preeti shook her head. “I don’t know where to go or whom to approach.” While the Ministry operates a toll-free grievance redressal helpline (1515), Preeti and other women facing exclusion told Decode they were unaware of its existence.
For Preeti, the failure is measured in missed meals and mounting anxiety about whether her child will begin life already at a disadvantage.
Women like Preeti are learning a new reality. In the world's largest democracy, access to food for your newborn child depends on whether a Google algorithm, deployed without public disclosure, tested inadequately, and designed for consumer apps, recognises your face.
This story has been edited by Adrija Bose.