BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • Mass Surveillance: Indian Railways...
      Explainers

      Mass Surveillance: Indian Railways Use Facial Recognition Tech And AI

      Herta Security is supplying its advanced facial recognition technology for the Indian railways project. Its systems can detect emotions, micro-expressions, and even gaze directions.

      By - Hera Rizwan |
      Published -  5 Aug 2024 12:22 PM IST
    • Boomlive
      Listen to this Article
      Mass Surveillance: Indian Railways Use Facial Recognition Tech And AI

      Indian Railways Implements Facial Recognition Technology To Combat Crime

      • Indian Railways is integrating AI and facial recognition-enabled CCTV cameras into train coaches across the country to enhance security and curb crime.
      • IDIS Global and Herta Security are supplying the technology. IDIS provides 4K cameras with AI video analytics, while Herta offers advanced facial recognition technology.
      • There are concerns that increased surveillance could deter public protests and other forms of democratic participation, as seen in previous instances of using surveillance for policing protests.

      India's rail networks are adding AI and facial recognition-enabled CCTV cameras inside train coaches nationwide. The Ministry of Railways will be using face-cropping tools and face-matching servers to monitor and identify individuals with the aim of 'curbing crime'.

      According to the blog post from IDIS Global, the South Korean video surveillance manufacturer handling the project, Indian Railways will be equipped with a 4K camera system integrated with AI video analytics and facial recognition technology from Herta Security. In May 2024, cameras were installed at hundreds of platforms in 230 busy stations in the eastern region. This was the first phase of the development.

      In January, the Centre for Railway Information Systems, a functionary of the Ministry of Railways had floated a tender for the same. As per the ambitious plan, 38,255 coaches will be equipped with 8 cameras, 2,744 coaches with 5 cameras, 2,079 coaches with 4 cameras and 960 coaches with 6 cameras.

      The CCTV systems will use video analytics and facial recognition technology (FRT). Each train will have four cameras at entry and exit points. These cameras will crop face images from the live feed and send the data to a central server in real-time, storing facial data of all passengers, including children.

      The project adds to the growing list of government undertaking relying on citizens' sensitive personal data, implemented while India's Digital Personal Data Protection Act remains in limbo.

      Also Read:What Is The Fast Track Immigration Programme And Why Is It Raising Concerns?

      What do we know about IDIS Global and Herta Security?

      Founded in 1997, IDIS Global is a security technology company headquartered in Seoul, South Korea. It operates globally through regional offices and strategic partners. IDIS has over two million recorders installed worldwide, and its technology is used in more than 16.5 million cameras.

      The company will aid Indian Railways with its AI threat alerts. Its blog says the alerts will "drive efficiency and reduce pressure on monitoring teams, for maximum consistency of surveillance operations".

      Additionally, the system’s video management platform provides mobile surveillance via its IDIS Mobile Plus app, allowing images and videos captured from smartphones to be integrated into the system.

      According to the company, this adds "hugely" to the potential surveillance reach, allowing "local staff to send on-site reports of damage or incidents, channeling them directly to local and central command centres".

      Alluding to the Indian Railways, the company mentioned that it will be integrating Herta Security’s facial recognition technology, which has "advanced algorithms to provide real-time identification and alerts to persons-of interest".

      Headquartered in Barcelona with offices in Madrid, London, and Los Angeles, Herta provides video surveillance, access control, and marketing solutions. Currently, it services are being used for surveillance in various international projects, including safe cities, airports, train and metro stations, prisons, banks, casinos, sports stadiums, shopping malls, military, police, and forensic applications.

      As per its website, Herta's facial recognition technology works for both crowded and non-crowded areas. It claims to be adept in high speed analysis of video recordings and images.

      Herta's algorithms are capable of detecting and identifying an individual by "automatically extracting and encoding the most relevant information from the face". Moreover, it can identify a user from access control lists and verify their identity, thereby, controlling individual's access to physical locations.

      It claims to yield accurate result "despite partial occlusions of the face, the use of glasses, scarfs or caps, changes of facial expression, and moderate rotations of the face".

      Notably, Herta has a unique FRT solution called the BioObserver. This solution can recognise basic emotions (joy, sadness, anger) as well as micro-expressions (frown, blink, eyebrow raise) of humans."BioObserver also allows to extract the direction of the gaze and the orientation of the head, to monitor behavioral metrics such as the degree of attention of the individual," the website read.

      Also Read:Smile, You're On Camera: The AI Surveillance At Ram Mandir In Ayodhya

      FRT concerns around privacy and democracy

      Experts told BOOM that the breach of facial data, one of our permanent identifiers, can cause irreparable damage. Unlike passwords, facial data cannot be changed to stop the harm. As a result, programmes based on biometrics (unique physical traits) are more vulnerable to risks.

      Sangeeta Mahapatra, a research fellow studying digital authoritarianism, at the German Institute for Global and Area Studies, Hamburg, highlighted that the Digital Personal Data Protection (DPDP) Act, in its current form, also cannot uphold data privacy. As per her, the Indian Railways' implementation of FRT raises more questions than it answers.

      She said, "There should be a legislative basis for such a system. We don't know its intent, usage or impact assessment. We don't have answers to questions like: Who will have access? How long will the data be stored? Will it be stored as classified data? Will it be integrated with the crime data?."

      According to Mahapatra, FRT is so dangerous that it requires a special law for itself. She warned of the harm this tech, based on our facial data, can cause. She highlighted, "In the European Union (EU), despite several countries using FRT, there was a general consensus on banning its use in public spaces, with exemptions for terrorism and locating missing children."

      "If our justification is security, then only installing FRT-enabled CCTV cameras does not hold water. That should be accompanied with an aware society, ground policing and neighbourhood watch," she added.

      Mahapatra opines that we are at a juncture where conversation should move beyond accuracy to the intent of such technologies. "Accuracy can be improved overtime but for FRT, the intent must remain the primary focus. It should not be just privacy by design but privacy by default."

      Addressing concerns about potential repercussions of FRT, independent researcher Shivangi Narayan sees such surveillance exercises as "a first step towards a totalitarian system".

      She said, "Currently, the issue is not immediate individual privacy invasion, but the deployment of FRT in Railway networks will create a structure that will enhance system accuracy, as where else would we get such volumes of data. This will make identification easier, potentially deterring democratic participation, such as protests."

      Profiling concerns, as seen during the farmer protests in Haryana, will deter people from protesting. Even those merely passing by can be identified, leading to a decrease in participation, she explained.

      In February, Ambala Police had announced that they have begun the process to cancel passports and visas against those caught causing disturbances or breaking barricades on CCTV or drone cameras during the farmers’ protests. In light of this, civil groups had raised questions whether drones used for surveillance in protests had facial recognition capabilities, labeling the police's cancellation of visas and passports as an "extreme act".

      Opposing the idea of using FRT for societal purposes like criminal investigation or surveillance, Narayan said, "Our criminal justice system is colonial in nature. Shifting bureaucratic accountability to potentially flawed technology like FRT can lead to wrongful identification. In a system where the process itself is punitive, arbitrary arrests based on FRT assumptions could take years to overturn."

      Citing the problematic use of FRT in the investigations of the 2020 Delhi riots, which Narayan also researched, she said, "Arrests were made based solely on individuals being present at the site, as identified by the FRT, and not because they were necessarily indulging in problematic behaviour. Thus, such technology-oriented solutions are short-term and mostly directed at a specific strata which do not have the same negotiating powers in society like the privileged ones."

      We contacted the Ministry of Railways multiple times for a response, the story will be updated if and when they respond.

      Also Read:Voluntary Or Mandatory? DigiYatra Trend Puzzles Airport Passengers


      Tags

      facial recognitionsurveillanceIndian RailwayFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!