MITB Banner

Govt Wants All Defence PSUs To Initiate AI Projects, BEL To Invest Upto ₹50 Crore PA

Share

bel

bel

In a move that clearly showcases that the Central Government wants the country to focus on emerging technologies, it is now mandatory for defence public sectors units (DPSU) to initiate projects in artificial intelligence.

Gowthama MV, the chairman and managing director of  Bharat Electronics Limited a DPSU, told a national daily, “The Ministry of Defence has made it clear that this year onwards all DPSUs must focus on tapping potential business in the area of AI. Every year, we sign a memorandum of understanding with the Government of India on performance targets. Now, as part of this MoU, we have taken up the challenge to bring out AI-enabled products in the current financial year.” He added that BEL would be investing around ₹40-50 crore per annum on AI-related projects over the next three years.

In fact, BEL and the Central Research Laboratory (CRL) are already working on a “first responder robot” to secure Indian borders. Gowthama had said that their scientists, who have been working on the project since December 2018, have now started putting together preliminary requirements for design and implementation for the same. He also added that they have already conducted training programmes to build competency in AI.

Currently, BEL is working on five major projects involving AI:

  1. Facial recognition for security applications
  2. IoT-based platform maintenance
  3. Social networking analysis
  4. Robotics surveillance platform
  5. Automated information extraction and synthesis

Gowthama told the leading newspaper, “Currently, we do not have any orders for AI. It is with our own internal investment that we are developing AI-enabled products, which shall undergo internal evaluation by December this year and later demonstrated to the users for feedback. Some of the systems like facial recognition may not require much infrastructure investment. However, IOT based systems need networkable hardware.”

In June 2018, the Ministry of Defence had initiated the process of preparing Indian defence forces in their use of AI and how these capabilities could be developed in the country. To study the whole gamut of issues surrounding strategic implications of AI in national security perspective, in the global context, a multi-stakeholder Task Force represented by members from the Government, Services, Academia, Industry Professionals and startups was constituted in February 2018, under the Chairmanship of N Chandrasekaran, the Chairman at Tata Sons.

Share
Picture of Prajakta Hebbar

Prajakta Hebbar

Prajakta is a Writer/Editor/Social Media diva. Lover of all that is 'quaint', her favourite things include dogs, Starbucks, butter popcorn, Jane Austen novels and neo-noir films. She has previously worked for HuffPost, CNN IBN, The Indian Express and Bose.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.