- | 6:00 pm
India readies own generative AI tool to rival DeepSeek, ChatGPT
The India AI Compute facility has secured about 18,000 GPUs to drive the development of a large language model built for India, IT minister said

India is developing its own generative AI tool along the lines of OpenAI’s ChatGPT and China’s DeepSeek, union information technology minister Ashwini Vaishnaw said.
The development comes in the same week that Chinese AI startup DeepSeek sent shockwaves through Silicon Valley by a releasing AI tools that matched ChatGPT maker OpenAI’s most advanced models at a fraction of their cost.
Speaking at the Utkarsh Odisha Conclave, Vaishnaw said the government has chosen 18 proposals that it will back with computing infrastructure, data and funding to develop AI-related applications across domains including climate change and farming.
“The foundational models made in India will be able to compete with the best of the best in the world,” Vaishnaw said, adding: “We believe there are at least six major developers who can develop AI models in the six to eight months on the outer limit, and four to six months on a more optimistic estimate.”
A foundational model is an AI model trained on vast datasets to perform a wide range of tasks. These are designed to adapt and can be fine-tuned for specific applications, making them versatile.
Researchers, startups, and academic institutions need high-end computational infrastructure to advance AI development and, as part of the India AI Mission, the government has prioritized establishing a shared computing resource
“A common compute facility is the most important component for a robust AI ecosystem,” Vaishnaw said.
The initiative will be powered by the India AI Compute facility, which has secured about 18,000 graphics processing units (GPUs) to drive the development of a large language model (LLM) built for India, Vaishnaw said while pointing out that the facility exceeded initial expectations by securing more GPUs than the originally planned 10,000.
These include 12,896 Nvidia H100 GPUs and 1,480 Nvidia H200 GPUs, which are some of the most powerful AI chips available, Vaishnaw said, adding that of these, about 10,000 are ready for immediate use, and the rest will be deployed gradually.
India will fund 40% of the computing price of the proposals, Vaishnaw added.