India on Thursday outlined global AI ambitions with plans to build its own ‘foundational model’ that could take on the might of ChatGPT, Deepseek R1 and others, as it lined up “most affordable” common compute facility powered by 18,693 GPUs to be used by startups and researchers.
India’s bold move comes at a time when Chinese company DeepSeek has turned heads after its AI model overtook ChatGPT as the top-ranked free app on Apple’s appstore, challenging the AI dominance concentrated so far with the US firms, particularly Silicon Valley frontrunner Open AI.
IT Minister Ashwini Vaishnaw exuded confidence that India will build a foundational model that is world-class, and that it will be able to compete with best models across the globe. The government announced the next steps in its AI blueprint, among them, 18,693 Graphics Processing Unit or GPUs on offer by the empanelled bidders (a list that includes Jio Platforms, CMS Computers, Tata Communications, E2E Networks, Yotta Data Services, and others), and start of AI safety institution, with 8 projects approved under it.
The government is also calling for proposals to develop India’s own foundational models that would be aligned to Indian context, Indian languages, culture, basically where datasets are “for our country, of our country and for our citizens” and “biases are removed”.
Put simply, foundation models in generative AI are large, pre-trained models that form the base for a variety of AI applications.
The common compute facility (powered by 18,693 GPUs) would be made available at a fraction of global cost benchmarks, Vaishnaw assured. The compute facility will be “most affordable” coming significantly less than one dollar (per GPU hour) after 40 per cent cost borne by the government.
“Making modern technology accessible to everyone, that is the economic thinking of our PM... Ours is the most affordable compute facility, at this point of time,” Vaishnaw said.
The Minister said there are at least six major developers/ startups who would be able to build foundational models in the next 8-10 months at the outer limit, and 4-6 months at a more optimistic estimate.
“Algorithmic efficiency matters a lot, it can deliver a model at much lower cost and less time than the world has seen today,” the Minister said, expressing optimism that India will have a “world class foundational model” in the coming few months.
The government, under the IndiaAI Mission, had approved Rs 10,372 crore outlay to strengthen India’s AI ecosystem - a key pillar for that was enabling 10,000 GPUs for AI Compute Infrastructure. The 18,693 GPUs would offer large compute power considering that Deepseek was trained on 2,000 GPUs and ChatGPT version 4 on 25,000 GPUs.
On the applications under India AI mission, India’s focus has been about utilising the power of Artificial Intelligence for solving population-scale problems, in areas such as healthcare, education, agri, logistics, weather forecasting and others.
The projects approved for AI safety include areas of machine unlearning (IIT Jodhpur), synthetic Data Generation (IIT Roorkee), AI bias mitigation strategy, explainable AI framework (Defence Institute of Advanced Technology Pune), privacy enhancing strategy (IIT Delhi, IIIT Delhi, TEC), AI Ethical certification framework, AI algorithm auditing tool, and AI governance testing framework.