Analytics India Magazine https://analyticsindiamag.com/ News and Insights on AI, GCC, IT, and Tech Tue, 30 Sep 2025 08:59:46 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2025/02/cropped-AIM-Favicon-32x32.png Analytics India Magazine https://analyticsindiamag.com/ 32 32 New NVIDIA Models Helps Robots Learn, Reason, and Act in the Real World https://analyticsindiamag.com/ai-news-updates/new-nvidia-models-helps-robots-learn-reason-and-act-in-the-real-world/ Tue, 30 Sep 2025 08:59:37 +0000 https://analyticsindiamag.com/?p=10178528

NVIDIA also unveiled the GB200 NVL72 system, RTX PRO servers and Jetson Thor for real-time on-robot inference.

The post New NVIDIA Models Helps Robots Learn, Reason, and Act in the Real World appeared first on Analytics India Magazine.

]]>

NVIDIA has released new open models and simulation libraries to support global robotics research and development. The announcement, made at the Conference on Robot Learning (CoRL) in Seoul, includes the open-source Newton Physics Engine, the Isaac GR00T N1.6 foundation model and new Cosmos world foundation models.

The updates aim to accelerate the way robots learn, reason, and transfer skills from virtual to real environments. Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA, said, “Humanoids are the next frontier of physical AI, requiring the ability to reason, adapt and act safely in an unpredictable world.”

Newton Sets New Benchmark 

The Newton Physics Engine, developed with Google DeepMind and Disney Research, is now available in NVIDIA Isaac Lab. Managed by the Linux Foundation, the engine supports complex actions such as walking on uneven ground and handling delicate objects.

ETH Zurich, Technical University of Munich and Peking University are among the first adopters. 

NVIDIA has also announced Isaac Lab – Arena, an open-source policy evaluation framework codeveloped with Lightwheel. The tool will enable large-scale testing of robotic skills in diverse simulated environments.

Robot Reasoning and Training

The Isaac GR00T N1.6 model integrates NVIDIA’s Cosmos Reason, which helps robots turn vague instructions into step-by-step plans using prior knowledge and physics-based reasoning. The model, available on Hugging Face, also supports multi-task operations such as opening heavy doors.

NVIDIA’s Cosmos world foundation models, which have been downloaded more than 3 million times, have been updated to generate large-scale training data. Cosmos Predict 2.5 and Cosmos Transfer 2.5, due soon, will offer longer video generation, multi-view outputs and faster synthetic data creation.

Boston Dynamics used NVIDIA’s new grasping workflow in Isaac Lab 2.3 to train its Atlas robots to improve manipulation skills. Other companies adopting NVIDIA’s Isaac and Omniverse platforms include Agility Robotics, Figure AI, Franka Robotics, Techman Robot and Solomon.

NVIDIA also unveiled new AI infrastructure, including the GB200 NVL72 system, RTX PRO servers and Jetson Thor for real-time on-robot inference. These tools are being adopted by partners such as Figure AI, Meta, Google DeepMind and the RAI Institute.

Lebaredian added, “With these latest updates, developers now have the three computers to bring robots from research into everyday life, with Isaac GR00T serving as the robot’s brain, Newton simulating their body and NVIDIA Omniverse as their training ground.”

The post New NVIDIA Models Helps Robots Learn, Reason, and Act in the Real World appeared first on Analytics India Magazine.

]]>
UST, Kaynes Semicon Announce ₹3,330 Cr JV for OSAT Facility in Sanand https://analyticsindiamag.com/ai-news-updates/ust-kaynes-semicon-announce-%e2%82%b93330-cr-jv-for-osat-facility-in-sanand/ Tue, 30 Sep 2025 08:53:20 +0000 https://analyticsindiamag.com/?p=10178523

The partnership combines UST’s digital engineering and AI expertise with Kaynes Semicon’s manufacturing experience.

The post UST, Kaynes Semicon Announce ₹3,330 Cr JV for OSAT Facility in Sanand appeared first on Analytics India Magazine.

]]>

California-based UST and Kaynes Semicon, a subsidiary of Kaynes Technology, will set up a ₹3,330 crore semiconductor manufacturing facility in Sanand, Gujarat. The joint venture will support India’s ‘Make in India’ initiative and focus on outsourced semiconductor assembly and testing (OSAT).

The partnership combines UST’s digital engineering and AI expertise with Kaynes Semicon’s manufacturing experience. Both companies stated that the venture aims to enhance India’s semiconductor ecosystem, serving sectors such as electric vehicles, renewable energy, and consumer technology.

Krishna Sudheendra, CEO of UST, said, “Together, our two great companies will harness the strengths of the Indian market and build a formidable foundation for the country to become a key player in the global semiconductor industry.”

Kaynes Semicon CEO, Raghu Panicker, recently told AIM that the launch of its pilot chips will take place in the first week of October. This will be done from the Sanand OSAT facility, which will advance India’s semiconductor ambitions at a time when the industry is moving towards self-reliance and global competitiveness. 

Ramesh Kannan, executive vice chairman of Kaynes Technology India Ltd, said, “Kaynes Semicon’s collaboration with UST is a proud milestone for the ‘Make in India’ mission. Together, we are creating a platform that will set benchmarks for semiconductor assembly, testing, and innovation, not just for India but for the global market.”

Recently, Kaynes Semicon also entered into a partnership with Emerson to deploy Emerson’s NI Semiconductor Test System (STS) as its preferred platform across its test facilities in India. The platform will standardise testing of analogue, mixed-signal, RF, power and MEMS devices, helping to accelerate production and cut time-to-market. 

In parallel, Kaynes Electronics, another subsidiary of Kaynes Technology, plans to invest ₹352 crore in a manufacturing plant in Bhopal, which is expected to create more than 1,000 jobs. 

On the overseas front, Kaynes Semicon has also inaugurated its first international chip design centre in Muscat, Oman, focusing on extensive-scale integration (VLSI) design. The centre, funded by Oman’s government and established in partnership with its ministries, will support both front-end and back-end chip design and train 80–100 students annually in advanced semiconductor design. 

The post UST, Kaynes Semicon Announce ₹3,330 Cr JV for OSAT Facility in Sanand appeared first on Analytics India Magazine.

]]>
India’s Industrial AI Moment: Why VCs Need to Partner with Universities and Startups Now https://analyticsindiamag.com/ai-startups/indias-industrial-ai-moment-why-vcs-need-to-partner-with-universities-and-startups-now/ Tue, 30 Sep 2025 08:25:39 +0000 https://analyticsindiamag.com/?p=10178520

Structured collaboration offers a defensible sourcing advantage, providing access to proprietary technologies before they enter the open market.

The post India’s Industrial AI Moment: Why VCs Need to Partner with Universities and Startups Now appeared first on Analytics India Magazine.

]]>

India’s universities and technology institutes have often brought out cutting-edge industrial research. From predictive modelling for polymers to AI for cybersecurity, some of the most ambitious industrial AI innovations are sprouting inside labs. 

These hubs of innovation, though, are faced with a glaring question: how to scale these breakthroughs into viable, globally competitive businesses?

Collaborations between academic institutions, startups, and venture capitalists could bridge the stubborn “lab-to-market” gap and aid India in emerging as a hub for industrial AI.

Research to Returns

The partnership between the TCG Centre for Research and Education in Science and Technology (CREST) and Haldia Petrochemicals is an example of research translating into industrial value. The project aimed to address a long-standing challenge in polymer production: predicting the Melt Flow Index (MFI), a critical quality metric, in real-time.

Professor Goutam Mukherjee, director, Institute of Advancing Intelligence at TCG CREST, said, “Approximately 99% of the data records are computed using effective imputation techniques.” The final prediction combined MFI forecasts with predicted error corrections to produce robust outcomes, he explained, adding the project not only eliminated the four-hour delay, but also improved profitability and agility.

What’s notable is not just the technical achievement, but the model of collaboration itself. As Mukherjee put it: “Theoretical research is good, but at the same time, we must explore its utility for the society and the business.”

Detect Technologies, incubated from IIT Madras, has developed its flagship product T-Pulse, which provides real-time health monitoring of assets in heavy industries such as oil & gas and steel. It is frequently mentioned in lists of the top AI industrial automation companies in India.

Chakr Innovation, founded by IIT Delhi alumni, developed retrofit emission control devices that reduce diesel generator emissions by up to 90%. The company holds multiple patents and has received policy approvals.

Why Industrial AI Is Harder to Scale

Haldia’s case shows how academic collaboration can yield immediate benefits. But scaling such models across industries requires confronting the structural challenges of industrial AI.

Unlike consumer apps or SaaS tools, industrial AI solutions are deeply contextual. They demand domain knowledge in areas as varied as chemical engineering, power systems, automotive manufacturing, and logistics. They also require significant capital to build prototypes, run pilots, and integrate into real-world plants where downtime is expensive.

This is where venture capitalists often hesitate. As Shashank Randev, founder and general partner at 247VC, said at Cypher 2025 that some founders underestimate the life cycle of the sales process. “From a paid proof of concept to actually generating revenue, and then figuring [out] the efficacy of that product at the enterprise level; that cycle is what we are essentially trying to shorten for our portfolio.”

For many startups, that “pilot purgatory” becomes a graveyard. 

The Academic Spinout Opportunity

Potential AI ventures are plentiful at academic institutions but taking them to market remains a challenge.

Aditya Singh Gaur, deputy manager at C3iHub, IIT Kanpur, said that a significant ‘lab-to-market’ gap prevents breakthroughs from reaching venture-backed scale. “The core challenge lies in a scarcity of structured commercialisation pathways that can de-risk early-stage technology,” he added. 

Gaur advocates for dedicated translational research platforms and deep-tech incubators that provide patient capital, shared infrastructure, and industry partnerships. Equally important are university spinout mechanisms with clear, founder-friendly IP policies. Without these, researchers often lack the incentives or legal clarity to convert their work into startups.

Prasanjeet Sinha, incubation manager at C3iHub, IIT Kanpur, added that VCs want more than surface-level engagements like hackathons and demo days, as these are now seen as insufficient for generating high-quality deal flow. “The industry is leaning towards a deeper, more strategic engagement that provides proprietary access to defensible technology and high-potential founding teams,” Sinha said.

From Hackathons to Real Industry Pilots

So what does “deeper engagement” look like? According to Sinha, models that work include university spinouts with clear IP licensing frameworks, cofunded pilot programs in authentic industrial settings and early access to entrepreneurial talent nurtured into founding teams.

The barrier, he notes, lies in the absence of standardised frameworks. Ambiguity around IP, a lack of co-investment pathways, and weak startup readiness programs hinder collaborations from being repeatable rather than ad hoc.

For VCs, the payoff of solving this is huge. Structured collaboration offers a defensible sourcing advantage, providing access to proprietary technologies before they enter the open market. For universities, it creates a culture where research is not only publishable, but also buildable.

Ecosystem Gaps India Must Solve

For India to genuinely advance in industrial AI, it is imperative to address several critical gaps in the ecosystem. 

Gaur highlights four urgent needs: specialised AI talent for sectors like manufacturing, enhanced access to large-scale GPU/TPU clusters for startups and researchers, the establishment of industrial testbeds for real-world experimentation, and global partnerships for collaborative strategies and data access. Addressing these will position India as a leader in industrial AI, he said. 

Without these, India risks falling behind countries where universities, corporations, and investors are already closely aligned in their pursuit of industrial AI, he added. 

Why VCs Should Care

From a VC perspective, the incentive is not just patriotic,  it’s financial. Industrial AI startups may take longer to mature, but once entrenched, they become deeply defensible businesses. Enterprise clients are sticky, integration is complex, and switching costs are high.

As Randev highlighted, the challenge is ensuring these startups can scale beyond one or two enterprise customers. He said that while evaluating, the questions VCs face are: whether they can find enterprise customers, will this model work, and will they be able to replicate it for 10–15 others?

For VCs willing to engage early with institutional platforms, the upside is privileged access to startups that can dominate global industrial niches. They need to grow from being financiers to active co-creators in the lab-to-market pipeline. 

Mukherjee reflected on his own journey since joining TCG CREST and said he had realised that, “if you work with a problem which comes up from a business point of view, it gives you more problems for your academic institution.”

In other words, collaboration doesn’t just transfer knowledge outward; it deepens the research itself. For investors, that is perhaps the biggest reason to get involved.

The post India’s Industrial AI Moment: Why VCs Need to Partner with Universities and Startups Now appeared first on Analytics India Magazine.

]]>
This Ahmedabad Startup is Building a Thermometer for Earth https://analyticsindiamag.com/deep-tech/this-ahmedabad-startup-is-building-a-thermometer-for-earth/ Tue, 30 Sep 2025 04:49:33 +0000 https://analyticsindiamag.com/?p=10178507

Rising heat is stressing water resources, and thermal data can identify it a month before it becomes visible to the human eye.

The post This Ahmedabad Startup is Building a Thermometer for Earth appeared first on Analytics India Magazine.

]]>

Heat is now one of the most closely watched markers of climate change. The World Meteorological Organisation notes that the past 10 years have been the warmest on record, with 2023 alone bringing unprecedented heatwaves across Asia, Europe, and North America. 

July 2025 was the third-warmest July globally (after July 2023 and 2024), according to the EU’s Copernicus Climate Change Service. The average sea surface temperature was also the third highest on record. 

Rising temperatures threaten crop yields, accelerate water stress, and intensify urban heat islands, yet monitoring systems remain patchy. The United Nations Secretary-General’s Early Warnings for All (EW4ALL) initiative aims to ensure every person on Earth is protected by early warning systems by 2027. 

Thermal imaging, often described as taking the Earth’s temperature from above, provides a means to track these changes in real time and offer data to governments, farmers, and planners, so that they can act before crises unfold.

India’s SatLeo Labs, a space tech startup based out of Ahmedabad, is working on a satellite constellation designed to act like a thermometer for the planet. The company plans to use thermal imaging from space to monitor rising temperatures, greenhouse gases, and water stress, problems it says are already reshaping economies and communities.

What is the Urgency?

The Indian government has also moved to back private space companies through policy and funding by introducing the ₹1,000 crore VC fund for the sector. Gujarat, too, has recently announced a 25% subsidy for space startups

Shravan Bhati, co-founder and CEO of SatLeo Labs, in an exclusive interaction with AIM, said that the company’s technology is designed to address pressing challenges. “Four lakh people lost their lives last year just because of the heat islands, 50,000 people lost their lives, just in Europe, right?” said Bhati, adding that SatLeo’s data identifies where such heat islands are developing and can even predict where it could happen. 

For Bhati, the goal is simple: space data should be practical. “Space tech is a very interesting area, but I think we should also focus on the challenges, what problem we are solving, right? That’s very important,” he said.

He mentioned the ‘National Geospatial Policy’ that allows private (Indian) players to map areas with one-metre resolution, a shift that has encouraged more startups to enter the geospatial sector. He added that ISRO continues to support young firms by providing access to testing and manufacturing facilities.

SatLeo has signed agreements with government and private bodies, and will initially focus on the Indian and the Middle East markets, before expanding to Europe and the Americas.

Pilots on the Ground

Bhati explained that rising heat is reducing crop yields and stressing water resources. “By 2050, it [world population] will be around 9.7 billion, and we will need 70% more food. Because of rising temperatures, 30% of the yield will be reduced, and we still do not have any monitoring system of the temperature,” he said.

Thermal data can identify water stress more than a month before it becomes visible to the human eye. This, he argued, will be key for farming and food security.

SatLeo has also initiated pilot projects with municipal authorities. “We are working closely with Tumkur Municipal Corporation, where we are using this data to identify and solve multiple problems, to start with solid waste dumps,” Bhati said.

Temperature data for identifying heat hotspots in the Tumkur region.

The company’s AI platform, SatLeo insight hub portal, analyses thermal views of urban areas to flag issues. Through this platform, the Tumkur Municipal Corporation is able to identify emission of the greenhouse gases around their solid systems. “We are also identifying the thermal patterns of the city and that’s very important for planning,” he said.

Bhati explained that the technology relies on multiple thermal bands, including mid-wave infrared and long-wave infrared (MWIR & LWIR), which can cut through thin clouds and haze. This approach improves the data accuracy, making it reliable for urban planning and agriculture.

SatLeo is also in talks with other cities to provide them similar applications, from tracking heat islands to improving plantation strategies.

Temperature data for identifying heat hotspots in the Tumkur region.

Building the Satellites

The startup recently closed a pre-seed round of $3.3 million in May to develop its high-resolution thermal and visible imaging technology from Low Earth Orbit (LEO).

“We will be using this fund for the manufacturing of our two platforms,” Bhati said. The first, Tapas, is a CubeSat platform for thermal analytics, while the second, Pyro, is a larger 100-kg satellite. 

Tapas is scheduled to launch in February 2026, with Pyro to follow in late 2026 (Q4 2026).

The company wants to complete its constellation within three years. “We do not want to consume a lot of time,” he said.

Talent is central to the company’s technology as SatLeo’s team combines experienced scientists with young engineers, a balance that helps the startup manage challenges such as atmospheric noise in thermal data and improve accuracy in satellite analytics.

He added that the satellites will also use edge computing to process data onboard. By reducing transmission time, the system can support faster responses in disaster management and defence scenarios.

But, raising capital for a deep tech venture has been a steep climb, recalled Bhati. “We pitched to around 40 to 50 VCs. Some were asking irrelevant questions, some were interested, but most did not have much understanding of deep tech.”

Additionally, despite making 90% of its satellite components in India, SatLeo Labs still faces hurdles with imports. Bhati said there is still room for improvement in the customs department. 

One critical part is the detector, used in Earth observation satellites, which is not manufactured domestically and must be sourced from abroad. He explained that while draft rules exist to exempt space-based items from customs, authorised bodies to certify components are missing, creating delays and uncertainty.

The post This Ahmedabad Startup is Building a Thermometer for Earth appeared first on Analytics India Magazine.

]]>
Users Can Shop From Etsy and Shopify in ChatGPT as OpenAI Launches New Agentic Commerce Protocol https://analyticsindiamag.com/ai-news-updates/users-can-shop-from-etsy-and-shopify-in-chatgpt-as-openai-launches-new-agentic-commerce-protocol/ Mon, 29 Sep 2025 19:17:30 +0000 https://analyticsindiamag.com/?p=10178503

Agentic Commerce Protocol (ACP), an open standard co-developed with Stripe, allows programmatic commerce flows between buyers, AI agents, and businesses.

The post Users Can Shop From Etsy and Shopify in ChatGPT as OpenAI Launches New Agentic Commerce Protocol appeared first on Analytics India Magazine.

]]>

OpenAI’s ChatGPT now lets users buy products directly in chat through its Instant Checkout feature, currently available for US users shopping on Etsy and soon expanding to Shopify merchants such as Glossier, Vuori, Spanx, and SKIMS.

Instant Checkout is now available for US ChatGPT Pro, Plus, and free accounts.

“Shopify merchants will be able to sell directly in ChatGPT,” said Shopify CEO Tobi Lütke in a post on X.  “We’ve been working with OpenAI for quite some time so people can search and buy products in chat, and it’s something we’ve had a hard time keeping quiet.”

The rollout is powered by the Agentic Commerce Protocol, an open standard co-developed with Stripe. ACP allows businesses to maintain control over transactions while enabling AI agents to securely facilitate purchases. It connects with any commerce backend or payment system and supports physical and digital goods, subscriptions, asynchronous purchases, multi-merchant carts, and in-store pickup options.

The ACP specification is available for businesses and AI agents to implement immediately. “Customers should be able to securely buy where they discover; businesses should be able to sell through new channels without giving up trust, brand, or control; and AI agents should be able to enable transactions without exposing customer credentials,” said Stripe in its blog post

The protocol is designed to connect with any commerce backend and payments infrastructure, allowing businesses to integrate once and distribute to any ACP-compatible AI agent. ACP also supports physical and digital goods, subscriptions, and asynchronous purchases, including features such as multi-merchant carts and in-store pickup options.

“Trust is essential. With AI agents now capable of initiating transactions on behalf of buyers, businesses need a way to confirm purchases, securely accept payment credentials, respond to new fraud signals, and update their risk models,” said Stripe.

Under ACP, the transaction process works as follows. The buyer selects a product and payment method. The AI agent collects payment details and requests checkout from the business. The business reviews and processes the transaction as the merchant of record, while the payment provider relays credentials securely through a tokenised system.

The protocol is open source under an Apache 2.0 license, allowing any business to implement it with compatible AI agents and payment providers.

The post Users Can Shop From Etsy and Shopify in ChatGPT as OpenAI Launches New Agentic Commerce Protocol appeared first on Analytics India Magazine.

]]>
Anthropic Launches Claude Sonnet 4.5, Touts It as ‘Best Coding Model in the World’ https://analyticsindiamag.com/ai-news-updates/anthropic-launches-claude-sonnet-4-5-touts-it-as-best-coding-model-in-the-world/ Mon, 29 Sep 2025 17:22:08 +0000 https://analyticsindiamag.com/?p=10178501

Claude Sonnet 4.5 achieved top scores on the SWE-bench Verified evaluation, which tests real-world software coding skills.

The post Anthropic Launches Claude Sonnet 4.5, Touts It as ‘Best Coding Model in the World’ appeared first on Analytics India Magazine.

]]>

Anthropic on Monday announced the release of Claude Sonnet 4.5, its latest AI model for coding and agent-based tasks. The company said the model demonstrates improvements in reasoning, math, and long-duration task management.

“Claude Sonnet 4.5 is the best coding model in the world. It’s the strongest model for building complex agents,” the company said in its blog post. “It’s also the best model at using computers and shows substantial gains in reasoning and math.”

The model is available via the Claude API at the same pricing as Sonnet 4, $3 per million tokens for standard use and $15 per million for extended use.

Anthropic said that the Claude API has added context editing and memory tools to support longer tasks, and the Claude apps now allow code execution and file creation directly within conversations. Anthropic also released the Claude for Chrome extension for Max users on the waitlist.

Claude Sonnet 4.5 is also integrated into Claude Code, which now includes checkpoints to save progress and roll back to previous states, a refreshed terminal interface, and a native VS Code extension. 

Developers can access the Claude Agent SDK, which provides the infrastructure used internally to build Claude Code. “The Agent SDK gives you the same foundation to build something just as capable for whatever problem you’re solving,” the spokesperson said.

Claude Sonnet 4.5 achieved top scores on the SWE-bench Verified evaluation, which tests real-world software coding skills. On OSWorld, a benchmark for real-world computer tasks, the model scored 61.4%, up from 42.2% for Claude Sonnet 4. Early users reported improved performance across finance, law, medicine, and STEM domains.

The company emphasised safety and alignment improvements, noting reductions in misaligned behaviour such as sycophancy, deception, and power-seeking. The model is released under Anthropic’s AI Safety Level 3 framework, which includes classifiers to flag potentially dangerous content.

Anthropic also introduced a temporary research preview, “Imagine with Claude,” which allows users to see the model generate software in real time. It is available to Max subscribers for five days at claude.ai/imagine.

The post Anthropic Launches Claude Sonnet 4.5, Touts It as ‘Best Coding Model in the World’ appeared first on Analytics India Magazine.

]]>
Microsoft Brings ‘Vibe Working’ to 365 Copilot With Agent Mode and Office Agent https://analyticsindiamag.com/ai-news-updates/microsoft-brings-vibe-working-to-365-copilot-with-agent-mode-and-office-agent/ Mon, 29 Sep 2025 16:28:20 +0000 https://analyticsindiamag.com/?p=10178499

Office Agent in Copilot chat, powered by Anthropic models, brings presentation and document creation into a chat-first interface.

The post Microsoft Brings ‘Vibe Working’ to 365 Copilot With Agent Mode and Office Agent appeared first on Analytics India Magazine.

]]>

Microsoft has launched Agent Mode in Excel and Word and Office Agent in Copilot chat, introducing what the company calls “vibe working” to Microsoft 365 Copilot. The features are designed to help users create spreadsheets, documents, and presentations through iterative, prompt-based collaboration with AI.

The rollout begins with limited availability. Agent Mode in Excel and Word is now available starting today for Microsoft 365 Copilot-licensed customers, as well as Microsoft 365 Personal or Family subscribers in the Frontier program. 

Office Agent in Copilot chat is currently offered to Microsoft 365 Personal or Family subscribers in the United States, where it works in English on the web.

“Agent Mode delivers AI that can ‘speak Excel’ natively,” said Sumit Chauhan, corporate vice president of Microsoft’s Office Product Group, in a blog post. She added that it combines the depth of Excel’s data structures with OpenAI’s latest reasoning models, aiming to make advanced modeling accessible to a wider range of users beyond experts.

In Excel, Agent Mode enables the AI to generate outputs, validate results, and repeat tasks until outcomes are verified. Microsoft says it democratises access to expert-level modelling, with tasks such as financial analysis, loan calculations, and household budgeting. According to Microsoft’s evaluation of the SpreadsheetBench benchmark, Copilot in Excel Agent Mode achieved 57.2% accuracy.

In Word, Agent Mode turns document creation into what Microsoft calls “vibe writing.” Users can issue prompts such as updating reports, cleaning up formatting, or drafting project summaries, while Copilot refines drafts, suggests edits, and asks clarifying questions.

Office Agent in Copilot chat, powered by Anthropic models, brings presentation and document creation into a chat-first interface. The tool can clarify user intent, conduct research, and generate presentations or documents. Microsoft says Office Agent addresses earlier gaps in AI slide creation.

For example, prompts can include creating a market trends deck, planning a restaurant pop-up kitchen, or preparing retirement savings plan slides. “Office Agent creates tasteful, well-structured PowerPoint decks and ready-to-use Word documents,” Microsoft noted.

The post Microsoft Brings ‘Vibe Working’ to 365 Copilot With Agent Mode and Office Agent appeared first on Analytics India Magazine.

]]>
Lovable Introduces Cloud and AI Features to Build Full-Stack Apps https://analyticsindiamag.com/ai-news-updates/lovable-introduces-cloud-and-ai-features-to-build-full-stack-apps/ Mon, 29 Sep 2025 16:14:05 +0000 https://analyticsindiamag.com/?p=10178497

Lovable AI, powered by Google’s Gemini models, allows users to add artificial intelligence features to their apps without setup, API keys, or separate billing. 

The post Lovable Introduces Cloud and AI Features to Build Full-Stack Apps appeared first on Analytics India Magazine.

]]>

Lovable, a platform for building apps through prompts, launched Lovable Cloud and Lovable AI on Monday, aiming to make full-stack app creation more accessible.

The company said in a post on X that with the new update, “Anyone can now build apps with complex AI and backend functionality, just by prompting.”

Lovable AI, powered by Google’s Gemini models, allows users to add artificial intelligence features to their apps without setup, API keys, or separate billing. “Just ask Lovable to add AI capability to your app and it will do it for you,” the company said. The AI feature is free for all users until October 5.

Lovable Cloud provides a built-in backend that covers logins, databases, file uploads, and other behind-the-scenes requirements. According to the company, this integration means that “building full-stack apps is easier than ever.”

The company said that developers and entrepreneurs are already using Lovable to create revenue-generating businesses. Examples cited include Sabrine, who reached $456,000 ARR in three months with a women’s safety app; Lumoo, with €700,000 ARR in seven months from an AI content platform; and Rory, who reported £100,000 ARR in six months from a marketplace.

“Because now, with Lovable Cloud & AI, the hardest parts of building real apps are handled for you,” Lovable said. The team said it had spent months developing the update before making it available to all users this week.

In July, the company raised $200 million in a Series A round led by Accel, valuing it at $1.8 billion and making it an AI unicorn.

The post Lovable Introduces Cloud and AI Features to Build Full-Stack Apps appeared first on Analytics India Magazine.

]]>
NIT Rourkela Patents AI Model to Boost Road Safety Through Vehicle-to-Vehicle Communication https://analyticsindiamag.com/ai-news-updates/nit-rourkela-patents-ai-model-to-boost-road-safety-through-vehicle-to-vehicle-communication/ Mon, 29 Sep 2025 12:37:33 +0000 https://analyticsindiamag.com/?p=10178373

This could pave way for safer roads, smarter traffic management and the future of autonomous mobility in India.

The post NIT Rourkela Patents AI Model to Boost Road Safety Through Vehicle-to-Vehicle Communication appeared first on Analytics India Magazine.

]]>

Researchers at the National Institute of Technology (NIT) Rourkela have patented an AI model designed to improve vehicle-to-vehicle communication for making roads safer.

The patent, titled “Adaptive Contention Window Optimisation in VANETs using Multi-Agent Deep Reinforcement Learning for Enhanced Performance Model”, was filed by Dr Arun Kumar, assistant professor; Prof Bibhudatta Sahoo, Dr Lopamudra Hota, research graduate, from the department of computer science & engineering.

The model addresses a key challenge in Vehicular Ad-Hoc Networks (VANETs) — when multiple vehicles send messages simultaneously, congestion can delay or block crucial alerts. The AI-driven system uses multi-agent deep reinforcement learning to sequence communications, ensuring that time-sensitive messages, such as sudden braking alerts or emergency notifications, reach other vehicles without delay.

“In 2023, India reported around 480,000 road accidents and 172,000 deaths, many of which could be prevented using modern technologies. Our work is a step towards building safer roads and smarter cities,” Kumar said.

Sahoo added, “The patent represents a practical move toward vehicle-to-vehicle communication in India. By reducing congestion in VANETs, the model lays the groundwork for safer and more efficient traffic management. We invite researchers across institutions to collaborate with our lab on autonomous vehicle technologies.”

Potential applications of the technology include electronic brake lights, platooning, real-time traffic updates, emergency alerts, and on-the-move services such as restaurant or toll information. Experts say VANETs could also support future smart city systems by enabling autonomous vehicles to coordinate in real time.

The post NIT Rourkela Patents AI Model to Boost Road Safety Through Vehicle-to-Vehicle Communication appeared first on Analytics India Magazine.

]]>
Chennai Startup Agnikul Reveals Plans for its Fully Reusable Rockets https://analyticsindiamag.com/ai-news-updates/chennai-startup-agnikul-reveals-plans-for-its-fully-reusable-rockets/ Mon, 29 Sep 2025 12:23:17 +0000 https://analyticsindiamag.com/?p=10178369

The company said it will ensure no part of its rockets is left behind or expended.

The post Chennai Startup Agnikul Reveals Plans for its Fully Reusable Rockets appeared first on Analytics India Magazine.

]]>

Agnikul Cosmos, a Chennai-based space tech startup, has announced plans to make its launch vehicles fully reusable, revealing the development at the International Astronautical Congress (IAC) 2025 in Sydney. 

The announcement follows the company’s validation of its engine design, autopilot, avionics and aerodynamic tests in its maiden controlled-ascent launch last year. Agnikul said its move aligns with the IAC theme of “Sustainable Space: Resilient Earth” and builds on patents secured in the United States, Europe and India.

The company said it will ensure no part of its rockets is left behind or expended.

Central to Agnikul’s plan is its patent on a combined launch vehicle and satellite system, supported by semi-cryogenic propellant technology. The company said these technologies will allow efficient refurbishment and cost-effective re-flights.

“We have consistently designed our vehicles to ensure that affordability and flexibility are never afterthoughts but are built in from day one,” said Srinath Ravichandran, co-founder and CEO of Agnikul Cosmos. 

He added that support from IN-SPACe and ISRO has enabled the company to explore rocket stage recovery and reuse.

Scaling for Commercial Use

Agnikul said its in-house facilities are driving efforts to make launch services more affordable and customisable, while ensuring economic scalability. 

“Our newly planned strategy enables cost efficiencies at scale, allowing us to deliver launch services at globally competitive prices for all small satellite missions,” said Moin SPM, co-founder and COO.

Last week, the company also opened an additive manufacturing facility in Chennai dedicated to aerospace and rocket systems. Agnikul stated that the facility will reduce production costs for space systems by 50% and support India’s position in the global space economy.

Agnikul Cosmos, incubated at IIT Madras, builds space transportation systems under the name Agnibaan, designed to carry small satellites into orbit on demand. 

The company launched India’s first private launchpad and completed its maiden controlled-ascent launch last year using in-house 3D-printed engines and autopilot algorithms.

The post Chennai Startup Agnikul Reveals Plans for its Fully Reusable Rockets appeared first on Analytics India Magazine.

]]>
WhatsApp Vs Arattai – Can Made in India Messenger Win? https://analyticsindiamag.com/videos/whatsapp-vs-arattai-can-made-in-india-messenger-win/ Mon, 29 Sep 2025 11:48:15 +0000 https://analyticsindiamag.com/?p=10178364

India’s swadeshi tech push faces its biggest challenge: can homegrown apps like Aratai rival WhatsApp? This debate explores patriotic sentiment, privacy concerns, and network effects. Featuring ministerial endorsements and Zoho’s swadeshi success, it examines whether India can build globally competitive digital products while riding the Make in India wave.

The post WhatsApp Vs Arattai – Can Made in India Messenger Win? appeared first on Analytics India Magazine.

]]>

India’s swadeshi tech push faces its biggest challenge: can homegrown apps like Aratai rival WhatsApp? This debate explores patriotic sentiment, privacy concerns, and network effects. Featuring ministerial endorsements and Zoho’s swadeshi success, it examines whether India can build globally competitive digital products while riding the Make in India wave.

The post WhatsApp Vs Arattai – Can Made in India Messenger Win? appeared first on Analytics India Magazine.

]]>
Sahamati Labs, Google Cloud to Bring AI into India’s Account Aggregator Network https://analyticsindiamag.com/ai-news-updates/sahamati-labs-google-cloud-to-bring-ai-into-indias-account-aggregator-network/ Mon, 29 Sep 2025 11:27:13 +0000 https://analyticsindiamag.com/?p=10178358

The partnership will use generative AI to build predictive insights, personalised recommendations, and fraud detection systems.

The post Sahamati Labs, Google Cloud to Bring AI into India’s Account Aggregator Network appeared first on Analytics India Magazine.

]]>

Sahamati Labs, the innovation arm of the Account Aggregator (AA) industry alliance, has partnered with Google Cloud India to embed AI into the AA framework. The collaboration marks the launch of an AI Centre of Excellence aimed at setting a global benchmark for financial inclusion.

The centre will focus on scaling inclusion through Indic language tools, embedding trust with fraud and identity safeguards, and boosting efficiency to make financial services faster and cheaper.

“At Google Cloud, we believe technology should be inclusive. By bringing our AI into India’s Account Aggregator framework, we are helping make financial services safer, more accessible, creating a model that can inspire financial ecosystems worldwide,” Sashi Sreedharan, managing director of Google Cloud India, said.

The partnership will use generative AI to build predictive insights, personalised recommendations, and fraud detection systems. It will also leverage Google Cloud’s privacy-preserving technologies to protect sensitive financial transactions. 

A key focus will be multilingual access, with Google Cloud enabling speech-to-text and text-to-speech features in multiple Indian languages to break literacy and language barriers.

The AA framework, seen as India’s next big digital infrastructure after UPI and Aadhaar, already supports consent-based data sharing for over 2.12 billion accounts.

“Its success depends on trust, inclusion, and innovation. By embedding AI at its core with Google Cloud, we are setting the stage for India to once again define a new global benchmark, this time for financial inclusion at scale,” BG Mahesh, chief executive of Sahamati said.

The companies said India aims to make the AA network the world’s strongest example of how AI and digital public goods can combine to transform finance, giving every citizen, regardless of language, literacy, or location, access to the digital economy.

The post Sahamati Labs, Google Cloud to Bring AI into India’s Account Aggregator Network appeared first on Analytics India Magazine.

]]>
How Pure Storage India R&D Centre Built Pure KVA  https://analyticsindiamag.com/gcc/how-pure-storage-india-rd-centre-built-pure-kva/ Mon, 29 Sep 2025 10:19:06 +0000 https://analyticsindiamag.com/?p=10178350

Pure KVA, one of Pure Storage’s flagship innovations from India, helps enterprises reduce AI infrastructure costs by optimising GPU usage.

The post How Pure Storage India R&D Centre Built Pure KVA  appeared first on Analytics India Magazine.

]]>

As the AI revolution accelerates, the demand for Graphics Processing Units (GPUs) has skyrocketed, triggering a severe shortage that shows no signs of easing. 

NVIDIA, the dominant player in high-end GPUs, has seen its Blackwell series sell out through 2025, with lead times stretching into years. 

Analysts predict that by 2030, GPU supply could fall short by as much as 43% of projected demand, with training costs for individual AI models reaching $1 billion by 2027. This challenge is not limited to big tech—it’s a major barrier for startups, researchers, and enterprises seeking to innovate cost-effectively.

Amid this landscape, Bangalore’s tech ecosystem has become a critical hub for innovation. Pure Storage’s R&D and global capability centre (GCC) in India is spearheading enterprise AI initiatives, including the development of the pure key-value accelerator (KVA)—a protocol-agnostic, high-performance key-value caching solution designed to optimise large language model (LLM) inference. 

By persisting and reusing precomputed attention states across sessions, Pure KVA eliminates redundant computation, delivering substantial performance gains without requiring changes to the underlying model or infrastructure.

The core idea behind Pure KVA is instead of discarding the key and value tensors after each inference session, the system captures these intermediate states, compresses them, and stores them on a high-performance Pure Storage NFS or S3 backend.

When the same prompt or context is reused, the stored tensors are quickly reloaded, bypassing unnecessary recomputation and dramatically improving efficiency for AI workloads.

GCC Growth and Investment in India

Talking to AIM, Nirav Sheth, VP – WW sales & customer success engineering at Pure Storage, highlighted the importance of India for the company: “I believe we have about 25% of our R&D function here in the GCC in India. Roughly, the team size is about 500 to 600.”

He added that the GCC is treated as a true hub for product development and innovation, not just a back office, “we’re seeing a tremendous amount of innovation. This Pure KVA, a fantastic optimisation opportunity for any customer looking at AI, is being developed in India.”

Pure KVA, one of Pure Storage’s flagship innovations from India, helps enterprises reduce AI infrastructure costs by optimising GPU usage.

“A very large cost of AI is actually within the GPU, which is further compounded by GPU availability. KVA helps customers reduce the cost of AI by optimising GPU utilisation,” Sheth said, adding that it leads customers to utilise their existing infrastructure more efficiently.

On its impact, Sheth mentioned that “for inferencing, based on some of the calculations we’ve seen, it could be 20X optimisation.”

As much as 70% of Pure KVA was developed in India. The company established an AI Center of Excellence within the GCC with data scientists, prioritising AI skill sets.

Ajeya Motaganahalli, VP engineering and MD India R&D, added they incubated a Gen AI team here. “We don’t have a similar team anywhere else. This team constantly looks at ways to make things better, faster, more cost-efficient for AI users.”

AI Partnerships

Pure Storage has built strong AI partnerships and leverages open-source large language models for enterprise needs.

“We have a world-class partnership with NVIDIA. We were the first in the industry to have a reference architecture with NVIDIA back in 2017. We also partner with RunAI, Weights & Biases, vector database providers like MongoDB, and infrastructure partners like Cisco and Arista,” Sheth said.

On building models for enterprise AI, Motaganahalli explained taking existing large language models, like Llama or Claude, and tuning them to internal data. 

“We’re building small-scale models, while leveraging enterprise AI infrastructure for tuning and training. Open-source models have democratised AI, allowing enterprises to adapt and train models to their specific needs,” Motaganahalli said.

Talent Strategy

Pure Storage also emphasises nurturing talent through internships, ensuring strong retention and building a skilled workforce. 

Motaganahalli explained, “When you come to a deep tech company like us, you spend a lot of time understanding how the development processes work, what the code base looks like, and how to write your unit tests. Internship gives this ability.” Most interns end up joining them as employees as the interview process remains the same for both, he added. 

Elaborating on the GCC hiring ecosystem, Sheth said that Pure Storage GCC is inviting internships, hiring fresh graduates, as well as staff at junior and senior levels. 

Meanwhile, Pure Storage recently announced its expansion in Enterprise Data Cloud to streamline AI workflows across on-premises and cloud environments. 

Key updates include Pure Storage Cloud Azure Native, enabling seamless VMware workload migration, Portworx integration with Pure Fusion for unified data management, Pure1 AI Copilot for natural-language storage management; and Key Value Accelerator with NVIDIA Dynamo to speed AI inference. Next-gen FlashArray and Purity Deep Reduce further optimise performance and efficiency.

The post How Pure Storage India R&D Centre Built Pure KVA  appeared first on Analytics India Magazine.

]]>
Spencer Kimball on How CockroachDB Will Power Efficient AI Agents https://analyticsindiamag.com/global-tech/spencer-kimball-on-how-cockroachdb-will-power-efficient-ai-agents/ Mon, 29 Sep 2025 08:24:37 +0000 https://analyticsindiamag.com/?p=10178329

Spencer Kimball sees AI agents outnumber humans and CockroachDB is building the database backbone to handle that.

The post Spencer Kimball on How CockroachDB Will Power Efficient AI Agents appeared first on Analytics India Magazine.

]]>

The database industry is undergoing a fundamental transformation as AI reshapes how organisations design and operate their data systems. With billions of human interactions soon to be joined by trillions of AI-driven agents, the demands on infrastructure are unprecedented. 

At the centre of this change is a new emphasis on scalability, resilience, and affordability, all while navigating the complexities of multi-cloud environments.

In a conversation with AIM, Spencer Kimball, CEO of CockroachDB, shared his perspective on the company’s current standing in a rapidly evolving database market. 

While acknowledging the competition from other players in the space, like MongoDB and AWS, Kimball emphasised CockroachDB’s focus on solving scale, resilience, and multi-cloud challenges.

Scaling for an AI-driven world

Kimball underlined that the next big transformation is the sheer demand created by AI agents. “All of the activity that databases have had to deal with up till now has been humans. Now it’s going to be agents… You could have a trillion. And that’s going to happen.” 

He explained that CockroachDB’s distributed design positions it to handle such explosive growth, which traditional databases might struggle to handle.

But scale is not just about growth; it is about efficiency as well. “Cockroach needs to become the most efficient and performant database at scale,” he said, adding that optimisation is underway. 

Version 25.2 improved performance by 50%, and the long-term vision is to become “the cheapest database available at scale.” 

For enterprises already straining under AI workloads, the promise of efficiency without compromising resilience is an attractive proposition.

Kimball also gave a glimpse into the architectural choices that enable such efficiency. He highlighted that CockroachDB uses data triplication instead of duplication, ensuring automatic self-healing in the event of failures. 

While this is more expensive than traditional duplication, it prevents costly data loss and operational chaos. 

“If that happens frequently, your teams burn out, your customers leave you, you get reputational brand damage, and the dollars and cents add up,” he warned, underscoring why resilience is worth the investment.

Competing, Collaborating, and Differentiating

Kimball acknowledged MongoDB as “a smart company” with a developer-friendly model but noted that CockroachDB rarely goes head-to-head with it. 

“The team will typically know in advance whether they want a relational database or a document database,” he said.

He sees AI reshaping that dynamic, since rational, AI-driven decisions about architecture will favour capability over familiarity. Developers might have reservations about CockroachDB’s additional complexity, but AI tools are expected to render the adoption curve irrelevant in the long run, and may help developers realise the additional benefits they offer.

On partnerships, Kimball highlighted CockroachDB’s complicated, but fruitful relationship with AWS. 

“In many ways, we think of AWS as our true north competitor. But they’re also one of our best partners,” he said.

In a $100 billion and growing operational database market, there is enough room to collaborate, compete, and still thrive. AWS itself sometimes may recommend CockroachDB to customers who need scale beyond Aurora or multi-cloud support, showing how competition and cooperation coexist.

Kimball also pointed to the broader market reality. “The hardest thing isn’t beating the competitor. It’s just trying to find a way to help the customer migrate, which is very difficult right now. But AI, I think, can substantially change those costs.” 

Migration complexity remains a bottleneck, and Kimball believes AI will play a critical role in easing that process.

India As a Proving Ground

Discussing India, Kimball stressed that the market’s scale and regulatory environment align well with CockroachDB’s strengths. 

He pointed to UPI and brokerage platforms as prime examples of workloads requiring resilience, compliance, and massive scalability. 

“India actually does match up extraordinarily well with Cockroach’s differentiators. So that does help us command a premium price because we’re bringing such value.”

One example is Groww, one of India’s leading brokerage platforms, which relies on CockroachDB to manage enormous transaction volumes at scale. With only 5% market penetration so far, Kimball noted the growth opportunity is immense and a perfect match for CockroachDB’s distributed architecture.

He added that India’s financial sector, with rapid digitisation and strict regulatory oversight, creates strong demand for distributed systems that ensure regional survivability. 

“Indian regulators are very concerned about the cloud provider risks. And that is also true in Europe. It’s not so true in the United States.” This makes CockroachDB’s ability to guarantee region survivability a natural fit for Indian enterprises.

Kimball noted CockroachDB’s commitment to Bengaluru as its APAC hub, and described India as an early, yet natural choice for enterprise-scale innovation. 

“We’re very committed to Bangalore, despite the traffic situation,” he quipped, adding that the company is scaling operations and visibility across the region. Over the past three years, India has grown from an experiment to the anchor of their Asia-Pacific strategy.

The Age of Agentic AI

Looking ahead, CockroachDB is aligning itself with the rise of agent-based systems. Kimball emphasised CockroachDB’s decisive move towards becoming the optimal solution for agentic AI in the enterprise, highlighting features like multi-tenancy, bring-your-own-cloud, and Kubernetes operator for enterprise flexibility and control.

Kimball also discussed how developers at CockroachDB are adapting to the AI-driven pace of change.

 While workloads have intensified, AI itself has become a productivity multiplier. 

He emphasised that the best engineers are those who learn to “manage AI” as a resource, turning it into a career-defining capability.

As Kimball summed up, the database wars will not be won on features alone but on efficiency, resilience, and cost-effectiveness at unprecedented scale.

CockroachDB is betting on a future where distributed systems are not a luxury but a necessity, as billions of humans and trillions of agents demand reliable data infrastructure.

The post Spencer Kimball on How CockroachDB Will Power Efficient AI Agents appeared first on Analytics India Magazine.

]]>
Dukaan Moved to Bare Metal to Prevent $80,000 Monthly Cloud Bills  https://analyticsindiamag.com/ai-news-updates/dukaan-moved-to-bare-metal-to-prevent-80000-a-monthly-cloud-bills/ Mon, 29 Sep 2025 05:46:01 +0000 https://analyticsindiamag.com/?p=10178305

The company carried ‘Strangler Fig Pattern’ migration, gradually shifting from AWS to self-hosted infrastructure without downtime. 

The post Dukaan Moved to Bare Metal to Prevent $80,000 Monthly Cloud Bills  appeared first on Analytics India Magazine.

]]>

Subhash Choudhary, the co-founder and CTO of the Indian e-commerce startup Dukaan, revealed how the company transitioned from cloud services to bare metal servers in the interest of cost savings. 

Choudhary, in his book titled ‘The Accidental CTO’, stated that Dukaan drastically reduced its $80,000 per month bill by migrating from Amazon Web Services (AWS) to bare metal. 

Choudhary noted that using bare metal could save 10 to 20 times more for the same resources compared to cloud solutions. “That $80,000 AWS bill could become a $5,000 bare-metal bill,” he added.

While he stated that AWS provided the company with ease of maintenance, power, and speed, it came at a significant cost. “At every layer of our stack, we were paying a ‘convenience tax.’ We were paying AWS for the privilege of not having to manage the underlying hardware,” Choudhary said in the book. 

“In the early days, this tax was worth it. It allowed us to move incredibly fast. But now, that convenience was threatening to bankrupt us,” he added. 

The company carried out ‘Strangler Fig Pattern’ migration, which involved gradually shifting traffic from AWS to self-hosted infrastructure without downtime. 

He revealed that Dukaan was able to complete a ‘zero-downtime migration’ because the startup owned its own IP address space. 

“This IP address was our permanent, portable address on the internet. We could move house, but our address would stay the same. This was the key that unlocked our ability to migrate without our users ever knowing,” he said.

Dukaan then utilised the Strangler Fig Pattern, which involves gradually replacing a legacy system rather than performing a complete migration in one go. 

The startup chose Hetzner, the German-based bare-metal provider, and rented physical servers in data centres that were geographically close to their existing AWS regions. Choudhary, in an example, explained how the company chose a data centre in Helsinki, Finland, to replace their AWS cluster in Frankfurt, Germany. 

He also revealed that they utilised tools like k3s, a lightweight and powerful Kubernetes distribution, to simplify the process. “Over several weeks, we painstakingly built and tested our new, self-hosted Kubernetes clusters in nine new data centres around the world,” said Choudhary. 

Initially, the company shifted just 1% of its European traffic away from AWS to their bare metal cluster in Helsinki, and the remaining 99% continued to be served by the stable AWS infrastructure. 

After extensive monitoring of the system, the company gradually increased the traffic shifted to their bare metal servers, first to 10%, then to 25%, and eventually shifted 100% away from AWS. 

“Each step was carefully monitored. If we saw any issues, we could instantly shift all the traffic back to AWS with zero downtime,” Choudhary said.

“We then repeated this incredibly careful and deliberate process, region by region, over the next two months, until our entire global application was running on our own self-hosted, bare-metal infrastructure,” he added. 

Choudhary published his book openly on GitHub, sharing his experiences as Dukaan’s CTO along with several case studies related to the company’s tech stack. 

In addition to Dukaan, various companies over the years have reported significant cost savings by switching to bare metal servers. 

For example, OneUptime, the observability platform, estimated annual savings of $230,000 after moving to bare metal servers. In 2022, Basecamp projected total savings of $10 million over five years following the exit from cloud servers. 

The post Dukaan Moved to Bare Metal to Prevent $80,000 Monthly Cloud Bills  appeared first on Analytics India Magazine.

]]>
Swiss Chip Startup Corintis Raises $24M Post Microsoft Deal https://analyticsindiamag.com/ai-news-updates/swiss-chip-startup-corintis-raises-24m-post-microsoft-deal/ Mon, 29 Sep 2025 04:28:34 +0000 https://analyticsindiamag.com/?p=10178295

“Corintis is unlocking the next wave of performance by making cooling a design feature, not an afterthought.” 

The post Swiss Chip Startup Corintis Raises $24M Post Microsoft Deal appeared first on Analytics India Magazine.

]]>

Switzerland-based Corintis, a semiconductor startup specialising in advanced chip cooling solutions, has secured $24 million in a Series A funding round. The funds will help the company scale its microfluidic cooling technology, which is already being tested by Microsoft in its latest cooling system, which the company claims is 3 times better than others.

The new investment, led by BlueYard Capital, will support Corintis’ expansion as it looks to address the thermal limitations of increasingly powerful chips. With AI technology demanding more energy and producing more heat, effective cooling solutions have become critical. 

With this, the startup has added Lip-Bu Tan, the CEO of Intel, to its board. The startup claims that its technology can improve cooling efficiency by up to 10 times compared to traditional methods, a breakthrough that could enable the next generation of AI chips.

Expansion and Future Growth

With the new funding, Corintis plans to expand its workforce from 55 to 70 employees by the end of the year and ramp up its production capacity. The company aims to manufacture over a million microfluidic cold plates annually by 2026, with the potential to scale further as demand for advanced AI chips grows.

Remco van Erp, co-founder and CEO of Corintis, explained that every chip is unique, like a cityscape of billions of transistors. Current cooling designs are too simplistic, relying on generic methods like copper blocks with parallel fins. 

Corintis’ solution uses precisely shaped micro-scale channels tailored to each chip, directing coolant to the most critical regions. This approach addresses the challenge of developing more efficient cooling systems within tight deadlines. 

With its innovative platforms such as Glacierware and Therminator, Corintis enables chip designers to optimise thermal performance. The company’s technology is poised to play a pivotal role in AI’s growing demand for computational power.

Corintis has already shipped over 10,000 cooling systems and secured eight-digit revenue since its founding in 2022. The company expects to significantly increase its revenue through early deployments with its technology customers.

“Cooling is one of the biggest challenges for next-generation chips,” said Tan, “Corintis is fast becoming the industry leader in advanced semiconductor cooling solutions to address the thermal bottleneck.”

The post Swiss Chip Startup Corintis Raises $24M Post Microsoft Deal appeared first on Analytics India Magazine.

]]>
Indian IT Firms Confront Americas Reliance Amid Gradual Diversification https://analyticsindiamag.com/it-services/indian-it-firms-confront-americas-reliance-amid-gradual-diversification/ Mon, 29 Sep 2025 03:08:17 +0000 https://analyticsindiamag.com/?p=10178275

Slower economic growth in the US and Europe has tempered IT spending, encouraging Indian companies to pursue new markets

The post Indian IT Firms Confront Americas Reliance Amid Gradual Diversification appeared first on Analytics India Magazine.

]]>

India’s $250 billion IT services sector, the world’s largest outsourcing destination, faces a growing challenge: its heavy dependence on the Americas. For most of the country’s top software exporters, North America accounts for around half of revenues, while Europe adds another third. 

That leaves little room for cushioning when shocks hit.

Indian IT companies have marginally reduced their exposure to the Americas in Q1 2026 compared to last year. However, the region continues to remain their largest and most critical revenue driver, underscoring sustained dependency despite diversification effort

Even as firms look to Asia-Pacific for growth, the Americas still account for over half of revenues for most players. This highlights both the opportunities and vulnerabilities tied to the US market.

Any adverse development in the North American region tends to impact Indian IT, said Ananya Roy, founder of Credibull Capital. “Take for instance, the collapse of small banks in the US. US BFSI’s IT spends crashed in the months which followed and Indian IT players concentrated on US BFSI had suffered as a result.”

Roy added that macro headwinds are compounding the problem. “The US GDP growth clocked 2.9% in 2023, it is expected to slow down to 1.7% in 2025. The Euro area is expected to post sub-1% growth. With geopolitical and economic uncertainty, the long-term prospects for these regions look uncertain as well. IT spends in these regions seem to have plateaued.”

A Gradual  Diversification

There are signs of change. “Over the years, Indian IT’s exposure to North America has declined,” Roy said, mentioning a reduction of about 10% in overall business coming from North America. 

“At the same time, exposure to other regions (including India itself) has increased. The numbers prove that Indian IT has been successfully diversifying into other regions. Considering the recent bout of policy uncertainty, I am sure the urgency towards diversifying would have increased further,” she added.

However, Chokkalingam G, founder of Equinomics Research, finds the diversification slow and uneven. “Combined GDP of Southeast Asian countries will not be even one-fourth or one-fifth of America. So the corporate world is much bigger in Europe and America. They only need IT. Southeast Asian economies cannot help to mitigate the risk coming from the developed world,” he opined. 

Alternatives are limited, according to Chokkalingam, who said that only amicable trade between India and the US can resolve this quagmire. He noted that annual growth in IT exports had slowed down to 2–4% in dollar terms, down from 50–60% in the 1990s. 

The AI Shake-up

Artificial intelligence (AI), once seen as an opportunity, has emerged as both a disruptor and a catalyst. “With GCCs and mid-tier IT players leveraging AI more extensively and passing on the cost-benefits to clients, clients have started asking even the larger players to step up,” Roy said. 

Chokkalingam agreed AI is reshaping the economics of the sector, but in ways that also threaten demand. “The purpose of AI is to cut down the human cost, which is happening now. It reduces the cost of the base level of IT services. That is also slowly emerging as a threat,” he said.

Capital Allocation Questions

At the same time, industry heavyweights have preferred buybacks to aggressive investment in innovation. Infosys and others have returned billions to shareholders.

“Financially, it is sustainable,” said Chokkalingam. “But it also shows that you are not deploying that money for CapEx and expanding the business or IT service business. When the IT service industry growth outlook is not very robust, they should use it for acquiring mid-sized IT companies… or diversification into IT-enabled services.”

Roy flagged the lack of research spending. “India Inc.’s R&D expenses as a percentage of revenues is less than 1%, and lags significantly behind other economies. Even more alarming is the fact that in India, IT does not feature among the sectors with the highest R&D spending. For a sector that is at a risk of being rendered irrelevant, innovation is critical.” 

Green Shoots

Still, some see opportunities ahead. 

Pareekh Jain, founder of EIIRTrend, an information platform focusing on engineering, IoT, Industry 4.0, and R&D, said Indian IT players are slowly aligning with the government’s broader push to diversify exports. 

“A few years back, we wrote this note on $50 billion IT services opportunities in Japan, France, Korea. This is even more relevant now,” he said in a Linkedin Post. “In the last couple of years, one IT service provider has had a 30% CAGR growth primarily due to Japan and South Korea.”

For Jain, the lesson is that while the Americas will remain the backbone, new geographies can fuel incremental growth. “The time has come for the Indian providers to strategise this opportunity,” he said.

For now, the chart tells the story: Indian IT’s fortunes remain tied to the Americas. Whether diversification and AI adoption can tilt that balance will decide how resilient the industry is in the years to come.

The post Indian IT Firms Confront Americas Reliance Amid Gradual Diversification appeared first on Analytics India Magazine.

]]>
Enterprises Beware: Agent-Washing Clouds the Future of AI https://analyticsindiamag.com/ai-features/enterprises-beware-agent-washing-clouds-the-future-of-ai/ Sat, 27 Sep 2025 10:30:00 +0000 https://analyticsindiamag.com/?p=10178265

Vendors mislabel copilots as agents, raising regulatory and operational risks for firms chasing the promise of agentic AI.

The post Enterprises Beware: Agent-Washing Clouds the Future of AI appeared first on Analytics India Magazine.

]]>

Most vendors are mislabeling their products as “agentic AI,” setting unrealistic expectations around tools that are essentially copilots or intelligent automation with a chat interface, according to new research from HFS.

This “agentic-washing” — the gap between what is marketed and what is actually sold — has become the next big trust issue in enterprise AI. Vendors are rebadging copilots as “agents” to imply autonomy and business impact, according to the research authored by Hansa Iyengar, practice leader (BFS & IT Services) at HFS Research. 

A report by Research and Markets on AI Agents projected the AI Agents market to grow from $5.1 billion in 2024 to $47.1 billion in 2030, with a CAGR of 44.8% during 2024-2030. 

Surveying over 1,300 professionals to “learn about the state of AI agents”, the report found that 51% of the respondents said they have already been using AI agents in production, 63% of mid-sized companies deployed agents in production, and 78% have active plans to integrate AI agents. 

The HFS report said regulators on both sides of the Atlantic are already targeting false claims, setting up a collision between hype and compliance.

Gartner forecasts that 40% of enterprise applications are expected to feature task-specific AI agents by the end of 2026, up from less than 5% in 2025. These agents will evolve from AI assistants, currently embedded in most enterprise apps by end-2025, to autonomous, task-capable systems that enhance productivity, collaboration, and workflow orchestration. 

Gartner predicts that agentic AI could account for around 30% of enterprise application software revenue by 2035, surpassing $450 billion. 

“We are still seeing AI assistants being deployed which are agent washed,” Anushree Verma, senior director analyst, Gartner, told AIM.

She added that the rapid growth in popularity of Agentic AI in India is largely driven by hype, while adoption is very low for now with low ‘AI agency’ use cases. Early examples, according to her, take the form of virtual assistant software architectures which creates even further confusion. 

“Customer service and knowledge management remain the top use cases which have advanced the level of ‘AI agency’ in these implementations. We do have some other emerging use cases, for example, SOC agents, Agents for SDLC, Simulation, etc,” she said.

Devil is in the Details

HFS clarifies the differences. 

Copilots are assistants confined to a single app or workflow, triggered by a user, with limited memory and no autonomous planning or open tool choice. 

AI agents are individual systems executing specific tasks with policies, telemetry, and rollback.

Agentic AI refers to orchestrated, autonomous systems that coordinate multiple agents, maintain context, and adapt dynamically to achieve broader business outcomes. 

If a vendor’s AI can’t decompose goals, choose tools across systems, remember context, and recover from failure, HFS says they’re not selling agentic AI, but AI-assisted workflows.

The research referred to UiPath’s Autopilot and Automation Anywhere’s Co-Pilot to illustrate the rebadging trend. 

Both products deliver productivity gains through text-to-automation or natural-language prompts, but they operate within bounded stacks, not open-world autonomy. 

ServiceNow positions its AI Agents as skills-based orchestrators across IT and HR workflows, but again, scope is defined by policy guardrails and configured skills.

The three companies did not respond to AIM‘s queries.

Verma explained that Agentic AI refers to a class of system developed using various architectures, design patterns and frameworks, encompassing both single AI agent and multiagent designs. These systems are capable of performing unsupervised tasks, making decisions and executing end-to-end processes. 

Whereas, AI agents are autonomous or semiautonomous software entities that use AI techniques to perceive, make decisions, take actions and achieve goals in their digital or physical environments.

“It effectively means that Agentic AI practice is used for creating AI agents,” she said. 

Still an Aspiration

Most deployments today remain at Levels 1 and 2 of HFS’ “five levels of agentic maturity.” Copilots handle departmental tasks under human oversight. A smaller group reaches Level 3, where processes are coordinated across bounded systems.

Levels 4 and 5, where multi-agent systems own business outcomes and evolve with minimal human input, remain aspirational. 

Roadmaps such as Intuit’s GenOS describe “done-for-you agentic experiences,” but HFS classifies them as emerging claims pending production-grade evidence.

The risks of overstatement are growing. 

The US Federal Trade Commission launched “Operation AI Comply” in September 2024, warning that deceptive AI marketing falls under consumer-protection laws. 

In parallel, the Council of Europe’s legally binding AI treaty requires lifecycle transparency, impact assessment, and oversight.

Enforcement has already begun. DoNotPay, which marketed itself as the “world’s first robot lawyer,” faces FTC action for deceptive autonomy claims and has been ordered to compensate customers. 

Rytr, an AI writing assistant, enabled mass production of fabricated reviews, failing consumer-protection standards. 

Delphia and Global Predictions, which claimed to be the “first regulated AI financial advisor,” paid $400,000 in penalties after regulators found their claims misleading.

Check Before Subscribing

HFS recommends CIOs use its “two-gate Agentic Reality test” before buying into vendors’ claims: 

Gate one asks whether the system demonstrates agency, goal decomposition, tool use, memory, policy guardrails, and telemetry. 

Gate two tests readiness to scale, requiring multi-agent coordination, API execution, fraud prevention, compliance hooks, and lifecycle support.

If two or more Gate 1 items fail, buyers are looking at an assisted workflow, not an agent.

CIOs should also enforce claims contractually — write “agent” into agreements, demand telemetry, set governance thresholds, define KPIs, require architecture disclosure, and link payments to performance. 

“The bottom line: if a vendor wants a premium for agentic AI, they must earn it with evidence,” HFS said. 

“If a product can’t plan, pick tools across systems, remember context, and recover from failure, it’s a copilot. Label it, limit it, and buy useful assistance at assistant rates.” 

Ashish Kumar, the chief data scientist at Indium, had said that the tech works, but the skill gap is real. Agentic AI needs more than prompts and APIs. It requires thoughtful design, orchestration, modularity, and people who understand both software and business logic.

The post Enterprises Beware: Agent-Washing Clouds the Future of AI appeared first on Analytics India Magazine.

]]>
LTTS, Siemens Partner for AI-led Transformation in Process Engineering & Smart Manufacturing https://analyticsindiamag.com/ai-news-updates/ltts-siemens-partner-for-ai-led-transformation-in-process-engineering-smart-manufacturing/ Sat, 27 Sep 2025 08:44:13 +0000 https://analyticsindiamag.com/?p=10178268

The deal will deliver simulation-driven automation and AI-enabled solutions for diverse sectors.

The post LTTS, Siemens Partner for AI-led Transformation in Process Engineering & Smart Manufacturing appeared first on Analytics India Magazine.

]]>

L&T Technology Services has announced an expanded partnership with Siemens Limited.

The deal aims to advance machine & line simulation and IIoT (Industrial Internet of Things) technology, setting a new benchmark for innovation within LTTS’ sustainability segment, which encompasses process engineering, discrete manufacturing and industrial products.

LTTS said in a release that through the collaboration it will utilise the digital technology portfolio of Siemens Limited to deliver simulation-driven automation and IIoT-enabled solutions for diverse sectors including automotive & transportation, industrial products, and process & plant engineering. 

By combining Siemens’ flagship platforms, TIA (Totally Integrated Automation) Portal, Industrial Edge, and Tecnomatix, integrated with LTTS’ AI-driven engineering expertise, the partnership will accelerate digital adoption, improve precision in system design, and drive faster, smarter decision-making across manufacturing ecosystems.

From enhancing design accuracy to enabling predictive and sustainable production at scale, the strengthened partnership positions LTTS at the forefront of creating intelligent and environmentally responsible industrial ecosystems worldwide, the company said. 

In the statement, Alind Saxena, president & executive director – mobility & tech, LTTS, said, “By focusing on robust solutions such as Machine & Line Simulation and IIoT Technology, we are empowering industries to achieve greater agility, actionable insights, and measurable business outcomes.”

Suprakash Chaudhuri, head of digital industries, Siemens Limited, said combining deep domain expertise with cutting-edge digital solutions can help co-create scalable, future-ready innovations that empower industries to thrive in a rapidly evolving world.

The post LTTS, Siemens Partner for AI-led Transformation in Process Engineering & Smart Manufacturing appeared first on Analytics India Magazine.

]]>
How Neysa Stands Out in the IndiaAI GPU Race https://analyticsindiamag.com/ai-features/how-neysa-stands-out-in-the-indiaai-gpu-race/ Sat, 27 Sep 2025 04:30:00 +0000 https://analyticsindiamag.com/?p=10178215

Unlike other providers focused on GPU allocation, Neysa claims to deliver an end-to-end AI cloud platform.

The post How Neysa Stands Out in the IndiaAI GPU Race appeared first on Analytics India Magazine.

]]>

India’s AI cloud market is crowded with multiple providers vying for the attention of startups, IITs, and enterprises. The IndiaAI Mission has empanelled over 34,000 GPUs, with another 6,000 on the way. 

Around 72% of these GPUs have been allocated to startups building foundational models, providing a boost to the nation’s AI ambitions.

Yotta Data Services, NxtGen, E2E Networks, and others like Jio, CtrlS, Netmagic, Cyfuture, Sify, Vensysco, Locuz, and Ishan Infotech have carved their own slices of this GPU pie. But, Neysa is staking a distinct claim. 

The Mumbai-based AI acceleration cloud system provider is focussed on the problem that most AI teams face: the AI trilemma, as its chief product officer Karan Kirpalani terms it. 

At Cypher 2025, one of India’s largest AI conferences organised by AIM in Bengaluru, Kirpalani defined this trilemma: building a product with the right unit economics, speed to market, and product-market fit, all while scaling trust, which rarely works in practice. 

“You can build a product at the right cost with speed to market but may fail to align with market needs, or any two of the other criteria. It’s the apartment problem. Pick any two, but you can’t have all three,” he said.

Traditional cloud providers — AWS, Google Cloud, Azure — can solve parts of the problem but rarely all three. “AWS will charge you four times what the prevalent market rate is for an H100 GPU. You get speed, yes, but you miss unit economics. You pivot the other way, buy your own GPUs, and now you’re stuck on speed and scale. No one has solved all three,” Kirpalani elaborated.

Enter Velocis

Velocis Cloud aims to tackle the trilemma. Unlike other providers focused on GPU allocation, Neysa delivers an end-to-end AI cloud platform. From Jupyter notebooks and containers to virtual machines and inference endpoints, everything is pre-integrated and accessible with a click on Velocis Cloud. 

Enterprises get flat-fee pricing, granular observability, and dedicated inference endpoints for models like OpenAI’s GPT-OSS, Meta’s Llama, Qwen, and Mistral. Startups get credit programs to avoid “project-killing” hyperscaler bills. 

“Clients appreciate it more than GPUs. Bare metal, virtual machines, containers, Jupyter notebooks, inference endpoints — you can do all of it with a click, and at far better unit economics than hyperscalers,” Kirpalani said during a podcast at Cypher 2025.

Contrast that with Yotta. CEO Sunil Gupta has ordered 8,000 NVIDIA Blackwell GPUs to expand capacity for IndiaAI projects. Yotta already operates 8,000 H100s and 1,000 L40s, supporting Sarvam, Soket, and other large-scale AI models. “Most large-scale AI model development in India today is happening on Yotta’s infrastructure,” Gupta earlier told AIM

Yotta’s strength is sheer scale, with a platform-as-a-service API layer for enterprise access. At the same time, Yotta also offers similar services, from training on bare metal hardware to deploying custom models and inference on its Shakti AI Cloud platform.

NxtGen takes a long-term, trust-driven approach to AI and cloud. Unlike Neysa, which focuses on end-to-end platform usability and flexibility, NxtGen leverages its legacy as one of India’s first cloud players and government contracts to build enterprise inference and sovereign AI at scale. 

“The first difference is that we have a lot of trust with our customers,” CEO AS Rajgopal told AIM earlier, emphasising that NxtGen is not just providing GPUs but creating an enterprise-grade inference market with open-source, agentic AI platforms. Its philosophy blends early adoption, infrastructure investment, and operational sovereignty.

Standing Out

So where does Neysa fit in this crowded domain? It’s not about who has the most GPUs or the biggest contracts. It’s about usability, predictability, and sovereignty. Kirpalani emphasised India’s need to reduce dependency on foreign models and data centres. 

“For India, investing across the stack and reducing dependency on foreign models, hardware, and data centres is vital,” he said. Neysa’s strategy is to offer variety — supporting multiple open-weights models — and control, ensuring enterprises can fine-tune, self-host, and manage token performance without surprises.

Hardware scale is a consideration, but Neysa is pragmatic. “Seeing a homegrown NVIDIA in five years? Not realistic. Manufacturing silicon is complex. A more realistic approach is to incentivise global manufacturers and ODMs to produce in India,” Kirpalani noted. The focus is on accessible infrastructure and a strong supply chain rather than building chips from scratch.

While Yotta, E2E, NxtGen, and others are racing to deploy GPUs and secure large contracts, Neysa is carving a niche for operational simplicity and sovereign AI. Its Velocis Cloud is designed to let AI teams focus on product development rather than cloud headaches. 

IndiaAI’s GPU push is impressive — 40,000 units and counting — but sheer capacity alone doesn’t solve the trilemma. That’s Neysa’s take.

The post How Neysa Stands Out in the IndiaAI GPU Race appeared first on Analytics India Magazine.

]]>
A Day in the Life of an Indian Developer  https://analyticsindiamag.com/ai-highlights/a-day-in-the-life-of-an-indian-developer/ Fri, 26 Sep 2025 13:33:37 +0000 https://analyticsindiamag.com/?p=10178259

The campaign highlights something developers have always known: the line between work and life is porous.

The post A Day in the Life of an Indian Developer  appeared first on Analytics India Magazine.

]]>

In the tech world, developers are often celebrated for their problem-solving brilliance. Yet, behind the screen, life is messy, unpredictable, and full of tiny victories and epic fails. 

Amazon Web Services (AWS) is now changing that narrative with a refreshing series of promotional videos that lean into humour, empathy and everyday relatability. This series is presented as a micro drama with multiple episodes, following developers as they navigate multi-tasking, late-night deployments, and the constant challenge of explaining their jobs to curious five-year-olds, skeptical parents, and everyone in between. 

The campaign strings together everyday situations that every developer—and their loved ones—will instantly recognise. Take Nikhil, an engineer who builds a mood-analysing app to better understand his girlfriend’s emotions—a playful reminder that sometimes technology is less about business optimisation and more about personal survival. 

Another episode shows him explaining coding to his niece, highlighting how mentorship and inspiration can come from unexpected corners of family life. 

One of the standout moments comes when his in-laws, typically the toughest audience in any household, are won over by Nikhil; not by his resume or paychecks, but with his clever use of Alexa. AI becomes a household entertainer, turning invisible developer labour into tangible admiration. 

Of course, no developer’s journey is complete without the all-too-real scenario of burning the midnight oil. A footage shows Nikhil, shoulder-to-shoulder with a colleague, pushing through the night to finish app deployment—a mix of camaraderie, caffeine and cloud.

It’s a nod to the countless unsung moments where creativity and persistence fuel innovation at odd hours. 

In a charming twist, Alexa isn’t just a silent helper in these stories—she has a personality. She teases the developer about debugging struggles, makes cheeky remarks about neglected fitness routines and even pokes fun at laundry chaos piling up in the background. This quirky AI companion mirrors the highs and lows of a developer’s life, blending humor with human resilience. 

At its core, this campaign highlights something developers have always known: the line between work and life is porous. Deployments and deadlines bleed into family dinners, and late-night bug fixes run parallel to everyday chores. 

Why does this matter? Because the future of cloud and AI isn’t just about scale and security, it’s about people. Developers are the heartbeat of innovation, and they live in a world where solving one problem often creates three new ones—both on the screen and off. 

This campaign resonates because it mirrors the stories developers tell themselves and each other. Everyone wants an app to decode emotions, the unexpected family praise moment, or the late-night debugging war story. By wrapping these moments into episodic micro dramas, AWS offers a fresh lens on what it means to be a developer today—resilient, creative, slightly overwhelmed, but always moving forward.

Keep an eye on our LinkedIn handle for the mini-series.

Start building with AWS here.

The post A Day in the Life of an Indian Developer  appeared first on Analytics India Magazine.

]]>
Two Indian Engineers on a Mission to Automate Home Cooking for the World https://analyticsindiamag.com/ai-features/two-indian-engineers-on-a-mission-to-automate-home-cooking-for-the-world/ Fri, 26 Sep 2025 11:30:00 +0000 https://analyticsindiamag.com/?p=10178244

In a live demonstration for AIM, Posha prepared paneer tikka masala in approximately 25 minutes

The post Two Indian Engineers on a Mission to Automate Home Cooking for the World appeared first on Analytics India Magazine.

]]>

Building a robot that performs mechanical cooking actions is straightforward engineering. Creating one that thinks, perceives, and improvises like a human cook presents an entirely different challenge.

Is the tomato purée thick enough? Do the onions need a few more seconds of sautéing?

Posha, a San Francisco-based startup founded by two Indian engineers, Rohin Malhotra and Raghav Gupta, is pursuing this challenge. 

In an interaction with AIM, co-founder and CTO Rohin Malhotra outlined how the company’s appliance transforms raw ingredients into ready meals. 

For users overwhelmed by AI products that just write emails and generate images, Posha represents a different league: AI applied to physically demanding tasks that people may want to avoid. 

The startup offers an early glimpse of AI handling domestic work that requires real-world perception and judgment.

An Attempt to Bridge the Gap

Posha’s hardware features mechanical arms that pour and stir ingredients, as well as dispense spices through multiple pods, along with an induction pan and integrated oil and water tanks. There is also a display through which users can interact with the appliance, view setup instructions, recipes, and more. 

“The first step would be to choose how many people you’re cooking for,” said Malhotra, as that would guide Posha about the quantity of ingredients to be fed in. The appliance can churn out up to 600 pre-programmed recipes for up to four people. 

Thanks to AI models equipped with computer vision hardware, Posha can ‘watch’ food change during cooking and make real-time decisions about when to adjust the heat, add more ingredients, or proceed to the next step. 

Besides the recipe, Posha can also accommodate for any missing ingredients or specific dietary restrictions intelligently. 

“This happens a lot when some of our customers don’t have enough time, or the lack of skills to prepare the ingredients the right way — Posha can detect that and ensure the final recipe turns out the same way,” said Malhotra. 

The appliance is priced at a one-time fee of $1,499 and is being sold in the United States in limited quantities. The company recently raised $8 million in Series A funding led by Accel Ventures. 

In a live demonstration for AIM, Posha prepared paneer tikka masala, a traditional Indian main course made with cottage cheese, in approximately 25 minutes. 

Where’s the Training Data?

Essentially, the company needed to train the computer vision models to cook all 600 recipes in its database. “One of the challenges was that there was no data on which these models could have been trained. We had to create our own datasets,” said Malhotra. 

The team had to break cooking into component skills that function like “Lego blocks” for recipes. For example, to teach the system frying, engineers cooked 10 different ingredients from raw to burnt, training the camera to recognise colour states from raw to golden brown to black.

Similarly, when cooking ingredients that shrink in size, such as mushrooms, the camera can calculate the percentage by which ingredients’ size decreases, said Malhotra. 

“We had to get a large number of ingredients initially and then create a substantial amount of synthetic data. We also had to cook them and gather all the necessary data before training our models on that,” he added. 

Each time a user cooks a recipe, Posha also collects camera vision data to enhance the models’ efficiency, Malhotra said. 

To achieve the desired outcome, individuals from various fields of expertise were assembled. The company collaborates with mechanical, electronics, manufacturing, software, and AI engineers — and also, chefs. “A few people who work here were chefs in their previous jobs. And now, for their living, teach robots how to cook.” 

Taste Over Tech

None of these efforts that went into development would matter if the food doesn’t taste good. The looks of the finished appliance and its capabilities would all seem worthless. 

“One of the most important aspects which we discovered was that we need to get people to taste the food,” said Malhotra. “What users are really sceptical about is whether Posha can cook good food.” 

“The easiest way to do that is to do a lot of demos. We invited people to our office, and a lot of them became our customers after seeing a demo and tasting the food,” added Malhotra. 

The company’s design philosophy, he said, focuses on providing the user a clear view of how the appliance is cooking their food. “It helps build a level of trust.”

Furthermore, saving time or effort is not the only goal of Posha, but also promoting a healthy diet. 

“The amount of time people spend cooking food at home [in the US] is extremely low. That is increasingly being replaced by processed food from supermarkets or food delivery apps,” said Malhotra. Posha’s library of recipes also contains a long list of health-based dishes across various cuisines and dietary preferences. 

Having said that, Posha isn’t the only company in the smart cooking or cooking technology sector. Startups such as Tovala, June Oven, Anova Culinary, and Impulse Labs offer a variety of products and appliances in the smart kitchen market. 

While most leading AI-enabled cooking solutions are offered by startups and modern companies, it would be interesting to see how mainstream appliance manufacturers respond to both the opportunity and the challenge.

The question that will drive innovation remains the same: how long will it take for incumbents to develop such solutions at scale, and at a fraction of the cost?

The post Two Indian Engineers on a Mission to Automate Home Cooking for the World appeared first on Analytics India Magazine.

]]>
Karnataka to Democratise AI Access for Students, Startups https://analyticsindiamag.com/ai-news-updates/karnataka-to-democratise-ai-access-for-students-startups/ Fri, 26 Sep 2025 11:23:49 +0000 https://analyticsindiamag.com/?p=10178254

Kharge hinted at a possible extension of the Nipuna scheme to fund and retain advanced research talent.

The post Karnataka to Democratise AI Access for Students, Startups appeared first on Analytics India Magazine.

]]>

In a move set to reshape access to AI-enabled devices in India, Priyank Kharge, Karnataka’s minister for electronics, IT, BT, and rural development & panchayat raj, announced the state’s plan to prioritise affordable AI machines for students and early-stage founders.

The government will work within procurement rules to launch credible pilots, Kharge said during the Startup Vision event at Cypher2025, India’s biggest AI summit. You’ll see what Karnataka has pushed for to ensure AI computers for all. Just wait until 18 November when we launch at the Bengaluru tech summit,” he added.

The initiative aims to democratise AI by providing students, researchers, and startups with affordable access to AI devices. 

Startups like Revrag, Skylark Drones, SatSure, and Soket AI participated in the event, pitching innovative ideas, including developing multilingual LLMs and drones for governance, and AI agents for streamlining BFSI onboarding. They expressed interest in collaborating with the government.

The conversation also touched on the challenges of compute infrastructure. A founder highlighted that on-device models running on consumer-grade hardware could reduce costs and latency for Indian use cases, but teams need a baseline kit to experiment.

“Technology isn’t the problem, but infrastructure is,” said one founder. They pointed out the GPU shortage, the friction in government procurement, and Bengaluru’s overstretched roads and metros as significant bottlenecks.

Startup Meets Government Realism

Founders advocated for a simpler path to proofs of concept, noting that many have developed agents for citizen services, triage tools for hospitals, and safety monitoring systems for factories. However, they lamented the bureaucratic hurdles that delay pilot projects.

Kharge acknowledged these constraints, explaining that the state must comply with transparency and procurement laws. “Any solution must prove three things—technical merit, price discipline, and scalability across thousands of gram panchayats,” he said.

However, Kharge left the door open for time-bound trials. “The government of Karnataka is happy to be your first customer, either through a pilot or procurement,” he stated, emphasising that pilot projects and public-procurement carve-outs could be used for well-defined and measurable innovations.

A Strategic Push for R&D

Kharge also highlighted Karnataka’s focus on structured research initiatives, including a possible extension of the Nipuna scheme to fund and retain advanced research talent. “This will position Bengaluru not just as a services hub but as a global centre for frontier AI R&D,” he noted.

When asked about data-sharing initiatives, Kharge acknowledged Karnataka’s collaboration with mobility agencies and city utilities to release operational data for unified commuter apps. However, he cautioned that some datasets would remain closed or tightly controlled. He encouraged startups to propose specific datasets like district-level crop-loss histories or claims gaps for evaluation.

Addressing Bengaluru’s Infrastructure Bottleneck

When Himanshu Upreti, cofounder & CTO at AI Palette, raised concerns about Bengaluru’s traffic, Kharge pointed to the central–state funding imbalance. “For every ₹100 Karnataka gives the Centre, we get back only ₹12,” he said. “Give me ₹25 or ₹50, and I’ll build better roads and create more jobs.”

The post Karnataka to Democratise AI Access for Students, Startups appeared first on Analytics India Magazine.

]]>
AWS Space Accelerator Program to Support 42 Indian Space Startups https://analyticsindiamag.com/ai-news-updates/aws-space-accelerator-program-to-support-42-india-space-startups/ Fri, 26 Sep 2025 10:28:25 +0000 https://analyticsindiamag.com/?p=10178246

The selected startups are developing solutions across geospatial analytics, satellite propulsion, space sustainability, and more.

The post AWS Space Accelerator Program to Support 42 Indian Space Startups appeared first on Analytics India Magazine.

]]>

Amazon Web Services (AWS) has selected 42 Indian startups to participate in its AWS Space Accelerator: APJ 2025 program. The 10-week virtual initiative aims to help these startups advance their innovations in the space technology sector. 

This is part of the larger cohort with a total of 67 startups from Australia, India, and Japan. The program will conclude on November 28,  with startups showcasing their solutions to space agencies, investors, and industry leaders.

The program offers support, including up to $100,000 in AWS credits, business coaching, technical guidance, and mentorship. Clint Crosier, director of the aerospace and satellite business at AWS, highlighted the accelerator’s role in helping startups leverage cloud technologies to address complex challenges in space. 

“We’re proud to support these visionary companies as they leverage cloud technologies to solve complex challenges in space and back on earth, from designing new launch systems to climate resilience, space sustainability, and data accessibility,” he said.

India’s Space Sector 

The 2025 cohort reflects the growing significance of India’s space sector, which has been bolstered by government initiatives, such as the Indian National Space Promotion and Authorisation Centre’s (IN-SPACe) funding of approximately 5 billion rupees ($57.58 million) for space innovators. 

The selected startups are developing solutions across diverse space segments, such as geospatial analytics, satellite propulsion, and space sustainability.

These startups include Bengaluru-based  SkyServe, which offers in-space edge computing with its EdgeAI Suite to process Earth observation data onboard satellites, providing faster and lower-cost insights. 

Another participant, Axial Aero, is creating simulators for astronaut and pilot training, while Quantumspace is securing satellite communications with compact, quantum-safe Quantum Key Distribution (QKD) modules that integrate easily into existing spacecraft without costly redesigns. 

Cosmoserv Space is another company tackling space debris with a dual-spacecraft system that utilises AI, robotics, and refuelling depots to deliver scalable active debris removal.

The accelerator is delivered in collaboration with key partners, including IN-SPACe, the Australian Space Agency, and SKY Perfect JSAT Corporation, ensuring startups receive both technical and business expertise. 

Overall, the program operates in three primary space segments: geospatial applications (51%), launch and space infrastructure (42%), and simulation (7%).

The post AWS Space Accelerator Program to Support 42 Indian Space Startups appeared first on Analytics India Magazine.

]]>
BharatGen and the Pursuit of Sovereign, Scalable AI for India https://analyticsindiamag.com/ai-features/bharatgen-and-the-pursuit-of-sovereign-scalable-ai-for-india/ Fri, 26 Sep 2025 09:58:16 +0000 https://analyticsindiamag.com/?p=10178237

“Knowledge-driven components are important because we don't want everything to be just algorithmic innovation.”

The post BharatGen and the Pursuit of Sovereign, Scalable AI for India appeared first on Analytics India Magazine.

]]>

Generative AI is evolving beyond the race for larger models, focusing on sovereignty, data ownership, and cultural alignment. For India, where multilingual diversity defines daily life, the challenge lies in building AI that reflects these realities while remaining scalable and cost-efficient.

The answer may lie in BharatGen, a consortium-led effort to create multilingual and multimodal AI that is sovereign, frugal, and rooted in India’s priorities.

At Cypher 2025, Ganesh Ramakrishnan, professor at the department of computer science and engineering, IIT Bombay, said, “India’s AI opportunity, converting the diversity into a strength by leveraging the similarity across languages, getting back our skilled engineers and researchers to work together.”

The project brings together IITs and other institutions under a not-for-profit structure, combining academic research with practical applications. Initially supported by the Department of Science and Technology, BharatGen recently received a significant boost in the form of a ₹900 crore grant under the IndiaAI Mission.

This whole-of-government approach, with the Ministry of Electronics and IT stepping in alongside earlier support, aims to scale the models towards the trillion-parameter range and enable the creation of agentic systems for Bharat.

As Ramakrishnan explained, this is a deep-frogging opportunity to shift India from being a “use case capital” to an IP producer, while reinforcing privacy and cultural preservation.

Models Born from India’s Context

BharatGen has already released models ranging from 500 million to 7 billion parameters. Among them is Param-1, a 2.9 billion-parameter language model pre-trained from scratch with 33% Indian data, including 25% Hindi.

“We also released several domain-specific models in agriculture, legal, finance, and Ayurveda,” Ramakrishnan said, emphasising the localisation strategy.

The consortium has also launched multimodal systems. The Sooktam family powers text-to-speech, Shrutam focuses on automatic speech recognition, and Patram stands as India’s first 7 billion-parameter document vision-language model.

These systems are intended to serve Indian needs rather than mimic global templates. “This is actually the seat of India’s AI ecosystem, having our feet on the ground through applications, while also ensuring that we are building models which are not just aping the Western models,” Ramakrishnan emphasised.

Applications such as Krishisathi, accessible via WhatsApp, demonstrate how these models can reach ordinary users. From speech-to-speech systems capable of conveying emotion to compact diffusion-based voice models that work with minimal data, BharatGen’s experiments point towards a personalised, inclusive future for Indian AI.

Also Read: BharatGen’s ‘Recipe’ for Building a Trillion Parameters Indic Model

Research, Sovereignty, and Scaling Ahead

Research is central to BharatGen’s approach, with over 15 papers published in top-tier venues within a year. The consortium has collected more than 13,000 hours of speech data across Indian regions, embedding fidelity and provenance checks into its data pipelines.

Ramakrishnan described this as a “virtuous cycle” of recipes and indigenous benchmarks, ensuring models evolve from robust foundations.

Training challenges remain formidable, with even mid-sized models requiring hundreds of GPUs over weeks. Yet BharatGen’s frugal philosophy has produced compact multilingual architectures that perform competitively on benchmarks.

The recent government funding promises to accelerate this trajectory. With resources to train much larger models, the project can now aim for trillion-parameter systems, speech agents capable of handling multilingual tasks, and multimodal document models for domains such as governance, healthcare, and finance.

At its core, BharatGen is a strategic exercise in sovereignty. By embedding knowledge-driven components, focusing on explainability, and leveraging linguistic similarities across Indian languages, the initiative seeks to create AI that is not only technically strong but also aligned with India’s cultural and national priorities.

As Ramakrishnan concluded, it is about turning diversity into strength and laying the foundation for India to lead, not follow, in the age of generative AI.

Also Read: How BharatGen Took the Biggest Slice of IndiaAI’s GPU Cake

The post BharatGen and the Pursuit of Sovereign, Scalable AI for India appeared first on Analytics India Magazine.

]]>
GitHub Launches Copilot CLI in Public Preview for Terminal-Based Coding https://analyticsindiamag.com/ai-news-updates/github-launches-copilot-cli-in-public-preview-for-terminal-based-coding/ Fri, 26 Sep 2025 09:17:57 +0000 https://analyticsindiamag.com/?p=10178234

The company is bringing its AI coding agent directly to the terminal with native GitHub integration, agentic capabilities, and full developer control.

The post GitHub Launches Copilot CLI in Public Preview for Terminal-Based Coding appeared first on Analytics India Magazine.

]]>

GitHub has announced the public preview of GitHub Copilot CLI, a new command-line interface that brings its AI-powered coding agent directly into developers’ terminals. The release is aimed at making Copilot’s assistance more seamless by removing the need for constant context switching between tools.

With the CLI, developers can now interact with GitHub Copilot synchronously and locally within their command line. The system integrates directly with GitHub, enabling access to repositories, issues and pull requests using natural language commands authenticated through a GitHub account.

The new tool extends beyond simple suggestions, offering agentic capabilities to plan, edit, debug and refactor code. Every action is previewed before execution, giving developers complete control. GitHub also highlighted the CLI’s MCP-powered extensibility, which allows users to run GitHub’s default MCP server and connect custom MCP servers for extended functionality.

Installation is straightforward, requiring only running an npm install command, authenticating with a GitHub account and using an existing Copilot Pro, Pro+, Business or Enterprise plan. 

According to the company, the CLI is particularly useful when exploring new codebases, implementing features directly from tracked issues or debugging projects locally. It positions Copilot not only as a coding assistant inside IDEs, but also as a terminal-native collaborator.

The feature is available across GitHub Copilot Pro, Pro+, Business and Enterprise plans. For organisation-managed accounts, administrators must enable the Copilot CLI policy for users to access it.By embedding Copilot deeper into the developer workflow, GitHub continues to advance its AI strategy. This can serve as an alternative to Gemini CLI or Claude Code for developers looking for a solution from Microsoft.

The post GitHub Launches Copilot CLI in Public Preview for Terminal-Based Coding appeared first on Analytics India Magazine.

]]>
IBM Helps Unity Bank Cut Time to Market for New APIs by 50% https://analyticsindiamag.com/ai-news-updates/ibm-helps-unity-bank-cut-time-to-market-for-new-apis-by-50/ Fri, 26 Sep 2025 09:09:49 +0000 https://analyticsindiamag.com/?p=10178231

With IBM’s solutions, Unity Bank has established a centralised API hub to manage internal and external APIs across its hybrid cloud infrastructure.

The post IBM Helps Unity Bank Cut Time to Market for New APIs by 50% appeared first on Analytics India Magazine.

]]>

The Mumbai-based Unity Small Finance Bank has collaborated with IBM to centralise and secure its Application Programming Interface (API) ecosystem.

Using the IBM Cloud Pak for Integration and the expertise of IBM Consulting in application management, Unity Bank has established a centralised API hub to manage internal and external APIs across its hybrid cloud infrastructure. The announcement said Unity Bank deployed IBM Cloud Pak for Integration on Red Hat OpenShift.

The API hub is said to leverage an API gateway that facilitates ‘smooth integration between core banking systems, digital channels, and operational workflows’. The hub also provides standardised, secure and real-time access to business-critical data. 

This enables third-party developers, fintech partners, and corporate clients to build value-added services that leverage core banking functionalities, the company said. 

Besides, the API hub is claimed to ‘drive a 50% reduction in time-to-market for new APIs and features, and a nearly 30% improvement in API issue resolution.’

“With IBM’s support, we now have a powerful, secure application backbone that allows us to innovate faster, scale effectively, and respond to customer needs in real time. This transformation enables our teams to focus on building differentiated customer journeys rather than managing complex integrations,” said Yusuf Roopawalla, Chief Information Officer, Unity Small Finance Bank.

Rishi Aurora, managing partner at IBM Consulting, India and South Asia, said that India’s banking sector, which is rapidly evolving due to the emergence of AI and new technologies, necessitates that banks handle the complexities related to applications and APIs. 

“Unity Bank is addressing this uniquely and effectively through the collaboration, by leveraging automation and centralised API governance to build a secure, scalable, and modern application backbone,” he added. 

The post IBM Helps Unity Bank Cut Time to Market for New APIs by 50% appeared first on Analytics India Magazine.

]]>
ServiceNow University Launches in India to Train 1 Million in AI by 2027 https://analyticsindiamag.com/ai-news-updates/servicenow-university-launches-in-india-to-train-1-million-in-ai-by-2027/ Fri, 26 Sep 2025 08:04:45 +0000 https://analyticsindiamag.com/?p=10178212

The launch comes as new research from ServiceNow estimates that Agentic AI will reshape more than 10 million jobs in India by 2030.

The post ServiceNow University Launches in India to Train 1 Million in AI by 2027 appeared first on Analytics India Magazine.

]]>

ServiceNow has launched ServiceNow University in India, with the goal of skilling one million learners in AI by 2027. The announcement was made at the company’s first AI Skills Summit in Hyderabad, which saw 1,200 students attend in person and more than 20,000 join virtually.

The initiative is part of ServiceNow’s larger ambition to train 3 million people globally. India, one of the company’s fastest-growing markets, is at the center of this plan. 

Currently, the platform has 318,000 active learners and 116,000 certified professionals, with programs already feeding talent pipelines for its partners and customers.

“Organisations are adopting AI at record speed, but there simply aren’t enough skilled professionals to power this transformation,” said Sumeet Mathur, senior vice president & MD, ServiceNow India technology & business centre. “That uncertainty is exactly what ServiceNow University is designed to address.”

The launch comes as new research from ServiceNow estimates that Agentic AI will reshape more than 10 million jobs in India by 2030. At the same time, India is expected to add three million new tech workers in the next five years, creating both an opportunity and a challenge. 

ServiceNow University aims to address the skills gap in India through free, gamified and AI-personalised learning. Learners can take bite-sized courses, earn digital credentials, and follow India-focused career pathways such as software development, data engineering, and security operations. 

The platform also integrates with universities, government skill programs, and regulatory bodies like AICTE to make courses more widely accessible.

While earlier speaking with AIM, ServiceNow’s MD and GVP, India and SAARC, Ganesh Lakshminarayanan said that AI may shift roles away from pure coding to “business-tech-commercial” skills, but the demand will be massive. He revealed that he is backing this belief with a goal to train one million Indian learners in AI skills by 2027, part of a global goal of three million of ServiceNow University.

Read: ServiceNow India Head Says AI Agents Can Shrink the Indian IT Bench

The post ServiceNow University Launches in India to Train 1 Million in AI by 2027 appeared first on Analytics India Magazine.

]]>
ChatGPT’s New Background Agent ‘Insanely Useful’ https://analyticsindiamag.com/ai-news-updates/chatgpts-new-background-agent-insanely-useful/ Fri, 26 Sep 2025 07:02:27 +0000 https://analyticsindiamag.com/?p=10178210

ChatGPT Pulse presents a feed of cards filled with content you’ve asked about before.

The post ChatGPT’s New Background Agent ‘Insanely Useful’ appeared first on Analytics India Magazine.

]]>

OpenAI announced a new feature for the $200 monthly ChatGPT Pro plan, called ‘ChatGPT Pulse,’ on September 25. 

The feature offers a feed-like experience on ChatGPT, delivering updates to users based on their chat history and drawing information from all the connected apps. 

Pulse consists of visual cards on various topics that can be expanded for more detailed information. Users receive a new Pulse with a fresh set of cards every day. 

“This is the first step toward a more useful ChatGPT that proactively brings you what you need, helping you make more progress so you can get back to your life,” said the company. 

OpenAI said that, based on feedback from Pro users, the feature will be rolled out to the $20 monthly ChatGPT Plus plan, with the eventual goal of making it available to everyone. 

Greg Brockman, the president and co-founder of OpenAI, described it as a ‘background agent’ that provides daily updates to users on topics of interest. 

Initial reactions from users have been mainly positive. Simon Smith, the executive vice president of generative AI at Klick Health, said in a post on X, “This is going to be insanely useful.” 

“You get content on very specific things you’ve been discussing with ChatGPT. For example, I was chatting with it the other day about AIs competing with each other, so it presented some content on that,” he added, in his review of the feature.

He also added that it curates content based on things that users have mentioned in passing and are genuinely interested in. “It noticed I’ve been asking about Toronto’s crashing condo market, for example, so found me an article on that,” he added. 

However, some users also expressed concerns about the future of ChatGPT. “Feels like an attempt to build a ‘feed” for ‘scrolling’. Feels very passive vs utilising the conversational strengths of LLMs. and yeah, mostly news (sic),” said Kevin Xu, CEO of Alpha AI, in a post on X.

The release of Pulse also coincides with reports that OpenAI is seeking someone who can help the company with advertising, reporting to Fidji Simo, the CEO of applications. 

The post ChatGPT’s New Background Agent ‘Insanely Useful’ appeared first on Analytics India Magazine.

]]>
How Pradhi AI Embeds Emotional Intelligence in Voice AI https://analyticsindiamag.com/ai-features/how-pradhi-ai-embeds-emotional-intelligence-in-voice-ai/ Fri, 26 Sep 2025 06:30:00 +0000 https://analyticsindiamag.com/?p=10178175

As businesses recognise the potential of voice-driven tech, Pradhi AI is laying the foundation for an empathetic, responsive AI ecosystem.

The post How Pradhi AI Embeds Emotional Intelligence in Voice AI appeared first on Analytics India Magazine.

]]>

Text-to-speech-based generative models have propelled the industry towards faster and more efficient consumer engagement and hiring. Yet, replicating human tone, emotion, and subtlety remains far off. Voice AI holds a unique promise, but complexities as well. 

Pradhi AI Solutions, a Hyderabad-based startup, has been focused on pioneering a voice-driven, emotionally intelligent AI platform. 

From Multigraphs to Market Models

The company’s roots lie in deep tech research. CEO & co-founder Vijayalaksmi Raghavan, in a conversation with AIM, recalled how her team translated abstract mathematical concepts, such as multigraphs, into deployable predictive models. These could monitor real-time metrics such as the melt flow index. 

Their system is built in layers. It extracts over a hundred voice-based measures, refines them through network graph theory and statistical techniques, and delivers actionable insights.

This is more than academic. The model is designed for enterprise deployment, embedding intelligence into everyday customer and sales conversations. By focusing on extracting significance from voice, Pradhi AI allows organisations to see far beyond what traditional text-based interfaces can capture.

Moving Beyond Sentiment Analysis

Voice AI has long been associated with sentiment detection: happy, sad, or angry tones tagged at a superficial level. Raghavan argues this is shallow.

“Speech emotion recognition is an evolving field. Today, semantic analysis only tells you so much. A human can distinguish the difference between a flat ‘okay’ and an enthusiastic ‘okay!’, but a large language model can’t,” she explained.

Pradhi AI, in collaboration with IIT Delhi, is researching prosody — the rhythm, stress, and intonation of speech. These markers can indicate tension, hesitation, or emphasis. When modelled correctly, it can expand AI’s interpretive depth.

The implications are vast. In customer service, for instance, the need may not be to identify anger but to equip a bot with empathetic responses that mirror human interaction. “We’re very far away from that reality,” Raghavan admitted, “but our work is laying the groundwork to get there.”

India’s Multilingual Reality

One of the toughest challenges for voice AI is handling Indic languages. Unlike English, which has benefited from decades of corpus development and tokenisation research, Indian languages lack extensive digital datasets.

Some large models, such as Google’s Gemini stack, perform significantly better with Indic languages than others, Raghavan pointed out. Consequently, Pradhi AI has adopted a hybrid approach that combines augmented models with datasets, leverages heuristic methods for recognising dialects, and introduces a human-in-the-loop mechanism to achieve fine-tuned accuracy.

“Achieving the 98-99% accuracy levels for Indian languages will take time,” she said. “But until we go under the hood, embedding-level improvements, tokenisation, and larger datasets, the gap will persist.”

Privacy as a Cornerstone

If emotion recognition is a research challenge, data privacy is the commercial one. Enterprise customers are often hesitant to transmit voice data outside their controlled environments. Raghavan is acutely aware of this barrier.

She revealed a significant breakthrough. “From a data privacy standpoint, we do not make any API calls. That means the data remains within your environment only. All our models are locally installed. Plus, our data is protected with an advanced cryptography solution. The breakthrough is no API calls.”

Pradhi AI uses elliptic-curve encryption to secure audio both in transit and at rest. Clients can run the stack on-premises, ensuring compliance with strict privacy standards.

The Indian AI Ecosystem

Voice-first platforms in India face infrastructure barriers. Many enterprises are still adapting their user interfaces, designed originally for text, to accommodate voice inputs.

Raghavan likens the process to plumbing: “You can’t install a fancy tap until the pipes are laid down. For us, the first work often is just putting that infrastructure in place.”

Yet the enthusiasm remains high. Once businesses see the potential for natural and accurate interactions, they recognise the long-term value.

The funding climate for AI has shifted dramatically since 2023. Back then, a startup could attract investors merely by mentioning “AI” in its pitch deck. Investors demand a clear path to revenue now, says Raghavan.

Voice AI has received significant investment on the front-end side, making voices sound human, Indian, or empathetic. But voice as data input and voice analytics remain underfunded. That is precisely where Pradhi AI is carving its niche.

“We will look for institutional funding soon,” Raghavan said, “but right now our focus is on traction and building an ARR-driven business model.”

Her career reflects the startup’s layered approach. She worked in the corporate sector, transitioned to the non-profit world, and ultimately embraced entrepreneurship.

“Corporate impact can feel limiting,” she said. “Non-profits deliver scale but often depend on stakeholders and grants. As an entrepreneur, I can shape not just my organisation but also the behaviours of others through the products we create.”

This philosophy underpins Pradhi AI’s dual vision: advancing research in emotion recognition while creating tangible impact for enterprises.

Future of Voice AI with Emotional Intelligence 

Pradhi AI is tackling the gap between human emotion and machine intelligence.

Its layered model stands apart in a domain still in its early stages. 

The aim is not to replace human agents but augment them, providing better assessments of conversations and stronger decision-making for enterprises.

As Raghavan put it: “The trick is not in solving 80% of the use cases, but in knowing when the AI is dealing with the critical 20% it can’t handle yet. That’s why we always keep the human in the loop.”

In doing so, Pradhi AI is shaping not just India’s AI story but also the global debate on how machines can truly learn to listen.

The post How Pradhi AI Embeds Emotional Intelligence in Voice AI appeared first on Analytics India Magazine.

]]>