Interviews and Discussions News, Stories and Latest Updates 2025 https://analyticsindiamag.com/news/interviews-and-discussions/ News and Insights on AI, GCC, IT, and Tech Thu, 21 Aug 2025 06:04:17 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2025/02/cropped-AIM-Favicon-32x32.png Interviews and Discussions News, Stories and Latest Updates 2025 https://analyticsindiamag.com/news/interviews-and-discussions/ 32 32 ServiceNow India Head Says AI Agents Can Shrink the Indian IT Bench https://analyticsindiamag.com/it-services/servicenow-india-head-says-ai-agents-can-shrink-the-indian-it-bench/ Tue, 19 Aug 2025 11:16:34 +0000 https://analyticsindiamag.com/?p=10176163

“You can actually build AI which learns on both sides and creates personalised recommendations for each employee.”

The post ServiceNow India Head Says AI Agents Can Shrink the Indian IT Bench appeared first on Analytics India Magazine.

]]>

India isn’t just a market for ServiceNow — it’s a core R&D hub. About 40% of its global product engineering happens here. That’s not limited to product creation; engineering teams here often work directly on large-scale deployments.

The platform-as-a-service company could possibly resolve one of Indian IT’s chronic problems: the bench

ServiceNow’s MD and GVP, India and SAARC, Ganesh Lakshminarayanan said that AI could match skills in real-time, recommend training, and push people into billable work faster, while reflecting on resolving the demand and supply conundrum of the Indian IT industry. 

While speaking with AIM, he said that access to data on both the ends and AI agent’s intervention would be critical to design such workflows.

“Assume that there are AI agents working on both sides. Somebody who’s coming up with personalised recommendations and somebody who’s pushing the workflow. Suddenly, you’re solving the most important problem SIs are having,” Lakshminarayanan said, talking about reducing the numbers on the bench.

This is where a platform approach beats siloed tools, he argued. It’s also why ServiceNow is investing in an “AI Control Tower” that can orchestrate not just its own agents, but those built on Microsoft, ERP systems, or other vendors. “Our approach has never been proprietary,” he said. 

“We work with multiple LLMs. The value is in the data and workflows.”

This will help companies train and upskill employees in real time with new skillsets without a human push. “You can build AI which learns on both sides and creates personalised recommendations for each employee,” he explained.

Lakshminarayanan said that the Indian market, by all accounts, is heating up fast. “The country is going through a major AI uptick. We are in the right stage of growth.” The proximity to both customers and developers, he said, is a “boon”, allowing rapid innovation and problem-solving.

ServiceNow’s Big Bets on India

While Indian IT and consulting giants are busy promising AI transformation, ServiceNow is already delivering it. In a market filled with pilots, the company’s AI subscription revenue has risen to $3 billion, and AI deal volume has increased by 50% in a single quarter, as per its Q2 earnings results.

ServiceNow’s own research predicts AI will create 10.3 million jobs in India by 2030. Lakshminarayanan has lived through three tech booms—the internet, BPO, and now AI—and believes each has been a net job creator. 

The roles may shift away from pure coding to “business-tech-commercial” skills, but the demand will be massive. He’s backing that belief with a goal to train one million Indian learners in AI skills by 2027, part of a global goal of three million of ServiceNow University. 

“India needs AI, but AI needs India,” he said, pointing to the country’s unmatched scale in banking, telecom, retail, and IT services. “If a particular AI company wants to make their AI agents useful for banking, where else are you going to find the biggest banks in terms of volume of transactions? You’re going to find them in India.”

Most importantly, Lakshminarayanan is clear that the future belongs to platforms that can bring together data, workflows, and AI agents. ServiceNow’s Workflow Data Fabric (WDF) is designed to do exactly that—connect multiple systems of record without copying data, making it instantly accessible to AI agents with minimal latency.

This is similar to what Sumeet Mathur, SVP and MD of ServiceNow India Technology & Business Centre, told AIM earlier.

“Our one-platform approach gives us a significant edge over competitors who rely on acquisitions to piece together solutions,” Mathur explained. ServiceNow stands out with its commitment to providing a single platform, data model, and architecture. This strategy ensures an integrated and seamless user experience across an enterprise’s various functions.

Paul Smith, former president of global customer and field operation at ServiceNow, told AIM earlier in an interaction: “One in five of all ServiceNow employees is based in India. And 85% of the resources that we have in ServiceNow India are in R&D and engineering roles — especially core engineering.” 

Smith went on to highlight that a significant number of these engineers are working on leading-edge research around AI, underlining India’s role in building the future of enterprise automation. He is currently the chief commercial officer at Anthropic. 

The Service Deficit

Infosys and TCS, two of ServiceNow’s biggest Indian partners, are also building AI platforms. Lakshminarayanan sees their role as complementary. “AI value comes from data plus workflows. We’ve been doing IT workflows for 20 years, which is why we can achieve 78% automation already with agentic AI.”

Lakshminarayanan is quick to draw a line between ServiceNow’s positioning and that of other IT vendors. “We are an AI platform company,” he said. This shift is deliberate. 

ServiceNow’s long-standing IT workflows, HR automation, and service management tools have given it something many AI startups and consultancies don’t have: deep, structured, enterprise-wide data. 

Lakshminarayanan pushes back on the narrative that AI adoption would lead to reduced jobs. Instead, he sees Indian enterprises using AI to bridge what he calls “the service deficit.”

He recalls a conversation with the ED of a large public-sector bank. The executive asked: Why should only a small percentage of accounts get a dedicated relationship manager? Why not use AI agents to make that experience available to every account holder?

It’s a theme he sees across industries — from consulting firms looking to give every employee a digital executive assistant, to HR leaders wanting a dedicated recruiter for every candidate. “It’s about hyper-personalised service, so people can focus on meaningful work,” he said.

With OpenAI now hiring forward-deployed engineers in a Palantir-style move, is there a threat to ServiceNow’s position? Lakshminarayanan doesn’t think so. 

“Yes, you can deploy forward engineers, but without existing data and workflows, you’re starting from scratch. We have 150 AI agents out-of-the-box, going to a thousand in six months. Time to value matters.”

The post ServiceNow India Head Says AI Agents Can Shrink the Indian IT Bench appeared first on Analytics India Magazine.

]]>
India Has Just 5-10 Years to Catch Up in Global Quantum Race, Says IISc Professor https://analyticsindiamag.com/deep-tech/india-has-just-5-10-years-to-catch-up-in-global-quantum-race-says-iisc-professor/ Mon, 18 Aug 2025 11:53:54 +0000 https://analyticsindiamag.com/?p=10176100

There are already 80 to 90 quantum entrepreneurs in the state, NS Boseraju told AIM.

The post India Has Just 5-10 Years to Catch Up in Global Quantum Race, Says IISc Professor appeared first on Analytics India Magazine.

]]>

India needs to establish a self-reliant quantum ecosystem within the next decade if it wants to avoid the risk of falling behind, and being a mere consumer of foreign technology, according to Arindam Ghosh, professor of physical sciences at the Indian Institute of Science (IISc), Bengaluru. 

In an exclusive chat with AIM, Ghosh warned against the traditional Indian approach of importing and assembling components, while emphasising the need to execute a strategic homegrown approach towards developing quantum systems. He sees this as not just a matter of technological progress but that of national security and economic sovereignty as well.

“What worries me,” explains Ghosh, is that Indian quantum systems could be “assembled from imported components,” while yearning for a “truly self-reliant quantum future”.    

An industry analyst from Bengaluru who works on semiconductor solutions, but did not wish to be named, echoed this concern. According to him, even when some visible components, like the outer shell, are made domestically, critical subsystems are almost always imported, often from countries like the Netherlands or Finland. This dependency, he said, makes it unlikely for India to become a global player without a coordinated policy push. 

Ghosh said the policy push must involve a “holistic” approach to develop supporting components, such as semiconductors, fans and hard drives, as well along with the core hardware. A multifaceted approach will ensure India has the infrastructure and expertise to maintain and evolve quantum systems independently. 

N S Boseraju, Karnataka’s minister for science and technology, told AIM in an exclusive interview that the state government has provided a grant of ₹48 crores to IISc to support the development of quantum technology

The state already has 80 to 90 entrepreneurs working on quantum computing, he said, adding that the government is preparing a platform to support these companies with infrastructure like land, water, and power. Boseraju added that the government intends to provide “prime land” in Bengaluru for a quantum chip foundry, which would be part of a larger plan to establish the “quantum city of Bengaluru”.

Security Stakes Are High

Quantum computing has the power to break existing encryption, which could leave critical infrastructure vulnerable, amplifying the urgency to develop sovereign quantum capability given its national security implications. 

“Tomorrow, if someone has a 10,000-qubit computer, they will first hack our national security,” said Ghosh.

Ajai Chowdhry, chairman of the National Quantum Mission (NQM), told AIM, that adversaries can “harvest today and use later,” implying collecting data now to decrypt it once powerful quantum computers are available. 

He said that acting on this front, the government has created a task force to formulate a policy for making India “quantum secure” within the next three months. This task force will advise critical institutions, such as the Reserve Bank of India, power grids, and telecom operators on how to protect themselves.

Sanjeev Gupta, CEO of Karnataka Digital Economy Mission (KDEM), told AIM that quantum technology is converging with many other domains. Looking at cybersecurity, communications, space, defence, and healthcare, he posed the question if the country’s digital economy is ready for ‘Q-day’? According to him, a Q-day or a quantum day would be when “a quantum hacker attacks you.” 

Market Gaps or Manufacturing? 

Talking about the commercial aspects of quantum computers, the analyst said that without a domestic market beyond the government, investors would see little incentive in the sector. 

A lack of broad commercial demand, he argued, could leave India overly reliant on public-sector procurement and vulnerable to being outpaced by countries where corporate giants, from Microsoft to Google, are driving large-scale quantum projects with multi-billion-dollar budgets.

Professor Ghosh re-emphasised domestic development of related products and components to match global standards so as to facilitate an exchange of quantum tech between India and other industry leaders.  

Incentivising for a Homegrown Ecosystem

India’s quantum journey is underway, with a 6-qubit system at TIFR, a 7-qubit system at IISc, and a 25-qubit machine at QpiAI. But Chowdhry admitted this is “not good enough,” and the goal is a 1000-qubit computer. Ghosh said India could also compete in quantum sensing and materials if it invests strategically.

The analyst cautioned that the path to parity with global leaders is lengthy and resource-intensive. The United States’ quantum story spans over 15 years, beginning with early companies like D-Wave, and backed by billions in funding from tech giants.

“It doesn’t matter if you’re building in the US or India, the hardware, the chip, the programming, everything costs the same,” he said. Current funding levels in India, he added, are “peanuts” compared to what’s needed.

Avoiding the Tech Gap

Beyond money, talent is a bottleneck. “We possibly need 100x more PhDs,” the analyst said, adding, it is difficult to get good talent in India. “Most of our best brains go to the US.” 

He argued for bold government intervention akin to China and Korea’s industrial policies. There needs to be some amount of liquidity to incentivise people to do something they otherwise wouldn’t, he said. 

Eric Holland, general manager of Keysight Technologies’ quantum engineering solutions, told AIM that while building an entirely homegrown quantum computer is “possible, yes… sustainable, unclear.” 

He noted: “I don’t think that it’s possible for one country to own the full supply chain, it’s so diverse and so across the spectrum.” Instead, he advocates leveraging global strengths and partnerships, pointing out that “it’s to your benefit to leverage other areas that are driving profitability rather than saying, hey, we have to start from scratch.”

The post India Has Just 5-10 Years to Catch Up in Global Quantum Race, Says IISc Professor appeared first on Analytics India Magazine.

]]>
Wipro Bets Big on Agentic AI https://analyticsindiamag.com/ai-features/wipro-bets-big-on-agentic-ai/ Thu, 14 Aug 2025 07:43:00 +0000 https://analyticsindiamag.com/?p=10175935

Wipro Limited has teamed up with Google Cloud to roll out 200 production-ready generative AI agents

The post Wipro Bets Big on Agentic AI appeared first on Analytics India Magazine.

]]>

While Agentic AI has yet to become fully mainstream, momentum is clearly building among the leading Indian IT firms. However, despite the growing appetite for adoption, full-scale integration remains gradual. 

According to Deloitte’s fourth wave of the State of GenAI report (India perspective), over 80% of Indian organisations are exploring the development of autonomous agents, signalling a significant shift towards Agentic AI.

Yet, despite the growing appetite for adoption, full-scale integration remains gradual. The same Deloitte report noted that with AI evolving at an unprecedented pace, 28% of firms worry their current AI solutions could become obsolete within just two years.

Mukesh Bansal, founder of Nurix, CureFit, and Myntra, pointed out the disconnect between the AI agent hype and its actual production deployment in a recent LinkedIn post.

“Everyone is building AI agents, and yet so few AI agents are in production doing real work. LLMs keep getting better and better, and yet P&Ls are not changing by even basis points! What’s going on?” Bansal wrote.

For him, the reason is simple. Building real, enterprise-grade AI agents is not the same as coding an app or selling a SaaS tool. “Agentic companies are not like a traditional software or platform company at all. It is a cross between McKinsey and Infosys,” he said.

Amidst this, Wipro is placing a big bet on Agentic AI systems that can set goals, make plans, and take action on their own. 

Wipro’s Take on Agentic AI

Recently, the company teamed up with Google Cloud to roll out 200 production-ready generative AI agents across sectors such as healthcare, banking, insurance, retail, manufacturing, and IT solutions.

It is designed to improve customer experiences, streamline business processes, and accelerate digital transformation using Google’s Gemini models and Vertex AI platform.

“Wipro is demonstrating how Google’s Gemini models and Vertex AI can be utilised to build powerful, industry-specific agents that transform everyday work across industries,” said Victor Morales, VP of global system integrators partnerships, Google Cloud, in a statement. 

Meanwhile, Pushpa Ramachandran, VP and global head of AI at Wipro Limited, told AIM that these agents are capable of learning, planning, and acting independently within complex business environments. 

The strategy is to build modular, domain-specific agents that plug into business processes across industries. 

In healthcare, Wipro is creating agents that handle tasks like provider credentialing and onboarding. In energy and utilities, AI agents can spot equipment faults and adjust maintenance schedules without human intervention. In retail, they can process refunds, resolve complaints, and update prices and recommendations in real time.

One global retailer, as Ramachandran mentioned, used Wipro’s agentic AI to automate free trade agreement checks and customs paperwork across multiple countries. As a result, it reduced port delays, minimised duty leakage, and saved significant costs while speeding up product launches, the company claimed. 

He further said that AI is now part of almost every client conversation. “We are in an era where we have AI-led delivery in all the client conversations,” Ramachandran noted. He added that industries such as BFSI, consumer goods, telecom, and manufacturing are rapidly advancing in this space.

What Others are Doing?

Meanwhile, Tata Consultancy Services built one of the largest enterprise AI stacks, quietly powering over 150 workflows with autonomous AI agents.

On the other hand, HCL Tech’s patented GenAI platform, AI Force, which revitalises the software development and IT-Ops lifecycle, has now been enhanced with Agentic AI capabilities. 

Alongside this, its Talent Navigator solution leverages GenAI to search through vast resume databases, match candidates to job descriptions, simulate interview experiences, schedule interviews, and set up personalised onboarding sessions.

Infosys, too, is in the race and has launched over 200 enterprise AI agents powered by Infosys Topaz AI offerings and Google Cloud’s Vertex AI Platform.

Why is Wipro Focusing on SLMs?

Wipro is also working on small language models (SLMs), which are lighter, task-specific AI models trained on industry-specific data.

In the emerging Agentic AI trend, AI agents often don’t need a gigantic general model. They need lightweight, task-specific brains that can work in concert.

SLMs fit perfectly into this architecture. 

Wipro also collaborates with Wipro Ventures, which brings together a network of startups focused on industry- and domain-specific solutions, including small language models (SLMs). 

Each startup offers deep specialised expertise, enabling Wipro to accelerate SLM development and deliver business impact for clients. In addition, the company partners with the research arms of universities to address complex business challenges and broader ecosystem opportunities.

Skills remain a primary focus, with over 87,000 employees now certified in AI, covering areas such as prompt engineering, model fine-tuning, and AI ethics and governance. 

Wipro is also expanding training to frontline roles like customer service agents, field technicians, and plant operators, so they can work alongside AI systems and ensure adoption at every level.

“This democratisation of AI knowledge is core to our AI-first transformation strategy,” Ramachandran mentioned.

The post Wipro Bets Big on Agentic AI appeared first on Analytics India Magazine.

]]>
The Double Thank You Moment Between Kubernetes and LLMs https://analyticsindiamag.com/ai-features/the-double-thank-you-moment-between-kubernetes-and-llms/ Tue, 12 Aug 2025 10:54:41 +0000 https://analyticsindiamag.com/?p=10175783

Kubernetes’ orchestration efficiency and evolution under CNCF makes it a critical infrastructure

The post The Double Thank You Moment Between Kubernetes and LLMs appeared first on Analytics India Magazine.

]]>

Large language models (LLMs) may dominate AI-related headlines, but the underlying infrastructure that makes them work reliably at scale rarely does. 

Kubernetes, an open source container cluster manager, is not only enabling the AI era by orchestrating inference at scale, but also evolving through the demands of AI workloads, a mutually reinforcing cycle between the two, according to Jonathan Bryce, executive director of the Cloud Native Computing Foundation (CNCF).


“We are in the middle of what I think is a huge shift from the traditional workloads of applications to AI applications,” Bryce told AIM in an interview. For context, Kubernetes is maintained by CNCF. 

While great performance and efficient response time and uptime still remain priorities, the requirements in terms of hardware have evolved to suit AI, said Bryce, referring to GPU utilisation. 

The high costs of GPUs make orchestration efficiency a critical factor. This is also why recent developments within the cloud native community have prioritised GPU scheduling, allocation, and workload placement. This brings us to the networking capabilities that Kubernetes brings to the table. 

“Kubernetes has always had these networking concepts available to applications that allow containers to move within pods, and that’s becoming really key to building high-performance AI applications, specifically inference,” said Bryce. 

Bryce, like many voices in the industry today, highlighted how inference is set to become the most critical workload in AI — surpassing model training for the time being. As not all AI inference runs in GPU datacentres, some will also need to operate on devices like laptops, phones, cars, and other edge systems, where orchestration becomes a priority. 

Several open source frameworks are exploring how to run these workloads efficiently, such as Ray, Red Hat’s new llm-d, and ByteDance’s AIBrix.

“The common factor: Kubernetes and the Kubernetes primitives,” Bryce said. 

Kubernetes remains the core foundation for orchestrating LLM workloads, handling deployment, scaling, fault‑tolerance, and hardware abstraction. And recent developments across the Kubernetes ecosystem further extend its capabilities for LLM inference and serving.

In June, the Gateway API Inference Extension was introduced to Kubernetes. Unlike generic HTTP load balancers, this extension enables inference-aware routing, announcing session state, model identity, and resource usage, tailored for long-lived GPU-intensive LLM requests. Released in July, Google Cloud’s GKE Inference Gateway embodies these capabilities.

It routes LLM requests based on GPU-specific metrics like KV cache usage, enabling better throughput and lower latency. It supports multiple models behind a single endpoint and allows autoscaling based on AI workload patterns, optimised for vLLM and various GPU types.

Furthermore, Red Hat AI recently announced llm-d, a Kubernetes-native distributed inference framework built on top of vLLM, one of the most widely used open-source frameworks today to accelerate AI inference. 

In a dual-node NVIDIA H100 cluster, llm-d achieved ‘3x lower time-to-first-token’ and ‘50–100% higher QPS (queries per second) SLA-compliant performance’ compared to a baseline. 

Google Cloud, a key contributor in the llm-d project, said, “Early tests by Google Cloud using llm-d show 2x improvements in time-to-first-token for use cases like code completion, enabling more responsive applications.”

How AI Is Shaping Kubernetes Development

While Kubernetes enables AI workloads, AI is also influencing Kubernetes’ own evolution, and the open source community’s responsiveness to real-world needs is a big part of that. 

“What’s been happening with Kubernetes is as people are taking AI to production, they’re realising Kubernetes has the right types of workload orchestration to manage these complex environments — with networking, with different types of hardware, with different SLAs [service level agreements], and they’re pushing updates into Kubernetes or related projects,” Bryce explained.

One example is improved workload placement control. For high-performance inference, simply moving a model between GPUs can be inefficient because of context and key-value (KV) cache portability issues. 

“How do I make sure that requests are going to a GPU that has its cache? How do I share or distribute the cache using things like LM cache?” Bryce said. “These are differences Kubernetes didn’t support a year ago and does now because of the work that folks are writing.”

Bryce said that these aren’t just experiments or exploring fun ideas among developers, but are driven by real-world needs. 

The changes have made Kubernetes a more capable platform for LLM inference workloads, accommodating mixed hardware environments, tighter performance constraints, and more complex service-level requirements.

Opportunities for Devs in Cloud Native AI Era

For developers considering where to focus their skills in the AI age, Bryce sees the cloud-native ecosystem as a long-term bet. 

While Kubernetes is still the most prominent project at the CNCF, Bryce points to OpenTelemetry as the fastest-growing. 

With AI systems acting as partial black boxes, instrumentation is critical. “You’re going to need more information out of these systems to understand what’s really happening,” Bryce said. 

“Being able to instrument them and get better data out of them… that’s going to be a huge area of innovation,” he added. 

Another growth area Bryce points to is platform engineering, which involves designing internal platforms to make developers more effective. 

“Almost all [companies in the CNCF ecosystem] are implementing platform engineering in some way… If you can learn a skill set that makes your organisation’s developers more effective, that’s hugely valuable,” Bryce said.

Having said that, it is crucial to recognise that LLMs are only as effective as the infrastructure supporting them at scale. Kubernetes, with its advancing GPU orchestration, application-aware networking, and real-world-driven feature development, is emerging, if it has not already established itself, as the backbone for inference workloads. 

“It’s underappreciated how much the CNCF is actually at the centre of AI innovation… our community makes technology work reliably at scale,” added Bryce.

The post The Double Thank You Moment Between Kubernetes and LLMs appeared first on Analytics India Magazine.

]]>
Wadhwani Foundation is Building ‘ChatGPT Plus Plus Plus’ for Indian MSMEs https://analyticsindiamag.com/ai-features/wadhwani-foundation-is-building-chatgpt-plus-plus-plus-for-indian-msmes/ Tue, 05 Aug 2025 12:32:16 +0000 https://analyticsindiamag.com/?p=10174944

“Entrepreneurship shouldn’t be a privilege. It should be a path to prosperity—for anyone, anywhere.”

The post Wadhwani Foundation is Building ‘ChatGPT Plus Plus Plus’ for Indian MSMEs appeared first on Analytics India Magazine.

]]>

For most founders in India’s smaller cities, entrepreneurship is less about ambition and more about survival. But once you take the first step—setting up a business—the real struggle begins. 

“You’re trying to grow, but who’s going to help you?” asked Meetul Patel, president at Wadhwani Foundation. “You don’t have the right resources. You don’t know what you don’t know.”

The Wadhwani Foundation wants to change this.

The nonprofit, focused on large-scale job creation, is building a virtual AI growth accelerator—a national platform that combines curated knowledge, human mentorship and AI-powered guidance to help small businesses and startups grow. 

Wadhwani’s goal is to turn every company into an AI-first organisation, embedding AI across all its startup and skilling initiatives. The goal is simple but ambitious: to support millions of entrepreneurs and MSMEs with personalised, contextual and real-time help, without needing millions of human experts.

“AI lets us reach thousands of entrepreneurs simultaneously, but with the intimacy of one-on-one support,” Patel said.

Ignite, Liftoff and Accelerate

Nowhere is this vision more visible than in Gujarat, where the foundation recently launched Accelerate, its flagship venture-scaling programme, in Ahmedabad and Vadodara.

India has no shortage of startups. What it lacks, though, is structure. Most entrepreneurs drop out between initial excitement and sustainable growth. In Gujarat, this pipeline is already taking shape. Over the past year, Wadhwani Foundation has engaged with over 5,000 students across around 50 institutions (including GTU, Parul University and GLS) with its Ignite initiative.

Furthermore, with Liftoff, Wadhwani’s programme aimed at helping ventures build at zero-cost and zero-equity, over 150 startups and MSMEs have received venture-building support, and at least 20 scale-ready ventures have joined the new Accelerate programme in Gujarat.

“Most founders don’t need another pitch deck workshop,” Patel said. “They need someone to tell them where their real growth gap is, and help them fix it.”

That’s where AI steps in. 

Unlike general-purpose tools that answer questions, Wadhwani’s AI system, with the help of diagnostics, intelligent workflows and curated datasets, identifies what’s missing in a venture’s growth journey—whether it’s product-market fit, investor readiness or export compliance. 

It then nudges founders toward expert advice, market connections or even one-on-one coaching. Whether it’s regulatory compliance, fundraising readiness or finding the right market, the system is designed to walk founders through their entrepreneurial journey, step by step.

“It’s ChatGPT plus plus plus,” Patel quipped.

This isn’t just a chatbot dressed up with a slick interface; it’s a learning system trained on the actual journeys of Indian startups and MSMEs, including the problems they couldn’t solve. Most importantly, it’s free; not just to use, but to also build on. “It’s a shared digital public good,” Patel said. “We’re taking out the capital cost from the entire ecosystem.”

Fixing the Systemic Gaps

While India has over 63 million MSMEs, most struggle with productivity, digital adoption, and access to markets. Meanwhile, only 15-20% of startups survive beyond a few years, and even fewer scale.

Wadhwani’s AI-led programs are designed to change that by introducing execution discipline, real-time feedback and tailored interventions. Every week, the system gets smarter, pulling in lessons from founder journeys, expert sessions and partner inputs.

This feedback loop, where expert interactions are not lost in one-off conversations but fed back into the system, is what sets the platform apart.

“It’s not just about tech. It’s about how the tech is applied—at the right time, in the right way, with the right partner,” Patel said.

At the heart of it all is one mission to create jobs. By helping startups scale and MSMEs become globally competitive, the foundation is working to generate millions of livelihoods, especially in India’s Tier 2 and Tier 3 cities, where opportunities are scarce but ambition runs deep.

With AI as its backbone, the Wadhwani Foundation is showing what a modern startup ecosystem can look like: intelligent, inclusive and built for scale.

“Entrepreneurship shouldn’t be a privilege,” Patel said. “It should be a path to prosperity—for anyone, anywhere.”

It’s not surprising then that many ecosystem players, from incubators to industry associations, are signing up. “There’s no commercial incentive here. But there’s a shared mission. If you’re helping students, startups, or small businesses, we want to help you,” Patel said.

While the final version of the platform is still being iterated, the foundation has already released an early version for partners to explore. With rapid weekly updates and feedback loops, Patel likens their approach to a startup itself—fail fast, ship faster.

The post Wadhwani Foundation is Building ‘ChatGPT Plus Plus Plus’ for Indian MSMEs appeared first on Analytics India Magazine.

]]>
Zoho Says It Doesn’t Have the Muscle to Compete With ChatGPT https://analyticsindiamag.com/ai-features/zoho-says-it-doesnt-have-the-muscle-to-compete-with-chatgpt/ Mon, 21 Jul 2025 09:18:08 +0000 https://analyticsindiamag.com/?p=10173737

The company believes AI should serve business needs, not chase internet hype.

The post Zoho Says It Doesn’t Have the Muscle to Compete With ChatGPT appeared first on Analytics India Magazine.

]]>

Zoho recently rolled out a range of AI releases, including a proprietary LLM stack and speech recognition models, without ever invoking the usual ChatGPT comparisons. There were no open betas, no claims of general-purpose intelligence, and certainly no talk of disrupting humanity.

Instead, Zoho reiterated what it has quietly believed for years: AI, like software, should serve business needs, not chase internet hype.

While competitors chase benchmark bragging rights and flashy multimodal showcases, Zoho is staying firmly within its lane. 

“We don’t have the muscle to compete with the likes of ChatGPT or Gemini, to be honest,” Ramprakash Ramamoorthy, director of AI research at Zoho, told AIM. “That’s the truth.”

Enterprise Use-Case Only, No Consumer Focus

The company has no plans to open its new Zia LLM models, in 1.3B, 2.6B, and 7B parameter versions, to the public or consumers. These models are not for generic prompting or idle exploration. 

“We want to keep it focused on the enterprise user, and we have a lot of work to be done,” Ramamoorthy said.

Zoho had previously integrated open-source models, such as Llama, Mistral, and DeepSeek, into its stack but found the approach to be insufficient. 

“I think those are very generic; they are built for the consumers, and you are putting that into an enterprise stack,” he added.

The shift to building its own LLMs was as much about long-term control as it was about contextual performance. Ramamoorthy believes LLM providers will eventually evolve into SaaS competitors. Hence, it is essential for Zoho to keep investing in things like this.

Building and sustaining proprietary AI models inevitably requires robust infrastructure, and the need for top-tier GPUs is critical. The IndiaAI mission, a government initiative, aims to strengthen India’s AI ecosystem by offering grants, shared compute resources like GPUs, and research partnerships—primarily to startups and academic institutions.

When asked about Zoho’s participation, Ramamoorthy clarified that Zoho is not involved in the initiative. He mentioned that their company is substantial enough and does not require government grants. However, he expressed appreciation for the government’s efforts to encourage innovation, despite the existence of an export cap on GPUs and the associated costs. “The government playing its part to spur innovation is really appreciable,” he said.

AI That Lives Inside the Stack—and Stays There

Zia LLM and Zoho’s new automated speech recognition (ASR) engine are designed to work seamlessly within the company’s own applications—CRM, Desk, Books, Voice, and others. Nothing is routed through third-party APIs. No customer data is being sent to cloud providers for inference.

When asked what differentiates the company from its competitors, Ramamoorthy said AIM, “This is within the stack, you do not need an extension, you are not sharing your data with another third party. So that is the power of our platform, which makes us a key differentiator, whereas a model service provider might not have the platform; they will have to plug it into something else.”

The LLMs will be bundled into Zoho’s subscription plans, with usage caps by edition, and will not be priced separately. AI agents, including revenue analysers and support bots, will remain free during the adoption phase.

“Agent adoption rates are still poor across the industry, not just in Zoho. It is just a lot of hype around it. So we want customers to see value out of it.”

He continued that the offerings with AI agents and LLMs will be priced only when the customers feel there is value in them.

Ramamoorthy explained that they do not want anyone to be put away from running experiments because of the pricing. “So we are playing a wait-and-watch game here; it is a very conscious decision,” he added.

An Intentional Detour from the AI Arms Race

Zoho’s AI play avoids much of what’s fashionable. The models are relatively small, the training data is a mix of public and proprietary sources, and the focus is purely enterprise. “Nobody is going to plan vacations on a CRM system,” Ramamoorthy said.

Instead, Zoho is building a private backbone for enterprise AI, one where governance, context, and cost control matter more.

Even the speech models, trained in English and Hindi, have only 0.2B parameters, a fraction of the size of OpenAI Whisper V3, yet outperform it in word error rate on standard benchmarks.

Where most players view AI as a product to launch, Zoho sees it as infrastructure to build. And that difference could be potentially gauged with these developments. While others lead with hype, Zoho seems to be leading with integration, embedding AI where it quietly improves outcomes without adding complexity.

It’s not the most headline-grabbing strategy. However, it could be what matters to businesses using platforms like Zoho or those that include it.

In refusing to chase ChatGPT, Zoho might be building something great; only time will tell.

The post Zoho Says It Doesn’t Have the Muscle to Compete With ChatGPT appeared first on Analytics India Magazine.

]]>
Wipro Deploys 200+ AI Agents Across Functions, Says Rishad Premji https://analyticsindiamag.com/ai-news-updates/wipro-deploys-200-ai-agents-across-functions-says-rishad-premji/ Thu, 17 Jul 2025 04:03:31 +0000 https://analyticsindiamag.com/?p=10173518

“AI, especially generative and agentic AI, is becoming a game-changer. It’s helping us rethink how we work, uncover new growth opportunities and deliver great value to our clients,” said Rishad Premji.

The post Wipro Deploys 200+ AI Agents Across Functions, Says Rishad Premji appeared first on Analytics India Magazine.

]]>

Ahead of Q1 FY26 results at Wipro’s 79th Annual General Meeting, chairman Rishad Premji underscored the company’s commitment to agentic and generative AI, describing the technologies as central to Wipro’s future. 

As the company nears its 80th anniversary, Premji said that AI is reshaping internal operations and client offerings alike.

“AI, especially generative and agentic AI, is becoming a game-changer. It’s helping us rethink how we work, uncover new growth opportunities and deliver great value to our clients,” said Premji.

Agentic AI is already being integrated into core business processes, with more than 200 intelligent agents developed in collaboration with cloud partners. These agents are independently handling tasks across departments such as HR, finance, and legal, driving scaled efficiencies and outcomes.

Premji noted that Wipro is moving from AI-augmented workflows to fully autonomous systems, marking a new phase in the company’s digital evolution. “Agentic AI is also making a tangible impact across Wipro’s internal operations,” he said.

Alongside this shift, the IT major is nurturing an AI-first workforce. Most employees have completed foundational training in generative AI, and more than 87,000 have received advanced upskilling tailored to their roles, which includes teams in HR, sales, finance, and delivery.

Reflecting on the challenges of FY25, CEO and MD Srini Pallia highlighted the volatile business environment and global uncertainties, particularly in the latter half of the fiscal year. 

“In the fourth quarter, uncertainty deepened with new headwinds regarding the direction and quantum of tariffs,” Pallia said, noting how clients responded by reassessing their transformation initiatives.

Despite the turbulence, Wipro closed 63 large deals worth $5.4 billion, including two mega deals. Pallia emphasised that AI now plays a central role in every engagement.

During the Q4 FY25 earnings call, Pallia said that AI has been part of all deal conversations for a while. “But this year, it has actually become central to almost every opportunity, big or small,” Pallia said. 

Read: ‘AI is Part of All Deal Conversations’: Wipro Secures 17 Big Wins Worth $1.8 Bn in Q4

When asked about the cannibalisation of deals because of generative AI, Pallia said that Wipro is now incorporating generative AI into all its solutions. “GenAI was not part of the earlier deals, but in the new deals, we’re going to infuse GenAI.”

“Some of the clients would want to adopt GenAI, and some are worried about the guardrails in terms of how China is going to deploy [the AI models],” Pallia said. In 3-4 quarters, Wipro will have better clarity on that. “But we are going to infuse AI into a solution before we go to the client.”

Looking ahead to FY26, Pallia said Wipro’s priorities include scaling its consulting-led, AI-powered transformation model.

The post Wipro Deploys 200+ AI Agents Across Functions, Says Rishad Premji appeared first on Analytics India Magazine.

]]>
CP Gurnani says His 90-year-old Mother-in-law’s Best Friend is Her iPad with AI https://analyticsindiamag.com/ai-news-updates/my-90-year-old-mother-in-law-tells-me-her-ipad-with-ai-is-her-best-friend-says-cp-gurnani/ Tue, 08 Jul 2025 11:14:07 +0000 https://analyticsindiamag.com/?p=10173051

“It’s about creating the right environment for people to experiment, learn, and see the value for themselves.”

The post CP Gurnani says His 90-year-old Mother-in-law’s Best Friend is Her iPad with AI appeared first on Analytics India Magazine.

]]>

AI adoption is moving beyond enterprises and into living rooms, as seen in a personal account shared by CP Gurnani, co-founder of AIonOS.

In a recent LinkedIn post, Gurnani recounted how his 90-year-old mother-in-law now refers to her AI-enabled iPad as her “best friend,” using it daily to ask about the weather or find movie recommendations. Gurnani was speaking at the ET Edge HR Transformation Summit 2025.

“You know AI has arrived when my 90-year-old mother-in-law tells me her iPad with AI is her ‘best friend,’” Gurnani wrote. “If I want to know what’s trending, I just ask her or her AI.”

The post reflected a growing sentiment among AI practitioners that mainstream acceptance hinges less on technical readiness and more on cultural openness. 

“True adoption of AI isn’t about forcing change,” Gurnani noted. “It’s about creating the right environment for people to experiment, learn, and see the value for themselves.”

Gurnani emphasised that at AIonOS, the philosophy is to treat AI as an extension of human capability rather than a replacement. “It’s always human plus AI, not human or AI,” he said. “The real opportunity is in using these tools to free up teams for more creative, strategic work that truly moves the needle.”

As businesses and individuals alike navigate the pace of AI advancements, Gurnani urged a shift in perspective. “The question we need to ask ourselves isn’t about the timeline of adoption,” he said, “but how we can foster a culture that encourages learning and embraces rapid technological change.”

AIonOS was co-founded by Gurnani and Rahul Bhatia, founder of InterGlobe Enterprises. Launched in 2024–25, the company is headquartered in Singapore, with fulfilment centres in India (Noida, Hyderabad), the US, UK, Paris, Africa, and the Middle East.

Focused on accelerating digital transformation, AIonOS builds AI-powered solutions that enhance both human and system capabilities. Its primary sectors include Travel, Transportation, Logistics, and Hospitality (TTLH), leveraging InterGlobe’s strong presence in these industries, including IndiGo Airlines.

The company recently acquired a majority stake in Cloud Analogy, a Salesforce Summit Partner, to combine its generative AI platform with Cloud Analogy’s Salesforce expertise. The acquisition is aimed at improving AI-led customer experience transformation and expanding AIonOS’s presence across India, the US, Europe, and Asia.

The post CP Gurnani says His 90-year-old Mother-in-law’s Best Friend is Her iPad with AI appeared first on Analytics India Magazine.

]]>
Why GenAI is India’s High-Performance, Low-Cost Game Changer https://analyticsindiamag.com/ai-features/why-genai-is-indias-high-performance-low-cost-game-changer/ Mon, 23 Dec 2024 05:56:07 +0000 https://analyticsindiamag.com/?p=10144143

AWS’s Satinder Pal Singh points out that while 99% of employers envision their companies becoming AI-driven organisations by 2028, nearly 79% struggle to find skilled professionals.

The post Why GenAI is India’s High-Performance, Low-Cost Game Changer appeared first on Analytics India Magazine.

]]>

Generative AI is at the forefront of India’s tech evolution, delivering great performance without the steep price tag. This has been possible thanks to some big players who continue to up their game. With resilient infrastructure, cutting-edge technology, and strategic investments, cloud giant AWS is helping Indian businesses scale AI while redefining price-performance benchmarks. 

Satinder Pal Singh, head of solution architecture for AWS India and South Asia, leads a team of architects tasked with helping customers design scalable, efficient solutions using AWS services. During an in-depth conversation with AIM at AWS re:Invent 2024 held in Las Vegas recently, Singh shared insights into AWS’s strategic focus on India, customer needs, and groundbreaking launches that promise to redefine cloud technology in the region.

“India is a significant country for us,” Singh began, emphasising AWS’s extensive infrastructure in the country. With two operational regions—Mumbai and Hyderabad—spanning three availability zones, 33 points of presence, and nine direct connect locations, Singh noted, “We provide extensive capabilities to our customers, enabling them to leverage more than 240 services—from infrastructure to analytics, IoT, and machine learning—while keeping their data within India.”

Custom Chips: Redefining Price-Performance

AWS’s investments in custom-built chips—Graviton, Trainium, and Inferentia—have been a game-changer for its customers. Graviton chips, designed for general workloads, have already delivered significant benefits. Companies like Zomato and Paytm reported reduced costs by up to 30% and improved performance by 20-35%, a testament to the efficiency of these chips.

Trainium and Inferentia, on the other hand, cater specifically to AI workloads. Trainium delivers a 64% performance boost compared to previous offerings, while Inferentia is optimised for inference, helping global clients save up to 40% on costs. Singh acknowledged that these advancements were critical for Indian enterprises navigating the challenges of scaling AI while managing budgets.

Upskilling India: AWS AI-Ready Program

One of the most pressing challenges for Indian organisations is the AI talent gap. Singh cited a startling statistic—while 99% of employers envision their companies becoming AI-driven organisations by 2028, nearly 79% struggle to find skilled professionals. To address this, AWS launched the AI-Ready Program, a comprehensive initiative to upskill over 2 million individuals globally by 2025. 

In India alone, over 5.9 million individuals have been trained in cloud services since 2017. AWS’s investment in India’s local cloud infrastructure is projected to reach $16.4 billion by 2030, supporting 131,700 jobs annually and contributing $23.3 billion to India’s GDP by 2030.

Why Mumbai and Hyderabad?

The decision to establish data centres in Mumbai and Hyderabad was driven by performance and reliability considerations while providing multi-region availability. AWS’s infrastructure is built with resilience at its core, comprising multiple availability zones, which are essentially isolated data centres capable of functioning independently.

This design has helped AWS customers achieve exceptional reliability. For instance, ANI Technologies, a leading financial service provider, utilised this setup to create a robust failover mechanism, ensuring uninterrupted services even during system failures. 

Singh explained that such resilience is crucial for mission-critical applications, especially in industries like finance and healthcare.

He elaborated on the layered design of AWS’s infrastructure: “Each region consists of multiple availability zones. Think of one availability zone as one or more isolated data centres. If one zone fails, workloads can shift seamlessly to another.” This architecture ensures resilience even in extreme scenarios like natural disasters.

Generative AI, a key area of focus for AWS, has witnessed increasing interest from Indian enterprises. Singh highlighted two major innovations announced at re:Invent: model distillation and advancements in automated reasoning.

The model distillation technique, offered via Amazon Bedrock, allows smarter, cost-efficient models to inherit the learnings of larger, more accurate ones, significantly lowering operational costs without compromising on precision.

Another critical advancement is AWS’s approach to minimising hallucination in AI models using automated reasoning. Singh emphasised that this enhancement ensures model reliability, making it an ideal solution for industries like insurance and healthcare, where accuracy is non-negotiable. 

These innovations are a direct response to the needs of Indian customers, who often demand high performance at lower costs. Singh detailed AWS’s use of automated reasoning to enhance model reliability. “With this, businesses like insurance companies can rest assured that their models provide accurate responses, eliminating the risk of incorrect outputs,” he said.

Driving Innovation Across Industries

AWS’s democratisation of technology has empowered organisations across diverse industries. For instance, manufacturing firms like Apollo Tyres are leveraging AWS services and reported a 9% increase in productivity, while banks like Axis and HDFC enhanced customer experiences with data-driven insights. 

Singh emphasised AWS’s commitment to innovation while ensuring inclusivity. “We democratise technology so that SMBs have the same access to cutting-edge capabilities as large enterprises,” he said. 

The importance of AWS’s partner ecosystem cannot be overstated. Globally, AWS collaborates with over 140,000 partners, with a significant presence in India. Companies like HCLTech and Persistent Systems are leveraging AWS’s capabilities to deliver innovative solutions to their clients.

The post Why GenAI is India’s High-Performance, Low-Cost Game Changer appeared first on Analytics India Magazine.

]]>
Meet Ishit Vachhrajani, Global Head of Enterprise Strategy at AWS https://analyticsindiamag.com/videos/meet-ishit-vachhrajani-global-head-of-enterprise-strategy-at-aws/ Wed, 04 Dec 2024 06:34:05 +0000 https://analyticsindiamag.com/?p=10142426

Meet Ishit Vachhrajani with a discussion on how generative AI is revolutionising industries like healthcare, uncovering transformative use cases and innovations shaping the future.

The post Meet Ishit Vachhrajani, Global Head of Enterprise Strategy at AWS appeared first on Analytics India Magazine.

]]>
Meet Ishit Vachhrajani with a discussion on how generative AI is revolutionising industries like healthcare, uncovering transformative use cases and innovations shaping the future.

The post Meet Ishit Vachhrajani, Global Head of Enterprise Strategy at AWS appeared first on Analytics India Magazine.

]]>
OpenAI’s Internal Progress Toward AGI with Pragya Misra https://analyticsindiamag.com/videos/openais-internal-progress-toward-agi-with-pragya-misra/ Sat, 30 Nov 2024 15:20:35 +0000 https://analyticsindiamag.com/?p=10142097

Meet Pragya Misra, sharing her insights into OpenAI's internal progress toward AGI. Learn how a typical week look like at OpenAI looks like, and the experience of working with Sam Altman.

The post OpenAI’s Internal Progress Toward AGI with Pragya Misra appeared first on Analytics India Magazine.

]]>
Meet Pragya Misra, sharing her insights into OpenAI’s internal progress toward AGI. Learn how a typical week look like at OpenAI looks like, and the experience of working with Sam Altman.

The post OpenAI’s Internal Progress Toward AGI with Pragya Misra appeared first on Analytics India Magazine.

]]>
Granite 3.0 is IBM’s Love Letter to Indian Developers https://analyticsindiamag.com/ai-features/granite-3-0-is-ibms-love-letter-to-indian-developers/ Fri, 22 Nov 2024 12:00:00 +0000 https://analyticsindiamag.com/?p=10141464

“India’s developer ecosystem is vibrant and growing,” Vishal Chahal said, “and we are committed to supporting it every step of the way.”

The post Granite 3.0 is IBM’s Love Letter to Indian Developers appeared first on Analytics India Magazine.

]]>

Small language models and AI agents are the talk of the AI town. With the release of the latest Granite 3.0 and Bee Agent Framework, IBM has shifted its focus towards smaller and more efficient models while maintaining transparency to a level previously unheard of in the AI industry.

A few days after the launch of Granite 3.0, Armand Ruiz, VP of product–AI Platform at IBM, publicly disclosed the datasets on which the model was trained. “This is true transparency. No other LLM provider shares such detailed information about its training datasets,” said Ruiz.

This is a practice IBM has adhered to even in the past with new model releases. Vishal Chahal, VP of IBM India Software Lab, told AIM that it is important for companies to be fully open about their language models and the data they’ve used to build them. 

Despite its smaller size, Granite 3.0 has been performing exceptionally well in tasks like coding and making small chatbots. The model is applicable across diverse domains, from healthcare to finance, demonstrating its adaptability. Looks like open-source models have finally caught up with their closed-source counterparts.

“You can start small,” Chahal highlighted, sharing that Granite 3.0 can even run on a Mac. This flexibility allows businesses to scale as their needs grow without initially needing extensive GPU clusters.

The open-source philosophy extends beyond Granite’s datasets to its availability on platforms like GitHub, Hugging Face, and IBM WatsonX. IBM’s AI alliance with partners such as Meta, LlamaIndex and Ollama further broadens accessibility for developers worldwide.

Open Source–The Backbone of Granite’s Success in India 

India’s developer ecosystem has embraced Granite 3.0 wholeheartedly. Thousands of developers are already leveraging it, drawn by its performance and versatility. Beyond individual developers, significant collaborations are happening at the institutional level. “India’s developer ecosystem is vibrant and growing,” Chahal said, “and we are committed to supporting it every step of the way.”

“Institutions like IIT Madras, IIT Bombay, and IIT Jodhpur have become part of IBM’s AI alliance,” Chahal shared. These institutions, along with initiatives like AI4Bharat, are using Granite to create benchmarks for Indian languages like MILU, which was released last month. 

Major players like Infosys, KissanAI, Wadhwani AI, and Sarvam AI have also integrated Granite into their AI ecosystems, demonstrating its relevance to startups and enterprises alike. This collaboration is fostering innovation tailored to India’s unique needs, from agriculture to vernacular language processing.

Chahal explained that many startups are using Granite alongside other models, such as Meta’s Llama. “Granite is exceptional for tasks like coding instructions,” he said, “but a multi-model approach ensures startups can pick the best model for each use case.” IBM’s WatsonX platform facilitates this by hosting both IBM’s and partner models, allowing developers to choose the most suitable tools for their needs.

Chahal said that IBM plans to release a multimodal model soon, which will include speech and video models.

Beyond just language models, at its recent TechXchange conference, IBM unveiled advancements in AI agentic capabilities through frameworks like the Bee Agent and MARC (Multi-Agent Resource Coordination) frameworks. “We are officially in the agentic world,” Chahal declared.

IBM offers multiple ways to build agents, from WatsonX and Watson Orchestrate to WatsonX Flows, catering to diverse business requirements. “Soon, we will see a multi-agent world, where hundreds of agents work collaboratively across enterprises,” he predicted.

Meanwhile, IBM’s Bee Agent Framework is designed to assist developers in creating powerful agents with minimal adjustments to existing implementations, particularly as it actively works to optimise for other popular LLMs.

The framework includes key features for building versatile agents. Its built-in Bee agent, configured for Llama 3.1, is ready to use, though users can also customise their own agents using built-in tools in JavaScript or Python. 

Supporting the Developers

“For developers, trust is paramount. A model should transition seamlessly across the development, testing, and production environments without requiring constant adjustments,” said Chahal, adding that the ability to retrain models efficiently is also critical for developers. 

Retraining should not require duplicating models and consuming excessive resources. 

At IBM Red Hat’s InstructLab, the team has developed an open-source tool that allows developers to retrain models using a single GPU cluster. By adding knowledge layers tailored to industry-specific needs, this tool eliminates the need for multiple copies and streamlines the retraining process. For developers, this solves a significant challenge: managing infrastructure while scaling capabilities.

“We aim to lower the barriers for developers by providing access to smaller, efficient models that work even on a single GPU or CPU cluster,” said Chahal, while adding that beyond AI models, developers also need robust data tools and pipelines. This is where IBM’s research software and systems labs collaborate with colleges and universities to share these tools and empower developers with end-to-end capabilities.

In July, IBM held an International GenAI Conclave in Kochi, where college students competed in a hackathon, and winners were recognised by dignitaries. IBM also hired some of these developers, demonstrating its commitment to nurturing talent.

The company also launched a GenAI Innovation Centre in Kochi, which allows colleges, universities, startups, and partners to experiment with business use cases and learn how GenAI is applied in real-world scenarios. This centre will soon be replicated in other locations, creating open zones for innovation.

“Building in India, for India, Bharat, and the world,” Chahal said, adding that IBM wants Indian developers to aspire to have a global impact. With IBM’s support, they can create solutions that not only address local challenges but also have global relevance.

The post Granite 3.0 is IBM’s Love Letter to Indian Developers appeared first on Analytics India Magazine.

]]>
Interview with Amit Kapur (How Lowe’s Improved OpenAI’s Models) https://analyticsindiamag.com/videos/interview-with-amit-kapur-how-lowes-improved-openais-models/ Wed, 20 Nov 2024 12:48:25 +0000 https://analyticsindiamag.com/?p=10141315

The post Interview with Amit Kapur (How Lowe’s Improved OpenAI’s Models) appeared first on Analytics India Magazine.

]]>
Learn how Lowe’s partnership with OpenAI fuelled an AI-driven transformation in the retail industry with Amit Kapur.

The post Interview with Amit Kapur (How Lowe’s Improved OpenAI’s Models) appeared first on Analytics India Magazine.

]]>
Why to Build Indic AI with Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) https://analyticsindiamag.com/videos/why-to-build-indic-ai-with-tanuj-bhojwani-peopleplusai-head-and-vishnu-subramanian-founder-of-jarvislabsai/ Wed, 20 Nov 2024 12:34:43 +0000 https://analyticsindiamag.com/?p=10141312

Meet Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) to learn why to build an Indic AI & how it is important for Bharat.

The post Why to Build Indic AI with Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) appeared first on Analytics India Magazine.

]]>
Meet Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) to learn why to build an Indic AI & how it is important for Bharat.

The post Why to Build Indic AI with Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) appeared first on Analytics India Magazine.

]]>
What is Nikhil Malhotra’s Dream AI? https://analyticsindiamag.com/ai-features/what-is-nikhil-malhotras-dream-ai/ Thu, 14 Nov 2024 08:48:48 +0000 https://analyticsindiamag.com/?p=10140945

Dream AI aspires to create agents that “dream” by simulating environments and learning with a nuanced understanding of context, instead of relying on vast datasets alone.

The post What is Nikhil Malhotra’s Dream AI? appeared first on Analytics India Magazine.

]]>

AI is a compute-hungry beast, pushing companies to make better hardware and experiment with the existing system architectures to ease the load. In this bid, Nikhil Malhotra, the global head of Makers Lab at Tech Mahindra, coined the idea of ‘Dream AI’.

Deep Reinforced Engagement Model AI, or Dream AI, as Malhotra explains is an architecture combining symbolic AI with deep reinforcement learning, marking a shift from conventional models and addressing fundamental limitations of today’s AI. In his LinkedIn post, Malhotra pointed out three problems with current AI systems.

  • 1. AI of today is just pattern recognition in a narrow domain. It’s not generalised.
  • 2. Autoregressive LLMs are divergent. If they take on a state space, it’s very difficult to bring them back.
  • 3. They do well to pass the Turing test but have no reasoning or context of what they say.

Enter Dream AI

Highlighting the core challenge of current systems, Malhotra explained to AIM that Dream AI builds on a neurosymbolic architecture, drawing from two foundational schools in AI—connectionist (or deep learning) and symbolic (logic and symbols). 

Over time, deep learning architectures like Transformers have gained traction for their remarkable performance. However, as Malhotra explained, they do not inherently understand or reason; they excel at remembering sequences rather than forming world models. 

For instance, a Transformer model might present varying responses to slight prompt changes, which can lead to hallucinations—a symptom of shallow context alignment. 

This is where Dream AI steps in, creating a dual-loop system where symbolic reasoning informs world models while neural networks enable actions within those models. By incorporating symbolic AI, the framework empowers agents to simulate and act based on physical and logical rules, much like human cognition. 

“Symbolism helps us build world models, and Transformers enable actions based on those models,” Malhotra said. Dream AI, thus, aspires to create agents that “dream” by simulating environments and learning with a nuanced understanding of context, instead of relying on vast datasets alone.

The Role of RL in Dream AI

Speaking at Cypher 2024, Malhotra shared that his key research goal is to make AI less compute-intensive. 

He calls this the ‘min-max regret model’, which shifts away from traditional reward models and aims to empower AI to “dream” about its capabilities and understand its existence more profoundly. Malhotra said that this “dreaming model” allows AI to contemplate its own questions and aspirations. 

“Can you dream about yourself? Once you develop your dream, now come back to life and start with the life that you have,” he said, highlighting the physical aspect of existence and how it relates to AI. Drawing parallels between human cognition and AI functionality, Malhotra said that just as humans subconsciously store information in the hippocampus, AI systems will use their ‘memory’ to inform decision-making processes. 

Central to Dream AI is its use of reinforcement learning (RL) to harmonise symbolic reasoning with neural actions. This learning process allows AI agents to interact within simulated environments, learning optimal behaviours through feedback. For example, OpenAI decided to move away from RLHF and shift towards RL for improving the reasoning capabilities of o1.

In this hybrid architecture, reinforcement learning serves as the bridge between symbolic world models—built through simulation engines like the NVIDIA Omniverse—and the actions guided by deep neural networks. As Malhotra noted, “The AI doesn’t merely repeat learned patterns; it refines its decision-making process based on the impact of its actions within a realistic context.”

Likening it to cycling—an instinctual skill honed from childhood that remains stored in our subconscious—he said, “A lot of the data that you collect and a lot of your information still resides at the back of the hippocampus. As a result, you pull out that memory when you have to cycle.”

Efficient and Scalable AI Training

The dual learning approach makes Dream AI a powerful solution for dynamic, complex environments, enabling agents to learn contextually rather than reactively. This is particularly useful in applications where conventional AI struggles, such as real-world robotics, autonomous systems, and other domains where both physical laws and logical reasoning are essential.

By incorporating symbolic models, Dream AI aims to reduce the training burden on AI systems. Rather than relying on enormous datasets, Dream AI uses symbolic structures to “dream” or simulate scenarios, effectively minimising repetitive data input. 

Symbolic AI provides a contextual foundation, accelerating learning and reducing the dependency on real-world data for every scenario. This process not only shortens the training time but also yields models that are resilient and adaptable, allowing for faster deployment and reducing costs significantly.

This is similar to what Amit Sheth, the chair and founding director of the AIISC, also told AIM. “The government is focused on AI for health, cybersecurity, and education as three of the top application areas,” Sheth said. 

That’s what Sheth conveyed to the ministry and Prime Minister Narendra Modi regarding the areas of AI India should prioritise for future investment. While generative AI has caught everyone’s attention, he outlined the relevance of neurosymbolic AI, which he believes will drive the third phase of AI.

A Leap Beyond Traditional AI Architectures

Malhotra’s Dream AI architecture offers multiple advantages over traditional AI models. Its integration of symbolic reasoning with deep learning enables a level of adaptability and contextual awareness often missing in autoregressive models. 

Traditional systems are confined to specific tasks, lacking a broader understanding of real-world contexts. Dream AI, however, allows agents to simulate world views, thereby aligning their actions with physical and logical principles. The dual-learning loop—symbolic reasoning paired with neural-driven action—fosters a self-reinforcing cycle of refinement and understanding.

By reducing reliance on extensive real-world data, Dream AI makes training both more efficient and scalable.

The post What is Nikhil Malhotra’s Dream AI? appeared first on Analytics India Magazine.

]]>
Avathon’s Industrial AI to Power Petrol Pumps, Planes https://analyticsindiamag.com/ai-features/avathons-industrial-ai-to-power-petrol-pumps-planes/ Mon, 11 Nov 2024 06:55:27 +0000 https://analyticsindiamag.com/?p=10140743

The company claims that its AI platform now powers 20% of India’s petrol pumps and ensures safety at over 17,000 retail outlets across the country.

The post Avathon’s Industrial AI to Power Petrol Pumps, Planes appeared first on Analytics India Magazine.

]]>

Avathon, an enterprise robotics company, formerly known as Sparkcognition, has rebranded itself with an ambitious plan to triple its workforce in India within two years and transform Industrial AI in sectors such as Energy, aviation, and Supply Chain Management.

This rebranding underscores Avathon’s focus on advancing legacy infrastructure through AI, turning traditional systems into autonomous, sustainable, and resilient ecosystems. With over $100 trillion of ageing infrastructure globally under strain from supply chain disruptions, workforce shortages, and escalating security threats, Avathon is poised to tackle these challenges as a leader in industrial AI solutions.

The company claims that its AI platform now powers 20% of India’s petrol pumps and ensures safety at over 17,000 retail outlets nationwide. The company is strengthening its India presence to attract premier AI and engineering talent.

Furthermore, Avathon’s AI is contracted to enhance safety at major oil and gas facilities in India, including 83 airport terminals and 15 airport fueling stations. With key partnerships with technology leaders like NVIDIA and Qualcomm, Avathon continues to deliver cost-efficient products tailored to client needs. 

In line with this expansive mission, Avathon’s digital twin technology has been designed to operate beyond isolated factory floors, encompassing the entire supply chain. Johar compared Avathon’s platform to NVIDIA’s Omniverse, explaining, “While many digital twins focus solely on individual factories, our approach includes every step of the supply chain to optimise quality control and problem-solving,” Pervinder Johar, CEO of Avathon told AIM.

“We’re bridging the physical and computational aspects of AI,” Johar elaborated. Avathon is now positioned as a robust platform that integrates mechanical engineering advancements with AI innovations, a concept Johar believes is becoming increasingly relevant as industries shift toward intelligent, autonomous systems.

Digital Twin for Aviation Industry

Avathon’s digital twin solution addresses quality inspection issues not only at a factory level but also extends to tier-one suppliers and beyond. By deploying computer vision and other AI techniques, Avathon helps clients detect defects in parts like aeroplanes or car components early in the supply chain, avoiding costly recalls and logistical challenges. “If a part has a flaw, identifying it before it even reaches the assembly line saves time, resources, and ultimately, customer satisfaction,” Johar noted.

The expertise of Kunal Kislay, India president at Avathon, significantly strengthens Avathon’s computer vision and low-code AI platform. Kislay joined the company following the acquisition of his company, Integration Wizards. With a rich background in computer vision and AI engineering, Kislay now helps the company scale AI-driven industrial applications.

Johar shared broader industry insights and examples of Avathon’s impact. He illustrated how clients such as HPCL and IOCL leverage Avathon’s platform for everything from safety monitoring to productivity improvements. For HPCL, Avathon’s computer vision monitors security and safety on-site, identifying potential hazards before they occur. 

Highlighting the economic impact, Johar estimated that Avathon’s solutions have the potential to influence trillions of dollars in asset value across industries. “Just in aviation, our AI can provide predictive maintenance for fleets, allowing airlines to extend the life of their planes, which represents billions in capital investment savings,” he explained.

Even with a marginal cost improvement of 10%, clients can see significant returns on their investments. In sectors like aviation, this could mean substantial operational savings and improved efficiency for airlines managing vast fleets.

AI as a Copilot

Avathon’s emphasis on collaborative AI solutions extends to predictive and prescriptive maintenance for complex assets. The company’s AI copilots provide essential guidance, even for technicians unfamiliar with specific machinery. This capability is particularly valuable in industries like aerospace, where technicians may only specialise in certain types of equipment. 

“Parts may be interchangeable between Boeing and Airbus planes, but often mechanics aren’t trained across both,” Johar pointed out. “Our AI copilots step in to bridge that knowledge gap, saving time and avoiding the need to fly specialists worldwide.”

Looking towards the future, Johar envisions an AI-driven evolution across industries, particularly with the advent of humanoids and physical AI in manufacturing. He referenced the work of Avathon’s joint venture with Boeing, SkyGrid, which focuses on autonomous air traffic control for a future filled with autonomous aircraft. 

“As air traffic grows, we need a system that can manage the skies without relying on human controllers,” he explained. Similarly, Avathon is working toward creating autonomous supply chain planning systems that not only support human planners but could potentially automate decision-making processes entirely, as the supply of mechanical engineers is getting slow globally.

Sparking an Idea

Originally founded as Sparkcognition 11 years ago, Avathon emerged from the University of Texas at Austin, where it was initially a niche AI project led by UT Austin’s PhD student and Dr. Bruce Porter, the university’s former computer science dean. 

“Back then, AI was just beginning to gain traction. We were ahead of the curve, focusing on applying AI to real-world problems long before AI became the trend it is today,”. This foundation in academia drove a decade of innovation, especially for large-scale clients in Fortune 500 companies, as Avathon carved its niche in the AI-driven infrastructure space.

Over time, Johar realised that its original mission— ‘sparking an idea’ —was evolving into a more comprehensive vision. With a focus on building a sustainable platform that would serve industries for decades, they rebranded to Avathon, a name Johar explained as stemming from two Greek words that mean ‘to bind together.’ 

The post Avathon’s Industrial AI to Power Petrol Pumps, Planes appeared first on Analytics India Magazine.

]]>
Why India Needs a Quantum Mission https://analyticsindiamag.com/ai-features/why-india-needs-a-quantum-mission/ Sat, 09 Nov 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10140722

“The US has banned the export of quantum. They don't want to share the technology, so that's the kind of message that we are getting from many countries, that nobody really wants to share,” said Ajai Chowdhry, co-founder of HCL and Chairman-Mission Governing Board of National Quantum Mission.

The post Why India Needs a Quantum Mission appeared first on Analytics India Magazine.

]]>

In April 2023, India inaugurated its ambitious National Quantum Mission (NQM), representing a critical step forward in the nation’s journey towards harnessing quantum technology. With a target budget of ₹6,000 crore and an eight-year timeline, the mission aims to establish India as a global leader in quantum technology under the leadership of Ajai Chowdhry, co-founder of HCL and the founder and chairman of EPIC Foundation. 

“When it comes to quantum computing, we need to make a quantum computer of 1,000 cubits over the next eight years. With respect to communication, we are supposed to have on-ground communication of 2,500 km–-quantum communication—and in space 2,500 kilometres. This way, the targets are well set for each of the areas,” said Chowdhry, in an exclusive interaction with AIM

Establishing India’s Quantum Framework

By January 2024, Chowdhry and his team had established the quantum mission’s framework, beginning with a request for proposals (RFP) to attract expertise and innovative ideas across the quantum spectrum.

“We requested proposals, and within a week of our board meeting, we had already set up calls for proposals,” said the Padma Bhushan awardee. The NQM received an overwhelming response, with 385 applications submitted for review. By August, 84 proposals were approved, covering four primary areas namely quantum computing, quantum communication, sensors, and materials science.

Ambitious and Aggressive Targets 

Chowdhry referred to the mission’s goals as “very tough”, while emphasising that such high targets are essential to placing India on the quantum technology map. While goals are one part of the mission, the funding has been strategically distributed.

“It was decided that ₹4,000 crore will be directly with the National Quantum Mission and ₹2,000 crore will be spent by others, which include the departments of space, defence, and atomic energy. So, it’s a ₹6,000-crore plan,” he said.

The Need for Quantum

A critical challenge identified by Chowdhry is the need to shift researchers’ focus from academic publications to practical applications. Historically, researchers in India, much like other countries, have prioritised publishing papers over product development. NQM’s approach requires a shift to a goal-oriented mindset that aligns with technological targets rather than purely theoretical research.

“A lot of these large corporates will need quantum technology four to five years from now,” said Chowdhry. “For drug development, there’s nothing better than a quantum computer. It’ll dramatically reduce the time for drug discovery and the cost of drugs, so Indian pharma companies can go far ahead if they start thinking about quantum technologies,” he added.

Chowdhry emphasised the urgency of making India quantum-ready, especially in terms of cybersecurity. As quantum technology advances, traditional encryption methods could become obsolete, making sensitive data vulnerable to breaches. 

“When a quantum computer comes up that can crack current cybersecurity, you will be completely open to an attack, and this can happen maybe four years, five years, six years,” cautioned Chowdhry. “At a time when a powerful quantum computer is ready, it can crack your financial systems, it can crack your security systems.” 

Startup and International Collaboration

Chowdhry believes in the importance of involving startups in the quantum mission, recognising the potential they bring for rapid innovation and commercialisation. He explained that the NQM has already reached out to approximately 45-50 startups and created a groundbreaking startup policy, which grants up to ₹25 crore per startup, significantly higher than the usual government grants. 

“This is absolutely unique in the country. For the first time, we are giving startups up to ₹25 crore as support,” which he believes is crucial considering the expensive and resource-intensive nature of quantum research. 

Chowdhry also emphasised the promising nature of the Indian startup ecosystem. “There are already about three or four startups that are commercially selling their products, with one having a subsidiary in America,” he said. 

While collaboration with other nations might seem a logical path to accelerate India’s quantum mission, Chowdhry highlighted the limitations of this move. Countries with significant advancements in quantum, like the US, are often reluctant to share details about core technologies due to concerns over security and competition. 

“It’s nice to talk, but difficult to execute. The US has banned the export of quantum. They don’t want to share the technology, so that’s the kind of message that we are getting from many countries, that nobody really wants to share,” he quipped. 

India is negotiating alternative collaboration models, such as joint intellectual property (IP) development. For instance, Chowdhry proposed inviting foreign companies that have achieved intermediate quantum advancements to set up operations in India. This approach could enable India to build on existing innovations and leapfrog into more advanced stages of quantum technology development.

Quantum, AI and India 

In the long term, Chowdhry envisions a synergistic relationship between quantum computing and AI. While quantum computers are not yet advanced enough to significantly enhance AI applications, Chowdhry is optimistic about the progress of this combination in the upcoming years. 

“AI and quantum are going to be a deadly combination,” he said, forecasting that advancements in quantum computing will eventually transform AI capabilities.

Reflecting on his journey from co-founding HCL to spearheading the quantum mission, Chowdhry, commonly referred to as the ‘Father of Indian Hardware’ expressed confidence in India’s potential to lead in technology. He stressed that while software services have fueled India’s growth over the past two decades, the time has come for India to focus on products. 

“If we don’t shift towards a product-oriented mindset, we’ll never be a developed country,” he asserted. “Tomorrow’s people should be in India. This is a growth country,” he concluded. 

Ajai Chowdhry’s vision for NQM underscores India’s aspiration to become a quantum powerhouse, positioning the country among the leaders in one of the most transformative technological fields of the future. 

The post Why India Needs a Quantum Mission appeared first on Analytics India Magazine.

]]>
Lenovo’s ‘Custom-Fit’ Data Centres Pushes Against Competitors  https://analyticsindiamag.com/ai-features/lenovos-custom-fit-data-centres-pushes-against-competitors/ Wed, 30 Oct 2024 12:38:17 +0000 https://analyticsindiamag.com/?p=10139860

“We’re building exascale systems designed to be accessible to any customer, regardless of their data centre setup or size,” said Scott Tease, VP and GM of AI and HPC at Lenovo.

The post Lenovo’s ‘Custom-Fit’ Data Centres Pushes Against Competitors  appeared first on Analytics India Magazine.

]]>

At the recently held Lenovo Tech World 2024 event, the company showcased its expertise in the concept of ‘Hybrid AI,’ encompassing products and launches focused on personal AI, enterprise AI, and public AI. While Lenovo PC’s and AI buddies were talked of highly, Lenovo’s continued push in the infrastructure domain was undoubtedly the highlight. 

Scott Tease, VP and GM of AI and High-Performance Computing (HPC) at Lenovo, explained what sets Lenovo’s HPC solutions apart from competitors like Dell. “We’re building exascale systems designed to be accessible to any customer, regardless of their data centre setup or size,” he said while interacting with AIM at the sidelines of Tech World in Seattle. 

Data Centre for All-Scale Business

Unlike competitors whose HPC solutions often require specialised facilities, Lenovo’s systems are engineered to fit into standard data centre racks and operate with standard power requirements. Tease emphasised that Lenovo’s HPC technology is designed for flexibility, with options for both air and water cooling and open interconnects, making it ideal for regular data centres.

“Our competitors might offer the same building blocks, but they’re often custom-designed for very large systems,” he explained when asked about HPC solutions offered by competitors such as Dell and HPE. 

Lenovo’s approach prioritises accessibility without the need for costly infrastructure changes like reinforced floors or oversized entryways. Tease explains that, with Lenovo’s technology, there’s no need to rebuild data centres or reinforce floors, as the equipment can fit through a standard 2-metre door or freight elevator. In contrast, competing solutions involve 4,000-kilogram racks that are 2.5 metres wide, exceeding typical floor support capabilities.

“We’re designing exascale for every scale,” he added, underscoring Lenovo’s mission to democratise high-end HPC technology.

Sustainable Power to Run the Show

While offering HPC as a major service, Lenovo is also managing to hit sustainability goals. Tease emphasised the importance of reducing power consumption to achieve both Lenovo’s and its clients’ ESG targets. “80% of a device’s carbon impact comes from power usage,” he said, underscoring Lenovo’s focus on lowering energy use to reduce overall emissions.

Lenovo’s sixth-generation Neptune cooling system ThinkSystem N1380 Neptune, unveiled at Tech World, is a water-cooling system built to support NVIDIA’s Blackwell platform and AI applications. The system is said to operate at 100 kW+ server racks, thereby achieving 100% heat removal. 

“If one of these large language model nodes costs you 10 kilowatts to run the compute itself, you’re going to spend another 4 kilowatts just for air conditioning. With Neptune, we can do away with all that air movement. We can do away with all the air conditioning and the air movement in the data centre,” said Tease. It could easily save 30-40% of the power bill. 

Tease explained that twelve years ago, Lenovo’s first server node consumed 330 watts of power. Today, the new Grace Blackwell super node uses around 14 times more power but offers nearly 1,000 times the performance of those early systems. This focus on increasing performance per watt has become essential for customer support. 

Lenovo’s Neptune liquid cooling system aims to help clients rethink data centre design by addressing not only the power needed to operate AI but also the substantial energy required for cooling.

Cooling systems have become the need of the hour, with a number of big tech companies partnering with data cooling system providers. For instance, NVIDIA is not just partnering with Lenovo but also Super Micro Computer, for liquid cooling. The company also announced its partnership with Elon Musk’s xAI to power Colossus, the world’s largest AI supercomputer. 

HPC in India

“For HPC, you have some very big clusters going on in India, a lot of research, a lot of great universities, and we want to be a part of every one of those we possibly can,” Tease said.

Tease stated that the deployment of IT technologies in India is tailored to meet specific regional requirements. “India is like a hotbed of AI startups,” he said. He also committed to identifying and supporting the most talented Indian startups, not only within India but also on a broader scale. 

Lenovo’s Indian ambitions have been steadily growing to fuel the hyperscaler ambition for India. The company also announced its full-stack portfolio, which is completely ‘Made in India’. 

Last month, Lenovo inaugurated its state-of-the-art research and development lab in Bengaluru, making it the fourth city to host an infrastructure R&D lab globally. 

The post Lenovo’s ‘Custom-Fit’ Data Centres Pushes Against Competitors  appeared first on Analytics India Magazine.

]]>
Why India Needs People+ai https://analyticsindiamag.com/ai-features/why-india-needs-peopleai/ Fri, 25 Oct 2024 12:00:00 +0000 https://analyticsindiamag.com/?p=10139413

“If you look at how much an AI solution could mean to a user, it’s much higher in India with a much larger volume,” said Tanuj Bhojwani.

The post Why India Needs People+ai appeared first on Analytics India Magazine.

]]>

When we visited People+ai a couple of weeks ago, we had no idea we would walk into a space filled with AI engineers, volunteers and researchers, and of course, walls adorned with AI-generated art, all in line with the philosophy of ‘What can (AI) do for you?’ — straight from Tanuj Bhojwani’s LinkedIn tagline. 

Under his leadership, People+ai is championing AI for India and addressing real challenges at scale with the aim to accelerate through various initiatives and government partnerships. As he put it, “We have a seat at the table… not just in building models but in creating the world’s largest use cases for AI where the difference in people’s lives is tangible”.

People+ai was born out of Nandan Nilekani, Rohini Nilekani, and Shankar Maruwada’s EkStep Foundation in June last year. This community-driven effort aims to harness AI capabilities to help a billion people. And it is being led by Bhojwani, who has been an investor, a consultant, and co-founder earlier as well.

Bhojwani explained the importance of building Indic language models and their use cases within the country. “If I put a gun to your head and tell you that you cannot use Google from next month, you will maybe push back for one or two months but then start coughing up.” Bhojwani said that everyone is accustomed to the idea of going to this magic box, typing what they think, and getting the results.

But according to him, most of the Indian population is yet to experience that type of internet because of the country’s low literacy rate. The same people are going to access the new internet multimodally, through voice and pointing the camera at things. This is why it is important to take AI to the grassroot-level of the country, and do it in the way that India does it – frugally.

People+ai is currently working on projects like Jan Ki Baat, Sthaan, and Open Cloud Compute (OCC) to make this a reality.

“I think there are problems that are going to be unique for India. Who is going to solve that?” Bhojwani asked rhetorically while explaining that copying the West for ideas is not the right way forward. He also added that the behemoth, Amazon, is shaken because of Zepto, Instamart, and Blinkit – the same should be applicable to AI, and not playing catchup with the West.

Nilekani, the co-founder and non-executive chairman of Infosys, previously noted that just as India benefited hugely from Digital Public Infrastructure (DPI), Aadhaar, and UPI, AI holds similar transformative potential. “AI is a very powerful technology, but it’s a technology like anything other,” said Nilekani, adding that AI needs to be used with appropriate safety and guardrails. 

This is the exact principle that People+ai works on. “When it comes to the infrastructural pieces, which involves DPI and policy, we step in,” said Bhojwani.

India as the AI Use Case Capital of the World

Building AI in India is a game that’s very different from that in the West. “If you look at how much an AI solution could mean to a user, it’s much higher in India with a much larger volume,” he added. In the West, it’s about acquiring enterprise customers willing to spend millions of dollars. 

But in India, it’s a high-volume, low-value game, where the AI users would not be paying so much. These people would be more comfortable using AI in their own native languages. This solution is at the population scale. For India to flourish in AI, models must be created that understand India’s linguistic nuances and cultural complexities. 

Bhojwani had earlier told AIM that the foundation had been exploring the benefits of AI for quite some time. However, it was only in the past year that they decided to establish a specialised unit focused on extensively exploring AI use cases.

He believes that in a decade, India is going to be the AI use case capital of the world because of its huge population, diversity, languages and the need for specialised AI tools, be it in healthcare, education or any other field. This is exactly the idea that Nilekani puts in with ‘Adbhut India’.

While enterprises and other companies are pushing the idea of building models and acquiring big customers, it is also essential to make AI reach the roots of the country, which is what AI for Bharat stands for. People+ai is involved with stakeholders to determine if better tools and resources can be developed. “For this, we are collaborating and talking to different organisations, some in Africa, because they are also facing similar problems,” Bhojwani said.

AI for Bharat

If India is going to be the AI use case capital of the world, a substantial compute infrastructure is imperative. People+ai is actively working on developing an open compute infrastructure network to meet the rising demand for compute while promoting market competitiveness.

The idea here is to help small businesses based in Tier 2 or Tier 3 cities access computing infrastructure for training or inferencing at a lower cost compared to leveraging services from the likes of AWS, Google or Azure, which could prove to be costly. 

People+ai is already working with Indian computing service providers E2E Networks, Jarvis AI Labs, NeevCloud, and Vigyan Labs and the idea is to create a network of micro data centres with interoperable standards, allowing small businesses and startups to easily plug and play, based on their specific requirements.

“However, the biggest stumbling block right now is the lack of understanding regarding the potential use cases of AI. While building chatbots is undoubtedly crucial, there is a significant amount of work that still needs to be undertaken, especially in the context of India,” Bhojwani said.

“If I had to choose one of them first, I would choose the use cases,” said Bhojwani, while explaining that every company in the world is increasingly getting interested in Indian AI models, be it OpenAI, Google, or Meta. The moat usually stands in building AI use cases as it is easier to go up that supply chain, rather than going down.

OpenAI did not build GPT-4 on day one, it took them years and several iterations to reach that level. The same would go for Indic language models built by Indian AI companies. “Being one or two generations behind the SOTA models is still good enough.” 

The long-term vision of all AI companies is the same—to have indigenously developed AI models. Given the network and access to resources that the Indian AI companies have right now, the next best move would be to build a GPT-2 level model instead of competing with the West.

Building models is getting cheaper and catching up with SOTA a few years later would be astronomically cheaper. “What is the hurry?” asked Bhojwani, explaining that it is better to solidify the market that could sustain the models. “For a constrained set of resources, where would you rather apply them,” he added and said that it is good to build models, but if you had to pick what to do first, defining the use cases is more important.

The post Why India Needs People+ai appeared first on Analytics India Magazine.

]]>
The Journey From Startups to IPO By Neha Singh – Co-Founder and CEO of Tracxn Technology https://analyticsindiamag.com/videos/the-journey-from-startups-to-ipo-by-neha-singh-co-founder-and-ceo-of-tracxn-technology/ Fri, 25 Oct 2024 04:28:21 +0000 https://analyticsindiamag.com/?p=10139358

Learn how Tracxn Technologies went from a startup to IPO in just 10 years with Neha Singh, the CEO and Co-Founder of Tracxn.

The post The Journey From Startups to IPO By Neha Singh – Co-Founder and CEO of Tracxn Technology appeared first on Analytics India Magazine.

]]>
Learn how Tracxn Technologies went from a startup to IPO in just 10 years with Neha Singh, the CEO and Co-Founder of Tracxn.

The post The Journey From Startups to IPO By Neha Singh – Co-Founder and CEO of Tracxn Technology appeared first on Analytics India Magazine.

]]>
Interview with Tushar Vashisht, CEO & Co-founder of HealthifyMe‬ https://analyticsindiamag.com/videos/interview-with-tushar-vashisht-ceo-co-founder-of-healthifyme/ Wed, 23 Oct 2024 05:19:59 +0000 https://analyticsindiamag.com/?p=10139151

The post Interview with Tushar Vashisht, CEO & Co-founder of HealthifyMe‬ appeared first on Analytics India Magazine.

]]>
Meet Ria, Healthify’s in-house AI, and learn how OpenAI integration is transforming user experience with Tushar Vashisth, CEO & Co-founder of ‪HealthifyMe‬.

The post Interview with Tushar Vashisht, CEO & Co-founder of HealthifyMe‬ appeared first on Analytics India Magazine.

]]>
Raj Verma, CEO, Singlestore – The Database Market in India is $120 Billion https://analyticsindiamag.com/videos/raj-verma-ceo-singlestore-the-database-market-in-india-is-120-billion/ Wed, 09 Oct 2024 05:50:03 +0000 https://analyticsindiamag.com/?p=10137910

Watch Raj Verma, CEO, Singlestore talk about the company's journey, his views on the Indian database market.

The post Raj Verma, CEO, Singlestore – The Database Market in India is $120 Billion appeared first on Analytics India Magazine.

]]>
Watch Raj Verma, CEO, Singlestore talk about the company’s journey, his views on the Indian database market.

What is Database Market?

The database market encompasses the industry focused on developing, distributing, and managing database management systems (DBMS) related services. The market includes various types of databases, such as relational (RDBMS), NoSQL, NewSQL, and cloud databases. Key players in database market include Oracle, Microsoft, AWS, and MongoDB, with demand driven by the exponential growth of data, big data analytics, and the shift toward cloud-based solutions for scalability and flexibility.

The post Raj Verma, CEO, Singlestore – The Database Market in India is $120 Billion appeared first on Analytics India Magazine.

]]>
Mukund Raghunath – Breaking Down Entrepreneurship, Data Explosion, and the GenAI Revolution https://analyticsindiamag.com/videos/mukund-raghunath-breaking-down-entrepreneurship-data-explosion-and-the-genai-revolution/ Mon, 07 Oct 2024 12:01:26 +0000 https://analyticsindiamag.com/?p=10137735

Meet Mukund Raghunath, Founder and CEO of Acies Global, sharing insights on the evolving landscape of data science and engineering.

The post Mukund Raghunath – Breaking Down Entrepreneurship, Data Explosion, and the GenAI Revolution appeared first on Analytics India Magazine.

]]>
Meet Mukund Raghunath, Founder and CEO of Acies Global, sharing insights on the evolving landscape of data science and engineering.

The post Mukund Raghunath – Breaking Down Entrepreneurship, Data Explosion, and the GenAI Revolution appeared first on Analytics India Magazine.

]]>
Startups are all About Solving Problems that were not Solved Before https://analyticsindiamag.com/videos/startups-are-all-about-solving-problems-that-were-not-solved-before/ Mon, 07 Oct 2024 05:18:11 +0000 https://analyticsindiamag.com/?p=10137675

Explore the impact of AI in shaping the future of analytics with Kishore Gopalakrishna, Co-Founder and CEO at StarTree

The post Startups are all About Solving Problems that were not Solved Before appeared first on Analytics India Magazine.

]]>
Explore the impact of AI in shaping the future of analytics with Kishore Gopalakrishna, Co-Founder and CEO at StarTree

The post Startups are all About Solving Problems that were not Solved Before appeared first on Analytics India Magazine.

]]>
The Rise of Autonomous AI at Kyndryl https://analyticsindiamag.com/ai-features/the-rise-of-autonomous-ai-at-kyndryl/ Tue, 17 Sep 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10135574

"We believe this area is going to grow and is an important aspect for us," said Sreekrishnan Venkateswaran, CTO of Kyndryl India.

The post The Rise of Autonomous AI at Kyndryl appeared first on Analytics India Magazine.

]]>

Tech leaders and experts have been vocal about their predictions that AI agents will be the next big thing. Meta CEO Mark Zuckerberg recently said there could be more AI agents in the world than humans. Similarly, Google is also going big on agents with their senior developers dropping subtle hints on how AI assistants and agents will be a game-changer. 

A tech player who is already going big on this is Kyndryl.

Spinning off from IBM in 2021, Kyndryl, the world’s largest IT infrastructure services provider serving various sectors including cloud, security, network, enterprise, and applications, has been heavily investing in AI too.

“We have a lot of customers and our customers are across all verticals. They are seeking ways to ingest AI into their environments and then elicit advantage out of it,” said Sreekrishnan Venkateswaran, CTO of Kyndryl India in an exclusive interaction with AIM. He also noted that the majority of companies are using GenAI in production, not just for prototyping.

Speaking about the future of AI applications, Venkateswaran brought emphasis on how agentic AI will bring a significant shift in how things work. 

“We often use Agentic AI when mixed problem-solving approaches are required. For example, in customer engagement, the AI system directs each query to the most suitable agent based on its content. One agent might perform retrieval-augmented generation from an internal corpus, while another accesses real-time data through external APIs,” said Venkateswaran. 

The CTO explained that their collaborative workflow, where multiple agents tackle different aspects of a problem based on the query, is a prime example of Agentic AI. He emphasised that they leverage these Agentic AI patterns to effectively address several practical challenges faced by their customers.

Agentic AI is the Future

Venkateswaran explains Agentic AI as a concept that combines language models, custom code, data, and APIs to create intelligent workflows capable of solving business problems. He highlighted the significance of this development, noting AI and data to be particularly valuable in India due to its abundant data and skilled AI professionals. “We believe this area is going to grow and is an important aspect for us,” he stated.

Agentic AI represents a shift towards more autonomous decision-making systems. An “agent” in this context is a piece of code capable of perceiving its environment—through sight, sound, or text—and making decisions based on that input. 

This capability can range from simple tasks, like generating creative code and sharing it via WhatsApp, to complex functions such as managing supply chains or enhancing customer engagement systems. 

 “An agent takes the initiative and makes decisions to solve problems autonomously,” said Venkateswaran.

Venkateswaran’s illustrious career of over 28 years in technology, spending 25 years at IBM before moving to Kyndryl, stresses the importance of building broad technical skills and hands-on coding, cautioning against over-specialisation. As he puts it, “Leadership, yes, but you also need to bring in a C-level skill,” underscoring the need for both leadership and technical expertise. 

He also teaches computer science at BITS Pilani, emphasising that foundational skills like maths and algorithms remain crucial despite the industry’s evolving landscape.

Speaking about the progress in AI systems Venkateswaran said that neural networks required large datasets to learn and perform complex tasks. However, with foundational models like GPT, which are pre-trained, this requirement has been significantly reduced. 

“The human advantage in learning from smaller datasets is no longer valid,” the CTO observed. 

Nuanced AI Support

Speaking about Kyndryl’s application across domains, Venkateswaran highlights how AI has helped transform customer engagement in industries such as retail, travel and transportation, and believes AI is helping with nuanced approaches.

 “Customer engagement is much more than a Q&A with a chatbot.” He believes that to truly replicate or enhance live interactions, genAI must be adept at “understanding the sentiment associated with the text or sarcasm” and responding appropriately in the detected language.

Similarly, Kyndryl has also found strong use-cases in education, healthcare, and finance which is one of Kyndryl’s biggest focuses. “All of them [customers] ask for ways to use this new tech to solve fraud detection, anomaly detection, suspicious transactions, and so on,” said Venkateswaran. 

He also explains how adoption of generative AI in edutech surged post-COVID, with AI automating the evaluation of descriptive exam answers and creating tailored questions. In healthcare, AI is improving patient-provider connections by interpreting symptoms more effectively than traditional keyword searches.

“So any project where we have AI and data, it is usually a modern application project. And if you have Gen AI, it is like a modern application project, up on steroids. So it is not really, there is no real bifurcation between an AI app and a modern app,” said Venkateswaran. 

Kyndryl has partnered with top players including Google Cloud for developing responsible generative AI solutions and accelerating adoption by customers. “We have partnerships based on what we need from each of those partners, and then it goes back to what we can offer them with customers,” said Venkateswaran. 

Kyndryl has been rapidly expanding its operations in India expanding its base in Bengaluru. The company opened its third office in Bengaluru in April. Last month, Kyndryl launched a security operations centre (SOC) in the city for providing comprehensive support and advanced protection for cyberthreats leveraging AI. 


AI agents are predicted to be the next major technological breakthrough, with leaders like Meta’s Mark Zuckerberg suggesting AI agents could soon outnumber humans. Google also sees AI assistants as transformative game-changers. Kyndryl, the world’s largest IT infrastructure services provider, has been a key player in this space since its spin-off from IBM in 2021, focusing on AI to boost operational efficiency and innovation across sectors.

Sreekrishnan Venkateswaran, CTO of Kyndryl India, envisions AI agents, or Agentic AI, as the future of business. These intelligent systems combine language models, custom code, and real-time data to autonomously address complex tasks. Like a self-driving car that adapts without human input, AI agents handle tasks like fraud detection and customer sentiment analysis, reducing manual intervention and enhancing efficiency in industries such as finance, healthcare, and education. Agentic AI promises to revolutionize global operations by automating supply chains and procurement in e-commerce while interacting with customers. This level of automation allows human roles to focus on strategic decision-making and innovation.

As AI continues to advance, industries will become more adaptive, efficient, and customer-centric. Kyndryl’s partnerships with companies like Google Cloud underscore its commitment to responsible AI development.

Disclaimer: The opinions expressed in this expert analysis summary are solely of the AIM council members and do not reflect the views or opinions of the organization they are affiliated with.


AI agents are predicted to be the next major technological breakthrough, with leaders like Meta’s Mark Zuckerberg suggesting AI agents could soon outnumber humans. Google also sees AI assistants as transformative game-changers. Kyndryl, the world’s largest IT infrastructure services provider, has been a key player in this space since its spin-off from IBM in 2021, focusing on AI to boost operational efficiency and innovation across sectors.
 
Sreekrishnan Venkateswaran, CTO of Kyndryl India, envisions AI agents, or Agentic AI, as the future of business. These intelligent systems combine language models, custom code, and real-time data to autonomously address complex tasks. Like a self-driving car that adapts without human input, AI agents handle tasks like fraud detection and customer sentiment analysis, reducing manual intervention and enhancing efficiency in industries such as finance, healthcare, and education. Agentic AI promises to revolutionize global operations by automating supply chains and procurement in e-commerce while interacting with customers. This level of automation allows human roles to focus on strategic decision-making and innovation.
 
As AI continues to advance, industries will become more adaptive, efficient, and customer-centric. Kyndryl’s partnerships with companies like Google Cloud underscore its commitment to responsible AI development.


The post The Rise of Autonomous AI at Kyndryl appeared first on Analytics India Magazine.

]]>
Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development https://analyticsindiamag.com/ai-features/former-nutanix-founders-ai-unicorn-is-changing-the-world-of-crm-and-product-development/ Mon, 19 Aug 2024 10:56:33 +0000 https://analyticsindiamag.com/?p=10132923

Backed up Khosla Ventures, DevRev recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion.

The post Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development appeared first on Analytics India Magazine.

]]>

DevRev, founded by former Nutanix co-founder Dheeraj Pandey and former SVP of engineering at the company, Manoj Agarwal, is an AI-native platform unifying customer support and product development. It recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion.

Backed by major investors, such as Khosla Ventures, Mayfield Fund, and Param Hansa Values, the company is on the road to proving the ‘AI bubble’ conversation wrong. “Right now, there’s a lot of talk in the industry about AI and machine learning, but what we’re doing at DevRev isn’t something that can be easily replicated,” said Agarwal in an exclusive interview with AIM.

Agarwal emphasised the unique challenge of integrating AI into existing workflows, a problem that DevRev is tackling head-on. Databricks recently announced that LakeFlow Connect would be available for public preview for SQL Server, Salesforce, and Workday; DevRev is on a similar journey, but with AI at its core, it remains irreplaceable.

DevRev’s AgentOS platform is built around a powerful knowledge graph, which organises data from various sources—such as customer support, product development, and internal communications—into a single, unified system with automatic RAG pipelines. 

This allows users to visualise and interact with the data from multiple perspectives, whether they are looking at it from the product side, the customer side, or the people side.

The Knowledge Graph Approach

Machines don’t understand the boundaries between departments. The more data you provide, the better they perform. “Could you really bring the data into one system, and could you arrange this data in a way that people can visually do well?” asked Agarwal. 

The Knowledge Graph does precisely that – offering a comprehensive view of an organisation’s data, which can then be leveraged for search, analytics, and workflow automation.

Agarwal describes the DevRev platform as being built on three foundational pillars: advanced search capabilities, seamless workflow automation, and robust analytics and reporting tools. “Search, not just keyword-based search, but also semantic search,” he noted.

On top of these foundational elements, DevRev has developed a suite of applications tailored to specific use cases, such as customer support, software development, and product management. These apps are designed to work seamlessly with the platform’s AI agents, which can be programmed to perform end-to-end tasks, further enhancing productivity.

“The AI knowledge graph is the hardest thing to get right,” admitted Agarwal, pointing to the challenges of syncing data from multiple systems and keeping it updated in real-time. However, DevRev has managed to overcome these hurdles, enabling organisations to bring their data into a single platform where it can be organised, analysed, and acted upon.

The Open Approach

The company’s focus on AI is not new. In fact, DevRev’s journey began in 2020, long before the current wave of AI hype. “In 2020, when we wrote our first paper about DevRev, it had GPT all over it,” Agarwal recalls, referring to the early adoption of AI within the company. 

Even today, DevRev primarily uses OpenAI’s enterprise version but also works closely with other AI providers like AWS and Anthropic. In 2021, the platform organised a hackathon where OpenAI provided exclusive access to GPT-3 for all the participants. 

This forward-thinking approach allowed DevRev to build a tech stack that was ready to leverage the latest advancements in AI, including the use of vector databases, which were not widely available at the time.

One of the biggest challenges that DevRev addresses is the outdated nature of many systems of record in use today. Whether it’s in customer support, CRM, or product management, these legacy systems are often ill-equipped to handle the demands of modern businesses, particularly when it comes to integrating AI and machine learning.

Not a Bubble

DevRev’s architecture is designed with flexibility in mind, allowing enterprises to bring their own AI models or use the company’s built-in solutions. “One of the core philosophies we made from the very beginning is that everything we do inside DevRev will have API webhooks that we expose to the outside world,” Agarwal explained. 

As DevRev reaches its unicorn status, Agarwal acknowledges the growing concerns about an “AI bubble” similar to the dot-com bubble of the late 1990s. “There’s so many companies that just have a website and a company,” he said, drawing parallels between the two eras. 

However, he believes that while there may be some hype, the underlying technology is real and here to stay. “I don’t think that anybody is saying that after the internet, this thing is not real. This thing is real,” Agarwal asserted. 

The key, he argues, is to distinguish between companies that are merely riding the AI wave and those that are genuinely innovating and solving real problems. DevRev, with its deep investment in AI and its unique approach to integrating it into enterprise workflows, clearly falls into the latter category.

The post Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development appeared first on Analytics India Magazine.

]]>
Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing https://analyticsindiamag.com/ai-features/accels-prayank-swaroop-on-navigating-challenges-and-data-moats-in-indian-ai-startup-investing/ Mon, 19 Aug 2024 05:28:28 +0000 https://analyticsindiamag.com/?p=10132881

“My belief is that India is a great market, and smart founders come and keep on coming, and we'll have enough opportunities to invest in,” said Prayank Swaroop, partner at Accel.

The post Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing appeared first on Analytics India Magazine.

]]>

As pioneers in the startup VC ecosystem, Accel (formerly known as Accel Partners), with over four decades of experience, entered the Indian market in 2008. They placed their initial bets on a nascent e-commerce company poised to compete with Amazon. 

In 2008, Accel India invested $800,000 in seed capital into Flipkart, followed by $100 million in subsequent rounds. The VC firm went on to back some of today’s most successful ventures, including AI startups. “We’ve invested in 27 [AI] companies in the last couple of years, which basically means we believe these 27 companies will be worth five to ten billion dollars [in the future],” said Prayank Swaroop, partner at Accel, in an exclusive interaction with AIM. 

Swaroop, who joined Accel in 2011, focuses on cybersecurity, developer tools, marketplaces, and SaaS investments, and has invested in companies such as Aavenir, Bizongo, Maverix, and Zetwerk. Having placed careful bets in the AI startup space, he continues to be optimistic, yet wary, about the Indian ecosystem. 

Swaroop observed that while the Indian ecosystem has impressive companies, not all can achieve significant scale. He mentioned that they encounter companies that reach $5 to $10 million in revenue quickly, but they don’t believe those companies can grow to $400 to $500 million, so they choose not to invest in them.

Swaroop told AIM that Accel doesn’t have any kind of capital constraints and can support as many startups as possible. However, their focus is on startups with the ambition to grow into $5 to $10 billion companies, rather than those aiming for $100 million. “I think that is our ambition,” he said. 

Accel has also been clear about having no inhibition in investing in wrapper-based AI companies. They believe that as long as the startup is able to prove that they will find customers by building GPT or AI wrappers on other products, it is fine.  

“The majority of people can start with a wrapper and then, over a period of time, build the complexity of having their own model. You don’t need to do it from day one,” said Swaroop.

However, he also pointed out that for a research-led foundational model, it’s crucial to stand out, and that one cannot just create a GPT wrapper and claim it’s a new innovation.

Accel has invested in a diversified portfolio including food delivery company Swiggy, SaaS giant Freshworks, fitness company Cult.fit, and insurance tech Acko. Accel has made its second highest number of investments in India with a total of 218 companies, only behind the United States with 572. In 2022, the market value of Accel’s portfolio was over $100 billion.

Accelerating AI Startups

Accel has a dedicated programme called Accel Atoms AI that looks to invest in promising AI-focused startups across early stages. The cohort of startups will be funded and supported by Accel partners and founders to help them grow faster. 

Selected startups in Accel Atoms 3.0 received up to $500k in funding, cloud service credits, including $100,000 for AWS, $150,000 for Microsoft Azure, $250,000 for Google Cloud Platform, GitHub credits, and other perks. The latest edition, Atoms 4.0, is expected to begin in a couple of months.

While these programmes are in place, Accel has been following a particular investment philosophy for AI startups. 

Accel’s Investment Philosophy

The investment philosophy of Accel when it comes to AI startups entails a number of key criteria, that includes even the type of team. “It’s a cliched thing in VC, but we definitely look at the team,” said Swaroop, saying that they need to have an appreciation of AI.

He emphasised that teams must embrace AI, and be willing to dive into research and seek help when needed, demonstrating both a commitment to learning and effective communication.

Accel also focuses on startups that solve real problems. Swaroop believes that founders should clearly identify their customers and show how their solution can generate significant revenue.

“We get team members who are solving great things, and we realise they are solving great things, but they can’t say that. When they can’t say that, they can’t raise funding. Basically, are you a good storyteller about it?” he explained.

Revenue Growth Struggles  

Swaroop further explained how VCs are increasingly expecting AI startups to demonstrate rapid revenue growth. 

Unlike traditional deep tech companies that may take years to generate revenue, AI firms must show significant commercial traction within 12 to 18 months. He also stated that as VC investment in AI rises, startups without clear revenue paths face growing challenges in securing funding. 

“Even to the pre-seed companies, we say, ‘Hey, you need to show whatever money you have before your next fundraiser. You need to start showing proof that customers are using you.’ Because so many other AI companies exist,” he said. 

Swaroop also highlighted how investment behaviour for AI startups has changed over the last year where investors are now asking the hard questions.

VCs Obsess Over Data Moats

Speaking about what differentiates an AI startup and their moat, Swaroop highlighted how the quality of datasets may be a deciding factor; and “not so much” with Indic datasets.

“I don’t think language datasets can be a moat, because everybody understands language. Recently, in the Bhashini project, IISc gave out 16,000 hours of audio, so it is democratic data. Everybody owns it, so what’s proprietary in it for you?” asked Swaroop.  

Proprietary datasets, such as those in healthcare or specialised fields, are valuable due to their complexity and the effort required to create them. “I think startups should pick and choose an area where they have uniqueness of data, where they will have proprietary data which is different from just democratic data. That’s the broad thing,” said Swaroop.

Irrespective of the moat, India continues to be a great market with multiple opportunities for investment. In fact, at a recent Accel summit, Swaroop jokingly mentioned how he did not invest in Zomato during its early stage, but there are no regrets. Interestingly, Accel has invested heavily in Zomato’s competitor, Swiggy.

“I think the first thing you have to let go of as a VC is FOMO, the fear of missing out, that’s why I could not think of a company that I regret not investing in, because, my belief is that India is a great market. Smart founders come and keep on coming. We’ll have enough opportunities to invest in,” concluded Swaroop, excited to meet the next generation of founders working in the AI startup ecosystem. 

The post Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing appeared first on Analytics India Magazine.

]]>
Meet Deepak Joy Cheenath, co-founder of Quizizz https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/ https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/#respond Sun, 11 Aug 2024 05:11:42 +0000 https://analyticsindiamag.com/?p=10132064

Meet Deepak Joy Cheenath, co-founder of Quizizz, a powerful tool for enhancing student learning through interactive and engaging quizzes.

The post Meet Deepak Joy Cheenath, co-founder of Quizizz appeared first on Analytics India Magazine.

]]>
Meet Deepak Joy Cheenath, co-founder of Quizizz, a powerful tool for enhancing student learning through interactive and engaging quizzes.

The post Meet Deepak Joy Cheenath, co-founder of Quizizz appeared first on Analytics India Magazine.

]]>
https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/feed/ 0
Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/ https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/#respond Sun, 04 Aug 2024 14:01:45 +0000 https://analyticsindiamag.com/?p=10131291

Devnagri's dataset is robust, comprising over 750 million data points across 22 Indian languages.

The post Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities appeared first on Analytics India Magazine.

]]>

Hyperlocal content is becoming crucial for businesses to expand into tier two and tier three cities in India. Devnagri, a data-driven generative AI company, is paving the way by developing a solution, which they call the brain for Indian companies. 

Nakul Kundra, the co-founder and CEO, told AIM about the moat of the company in the era of Indic AI startups. 

“Devnagri is dedicated to helping businesses move into new markets by providing hyperlocal content. Our machine translation capabilities enable businesses to transform their digital content into multiple languages, allowing them to engage with diverse customer bases,” Kundra explained.

Based in Noida and founded in 2021, Devnagri specialises in personalising business communication for non-English speakers. The company had recently raised an undisclosed amount in a Pre-Series A round led by Inflection Point Ventures. These newly acquired funds will be used for marketing, sales, technology enhancement, R&D, infrastructure, and administrative expenses.

Devnagri leverages open-source LLMs, such as the latest Llama 3, integrating it with its existing dataset and proprietary translation engine for 22 languages. It tailors business communications for diverse linguistic audiences, seamlessly integrating its technology into both private and government infrastructures. 

“We built application layers on top of our machine translation engine,” Kundra elaborated. “These layers allow customers to upload documents, select languages, and even customise the content before translation. The system understands specific tones and terminologies, ensuring that the translated content aligns with the business’ communication style.”

Similarly, New York-based RPA firm UiPath recently partnered with Bhashini. The focus of this collaboration is the integration of Bhashini’s language models with the UiPath Business Automation Platform. This will facilitate seamless translations of documents and other essential areas, specifically targeting Indian languages supported by Bhashini.

Companies such as CoRover.ai and Sarvam AI are also in the similar field of building translation capabilities for companies. Even big-tech companies, such as Microsoft and Google, are heavily focused on translation into Indic languages for catering to the Indian market. 

What’s the Moat Then?

However, Kundra said that Devnagri’s proprietary technology lies at the heart of this initiative, and also the moat of the company. “We’ve created our own machine translation capabilities from scratch,” Kundra said. “Businesses can use our APIs to integrate this technology directly into their platforms, localising content in real-time.”

Devnagri’s dataset is robust, comprising over 750 million data points across 22 Indian languages. “We initially built our models using a vast dataset, and recently we’ve incorporated SLMs and LLMs to enhance quality and address grey areas identified through customer feedback,” Kundra said. 

The goal is to create a single brain for businesses, integrating all touchpoints and datasets into a cohesive system that understands and responds in the desired tone.

“We adapt existing models and integrate them with our proprietary technology, ensuring high-quality multilingual capabilities,” Kundra added.

Collecting data for such a comprehensive system is no small feat. “Our data comes from multiple sources, including open-source dataset corpus, customer data, and synthetic datasets we create,” Kundra explained, saying that the introduction of new datasets from Bhashini also helps the company improve its models.

Devnagri’s multilingual capabilities extend beyond text-to-voice-based conversational bots. “We are developing multilingual bots that allow customers to interact in their preferred languages, whether it’s Marathi, Kannada, or any other language,” Kundra said and added that they aim to reduce the latency to as little as possible.

The Road Ahead

When asked about Devnagri’s differentiating factor, Kundra emphasised on their multilingual bots. “These bots are essential for companies operating pan-India. They handle calls in multiple languages, switching seamlessly to accommodate the caller’s preference, all with the lowest latency.”

Security and privacy are paramount, especially when dealing with government organisations, and customers such as UNDP, and Apollo, among several others. “All our modules are proprietary, enabling us to bundle and position them securely within enterprises or government agencies,” Kundra assured.

Devnagri’s journey has been remarkable, marked by notable milestones like their appearance on Shark Tank India 2022. “Multilingual conversation is the need of the hour, and our solutions aim to optimise costs and improve efficiency for enterprises.”

The firm has also received numerous prestigious awards, including the TieCon Award 2024 in San Francisco, the Graham Bell Award 2023, and recognition as NASSCOM’s Emerging NLP Startup of India.

“Our machine translation engine is a foundational model. It enables us to build conversational bots that understand and respond in multiple languages, tailored to specific business needs,” Kundra said.

As Devnagri looks to the future, their focus remains on building comprehensive AI solutions that cater to the diverse linguistic landscape of India. “We aim to create an ecosystem where businesses can thrive in any language, offering seamless multilingual interactions and superior customer experiences,” Kundra concluded.

The post Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities appeared first on Analytics India Magazine.

]]>
https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/feed/ 0
Meet Reema Lunawat – Accelerating Women In Data Science and Tech https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/ https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/#respond Fri, 02 Aug 2024 08:43:35 +0000 https://analyticsindiamag.com/?p=10131225 Learn how Reema Lunawat's journey as a woman in tech and how ZS's inclusive culture and diversity initiatives made a tangible difference for her.

The post Meet Reema Lunawat – Accelerating Women In Data Science and Tech appeared first on Analytics India Magazine.

]]>
Learn how Reema Lunawat’s journey as a woman in tech and how ZS’s inclusive culture and diversity initiatives made a tangible difference for her.

The post Meet Reema Lunawat – Accelerating Women In Data Science and Tech appeared first on Analytics India Magazine.

]]>
https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/feed/ 0