India’s Data Centre Boom Is Running Into a Talent Wall

India’s Data Centre Boom Is Running Into a Talent Wall

Analytics India Magazine (Merin Susan John)

When US-based global digital infrastructure company Vertiv launched its Training Academy and Technology Excellence Center in Pune last month, the move indicated the shortage of skilled service engineers at a time when infrastructure is expanding rapidly and becoming significantly more complex due to AI-driven workloads.

India’s operational data centre capacity has crossed 1.5 gigawatts (GW) and is projected to reach 4.5 GW or more by the end of the decade, supported by investments estimated at $20–25 billion. Much of this growth is being driven by hyperscale cloud adoption, data localisation requirements, and the early stages of large-scale AI deployment. 

While capital expenditure and construction pipelines continue to accelerate, operators say the availability of skilled talent, particularly service engineers capable of managing high-density, AI-ready environments, has not kept pace.

Training for a new generation of infrastructure

Located at Vertiv’s Integrated Business Services (IBS) hub in Pune, the new training academy is designed to provide hands-on exposure to the systems that underpin modern data centres.

The facility features live demonstrations of cooling distribution units (CDUs), power switchgear systems, three-phase uninterruptible power supply (UPS) modules, and thermal management technologies, allowing trainees to work with production-grade equipment under simulated operating conditions.

“With the surge in AI investments in the data centre space, we expect a significant demand for a skilled, adapted, and updated workforce,” said Subhasis Majumdar, managing director, Vertiv India. He added that the academy intends to support engineers and customers working across both new and existing data centre environments.

The facility also includes a Technology Excellence Center with engineering and R&D laboratories focused on testing and validating power and thermal management solutions.

AI workloads raise the bar for service engineers

The shift from traditional enterprise IT to AI-heavy workloads has altered the operational profile of data centres. AI infrastructure typically involves GPU-dense racks, liquid cooling architectures, and power densities ranging from 100 kW to 300 kW per rack, compared to far lower levels in conventional facilities.

According to A S Rajgopal, managing director and CEO at NxtGen Cloud Technologies Private Limited, the most critical talent gap lies in managing this convergence of disciplines.

“The most critical gap today is at the intersection of high-density compute, power engineering, and AI infrastructure orchestration,” Rajgopal said. “Traditional data centre talent understands virtualisation, networking, and facilities, but AI workloads demand expertise in GPU fabric design, liquid cooling, ultra-high power racks, and AI workload scheduling.”

He added that there is also a growing shortage of professionals who can bridge physical infrastructure and AI platforms. “There is a sharp shortage of AI infrastructure architects and MLOps engineers who can bridge the gap between physical infrastructure and large-scale model training platforms,” he said, noting that the capability gap is widening faster than talent supply.

Converging physical infrastructure and digital intelligence

A similar assessment comes from Vipul Kumar, VP–edge and network at CtrlS Datacenters Limited, who said the industry is seeing a fundamental shift in the skills required to operate AI-ready facilities.

“The real challenge is building talent that can truly bridge physical infrastructure and digital intelligence,” Kumar said. “AI-ready data centres demand a shift in mindset from managing racks and uptime in isolation to understanding how compute density, thermal dynamics, energy efficiency, networking, and AI workload behaviour interact as one system.”

As AI workloads become more power- and heat-intensive, Kumar said, teams must take a holistic approach to performance optimisation, cooling strategies, energy usage, and workload orchestration. “This convergence of skills is now essential to delivering reliable, efficient, and scalable AI infrastructure,” he added.

CtrlS, he said, is building this capability through focused upskilling initiatives, cross-functional teams, and closer collaboration with technology partners to ensure its facilities are prepared for evolving AI infrastructure requirements.

Extensive retraining required

Despite India’s large pool of engineering graduates, operators say most new entrants are not immediately prepared for real-world data centre operations.

“Most engineering graduates are not job-ready for modern AI-era data centres,” Rajgopal said. Academic curricula, he noted, continue to focus on legacy IT systems and conventional cloud concepts. As a result, “companies end up spending six to 12 months, or more on retraining before graduates can handle real operational responsibility.”

Kumar echoed this view, saying that while graduates often have a strong theoretical foundation, they lack exposure to live environments. “They require immersion in real-world skills such as power systems, cooling technologies, network fabrics, risk management, security protocols, and process discipline,” he said.

This training burden has become more pronounced as data centres scale rapidly. Mumbai alone accounts for over 50% of India’s installed capacity, followed by Chennai, Delhi-NCR, and Bengaluru—regions where competition for experienced engineers is particularly intense.

According to David Yao, senior director at Vertiv’s Pune Hub, the facility combines experienced trainers with live systems and digital tools to support skill development across installation, maintenance, and continuous improvement in high-density environments.

Automation does not remove skill needs

While automation and AI-driven tools are increasingly deployed for monitoring, fault detection, and predictive maintenance, operators say these technologies do not eliminate the need for specialised talent.

“Automation and AI will reduce dependence on repetitive operational roles,” Rajgopal said. “However, this does not eliminate the talent crunch.”

Instead, demand is shifting toward advanced roles such as AIOps engineers, automation architects, cybersecurity specialists, and energy optimisation experts.

Kumar described the trend as a skills transition rather than a reduction in workforce requirements. “The talent requirement is evolving, not shrinking. The industry faces a skills transition, not a workforce reduction demand,” he said.

The post India’s Data Centre Boom Is Running Into a Talent Wall appeared first on Analytics India Magazine.

Generated by RSStT. The copyright belongs to the original author.

Source

Report Page