BLOG PUE in the Age of AI: Challenges and Opportunities for India’s Data Infrastructure Mar 25, 2026 STT GDC India SHARE Link copied! The modern data centre of is no longer a hidden, silent backbone. Intelligence is now powered by an active engine. And one metric that is being tested more than ever before as AI data centres expand quickly to support everything from generative AI to real-time analytics is PUE (Power Usage Effectiveness). Power efficiency in a colo data centre has traditionally been measured using PUE. However, PUE requirements are evolving in the era of high-performance computing powered by AI workloads. The question is whether PUE alone can meet the demands of next-generation infrastructure, not just how efficiently power is used. This shift presents India with both an opportunity to lead the way in developing sustainable AI data centres and a challenge. Why AI Workloads Are Changing the PUE EquationTraditional enterprise workloads and AI are fundamentally different. GPU data centres are far more energy-dense than traditional setups because it thrives on GPU-powered parallel processing. For older workloads, a typical data centre rack might run at 5 to 10 kW. AI racks, on the other hand, can produce more heat than 30 to 50 kW, which makes thermal management more important. Because more energy is needed for cooling and maintaining optimal operating conditions, this increase has a direct impact on PUE optimisation. AI workloads and PUE have a complicated relationship. Maintaining a low PUE becomes more challenging even if overall energy use increases because the supporting infrastructure, such as distribution, airflow, and cooling, must expand disproportionately. Continuous benchmarking has produced quantifiable improvements at STT GDC India. Since FY21, PUE has decreased by 3.4%, and the portfolio has improved by 2%. With an emphasis on modifying infrastructure to satisfy AI-driven demands while preserving efficiency, all new facilities are built to achieve a PUE of less than 1.5. The Cooling Challenge: From Air to LiquidHeat density is one of the most pressing cooling issues in AI data centres. Even though they worked well for earlier workloads, traditional air-cooling systems are becoming less effective in high-density settings. The industry is moving toward liquid cooling technologies as AI adoption picks up speed. Among them are: Direct-to-chip liquid coolingLiquid immersion coolingRear door heat exchangers (RDHx)In-row cooling Because liquid-based systems are much more effective at removing heat, they can pack compute resources more tightly and use less power. Additionally, they lessen the need for sizable air handling systems, which enhances PUE optimisation results. Throughout our portfolio of next-generation data centres, we at STT GDC India have already adopted a wide range of cutting-edge cooling technologies. Facilities are better prepared to manage the thermal intensity of contemporary GPU data centres by incorporating solutions like direct-to-chip and immersion cooling. There is more to the switch from air cooling to liquid cooling than just a technical improvement. The way data centre services are created, provided, and scaled for AI has undergone a structural change. PUE Optimisation for Next-Gen AI Data CentresStrong PUE optimisation in AI environments necessitates a multi-layered strategy that extends beyond infrastructure improvements. First, design is important. Energy-efficient layouts, intelligent airflow design, and modular scalability must be incorporated from the beginning in new construction, particularly in tier 3 data centre facilities. This design-first mentality is reflected in STT GDC India's strategy of aiming for sub-1.5 PUE in all new facilities. Second, it's crucial to monitor in real time. Operators can now monitor power consumption at fine-grained levels across all data centre racks and systems thanks to advanced energy management systems. Continuous optimisation and quicker reaction to inefficiencies are made possible by this. Third, software and hardware optimisation need to cooperate. Infrastructure can be adjusted to guarantee optimal utilisation, minimising energy waste, and AI workloads can be dynamically distributed. Lastly, cooling systems need to be run at maximum efficiency without overconsumption with the help of integrated thermal management techniques that combine airflow, liquid cooling, and intelligent controls. When combined, these tactics shape PUE's future, where efficiency is constantly improved rather than remaining constant. Beyond the PUE Metric: Measuring True SustainabilityThe development of AI is forcing the industry to look beyond PUE metric frameworks, even though PUE is still crucial. PUE calculates a facility's power delivery efficiency, but it doesn't take into consideration the power's source or its carbon footprint. Therefore, sustainability does not always follow from a low PUE. To construct truly sustainable AI data centres, operators need to consider extra metrics like: Effectiveness of Carbon Use (CUE) and Water Use (WUE)Integration of renewable energyInfrastructure lifecycle efficiency This more comprehensive perspective guarantees that environmental responsibility and power efficiency are in line. A move toward comprehensive sustainability is reflected in STT GDC India's emphasis on cutting-edge monitoring systems and energy-saving projects. The company is actively redefining what efficiency means in a high-performance world by fusing operational efficiency with forward-thinking design and energy strategies. India’s Opportunity to Lead in Sustainable AI InfrastructureIndia is at a special turning point. The nation is well-positioned to become a global centre for AI data centres due to its fast digital adoption, rising AI innovation, and expanding demand for data centre services. This expansion presents a chance to develop infrastructure that is both scalable and sustainable from the start. India can overcome the legacy issues that more developed markets face by investing in next-generation data centre design, implementing cutting-edge cooling technologies, and concentrating on PUE optimisation. With effective colo data centre solutions, sturdy tier 3 data centre facilities, and a strong focus on energy performance, operators like STT GDC India are already setting standards. Their dedication to innovation and ongoing PUE improvements show how India can balance growth and sustainability. The capacity to provide effective, high-performance infrastructure will define competitive advantage as AI continues to transform industries, not only for businesses but for the entire nation. ConclusionPUE has long been the cornerstone of data centre efficiency. However, it is being redefined in the era of artificial intelligence. The industry is moving toward more intelligent design, sophisticated cooling, and ongoing optimisation due to the increase in AI workloads and PUE issues. The tools to adapt are already in motion, ranging from integrated thermal management to innovations in liquid cooling. The next chapter lies in going beyond PUE metric thinking toward a more holistic understanding of sustainability. This is more than just a technical advancement for India. Leading the world in the construction of sustainable, scalable, and future-ready AI data centres is a strategic opportunity. Furthermore, PUE will be more than just a metric in that future. It will be a way of thinking.