Energy Data: Driving Innovation in Data Center Infrastructure

As new AI load profiles strain the electrical infrastructure of today’s data centers, operators are asking how they can keep up. Janitza explored the answers to this critical question in our July 2025 “QuickChat” with Data Center Frontier.

25-09-2025 Data-Center-Frontier-Visual

What drives innovation in data center electrical infrastructure? For Roshan Rajeev, Janitza’s VP of Engineering in the US, the answer is clear: “Data is the only way to keep evolving these systems,” he explained during the QuickChat. “Without having insight into your electrical infrastructure, you can't evolve it to the next level.” 

 

Data center consultant Ken Murphy, who joined the discussion, agreed, adding that the right data allows for a better understanding of how large AI clusters behave. “By placing measurement systems in the right locations and monitoring power quality, we can learn…[and] that allows us to improve infrastructure design and reduce the impact on the grid,” Ken noted. 

 

The Right Data at the Right Time 

While the importance of data is clear, collecting utility health data at scale remains a significant hurdle, Ken noted the challenge of managing thousands of devices simultaneously. The solution, according to Roshan, lies in standardized and scalable components that are easy to deploy. “Open protocols are best suited here to ensure that everything works properly – regardless of what manufacturer you use,” he said. Roshan pointed to Janitza meters as an example. These devices are modular in design and support OPC UA as a base framework. MQTT-based devices are also a good fit, making them ideal for IIoT and cloud integration. 

 

Standardization Enables Speed 

The availability of electricity remains one of the greatest challenges for rapidly expanding AI data center infrastructure. “The need for power is the most critical factor in building new AI data centers quickly,” said Roshan. Ken added that “time-to-capacity,” which refers to how long it takes a data center to reach full load, is a key economic driver and competitive advantage. “The faster we can deploy and spin up infrastructure to power GPU racks, the better,” he noted. 

 

Scalability is essential to achieving this speed. One proven approach is the use of “skid builds”—prefabricated, modular, and ready-to-connect infrastructure packages for power and cooling. With these building blocks, a lot of prefabrication can be done either at the vendor factory or at offsite locations. Ken explained that this approach allows for the configuration and testing of communications in a standardized and scalable fashion, leading to significant efficiency gains during construction. From his viewpoint, Roshan emphasized that to be effective, these components need to remain modular and standardized, enabling true plug-and-play usability for end-users. 

 

About Data Center Frontier’s QuickChat 

Roshan Rajeev is Vice President of Engineering at Janitza USA with more than a decade of experience in power quality and energy monitoring. Ken Murphy is an external data center consultant. Data Center Frontier is a leading digital publication focused on the future of data centers, cloud, and AI infrastructure. Its QuickChat format regularly spotlights emerging trends across the industry. 

 

To watch the full interview, click here.

Learn more about data centers
Gain deeper insights into the topic in our brief information sheet

 

Text: Joachim Bär