We use cookies to provide you with a better experience. By continuing to browse the site you are agreeing to our use of cookies in accordance with our Cookie Policy.
Data has become a critical asset in today’s digital world, often referred to as the new gold. It has evolved to the point where it requires its own real estate portfolio scattered across nations and oceans alike. Each facility requires a tremendous number of resources to ensure its information stays preserved and accessible for everyday computing needs.
This column serves as a reminder to innovators everywhere that water or liquid-based heating and cooling remains the simplest and most elegant solution to a carbon-neutral future.
The past few years have marked an unexpected shift in energy consumption, particularly among tech companies focusing on artificial general intelligence and quantum computing technology. The data stored from traditional computing is stored in what is referred to as “bits.” The supercomputing data storage metric is called “qbits” or quantum bits, which can be much more complex in their solution responses and require considerable amounts of additional energy than the traditional bit.
In 2023, we saw a notable trend where companies such as Microsoft, Google and Meta significantly increased their energy consumption, despite committing to politically charged international sustainability initiatives. The reason for this makes perfect sense though, in hindsight.
With incredible improvements made in the artificial intelligence (AI) space these last few years, we can only expect technology companies will need to get more involved in energy solution planning if we all want to continue this progressive work.
According to the International Energy Agency, “Electricity consumption from data centers, artificial intelligence and the cryptocurrency sector could double by 2026.” That will be here in just a few short years, which often is not long enough to get a construction permit or be granted regulatory approval to build.
The forecast continues: “Data centers are significant drivers of growth in electricity demand in many regions. After globally consuming an estimated 460 terawatt hours (TWh) in 2022 … data centers total electricity consumption could reach more than 1,000 TWh in 2026.” That is a tremendous amount of electricity demand on an already taxed grid.
These data centers would continue to build “off grid” with a future option to create a micro-grid or district system, generating energy locally for its neighbors and partners.
Introducing the data energy hub, an application designed for growing cities and neighborhoods looking to achieve long-term sustainability. The hubs would allow aboveground or belowground local storage and stewardship of data and the thermal energy availability of the facility. This availability challenge is rooted in the massive computational power AI and quantum computing will continue to require, especially with large language models growing and supercomputers coming online.
Perhaps the data energy hub could be paired with energy-independent fabrication plants, which are equally challenging to design, build and manage. I believe we will see a snowballing effect of technological needs ahead of us indefinitely. However, we can use these challenges as opportunities to be more energy-conscious, thus giving a new purpose to the modern computer age — built for the people by the people.
Consider a data energy hub project in a growing town, storing local data for a dedicated rural area below a park or farmland, preserving space and providing waste heat to its neighbors by thermal energy network (TEN) piping systems. This model could help cities everywhere gain access to low-cost home heating and cooling. It would be a gift from technology and perhaps help remove some of the apprehension around data collection.
When connected to a larger network, a data center could be a central source and sink of a city or town’s thermal needs. With greater processing demand comes the need to dissipate more heat generated by these complex machines. In terms of energy and water use, cooling costs have not gotten any cheaper, forcing owners and operators to rethink their approach to facility design and management.
Air Vs. Water
Let’s talk about how cooling technologies are evolving, particularly with the rise of liquid cooling. Traditional data centers have relied on air-cooled systems for decades. These systems work by conditioning the environment to maintain optimal temperatures for high rack-density data storage. Air is circulated through large HVAC systems to keep data server rooms cool and operational, often supported by computer room air conditioners.
These units are designed to handle computer-generated heat removal, and they come with several plumbing infrastructure needs, such as floor and funnel drains, trap primers and backflow-protected make-up water, to name a few. While air cooling remains the typical design for many U.S. data facilities, its efficiency naturally diminishes as the heat load increases. The higher the density of servers in a space, the more condensed heat they will produce, requiring more energy to maintain temperature swings.
I found this to be true while working on a project on New York’s Long Island, where my team helped turn an existing laboratory into a 25,000-square-foot data center. At the time, it was one of the largest data centers in the area. As a young project executive, I was excited to learn more about what goes on behind the scenes of modern-day computing and how plumbing aided in its function.
The plumber’s work isn’t the most technical on a traditional air-cooled data center project, as detailed earlier. However, with the increased adoption of liquid-cooled computing technology and water-sourced utilities such as TENs, the plumber may one day be the most critical part of a data center project, creating an optimal space for tomorrow’s machines.
In contrast to air cooling, liquid cooling is a highly efficient solution for dealing with the massive heat generated by today’s data centers. Liquid cooling directly transfers heat from components, such as central processing units (CPU) and graphics processing units (GPU), to a liquid coolant or effluent, which will dissipate the heat more effectively than air. This is particularly important for high-density computing facilities where available rack space can push traditional air cooling to its limits.
Liquid cooling systems are already becoming popular in consumer electronics, such as my Alienware desktop computer made by Dell; it uses liquid cooling to manage operation temperature from my processors. These processors can generate significant heat during intense computing tasks; liquid cooling allows for more stable performance.
When scaled up to the data center level, liquid cooling increases efficiency and preserves electricity. Liquid-cooled technology, as it is today, can handle far greater amounts of internal heat displacement, making it ideal for energy-hungry AI tasks.
CPUs, GPUs and GPTs
At the heart of every computer lies the CPU. Often described as the brain of a computer, the CPU handles the basic commands and instructions that allow the system to function with silicon chips. Chips are composed of millions of tiny little transistors, akin to the function of a valve with its on/off or one/zero functionality. The CPU executes various tasks on the chip, from basic functions to operating systems and advanced software.
The GPU, on the other hand, has become increasingly important in today’s web experiences driven by general pretrained transformers (GPT). For future applications such as OpenAI’s text-to-video tool, Sora, the processing power is expected to be significantly higher.
Originally, the GPU was thought of for only displaying graphic images and video but is often tasked with energy-intensive work. As a result, GPUs often contribute significantly to the overall emissions of data centers or cryptocurrency mines, generating significant heat energy worthy of capture.
The relationship between CPU and GPU can be likened to that of an apprentice and a journeyman. The CPU (apprentice) may initiate tasks, but the GPU (journeyman) carries out the skilled work, monitoring output and managing to get the job done.
Modular and Submerged Data Solutions
Modular storage solutions can certainly be one of the most interesting innovations in data center design and have provided intriguing flexibility. Instead of building massive, permanent structures, companies are turning to container or pod-type applications for data storage. These modular data centers can be set up quickly, scaled continuously and even moved from one location to another if necessary.
The portability and flexibility of these systems make them a viable option for businesses needing adaptable data storage solutions or wanting to reduce their overall footprint in a particular geographical area, whether permanent or temporary. These container-style data centers are also easier to cool. Pods can be fabricated with liquid cooling systems, allowing for more efficient heat capture, storage and sharing by interconnection.
This modular approach to data storage may be especially beneficial for those who operate in regions with largely fluctuating temperatures or limited access to renewable energy sources.
Underwater data centers are taking modular storage down a few levels. Microsoft has been developing and experimenting with underwater data centers for nearly a decade. These data centers are housed in pill-shaped vessels and submerged to the ocean floor, where they leverage the natural cooling properties of the ocean to maintain optimal temperatures.
The concept is simple but powerful: the ambient temperature of the ocean water acts as a natural coolant, transferring heat from the data center through the cold walls of the vessel, thereby reducing the need for energy-intensive cooling systems.
However, this innovation raises environmental questions that I would like to see researched further. Could these underwater data centers eventually contribute to oceanic litter, or could they be engineered to more permanently fit the environment and benefit marine life, such as artificial reefs? As this technology matures, research must be pursued to ensure that the benefits of innovation always far outweigh the risks.
Partnerships Drive Progress
New technologies and innovative designs, such as TENs and geothermal advancements, are reshaping the future of mission-critical systems. They are, however, forcing the tech industry to confront its growing infrastructure needs, providing continuing partnerships between the tech and construction industries.
Right now, we have an opportunity to collaborate with our resources and benefit all communities everywhere, especially those who contribute to the data being collected for future use.
While liquid cooling and piping systems can help to make data centers more efficient and circular, the rise of AI and quantum computing means that tech companies must carefully balance sustainable energy and technological progress.
With submergible data centers and futuristic data energy hubs, the technology and building industries are at a pivotal moment. By adopting smart water-based energy solutions and studying the measurable long-term environmental impacts, we get closer to harmony with our planet and the natural speed of advancement.
So, I challenge the engineers and industry enthusiasts to think 20 or 30 years ahead, toward the infrastructure needs of the day after tomorrow. For labor leaders and educators, think about how the skilled trade workers of the future can gear up and be a part of the technological revolution and energy transformation.
We must remain steadfast so that all humanity’s engineered solutions are built as scalable for generations to come, using sustainability as the benchmark of progress.
John A. Mullen, a fourth-generation plumber, brings nearly two decades of experience and a passion for intelligent, sustainable systems to the plumbing and mechanical industry. From apprentice to executive, he has led many complex projects and driven industry-wide safety and compliance initiatives. Mullen’s dedication to the field continues to drive forward the tradition of ensuring safe and efficient plumbing systems for the public.