For AI workloads, as a CTO, the next wave of thinking is about wringing every last bit of productivity out of every joule of energy and turning that into revenue. One example is using waste heat to power something else. Warm-water liquid cooling systems and two-phase immersion cooling setups can pump out 80 to 100 kilowatts of power and create heat that's suitable for district heating networks. When you co-locate with greenhouses, aquaculture or residential networks, and sell the heat to them under long-term power purchase agreements, the predictable revenue stream cuts down the effective operational expenses, and can also trigger local incentives. Another idea is to pump-less seawater assist, which makes use of the ocean's motion and head pressure to work your system. Basically, the ocean does most of the work, and mechanical pumps only have to top up. You can pair this with titanium plate heat exchangers to avoid corrosion. This also brings high upfront costs but you get continuous savings on chilling costs, and as a result, reduces your total cost of ownership, especially in hot climates. A concept that is a nuclear-adjacent cousin to traditional data centers is positioning data centers right next to Small Modular Reactors or large nuclear plants, and negotiating behind-the-meter deals, which give you 24/7 low-carbon energy, stability in the grid and PPAs that last for years. This appeals to premium clients like AI training facilities and banks who are willing to pay for guaranteed uptime and green credentials, and utilities also love the steady baseload demand. One other idea is to change the way AI is done by feeding it more specialized, application-specific accelerators, low-precision math, and sparsity-aware compilers. Then, throw in DPUs and SmartNICs to offload networking and storage. A return on investment for this approach could be 20 to 40% energy savings, and comes at no cost in software changes.
I have personally viewed how commonly overlooked heat reuse actually becomes an asset. One of our clients in Europe (I'll try not to give too much away, but this was in Switzerland, if that matters) took waste heat from their server and moved it into a nearby residential neighborhood, as part of a utility co-generation program in a benefit for both and to save operating costs and capture the support of the Municipality. This is a unique piece of dual use infrastructure for both sustainability and as an asset. Seawater cooling is another topic of interest. I have studied a few facilities piloting seawater cooling systems proximate to populations near the coast. You'll cut down on energy use from pumps, but you have to address design with the corrosion and biofouling concerns on the front end of the design. Best practice has always been to bake a seawater system into the design of the building rather than have it as an afterthought. Adding seawater is going to be difficult as a bolted on solution. I have followed nuclear microreactors with great interest; they are small, resilient and great for a remote data center campus. The paperwork is thick and you can easily lose a schedule just on permitting, especially in the US. Despite this fact, nuclear seems a potential solution for hyper scalers looking at 5-10 years out. I think photonic chips are exciting. I have participated in the testing of life-cycle end of life systems that use optical interconnects and I have no doubt that the speed and lower power are real. However, the manufacturing of photonic chips is not mature enough yet to scale. As interesting as these new technologies are, it really comes down to the whole equation. If it does not check compliance labs, extend lifespan of components or reduce cost of the lifecycle of the system or it will not matter. Innovations have to work in practice, but they have to work in theory first.
Assessing this challenge in the prism of the technological infrastructure projects experience, the most promising aren't only those variations of being environmentally friendly but having economic rationale too. Heat Reuse Economics The underwater Project Natick created by Microsoft made me learn something interesting: desalination systems or district heating networks may be driven by waste heat. In my computation, data centers normalcy fall victim to 40 per cent of waste in the form of heat. Internet-based heaters based on the principle of Qarnot are already marketed in such companies as the Company such as, Qarnot to residential buildings, and offset operation expenses by 30-50%. Your business case is very sound since you are actually selling your garbage. Seawater Cooling Reality The Finland plant of Google utilizes seawater cooling application by cutting the energy expenditures by half of the traditional systems. Maintenance issues aside, the ROI is realized in 3-4 years. Corrosion of salt has to be formed with special materials, Topping up extra costs to 15-20%, and this cost will only be compensated by the operational benefits. Nuclear Micro-Reactors Hyperscale facilities may rely on the powering of small modular reactors (SMRs). All the designs of NuScale are aimed at the cost of electricity 65/MWh only, which is competitive with renewable. Amazon has just made a 500M investment, which shows serious commercial intentions. Orbital Data Centers The production of Starlink at SpaceX has dropped by levels to 250,000 US dollars per satellite, creating possibilities that orbital Stations could incur 10-50M in a 10-year period. In space, there is uninterrupted supply of solar energy which removes the grid and complete dependence. The environmentally friendly, not to mention the profit margins, are added up in the winners.
Our AI models are continually running at Davincified in a bid to generate customized paint kits. That is we exist in compute stress world, increasing the cost of clouds, and energy impact. Therefore, when discussing alternative data center concepts, we do not discuss another theory but the concept of maintaining our business as efficient and viable. It would be logical to conserve heat. Servers produce heat on a daily basis. Some centers instead of dumping it are now piping that heat into the adjacent buildings or water system. It is already operating in sections of Europe and it as well has environmental and financial potential. Cooling is another big one. Conventional system consumes high amount of electricity to ensure machines do not overheat. Due to the natural movement of sea water, cooling equipment largely consumes a significant part of that energy expenditure particularly in coastal areas. It is passive, dependable and is already undergoing live trials. And then there is hardware. The competition in the GPUs is expensive and power-consuming. However, there is emerging technology in the form of photonic chips that employ light instead of electricity and can be a revelation. Reduced power, no thermal, high speed process. All this will fail in case it cannot be boosted or adapted to a business model. However, once it starts doing so, then it ceases to be science fiction and begins to be a competitive advantage.
Storage crisis is no longer about the capacity. I have been involved in enterprise infrastructure over the last ten years, and I can see the game-changing solutions that are going on today. Heat recovery is already profitable. The data centers operated by Microsoft in Finland launder the excess heat to district heating systems, making the operations of the company 30% cheaper and earning the company revenue streams. This arithmetic is correct since you are so getting paid twice: when it has to be computed and when it has to be heated. I have contacted companies that put this into practice and after 18 months ROI is usually realized. Underwater data centers are beyond concept phase. Project Natick of Microsoft showed 87 percent superior reliability to onshore plant. The universities would benefit since ocean cooling removes the expense of buying HVACs and the benefit of cut down operation costs by 40 percent and a much better uptime is also unquestionable. It is no longer a technical issue but a regulatory approves on the installations at the ocean bins. Nuclear micro-reactors will dominate enterprise computing by 2030. Such companies as Oklo are constructing 15MW reactors tailored to data centers. The economics is breathtaking: even levelized, not to mention the fact that in many areas, grid power costs will exceed $100/MWh. Amazon is already spending heavily in this. Photonic computing chips reduce power consumption by 90% compared to traditional semiconductors. Processors at Lightmatter are shipping today but specifically aiming at the AI workloads. It is business case direct: when power bills are reduced, it is bashed in margins. Space-based centers will not be economically viable because of the cost of launching satellite, terrestrial-based solutions are available today.