A Microsoft project touted its recent re-investment in immersive cooling technologies as not just a way to overcome the derailment of Moore’s Law by physics, but the only way. Hypothesis or hyperbole?
An “Innovation Story” published by Microsoft last April re-introduced readers to the company’s work with immersion cooling for data center servers, particularly as it relates to Azure. But then it brought up an unusual connection — one whose existence may be a question in itself. If two-phase immersion cooling were present in production data center environments, wrote Microsoft’s John Roach, the scalability limitations of processor designs at the atomic level could be overcome.
Put another way: If heat is the obstacle that derailed Moore’s Law, why not use immersion cooling to transfer heat off the chips? The observation that transistor densities could double every 18 months while keeping processors affordable and desirable in the market, could get back on track — albeit submerged.
That assertion prompted Greg Cline, Omdia’s principal analyst for SD-WAN and data center, along with Vladimir Galabov, head of Omdia’s Cloud and Data Center Research Practice, to draw a stunning conclusion. In their April 21 report entitled, “Immersion cooling is heating up,” they wrote that Microsoft expanded its investments in liquid cooling technologies because “it is the only way to offset the slowdown of Moore’s Law.”
By Scott Fulton III on ITProToday
Image Source: DataCentreKnowledge
コメント