The demand for computer calculation will determine the future of high-tech

The demand for computer calculation will determine the future of high-tech
The demand for computer calculation will determine the future of high-tech

Hear

Never has humanity made such colossal investments in times of peace as the astronomical sums it devotes to the development of artificial intelligence (AI). After having allocated almost 100 billion dollars until the Covid pandemic began in 2019, it mobilized 91.5 billion in 2022 and another 89 billion in 2023, not counting investments in infrastructure, according to Human-Centered AI at Stanford University . This year, the four high-tech giants (Amazon, Meta, Microsoft and Google) plan to invest a new fortune of 200 billion dollars in new facilities and basic services, according to a forecast by the McKinsey cabinet. That escalation did not end. Due to the astronomical prices reached by specialized processors, such as those produced by semiconductor leader Nvidia, private investments in servers dedicated to AI alone will increase from $25 billion to $125 billion per year (+500% between 2022 and 2025) .

What reasons explain this phenomenon? “The demand for computer calculations for AI multiplied by a million “The number of babies born from the virus has increased by 10% over the past six years and is growing at an annual rate of 10%,” Google CEO Sundar Pichai said on May 14. The phrase brings to mind millions of chicks with their beaks open immediately after hatching, waiting for the moment to satisfy their hunger.

The limitless climbing launched by AI, especially in the last 10 years, is not limited to one race technology to reduce the volume of processors and increase the capacity of semiconductors in order to respond to the demand of the cryptocurrency industry and the generalization of 5G telephone technology. It also involves a feverish technological search aimed at achieving more efficient algorithms, faster and more powerful semiconductors, and building new DPCs (data processing centers or data centers) less energy-hungry. Since no technological and industrial revolution has a neutral ecological “cost”, this rapid transition is not without its drawbacks.

In appearance, a data center It is simply a huge building or a lineup of prefabricated modules housing giant computer banks that store critical information and process algorithms at the rate of millions of calculations per second. To ensure safe and efficient operation, the American Society of Heating, Refrigerating and Air-Conditioning Engineers (Ashrae) advises that the optimal temperature of these electronic farms be kept within a range of 24°C ± 2°C.

It is estimated that by 2020 the world’s fleet of data centers It had a capacity of over 1,400 exabytes (one exabyte is roughly equivalent to one billion gigabytes). Its volume doubles every two or three years, which means that its storage capacity currently ranges between 4,500 and 5,600 exabytes. To visualize that volume, just imagine that the total storage capacity of the 8,000 data centers that exist in the world is equivalent to about 43.75 billion smartphones (5468 phones per inhabitant, including babies).

As members of the sector information technology (IT), data centers became one of the most dynamic areas of the world economy. That colossal market, which actually emerged less than 20 years ago, weighed $215.8 billion in 2023 and, according to market consulting firm Grand View Research, expects annual growth of 10% from 2024 to 2030. Just the five top players (Amazon Web Services, Microsoft Azure, Google Cloud, Equinix and Digital Realty) had revenues of $196 billion last year.

The problem is that this rapid development and the foreseeable demand for the next quarter of a century require new investments and the construction of gigantic infrastructure works in order to respond, for example, to the demand for electricity: the data centers consumed between 800-1000 TWh (terawatts) last year, a figure currently equivalent to 1.3% of global electricity demand, according to three coincident sources: the BP energy statistical yearbook, the International Energy Agency (IEA) and Enerdata’s Global Energy Statistical Yearbook. Some experts, however, dismiss these calculations out of hand and estimate that digital technologies actually mobilize 10% of global electricity production, and will reach 20% before 2030. As 60% of that total is originates from fossil fuels, a report from the World Economic Forum (WEF) in Davos estimates that the generalization of artificial intelligence will cause a jump in greenhouse emissions from 6-8% of the world total today to 9-10 % in 2030.

The impact is also very strong, although less visible, during the industrial process. The smaller the components, the more significant their technical footprint. Manufacturing a 2-gram integrated circuit requires – for example – 32 kilos of raw materials. The WEF’s Global E-waste Monitor assessed that in 2023 the high-tech The US has accumulated 146 million tons of electronic waste, metals and production waste that are almost impossible to recycle. This figure does not include some “invisible” materials, such as the 1.3 million kilometers of submarine cables deposited on the ocean floor, through which 97% of the world’s internet traffic passes.

“Far from freeing us from the limitations of the physical world, digital technology leaves us with the most gigantic garbage dump in history, much of which will be impossible to recycle,” wrote John Perry Barlow in Declaration of independence of cyberspacepublished in 1996.

The biggest problem, however, is that the functioning of each server It becomes a real stove that generates up to 60 degrees of heat. A medium-sized plant requires cooling systems that require additional energy consumption and about 600,000 cubic meters of water per year. Thanks to the permanent miniaturization of tiny boards and semiconductors developed by Nvidia and the other leaders of the high-tech It has been possible, until now, to limit electricity consumption and the production of high temperatures, which contribute to exacerbating climate warming. These companies have also begun to use an arsenal of new refrigeration methods, such as the so-called free coolinghe liquid cooling or the immersion of containers in the ocean floor, a procedure invented by the Dutchman Asperitas. Thanks to the contribution of Naval Group, a specialist in underwater technologies, Microsoft made a first experiment with a cylinder with 864 servers submerged at 100 meters depth. The Chinese giant Alibaba is studying anchoring part of its servers in oil. Other companies opted to build data centers in the Nordic countries to take advantage of their low temperatures.

The latest idea consists of appealing to the same technology that scientists seek to achieve to cool quantum computers, which must operate at temperatures close to absolute zero (-273°C). “Refrigerate a data center It’s a real science. It may be that this is where the next revolution in computer calculation will come from,” predicts Jean-Michel Rodriguez, an IBM expert. It is the great challenge because to function, before computing power, the future of computing depends – above all – on a cable and a plug.

Economic intelligence specialist and journalist

Get to know The Trust Project
 
For Latest Updates Follow us on Google News
 

-

NEXT New Volkswagen T-Cross, what changes and how much does the renewed compact SUV cost?