Home Technology peripherals AI Cold thoughts under the ChatGPT craze: AI energy consumption in 2025 may exceed that of humans, and AI computing needs to improve quality and efficiency

Cold thoughts under the ChatGPT craze: AI energy consumption in 2025 may exceed that of humans, and AI computing needs to improve quality and efficiency

Apr 12, 2023 am 09:43 AM
ai chatgpt

After years of development, the DALL-E and GPT-3 generative AI systems launched by OpenAI have become popular all over the world and currently highlight their amazing application potential. However, there is a problem with this explosion of generative AI: every time DALL-E creates an image or GPT-3 predicts the next word, it requires multiple inference calculations, thus taking up a lot of resources and Consumes more electricity. Current GPU and CPU architectures cannot operate efficiently to meet the imminent computing demands, creating huge challenges for hyperscale data center operators.

Cold thoughts under the ChatGPT craze: AI energy consumption in 2025 may exceed that of humans, and AI computing needs to improve quality and efficiency

Research institutions predict that data centers have become the world’s largest energy consumers, accounting for 3% of total electricity consumption in 2017, rising to 4.5% in 2025. %. Taking China as an example, the electricity consumption of data centers operating nationwide is expected to exceed 400 billion kWh in 2030, accounting for 4% of the country's total electricity consumption.

Cloud computing providers also recognize that their data centers use large amounts of electricity and have taken steps to improve efficiency, such as building and operating data centers in the Arctic to take advantage of renewable energy and natural cooling conditions. However, this is not enough to meet the explosive growth of AI applications.

Lawrence Berkeley National Laboratory in the United States found in research that improvements in data center efficiency have been controlling the growth of energy consumption over the past 20 years, but research shows that current energy efficiency measures may not be enough to meet the needs of future data centers. needs, therefore a better approach is needed.

Data transmission is a fatal bottleneck

The root of efficiency lies in the way GPU and CPU work, especially when running AI inference models and training models. Many people understand "beyond Moore's Law" and the physical limitations of packing more transistors on larger chip sizes. More advanced chips are helping to solve these challenges, but current solutions have a critical weakness in AI inference: the significantly reduced speed at which data can be transferred in random-access memory.

Traditionally, it has been cheaper to separate the processor and memory chips, and for years processor clock speed has been a key limiting factor in computer performance. Today, what's holding back progress is the interconnect between chips.

Jeff Shainline, a researcher at the National Institute of Standards and Technology (NIST), explained: "When memory and processor are separated, the communication link connecting the two domains becomes the main bottleneck of the system." Professor Jack Dongarra, a researcher at Oak Ridge National Laboratory in the United States, said succinctly: "When we look at the performance of today's computers, we find that data transmission is the fatal bottleneck."

AI inference vs.AI training

AI systems use different types of calculations when training an AI model compared to using an AI model to make predictions. AI training loads tens of thousands of image or text samples into a Transformer-based model as a reference, and then starts processing. Thousands of cores in a GPU process large, rich data sets such as images or videos very efficiently, and if you need results faster, more cloud-based GPUs can be rented.

Cold thoughts under the ChatGPT craze: AI energy consumption in 2025 may exceed that of humans, and AI computing needs to improve quality and efficiency

Although AI inference requires less energy to perform calculations, in auto-completion by hundreds of millions of users, a lot of calculations and predictions are required to decide which word is next What, this consumes more energy than long-term training.

For example, Facebook’s AI systems observe trillions of inferences in its data centers every day, a number that has more than doubled in the past three years. Research has found that running language translation inference on a large language model (LLM) consumes two to three times more energy than initial training.

Surge in demand tests computing efficiency

ChatGPT became popular around the world at the end of last year, and GPT-4 is even more impressive. If more energy-efficient methods can be adopted, AI inference can be extended to a wider range of devices and create new ways of computing.

For example, Microsoft’s Hybrid Loop is designed to build AI experiences that dynamically leverage cloud computing and edge devices. This allows developers to make late-stage decisions while running AI inference on the Azure cloud platform, local client computers, or mobile devices. Bind decisions to maximize efficiency. Facebook introduced AutoScale to help users efficiently decide where to compute inferences at runtime.

In order to improve efficiency, it is necessary to overcome the obstacles that hinder the development of AI and find effective methods.

Sampling and pipelining can speed up deep learning by reducing the amount of data processed. SALIENT (for Sampling, Slicing, and Data Movement) is a new approach developed by researchers at MIT and IBM to address critical bottlenecks. This approach can significantly reduce the need to run neural networks on large datasets containing 100 million nodes and 1 billion edges. But it also affects accuracy and precision—which is acceptable for selecting the next social post to display, but not if trying to identify unsafe conditions on a worksite in near real-time.

Tech companies such as Apple, Nvidia, Intel, and AMD have announced the integration of dedicated AI engines into processors, and AWS is even developing a new Inferentia 2 processor. But these solutions still use traditional von Neumann processor architecture, integrated SRAM and external DRAM memory - all of which require more power to move data in and out of memory.

In-memory computing may be the solution

In addition, researchers have discovered another way to break the "memory wall", which is to bring computing closer Memory.

The memory wall refers to the physical barrier that limits the speed of data entering and exiting the memory. This is a basic limitation of traditional architecture. In-memory computing (IMC) solves this challenge by running AI matrix calculations directly in the memory module, avoiding the overhead of sending data over the memory bus.

IMC is suitable for AI inference because it involves a relatively static but large weighted data set that can be accessed repeatedly. While there is always some data input and output, AI eliminates much of the energy transfer expense and latency of data movement by keeping data in the same physical unit so it can be efficiently used and reused for multiple calculations.

This approach improves scalability because it works well with chip designs. With the new chip, AI inference technology can be tested on developers' computers and then deployed to production environments through data centers. Data centers can use a large fleet of equipment with many chip processors to efficiently run enterprise-level AI models.

Over time, IMC is expected to become the dominant architecture for AI inference use cases. This makes perfect sense when users are dealing with massive data sets and trillions of calculations. Because no more resources are wasted transferring data between memory walls, and this approach can be easily scaled to meet long-term needs.

Summary:

The AI ​​industry is now at an exciting turning point. Technological advances in generative AI, image recognition, and data analytics are revealing unique connections and uses for machine learning, but first a technology solution that can meet this need needs to be built. Because according to Gartner’s predictions, unless more sustainable options are available now, AI will consume more energy than human activities by 2025. Need to figure out a better way before this happens!

The above is the detailed content of Cold thoughts under the ChatGPT craze: AI energy consumption in 2025 may exceed that of humans, and AI computing needs to improve quality and efficiency. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undress AI Tool

Undress AI Tool

Undress images for free

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

The top 10 recommended rankings of the most valuable virtual currency (2025 latest version) The top 10 recommended rankings of the most valuable virtual currency (2025 latest version) Aug 22, 2025 pm 07:15 PM

Bitcoin ranks first, followed by Ethereum, Solana, BNB, XRP, USDT, ADA, DOGE, SHIB, and AVAX, based on comprehensive rankings based on technology, ecology and market consensus.

How to identify current trends/narratives in the crypto market? Methods for identifying current trends in crypto markets How to identify current trends/narratives in the crypto market? Methods for identifying current trends in crypto markets Aug 26, 2025 pm 05:18 PM

Table of Contents 1. Observe the tokens with leading gains in the exchange 2. Pay attention to trend signals on social media 3. Use research tools and institutional analysis reports 4. Deeply explore on-chain data trends 5. Summary and strategic suggestions In the crypto market, narrative not only drives capital flow, but also profoundly affects investor psychology. Grasping the rising trend often means higher returns potential; while misjudgment may lead to high-level takeovers or missed opportunities. So, how can we identify the narrative that dominates the market at present? Which areas are attracting a lot of capital and attention? This article will provide you with a set of practical methods to help you accurately capture the hot pulse of the crypto market. 1. The most intuitive signal of observing the leading tokens on the exchange often comes from price performance. When a narrative begins

What is the reason for the rise of OKB coins? A detailed explanation of the strategic driving factors behind the surge in OKB coins What is the reason for the rise of OKB coins? A detailed explanation of the strategic driving factors behind the surge in OKB coins Aug 29, 2025 pm 03:33 PM

What is the OKB coin in the directory? What does it have to do with OKX transaction? OKB currency use supply driver: Strategic driver of token economics: XLayer upgrades OKB and BNB strategy comparison risk analysis summary In August 2025, OKX exchange's token OKB ushered in a historic rise. OKB reached a new peak in 2025, up more than 400% in just one week, breaking through $250. But this is not an accidental surge. It reflects the OKX team’s thoughtful shift in token model and long-term strategy. What is OKB coin? What does it have to do with OKX transaction? OKB is OK Blockchain Foundation and

Bitcoin (BTC) $13.8 billion option expiration is imminent, bulls face key test Bitcoin (BTC) $13.8 billion option expiration is imminent, bulls face key test Aug 29, 2025 pm 04:15 PM

Key points of the catalog: The bullish Bitcoin strategy is weakly defending below $114,000. The Fed's trends and technology stock performance may dominate the future trend of Bitcoin. The Bitcoin option expiration date is approaching, and the technology sector is under pressure may reveal whether the current pullback is a suspension of a bull market or the beginning of a trend reversal. Key points: Bitcoin short side has an advantage below $114,000, and downward pressure may further increase as the option expiration date approaches. Market concerns about capital expenditure in the field of artificial intelligence (AI) have increased, aggravated volatility in the overall financial market and weakened the attractiveness of risky assets. The $13.8 billion Bitcoin (BTC) option will expire in a concentrated manner on August 29. The market is paying close attention to this node to determine whether the previous 9.7% drop is short.

What is Lumoz (MOZ coin)? MOZ Token Economics and Price Forecast What is Lumoz (MOZ coin)? MOZ Token Economics and Price Forecast Aug 29, 2025 pm 04:21 PM

Contents What is Lumoz (MOZ token) How Lumoz (MOZ) works 1. Modular Blockchain Layer Background and History of Lumoz Features of MOZ Token Practicality Price of MOZ Token History of MOZ Token Economics Overview Lumoz Price Forecast Lumoz 2025 Price Forecast Lumoz 2026-2031 Price Forecast Lumoz 2031-2036 Price Forecast ‍L2 is widely recognized in expansion solutions. However, L2 does not effectively handle many hardware resources, including data availability, ZKP (zero knowledge proof)

Which Bitcoin website is better? Global Bitcoin website ranking 2025 Which Bitcoin website is better? Global Bitcoin website ranking 2025 Aug 22, 2025 pm 07:24 PM

Binance ranks first with its high liquidity, low handling fees and complete ecosystem. Ouyi ranks second with its derivatives and Web3 advantages, Huobi ranks third with its stable operation, Gate.io has become the first choice for altcoins with rich currencies, Coinbase has won the trust of novices for its compliance, Kraken attracts institutional users with its top security, and KuCoin is favored for its new coins and automation tools.

What is COOKIE DAO? How to buy it? COOKIE Price Forecast 2025-2030 What is COOKIE DAO? How to buy it? COOKIE Price Forecast 2025-2030 Aug 25, 2025 pm 05:57 PM

Directory What is COOKIEDAO? COOKIEDAO Token Economics Current Market Conditions and Factors Influencing COOKIE Prices COOKIE 2025-2026 Price Forecast COOKIE 2029-2030 Price Forecast 2025-2030 Price Forecast Price List Which exchanges are COOKIE coins traded? How to buy Binance (Binance) BybitBitgetKuCoinMEXCBTCCCOOKIE coins? Conclusion‍CookieDAO's $ after reaching an all-time high of $0.7652 on January 10, 2025

What is Onyxcoin (XCN)? XCN Price Forecast 2025-2030 What is Onyxcoin (XCN)? XCN Price Forecast 2025-2030 Aug 25, 2025 pm 06:03 PM

Directory What is Onyxcoin(XCN)? How does OnyxProtocol work? The current market conditions and factors that affect the price of Onyxcoin (XCN) 1. Market sentiment 2. Adoption rate 3. Technology development The technology behind OnyxcoinProtocol 1. Advanced security architecture 2. Scalable infrastructure 3. Multi-asset-backed XCN development prospect XCN price forecast 2025-20301. 2025 forecast 2. 2026 forecast 3. 2029 forecast 4. 2030 forecast price forecast table (20

See all articles