Home > Technology peripherals > AI > body text

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

WBOY
Release: 2023-09-25 18:45:07
forward
932 people have browsed it

Electronic computers were born in the 1940s, and within 10 years of the emergence of computers, the first AI application in human history appeared.

AI models have been developed for more than 70 years. Now they can not only create poetry, but also generate images based on text prompts, and even help humans discover unknown protein structures

In such a short period of time, AI technology has achieved exponential growth. What is the reason for this?

A long picture from "Our World in Data" (Our World in Data), using the computing power used to train the AI ​​model as a scale to analyze the history of AI development. Traceability.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

High-definition picture: https://www.visualcapitalist.com/wp-content/uploads/ 2023/09/01.-CP_AI-Computation-History_Full-Sized.html The content that needs to be rewritten is: High-resolution large image link: https://www.visualcapitalist.com/wp-content/uploads/2023/09/01.-CP_AI-Computation-History_Full-Sized.html

The source of this data is a paper published by researchers from MIT and other universities

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

Paper link: https://arxiv.org/pdf/2202.05924.pdf

In addition to the paper, a research team also produced a visual table based on the data in this paper. Users can zoom in and out of the chart at will to obtain more detailed data

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

The content that needs to be rewritten is: Table address: https://epochai.org /blog/compute-trends#compute-trends-are-slower-than-previously-reported

The author of the chart mainly estimates the training time of each model by calculating the number of operations and GPU time. The amount of calculation, and for which model to choose as a representative of an important model, the author mainly determines through three properties:

Significant importance: a certain system has a major historical impact, significant Improved SOTA, or been cited more than 1,000 times.

Relevance: The author only includes papers that contain experimental results and key machine learning components, and the goal of the paper is to promote the development of existing SOTA.

Uniqueness: If there is another more influential paper describing the same system, then that paper will be removed from the author's dataset

Three eras of AI development

In the 1950s, American mathematician Claude Shannon trained a robot mouse named Theseus to enable it to Navigate the maze and memorize the path. This is the first instance of artificial learning

Theseus is built on 40 floating-point operations (FLOPs). FLOPs are commonly used as a measure of computer hardware computing performance. The higher the number of FLOPs, the greater the computing power and the more powerful the system.

The advancement of AI relies on three key elements: computing power, available training data, and algorithms. In the early decades of AI development, the demand for computing power continued to grow according to Moore's Law, which means that computing power doubled approximately every 20 months

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

However, when the emergence of AlexNet (an image recognition artificial intelligence) in 2012 marked the beginning of the deep learning era, this doubling time was greatly shortened to six months, as researchers improved the computing and processor Investments increase

With the 2015 emergence of AlphaGo—a computer program that defeated professional human Go players—researchers discovered the third era: massive AI The model era has arrived, and its computational requirements are greater than all previous AI systems.

The progress of AI technology in the future

Looking back on the past ten years, the growth rate of computing power is simply incredible

For example, the computing power used to train Minerva, an AI that can solve complex mathematical problems, was almost 6 million times that used to train AlexNet a decade ago.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

This growth in computing, coupled with the vast number of available data sets and better algorithms, has allowed AI to achieve a vast amount of results in an extremely short period of time. progress. Today, AI can not only reach human performance levels, but even surpass humans in many fields.

AI capabilities will continue to surpass humans in all aspects

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

As can be clearly seen from the above picture, Artificial intelligence has already surpassed human performance in many areas and will soon surpass human performance in others.

The following figure shows in which year AI has reached or exceeded human levels in common abilities that humans use in daily work and life.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

AI technology development potential is sufficient

It is difficult to determine whether growth can maintain the same rate . The training of large-scale models requires more and more computing power. If the supply of computing power cannot continue to grow, it may slow down the development process of artificial intelligence technology

Similarly, exhaustion is currently available for All the data used to train AI models can also hinder the development and implementation of new models.

In 2023, the AI ​​industry will see an influx of capital, especially generative AI represented by large language models. This may indicate that more breakthroughs are coming. It seems that the above three elements that promote the development of AI technology will be further optimized and developed in the future

In the first half of 2023, the AI ​​industry of startups have raised $14 billion in funding, which is even more than the total funding received in the past four years.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

And a large number (78%) of generative AI startups are still in the very early stages of development, and even 27% of generative AI startups The company has not yet raised capital.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

More than 360 generative artificial intelligence companies, 27% have not yet raised funds. More than half are projects of Round 1 or earlier, indicating that the entire generative AI industry is still in a very early stage.

AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages

Due to the capital-intensive nature of developing large language models, the generative AI infrastructure category has gained over 70% since Q3 2022 % of funds, accounting for only 10% of all generative AI trading volume. Much of the funding comes from investor interest in emerging infrastructure such as underlying models and APIs, MLOps (machine learning operations), and vector database technology.

The above is the detailed content of AI technology explodes exponentially: computing power increased 680 million times in 70 years, witnessed in 3 historical stages. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!