Home >Operation and Maintenance >Windows Operation and Maintenance >Introduction to Neural Network Algorithms
Neural network is an important machine learning technology. It is the basis of deep learning, the most popular research direction at present. Learning neural networks not only allows you to master a powerful machine learning method, but also helps you better understand deep learning technology.
Let’s review the development process of neural networks. The history of the development of neural networks is full of twists and turns. There are moments when it is praised to the sky, and there are also times when it falls on the streets and no one cares about it. It has experienced several ups and downs in the middle.
Starting from a single-layer neural network (perceptron), to a two-layer neural network containing a hidden layer, and then to a multi-layer deep neural network, there are three rises.
The peaks and troughs in the above figure can be seen as the peaks and troughs of the development of neural networks. The horizontal axis in the graph is time, measured in years. The vertical axis is a schematic representation of the influence of a neural network. If the 10 years from the proposal of the Hebb model in 1949 to the birth of the perceptron in 1958 are regarded as falling (not rising), then the neural network has experienced a process of "three rises and three falls", similar to Comrade "Xiaoping". As the saying goes, when heaven is about to entrust a person with a great responsibility, he must first work hard on his mind and body. The success of neural networks that have experienced so many twists and turns at this stage can be seen as an accumulation of hard work.
The biggest advantage of history is that it can serve as a reference for the present. Scientific research presents an upward spiral process and cannot be smooth sailing. At the same time, this also sounds a warning to those who are currently overly enthusiastic about deep learning and artificial intelligence, because this is not the first time that people have gone crazy because of neural networks. From 1958 to 1969, and from 1985 to 1995, people's expectations for neural networks and artificial intelligence were not as low as they are now, but everyone can see clearly what the results will be.
Therefore, calmness is the best way to deal with the current deep learning craze. If people rush in because deep learning is hot or because they can make money, then the ultimate victim will only be themselves. The neural network community has twice been praised to the heavens by people. I believe that the higher the praise, the worse the fall. Therefore, scholars in the neural network field must also pour water on this craze and not let the media and investors overestimate this technology. It is very likely that after thirty years in Hedong and thirty years in Hexi, the neural network will hit the bottom again in a few years. Based on the historical graph above, this is very possible.
Let’s talk about why neural networks are so popular? In short, it is the powerful learning effect. With the development of neural networks, their representation performance is getting stronger and stronger.
From a single-layer neural network, to a two-layer neural network, and then to a multi-layer neural network, the following figure illustrates that with the increase in the number of network layers and the adjustment of the activation function, the neural network can fit The ability to make decisions about demarcation planes.
It can be seen that as the number of layers increases, its nonlinear boundary fitting ability continues to increase. The dividing line in the picture does not represent the actual training effect, but is more of a schematic effect.
The reason why the research and application of neural networks can continue to develop vigorously is inseparable from its powerful function fitting ability.
Of course, just having strong inner ability does not necessarily lead to success. A successful technology and method requires not only the role of internal factors, but also the cooperation of the current situation and the environment. The external reasons behind the development of neural networks can be summarized as: stronger computing performance, more data, and better training methods. Only when these conditions are met, the function fitting ability of the neural network can be fully reflected, as shown in the figure below.
The reason why Rosenblat was unable to create a two-layer classifier in the era of single-layer neural networks was that the computing performance at the time was insufficient, and Minsky also used this to suppress neural networks. But Minsky did not expect that just 10 years later, the rapid development of computer CPUs would allow us to train two-layer neural networks, and there would also be a fast learning algorithm BP.
But in the era when two-layer neural networks are rapidly becoming popular. Due to problems with computing performance and some computing methods, the advantages of higher-level neural networks cannot be reflected. Until 2012, researchers discovered that graphics accelerator cards (GPUs) for high-performance computing can perfectly match the requirements required for neural network training: high parallelism, high storage, not much control requirements, and pre-training Waiting for algorithms, neural networks can shine.
In the Internet era, a large amount of data is collected and organized, and better training methods are constantly discovered. All of this meets the conditions for multi-layer neural networks to exert their capabilities.
“The times make heroes”, as Hinton said in his 2006 paper
"... provided that computers were fast enough, data sets were big enough, and the initial weights were close enough to a good solution. All three conditions are now satisfied.",
External conditions The satisfaction of neural network is also an important factor in the development of neural network from neurons to the current deep neural network.
In addition, the development of a technology is impossible without "Bole". In the long history of neural networks, it is precisely because of the perseverance and continuous research of many researchers that the current achievements have been achieved. In the early days of Rosenblat, Rumelhart did not witness the current popularity and status of neural networks. But at that time, the foundation they laid for the development of neural networks will be passed down forever and will not fade.
The above is the detailed content of Introduction to Neural Network Algorithms. For more information, please follow other related articles on the PHP Chinese website!