I Am Not A Mage Lord

Chapter 318: Neural Networks

, I'm really not Faye

Just as Lynch was addicted and hesitated about the interface module of the processor built for him by the goddess with the power of fate, he suddenly found that the chip that was also under construction in his mind was heading for a new chapter.

Very different from the simple CPU chip model he once conceived, it is not the GPU chip model that has only recently changed his mind.

It is a more extreme chip model.

AI chip.

As we all know, there are many types of chips. If they are divided by manufacturing process, microcomputers and mobile phones are the key to consumer electronics. Naturally, they are allocated the best consumer-grade chips, such as the latest flagship phones launched by various mobile phone manufacturers every year. Without the latest 870/880 chip, I am absolutely sorry for this title.

Even if some of the chips in it are very hot or even costly due to their performance, but the name of the latest and strongest cannot be underestimated, otherwise consumers will immediately teach life this year.

In addition to these chips, there are still different types of chips. They do not need to use the most advanced 5nm process. They can even be controlled by single-chip microcomputers, including but not limited to ARM, DSP, etc., which is the general term. Mu chip.

Their manufacturing process is 28nm high, but it is a huge consumption. For example, cars are the big consumption of such chips. The simplest two windows are required to control the lift, let alone automatic assisted driving and other complexities. Functional modules.

The AI ​​chip is a chip type that is more extreme than the GPU on top of alienation.

If it is a GPU, it is an ALU unit that requires more than a CPU.

Then the AI ​​chip is a dedicated chip customized for the AI ​​algorithm, so the energy consumption is lower when the AI ​​algorithm is executed, and the efficiency is higher.

Lynch first looked at the "processor" module that the goddess used to stimulate his creative efficiency with the "sacred word of creation", and soon discovered the difference between it and the ordinary chip structure.

For the category of autonomous driving, ordinary CPU processor calculations cannot meet the needs because the calculations are not strong, so the speed cannot meet the needs. As for the GPU chip, it meets the needs, but its high cost and power consumption are often beyond the consumer's acceptable range.

At this time, AI chips specially customized to fit these application scenarios came into being, such as the graphics chips that Google used to train Alpha Dog in the early days, and later directly used the self-developed AI chips for training.

Only then did Lynch remember it ignorantly.

The AI ​​chip can win because the AI ​​algorithm involves too many convolutions, residual networks, and fully connected type calculations.

These calculations are essentially addition and multiplication.

Similar to the calculations of those spell models that Lynch had come into contact with.

You know, a more mature AI algorithm, if it is executed once, it is easily equivalent to trillions of addition and multiplication calculations.

And more advanced CPU processors, counting the number of calculations per second of multi-core is only tens of billions.

There is a time gap when it comes to processing trillions of times.

It's like TPU1 developed by Google. Its calculation times per second is close to 100 trillion times.

The ai algorithm, which has been calculated trillions of times, has been executed hundreds of times in one second.

If the GPU is specifically separated from the CPU to process image calculations, then the AI ​​chip is specifically separated from the processing of AI algorithm calculations.

All of this stems from the dependence of deep learning on neural network algorithms!

Unhappily.

At this time, Lynch looked at this miracle that had just been established in his mind, and couldn't utter a word.

No matter how much you say, it's all superfluous.

The magic model itself involves the most basic addition and multiplication operations.

And Lynch’s original plan was to transform to artificial intelligence in the future, but he did not expect that he would still be abruptly lifted a step here.

At this moment, he looked at the gods on the opposite side again, and the other side looked at Lynch with satisfaction.

Obviously, Lynch understands the structure of the AI ​​chip and will not let the Orb have a day of dust.

"Neural Networks!"

The gods made a terrifying sound and swept across Lynch's eardrums again.

And in his mind, all the information about this algorithm was suddenly reorganized, and the part that was returned because of the knowledge was combined again.

Like something indescribable, when it first came out, the CPU was digging it. In this case, it is all specially customized mining machines, and these mining machines are using AI chips.

In the field of computing, ACIS can be regarded as being eliminated from the siege of CPU and GPU.

Lynch curled his lips.

Spells.

magic.

Spell model.

Regarding the most reliable casting method, it is natural to teach the processor to complete the entire casting process by itself.

The external PID deals with the overall secret energy field parameter problem, while the internal one is the AI ​​chip processing the calculation problem of the spell model.

people.

It shouldn't exist in this link at all.

And letting the chip learn to use magic is only the first step.

The second step is to let the chip learn to choose!

Human reaction has proved that it cannot be less than 0.1 seconds, so sprinting thinks that the reaction speed exceeds this is a start.

However, in the face of the ever-changing spell battle, if Lynch is thinking about 1V1 singles, then he is really enough.

But if he wants to be the enemy of thousands of people in a spell battle, he also needs an automatic spell response mechanism.

This is also the reason why countless wizards need to specify the spell warfare plan for the next battle in advance, because his thinking can no longer support the battle of millisecond response, and can only make a more comprehensive plan and then embed it in instinct.

Since he remembers the palace, there is an AI chip that is about to be born, why not go straight and develop the spell response along the way?

And here we have to go back to the original question.

The machine processes 1+1, which can crush everything in the world.

But it takes a long way for the machine to know how to choose spells!

Just an autonomous driving, allowing machines to replace humans for driving has cost countless manufacturers' efforts, and it is still hovering at L2 today.

What is machine learning?

To put it simply

People: 1+1=?

Machine: 5

People: 1+2=?

Machine: 7

People: 3+2=?

Machine: 10

After countless times...

People: 1+1=?

Machine: 2.

The so-called artificial intelligence.

There is as much intelligence as there are people.

There was once an example of a mango.

For example, if you want to choose mangoes, but you don't know what kind of mangoes are delicious, you need to taste all the mangoes first, then summarize the deliciousness of dark yellow, and then buy your own and choose dark yellow.

And machine learning is to let the machine taste all the mangoes first, and let the machine sum up a set of rules.

What the people here need is to describe the characteristics of each mango to the machine, from the color to the softness and hardness, and finally let it input whether it is delicious or not.

The rest is waiting for the machine to learn a set of rules to determine that the "dark yellow" mango is delicious.

This learning process is machine learning, and neural network is the most popular machine learning method.

Lying calmly again, Lynch walked to the bookshelf in the Palace of Memory, and silently opened the original book.

The progress jumped too fast, so he had to work overtime to study the next knowledge, he was like a cook who just started to turn over the recipe after boiling the oil.

Although the situation is extremely urgent, there is a certain destiny in it.

In the past, Alpha Dog used the Monte Carlo algorithm and neural network algorithm, and neural network learning is an inevitable barrier for all those who engage in machine learning.

This is also the point of knowledge that Lynch needs to quickly gnaw away.

At this moment, he was sitting in the cage, and he was deduced on the muddy ground with nothing else in his heart, without any scruples about the dirt and sand on it, as if this was a wide-screen blackboard for him to perform calculations.

Neural networks, as the name suggests, come from human neurons.

Basically, through high school biology teaching, most of them can understand the principle of neurons. In the middle is a spherical cell body, and one end is a small and prosperous nerve fiber branch, the scientific name is dendrites.

On the other end is a single long protruding fiber, the scientific name axon.

The function of a neuron is that each dendritic receives electrochemical signal stimulation from other neuron cells. After these pulses are superimposed on each other, once the final intensity reaches a critical value, it will start the neuron cell and then move towards the axon. Send the signal.

The axon changes the membrane potential through the sodium potassium ion exchange inside and outside the cell membrane, so that the entire electrical signal is transmitted without attenuation.

Eventually these signals are transmitted to other axons and dendrites, and then they are stimulated to generate signals and become secondary neurons.

Like the human visual system, it receives light signals through 130 million photoreceptive cells, and transmits information from the retina to the brain through 1.2 million node cell axons, forming a three-dimensional pattern.

And machine learning is to teach the computer how to associate the input results it accepts with the output results we want.

Such as seeing a picture, it can understand that this is the number we need 1.

And this depends on the perceptron, which is also the reason for the name neural network.

Perceptrons are analog nerve cells, and the original biological terms have corresponding new names.

Weights, biases and activation functions.

The machine cannot understand a picture, but it can translate the picture into a "pixel matrix", and then these dots are input with 0 and 1.

Lin Qi silently transformed an xy coordinate axis familiar to junior high school students on the ground, and at the same time pointed out the four left sides on it. They are connected to form a square, and the four left sides are located in the four quadrants.

What machine learning needs is to let the machine know which quadrant such as this should be counted as?

This requires the "classification" function of neural network algorithms.

Here the input is a coordinate, it is a 1 by 2 matrix, this is the input layer.

Set 50 neurons, so it is a 1 by 50 matrix, which is the hidden layer.

The result is 14 quadrants, which is a 1 by 4 matrix, which is the output layer.

According to the knowledge of linear algebra, it can be known that the matrices can be communicated, so the 1 by 4 matrix of an output layer can be expressed by the original 1 by 2 input layer matrix.

The operation here is to add an activation layer and output normalization to this matrix operation, then quantify the pros and cons of the current network through cross-entropy loss, and finally optimize the parameters.

What this process requires is iterative iterations.

After going through the process again, Lynch couldn't help sighing with emotion.

He still remembers the university's graduation thesis topics. At that time, each tutor formulated the topic according to his professional scope, and then the students signed up and chose it.

At that time, Lynch was slow to choose, and what was left in the end was a number of "algorithmic questions" that were not easy to chew.

After all, to make an algorithm, the cemetery is to optimize its parameters to make the entire calculation time shorter ~lightnovelpub.net~ The effect is more accurate, and ultimately more optimized, but every year, generations of students have already picked up the obvious gems in the desert. However, the remaining method is just the way of graduate students and doctoral students. Enclose a piece of land by yourself and continue to dig down. It is nonsense to pick out diamonds at a glance.

But Lynch finally thought about it several times and chose a genetic algorithm to do the global optimization problem. As a result, there was a complete set of mature toolkits at that time. Lynch still wrote the function honestly by himself, and finally came up with a reluctant paper. .

In the end, when faced with the review experts’ questions about the innovation, Lynch could only answer in front of him. The combination of these parameters he used had not seen the previous article, so he barely made the pass.

And his roommate who chose the neural network algorithm was questioned on the spot for falsification of simulated data and almost postponed.

Later, in order to help his roommates, Lynch was the first time he came into contact with neural network algorithms.

Undoubtedly, in the neural network algorithm, the secret energy field parameter is the "input matrix", and the result of the spell model is the "output matrix".

Immediately, Lynch wrote a paragraph on the ground

Neural Networks.

Artificial neural networks.

A large number of simple calculation units constitute a nonlinear system.

To a certain extent, it simulates the information processing, storage and retrieval functions of the brain.

The error backward learning algorithm of BP network.

The output error is used to estimate the direct leading layer error of the output layer, and then the error of the previous layer is updated with the error, and the error estimates of all other layers are obtained by passing it back.

...

He reorganized the knowledge of the entire neural network step by step.

He believed that this was also what the **** evil on the opposite side was waiting for.

This is also an exchange between the two!

An unspoken tacit understanding.