The experiment explores the vulnerabilities of neatly arranged systems and allows the computer network to do its own thing.
In nature, strength comes from diversity, whether it be the human brain or the ecosystem. So why is artificial intelligence relatively standardized? A team from North Carolina State University set out to work on this question.
Common artificial intelligence is a neural network, made up of artificial neurons that can make billions of connections with each other. This network often consists of different layers, but within this layer the neurons are usually the same. This leads to static networks. You can train them to be good at a task, nowadays they can often do it themselves. But once the system is optimized for this task, there isn’t much you can do with it.
The North Carolina team wanted to move away from those “hard rules” and try something closer to how calculations are done in nature. “Our real brain contains more than one type of neuron,” explains physicist William Ditto on behalf of the research team. “So we’ve given our AI the ability to look at itself and decide whether or not the configuration of the neural network needs to be tweaked. The system has been given the control button of its own brain.”
Then there was training. The researchers used a standard data set in which the network had to learn to recognize images with distorted numbers. It is possible that he will try to carry out the task. He can then assess how well he is scoring goals and adjust himself. This was possible by assigning more weight to certain functions, adding more neurons, and altering the neurons more; The system was allowed to figure this out on its own as much as possible. In each evaluation, the network chose more variety over more computing power itself.
Chaos overcomes monotheism
And this is true, as it turns out. Where the homogeneous neural network achieved 57% of correct interpretations, the most diverse network managed to get 70%. The more complex and messy the task assigned to the AI, the greater the difference. For example, the networks also had to model how the star would move within the galaxy, with all the forces acting on it at the same time. The diverse network did this 10 times better than the homogeneous network, says Ditto in an explanation of the study, which in the journal nature popped up.
The results are consistent with previous results in Ma Imitation of nature It has come to be called the starting point in which artificial systems build on processes you see in nature. The researchers therefore compare their findings to arable farming, where monocultures are often at risk. Their diverse neural network is like a diversely sown field.
This does not only have the direct benefit of improving performance. The researchers believe that the fact that the network itself chose diversity, and examining the choices it made in doing so, could also teach us new things about how diversity works in nature. It is possible that the more a network can go its own way, the more it will resemble natural processes.
Is artificial intelligence the gunpowder of the twenty-first century? “There are definitely similarities.”
Gunpowder was a wonderfully clever invention with good and bad applications. Will we look at AI the same way later?
“Travel enthusiast. Alcohol lover. Friendly entrepreneur. Coffeeaholic. Award-winning writer.”