General

The fastest supercomputer in the world broke the record of artificial intelligence

On the west coast of America, the most valuableworld companies are trying to make artificial intelligence smarter. Google and Facebook boast experiments using billions of photos and thousands of high-performance processors. But at the end of last year, a project in eastern Tennessee quietly surpassed the scale of any corporate laboratory of artificial intelligence. And he was under the administration of the US government.

US government supercomputer breaks record

The record project involved the most powerful inWorldwide supercomputer Summit, located in the Oak Ridge National Laboratory. This car received the crown in June last year, returning the US title five years later, when the list was headed by China. As part of a climate research project, a giant computer launched a machine learning experiment that proceeded faster than ever before.

"Summit", occupying an area equivalent to twotennis courts, involved in this project more than 27,000 powerful graphics processors. He used their power to teach deep learning algorithms, the very technology that underlies advanced artificial intelligence. In the process of deep learning, algorithms perform exercises at a speed of a billion billion operations per second, known in supercomputer circles as exaflop.

"Previously, deep learning never reachedthis level of performance, ”says Prabhat, the head of the research team at the National Energy Research Center at the Lawrence Berkeley National Laboratory. His team collaborated with researchers at Summit Headquarters, Oak Ridge National Laboratory.

As you might guess, the training of the AI ​​itselfThe world's most powerful computer has focused on one of the biggest problems in the world - climate change. Technology companies teach algorithms to recognize faces or road signs; Government scientists have taught them to recognize weather conditions like cyclones from climate models that compress the centenary forecasts of the Earth’s atmosphere at three o'clock. (It is not clear, however, how much energy the project requested and how much carbon was emitted into the air in the process).

The Summit Experiment Matters for the Futureartificial intelligence and climatology. The project demonstrates the scientific potential of adapting deep learning to supercomputers, which traditionally simulate physical and chemical processes, such as nuclear explosions, black holes, or new materials. It also shows that machine learning can benefit from more computing power — if you can find it — and provide breakthroughs in the future.

"We did not know that it could be done in suchscale until they did it, ”says Rajat Monga, technical director of Google. He and other googles helped the project by adapting the company's open source TensorFlow machine learning software to giant Summit scales.

Much of the work of scaling deepThe training was conducted in data centers of Internet companies, where servers work together on problems, dividing them, because they are located relatively separately, and not connected in one giant computer. Supercomputers like Summit have a different architecture with specialized high-speed connections connecting their thousands of processors into a single system that can work as a single unit. Until recently, relatively little work was done on adapting machine learning to work with this kind of hardware.

Monga says adaptation work TensorFlowto scale Summit will also contribute to the efforts of Google to expand its internal systems of artificial intelligence. Nvidia engineers also participated in this project, making sure that tens of thousands of Nvidia GPUs in this machine run smoothly.

Finding ways to use more computationalPower in deep learning algorithms has played an important role in the current development of technology. The same technology that Siri uses for voice recognition and Waymo cars for reading road signs became useful in 2012 after scientists adapted it to work on Nvidia GPUs.

In an analysis published last May,Scientists from OpenAI, a San Francisco-based research institute founded by Ilon Mask, calculated that the amount of computing power in the largest public machine learning experiments has doubled approximately every 3.43 months since 2012; this will mean an 11-fold increase over the year. This progression helped the Alphabet bot defeat champions in complex desktop and video games, and also contributed to a significant increase in the accuracy of the Google translator.

Google and other companies nowcreate new types of chips, adapted for AI, to continue this trend. Google claims that “pods” with closely spaced thousands of its AI chips — duplicated tensor processors, or TPU — can provide 100 petaflops computing power, which is one tenth of the speed reached by Summit.

Summit's contribution to climate scienceshows how a giant-scale AI can improve our understanding of future weather conditions. When researchers generate centenary weather predictions, reading the resulting forecast becomes challenging. “Imagine that you have a movie on YouTube that is 100 years old. There is no way to find all the cats and dogs in this movie by hand, ”says Prabhat. Usually, software is used to automate this process, but it is not perfect. The results of the Summit showed that machine learning can do this much better, which should help in predicting storm impacts like flooding.

According to Michael Pritchard, a professorUniversity of California, Irvine, launching deep learning on supercomputers is a relatively new idea that has emerged at a convenient time for climate researchers. The slowdown in the rate of improvement of traditional processors has led to the fact that engineers began to equip supercomputers with a growing number of graphics chips, so that their performance grew more stably. “The moment has come when it is no longer possible to increase the computing power in the usual way,” says Pritchard.

This shift started the traditional simulationdead end, and therefore had to adapt. It also opens the door to harness the power of deep learning, which is naturally suitable for graphics chips. Perhaps we will get a clearer picture of the future of our climate.

How would you use such a supercomputer? Tell us in our chat in Telegram.