Most recently, on a clear morning in Palm Springs, stateCalifornia, Vivienne Xie stepped onto a small stage to hold perhaps the most nervous presentation in her career. The subject was well known to her. She had to tell the audience about the chips that were developed in her lab at MIT and that promise to bring powerful artificial intelligence to a variety of devices with limited power supply. It is known that most of the computational tasks by the forces of artificial intelligence today were carried out in huge data centers. However, the event - and the audience - made Xi think.
Artificial intelligence on chip
MARS - the venue - iselite conference, which can be reached only by invitation. Robots roll (or fly) around a luxury resort, famous scientists communicate with science fiction writers. Very few scholars are invited for technical presentations, and these sessions should be both inspiring and instructive. Meanwhile, about a hundred well-known researchers, directors and entrepreneurs of the world gathered to the people. MARS is led by none other than the founder and chairman of Amazon, Jeff Bezos. He sat in the front row.
“The audience, so to speak, was of a rather high level,” recalls Xie with laughter.
Speakers at MARS presentedkarate robots, insect drones, and even optimistic drawings for the Martian colonies. Ce chips could seem relatively modest; the naked eye would not distinguish them from chips that are in any electronic device. However, they - perhaps - were much more important than anything else that was shown at the event.
New chip features
The latest developments in the field of chips - like thosewhich were developed in the CE laboratory - may be crucial for the future progress of artificial intelligence (AI), including areas of the same drones and robots that were on MARS. Until recently, AI software relied for the most part on graphics chips, but new hardware could make AI algorithms more powerful, which would open up new applications. New chips for AI could make storekeepers more ubiquitous or allow smartphones to create a photo-realistic landscape with augmented reality.
Ce chips are both extremely efficient and flexible in their design, which is important for an area that is growing rapidly.
These microchips are designed tosqueeze more of the algorithms of "deep learning" AI, which already turned the world upside down. And in the process, they can inspire the algorithms themselves for evolution. “We need new hardware because Moore’s law has slowed,” says Xie, referring to the axiom introduced by Intel co-founder Gordon Moore, who predicted that the number of transistors on a chip would double about every 18 months.
Now this law is increasingly rests onphysical constraints associated with engineering components on an atomic scale. And it stimulates new interest in alternative architectures and approaches to computing.
High rates associated with investing in chipsThe next-generation AIs, and America’s dominance in chip production as a whole, are obvious to the US government. Ce microchips are being developed with the support of the DARPA program for the development of new designs of microchips for artificial intelligence. And, of course, this program was created against the background of the rapid development of China in the same area.
But innovations in the production of microchipsstimulated mainly through the development of deep learning, a very powerful way to train machines to perform useful tasks. Instead of giving the computer a set of rules to follow, the machine essentially programs itself. The training data is fed into a large simulated artificial neural network, which is then configured in such a way as to obtain the desired result. With sufficient training, the deep learning system can find imperceptible and abstract patterns in the data. This method is used for a growing number of practical tasks, from facial recognition on smartphones to predicting diseases from medical images.
New Chip Race
Deep learning is not particularly dependent on Moore's law. Neural networks perform many mathematical calculations in parallel, so they work much more efficiently on specialized graphics chips for video games that produce parallel calculations for visualizing three-dimensional images. But microchips designed specifically for the calculations underlying deep learning should be even more powerful.
The potential of new microcircuit architectures to improve artificial intelligence has raised the level of entrepreneurial activity that the microchip industry has not seen for decades.
Tesla secretly developed its own chips for artificial intelligence of their cars.
Facebook plans to create its own chips for better artificial intelligence.
Large technology companies thathoping to use and commercialize AI — including Google, Microsoft, and Amazon — are working on their own in-depth learning chips. Many smaller companies are also developing new chips. “It’s impossible to keep track of all the companies that are jumping into this race for AI chips,” said Mike Delmer, a microchip analyst from the Linley Group, an analyst firm. "I am not joking: we will find out at least about one every week."
The real possibility is not tobuild the most powerful chips deep learning, said Xie. Energy efficiency is important because AI also needs to work outside of large data centers, while relying only on the energy available in the device's battery.
"AI will be everywhere - and figuring out how to make it all energy efficient will be extremely important," says Navin Rao, vice president of artificial intelligence products at Intel.
Xe hardware, for example, is moreeffective because it physically reduces the problem of where data is stored and where to analyze, and also uses smart schemes to reuse data. Before joining MIT, Ce applied this approach for the first time to improve video compression efficiency in Texas Instruments.
In such a rapidly developing area as deeptraining, the task of those who work on chips for AI, is to make sure that they are flexible enough so that they can be adapted to work with any application. You can easily design a super efficient chip that can do only one thing, but such a product will quickly become obsolete.
Chip Xie is called Eyeriss. Developed in collaboration with Joel Emer, Nvidia Research Associate and MIT Professor, the chip was tested with a number of standard processors to see how it handles a number of different deep learning algorithms. According to an article published last year, thanks to the combination of efficiency and flexibility, the new chip achieves a performance of 10 or even 1000 times greater than existing equipment.
More simple AI chips are already having significantinfluence. High-end smartphones already include chips that are optimized for running deep learning algorithms for image and voice recognition. More efficient chips could allow these devices to handle more powerful AI code with better capabilities. Self-propelled cars need powerful computer chips, as most of the current prototypes rely on a mountain of computers.
Rao says MIT chips are promising, butThe success of the new hardware architecture will be determined by many factors. One of the most important factors, he said, is the development of software that allows programmers to run code on it. “Creating something useful from a compiler point of view is perhaps the biggest obstacle to approval,” he says.
Laboratories Xie also explores the possibilitiescreate software that will better utilize the properties of existing computer chips. And this work goes beyond just deep learning.
Together with Sertak Karaman from the departmentAeronautics and Astronautics at the Massachusetts Institute of Technology, Xie developed the low-power Navion chip, which is incredibly efficient in performing three-dimensional mapping and navigation for a tiny drone. Navion shows that software in the field of AI (deep learning) and hardware (chips) are beginning to evolve together, in symbiosis.
Xie chips may not attract attentionlike waving drones, but the fact that they were shown at MARS speaks of the importance of its technology for the future of AI. It is possible that at the next MARS conference robots and drones will be with something new inside.
What do you think, when we are waiting for the explosive growth of artificial intelligence? Let's discuss in our chat in Telegram.