Page 16 - EE Times Europe Magazine | February 2020
P. 16
14 EE|Times EUROPE
ARTIFICIAL INTELLIGENCE
LeCun: ‘It’s Really Hard to operation used in most image-processing and
speech-recognition neural networks today.
Succeed with Exotic Hardware’ “[The prevailing approach] will become more
and more wrong, in the sense that we are
going to have bigger and bigger requirements
By Sally Ward-Foxton for power,” he said. “If we build a generic
piece of hardware where 95% of the cycles are
At NeurIPS, a neural-net pioneer shares his view of the spent on doing convolutions, we are not doing
past and future of AI accelerator chips. a good job.”
KILLER APP
The future, as LeCun described it, will see
convolutional neural networks (CNNs) used
in everything from toys to vacuum cleaners to
medical equipment. But the killer app — the
one application that will prove AI’s value to
consumer devices — is the augmented-reality
headset.
Facebook is currently working on hardware
for AR glasses. It’s a huge hardware challenge
because of the amount of processing required
at low latency, powered only by batteries.
“When you move, the overlaid objects in the
world should move with the world, not with
you, and that requires quite a bit of computa-
tion,” said LeCun.
Facebook envisions AR glasses that are
operated by voice and interact through ges-
tures via real-time hand tracking. While those
features are possible today, they are beyond
t’s really hard to succeed with exotic hard- front-end languages, allowing the researchers what we can do in terms of power consump-
ware,” Facebook Chief AI Scientist Yann to train and experiment with neural networks. tion, performance, and form factor. LeCun
LeCun told the audience for his keynote The researchers’ work advanced the concept noted a few “tricks” that can help.
Ispeech at NeurIPS. that deep-learning systems can be assembled For example, when running the same
Addressing the global gathering of AI from differentiable modules and then auto- neural network on every frame of a video —
experts in Vancouver, Canada, in December, matically differentiated. While novel at the perhaps to detect objects — it doesn’t matter
LeCun surveyed the history of specialized time, this is common practice now. if the result for one frame is wrong, because
computing chips for processing neural- The right tools gave LeCun’s team its we can look at the frames before and after it
network workloads, offered a glimpse of “superpower” and were also an important fac- and check for consistency.
what Facebook is working on, and made tor in producing reproducible results, he said. “So you could imagine using extremely
some predictions for the future of deep- “Good results are not enough … even if you low-power hardware that is not perfect; in
learning hardware. get good results, people will still be skeptical,” other words, you can tolerate bit flips once
he said. “Making those results reproducible is in a while,” said LeCun. “It’s easy to do this by
ANCIENT HISTORY almost as important as actually producing the lowering the voltage of the power supply.”
LeCun is a renowned visionary in the field of results in the first place.”
AI, having been at the forefront of neural-net- Along with the right tools, hardware perfor- NEURAL-NET DEVELOPMENTS
work research in the 1980s and 1990s. As mance is crucial to the research community, The rapid evolution of neural networks is
a Bell Labs researcher in the late 1980s, he as hardware limitations can influence entire a major challenge for hardware design. For
worked with the earliest types of dedicated directions of research, said LeCun. example, dynamic networks — those with
neural-network processors, which comprised “[What] the hardware community builds memory that can be trained to learn sequen-
resistor arrays and were used to perform for research or for training actually influ- tial or time-varying patterns — are gaining
matrix multiplication. As neural networks fell ences what ideas people think of,” he said. in popularity, especially for natural-language
out of favor in the late 1990s and early 2000s, “Entire ideas can be abandoned just because processing (NLP). However, they behave
LeCun was one of a handful of scientists who the hardware is not powerful enough, even differently from many assumptions made by
continued to work in the field. In his keynote, though they were good ideas.” current hardware. The compute graph can’t
he shared some of the things he learned about The answer may not lie with new and novel be optimized at compile time; that has to be
hardware for deep learning during that time. forms of computing, he said, noting that many done at runtime. It’s also rather difficult to
First, tools are really important. What killed exotic fabrication technologies failed to take implement batching, a popular technique
neural nets (temporarily) in the ’90s was that off when they didn’t fit in with the existing through which more than one sample is pro-
only a few people — including LeCun — had computing environment. cessed at once to improve performance.
tools to train them. LeCun and his colleagues One of LeCun’s frustrations with today’s “All the most common hardware that we
spent a lot of time building what would hardware solutions for AI acceleration is that have at our disposal assumes that you can
now be called a deep-learning framework: most are built for matrix multiplication, not batch, because if you have a batch with more
a flexible piece of software that interpreted convolution, which is the key mathematical than one sample, then you can turn every
FEBRUARY 2020 | www.eetimes.eu

