Page 31 - EE Times Europe November 2021 final
P. 31
EE|Times EUROPE 31
Conventional Memory Is Key to Ubiquitous AI
high-bandwidth but low-power bulk memory, as it’s intrinsically non- Reliability and longevity are among the reasons that NOR flash will
volatile. There are already moves to process AI workloads in embedded play a role in automotive AI over the long term, enabling operation in
devices, such as an industrial endpoint, then shift some tasks to a local harsh environments for a decade or more, Yastic predicted. NOR flash is
5G-connected base station. also favored by carmakers for its fast-bootup capabilities. For example,
More complex tasks would be shipped to cloud data centers. “There’s Macronix’s OctaFlash SPI NOR flash offers quick startup and a fast
already work going on in stratifying that way, because there’s frankly interface that can reach most endpoints in an AV.
not enough bandwidth going back to the It also comes down to cost, Yastic noted: NOR flash has been around
core,” Ober said. a long time, so the price points have dropped.
Over time, all memory technologies inevitably increase in den-
FEDERATED LEARNING sity and performance while consuming less power in a smaller form
That hierarchal approach to distributed factor at a lower cost. The need for high-performance memory in
AI supports incremental training, or “fed- data centers to crunch AI and ML workloads remains, but so, too, do
erated learning,” allowing for continuous opportunities for commodity memories to fulfill many AI require-
improvement, said Ober. “There’s con- ments in distributed systems.
stant retraining of neural networks and According to Steve Woo, a Rambus fellow, the history of computing is
updating them. You’ve got to have some a predictor of the future of memory in AI systems over the longer term.
nonvolatile memory or some memory “Today’s supercomputer is tomorrow’s smartphone,” he said.
that you can push these updates out to Some of the earlier AI models that needed high-end hardware can
in all of these devices — no matter how now be handled using more mainstream memories. “It’s much more
small or large.” Macronix’s Jim Yastic accessible now in part because the semiconductor industry has done
For example, Lenovo’s ThinkEdge its part to miniaturize and had to drive the cost out of that hardware,”
includes an AI-capable edge appliance. It said Woo.
uses high-performance DDR4 DRAM and capacity SSD to support AI Today’s HBM2 could soon become a few DDR DIMMs and other
and ML models such as computer vision used for tracking warehouse memories connected via Compute Express Link (CXL). “You’ll be able
and logistics operations or automating manufacturing processes. to get to the same kind of performance levels that seem so out of reach
For industrial robotics and automotive use cases such as autono- today,” he said.
mous vehicles (AVs), more memory bandwidth and capacity may be Woo likens the mainstreaming of AI to the decade-long evolution
necessary, but it doesn’t have to be the top of the line. of the smartphone. “There were all kinds of developers coming up
Jim Yastic, director of technical marketing at Macronix, said that AI’s with new ways to use the technology,” he said. With scaling, the
hype cycle is similar to that for the internet of things, which now is market grew to the point where specialized memories that served
doing much of the heavy lifting in automotive, industrial, and security the low-power market were developed
settings. By 2023, IDC predicts that 70% of IoT deployments will as volume increased. Woo expects the
include AI for autonomous or edge decision-making, with computer same synergies for AI memories: “The
vision among the fastest-growing edge AI applications. costs will continue to be driven down.
Yastic said that a distributed approach to AI makes sense, as doing Specialized components will be justified
everything in data centers is expensive. Just as IoT devices have taken now because you can do the [return on
on more processing capabilities locally, more AI operations are moving investment] for it.”
out of data centers while determining what needs to be sent back to a These advances are also aligned with
central cloud. architectural changes being made to the
In the industrial and automotive segments, memory requirements internet. “Data movement is becoming
for edge AI are being dictated by the various types of sensors all the bottleneck,” said Woo. Moving data to
performing some level of filtering and contributing to the creation of the cloud for processing consumes far too
better ML models by sending selected data back to a central location. much energy, so processing locally drives
The new models are then downloaded. Rambus’s Steve Woo down cost and improves performance
That approach is necessary because sectors such as automotive while consuming less power.
simply can’t deal with terabytes of data over a short period, said Yastic. Woo also sees inference and computa-
The local system must make some smart decisions quickly without tional tasks as well as endpoint type determining which memories are
transferring lots of data back and forth, even with the availability of 5G. most suitable as AI advances. Regardless, thermal characteristics and
In AVs, 5G supports ADAS and AI functionality. power constraints will be a factor. “You can see the tradeoffs,” he said,
Yastic said that the speed with which decisions must be made by adding that if it’s just inference, then on-chip SRAM may be enough.
different devices determines AI system architecture and, hence, mem- What ultimately matters for memory as AI becomes ubiquitous
ory requirements as measured in terms of performance and density. and distributed across different platforms is the streamlining of neu-
“Depending on the application, it could be just [an embedded ral networks, e.g., making them mainstream AI platforms, Woo said.
multimedia card],” he said. AI-based applications will require supercomputing for the fore-
seeable future, the Rambus fellow added, but Moore’s Law scaling
MEMORY MENU and other memory advances will help bring data closer to computing
Other memory devices for automotive and industrial AI could include resources. The challenge for any new memory type is demonstrating
universal flash storage, NAND flash SSDs, DRAM, and even SRAM. benefits that justify replacing something that’s tried and true.
What hasn’t changed in many of these ecosystems, especially auto- “There’s going to be some finite number of memories that are really
motive, is the need for reliability, safety, and security — which is why needed in the industry,” said Woo. “There are a bunch of incumbents
incumbent memories will remain the first choice, even for AI tasks. As that appear to be good enough in a lot of cases.” ■
much as today’s cars are servers on wheels, they are also a collection
of embedded endpoints, including sensors and cameras with on-board Gary Hilson is a contributing writer. This article was originally
memory that need to last as long as the vehicle. published on EE Times and may be viewed at bit.ly/3uiqXmn.
www.eetimes.eu | NOVEMBER 2021