Page 18 - EE|Times Europe Magazine - December 2020
P. 18
18 EE|Times EUROPE — The Memory Market
Memory Technologies Confront Edge AI’s Diverse Challenges
“We think LPDDR will be the most popular: A single DRAM gives est voltage and shortest time. That translates into the highest energy
more than 10 GB/s of bandwidth … yet has enough bits to store the efficiency, Walker said.
weights/intermediate activations,” said Tate. “Any other DRAM would While this technique also applies to higher-precision neural net-
require more chips and interfaces and more bits would need to be works, it’s especially suited to BNNs because the MRAM cell has two
bought that aren’t used.” states, matching the binary states in a BNN.
Is there room for any emerging memory technologies here? Using MRAM at the edge is another potential application, according
“The wafer cost goes up dramatically when using any emerging to Walker. “For edge AI, MRAM has the ability to run at lower voltages
memory, whereas SRAM is ‘free,’ except for silicon area,” he added. “As in applications where high-performance accuracy isn’t a requirement
economics change, the tipping point could change, too, but it will be but improvements in energy efficiency and memory endurance are very
further down the road.” important,” he said. “In addition, MRAM’s inherent nonvolatility allows
for data conservation without power.”
EMERGING MEMORIES One application is as a so-called unified memory, “where this emerg-
Despite the economics of scale, other memory types hold future possi- ing memory can act as both an embedded flash and SRAM replacement,
bilities for AI applications. saving area on the die and avoiding the static power dissipation inher-
Magnetoresistive RAM (MRAM) stores each bit of data via the ori- ent in SRAM,” said Walker.
entation of magnets controlled by an applied electrical voltage. If the While Spin Memory’s MRAM is on the verge of commercial adoption,
voltage is lower than required to flip the bit, there is only a probability specific implementation of the BNN would work best on a variant of the
that a bit will flip. This randomness is unwanted, so MRAM is driven basic MRAM cell. Hence, it remains at the research stage.
with higher voltages to prevent it. However, some AI applications can
take advantage of this inherent stochasticity (which can be thought of NEUROMORPHIC ReRAM
as the process of randomly selecting or generating data). Another emerging memory for edge AI applications is ReRAM. Recent
Experiments have applied research by Politecnico Milan using Weebit Nano’s silicon oxide (SiOx)
MRAM’s stochasticity capabili- ReRAM technology showed promise for neuromorphic computing.
ties to binarized neural networks ReRAM added a dimension of plasticity to neural network hardware;
(BNNs), a technique whereby the that is, it could evolve as conditions change — a useful quality in neuro-
precision of all the weights and morphic computing.
activations is reduced to 1 bit. This Current neural networks can’t learn without forgetting tasks they’ve
is used to reduce compute and power been trained on, while the brain can do so quite easily. In AI terms, this
requirements dramatically for far- is “unsupervised learning,” wherein the algorithm performs inference
edge applications. There may be a on datasets without labels, looking for its own patterns in data. The
tradeoff with accuracy, depending on eventual result could be ReRAM-enabled edge AI systems that can
how the network is re-trained, but in learn new tasks in situ and adapt to the environment around them.
general, the neural network can be Overall, memory makers are introducing technologies offering speed
made to function reliably despite the and bandwidth required for AI applications. Various memories, whether
Spin Memory’s Andy Walker reduced precision. on the same chip as the AI compute, in the same package, or on sepa-
“Binarized neural networks are rate modules, are available to suit many edge AI applications.
unique in that they can function reliably, even as the certainty of a While the exact nature of memory systems for edge AI depends
number being –1 or +1 is reduced,” said Andy Walker, product vice on the application, GDDR, HBM, and Optane are proving popular for
president at Spin Memory. “We have found that such BNNs can still data centers, while LPDDR competes with on-chip SRAM for endpoint
function with high levels of accuracy, as this certainty is reduced [by] applications.
introducing what is called ‘bit error rate’ of the memory bits being Emerging memories are lending their novel properties to research
written incorrectly.” designed to advance neural networks beyond the capabilities of today’s
MRAM can naturally introduce bit error rates in a controlled manner hardware to enable future power-efficient, brain-inspired systems. ■
at low voltage levels, maintaining accuracy while lowering power
requirements. The key is determining the optimum accuracy at the low- Sally Ward-Foxton is editor-in-chief of EE Times Weekend.
BACK IN PRINT!
Get your FREE subscription to the re-launched
EE Times Europe Magazine
EE Times Europe Magazine delivers the latest news, articles, special
reports, and features to cover every facet of today’s challenging and
disruptive technologies, software, and components.
Our team of editors brings together the latest in engineering journalism
and articles to keep you up to speed with the state of the art in the Subscribe at
electronic engineering industry. www.eetimes.eu