Page 30 - EE Times Europe November 2021 final
P. 30

30 EE|Times EUROPE



            SPECIAL REPORT: ARTIFICIAL INTELLIGENCE
           Conventional Memory Is Key to Ubiquitous AI


           By Gary Hilson

                  s the hype around artificial intelligence dies down and
                  engineers confront new challenges, it’s becoming clearer that
                  not all machine-learning and inference tasks will require
           A advanced memory technology. Proven conventional memo-
           ries can handle AI at the edge, while distributed AI could be the perfect
           solution for 5G.
             Even so, basic inference operations are already becoming more com-
           plex, and overall, memory will be expected to do more for inference.
             Bob O’Donnell, TECHnalysis Research president and chief analyst,
           sees AI as integral to realizing the promise of 5G. Only when the two
           are combined will new applications be realized. “The irony is every-
           body’s been treating each of these as separate animals: 5G is one thing,
           edge is another thing, [and] AI has been another thing,” said O’Donnell.
           “You really need the combination of these things for any of them to
           really live up to what they’re capable of.”
             Centralized AI has already proven itself to a certain degree as edge
           processor development advances and memories such as LPDDR are
           enlisted to handle mundane AI tasks at the edge. “A camera in a room
           can do the very simple AI processing to detect the number of people   Edge appliances such as Lenovo’s ThinkEdge use DDR DRAM and
           in the room and therefore adjust the HVAC,” said O’Donnell. While not   flash SSDs to process and store data, enabling local AI and secure
           sexy, those tasks can be processed locally among a group of buildings   cloud management. (Source: Lenovo)
           with modest compute and memory power, eliminating the need to send
           data back and forth to the cloud.
             There’s also a middle ground, O’Donnell said, whereby edge devices   That 5G infrastructure at the edge sometimes has more bandwidth
           process data locally while imbued with enough intelligence to    than the older infrastructure to which it’s connected, so some inference
           know when to send files to a data center for “in-depth crunching.”   is required to manage network transactions. “It’s just too complicated
                                        One outcome would be     to do with explicit programming,” Ober said.
           Centralized AI has           improved algorithms sent   Many edge use cases for AI are quite mundane, using embedded
           already proven itself to     back to the edge.        devices requiring memories with small physical and power footprint.
                                                                 The challenge, said Ober, is that even basic AI functions such as image
                                          “There’s this continuous
           a certain degree as edge     loop of improvement,” the   recognition and classification at the edge are becoming bigger jobs.
                                                                 Higher-resolution images up to 4K, combined with the need
                                        analyst said. “That’s where
           processor development        things start to get very   for more information and context, mean these neural networks are
           advances and memories        interesting.”            more complex. “If it’s a video, then you have multiple frames you want
                                                                 to use to extract meaning over time,” said Ober. “Memory is really
                                          Memory dedicated to
           such as LPDDR handle         distributed AI applications   important there.”
                                                                   Nvidia is focused on data-center–training workloads, in which
                                        will be relatively low-end,
           mundane AI tasks at          O’Donnell predicted, and   memory capacity and bandwidth are critical while reducing power con-
           the edge.                    those memory types could be   sumption, said Ober. Hence, different memory technologies could play
                                                                 a role in future AI rollouts, including voltage-controlled MRAM, which
                                        used in a variety of apps,
                                        such as distributed edge   could reduce power, sustain bandwidth, and free up power for compute.
           devices. “My guess is that LPDDR-type memories would make the   “You’ll have some really interesting solutions longer term,” he said.
           most logical sense,” he said.                           As memory capabilities rise to meet AI demands, so, too, will expec-
             But even low-power DDR could get a boost above and beyond the   tations, Ober added, as the exponential growth of AI complexity has
           typical device types used in smartphones, vehicles, and various edge   been consistent. “The more knowledge you can codify, the more stuff it
           endpoints. During a recent update discussing progress on pushing   can do,” he said.
           processing-in-memory (PIM) technology into the mainstream    Training a network is essentially codifying information, and it’s no
           (bit.ly/3uiqctF), Samsung noted that the technology could eventually    longer enough for an edge device to detect a dog. “They want to know
           be applied to other types of memory to enable AI workloads. That could   what type of dog: What’s it doing? Is it happy? Is it sad?” said Ober.
           include LPDDR5 used to bring AI to the edge inside a variety of end-  “The expectations continue to rise exponentially.”
           point devices without requiring data center connectivity.  As functions such as image detection and classification for robotics
             Samsung has demonstrated an LPDDR5-PIM with more-than-   improve, AI and ML workloads in the data center will be expected to do
           doubling performance while reducing energy usage by over 60%    more. Hence, there’s a continuing need for high-performance com-
           when used in applications such as voice recognition, translation,    puting, Ober said, and there will always be new AI tasks that are more
           and chatbots.                                         complex, take more time, and require more machine intelligence.
                                                                   Shifting data tied to an AI task into the right memory is among the
           AI, 5G                                                biggest challenges for AI in the data center. Another is reducing the
           Some distributed AI requiring memory is helping to operate 5G base   need to send every workload back to a central cloud, thereby placing
           stations, said Robert Ober, chief platform architect at Nvidia.  greater strain on memory resources. Ober foresees demand for new

           NOVEMBER 2021 | www.eetimes.eu
   25   26   27   28   29   30   31   32   33   34   35