A compute-in-memory chip based on resistive random-access memory.

阅读:11
作者:Wan Weier, Kubendran Rajkumar, Schaefer Clemens, Eryilmaz Sukru Burc, Zhang Wenqiang, Wu Dabin, Deiss Stephen, Raina Priyanka, Qian He, Gao Bin, Joshi Siddharth, Wu Huaqiang, Wong H-S Philip, Cauwenberghs Gert
Realizing increasingly complex artificial intelligence (AI) functionalities directly on edge devices calls for unprecedented energy efficiency of edge hardware. Compute-in-memory (CIM) based on resistive random-access memory (RRAM)(1) promises to meet such demand by storing AI model weights in dense, analogue and non-volatile RRAM devices, and by performing AI computation directly within RRAM, thus eliminating power-hungry data movement between separate compute and memory(2-5). Although recent studies have demonstrated in-memory matrix-vector multiplication on fully integrated RRAM-CIM hardware(6-17), it remains a goal for a RRAM-CIM chip to simultaneously deliver high energy efficiency, versatility to support diverse models and software-comparable accuracy. Although efficiency, versatility and accuracy are all indispensable for broad adoption of the technology, the inter-related trade-offs among them cannot be addressed by isolated improvements on any single abstraction level of the design. Here, by co-optimizing across all hierarchies of the design from algorithms and architecture to circuits and devices, we present NeuRRAM-a RRAM-based CIM chip that simultaneously delivers versatility in reconfiguring CIM cores for diverse model architectures, energy efficiency that is two-times better than previous state-of-the-art RRAM-CIM chips across various computational bit-precisions, and inference accuracy comparable to software models quantized to four-bit weights across various AI tasks, including accuracy of 99.0 percent on MNIST(18) and 85.7 percent on CIFAR-10(19) image classification, 84.7-percent accuracy on Google speech command recognition(20), and a 70-percent reduction in image-reconstruction error on a Bayesian image-recovery task.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。