Computing-in-memory technology is balanced to eradicate the massive data communications bottlenecks otherwise associated with performing AI speech processing at the network’s edge. However, it needs an embedded memory solution that simultaneously accomplishes neural network computation and stores weights. Microchip Technology Inc. through its Silicon Storage Technology (SST) subsidiary announces that its SuperFlash memBrain neuromorphic memory solution has solved this problem for the WITINMEM neural processing SoC, the first in volume production that allows sub-mA systems to decrease speech noise and recognise hundreds of command words, in real-time and immediately after power-up.
The company has worked with WITINMEM to include its memBrain analog in-memory computing solution, based on SuperFlash technology, into their ultra-low-power SoC. The SoC features computing-in-memory technology for neural networks processing, including speech recognition, deep speech noise reduction, voice-print recognition, scene detection, and health status monitoring. WITINMEM, in turn, is operating with multiple customers to bring products to market through 2022 based on this SoC.
“WITINMEM is breaking new ground with Microchip’s memBrain solution for addressing the compute-intensive requirements of real-time AI speech at the network edge based on advanced neural network models,” said Shaodi Wang, CEO of WITINMEM. “We were the first to develop a computing-in-memory chip for audio in 2019, and now we have achieved another milestone with volume production of this technology in our ultra-low-power neural processing SoC that streamlines and improves speech processing performance in intelligent voice and health products.”
“We are excited to have WITINMEM as our lead customer and applaud the company for entering the expanding AI edge processing market with a superior product using our technology,” said Mark Reiten, vice president of the license division at SST. “The WITINMEM SoC showcases the value of using memBrain technology to create a single-chip solution based on a computing-in-memory neural processor that eliminates the problems of traditional processors that use digital DSP and SRAM/DRAM-based approaches for storing and executing machine learning models.”
Microchip’s memBrain neuromorphic memory product is optimised to perform VMM for neural networks. It allows battery-powered and deeply-embedded edge processors to provide the highest possible AI inference performance per watt. This is achieved by storing the neural model weights as values in the memory array and utilising the memory array as the neural compute element. The result is 10 to 20 times lower power consumption than alternative approaches and lower overall processor BOM costs because external DRAM and NOR are not needed.
Permanently storing neural models inside the solution’s processing element supports instant-on functionality for real-time neural network processing. WITINMEM has used SuperFlash technology’s nonvolatility floating gate cells to power down its computing-in-memory macros through the idle state to further lower leakage power in demanding IoT use cases.