Embedded Technology
Speech processing challenges at the edge solved using Microchip’s analog embedded SuperFlash technology
C K
omputing-in-memory technology is poised to eliminate the massive data communications bottlenecks otherwise associated with performing artificial intelligence (AI) speech processing at the network’s edge but requires an embedded memory solution that simultaneously performs neural network computation and stores weights. Microchip Technology via its Silicon Storage Technology (SST) subsidiary, says that its SuperFlash memBrain neuromorphic memory solution has solved this problem for the WITINMEM neural processing SoC, the first in volume production that enables sub-mA systems to reduce speech noise and recognize hundreds of command words, in real time and
immediately after power-up.
Microchip has worked with WITINMEM to incorporate Microchip’s memBrain analog in- memory computing solution, based on SuperFlash technology, into WITINMEM’s ultra-low-power SoC. The SoC features computing-in-memory technology for neural networks processing including speech recognition, voice-print recognition, deep speech noise reduction, scene detection, and health status monitoring. WITINMEM, in turn, is working with multiple customers to bring products to market during 2022 based on this SoC.
Microchip’s memBrain neuromorphic memory product is optimized to perform vector matrix multiplication (VMM) for neural
networks. It enables processors used in battery-powered and deeply-embedded edge devices to deliver the highest possible AI inference performance per watt. This is accomplished by both storing the neural model weights as values in the memory array and using the memory array as the neural compute element. The result is said to be 10 to 20 times lower power consumption than alternative approaches along with lower overall processor Bill of Materials (BOM) costs because external DRAM and NOR are not required.
Permanently storing neural models inside
the memBrain solution’s processing element also supports instant-on functionality for real- time neural network processing. WITINMEM has leveraged SuperFlash technology’s floating gate cells’ nonvolatility to power down its computing-in-memory macros during the idle state to further reduce leakage power in demanding IoT use cases.
http://www.sst.com/
KIOXIA introduces UFS embedded flash memory devices supporting MIPI M-PHY v5.0
IOXIA Europe, a specialist in memory solutions, has announced that the sampling of, what it describes as, “the industry’s first” Universal Flash Storage (UFS) embedded flash memory devices supporting MIPI M-PHY v5.0 has started. The new line-up utilizes the company’s BiCS FLASH 3D flash memory and is available in three capacities: 128GB, 256GB and 512GB. The new devices deliver high speed read and
write performance and are targeted to a variety of mobile applications, including leading-edge smartphones. The new KIOXIA devices are next- generation UFS (MIPI M-PHY 5.0), which has a theoretical interface speed of up to 23.2Gbps per lane (x2 lanes = 46.4Gpbs) in HS-Gear5 mode. Sequential read and write performance of the 256GB device is improved by approximately 90 per cent and 70 per cent, respectively, over previous generation
devices. Also, the random read and write performance of the 256GB device is improved by approximately 35 per cent and 60 per cent, respectively, over previous generation devices. This next generation of UFS is said to provide significant performance increases, enabling next-generation smartphones and other products to enhance their capabilities and end user experiences in the 5G era and beyond.
www.kioxia.com/en-emea s sureCore announces technology for in-memory computing
ureCore, the ultra-low power, embedded memory specialists, has announced its new technology for in-memory computing called CompuRAM. This will enable solutions for computing at the Edge to be more power efficient. At present, sensor data often has to be sent from an IoT device to a server for processing, which creates a connectivity requirement and an unavoidable latency. For time critical applications this is not acceptable and so there is a drive to do more computation within the device itself, i.e., AI processing at the Edge. Power is a significant design constraint in IoT devices, and so any extra AI-related computation must be done in a power-efficient way. sureCore’s existing low-power memory solutions already
28 March 2022
provide a way to add the significant extra memory needed by AI applications without dramatically increasing power requirements. In-memory computing provides further power savings by reducing the need to move large amounts of data around within a chip, as the initial processing of data is carried out very close to the memory array itself.
Tony Stansfield, sureCore’s CTO, explained: “Our intimate knowledge of memory technology means that we have been able to create a solution for the next technology demand of integrating arithmetic operations within the memory. It is another example of us seeing what the industry will require in the near future and developing a solution that will be ready when the need for AI at the Edge becomes mainstream. Cutting power
Components in Electronics
consumption is what we do as we have proven with our existing technologies, such as our EverOn and PowerMiser SRAM families that enable near threshold operation and 50 per cent dynamic power cuts respectively. Our solutions are all designed
to making it possible to create products for the next generations of ultra-low power applications that could not exist without their power reducing techniques.”
In the same way that on-chip memory is better, faster and more power efficient than transporting data back and forth to off-chip memory, integrating memory and compute capability offers even more significant power
saving benefits, according to the company. sureCore’s in-memory compute technology achieves this integration by embedding arithmetic capability deep within the memory array in a way that is compatible with its existing silicon-proven, low-power memory design.
www.sure-core.com
www.cieonline.co.uk
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54