AI accelerator chip with embedded MRAM is production ready
The Lightspeeur 2802M AI accelerator was delivered to the company by TSMC in June of 2018, says the company, and it has been preparing to discuss the new offering with customers interested in designing solutions for IoT endpoints, cloud solutions, autonomous vehicles, and many other innovative use cases. The 2802M is claimed to be the first AI accelerator to deliver the benefits of MRAM, such as non-volatility and low power, as well as significant advancements specific to edge AI.
The 2802M includes 40MB of memory, which can support large AI models or multiple AI models within a single chip. Examples of models that could be supported within the same chip, says the company, are image classification, voice identification, voice commands, facial recognition, pattern recognition, and many others.
“Our proprietary and patented MRAM engine,” says GTI Co-Founder and SVP of Engineering, Terry Torng, “allowed us to bring this first of our company’s many planned MRAM-enabled AI Accelerators to market, to open up new use cases and drive the success of our customers’ product designs.”
The 2802M, says the company, is the first in a portfolio of MRAM chips in its roadmap, and it is looking forward to working on large-scale opportunities with customers to develop specific “super AI accelerator chips” that combine the new MRAM Engine with its other patented innovations such as its Matrix Processing Engine (MPE) and AI Processing in Memory (APiM) technologies.
The MPE uses CNN with the APiM, pairing logic with memory like a human brain. This optimizes the speed of processing, achieving high TOPS performance, while also saving tremendous amounts of power by avoiding management of data in discrete memory components.
The MRAM line of products will leverage all of the tools and technical documentation created for other AI Accelerator chips made available earlier in this year, such as the SDK and the recently announced PLAI Builder, which simplifies AI Model creation for development teams less familiar with developing solutions using artificial intelligence. AI Models created with Caffe, TensorFlow and other tools can be supported, enabling a wide variety of neural networks, including VGG, MobileNet, ResNet, SDD, ImageNet, and others.