Traditionally, separate data processing and memory chips are used to process and store data, respectively, which results in extra energy being used as they exchange data across a "memory wall." This complicates efforts to build AI-capable chips for battery-operated mobile devices.
"Transactions between processors and memory," says computer scientist Subhasish Mitra, senior author of a new study on the research, "can consume 95 percent of the energy needed to do machine learning and AI, and that severely limits battery life."
Now, say the researchers, they have developed new algorithms that combine several energy-efficient hybrid chips to create the illusion of one mega–AI chip. The system can run AI tasks faster, and with less energy, by harnessing eight hybrid chips, each with its own data processor built right next to its own memory storage.
The work builds on the researchers' prior development of a new memory technology, called resistive random-access memory (RRAM), that stores data even when power is switched off - like flash memory - only faster and more energy efficiently. The RRAM advance enabled the researchers to develop an earlier generation of hybrid chips that worked alone. Their latest design incorporates a critical new element: algorithms that meld the eight, separate hybrid chips into one energy-efficient AI-processing engine.
"If we could have built one massive, conventional chip with all the processing and memory needed, we’d have done so, but the amount of data it takes to solve AI problems makes that a dream," says Mitra. "Instead, we trick the hybrids into thinking they're one chip, which is why we call this the Illusion System."
The researchers built and tested a prototype with help from collaborators at the French research institute CEA-Leti and at Nanyang Technological University in Singapore. The eight-chip system, say the researchers, is just the beginning. Simulations showed how systems with 64 hybrid chips could run AI applications seven times faster than current processors, using one-seventh as