Light-powered AI chip moves closer to market

February 07, 2018 // By Peter Clarke
Lightmatter Inc. (Boston, MA), a company developing a photonic processor for neural network applications, has received $11 million in Series A funds.

The company was founded in 2017 by MIT researchers Nicholas Harris, Darius Bunandar, and Thomas Graham. The photonic processor technology that underlies Lightmatter was developed over four years at the MIT Research Laboratory of Electronics.

The round was co-led by Matrix Partners and Spark Capital. Stan Reiss from Matrix and Santo Politi from Spark have joined Lightmatter's board of directors.

The Lightmatter group won a $100,000 prize within MIT in 2017 for developing fully optical chips that compute using light. This means that they can work faster and using less energy than electronic circuits.

Although analog computing simple digital operations using operations were demonstrated many years ago the complexity of digital architectures, the incomplete set of photonic equivalents to electronic circuits and the need to move out of and back into the optical domain has held back the use of photonic computing.

Lightmatter has homed in on the artificial intelligence (AI) domain where digital operations are more uniform than in general purpose computing and produced a silicon chip that uses light signals, rather than electrical signals, for matrix multiplication. The system uses heated silicon channels between Mach-Zehnder interferometers to slow down, to varying degrees, optical signals that represent weights. As the signals pass through a cascade of interferometers the input weights are matrix multiplied to produce the required outputs.

In 2017 the team reported in Nature Photonics on a silicon photonic chip that includes 56 Mach-Zehnder interferometers and such a matrix of controllable wave-guides that was used to implement a neural network that recognizes four basic vowel sounds.

The system achieved 77 percent accuracy compared to about 90 percent for electronic systems but with the prospect of scaling up to outperform electronic systems.

Such energy-efficient photonic acceleration is particularly beneficial for neural network architectures where training is done using large datasets – and requires a lot of time and energy.

"For decades, electronic computers have been at the foundation of


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.