MENU

Compact beam steering to ‘revolutionize’ AR, autonomous navigation

Compact beam steering to ‘revolutionize’ AR, autonomous navigation

Technology News |
By Rich Pell



The researchers say they are one of the first to demonstrate a low-power large-scale optical phased array (OPA) at near infrared and the first to demonstrate optical phased array technology on-chip at blue wavelength for autonomous navigation and augmented reality, respectively. In addition, say the researchers, they also developed an implantable photonic chip based on an optical switch array at blue wavelengths for precise optogenetic neural stimulation.

“This new technology that enables our chip-based devices to point the beam anywhere we want opens the door wide for transforming a broad range of areas,” says Michal Lipson, Eugene Higgins Professor of Electrical Engineering and Professor of Applied Physics at Columbia Engineering. “These include, for instance, the ability to make LiDAR devices as small as a credit card for a self-driving car, or a neural probe that controls micron scale beams to stimulate neurons for optogenetics neuroscience research, or a light delivery method to each individual ion in a system for general quantum manipulations and readout.”

The researchers designed a multi-pass platform that reduces the power consumption of an optical phase shifter while maintaining both its operation speed and broadband low loss for enabling scalable optical systems. They let the light signal recycle through the same phase shifter multiple times so that the total power consumption is reduced by the same factor it recycles.

They demonstrated a silicon photonic phased array containing 512 actively controlled phase shifters and optical antenna, consuming very low power while performing 2D beam steering over a wide field of view. Their results, say the researchers, are a significant advance towards building scalable phased arrays containing thousands of active elements.

Phased array devices were initially developed at larger electromagnetic wavelengths. By applying different phases at each antenna, researchers can form a very directional beam by designing constructive interference in one direction and destructive in other directions. In order to steer or turn the beam’s direction, they can delay light in one emitter or shift a phase relative to another.

Current visible light applications for OPAs, say the researchers, have been limited by bulky table-top devices that have a limited field of view due to their large pixel width. Previous OPA research done at the near-infrared wavelength faced fabrication and material challenges in doing similar work at the visible wavelength.

“As the wavelength becomes smaller, the light becomes more sensitive to small changes such as fabrication errors,” says Min Chul Shin, a PhD student in the Lipson group and co-lead author of a paper published in Optics Letter. “It also scatters more, resulting in higher loss if fabrication is not perfect—and fabrication can never be perfect.”

The researchers leveraged previous research on a low-loss silicon photonic platform to realize their new beam steering system in the visible wavelength — the first chip-scale phased array operating at blue wavelengths using a silicon nitride platform. A major challenge, say the researchers, was working in the blue range, which has the smallest wavelength in the visible spectrum and scatters more than other colors because it travels as shorter, smaller waves.

Another challenge in demonstrating a phased array in blue was that to achieve a wide angle, the researchers had to overcome the challenge of placing emitters half a wavelength apart or at least smaller than a wavelength — 40 nm spacing — which was very difficult to achieve. In addition, in order to make optical phased array useful for practical applications, they needed many emitters. Scaling this up to a large system, they say, would be extremely difficult.

“Not only is this fabrication really hard, but there would also be a lot of optical crosstalk with the waveguides that close,” says Shin. “We can’t have independent phase control plus we’d see all the light coupled to each other, not forming a directional beam.”

Solving these issues for blue, say the researchers, meant that they could easily do this for red and green, which have longer wavelengths.

“This wavelength range enables us to address new applications such as optogenetic neural stimulation,” says Aseema Mohanty, a postdoctoral research scientist and co-lead author of papers published in the Optics Letter and Nature Biomedical Engineering. “We used the same chip-scale technology to control an array of micron-scale beams to precisely probe neurons within the brain.”

The researchers say they are now collaborating with other researchers in the Applied Physics group to optimize the electrical power consumption because low-power operation is crucial for lightweight head-mounted AR displays and optogenetics.

“We are very excited because we’ve basically designed a reconfigurable lens on a tiny chip on which we can steer the visible beam and change focus,” says Lipson. “We have an aperture where we can synthesize any visible pattern we want every few tens of microseconds. This requires no moving parts and could be achieved at chip-scale. Our new approach means that we’ll be able to revolutionize augmented reality, optogenetics and many more technologies of the future.”

For more see “Large-scale optical phased array using a low-power multi-pass silicon photonic platform“, “Reconfigurable nanophotonic silicon probes for sub-millisecond deep-brain optical stimulation“, and “Chip-scale blue light phased array.”

Related articles:
Photonics chip uses ‘slow light’ method for beam steering
‘Holy grail’ of LiDAR leverages beam-steering metasurfaces
Nano-optics startup raises $4.3M to miniaturize solid-state lidar

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s