3D depth sensor adds spatial awareness to any device

December 04, 2018 // By Rich Pell
Mobile computer vision startup Occipital (Boulder, CO) has announced the launch of a new sensor that "makes it easy to add spatial awareness to a new generation of advanced products."

The new sensor - called Structure Core - is designed for AR/VR simultaneous localization and mapping (SLAM), robot vision, and other embedded applications "where great depth performance matters." The sensor supports multiple platforms, including Linux and Windows, and features a self-contained design with onboard IMU and visible camera.

Structure Core, says the company, works perfectly with devices that its previous product - Structure Sensor , which adds 3D scanning to mobile devices while using their camera and inertia sensors - was never specifically designed to work with, like robots, drones, and AR/VR headsets. Such products are expected to become commonplace within the next few years, with spatial awareness becoming a fundamental characteristic of all of them.

A key feature of the new design is a high-contrast, eye-safe laser projector, which enables the device to achieve best-in-class range while still achieving a major reduction in z-height. Structure Core also features dual global shutter infrared (IR) cameras, and a choice between a 165° wide vision visible spectrum camera or an 85° color camera. It can sense 3D detail and depth up to around 5 meters (about 16 feet) away via IR projection, and further with its other cameras.

The Structure Core sensor - in both a self-enclosed version and a module designed to be embedded - is expected to be generally available in March 2019 for $399. It can be preordered now for $599 for delivery in two weeks, and for $499 for delivery in January.

Occipital


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.