The reference design integrates the Blaize Pathfinder P1600 SoM (system on a module) for embedded edge AI applications with the eYs3D Depth Camera product line of 3D cameras. The camera line features unique stereo vision capabilities that deliver millimeter-level accuracy of depth at optimal range and is used for indoor and outdoor AI-based autonomous operation, including robotics, security, touchless control, autonomous vehicles, and smart retail.
These advanced capabilities, say the companies, eliminate the use of costly Lidar implementations for robotics and other services.
"The Blaize and eYs3D collaboration," says Rajesh Anantharaman, Sr. Director Products, Blaize, "showcases the power of the Blaize fully programmable GSP [Graph Streaming Processor] architecture and software productivity suite to process both RGB camera data and depth data in a highly efficient manner to provide both high accuracy and high performance for 3D sensor fusion applications at the edge. The Blaize and eYs3D integration enables faster time-to-market for systems incorporating visual simultaneous location and mapping (VSLAM), facial feature depth recognition, and gesture-based commands."
Taking advantage of the Blaize Graph Streaming Processor (GSP) architecture, the combined design is claimed to offer better depth and distance sensing via the camera's 3D sensor application that includes a sensor fusion function enabling a combination of two sets of data – RGB camera data and Depth data. Processing efficiency and user-friendly programmability for the entire end-to-end application are designed to enable reductions in "performance per watt" system metrics.
In addition, the Blaize P1600 can convert the depth camera’s USB output to high-speed Ethernet connectivity, for enhanced video processing. Software development kits for the reference design will accommodate a wide range of operating systems, programming languages and development tools.
“We are excited to partner with Blaize to bring advanced computer vision capability to market, such as filtering, depth-sensing fusion, real-time 3D point cloud compression and streaming, that further enhance edge AI capability," says James Wang, eYs3D's Chief Strategy Officer. "Depth-sensing technology has been