
If you’ve driven around the Metro Phoenix area of the United States recently you may have shared the streets and freeways with a driverless vehicle. Jaguar’s I-PACE forms the base vehicle for Waymo’s premium electric self-driving vehicles, which, if it weren’t for the exterior branding, are recognisable by their ‘ski-box’ sized adornment found on the roof. The availability of this, and other, self-driving services is in response to a demand for car sharing and a falling interest in being a car owner, especially for those whose driving is limited to local commutes and grocery shopping. It also aims to provide a cost-effective transport solution for the 3 million 40+ aged blind or low vision Americans, as well as for senior citizens living in car dependent communities [Reference 1].
Advanced Driver-Assistance Systems (ADAS) have become a standard feature of many modern vehicles, helping the driver to handle certain traffic situations. Many Automotive OEMs and their suppliers are currently focusing on Level 3 automation where ‘hands-off’ and ‘eyes-off’ are possible but the driver has to intervene under certain conditions. Market disrupters looking to offer driverless car-sharing services, such as Waymo, are already playing in the Level 4 space.
Reviewing the constituent parts of automation
When broken down into its constituent parts, the autonomous driving platform is essentially made up of three blocks; sensing, computation and actuation. All current autonomous driving solutions combine a significant number of different sensing solutions to determine the environment around them. This includes short-range sensing such as ultrasound and short-range radar as well as long-range radar, LiDAR, and 360° cameras.