The demonstrator allows immersive experiments and tests to be carried out under numerous aspects of human-machine interaction and, beyond that, of the overall driving experience. It covers numerous interaction use cases - from parking and refuelling/recharging to a wide range of communication and entertainment functions. The heart of this system is to collect qualitative information and data on the driver/passenger behavior in front of a smart system which could lead you and provide new services as you move to a different place. The cockpit can provide all the possibilities of interaction design in a broad range of contexts and situations that development engineers can imagine and that can occur. The design of the client's intelligent driving system is designed accordingly, in an agile manner and the data collected can be used to execute machine learning process and find patterns to script AI skills depending on the operating system used by the maker.
The multi-sensory on-board demonstrator consists of two screens for the driver: an environment display, which is designed to increase safety while driving, minimize the risk of accidents and improve user involvement by sending information that matches the user's behavior and tastes. AutonoMIA also includes a cluster display for technical information.
The demonstrator was developed in collaboration with the Italian HMI and infotainment specialist ART, the Finnish software company Siili and the seat manufacturer Aras. In a further step, the UX will also integrate AR technology from WayRay, a multinational technology company from Switzerland.