Error recovery strategies in conversational automotive Human Machine Interfaces
The application of conversational user interfaces (CUI) in vehicles shows promising advantages over the conventional interfaces due to its hands-free, eyes-on-the-road interactions owing to higher safety while driving. However, the currently available technology has its fair share of challenges w.r.t non-intuitive, command-based speech interfaces and poor speech recognition. This research investigates the ultimate acceptability and user experience of speech based systems for in-vehicle interactions, with the development of a more intuitive, naturalistic dialog structure between humans and their cars. The idea is to implement human error recovery strategies to resolve the problems of misrecognition and misunderstanding. This study uses an interactive prototype developed using Amazon’s Alexa Voice Service and IBM’s Watson cognitive computing services.