In a quest to offload drivers and by offering a holistic human-machine interface, Continental has incorporated gesture control into the steering wheel. This technique was previously limited only to infotainment systems.
The core element of steering wheel-based gesture control is a time-of-flight sensor built into the instrument cluster. According to the German company, by integrating this sensor into the unusual position in the instrument cluster, it enables a solution that effectively reduces driver distraction and paves the way for further enhancements on the way to a holistic HMI.
The time-of-flight sensor spots the motion of the hand and converts it into actions. Drivers can navigate through the menus by swiping up and down, and approve the selection with just a quick tapping motion. Moreover, touch-free operation is possible for other functions. For instance, if the driver moves his fingers up and down in a uniform movement while keeping his hands on the steering wheel, he can either accept or reject calls. Typically, a gesture is a movement associated with a certain property. With the time-of-flight sensor incorporated in the instrument cluster, this development has a high rate of gesture recognition. The sensor includes a 3D camera system with an incorporated 3D image sensor and converts the infrared signal spotted by the sensor into a 3D image. Hence, the hand positions and gestures of the driver are spotted with millimeter precision and converted to actions.
With the existing solutions, the driver frequently has to take his hands off the wheel or his eyes from the road, while the new action radius with the new solution is more focused.
According to Ralf Lenninger, head of Strategy, System Development, and Innovation in Continental’s Interior pision, with gestures in a clearly defined area on the steering wheel, they could reduce distraction and enhance safety. He says that this narrowing down also inhibits the driver from accidentally activating gesture-based control by means of their typical everyday gestures, and hence making undesirable selections.
Currently, the system can spot four various gestures: controlling the on-board computer, answering calls, starting music, and browsing through apps. Particularly, they welcomed the proximity to the steering wheel, thumb operation, and the intuitive learnability of the gestures.
Ralf Lenninger explains that the development of a holistic HMI is critical for further reinforcing the driver’s confidence in their vehicle. He adds that building up this confidence, combined with an intuitive dialog between the driver and the vehicle is yet another crucial step on the road to automated driving, one that they are supporting with gesture-based control on the steering wheel.
The all-new operating concept incorporates smoothly into the HMI interface and can substitute for other elements such as buttons or even touch-sensitive surfaces on the steering wheel. In employs two transparent plastic panels—without any electronic components—behind the steering wheel, which a driver can operate with his thumbs like a touchpad. Hence, the driver benefits from intuitive operation, while automakers benefit from optimized system costs. The panel’s clear design is also compatible with almost any control geometry and new gestures can be added at any time. Moreover, the variable complexity guarantees that the system can be integrated in a number of various vehicle classes and not only in the luxury segments.
Hamid Moaref has always been fascinated by cars and the automotive industry. His family has a longstanding association with the industry and has been in the tire business for the past 35 years. Raised in Dubai, Hamid attended Capilano University in Vancouver where he graduated with a BBA in marketing before attending an intensive course in magazine publishing in 2005. He has been the publisher and chief editor of Tires & Parts magazine for the past ten years.
Your email address will not be published. Required fields are marked *
© 2017 Morjan Media LLC. All Rights Reserved.