Customer Logins
Obtain the data you need to make the most informed decisions by accessing our extensive portfolio of information, analytics, and expertise. Sign in to the product or service center of your choice.
Customer LoginsADAS & Autonomous Driving at CES 2015
The International CES is the most important show in the consumer electronics field. It has grown consistently year over year, not only in physical size but also in terms of interest areas - and automotive is the biggest part of that change. What was once a small part of the Las Vegas Convention Center housing aftermarket radios and booming speakers is now the premiere event for automotive innovation.
That innovation has largely resided in one of two areas. First, the connected car, both inside and out: connectivity between the brought-in and the built-in, and connectivity between the vehicle and the rest of the increasingly connected world. Second, the intelligent, cooperative and autonomous car: intelligence to become aware of its surroundings, to communicate and cooperate with the driver, and to gradually improve the safety, convenience and efficiency of the driving activity.
Where these two areas meet is the necessarily complex human machine interface (HMI) and by extension the entire in-vehicle experience itself. And it's that in-vehicle experience that was a major focus at the 2015 International CES.
In The Background And The Foreground
One of the most intriguing introductions at CES was the Tegra X1 from NVIDIA. While CES is accustomed to seeing NVIDIA in computers and mobile devices, the company was laser-focused on automotive with the TX1. The "mobile superchip" easily bests the Tegra K1 from last year's CES - used by the likes of Audi as the computing platform for their semi-automated piloted driving technology - but the more impressive news were the tools that accompany the TX1 that will help it power both the increasingly graphic-intensive HMI as well as automated driving sensors and systems.
The NVIDIA Drive CX platform includes a single TX1 and the DRIVE Studio development platform; it will allow an incredible range of visualizations on a number of in-vehicle displays with detailed textures that look to come dangerously close to driver distraction with the level of detail available.
The NVIDIA Drive PX platform includes dual TX1 chips on a board supporting 12 camera inputs to power machine vision processing that supports auto-pilot applications in automotive's semi-autonomous near-term future. Sensor fusion, environment modeling, situational awareness and path finding are all enabled by the Drive PX platform, but perhaps the most impressive feature is machine learning.
Deep Neural Networks are the enabler here alongside the massive computing power on the dual TX1 chips. At its simplest, deep neural networks enable machine learning which is exactly what it sounds like: feature detection from massive amounts of unlabeled data through iterative processing. It's not the simplest technology in the world by any stretch of the imagination but giving machine vision the opportunity to improve itself - even at the cost of power draw and valuable electronic real estate - holds great promise for allowing ADAS and automated and autonomous driving systems to adapt to their environment and hopefully reduce the need for uniform infrastructure.
Simplifying The HMI
While CES saw a lot of activity surrounding the HMI for infotainment purposes - gesture recognition, eye tracking and predictive, adaptive, contextual interfaces for example - the topic of HMI in ADAS and automated driving faces slightly different obstacles. If the vehicle itself is controlling the driving task and the driver can disengage mentally (to whatever extent), the HMI must call the driver back to attention, inform them about the current situation, suggest an appropriate action and then rely upon the driver to execute. In time-sensitive, safety-critical situations - precisely when the driver is most likely to be called back into the loop - the HMI has a massively complex task on its hands.
The industry is addressing that and related issues in many ways. On one side, distilling complex information to a digestible and intuitive form is exactly what infotainment-centric HMIs must do. Many companies are tackling both infotainment and safety HMIs in similar ways; Valeo's Moebius digital interface helps manage information in either automated to driver-controlled modes but always keeps the situational awareness interface visible for the driver to stay current on the vehicle's surroundings.
On the other side of the equation, many automotive suppliers are aiming to simplify the HMI by automating many secondary functions that are lower priority than simply driving the vehicle. The information is useful for effective or efficient driving, but the complexity of the information means informing the driver is not a simple task.
Examples of this simplification of the HMI include location-based driving tips to improve efficiency. HERE provides detailed map databases that allow its partners such as Continental and Harman to generate an electronic horizon to prepare the vehicle for upcoming events. That horizon may include live traffic information or topographical or road infrastructure features for example, and it could take consider the effective driving range as in the BMW i3 proof of concept. While the concept (and implementation) of the electronic horizon is not new, it underlines the ability of the vehicle to automate secondary tasks and allow the driver and the HMI to focus on the task at hand.
HMI Inside & Out
CES provided a provocative example of the next generation of the automotive HMI when Daimler 'reimagined' the autonomous automobile anew, designing the F015 Luxury In Motion Concept from the ground up, giving thought to how autonomous mobility will change society - and certainly the in-vehicle experience.
The F015 includes new interior HMI technologies such as gesture control and expansive personal displays for each user, but the most interesting human machine interfaces are outside the vehicle rather than inside. The F015 suggests solutions for some of the oft-asked questions of autonomous vehicles: how can a robot replicate the acknowledgement and non-verbal communication that frequently occurs between driver and pedestrian?
When the F015 detects a pedestrian and their paths are at a cross, it projects a laser crosswalk onto the road that intuitively tells the pedestrian that the vehicle sees them and intends to yield the right of way. To replace the common head nod by the driver to signal the same, the F015 begins scrolling the crosswalk to indicate the pedestrian can cross the vehicle's path.
Front-facing LEDs arranged in a wide field across the front of the vehicle indicate whether the vehicle is driven manually (white) or autonomously (blue), but they also track movement by following the pedestrian; the red LED field on the rear bumper replicates that tracking to pass the information on to other vehicles or drivers as well.
Conclusion
The HMI is a very complex set of technologies whose purpose is to convey a wealth of information without being a distraction. That is a monumental task, and it only becomes more complex as the driver is allowed to disengage from the act of driving and cede control to the vehicle. Even if under the most ideal circumstances, there's no getting around the fact that the HMI is among the most critical technology points in the car of the future and it is as deserving of attention as the other technologies that enable automated and autonomous driving.
CES serves as the show of innovation for automotive more than any other industry event, and there are many innovative proposals to address some very complex issues facing the industry now and in the coming years.
By Jeremy Carlson, Senior Analyst, IHS Automotive
Posted January 22, 2015