AI Online

Ai INNOVATION, SINCE 1895

 

Studies show that drivers are unaware, uninterested or distrustful of the Advanced Driver Assistance Systems (ADAS) built into their vehicles. There is, therefore, reluctance to pay for additional ADAS features. Onboard dynamic maps which show live information of road conditions and interact with both the occupants and the vehicle systems are helping to bridge the gap.

For users to trust the system, they need to gain familiarity and feel comfortable with the car’s next move. This means the car must communicate its capability along the routes and its intent – such as when it plans to overtake or change lanes – clearly to the driver, enabling more anticipation, to avoid things like motion sickness and ensure the driver is always informed about upcoming maneuvers.

Automotive Industries (AI) asked Remco Timmer, SVP Product Management, and Head of Automotive Solutions at HERE Technologies, how HERE is helping consumers to better understand the value of ADAS functions.

Timmer: Assisted or automated driving encompasses a massive envelope of features in just two words.

Consequently, many people are very confused about what ADAS means. It starts very simply with maps carrying speed limit information, followed by traffic updates providing information about delays and safety-critical conditions. Next would be features such as cruise control, radar guided or automated cruise control, lane keeping assist, lane change assist, and hands-free lane changing.

Remco Timmer, SVP Product Management, and Head of Automotive Solutions at HERE Technologies
Remco Timmer, SVP Product Management, and Head of Automotive Solutions at HERE Technologies.

These features not only provide more comfort to the driver but also allow automakers to differentiate. They help build trust and understanding as we build up to hands and eyes off driving. This allows users to get accustomed to their vehicle doing the driving, without leaping immediately into full autonomy, which may be too much for many consumers.

For the consumers to understand ADAS, it is critical that it becomes part of every journey.

It should not be a feature set in that you have to think about activating or deactivating. It should be clear to you immediately what level of features are available. That is why we present the ADAS capability of the vehicle into the navigation view. This, for example, makes it clear where you can expect to be able to switch to hands-free driving, and make it part of the route planning and selection.

As soon as you arrive in an area that has these features available, we remind you that you can now safely activate it. We guide you through the entire navigation and guidance experience, showing when you can hand over or need to take control again.

In this way using ADAS becomes more seamless and intuitive, making it more likely to become part of every journey. Increasing the trustworthiness and acceptance of the assisted driving equipment & features amongst drivers helps automakers to improve their value perception even further.

AI: What are the most benefits to the OEM of starting with simpler ADAS?

Timmer: Having elemental ADAS in a vehicle provides an opportunity to slowly but surely upgrade the feature sets. The advantage of elemental ADAS features is that they will inform the driver about the things they need to know to be as safe as possible on the road.

Road rules and regulations are not always clearly signposted and often change. Did you know that 60% of the speed signs are actually implicit? Therefore, assisting drivers to adhere to speed limits and other road rules through accurate information on the map helps establish a trust relationship between the OEM and the elemental ADAS technology.

AI: What is the role of mapping data in autonomous decision making?

Timmer: Maps represent our collective understanding of the environment, enabling drivers and vehicle systems to understand the context in which they drive and to make more informed decisions. Another way of looking at it is the collective experience and wisdom of everybody on the road.

HERE maps share changing road conditions, road rules and regulations, and live traffic and traffic events that are critical for the autonomous driving systems.

What is important is that the autonomous system should not rely only on what its sensors can see. You can have as many sensors as you like, but they have physical and technical limitations.

Trust in assisted or automated driving starts with interactive maps.
Trust in assisted or automated driving starts with interactive maps.

It is therefore critical that the map also provides complementary insight about the road ahead to what the sensors can see. Even if there’s dense traffic around you or troublesome weather conditions, our system enables the vehicle to see beyond that, making for a far more comfortable and safer ride by preempting the changing road conditions. It also adds resilience to the system. If the sensors, for example, are temporarily blinded, you can fall back to the map.

AI: How do maps and the perception stack work together in ADAS?

Timmer: The perception stack gives a view on the real time environment of the car, but the map goes beyond that. Maps are not just a navigation tool but an essential part of the vehicle’s understanding of its surroundings. The vehicle uses the map to plan its path, adjusting to factors like road curvature, road elevation, speed limits, changing lane counts and traffic or weather conditions, the last being particularly relevant for Electric Vehicles.

This allows for smoother maneuvers, less consumption, optimized range, providing enjoyable, optimized driving experiences. Maps help the car see beyond the perception stack (what the sensors can perceive).

In dense urban traffic, where the perception stack may have limited visibility, the map plays a critical role in ensuring the vehicle can safely navigate by offering details about what’s ahead. Maps see beyond what sensors can see, providing benefits to drivers and systems like contextual awareness, seamless experience and anticipation of maneuvers.

The map is also used to help prioritize the compute budget available for the perception stack. If you know that you are on a straightforward two-lane highway, and there’s no intersection in the next 20 kilometers, and there’s no pedestrian crossing, there is limited demand for compute along that part of the route.

Yet there are very different compute demands in a dense urban environment with complex crossings.

AI: What is the importance of transparency in the car’s understanding of its environment and how can vehicle intent and real time feedback help drivers build trust in ADAS?

Timmer: We have always had a navigation map display in the cockpit and offer our navigation stack as well as the delivery of maps for ADAS to multiple automakers. This year at CES and Auto Shanghai, we demonstrated new capabilities to create a comprehensive ADAS experience.

The HERE SDK provides the latest, complete navigation and location services experience for connected vehicles. It stands out for its multi-scenario adaptability, data accuracy, coverage breadth, technical performance, and developer-friendly features, making it ideal for high-precision mapping, real-time navigation, and cross-platform support.

For the ADAS user experience, we need to help the driver understand what the vehicle understands about its environment. The car’s sensors often produce raw data that’s hard to interpret in isolation. If you only see what the sensors see, but not in context of a map, it’s hard to understand what the car perceives.

By projecting sensor data onto a map, users can see the spatial relationship between objects and the vehicle, helping them understand whether a vehicle is ahead, in the same lane, or in another lane. This contextualization makes it easier for the user to trust the system as is delivers a high-fidelity representation of the surroundings and road ahead.

The map becomes a canvas to show the user what the vehicle understands of the wider environment, with the perception stack projecting its understanding of the environment.

AI: Why is it important to contextualize sensor data for the user?

Timmer: For many people it is not intuitive. If you’re an engineer and you have processed a lot of sensor data, it feels like the most normal thing. But for many average consumers, it is magic.

Putting the sensor data on the map helps them to understand the context, rather than seeing cars floating here and there on a screen. In addition, it lets the drivers immerse into a compelling differentiated experience that automakers can monetize.

AI: How is user feedback incorporated into the improvement of ADAS?

Timmer: It is important to incorporate user feedback into the system. If drivers feel uncomfortable (for example being routed through a dangerous area at night) or take control in certain situations (which we can monitor through our software), that feedback provides valuable insight into areas where the system might need improvement.

Automakers learn from where people weren’t comfortable – and that is why you implement usage analytics on software, so we’ll learn when users or the vehicle behave different from its design intent. This feedback can help refine the vehicle’s Operational Design Domain (ODD), where the automaker encodes what level of support the driver gets where.

The monitoring can begin with understanding why and when users activate and deactivate the ADAS features.

Automakers can then determine in which environments the ADAS is seen as effective and with the help of our technologies build the effective user interface to surface that. If there is a lot of disengagement in certain environments, we can help them to identify why. Is there maybe something unique about the road conditions or the infrastructure that makes the system less effective or trusted in a particular environment?

We work with the automaker to understand the characteristics of these scenes and what it takes for them to perfect their solution, e.g. by helping them gain access to more training data, and more validation data to perfect the system.

Of course, also the users can sometimes see things that are too complex for sensors to observe, such as debris or a speed limit that has changed which has not been picked up by the perception stack or is not accurate in the map. This user feedback can be incorporated to make the collective understanding of the environment more robust and more intelligent.

AI: How are real-world scenarios used in simulations to train ADAS?

Timmer: Our map can be used to create the scenes in which the training happens. Our high definition and standard definition maps are used as source data to create what is known as an Open Scene format, which are used in simulation environments populated with real world scenarios. Earlier this year, together with our partner Amazon Web Services we have presented SceneXtract, a tool that enables automakers to select these scenes.

There are a lot of simulation companies which either take a prerecorded scenario and play it back in the scene or have a tool that enables them to generate a scenario, sometimes based on a real-world scenario, sometimes from scratch. These scenarios can then be used to train and validate in simulation environments. With SceneXtract, automakers can reduce the scene generation from weeks to minutes.

SceneXtract, enables automakers to select real world scenarios for ADAS training.
SceneXtract, enables automakers to select real world scenarios for ADAS training.

AI: How does simulation help improve system performance?

Timmer: This is achieved through a feedback loop where the OEM recognizes that their system is performing very well in one environment, but maybe not as well in another, or where they want to deploy in a new environment. This is where we can give them the scenes and the scenarios to train and help validate future functionality and keep improving it and making it more robust, more resilient and more reliable.

AI: What is next for Here?

Timmer: We are going through a transition from being a map provider to a fully integrated map and software solution provider, with enough flexibility and adaptability so that the OEM can deliver unique and exciting innovations and explore new ways of integrating maps into the digital cockpit, ADAS systems and other in-vehicle experiences. We are convinced that automakers should spend time on what they do best: innovate and differentiate.

By ensuring high integrity between the map and the software using the data in the map, we provide a faster time to market, at a much lower total cost of ownership for the OEM. We have taken on much of the costs of integration, quality management, continuous testing and validation.

At the same time, we are not prescriptive. As a B2B company we enable automakers to expand our standard software, customize it, and configure it to capture their unique brand identity and make it their own.

A good example of that is that currently we have over 10 automakers that deployed the exact same mainline app from us – HERE Navigation, but the look and feel user experience is vastly different from one to the other to the other. In addition, most of them have incorporated unique services on top of the common base, yet all of them benefit from each other’s progress.

There is a lot of exploration and innovation to be done to further advance the industry. Automakers should be able to focus their efforts and have create meaningful and fun innovations for their customers, and we want to help.