ADAS requirements and technologies are constantly evolving in what is proving to be protracted progress toward fully autonomous driving. To reduce costs and risks, OEMs and Tier 1s are looking for technology that will scale along with the progress towards self-driving vehicles.
One of the challenges is the processing of the terabytes of data generated by the array of sensors needed for the various levels of autonomous driving. A breakthrough is the Hailo-8™ artificial intelligence (AI) processor for edge devices, which won the Best Edge AI Processor award at the 2021 Edge AI and Vision Product of the Year Awards.
According to leading AI chipmaker Hailo, the Hailo-8 chip is built on architecture that enables edge devices to run sophisticated deep learning applications that could previously only be processed in the cloud. The Israel-based chipmaker, says its AI processor reimagines traditional computer architecture, enabling smart devices to perform sophisticated deep learning tasks such as object detection and segmentation in real time, with minimal power consumption, size, and cost.
Supported by its Hailo-8™ M.2 and Mini PCIe high-performance AI acceleration modules, the deep learning processor is designed to fit into a multitude of smart machines and devices, impacting a wide variety of sectors including automotive, industry 4.0, smart cities, smart homes, and retail. Hailo’s AI processor is already enhancing products such as Foxconn’s BOXiedge™ and Leopard Imaging’s EdgeTuring™
Orr Danon, Co-Founder and CEO at Hailo, says the company works with OEMs and Tier-1s which are at the forefront of AI automotive technology. The Hailo AI processor is an AEC-Q100 grade 2 Automotive AI accelerator, which has been developed in accordance with ISO26262 guidelines to meet ASIL-B conformance. It can handle up to 26 Tera Operations Per Second (TOPS).
“Hailo’s redesign eliminates untenable heat dissipation issues and removes the need for active cooling systems in the automotive industry. Its advanced structure translates to higher performance, lower power, and minimal latency, enabling more privacy and better reliability for smart devices operating at the edge,” says the company.
Automotive Industries (AI) asked Yaniv Sulkes, VP Product Marketing and Automotive at Hailo, what the significance of the Hailo Hailo-8™ processor is to OEMs.
Sulkes: The pursuit for “vision zero” is set by the Euro NCAP, the European association for car safety, as the next high-level goal for 2025. Intelligent perception technologies take the driver seat in this journey towards automotive AI. From vulnerable road user (VRU) protection capabilities to in-cabin driver monitoring – a compute-intense machine is needed while meeting thermal dissipation, mass production quality, area and price constraints.
More and more vehicles today are equipped with Advanced Driver Assistance Systems. Evolving standards and regulations like Euro NCAP’s Vision Zero and the focus on VRU protection, increasingly demand the inclusion of vision-based systems such as Automatic Emergency Steering (AES), lane support (or active lane-keeping assistance) and rear and side collision avoidance.
AI: What are the implications of the EU mandates on ADAS?
Sulkes: The EU regulation focuses on assisting the driver with braking, accelerating and steering, as well as monitoring his attention and logging his actions. Most of these functions correspond to SAE Level 1 and Level 2 and are gaining significant momentum in the market. As automakers compete in the global arena, ADAS functions offered by non-European automakers will need similar capabilities to be competitive and to meet increasing safety ratings and regulation in US, China, and other regions.
In parallel to advances in the ADAS functions driven by regulators, the automotive industry understands there is a big leap from Level 2 ADAS to the higher levels of autonomy (Levels 3 and 4), which gave birth to the now popular Level 2+ ADAS. These systems add safety and comfort features on top of traditional ADAS applications, while keeping the driver in control of the vehicle (Level 3).
Vehicle safety ratings are helping consumers get clear rankings related to ADAS when it comes to safety. Euro NCAP has been testing ADAS features for a few years now, and it has a roadmap in place that defines testing protocols and associated ratings for advanced ADAS features. These include driver monitoring, vulnerable road user protection and others. NHTSA in the US is considering following NCAP’s safety ratings for ADAS. Looking at new vehicle models and at the solution offerings by automotive Tier 1 suppliers, regulation and safety ratings are going a long way in shaping the landscape of available ADAS solutions and capabilities offered in many passenger vehicles.
AI: What solutions does Hailo offer?
Sulkes: Currently largely reliant on purpose-built intelligent cameras, ADAS systems require powerful processing but within a limited power budget and space and thus are constrained by traditional processors (that is, CPUs and GPUs). The Hailo AI processor is built for efficient, high-throughput and low-power AI processing in automotive-qualified systems. Within its small power requirements, it can run multiple neural networks simultaneously on one or several high-resolution video inputs in real time with low latency, making it ideal to support the most advanced ADAS applications.
AI: How does this compare to older ADAS systems?
Sulkes: A few years back, the definitive ADAS systems “menu” was based on SAE J3016 standard (which dates back to 2014 and was revised in 2016 and 2018). It defines the six levels of driving automation from no automation (Level 0) to full automation (Level 5). Looking at the market today, it is pretty clear that the menu has changed, and this standard is no longer its foundation. I would argue the focus has shifted from a focus on technology with a roadmap of driving automation to a pragmatic view that focuses on a range of effective driver assistance technologies that are making their way into most new vehicle models.
Beyond engineers dissecting driving automation and splitting it into multiple areas that require solutions, there are of course several other key factors here. One is what consumers want. Another is what regulation requires; and, finally, how automakers take these two into account in their new vehicle models.
Car buyers want safety and comfort and have come to expect more over time. Voluntary vehicle rating systems, such as Euro NCAP and C-NCAP (China), NHTSA’s Safety Ratings (US), i-VISTA for ADAS/AV (China) and others help consumers compare different vehicles, especially when it comes to safety. Regulation serves as the lower bound for active safety features. Comfort features driven by ADAS are a bit trickier and have no objective metric, but do carry weight in the design of these systems.
AI: What is your reading of the future of autonomous vehicles?
Sulkes: Automotive computational demands continue to rise as we pave the way for higher levels of autonomy. L3 autonomy and above involves complex multi-sensor systems that need to do more as the human driver is phased out. The multitude of vision, other emerging sensors and their fusion pose challenges and it is clear that neural processing requirements are increasing.
The processing power the Hailo AI chip offers today is such that it can support real-time AI processing of multiple streams of high-resolution video, as well as different types of sensors. The processor’s hardware architecture and software are scalable and can be optimized for different levels of neural workloads. Moreover, several chips can be combined to handle even larger workloads.