With Latest Intel Mobileye Chip, Automakers Can Bring Automated Driving to Cars

Intel subsidiary Mobileye plans to bring to market a new supercomputer designed to give passenger cars, trucks and SUVs autonomous driving powers.

The company showcased a new system-on-a-chip called EyeQ Ultra, specifically designed for autonomous driving, at the CES 2022 tech show on Tuesday. The company said the first silicon for the EyeQ Ultra SoC, which is capable of 176 trillion operations per second (TOPS), is expected in late 2023 with full automotive-grade production in 2025.

The company also showcased its next-generation EyeQ system-on-a-chip for advanced driver assistance systems, called EyeQ6L and EyeQ6H, at CES. The EyeQ6L is designed to support what’s known as ADAS Level 2 and is expected to reach start of production by mid-2023. The EyeQ6H, which will not enter production until 2024, will support ADAS or partially autonomous vehicle capabilities. This more efficient chip will be able to provide all the advanced driving assistance functions, multi-camera processing (including parking cameras) and will host third-party applications such as parking visualization and driver monitoring.

Mobileye is perhaps best known for providing automakers with computer vision technology that powers advanced driver assistance systems. The first EyeQ chip was released in 2004 and was used in vehicles to avoid collisions. This has been a booming business for Mobileye, which at the end of last year delivered its 100 millionth EyeQ SoC.

In recent years, the company has pursued what appeared to be a dual strategy of providing automakers with the chips they need for an advanced driver assistance system while developing and testing its own autonomous vehicle technology. In 2018, Mobileye even broadened its focus beyond being a simple supplier to becoming a robotaxi operator.

These two paths are converging now, fulfilling a long-standing strategy of Mobileye President and CEO Amnon Shashua, who describes consumer AVs as the “end game for the industry”.

Mobileye has been developing automated vehicle technology for several years. Its comprehensive self-driving stack – which includes redundant detection subsystems based on camera, radar and lidar technology – is paired with its REM mapping system and rules-based driving policy and Responsibility Sensitive Security (RSS).

Mobileye’s REM mapping system collects data by tapping into consumer and fleet vehicles equipped with its EyeQ4, or fourth generation system-on-a-chip, to create high-definition maps that can be used to support ADAS and autonomous driving. This data is not videos or images, but compressed text that collects around 10 kilobits per kilometer. The mapping technology, which guided the development of this new EyeQ Ultra chip, is accessible via the cloud to provide, in real time, up-to-date information on the paths to be traveled.

Mobileye has entered into agreements with six OEMs, including BMW, Nissan and Volkswagen, to collect this data on vehicles equipped with the EyeQ4 chip, which is used to power the advanced driver assistance system. On fleet vehicles, Mobileye collects data from an aftermarket product that it sells to commercial operators. Today, more than one million vehicles collect REM data – now at over 25 million kilometers per day, according to Mobileye.

The EyeQ Ultra continues to build on previous generations of its SoC architecture. The EyeQ Ultra packs the processing power of 10 EyeQ5s into a single package, according to Mobileye. The company said that EyeQ Ultra, which is designed with Mobileye software, is paired with additional processor cores, ISPs, and GPUs and is capable of processing input from two detection subsystems – a camera system. only and the other combined radar and lidar – such as as well as the central vehicle computer system, high definition REM map and RSS driving policy software.

Automakers keen to sell cars, trucks and SUVs to consumers who can drive independently would theoretically use this still available chip to achieve this goal. The EyeQ Ultra does not include sensors like radar and lidar. Instead, it processes all of this incoming information. It is up to the manufacturer customer to decide exactly how the EyeQ Ultra chip can be used. For example, one automaker might come up with new vehicles capable of driving autonomously only on highways, while another might focus on automation in urban areas.

Comments are closed.