Helm.Ai Announces DNN Foundation Models For Intent Prediction And Path Planning


(MENAFN- PR Newswire) REDWOOD CITY, Calif., Dec. 27, 2023 /PRNewswire/ -- Helm, provider of next-generation AI software for autonomous driving and automation of robotics, today announced DNN (Deep Neural Network)-based foundation models for behavioral prediction and decision-making as part of the company's AI software stack for high-end ADAS L2/L3 and L4 autonomous driving.

Intent + Path Prediction Top 3
Intent + Path Prediction Top 3

The company has trained
DNN foundation models to make predictions about the behavior of vehicles and pedestrians in complex urban scenarios, as well as to predict the path an autonomous vehicle would take in those situations, which are critical ingredients of the decision-making capabilities for self-driving cars. Helm leveraged its industry-validated surround view full scene semantic segmentation and 3D detection system as the core representation to enable training intent prediction and path planning capabilities. Additionally, the foundation models are trained using the company's proprietary Deep Teaching technology to achieve broad predictive capability in a scalable way.

Helm's technology learns directly from real driving data and uses the company's highly accurate and temporally stable perception system to capture information about complex behaviors of vehicles and pedestrians and the surrounding driving environment, leading to DNNs that automatically learn subtle yet important aspects of urban driving. The foundation models powering Helm's intent and path prediction gather input from a series of observed images and generate predicted video sequences that represent the most likely possible outcomes of what happens next. The models also provide a predicted path for the autonomous vehicle that is consistent with the intent prediction. Both the intent prediction and path prediction capabilities are essential for planning the safest optimal action by the autonomous vehicle.

Importantly, the Helm
DNN foundation models for intent prediction and path planning are trained in the highly scalable Deep Teaching paradigm, enabling unsupervised learning about complex urban driving scenarios directly from real driving data. This approach circumvents cumbersome physics-based simulators and hand-coded rules, which are insufficient to capture the full complexity of driving in the real world. In particular, the Helm development and validation pipeline, while optimized for high end ADAS L2/L3 mass production software, can also be directly applied to L4 fully autonomous applications. Moreover, the Helm scalable AI approach readily generalizes to robotics domains beyond self-driving

Helm is building an AI-first approach to autonomous driving that is designed to seamlessly scale from high-end ADAS L2/L3 mass production programs all the way to large scale L4 deployments. The company's software-only platform is hardware-agnostic and vision-first, addressing the critical perception problem for vision yet also incorporating sensor fusion between vision and radar/lidar as needed. The technology advancements announced today accelerate the value of Helm's software offering by paving the way for scalable development and validation of AI-based intent prediction and path planning software for autonomous vehicles.

"At Helm we are pioneering a highly scalable AI approach that addresses high end ADAS L2/L3 mass production and large scale L4 deployments simultaneously in the same framework," said Helm CEO Vladislav Voroninski.

"Perception is the critical first component of any self-driving stack. The more comprehensive and temporally stable a perception system is, the easier it is to build the downstream prediction capabilities, which is especially critical for complex urban environments. Leveraging our industry-validated surround-view urban perception system and Deep Teaching training technology, we trained DNN foundation models for intent prediction and path planning to learn directly from real driving data, allowing them to understand a wide variety of urban driving scenarios and the subtleties of human behavior without the need for traditional physics based simulators or hand-coded "

Helm closed a $55 million Series C funding round in August 2023. The round was led by Freeman Group and includes investments from venture capital firms ACVC Partners and Amplo as well as strategic investments from Honda Motor, Goodyear Ventures, and Sungwoo Hitech. This financing brings the total amount raised by Helm to $102M.

About Helm
Helm is building the next generation of AI software for high-end ADAS, L4 autonomous driving and robotics. Founded in November 2016 in Menlo Park, CA, the company has re-envisioned the way AI software is built to make truly scalable autonomous driving a reality. For more information on Helm, including its products, SDK and open career opportunities, visit
or find Helm on
LinkedIn .

SOURCE Helm

MENAFN27122023003732001241ID1107662009


PR Newswire

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.