(MENAFN- EIN Presswire)
Ottobots leveraging Contextual AI 2.0 using the embodied VLMs into Edge Robotics
Ottonomy's Contextual AI 2.0, unveiled at CES, leveraging VLMs on edge hardware by Ambarella N1. This is a massive leap towards general AI for Robotics
The integration of Ottonomy's Contextual AI 2.0 with Ambarella's advanced N1 Family of SoCs marks a pivotal moment in the evolution of autonomous robotics.” - Amit Badlani, Director of Generative AI and Robotics at Ambarella LAS VEGAS, NV, UNITED STATES, January 8, 2025 /EINPresswire / -- Ottonomy , a pioneering startup in autonomous delivery robots, unveiled its groundbreaking Contextual AI 2.0 at CES 2025. This advancement leverages Vision Language Models (VLMs) on Ambarella's edge hardware. Ottobots can now make more contextually aware decisions and exhibit intelligent behaviors, marking a significant step towards generalized robot intelligence; thereby Revolutionizing Robot Perception, Decision-Making and Behaviors.
Contextual AI 2.0 empowers the understanding of real-world complexities, allowing Ottobots to not only detect objects but also understand that additional“context”. This grants exceptional situational awareness, enabling them to adapt to environments, operational domains, or even weather and lighting conditions. Contextually aware behaviors of robots instead being purpose built traditional behaviors of robots is a big leap towards general intelligence for robotics.
“The integration of Ottonomy's Contextual AI 2.0 with Ambarella's advanced N1 Family of SoCs marks a pivotal moment in the evolution of autonomous robotics. By combining edge AI performance with the transformative potential of vision language models (VLMs), we're enabling robots to process and act on complex real-world data in real-time,” said Amit Badlani, Director of Generative AI and Robotics at Ambarella.
Contextual AI and modularity has been the core fabric of Ottonomy, with its customers in Healthcare, Intralogistics and last mile; their robots are delivering vaccines, testkits, ecommerce packages and even spare parts to large manufacturing campuses in both indoor and outdoor environments.
Ottonomy's CEO Ritukar Vijay says,“VLMs on edge hardware is a game-changer for moving closer to general intelligence and that's where we plug in our behavior engine to use the deep context for highest level of autonomous adaptability”. By equipping robots with advanced cognitive abilities, Ottonomy aims to revolutionizing the way items are delivered, be it Healthcare, Last-Mile Delivery or optimize intralogistics workflows, increasing productivity to deliver items in both indoor and outdoor environments seamlessly.
Ambarella's single SoC supports up to 34 B-Parameters Multi-Modal LLMs with low power consumption. This effort utilizes Solo Server to deliver fast, reliable, and fine-tuned AI directly on the edge. Stanford students from EE205 were able to use Solo to easily deploy Large Vision-Language Models and depth models for environment processing. By processing data locally, Solo eliminates the need for constant cloud connectivity, enabling seamless operations even in offline or bandwidth-limited environments.
About Ottonomy Inc
Ottonomy is an award-winning VC backed California based robotics company serving customers across healthcare, intralogistics, and e-commerce. The company has deployed robots across pharmaceutical, hospitals,logistics and manufacturing and smart cities and is scaling globally. The company is committed to developing innovative and sustainable technologies that revolutionize the way goods are delivered and are scaling globally.
Here is a link to the Media Kit
Ritukar Vijay
Ottonomy Inc
..., ...
Visit us on social media:
X
LinkedIn
Instagram
YouTube
Ottonomy's Contextual AI 2.0 - A Leap Towards Generalized Intelligence for Robots
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability
for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this
article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
MENAFN08012025003118003196ID1109069358
Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.