Arize AI Introduces AI Copilot


(MENAFN- PR Newswire) Industry-first AI assistant for troubleshooting AI and other new updates promise to speed development for AI engineers

SAN FRANCISCO, July 11, 2024 /PRNewswire/ -- Arize AI, a pioneer and leader in AI observability and LLM evaluation, today debuted new capabilities to help AI developers evaluate and debug LLM systems. The premiere is one among many taking place at the Arize:Observe conference today, where speakers – including OpenAI, Lowe's, Mistral, Microsoft, NATO, and others – are sharing the latest research, engineering best practices, and open source frameworks.

Arize Copilot
– the industry's first AI assistant to troubleshoot AI systems –
is a new tool that surfaces relevant information and suggests actions in the Arize platform, automating complex tasks and taking actions to help AI engineers save time and improve app performance. Examples where the AI Copilot can help out of the box include getting model insights, prompt optimization, building a custom evaluation, and AI search.

"Using AI to troubleshoot complex AI systems is a logical next step in the evolution of building generative AI applications, and we are proud to offer Arize Copilot to teams that want to improve the development and performance of LLM systems," said Aparna Dhinakaran, Chief Product Officer and Co-Founder of Arize.

Other new workflows debuting today in the Arize platform promise to help engineers find issues with LLM apps once they are deployed. With AI search, for example, teams can select an example span and easily discover all similar issues (i.e. finding all data points where a customer is frustrated). Teams can then save those data points into a curated dataset to apply annotations, run evaluation experiments, or kick off fine-tuning workflows.

Altogether, the updates make Arize a powerhouse for experimentation as well as production observability. Leveraging Arize, AI engineers can make adjustments – editing a prompt template, for example, or swapping out the LLM they are using – and then see if performance across a test dataset decreases or there are other impacts (i.e. around latency, retrieval, and hallucinations) before safely deploying a change into production.

About Arize AI
Arize AI is an AI observability and LLM evaluation platform that helps teams deliver and maintain more successful AI in production. Arize's automated monitoring and observability platform allows teams to quickly detect issues when they emerge, troubleshoot why they happened, and improve overall performance across both traditional ML and generative use cases. Arize is headquartered in Berkeley, CA

Media Contact: David Burch, [email protected]

SOURCE Arize AI

MENAFN11072024003732001241ID1108433020


PR Newswire

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.