IT Brief India - Technology news for CIOs & IT decision-makers
Story image

Endor Labs unveils AI open source model discovery tool

Today

Endor Labs has announced the release of a new feature that assists organisations in discovering and governing open source artificial intelligence models used within their applications, known as AI Model Discovery.

This new feature aims to help application security professionals by enabling them to identify open source AI models within their code, assess associated risks, and enforce organisation-wide policies regarding AI model curation and usage. Additionally, it automates the process of detection by alerting developers about policy violations and preventing high-risk models from being implemented in production.

Varun Badhwar, co-founder and CEO of Endor Labs, emphasised the necessity of this development, stating, "There's currently a significant gap in the ability to use AI models safely—the traditional Software Composition Analysis (SCA) tools deployed in many enterprises are designed mainly to track open source packages, which means they usually can't identify risks from local AI models integrated into an application. Meanwhile, product and engineering teams are increasingly turning to open source AI models to deliver new capabilities for customers. That's why we're excited to launch Endor Labs AI Model Discovery, which brings unprecedented security in open source AI deployment."

Experts in the field have recognised the critical need for such tools. Katie Norton, Research Manager for DevSecOps and Software Supply Chain Security at IDC, explained, "While vendors have rushed to incorporate AI into their security tooling, they've largely overlooked a critical need: Securing AI components used in applications. IDC research finds that 60% of organisations are choosing open source models over commercial ones for their most important GenAI initiatives, so finding and securing these components is critical for any dependency management program. Vendors like Endor Labs are addressing an urgent need by integrating AI component security directly into software composition analysis (SCA) workflows, while providing meaningful remediation capabilities that don't overwhelm developers."

The capability enhancements are designed to align well with Endor Scores for AI Models, which use metrics to evaluate open source AI models based on security, popularity, quality, and activity. This is particularly pertinent as most developers are opting to deploy pre-trained models from the platform Hugging Face, adapting them for specific purposes rather than creating new ones due to the significant time and cost involved.

By employing Endor Scores, Endor Labs can identify and evaluate risks associated with the over 1 million available open source AI models and datasets on Hugging Face. Security teams are then able to set critical constraints, ensuring they gain a similar level of insight and command over AI models as they have come to expect with other open source dependencies.

Endor Labs AI Model Discovery offers a structured approach through its key functionalities:

  • Discover: This function enables the scanning for and locating of local AI models already employed within Python applications, allowing for the creation of a comprehensive inventory of these models and tracking which teams and applications are utilising them.
  • Evaluate: Through this feature, AI models can be analysed for known risk factors using Endor Scores encompassing dimensions such as security, quality, activity, and popularity, thereby identifying models with questionable sources or practices.
  • Enforce: The feature allows organisations to set and manage usage policies for local open source AI models based on risk tolerance, thereby informing developers of policy violations and preventing high-risk models from being integrated into applications.

Endor Labs AI Model Discovery is now available to current customers with the possibility to trial the full platform for 30 days.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X