JFrog Ltd, a Liquid Software company, has announced a new integration with Amazon SageMaker, enabling companies to build, train, and deploy machine learning models.
The company has introduced new versioning capabilities for its ML Model Management solution, integrating model development into DevSecOps workflows. This will increase transparency around each model version, allowing developers, DevOps teams, and data scientists to ensure the correct, secure version of a model is utilised.
The company’s integration with Amazon SageMaker ensures all artifacts used by data scientists or used to develop ML applications are saved in JFrog Artifactory. The integration is available for JFrog customers and users.
“The combination of Artifactory and Amazon SageMaker creates a single source of truth that indoctrinates DevSecOps best practices to ML model development in the cloud – delivering flexibility, speed, security, and peace of mind – breaking into a new frontier of MLSecOps,”said Kelly Hartman, SVP, Global Channels and Alliances, JFrog.
The company will conduct an educational webinar on January 31 in which they will discuss best practices for introducing model use and development into secure software supply chain and development processes.
JFrog’s integration with Amazon SageMaker will enable organisations to maintain a single source of truth for data scientists and developers, ensuring all models are accessible, traceable, and tamper-proof.
Furthermore, the company said this integration will bring machine learning (ML) closer to software development and production workflows, protects models from deletion or modification, and allows for the development, training, security, deployment of ML models and it can also scan ML licenses for compliance with company policies and regulatory requirements.