Slim.ai Applies AI to Optimize Containers Before Being Deployed
Fresh off raising $6.6 million in seed funding, Slim.ai this week launched a namesake DevOps platform that makes use of machine learning algorithms to resize and optimize containers before they are deployed in a production environment.
Slim.ai CEO John Amaral says currently, many of the containers that developers are looking to deploy in a production environment are larger than they need to be either because code that is not required has been encapsulated or the way that code is organized is simply inefficient.
The artificial intelligence (AI) as a cloud service by Slim.ai will first analyze the contents of a container, then remove unnecessary code and, when appropriate, reorganize the code and replace the container with another one that has been optimally configured for a production environment, says Amaral. The resulting container is typically anywhere from 10 to 30 times smaller than the original, he adds.
Slim.ai achieves that goal by first analyzing and comprehending application composition and behavior to determine how any set of containers are constructed and run. The platform then employs a combination of run-time and static container analysis capabilities and optimization engines to construct a more efficient container runtime.
Ideally, the replacement containers will be shared with the original developer to enable them to see and absorb how containers might be better constructed using a set of visualization tools provided by Slim.ai. However, some organizations in the interest of time may employ the Slim.ai platform as an optimization gate with a DevOps workflow, notes Amaral. The platform is designed from the ground up to integrate with continuous integration/continuous delivery (CI/CD) platforms, container registries and code repositories.
Amaral says Slim.ai also has a role to play in advancing best DevSecOps processes by making it easier to identify code with known vulnerabilities that might have been encapsulated by a developer inadvertently.
In general, it’s not clear just yet how broadly AI will be applied to automate DevOps processes going forward. However, given how dependent application development and deployment processes still are on manual processes, the potential to employ machine learning algorithms to automate a wide range of processes is significant. In fact, most organizations that adopt containers will not be able to achieve the developer productivity levels necessary to deploy microservices-based applications at scale without relying more on AI, Amaral says. Slim.ai is designed to eliminate the need to depend on specialized knowledge of a production environment that today typically resides only in the head of a site reliability engineer (SRE), he notes.
DevOps proponents that are committed to ruthlessly automating IT to the full extent possible should naturally be at the forefront of AI adoption. The real challenge is gaining enough confidence in those AI platforms to trust the recommendations and actions being taken across a set of complex tasks and processes that today require a lot of manual intervention to get right. At this point, however, it’s not a question of whether AI will be applied to DevOps as much as it is to what degree.