Applying Generative AI Within Cloud-Native Workflows
The new wave of AI is profoundly enhancing software development, but its power is not limited to code generation. Because being able to automate all the code in the world doesn’t matter much if you can’t test, deploy and manage it effectively. As such, there are many interesting ways generative AI can be applied to accelerate cloud-native development and collaboration – especially for microservices-based applications built using containers and running on platforms like Kubernetes.
At KubeCon + CloudNativeCon 2023, I met with David DeSanto, chief product officer at GitLab, to understand the effects generative AI can have on cloud-native DevOps. According to DeSanto, much of the buzz of late has been around code completion and code creation. However, he sees generative AI as having an even greater potential impact around the organizational handoff points.
Applying AI to areas like code review, test generation, and planning could bring a big boost to operational efficiency, he said. Below, we’ll consider some of these areas and how they could benefit from AI. We’ll also weigh the advantages of off-the-shelf versus bespoke models and consider why some technical leaders may still be reluctant to incorporate AI with cloud-native application development due to security reasons.
Streamlining the Code Review Process
These days, most of a developer’s time isn’t even spent coding. In fact, 75% of developers’ time is spent on tasks other than writing code, a 2023 GitLab survey found. One way to apply AI in software development is within the code review process, said DeSanto, which could potentially reduce time spent on non-programming tasks.
One such task is manually tracking down reviewers for new code changes. As such, automatically suggesting teammates with the right domain knowledge to review new merge requests is something that could be accomplished using machine learning that analyzes a project’s history of code snippets and commits.
Another experimental idea is to leverage a large language model (LLM) to produce an AI-generated commit message or a natural language summary of a code review. This technology could streamline the handoff points between contributors on a Git-based project, for example. Such capabilities can unlock velocity and get you through the review process a lot more efficiently, said DeSanto.
Other Ways AI Can Guide Cloud-Native Development
According to the GitLab study, 90% of organizations use AI in software development today or plan to. The technology is at the point where even non-coders could write a sentence or two describing what they need to do and AI could generate a detailed description of a reference application architecture, with open source projects and step-by-step tasks and commands required to bring it to fruition.
In addition to coding suggestions, there are many other areas in which AI can be used to aid understanding and streamline development processes. For example, GitLab is looking into experimental AI features that encompass the following areas:
- Creating natural language descriptions of complex, intricate code bases
- Summarizing discussions between team members to get newcomers up to speed
- Automating test generation to suggest appropriate test coverage for new merge requests
- Recalling and suggesting commonly used Git commands
- Applying root cause analysis for failures and describing issues
Being Smart When Using AI
These days, there are many types of machine learning models to choose from — we’re spoiled for choice. You could go for in-house created models, open source foundational models or paid SaaS models. “It really depends on what you’re trying to accomplish and the project requirements,” said DeSanto. For instance, some models are small enough to use without excessive GPUs, he said.
The end goal should be ensuring everyone receives an efficiency boost while right-sizing the scope — this could equate to selecting lightweight models specifically tailored to the specific use case at hand. To reduce unnecessary costs in this new AI-assisted landscape, a best practice will likely be avoiding excessive model creation and processing.
Of course, there is also a delicate balance to strike between using AI on the sidelines and giving it ultimate control. Therefore, we should look at ways to leverage AI to automate things, but should hedge against risk, said DeSanto. “Automate low-risk areas that you have a tolerance threshold for,” he said. For example, using AI to review code, find vulnerabilities and identify incidents in production are relatively low-risk capabilities. However, relying on AI to automatically modify a production website could be high risk.
Interest in Generative AI Isn’t Going Anywhere
There are some risks inherent in LLM technologies, such as hallucinations, privacy concerns or IP leakage. As such, many executives are still reluctant to utilize new AI within their development processes due to LLM-related security issues. “There is a lot of hesitance to adopt AI, but there is also pressure in the market due to competitors doing it,” said DeSanto.
He recommended first starting with simple AI features that have a low impact, such as suggested reviewers. Focus on the scenarios where you know it’s only trained on your data. Then, over time, you may want to experiment with AI use cases that are more “scary,” he said.
Regardless of the trepidation, the AI train has left the station. We’re seeing more and more use cases and evidence of AI-assisted development in practice. Stack Overflow’s May 2023 Developer Survey, for instance, found that 70% of developers are already using AI or plan to do so this year. GitHub’s 11th annual Octoverse report found 65,000 public generative AI projects built on GitHub, representing a 248% increase from 2022. It also found that a third of open source projects have a maintainer using Copilot.
The interest in generative AI isn’t going anywhere. Therefore, to get a leg up with the competition, engineers and DevOps teams should consider where they might take advantage of utilizing AI within their workflows to get an edge.