OCI Standardizes Container Registry Protocol

The Open Container Initiative (OCI) announced today that the way which container images are distributed is about to become standardized using the Docker Registry v2 protocol.

Chris Aniszczyk, executive director of the OCI, says the protocol will now serve as the specification for the new distribution-spec OCI project that will foster interoperability across different container registries. Docker Inc. developed the Docker Registry v2 protocol, and many other registry providers created their own equivalents. Not surprisingly, that has led to some interoperability issues that make it more challenging than necessary to construct container pipelines and supply chains at scale, says Aniszczyk.

The Docker Registry v2 protocol makes the most sense to be that standard because it’s already the most widely employed, he says. According to Docker Inc., the protocol has been used to pull more than 40 billion container images. Most of the protocols to access registries today already are based on Docker Registry v2 protocol, with some subtle differences. By contributing Docker Registry v2 protocol to the OCI, the need for those differences goes away and vendors that support OCI are reaffirming their commitment to reserve competition to much higher up the platform stack, says Aniszczyk.

The distribution-spec is the latest in a series of OCI initiatives highlighting the rapid rate at which container technologies are maturing, Aniszczyk notes. In fact, rather than focusing mainly on building pipelines to construct single applications, more attention is being focused on how to manage supply chains that ultimately will consist of multiple registries and pipelines. Constructing supply chains at scale assumes a base level of interoperability between registries enabled by a Docker Registry v2 protocol, which now can be deployed anywhere to access images in public or private clouds.

As reliance on containers continues to accelerate, there’s no doubt that DevOps processes will need to evolve. DevOps processes are designed to accelerate the deployment of applications. But containers are now making it easier to build and update applications at unprecedented rates, which in turn is requiring organizations to both expand and enhance their DevOps processes. Most of those DevOps processes started with small teams that many IT organizations are now extending across the rest of the organization. But most of those processes assumed a rate of updates to applications that, in many cases, soon will be exceeded by developers capable of updating applications using containers almost as fast as they can write and validate code. Because of that issue, many organizations are simply handing more control over operations directly to the developers themselves—which has significant implications in allocating and reallocating IT operations staff.

There is no one-size-fits-all-approach when it comes to DevOps. Each organization needs to assess its own capabilities before trying to impose DevOps processes from the top-down. But the rise of containers does mean that, out of necessity, the rate at which organizations will find themselves moving up the DevOps maturity curve may be a lot faster than anticipated regardless of whether they are ready.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1620 posts and counting. See all posts by Mike Vizard