New Storage Challenges Emerge as Container Adoption Increases

Containers bring unprecedented mobility, ease and efficiency for rapidly deploying and updating highly scalable and resilient applications. As a result, containers are now one of the leading technologies reshaping DevOps practices. Just as VMs transformed the world of underutilized servers and cloud computing continues to transform the virtual data center by solving the elasticity problem, containers are transforming application delivery by making continuous integration/continuous development (CI/CD) practical.

According to research from Gartner, more than 50 percent of companies will use container technology by the year 2020. Research conducted by DataCore Software revealed that 46 percent of respondents already had containers deployed in either production or development/testing. These trends foretell unprecedented growth of containerized deployments in the mainstream, blazing an adoption curve that is much steeper than the ones achieved by server virtualization or public clouds.

However, as container consumption unfolds, new challenges must be addressed for them to be rolled out more broadly, especially for business-critical applications. The two main areas where container technology needs to mature are security and persistent storage. These challenges are evidenced in the recent “State of Software-Defined, Hyperconverged and Cloud Storage” report, which found that the following surprises and unforeseen consequences have been encountered by users following container implementation:

  • Lack of data management and storage tools.
  • Application performance slowdowns—especially for databases and other tier-1 applications.
  • Lack of ways to deal with applications such as databases that require persistent storage.

Container persistent storage for stateful applications has proven particularly difficult for the industry. Unlike monolithic applications, which reserve storage resources in perpetuity, containers and microservices bounce in and out of existence, migrating between machines at breakneck speeds. Consequently, the importance of quickly provisioning and releasing data volumes, as well as rapidly and securely reattaching them from another server in a web scale cluster, becomes paramount.

The repeating cycle of coding and testing, followed by blue/green production deployment, further exacerbates the data storage requirements for containerized applications. Data services fashioned for traditional application architectures must now serve a new, very transient consumer.

As a result, only those storage solutions capable of providing shared storage to both existing (virtualized and bare-metal application infrastructures) and container-native applications with a consistent set of data services are likely to survive. In other words, a modern storage solution must provide DevOps teams with persistent, stateful application data; allow the consumption of storage on-demand; and deliver the same level of availability and performance provided to traditional application infrastructures.

When evaluating alternatives, consider those software-defined storage (SDS) products that offer a comprehensive set of data services and shared data access across different types of applications, yet allow you to choose from a cross-section of competing storage systems. In particular, look for software stacks adhering to the Docker Volume plugin and, more recently, the Kubernetes Container Storage Interface (CSI)—critical elements for compatibility, easy setup and freedom from vendor lock-in. The software can then provision virtual disks on the back end to provide containerized applications and microservices with access to the same advance features that virtual machine and bare-metal servers enjoy.

For organizations already deploying a well-architected SDS technology supporting tier-one, mission-critical applications, the DevOps team can tap into the same shared storage pools as they begin to explore containers without having to stand up separate persistent storage silos.

The presentation of persistent storage takes place through familiar container orchestration platform (COP) commands, such as those found in Kubernetes, transparently leveraging advanced storage capabilities such as auto-tiering as well as data protection and data recovery features such as synchronous mirroring and continuous data protection (CDP). CDP is especially valuable in mitigating the damage from ransomware or other data corruption issues, as it enables data images to be rewound back in time before the attack occurred. Together, these SDS functions ensure dependable, predictable behavior at scale, accelerating the pace of container adoption for a broad spectrum of applications.

Augie Gonzalez

Augie Gonzalez brings 30+ years of diverse experience architecting, marketing and managing advanced IT products and services, most recently with software-defined storage solutions for on-premises and hybrid clouds. Before joining DataCore, Gonzalez led the Citrix team that introduced simple, secure, remote access solutions for small to midsize organizations. Earlier, Gonzalez headed Sun Microsystems Storage Division’s Disaster Recovery Group. Additionally, he’s held marketing / product planning and software development roles at Encore Computers and Gould Computer Systems specializing in high-end platforms for vehicle simulation and data acquisition.

Augie Gonzalez has 2 posts and counting. See all posts by Augie Gonzalez