Containers may be infiltrating the enterprise data environment at a rapid clip, but few organizations have accumulated enough practical experience with the technology to determine how they can best be deployed and utilized.
Indeed, since many leading IT platforms are now shipping with native container support, there is a strong likelihood that organizations will find themselves outfitted with a fully containerized data ecosystem by the time they complete their next refresh cycle, but still have only a rudimentary idea of exactly what to do with it.
And to make matters worse, this is coming at a time of great upheaval up and down the entire data stack as digital transformation ushers in new forms of infrastructure, architecture, applications, processes and perhaps entirely new business models.
Containers in the Budget
According to container data services company Portworx, a majority of businesses have begun budgeting for containerized applications and infrastructure this year, with nearly a third earmarking upwards of $500,000 or more. Docker is the most popular container platform by far, with nearly an 80 percent share of the market, and that number is expected to grow now that Microsoft has added native Docker support to its Windows Server platform. Kubernetes is the leading container management system, running in 43 percent of enterprise environments and nearly a third reporting that it is their primary solution. (To learn more about Docker, check out Docker – How Containers Can Simplify Your Linux Development.)
As for how containers are being used, virtually any application running on a virtual machine can be containerized, although it will likely have to be recoded to accommodate the more compartmentalized operational environment. In most cases, however, this will result in lighter-weight applications because functions like managing resource dependencies and maintaining data communication are now handled by the container. This is one of the reasons many enterprises are looking at containers to support hybrid cloud models, since the technology allows applications to easily traverse private and public resources in highly dynamic settings.
In all likelihood, containerized environments will give rise to native applications that will leverage the technology in ways that traditional applications cannot. But before enterprises can push into the more advanced use cases for containers, it’s probably best that they get their feet wet on a more familiar level. Docker recently launched a new program aimed at migrating legacy applications to its container ecosystem where, it is hoped, they will prove to be more flexible and productive. The Modernize Traditional Applications Program works with Cisco, Microsoft, HPE and other vendor solutions to provide both the migration and employee training to support a working container environment, all without requiring the enterprise to transform long-standing applications that users have come to trust.
One of the key questions that most enterprises face when examining container functionality is how they differ from a standard virtual machine. The chief difference is that containers encapsulate only the runtime environment needed to support a given application, which is the primary reason they provide such excellent portability. In contrast, a VM carries an entire operating system and virtual copies of hardware resources, which means they require a lot more CPU and RAM. It’s important to note, however, that while containers can function on bare-metal resources, they can also be deployed on virtualized infrastructure – either alongside or within virtual machines – to provide better security and even higher resource utilization.
The Rise of Microservices
Containers also make it possible to host individual components of applications known as microservices. This opens up an entirely new layer of data functionality in that each microservice has the potential to be paired with other microservices (depending on its policy restrictions) to create entirely new services and applications on the fly. It’s for this reason that containers are expected to be a major boon to the internet of things (IoT) and the big data analytics that support it. By pushing containers and analytics to the IoT edge, designers are hoping to enable new classes of intelligent, automated platforms that can intuitively compile the tools needed to fulfill all manner of user requests without the latency-inducing process of pulling data into centralized facilities and pushing the results back out again.
At the same time, companies like Oracle are looking to leverage containers for cloud-native database applications that would provide both the scale and flexibility to meet burgeoning IT workloads. The company recently retooled its MySQL database and WebLogic server for Docker in a bid to speed deployment of secure applications and microservices to either public or on-premises cloud infrastructure. Ultimately, says Oracle’s VP of software development Mark Cavage, the goal is to bring even mission-critical applications into the container fold where they can take advantage of real-time analytics and other forward-leaning services. (For more about on-premise cloud, see Public Cloud vs. Private On-Premise Cloud.)
As with any new technology, the true value of containers will not be in the technology they provide, but the functions they perform. At the moment, containers are in the early deployment phase where users are looking for ways to perform today’s tasks more efficiently and at less cost. But it shouldn’t be long before containerized environments start producing new applications and services that cannot exist anywhere else on the data stack.
And that is the point at which containers will evolve from a technology that is nice to have to one the enterprise cannot live without.