As digital transformation continues to reshape the competitive environment, organizations are having to meet new technology demands from their stakeholders, keep ahead of competitors and capture more revenue from growing opportunities in digital business. For many, this will require new approaches to developing apps, modernized cloud-based infrastructure and more agility in both IT and business processes. In recent years, open-source containerization solutions (like Docker) and container management platforms (like Kubernetes) have risen as a natural fit to bring application development, management and security into the future.
Container technology is reshaping how apps are built, deployed and managed across the enterprise. While the impact is widespread, containers and their platforms have certain unique qualities that are particularly beneficial to enterprise tech operators and security teams.
Container management for tech ops
On the application front, container management platforms are helping companies move from monolithic applications to microservices that are rapidly and independently deployable. Compared to traditional virtual machines, containers help package and deploy microservices efficiently, and their platforms provide integrated service discovery, orchestration and deployment management at scale.
In the area of infrastructure, the standard and portable packing format of containers is helping organizations scale-out cloud infrastructure to reliably and stably run applications wherever they want to deploy them—be it physical, virtual, private or public clouds.
When it comes to process improvements, container management platforms are helping tech ops teams create tighter connections with developers and be more agile with CI/CD and DevOps. As immutable app packages, containers can be promoted from dev to production without rebuilding. Tech operators can manage apps more consistently and collaborate with developers on a shared platform that addresses their core requirements.
Container management for security teams
When it comes to detecting and addressing application vulnerabilities, container technology is helping security teams do more. With a container approach, a file gets executed within its own operating system. Containers by default are immutable because the filesystem of a container image are read-only layers that are stacked on top of each other. Nothing gets written filesystem of the original image without another temporary layer that is identified by a hash: a unique identifier assigned to that new layer. When the container is terminated the new layer is discarded unless it’s committed, in which case a new container image is created to include the new layer. This makes tracking changes to a container image easy and allows Security teams to container with vulnerabilities by a filesystem hash or container image. For example: Threat vulnerabilities can be identified on containers that were checked through the registry to track their hashes to what is installed onto each layer. If a vulnerability gets announced via common vulnerabilities and exposures (CVE), a security team can quickly track what packages are installed on every layer—and which containers are deployed that have that layer—so these specific containers can be removed quickly.
Containers are also empirically defined. Most people manage containers with an infrastructure as code pattern. So, security teams can define them (using a Docker file or an open-source file in Kubernetes, for example) and check that into source control. Now any changes that get made to that file before they get re-executed have to follow a software lifecycle policy that the organization uses, which should include peer review as well as an integration process—which validates that the file is correct to meet the standards of the organization—versus someone remoting onto a virtual machine and installing whatever comes up.
Leaders in the container game: Docker and Kubernetes
More and more, it’s becoming clear that container-based application management will play an important role in the modern enterprise. While there are many technology providers around the marketplace, two have emerged as front runners. Docker, and its container technology, provides virtualization at the operating system level to efficiently pack software in a way that it can be isolated on a shared operating system. It provides a lightweight, standard and secure alternative to virtual machines, requiring only libraries and settings to run, and running the same wherever it’s deployed.
Coupling a container technology, like Docker, with Kubernetes’ open-source container platform is enabling organizations to seamlessly develop, deploy, manage and scale containers at the enterprise level, in a far more automated fashion. However, while Kubernetes does have a strong and growing open-source community to help organizations manage issues, deploy tools and deliver updates, the reality for most enterprises is that a basic installation of Kubernetes is only a starting point for running a containerized application environment at the enterprise level.
It would be very likely that the business would need to hire a team of Kubernetes experts to develop tools, provide support and maintain the deployment. The question that organizations are left asking themselves is whether they are in the business of writing Kubernetes modules and features, or are they in the business of delivering apps that make their organizations better? What’s more is that companies need to consider whether they are willing to risk their infrastructure running on community-based software versus something that is hardened, secured and QA’d at the enterprise level.
The best of both worlds: RedHat OpenShift built on open-source Kubernetes
For those running enterprise Linux environments, there is already a mature solution on the market that has solved these challenges: the RedHat OpenShift Container platform, which brings Docker and Kubernetes to the enterprise. OpenShift is actually an addition that is built on Kubernetes open-source technology. OpenShift offers a host of out-of-the-box modules and features to give enterprises everything they need to develop, deploy and manage existing and container-based applications across physical, virtual and public cloud infrastructures. Companies save time and costs they otherwise would have spent solving problems that OpenShift has already solved.
OpenShift also provides developers with an optimal platform for provisioning, building and deploying applications and their components in a self-service fashion and integrates with continuous integration and delivery tools. The platform also gives IT operations a secure, enterprise-grade environment that provides policy-based control and automation for container-based applications in production. It is the only container platform that supports all feature-rich add-ons with the power of running on Red Hat Enterprise Linux. And the platform also provides both build and deployment automation capabilities to enable organizations to accelerate their IT groups.
A deeper look into the container opportunity
For years, Lighthouse has been helping clients move their enterprises into the future with container-based approaches. We are a closely connected partner with RedHat, and we are experts in deploying their OpenShift technology. Whether you are interested in learning more about container-based technology or are moving forward with OpenShift, we can help you make the right choice for your business.
We offer OpenShift workshops on securing infrastructure, deploying in a hybrid cloud scenario and modernizing and configuring apps using service-oriented architecture. We can also fully install, deploy and configure OpenShift and teach you how to maintain it. A great way to get started is to try a demo of the OpenShift platform. There are many ways we can help you move forward. We’re here to help you succeed.