In case you are new to the containers game, let's get you caught up.

For many years you installed an operating system on a server, then installed a bunch of applications, database servers, etc. on it. No doubt you ran into conflicts with versions of libraries: application A wants version 10 and application B wants version 11. So, most likely, you moved the applications to different servers to avoid the conflict.

What if you could partition your applications into their own little world but let them run on the same server? You can! A container isolates your application from other applications while allowing you to share the kernel and other resources of the operating system on the same server. The definition of the container is packaged into a container image. The image contains the base operating system, your application, and all of its configuration.

As a bonus, these container images are portable and can be moved between servers.  No more figuring out how to reproduce your build in multiple server environments. Now you just push a copy of the image to the next server environment and you are guaranteed to be running the same code!

So, you're tempted to put everything you need in one container. Hold on there partner! Best practice recommends you break your application into distinct containers based on service type. Congratulations, you've just containerized your application.

Now how does the container running the API server know how to talk to the database container? I'm glad you asked. This is where container orchestration comes into play. Container orchestration solutions automate deployment of your containers, scales them, manages the networking between them, and much more.

Kubernetes is an open-source container orchestration platform that can manage a cluster of servers to deploy and scale your application by running multiple instances of the container on multiple nodes in the cluster. Kubernetes can detect when an application is not behaving, kill it, and spin up a new copy. When you need to introduce a new version of your application, Kubernetes can perform a rolling update making sure that your application is accessible during the whole upgrade.

Why should you care about Kubernetes?  You care because it helps you operate at scale with maximum uptime for all of your containerized applications in the enterprise.  There is more to the story. You're going to want help in load balancing your application behind a router, auto-scaling the application based on traffic metrics, assistance in building your container images in a safe and secure manner, and much more. That's where Red Hat OpenShift Container Platform comes to the rescue. Red Hat OpenShift is built upon the Kubernetes platform and supports these additional concerns above and beyond container orchestration. With Red Hat OpenShift you can easily implement DevOps concepts such as continuous integration and continuous deployment to the Kubernetes cluster.

 

Want to learn how to do all of this?

Get started with a course from Red Hat Training, Introduction to Containers, Kubernetes, and Red Hat OpenShift (DO180), one of a series of courses that gives you hands-on practical experience with Kubernetes.

Other Resources:

Deploying Containerized Applications Technical Overview

Red Hat OpenShift Container Platform

Kubernetes Concepts

 


Connect with Red Hat Services

Learn more about Red Hat Consulting
Learn more about Red Hat Training
Learn more about Red Hat Certification
Subscribe to the Training Newsletter
Follow Red Hat Services on Twitter
Follow Red Hat Open Innovation Labs on Twitter
Like Red Hat Services on Facebook
Watch Red Hat Training videos on YouTube
Follow Red Hat Certified Professionals on LinkedIn
Creative Commons License