The release management cycle of applications at an Enterprise is time consuming, involves several teams and mid night testing cycles in production. Adding patch management, upgrading software versions and provisioning volumes are another daunting work for IT Operations Manager.
I implemented a PIA (PeopleSoft Internet Architecture) for a telecom customer few years back. The PIA was running on WebLogic, WebLogic Tuxedo that connects to other Middleware Servers, Payment Gateways, Pre-paid platforms, Value Added Service providers, IVR, vendor databases, utility and batch servers. Releasing one version of the business process involved more than 20 different steps of deployment and most of them were manual. We did well with a controlled change management process and dedicated team members who worked round the clock. Later the systems became unmanageable, build and run books became outdated and it was a chaos.
These type of multi-level complex deployments introduced Automation tools such as IBM UrbanCode and CA Release Automation. These tools have larger footprint to manage and failed to provide 100% automation.
Therefore, the customers started looking for automated deployment solutions that are packaged.
The term CI/CD (Continuous Integration and Continuous Delivery) is the norm when it comes to tools such as Jenkins, CircleCI and Bamboo. These tools are good in integrating various build tools, scheduling, and the opensource community developed several cool plugins to integrate them into the ecosystem.
Without Docker, these tools still follow traditional OS coupled application deployments and that leads to Active/Passive deployment even in the Cloud. For example, an application with singleton class can’t be replicated and the associated application server or VM must be recycled in order to get the new version. The outage and subsequent testing during the deployment does not lead the organization to achieve 99.9% up time SLA commitment to their customers.
Docker with Kubernetes is the solution for 99.9% up time SLA commitment and also for continuous deployment.
The new version of the application and dependable libraries can be packaged into a new container, orchestrate to production, test it, provision it to the other production entities (such as Load Balancer, proxies) then destroy the previous version of the application Docker container. It’s a zero-downtime hot deployment method. This continuous deployment method relieves IT Operations from flipping services, changing load balancer rules and other database pointer changes.
Traditionally, the change in patches and configuration at an application server environment are applied by individual teams by receiving the Change Request (CR) ticket and through stringent CCB/PMO procedures. It is manual and staff often forget to apply few patches or changes.
Docker eliminates the above manual, error prone process. A configuration change and or patch update at a containerized application server can be created as a golden image, then the image can be pushed into other virtual machines in the release management (DevOps) pipeline.
DevOps is a culture and software system tools are available to practice that culture.
Gene Kim, a notable practitioner of DevOps, defines DevOps with the following four principles:
Understand the flow of work.
Always seek to increase flow.
Don’t pass defects downstream.
Achieve a profound understanding of the system
The key point is ‘Don’t pass defects downstream’.
I was in IT Operations for more than 10 years and deployed approximately 700 Java EE applications for customers during this time frame. That Middleware exposure led me to work with both front end (Load Balancers, Firewalls, Network), back end (Hardware, Databases, Storage), and other integrated technologies such as Mainframe, ERP, CRM with several cross-functional teams.
I heard the following often;
- It is working in my machine.
- The class-path at QA server is wrong so the deployment failed.
- The server configuration is correct. Whenever this particular application comes into production, it breaks everything.
- He worked on an unapproved change ticket.
The defects were passed downstream every now and then.
Docker resolves some of it by whirling DevOps culture. The developer can contain all software packages into one container then push it through DevOps pipeline without notable modifications. Kubernetes handles the container orchestration work to manage the non-functional region.
Cloud Computing automated the Infrastructure to some extent and it is keep growing with phenomenal innovations every day.
It’s time to automate the DevOps deployments at Multi-Cloud. Docker and Kubernetes are playing a big role here.
Kuberiter is developing a CI/CD/CD SaaS based DevOps pipeline product with Jenkins, Docker and Kubernetes. Please do contact me for free DevOps assessment at your organization.
#Docker #Kubernetes #Jenkins #DevOps #CloudComputing #HybridCloud #MultiCloud #BlueGreen #BlueGreenDeployment