Rajesh Dangi, June 2021
Most of the emerging technologies and SaaS-based solutions today practice rapid application development for enablement of use cases via agile frameworks fueled by DevOps, the proven methodology for system development life cycle. It combines Development + Quality assurance and IT operations as a single seamless function managed by specialized automation, self-service and mostly leveraging varied open-source tools, often called toolchains. I believe it emerged in the 1990s as part of the ITIL process and inherited best practices from PDCA, Lean, and/or Agile approaches as well as believing in continuous development and integration as a core principle.
The key purpose of DevOps adoption is fostering tight coupling between IT operations, software development, and quality assurance enabling better communications within the teams and leveraging the cross-functional expertise to improve design, development, testing, and deployment of software by enabling the following key tenets:
- Stable and on-demand operating environments
- Super-fast delivery, reusable configurations, and code
- Pre-integrated collaboration of workflows
- Time optimization, particularly in fix/maintain phases embracing failures as learnings
- Ongoing innovation and automation leveraging cross-functional expertise
An organization embracing DevOps must begin with developing and fostering the DevOps culture that demands a work methodology based on the development of code that uses new tools and practices and allows teams to work more closely, bringing greater agility to the business and notable increases in productivity.
DevOps Model allows developers and Operation engineers to simplify the process and allows them to be more productive via automation, which reduces the number of manual actions, and iteration accelerates the development process thanks to faster end-user’s feedback and quick fixes. The most important aspect is self-service, which helps accelerate releases by enabling developers to deploy applications on-demand by themselves and testers perform the testing in tandem with the code ready for testing. Broadly, DevOps accelerates time to market and fosters collaboration within the teams to share expertise to implement Continuous Integration and Continuous Delivery. The key reasons for the widespread adoption are:
- Predictability & Maintainability - DevOps offers a significantly lower failure rate of new releases and makes failovers and fallbacks almost instantaneous thus enriches predictability and lowers disruptions. An effortless process of recovery in the event of a new release crashing or disabling the current system thus guarantees maintainability. Of course, versioning everything is possible so that earlier versions can be restored anytime. DevOps also helps enhance collaborations between development, operations, and quality assurance teams to facilitate continuous integration, testing, and delivery of software thus increasing efficiencies for smother functioning and quicker issue resolutions/bug fixes for quality assurance of the product deliverable.
- Better Time to market, Greater Quality and Scale - DevOps reduces the time to market up to 50% through streamlined software delivery pipeline particularly the case for digital and mobile applications and helps the team to provide improved quality of application development as it incorporates infrastructure issue resolutions by design and allows the full spectrum of testing with automation. DevOps tools enable enterprises to automate software development and testing lifecycle by standardizing and automating the movement and deployment of code across different environments with the scale on demand.
- Reduced Risk & Resiliency – Since DevOps incorporates security aspects in the software delivery lifecycle helping in the reduction of risks across the entire application development and deployment lifecycle providing continuous operational state of the application for stable, secure, and auditable changes.
- Cost Efficiency & Agility: Adoption of DevOps offers cost efficiency in the application development process which is always an aspiration of organizations and allows the agile programming method with a component-based approach that can be designed, developed, and integrated breaking larger codebases into smaller and manageable chunks with structured codeines and merging strategies.
DevOps engineers must be able to understand and use a wide variety of open-source tools and technologies and low code or no code metaphors. Today’s stacks are dozens of different software, source control tools, databases, and automation tools, The technical domains that should converge into a successful DevOps engineer should entail.
- Being an excellent Sysadmin – Breathing open source and Linux/Unix – Know API ecosystems, integration, and configuration management.
- Deploying Virtualization and Containers - Hand-on expertise on networking and storage
- Coding, Testing, and Test Automation – Scripting – Full stack developers are definite winners
- Understanding Security aspects of the environments and databases and protection thereof.
The ability to make them all work together is also crucial and making the ability to code and script is just as important. Thus, the blend of development/programming frameworks, IT infrastructure, automation tools, and scripting not the least the database skills and many more. Apart from other skills necessary for the job, it’s more about mindset and should include:
- Ability to collaborate, communicate, and reach different departments.
- Comfort in a fast-paced, agile, and ever-changing environment.
- Years of experience with systems and IT operations, production engineering
- Strong emphasis on business outcomes and acumen a plus
Since development, quality assurance, and delivery processes are tightly integrated to work together to provide much more than the traditional SDLC, the software development lifecycle, each stage of the DevOps gets awareness and needs the active involvement of cross-functional skills, processes, and tools as they would impact the end objective of delivery and continuous improvement as an ongoing endeavor without compromising on the security of the digital assets such as code repositories and datasets.
- Create / Code – the stage where planning, design, code development and review, source code management tools, code merging, etc is involved.
- The code can be written in any language, but it is maintained by using Version Control tools. Maintaining the code is referred to as Source Code Management,
- The most popular tools used are Git, SVN, Mercurial, CVS, and JIRA. Also tools like Ant, Maven, Gradle might be used in this phase for building/ packaging the code into executable files for deployments in testing environments.
- Testing – continuous testing tools that provide quick and timely feedback on business risks. This is the stage where the developed software is continuously tested for bugs
- Automation testing tools like Selenium, TestSigma, TestNG, UFT, Appium, Bamboo, JUnit, Cucumber, etc allow QAs to test multiple codebases thoroughly in parallel to ensure that there are no flaws in the functionality, the environments used for testing are quickly provisioned and leverage containers or virtual instances for rapid and automated deployments.
- This entire testing phase can be automated with the help of a Continuous Integration tool called Jenkins. Automation testing saves a lot of time, effort, and labor for executing the tests instead of doing this manually. The build lifecycles, automation servers, code branching, merging, etc. are managed by open-source tools like Ant, Maven, GitHub as required.
- This stage is the core of the entire DevOps life cycle in which the developers are required to commit changes to the source code more frequently. This may be on an hourly, daily, or weekly basis based on the release strategy.
- Every commit is then built, and this allows early detection of problems if they are present. Building code not only involves compilation but also includes code review, unit testing, integration testing, and packaging
- The code supporting new functionality is continuously integrated with the existing code. Since there is a continuous development of software, the updated code needs to be integrated continuously as well as smoothly with the systems to reflect changes to the end-users, there is also an ecosystem of tools—Jenkins, Ansible, Gitlab CI, Travis CI, and numerous others have sprung up to help organizations create their own pipelines.
- This is the stage where the code is deployed to the production servers. It is also important to ensure that the code is correctly deployed on all the servers. The role of microservices and microcodes is becoming inevitable here since it allows granular scaling and updates and deployment of logical application modules independent yet interwoven to work and scale seamlessly.
- Configuration management is the operation to release server installations, schedule upgrades, and keep all instances compatible with configurations, responsible for establishing and maintaining consistency in an application’s functional requirements and performance and on-demand deployment of the code on the prescribed environments.
- Since the new code is deployed on a continuous basis, configuration management tools play an important role in executing tasks quickly and frequently. Some popular tools that are used here are Puppet, Chef, SaltStack and Ansible etc.
- This is a very crucial stage of the DevOps lifecycle where you continuously monitor the performance of your application. This stage provides useful information that allows you to ensure maximum productivity and uptime of operation. The operations team obtains accurate testing tools to locate and correct the bugs/flaws in the software.
- The feedback or the corrective information is processed to recognize the proper functionality of the application and the root cause of any issue is determined in this phase. It maintains the security and availability of the services and triggers actions required for correction.
- This practice involves the participation of the Operations team who will monitor the user activity for bugs or any improper behavior of the system and environments the toolchains operate upon. There are many popular tools used such as Nagios, Monit, Collectd, ELK Stack ( combination of Elastic Search, Logstash, and Kibana), etc. along with paid ones like Splunk,NewRelic, and Sensu etc.
DevSecOps, NoOps, AIOps, and way forward
Since automation and innovation are aspirational, more and more relevant streams are getting unified under the DevOps Model and branching out broadening the scope of DevOps in line with the market dynamics often known as security as a code. DevSecOps focuses on the integration of security at every phase of the software development lifecycle, from initial design through integration, testing, deployment, and software delivery is automated and tightly integrated into the workflow.
Whereas, NoOps focuses on the deployment by design without the need for any manual efforts as an end to end automated workflow from code commit to deployment thus enriched form of continuous delivery and deployment spanning from software to underlying infrastructure such as allocating an instance, loading the OS, and other architectural software components like application servers and databases, setting up the networking layer, building the application from the latest source in the code repository, deploying it in the configured machine. While containers are fast replacing the virtual environments in NoOps making the DevOps more agile, deployments are controlled by APIs to spin the containers, configure load balancers, or scale the web services or data nodes, etc. Changing the very paradigms of the development methodologies and metaphors.
Worth mentioning the AIOps, could be another topic to elaborate on, which focuses on artificial intelligence to improve IT operations and continuous monitoring via specialized AI-based correlation engines that could correlate alerts to determine a root cause, pinpoint broken configurations, and workflows, etc.
In Summary, DevOps is the strategy for assimilating cultural philosophies, best practices, and supporting tools that increase the ability to deliver applications and services at a higher velocity, agility in evolving and improving products than the traditional software development and infrastructure management approaches and thus helps sustain the digital agenda bringing in a holistic approach to the complete software delivery mechanism.
While few relate DevOps only to automation, it goes so much beyond the erstwhile domain enabling other information technology areas and practices; hyper-automation for better performance in the face of digital economy challenges and opportunities resulting in a competitive edge and commercial advantage in the overall digital transformation agenda. Coupled with emerging technologies that drive integration, automation, self-service provisioning, cloud adoption with auto-scaling /healing capabilities the DevOps ecosystem has today become a role model for modern, open-source, and distributed ways of computing practices. In a nutshell, crowdsourced innovations, sophisticated automation within a continual cycle of discovery, design, and delivery with real-time measurement to deliver business benefits remain the key mantra for DevOps, isn’t it?