We always talk about the DevOps pipeline as a series of tools that are magically linked together end-to-end to automate the entire DevOps process. But, is it a pipeline?
Not really. The tools used to implement DevOps practices are primarily used to automate key processes and functions so that they can be performed across organizational boundaries. Let’s consider a few examples that don’t exactly fit into “pipeline” thinking:
- Source code repositories – When code is being developed, it needs to be stored somewhere. There are a number of features associated with storing source code that are also useful, such as keeping track of dependencies between modules of code, keeping track of where the modules are being used in production, etc. There are DevOps tools that provide these features, such as jFrog Artifactory, and Github. These tools can be used by development teams to share code, or they can be used by a continuous integration server to pull the code for building or for testing. Or, continuous integration tools can take results and store them back into the repository.
- Containers – No one doubts that containers are a key enabler to DevOps for many companies, by making it easier to pass applications through the DevOps processes. But, it isn’t exactly a tool in the pipeline, so much as it is a vehicle for allowing all of the tools in the pipeline to more easily connect together.
- Cloud Sandboxes – Similar to containers, Cloud Sandboxes can be used in conjunction with many other tools in the DevOps pipeline to facilitate capturing and re-using the infrastructure and app configurations that are needed in development, testing, and staging.
I think an alternative view of DevOps tools is to think of them as services. Each tool turns a function that is performed into a service that anyone can use. That service is automated and users are provided with self-service interfaces to it. Using this approach, a source code repository tool is a service for managing source code that provides any user or other function with a uniform way to access and manage that code. A docker container is also a service for packaging applications and capturing meta data about the app. It provides a uniform interface for users and other tools to access apps. When containers are used in conjunction with other DevOps tools, it enables those tools to be more effective.
A Cloud Sandbox is another such service. It provides a way to package a configuration of infrastructure and applications and keep meta data about that configuration as it is being used. A Quali cloud sandbox is a service that allows any engineer or developer to create their own configuration (called a blueprint) and then have that blueprint set up automatically and on-demand (called a Sandbox). Any tool in the DevOps pipeline can be used in conjunction with a Cloud Sandbox to enable the function of that other tool to be performed in the context of a Sandbox configuration. Combining development with a cloud sandbox allows developers to do their code development and debugging in the context of a production environment. Combining continuous integration tools with a cloud sandbox allows tests to be performed automatically in the context of a production environment. The same benefits can be realized when using Cloud Sandboxes in conjunction with test automation, application release automation, and deployment.
Using this service-oriented view, a DevOps practice is implemented by taking the most commonly used functions and turning them into automated services that can be accessed and used by any group in the organization. The development, test, release and operations processes are implemented by combining these services in different ways to get full automation.
Sticking with the pipeline view of DevOps might cause people to mistakenly think that the processes that they use today cannot and should not be changed, whereas a service oriented view is more cross functional and has the potential to open up organizational thinking to enable true DevOps processes to evolve.