DevOps and Data Integration

DevOps and Data IntegrationDevOps is the fusing of software development (Dev) with IT operations (Ops). These days, we seem to be doing well with DevOps. More than 63% of developers have implemented DevOps, and most organizations that deploy DevOps can ship applications 30 times faster with 50% fewer failures.

The value of DevOps is so well known that it’s not worth repeating here. However, it’s important to state that DevOps is really about people and processes, as well as technology. The core notion is to release high quality code and binaries that perform well, and are of good quality, much more rapidly than traditional approaches to development, test, and deployment.

Linking DevOps and data integration links means producing new code that has data integration mechanisms automatically built into the deployed applications. This automation aspect of DevOps, including the automation around application and data integration, from testing and staging, is perhaps the most important aspect of this shift in thinking and technology.

The key concept of DevOps is “continuous,” meaning that we think in terms of how to build, test, and deploy applications, as well as how to leverage automated solutions for data integration. Continuous integration, continuous delivery, and continuous deployment are the keys to a successful DevOps program, with automated releases going out many times a day. The objective is to meet the demands of the end user with as little latency as possible, which leads directly to the ROI, and focuses on the value of the data integration aspects of DevOps.

Data integration fits in with DevOps in becoming part of the tools and technology that developers can build into applications. Thus, when applications are coded at the beginning of the DevOps processes, the developers can build data integration flows directly into the application in support of data exchange between applications and data stores.

This means selecting a data integration tool that’s able to support DevOps. Data virtualization is key; consider that we can leverage the physical data using an abstraction layer that is able to redefine the schema. Then, it’s a matter of defining the data flowing from source to target systems.

In the world of DevOps, data integration needs to be part of the continuous integration processes, as well as continuous testing, continuous deployment, and continuous operations. Therefore, we need to write the automation scripts that ensure the data integration pieces are parts of the process.

DevOps is a real thing. It’s being implemented in many enterprises these days, moving from tactical experimentation to full-blown operational readiness. The objective is to reduce the time it takes to bring solutions to the business entities that need them. Data integration is part of DevOps, and part of those who deploy DevOps organizations, processes, and tool sets. Please make sure to address data integration, because it is a big part of the DevOps picture.

Comments

  • Kammy

    Does Informatica offer any data integration tool that’s able to support DevOps?

    • Ryan Williams

      Jenkins is an open source continuous integration tool that is commonly used in enterprise DevOps environments. You can create CI pipelines that will execute a deployment when code changes in a linked repository. The tool has a lot of options and thousands of plugins so I’m sure there would be a way to make it work with Informatica given enough research. You would just want to be sure that any object dependencies, automated deployment tests and rollback procedures are covered in the pipline.

  • Manikanta

    Hi everyone
    Is there any chance of get a job in informatic In DevOps profile for fresher.i was trained DevOps few technologys.