Informatica 9.1 Supports Best Practices For Agile Data Integration

Informatica supports Agile Data Integration for Agile BI with best practices that encourage good data governance, facilitate business-IT collaboration, promote reuse & flexibility through data virtualization, and enable rapid prototyping and test-driven development.  Organizations that want to successfully adopt Agile Data Integration should standardize on the following best practices and leverage Informatica 9.1 to streamline the data integration process, improve data governance, and provide a flexible data virtualization architecture.

1. The business and IT work efficiently and effectively to translate requirements and specifications into data services

Informatica enables analysts and developers to collaborate on source-to-target ETL mapping specifications using role-based tools and a central metadata repository.  This is also referred to as self-service data integration because it empowers the business to do more on their own and collaborate more efficiently with IT to get the information they need fast while IT retains control of the process for governance and compliance.  Self-service data integration streamlines the communication between the business and IT and eliminates errors so that IT can focus on more value-added tasks.  For example, the Data Integration Analyst option enables analysts to quickly translate business requirements to mapping specifications and validate data integration logic without the help of IT.  Analysts can easily share these specifications with developers and automatically generate mappings to increase efficiency and deliver relevant, trustworthy, and authoritative information to business users for their Agile Data Integration projects.

2. Rapid prototyping and validation of mapping specifications and business rules

Analysts can directly validate data integration specifications and data quality rules within the Informatica Analyst browser-based UI. Developers can rapidly prototype data services, starting with a logical data object connected to physical data sources through what are called read-maps.  The logical data object can be profiled and validated by all stakeholders including developers and analysts.

3. Leverage an integrated business glossary of terms and metadata for quick search, impact analysis, and optimization

It is important to have a data governance program in place for Agile Data Integration to provide information transparency, integrity, and confidence.  This requires the ability to create a business glossary of terms linked to an information catalog containing metadata about data sources and targets, source-to-target ETL mappings, and other data integration objects.  This enables analysts to quickly find data by searching on business terms and browsing the data lineageData quality is critical to ensuring information integrity with consistent, complete, and accurate data.  A multi-domain MDM solution enables Agile BI organizations to have confidence in their customer, product, and other master data with a best version of the truth, free of duplicates, and enriched with valuable relationship information.

4. Data profiling is used continuously throughout the design and development phases

Column level profiling and comparative profiling helps analysts, stewards, and developers measure progress and quality frequently throughout the Agile Data Integration implementation lifecycle.  This enables continuous test-driven development so as to avoid mistakes downstream.  Infer both primary and foreign key relationships directly in the developer tool using model profiling.  Confirm join conditions will work, identify orphan rows, and find redundant values in other tables

5. Data quality is built into the data integration process to deliver trusted data

With a unified development environment it is easy to find, reuse, and include data quality rules in the data processing pipeline.  Browser-based dashboards provide complete visibility of data quality for your most important data for all Agile BI and data warehousing project stakeholders at each and every iteration.  In this way, all stakeholders can easily monitor data quality and quickly respond to trends.

6. Insulate applications from underlying data changes through data virtualization built on a model-driven data services architecture

Informatica’s adaptive data services enables Agile BI organizations to build a data virtualization architecture that can access and federate any data source, implement data governance policies (e.g. access, quality, retention, privacy, and latency), and help you manage your data assets by hiding the underlying data complexities from consuming applications – essentially insulating them from underlying data changes.

7. Minimize delivery time and maximize reuse through flexible deployment options of data services

Adaptive data services provides flexible deployment options.  You design and build data services once and deploy them any way that is needed such as through ODBC/JDBC, web services, or a batch ETL process.  This maximizes reuse across Agile Data Integration projects and enables the business to get the data they need faster.

8. Leverage test automation and data validation wherever possible to increase test coverage and reduce errors

Informatica provides several tools to help you decrease the time it takes to perform unit testing and system testing for Agile Data Integration projectsComparative profiling enables developers to quickly compare profiles before and after transformations.  The Data Validation option enables you to automate tests that compare actual target results with expected results of the ETL process thereby reducing test cycle time by up to 20x.  Data subsetting enables you to create realistic datasets based on production data increasing test coverage and lowering costs thereby reducing defects.  Data masking protects sensitive data minimizing any risk of a security breach.

How to get started with Agile Data Integration

To get started identify an executive or senior manager to sponsor an early win project to prove the value of Agile Data Integration.  Remember the Lean principle to first and foremost focus on knowing your customer, eliminating waste and optimizing the process from the customers’ perspective.  This is key to ensuring the success of your first project.  Identify change agents and people who are passionate about making it work to deliver maximum business value to your customer.  Create an “as-is” and a “to-be” value stream map of your data integration process.  Informatica Professional Services offers a Lean Integration Value Stream Mapping Workshop to give participants the experience to apply lean principles and optimize data integration processes.  Over time, continue to standardize and leverage the full capabilities of the Informatica platform to streamline the data integration process, facilitate business-IT collaboration, improve data governance, and provide a flexible data virtualization architecture.


This entry was posted in Data Governance, Data Integration, Data masking, Informatica 9.1, Master Data Management and tagged , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>