Top Questions and Answers on Data Integration Productivity Advancements in v10.1

Top Questions and Answers on Data Integration Productivity Advancements in v10.1

Recently Indranil Roy and I presented a technical deep dive webinar on Data Integration Productivity Advancements in 10.1 – the latest innovation to boost developer productivity, simplify and make your data integration environment more robust to evolving schema changes. It was well attended and we got great questions from attendees. Due to time constraint, we could not address them during the webinar. I have gone through the list and covered most of the questions here.

Here’s the link to the replay, as well as the top questions and answers from the session. Feel free to reach out to your local Informatica account manager if you’d like clarification or more details about these new features.

Data Integration Productivity Advancements in v10.1 – Technical Deep Dive

What is Dynamic Mapping? What data sources are supported? 

A Dynamic Mapping is a mapping that can accommodate changes to sources, targets, and transformation logic at run time.  They are highly re-usable mapping templates and can be used to manage frequent schema or metadata changes, or for data sources with different schemas. Dynamic Schema, Input Rules, Parameterization, Dynamic Expression are all features of Dynamic Mappings. Dynamic Mappings support a variety of standard sources and targets such as Relational, Flat files, Hive, HDFS, HBase, DW appliances.

What are the benefits of Dynamic Mapping?

A customer who is standing up a new data warehouse has estimated that dynamic mappings will save his team 50% percent of the data integration effort in standing up the new data warehouse.

Can we run the same mapping to load data into different target tables?

Yes, you can parameterize the target table. And using parameter files, you can run the mapping from the command line as well.

If we want to use a subset of columns from the source, will dynamic mapping work for us?

By defining Input Rules, you can exclude or include source columns based on data types, name etc. Input Rules can be also parameterized and the parameter value can be changed for each run.

How many processes can I run in parallel using the same mapping?

There is no software limitation to run mappings in parallel. However system resources need to be monitored.

How can a data source be changed at run time?

Data source can be changed by providing a different value during runtime.

What is SQL to Mapping? What SQLs does it support? 

SQL to Mapping converts external hand-coded ANSI SQL (includes SELECT, INSERT, DELETE, UPDATE) to a PowerCenter Mapping. Hand-coded SQLs are difficult to maintain and re-use and are opaque to Metadata Manager, blocking data lineage visibility.

Will Push-down Optimization convert “SQL to Mapping” back to SQL to run against the database?

Yes, once a mapping is generated, the execution engine will perform optimization and wherever it can find clauses which could be pushed to the database, it will do so.

Are these features available to PowerCenter customers? When will these features be available? Which package has these capabilities?

Dynamic Mapping and SQL to Mapping is available for PowerCenter customers via Developer Tool.  Dynamic Mapping is available in V 10.0 and SQL to Mapping will be available in V10.1. These capabilities are available with PowerCenter Productivity Package, which is an Add-on to Standard, Advanced and Premium Editions.