Tag Archives: PowerCenter

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

Back in 1884, a man had a revolutionary idea; he envisioned a compact knife that was lightweight and would combine the functions of many stand-alone tools into a single tool. This idea became what the world has known for over a century as the Swiss Army Knife.

This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army.  In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.

Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.

Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.

PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane. 

For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping.  And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.

As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.

To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at:  https://community.informatica.com/solutions/pcexpress

 Business Use Cases

Business Use Case for PowerCenter Express

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Data Integration, Data Migration, Data Transformation, Data Warehousing, PowerCenter, Vibe | Tagged , | Leave a comment

A Data Integration Love-Fest in Vegas

Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?

Is it?

Next-Gen Data Integration

Agile Data Integration

a) They are all top in their field.

b) They all view data as critical to their business success.

c) They are all using Agile Data Integration to drive business agility.

d) They have spoken about their Data Integration strategy at Informatica World in Vegas.

Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?

Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World.  They shared their enthusiasm for the important role of data in their business.  These industry leaders discussed best practices that facilitate an Agile Data Integration process.

American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task.  This effort involved large-scale data migration and included many legacy data sources.  The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.

American Airlines architects recommend the use of Data Integration design patterns in order to improve agility.  The architects shared success-factors for merger Data Integration.  They discussed the importance of ownership by leadership from IT and business.  They emphasized the benefit of open and honest communications between teams.  They architects also highlighted the need to identify integration teams and priorities.  Finally the architects discussed the significance of understanding cultural differences and celebrating success.  The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.

Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions.  The Data Integration team needs to support this business process.  They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge.  Integrating all claims in a single location was critical for smooth processing of insurance claims.  A single system also leads to reduced costs and complexity for support and maintenance.

Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation.  Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.

Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system.  This complex project included data conversion from 50 legacy systems.  The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain.  This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.

Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of  development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.

MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report.  They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.

MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders.  This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.

So let’s recap, Data Integration is helping:

- An airlines continue to serve its customers and run its business smoothly post-merger.

- A tire retail company to procure and provide tires to its customers and maintain leadership

- An insurance company to process claims accurately and in a timely manner, while minimizing costs, and

- A cancer research center to cure cancer.

Not too shabby, right? Data Integration is clearly essential to business success!

So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!

To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform | Tagged , , , , , , , | Leave a comment

Agile Data Integration in Action: PowerCenter 9.6 Demo

PowerCenter 9.6 Demo WebinarA Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…

Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.

The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.

We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!

The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.

Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.

It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.

Would you like to see a 5X increase in the speed of delivering data integration projects?

Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?

To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.

Deep Dive Demo: Informatica PowerCenter 9.6.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , , , , , | Leave a comment

Data Integration in Action at Informatica World

Great Data is By DesignWouldn’t you like to have been a fly on the wall when American Airlines and US Airways experts got together to integrate their data systems into one cohesive post-merger system?

Now you can experience the next best thing by attending InformaticaWorld 2014 and hearing the American Airlines US Airways Data Architects talk about the data challenges they faced. They will discuss the role of architecture in M&A, integrating legacy data, lessons learned, and best practices in Data Integration.

While you are at the show, you will have the opportunity to hear many industry experts discuss current trends in Agile end-to-end Data Integration.  

Agile Data Integration Development
To deliver the agility that your business requires, IT and Business must pursue a collaborative Data Integration process, with the appropriate Analyst self-service Data Integration tools.  At InformaticaWorld, you can learn about Agile Data Integration development from the experts at GE Aviation, who will discuss Agile Data Integration for Big Data Analytics. Experts from Roche, will discuss how Agile Data Integration has lead to a 5x reduction in development time, improved business self-service capabilities and increased data credibility.

Scalability
Another aspect of agility is your ability to scale your Data Warehouse to rapidly support more data, data sources, users and projects.  Come hear the experts from Liberty Mutual share challenges, pitfalls, best practices and recommendations for those considering large-scale Data Integration projects, including successful implementation of complex data migrations, data quality and data distribution processes.

Operational Confidence
The management of an enterprise-scale Data Warehouse involves the operation of a mature and complex mission-critical environment, which is commonly driven through an Integration Competency Center (ICC) initiative.  You now have the need to inspect and adapt your production system and expedite data validation and monitoring processes through automation, so that data issues can be quickly caught and corrected and resources can be freed up to focus on development. 

The experts from University of Pittsburgh Medical Center, along with Informatica Professional Services experts, will discuss best practices, lessons learned and the process of transitioning from ‘analytics as project’ to an enterprise initiative through the use of an Integration Competency Center. 

Hear from the Informatica Product Experts
You will have many opportunities to hear directly from the Informatica product experts about end-to-end Data Integration Agility delivered in the recent 9.6 release of PowerCenter.

See PowerCenter 9.6 in Action
Don’t miss the opportunity to see live demos of the cool new features of PowerCenter 9.6 release at the multitude of hands-on labs being offered at InformaticaWorld this year. 

For example you can learn how to empower business users through self-service Data Integration with PowerCenter Analyst tool; how to reduce testing time of Data Integration projects through automated validation tests; and how to scale your Data Integration with High Availability and Grid.

The sessions we described here are a sampling of the rich variety of sessions that will be offered on Data Integration at the show.  We hope that you will join us at InformaticaWorld this year in Las Vegas on May 13-15 and as you plan your visit, please check out the complete listing of sessions and labs that are focused on Data Integration.

Please feel free to leave a comment and let us know which InformaticaWorld session/s you are most looking forward to!  See you there!

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Informatica Events, Informatica World 2014, Integration Competency Centers | Tagged , , , , , , | Leave a comment

Agile Development tools deliver up to 5x faster data integration development

agile data integration boys playing telephoneWhen I was seven years old, Danny Weiss had a birthday party where we played the telephone game.  The idea is this:  there are 8 people sitting around a table, the first person tells the next person a little story.  They tell the next person, the story, and so on, all the way around the room.  At the end of the game, you compare the original story that the first person tells and compare it to the story the 8th person tells.  Of course, the stories are very different and everyone giggles hysterically… we were seven years old after all.

The reason I was thinking about this story is that data integration development is similarly inefficient as a seven year old birthday party.  The typical process is that a business analyst, using the knowledge in their head about the business applications they are responsible for, creates a spreadsheet in Microsoft Excel that has a list of database tables and columns along with a set of business rules for how the data is to be transformed as it moved to a target system (a data warehouse or another application).  The spreadsheet, which is never checked against real data, is then passed to a developer who then creates code in separate system in order to move the data, which is then checked by a QA person which is then checked again by the business analyst at the end of the process.   This is the first time the business analyst verifies their specification against real data.

99 times out of 100, the data in the target system doesn’t match what the business analyst was expecting. Why? Either the original specification was wrong because the business analyst had a typo or the data is inaccurate.  Or the data in the original system wasn’t organized the way the analyst thought it was organized.  Or the developer misinterpreted the spreadsheet.  Or the business analyst simply doesn’t need this data anymore – he needs some other data.  The result is lots of errors, just like the telephone game.  And the only way to fix it is with rework and then more rework.

But there is a better way.  What if the data analyst could validate their specification against real data and self correct on the fly before passing the specification to the developer. What if the specification were not just a specification, but a prototype that could be passed directly to the developer who wouldn’t recode it, but would just modify it to add scalability and reliability?   The result is much less rework and much faster time to development.  In fact, up to 5 times faster.

That is what Agile Data integration is all about.  Rapid prototyping and self-validation against real data up front by the business analyst.  Sharing of results in a common toolset back and forth to the developer to improve the accuracy of communication.

Because we believe the agile process is so important to your success, Informatica is giving all of our PowerCenter Standard Edition (and higher editions) customers agile data integration for FREE!!! That’s right, if you are a current customer of Informatica PowerCenter, we are giving you the tools you need to go from the old fashion error-prone, waterfall, telephone game style of development to a modern 21st century Agile process.

•             FREE rapid prototyping and data profiling for the data analyst.

•             Go from prototype to production with no recoding.

•             Better communication and better collaboration between analyst and developer

PowerCenter 9.6.  Agile Data Integration built in.  No more telephone game.  It doesn’t get any better than that.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , | Leave a comment

Dinner with my French Neighbor

My wife invited my new neighbors over for dinner this past Saturday night.   They are a French couple with a super cute 5 year old son.  Dinner was nice, and like most ex-pats in the San Francisco Bay Area, he is in high tech.  His company is a successful internet company in Europe, but have had a hard time penetrating the U.S. market which is why they moved to the Bay Area.   He is starting up a satellite engineering organization in Palo Alto and he asked me where he can find good “big data” engineers.  He is having a hard time finding people.

This is a story that I am hearing quite a bit with customers that I have been talking to as well.    They want to start up big data teams, but can’t find enough skilled engineers who understand how to develop in PIG or HIVE or YARN or whatever is coming next in the Hadoop/map reduce world.

This reminds me of when I used to work in the telecom software business 20 years ago and everyone was looking at technologies like DCE and CORBA to build out distributed computing environments to solve complex problems that couldn’t be solved easily on a single computing system.  If you don’t know what DCE or CORBA are/were, that’s OK.  It is kind of the point.  They are distributed computing development platforms that failed because they were too damn hard and there just weren’t enough people who could understand how to use them effectively.  Now DCE and CORBA were not trying to solve the same problems as Hadoop, but the basic point still stands, they were damn hard and the reality is that programming on a Hadoop platform is damn hard as well.

So could Hadoop fail, just like CORBA and DCE.  I doubt it, for a few key reasons.  One… there is a considerable amount of venture and industrial investment going into Hadoop to make it work.  Not since Java has there been such a concerted effort by the industry to try to make a new technology successful.  Second, much of that investment is in providing graphical development environments and applications that use the storage and compute power of Hadoop, but hide its complexity.  That is what Informatica is doing with PowerCenter Big Data Edition.  We are making it possible for data integration developers to parse, cleanse, transform and integrate data using Hadoop as the underlying storage and engine.  But the developer doesn’t have to know anything about Hadoop.  The same thing is happening at the analytics layer, at the data prep layer and at the visualization layer.

Bit by bit, software vendors are hiding the underlying complexity of Hadoop so organizations won’t have to hire an army of big data scientists to solve interesting problems.  They will still need a few of them, but not so many that Hadoop will end up like those other technologies that most Hadoop developers have never even heard of.

Power to the elephant.  And more later about my dinner guest and his super cute 5 year old son.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Integration, Data Integration Platform | Tagged , , | Leave a comment

Power(Center) to the People

At InformaticaWorld, we made a very exciting announcement—the introduction of PowerCenter Express, our entry-level data integration and profiling tool. What is PowerCenter Express, exactly? Well, in a nutshell, it’s giving the Power of PowerCenter to everyone, “to the people” if you like. We made PowerCenter Express available to all attendees at InformaticaWorld and they’ll be able to install it and be up and running in less than ten minutes. Since it’s PowerCenter, they’ll be able to scale up to enterprise class capabilities whenever they need to, using Vibe, our “Map Once, Deploy Anywhere” technology. Starting in July PowerCenter Express will be generally available to everyone- as a free download from Informatica’s Marketplace.

What we are doing with PowerCenter Express, is making sure that everyone, including departments and growing businesses, have access to PowerCenter’s high quality data integration and profiling tools. Until now the options for these groups have been limited—hand coding or open source products. Neither of these options is able to scale to be able to handle enterprise class data integration requirements. Which meant that before the advent of PowerCenter Express when these smaller organizations reached the point where they needed enterprise class capabilities and had to migrate to an enterprise data integration tool, they had no choice but to scrap all of their prior work . We don’t want that to happen anymore. We don’t want anyone to have to re-write mappings, to re-do work—ever. We want people to be able to map once, and deploy anywhere. And that’s what PowerCenter Express makes possible, that any organization, no matter how small, can start with PowerCenter—the gold standard for data integration—and stay with PowerCenter, re-using those same mappings when they transition to enterprise class, or when they want to deploy those mappings to Hadoop.

The reality is, as organizations’ data integration complexity reaches a certain point, they end up coming to Informatica—for the best products , the best support and the biggest ecosystem of developers. But in the past, for smaller organizations starting with the fully functional PowerCenter wasn’t always the best option. With PowerCenter Express, organizations can start small, start now, and scale fast. PowerCenter Express offers a real choice and future protection for entry-level data integration

If you’d like to learn more about PowerCenter Express before the public launch, shoot me an email at EBurns@Informatica.com. And start following me here, I’ll be posting a lot about this exciting new product over the coming weeks and months.

Emily V. Burns
Sr. Product Marketing Manager, PowerCenter Express

FacebookTwitterLinkedInEmailPrintShare
Posted in Marketplace | Tagged , , , | Leave a comment

World’s Best Entry Level Data Integration Product Has Finally Arrived!

For those of you hanging out at Informatica World, this is not news.  For those of you who aren’t in Vegas with us, you missed the unveiling of the world’s best entry level data integration platform. So you heard it here second, not first.  Next time, if you want to hear about this kind of stuff first, you have to show up at Informatica World!  <shameless plug for INFAWorld 2013 complete>

So, what is it that I am bragging about?  PowerCenter Express, that’s what.  This is the latest addition to the Informatica PowerCenter family of products, specifically designed for entry level data integration and data profiling.  This product will be downloadable over the Internet and installs in as little as 5 minutes.   It is super simple to use but has all of the rich transformation functionality you are used to from Informatica.  Also, you don’t have to install a separate profiling product,  everything is self-contained.   The product comes with built in  “cheat sheets” that walk you through how to use the product in a step by step fashion.  In addition, there is complete documentation as well as video based tutorials.

But best of all, PC Express delivers the kind of product quality you are accustomed to from Informatica.  What does that mean?  It means that unlike most of the entry level data integration products available for download, PC Express just works.  It doesn’t crash just because your ETL job requires more memory than you have on your machine, it gracefully caches to disk.

But wait, there’s more.  For the first time ever, Informatica is offering a FREE version of our market leading PowerCenter product.   There will be two versions of PowerCenter Express:

  • PowerCenter Express Personal Edition – available for FREE for a single developer at a time
  • PowerCenter Express Professional Edition- available for $8K/user per year subscription (at the time of this blog post)

And one last important point.  PC Express is based on the same virtual data machine as our enterprise class products and our cloud based products.  This means that at some later date, if you decide you need more scalability, more users, or enterprise class features like high availability, you can easily migrate from PC Express to the other Informatica data integration product lines.

So if you are at Informatica World, you will be receiving an email outlining how you can download and try out PowerCenter Express.  If you aren’t at Informatica World, maybe you have a friend who will share the secret website location where you can get a sneak peak at PowerCenter Express.  If you don’t have any friends who went to Informatica world, well, you will just have to wait until the download site goes public in July.  And next time you will know that you better go to Informatica World if you want to get early access to cool stuff.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, SaaS | Tagged , , , , | Leave a comment

Your Customized Informatica World 2013 Guide for Managing PowerCenter for Success

“We do nightly updates to our data warehouse, but we have no way to validate that the data was moved and transformed correctly in the time available. We only have time to test a subset and hope that it is the right subset.”

“We have tools to tell us about performance after an issue occurs, but nothing that helps us prevent the issue in the first place. So, we find out about failures from the end-users looking at a bad report. This causes delayed or poor business decision making, and also impacts our departments’ reputation.”

These are just a couple of quotes I have heard from data integration end users. We have been collecting similar information from thousands of our data integration customers around their challenges across the data integration lifecycle. What we are hearing is that the growth of data within organizations and ever increasing demand for more timely data has introduced a number of threats. In turn, these new demands have created massive variability in the way customers approach their projects, which introduces a host of data integration challenges, especially in production.

So naturally, organizations have taken a variety of approaches to combat these threats – ranging from adding full-time employee/contractor teams to test and monitor workflows to developing customized scripts or a combination of both. Ok – so there are monitoring tools out there. But, the fact remains that generic monitoring tools don’t uncover deep data integration issues. This was discussed at length here. Additionally, typical testing efforts such as “stare and compare” and hand-coding are manual, un-repeatable, and un-auditable as discussed here.

Now recall the point about added pressure of explosive data growth and increasing demands for timely delivery, and add to it the wide variability in how organizations approach projects – it’s scary. But what’s more concerning is that this variability riddles production environments with errors, inefficiencies, and security threats. Manual and reactive approaches to remedy the problem only exacerbate issues by increasing complexity. Hence the delays in identifying, monitoring and fixing issues, typically requiring fire drills, manual solutions and reactive measures, but not addressing problems.

If any of this sounds familiar to you, you may want to take advantage of what we are doing at Informatica World 2013 to help you address these challenges. To make things convenient for you, I have prepared a customized guide on relevant sessions, hands-on-labs and booths to help you understand how automated, repeatable and auditable testing, along-with a pre-emptive approach to diffusing threats before they erupt into full-blown issues, can help you. Please feel free to sign-up here for some of the breakout sessions we are hosting on this topic, or swing by one of the labs or booths:

BREAKOUT SESSIONS:

Tuesday, June 4

Wednesday, June 5

Thursday, June 6

BIRDS OF A FEATHER ROUNDTABLES (Check Agenda for Daily Timings):

  • Testing Strategies and Tools
  • Automating Administrative Maintenance Tasks

HANDS-ON LABS (Check Agenda for Daily Timings):

  • Table 42 – Informatica Data Validation
  • Table 43 – Informatica Proactive Monitoring

BOOTHS (Check Agenda for Daily Timings):

  • PowerCenter Developer Productivity and Production Manageability

 

I look forward to seeing you at Informatica World 2013.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Informatica Events | Tagged , , , , , | Leave a comment