Tag Archives: PowerCenter

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

cloud+services

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

An increasing number of companies around the world moving to cloud-first or hybrid architectures for new systems to process their data for new analytics applications.  In addition to adding new data source from SaaS (Software as a Service) applications to their data pipelines, they are hosting some or all of their data storage, processing and analytics in IaaS (Infrastructure as a Service) public hosted environments to augment on-premise systems. In order to enable our customers to take advantage of the benefits of IaaS options, Informatica is embracing this computing model.

As announced today, Informatica now fully supports running the traditionally on-premise Informatica PowerCenter, Big Data Edition (BDE), Data Quality and Data Exchange on Amazon Web Services (AWS) Elastic Compute (EC2).  This provides customers with added flexibility, agility and time-to-production by enabling a new deployment option for running Informatica software.

Existing and new Informatica customers can now choose to develop and/or deploy data integration, quality and data exchange in AWS EC2 just as they would on on-premise servers.  There is no need for any special licensing as Informatica’s standard product licensing now covers deployment on AWS EC2 on the same operating systems as on-premise.  BDE on AWS EC2 supports the same versions of Cloudera and Hortonworks Hadoop that are supported on-premise.

Customers can install these Informatica products on AWS EC2 instances just as they would on servers running on an on-premise infrastructure. The same award winning Informatica Global Customer Service that thousands of Informatica customers use is now available on call and standing by to help with success on AWS EC2. Informatica Professional Services is also available to assist customers running these products on AWS EC2 as they are for on-premise system configurations.

Informatica customers can accelerate their time to production or experimentation with the added flexibility of installing Informatica products on AWS EC2 without having to wait for new servers to arrive.  There is the flexibility to develop in the cloud and deploy production systems on-premise or develop on-premise and deploy production systems in AWS.  Cloud-first companies can keep it all in the cloud by both developing and going into production on AWS EC2.

Customers can also benefit from the lower up-front costs, maintenance costs and pay-as-you-go infrastructure pricing of AWS.  Instead of having to pay upfront for servers and managing them in an on-premise data center, customers can use virtual servers in AWS to run Informatica products on. Customers can use existing Informatica licenses or purchase them in the standard way from Informatica for use on top of AWS EC2.

Combined with the ease of use of Informatica Cloud, Informatica now offers customers looking for hybrid and cloud solutions even more options.

Read the press release including supporting quotes from AWS and Informatica customer ProQuest, here.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration | Tagged , , , | Leave a comment

Informatica Supports New Custom ODBC/JDBC Drivers for Amazon Redshift

Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.

Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.

Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.

Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.

Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.

Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.

Redshift

To learn more, sign up for the free trial of Informatica’s Redshift connector for Informatica Cloud or PowerCenter.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

The Billion Dollar (Data Integration) Mistake

How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!

Data IntegrationHow would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.

This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line.  How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?

A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.

Data Validation_Customer Benefits

Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:

  • Reduce costs associated with data testing.
  • Reduce time associated with data testing.
  • Increase IT productivity.
  • Increase the business trust in the data.

Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill.  In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.

Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!

Data Validation_Use Cases

TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:

  • Regression (Unit) testing
  • Application migration or consolidation
  • Software upgrades (Applications, databases, PowerCenter)
  • Production reconciliation

One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing.  Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.

A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.

And what about you?  What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me

Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Getting Value Out of Data Integration

The post is by Philip Howard, Research Director, Bloor Research.

Getting value out of Data Integration

Live Bloor Webinar, Nov 5

One of the standard metrics used to support buying decisions for enterprise software is total cost of ownership. Typically, the other major metric is functionality. However functionality is ephemeral. Not only does it evolve with every new release but while particular features may be relevant to today’s project there is no guarantee that those same features will be applicable to tomorrow’s needs. A broader metric than functionality is capability: how suitable is this product for a range of different project scenarios and will it support both simple and complex environments?

Earlier this year Bloor Research published some research into the data integration market, which exactly investigated these issues: how often were tools reused, how many targets and sources were involved, for what sort of projects were products deemed suitable? And then we compared these with total cost of ownership figures that we also captured in our survey. I will be discussing the results of our research live with Kristin Kokie, who is the interim CIO of Informatica, on Guy Fawkes’ day (November 5th). I don’t promise anything explosive but it should be interesting and I hope you can join us. The discussions will be vendor neutral (mostly: I expect that Kristin has a degree of bias).

To Register for the Webinar, click Here.

Share
Posted in Data Integration, Data Integration Platform, Data Migration | Tagged , , | Leave a comment

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

Back in 1884, a man had a revolutionary idea; he envisioned a compact knife that was lightweight and would combine the functions of many stand-alone tools into a single tool. This idea became what the world has known for over a century as the Swiss Army Knife.

This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army.  In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.

Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.

Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.

PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane. 

For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping.  And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.

As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.

To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at:  https://community.informatica.com/solutions/pcexpress

 Business Use Cases

Business Use Case for PowerCenter Express

Share
Posted in Architects, Data Integration, Data Migration, Data Transformation, Data Warehousing, PowerCenter, Vibe | Tagged , | Leave a comment

A Data Integration Love-Fest in Vegas

Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?

Is it?

Next-Gen Data Integration

Agile Data Integration

a) They are all top in their field.

b) They all view data as critical to their business success.

c) They are all using Agile Data Integration to drive business agility.

d) They have spoken about their Data Integration strategy at Informatica World in Vegas.

Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?

Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World.  They shared their enthusiasm for the important role of data in their business.  These industry leaders discussed best practices that facilitate an Agile Data Integration process.

American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task.  This effort involved large-scale data migration and included many legacy data sources.  The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.

American Airlines architects recommend the use of Data Integration design patterns in order to improve agility.  The architects shared success-factors for merger Data Integration.  They discussed the importance of ownership by leadership from IT and business.  They emphasized the benefit of open and honest communications between teams.  They architects also highlighted the need to identify integration teams and priorities.  Finally the architects discussed the significance of understanding cultural differences and celebrating success.  The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.

Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions.  The Data Integration team needs to support this business process.  They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge.  Integrating all claims in a single location was critical for smooth processing of insurance claims.  A single system also leads to reduced costs and complexity for support and maintenance.

Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation.  Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.

Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system.  This complex project included data conversion from 50 legacy systems.  The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain.  This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.

Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of  development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.

MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report.  They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.

MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders.  This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.

So let’s recap, Data Integration is helping:

– An airlines continue to serve its customers and run its business smoothly post-merger.

– A tire retail company to procure and provide tires to its customers and maintain leadership

– An insurance company to process claims accurately and in a timely manner, while minimizing costs, and

– A cancer research center to cure cancer.

Not too shabby, right? Data Integration is clearly essential to business success!

So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!

To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration

 

Share
Posted in Data Integration, Data Integration Platform | Tagged , , , , , , , | Leave a comment

Agile Data Integration in Action: PowerCenter 9.6 Demo

PowerCenter 9.6 Demo WebinarA Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…

Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.

The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.

We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!

The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.

Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.

It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.

Would you like to see a 5X increase in the speed of delivering data integration projects?

Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?

To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.

Deep Dive Demo: Informatica PowerCenter 9.6.

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , , , , , | Leave a comment

Data Integration in Action at Informatica World

Great Data is By DesignWouldn’t you like to have been a fly on the wall when American Airlines and US Airways experts got together to integrate their data systems into one cohesive post-merger system?

Now you can experience the next best thing by attending InformaticaWorld 2014 and hearing the American Airlines US Airways Data Architects talk about the data challenges they faced. They will discuss the role of architecture in M&A, integrating legacy data, lessons learned, and best practices in Data Integration.

While you are at the show, you will have the opportunity to hear many industry experts discuss current trends in Agile end-to-end Data Integration.  

Agile Data Integration Development
To deliver the agility that your business requires, IT and Business must pursue a collaborative Data Integration process, with the appropriate Analyst self-service Data Integration tools.  At InformaticaWorld, you can learn about Agile Data Integration development from the experts at GE Aviation, who will discuss Agile Data Integration for Big Data Analytics. Experts from Roche, will discuss how Agile Data Integration has lead to a 5x reduction in development time, improved business self-service capabilities and increased data credibility.

Scalability
Another aspect of agility is your ability to scale your Data Warehouse to rapidly support more data, data sources, users and projects.  Come hear the experts from Liberty Mutual share challenges, pitfalls, best practices and recommendations for those considering large-scale Data Integration projects, including successful implementation of complex data migrations, data quality and data distribution processes.

Operational Confidence
The management of an enterprise-scale Data Warehouse involves the operation of a mature and complex mission-critical environment, which is commonly driven through an Integration Competency Center (ICC) initiative.  You now have the need to inspect and adapt your production system and expedite data validation and monitoring processes through automation, so that data issues can be quickly caught and corrected and resources can be freed up to focus on development. 

The experts from University of Pittsburgh Medical Center, along with Informatica Professional Services experts, will discuss best practices, lessons learned and the process of transitioning from ‘analytics as project’ to an enterprise initiative through the use of an Integration Competency Center. 

Hear from the Informatica Product Experts
You will have many opportunities to hear directly from the Informatica product experts about end-to-end Data Integration Agility delivered in the recent 9.6 release of PowerCenter.

See PowerCenter 9.6 in Action
Don’t miss the opportunity to see live demos of the cool new features of PowerCenter 9.6 release at the multitude of hands-on labs being offered at InformaticaWorld this year. 

For example you can learn how to empower business users through self-service Data Integration with PowerCenter Analyst tool; how to reduce testing time of Data Integration projects through automated validation tests; and how to scale your Data Integration with High Availability and Grid.

The sessions we described here are a sampling of the rich variety of sessions that will be offered on Data Integration at the show.  We hope that you will join us at InformaticaWorld this year in Las Vegas on May 13-15 and as you plan your visit, please check out the complete listing of sessions and labs that are focused on Data Integration.

Please feel free to leave a comment and let us know which InformaticaWorld session/s you are most looking forward to!  See you there!

Share
Posted in Business/IT Collaboration, Data Integration, Informatica Events, Informatica World 2014, Integration Competency Centers | Tagged , , , , , , | Leave a comment