Tag Archives: PowerCenter

Introducing the Informatica Technology Partner Network

Enabling ISVs to Connect to More Data

Data is critical to application growth. Bringing additional data into your application is costly, and time spent on point-to-point integration takes time away from introducing new features.

Today, Informatica is releasing the Informatica Technology Partner Network (TPN) – an online developer portal designed to build a connector that makes it easy for Independent Software Vendors (ISVs) to access more application data. The Technology Partner Network provides ISVs with everything they need to fast-track cloud and hybrid connectivity with Informatica, including access to the following:

• Informatica development environment and connector toolkit
• Interactive REST API, instant API mock server and automated testing
• Technical resources, samples and adapter tester
• Developer community forum

The TPN provides developers with a development environment to load a connector toolkit (SDK) and immediately begin building their connector. The open and interactive REST API provides a space to learn, share and experience functionality without writing any code. A debugging proxy provides more detail on the request and response of the API call and can point to a mock server. These tools enable ISVs to build a prototype in a day and complete their connector development in just a couple of weeks.

The Informatica Vibe™ platform – a virtual data machine (VDM) provides the underlying data management engine that allows ISVs to transforms application data. Created exclusively for Independent Software Vendors (ISVs), Informatica is introducing the Vibe Ready Partner Program.

ISVs who join the Informatica Vibe Ready Partner Program gain access – at no cost – to the following:

• Pre-built Informatica connectors, mappings and end-user starter kit
• Informatica cloud sandbox multi-user instance
• Informatica software development not-for-resale (NFR) license Pack
• Connector developer support and certification
• Vibe Ready partner and certification logos
• 1-click activation of the connector on the Informatica Marketplace

ISVs that complete the Vibe Ready Certification can provide their customers with a 1-click trial or paid edition on the Informatica Marketplace. The Informatica Marketplace enables customers to search by application, connector or bundle. The Vibe Ready logo provides a simple way for customers to identify solutions that Informatica has certified.

Enabling a successful ISV ecosystem around the Vibe platform is a cornerstone of our business strategy. The Technology Partner Network and Informatica Vibe Ready Partner Program will enable our ISVs to make their data clean, connected and safe.

Here are some additional resources to get started developing with Informatica:

• Explore the Technology Partner Network

• Register as an Informatica Vibe Ready Partner Program

• Technology Partner Network questions? Email us

Share
Posted in Vibe | Tagged , , | Leave a comment

Everybody‘s Doing It! Learn how at Informatica World 2015!

I-WANT-DATA-NOW

Learn more at Informatica World 2015

Everybody’s doing it.  And if not, they say they are doing it anyway!  Are you doing it?  We all hear the mantra: ‘Data is an asset’.  Everybody wants to get in on the action.  Hear at Informatica World 2015 how Informatica customers are using data integration agility to drive business agility.  These organizations are relying on the Informatica platform as their data foundation.  Does that sound a bit vague?  Let’s get more specific here…

Who hasn’t used PayPal to send a secure payment?  Would you like to know how PayPal is managing the data integration architecture to support and analyze 11.6 million payments per day? Hear PayPal’s Architect chat with Informatica PowerCenter Product Managers.  They will discuss Advanced Scaling, Metadata Management and Business Glossary.  Would you like to learn how these PowerCenter capabilities can benefit your business? Add this session to your IW15 registration.

Verizon’s Architect will talk about consolidating 50+ legacy applications into a new application architecture. Wow, that’s a massive data management effort!  Are you curious to learn more about how to successfully manage an application modernization effort of this scope using Informatica tools?  Add this session to your IW15 registration.

Did you know that HP boasts one of the most complex and largest Informatica installations on the face of the earth?  HP’s Informatica Shared Services architecture allows hundreds of projects throughout HP worldwide to use PowerCenter data integration capabilities.  And they do so easily and cost effectively.  Join the session to gain insight on the design, architecture, creation, support, and overall governance of this solution.  Would you like to learn more from HP on how to maximize the benefits of your Informatica investment?  Add this session to your IW15 registration.

You have probably had your tires replaced at Discount Tire.  Would you like to learn more about how Discount Tire leverages the Informatica Platform to gain business advantage?  Discount Tire’s architect will share how they test and monitor their Data Integration environment using PowerCenter capabilities, such as Data Validation Testing and Proactive Monitoring.  They will discuss the benefit of doing so across multiple use cases.  They will highlight Discount Tire’s migration to a new SAP application and how the Informatica platform supports this migration. Would you like to improve how you test and monitor you critical business processes?  Add this session to your IW15 registration.

Finally if your organization is planning any type of application modernization or rationalization, you will not want to miss this Informatica customer panel.  This session will features experts from Cisco, Verizon and Discount Tire.  Speakers will share their experience, insights and best practices for Application Consolidation and Migration.  As we discussed in a previous blog post, failure rates for these projects are staggeringly high.  Arm yourself with proven best practices for data management.  This has been shown to increase the success of application modernization projects! To learn more Add this session to your IW15 registration.

Register for infa15

Register now for Informatica World 2015

We hope you can join us at Informatica World 2015!  Come and enjoy the wealth of experience and insights shared by these industry experts and many others.  See you there!

Share
Posted in Informatica World 2015 | Tagged , , , , , , , , , | Leave a comment

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

cloud+services

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

An increasing number of companies around the world moving to cloud-first or hybrid architectures for new systems to process their data for new analytics applications.  In addition to adding new data source from SaaS (Software as a Service) applications to their data pipelines, they are hosting some or all of their data storage, processing and analytics in IaaS (Infrastructure as a Service) public hosted environments to augment on-premise systems. In order to enable our customers to take advantage of the benefits of IaaS options, Informatica is embracing this computing model.

As announced today, Informatica now fully supports running the traditionally on-premise Informatica PowerCenter, Big Data Edition (BDE), Data Quality and Data Exchange on Amazon Web Services (AWS) Elastic Compute (EC2).  This provides customers with added flexibility, agility and time-to-production by enabling a new deployment option for running Informatica software.

Existing and new Informatica customers can now choose to develop and/or deploy data integration, quality and data exchange in AWS EC2 just as they would on on-premise servers.  There is no need for any special licensing as Informatica’s standard product licensing now covers deployment on AWS EC2 on the same operating systems as on-premise.  BDE on AWS EC2 supports the same versions of Cloudera and Hortonworks Hadoop that are supported on-premise.

Customers can install these Informatica products on AWS EC2 instances just as they would on servers running on an on-premise infrastructure. The same award winning Informatica Global Customer Service that thousands of Informatica customers use is now available on call and standing by to help with success on AWS EC2. Informatica Professional Services is also available to assist customers running these products on AWS EC2 as they are for on-premise system configurations.

Informatica customers can accelerate their time to production or experimentation with the added flexibility of installing Informatica products on AWS EC2 without having to wait for new servers to arrive.  There is the flexibility to develop in the cloud and deploy production systems on-premise or develop on-premise and deploy production systems in AWS.  Cloud-first companies can keep it all in the cloud by both developing and going into production on AWS EC2.

Customers can also benefit from the lower up-front costs, maintenance costs and pay-as-you-go infrastructure pricing of AWS.  Instead of having to pay upfront for servers and managing them in an on-premise data center, customers can use virtual servers in AWS to run Informatica products on. Customers can use existing Informatica licenses or purchase them in the standard way from Informatica for use on top of AWS EC2.

Combined with the ease of use of Informatica Cloud, Informatica now offers customers looking for hybrid and cloud solutions even more options.

Read the press release including supporting quotes from AWS and Informatica customer ProQuest, here.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration | Tagged , , , | Leave a comment

Informatica Supports New Custom ODBC/JDBC Drivers for Amazon Redshift

Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.

Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.

Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.

Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.

Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.

Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.

Redshift

To learn more, sign up for the free trial of Informatica’s Redshift connector for Informatica Cloud or PowerCenter.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

The Billion Dollar (Data Integration) Mistake

How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!

Data IntegrationHow would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.

This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line.  How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?

A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.

Data Validation_Customer Benefits

Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:

  • Reduce costs associated with data testing.
  • Reduce time associated with data testing.
  • Increase IT productivity.
  • Increase the business trust in the data.

Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill.  In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.

Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!

Data Validation_Use Cases

TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:

  • Regression (Unit) testing
  • Application migration or consolidation
  • Software upgrades (Applications, databases, PowerCenter)
  • Production reconciliation

One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing.  Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.

A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.

And what about you?  What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me

Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Getting Value Out of Data Integration

The post is by Philip Howard, Research Director, Bloor Research.

Getting value out of Data Integration

Live Bloor Webinar, Nov 5

One of the standard metrics used to support buying decisions for enterprise software is total cost of ownership. Typically, the other major metric is functionality. However functionality is ephemeral. Not only does it evolve with every new release but while particular features may be relevant to today’s project there is no guarantee that those same features will be applicable to tomorrow’s needs. A broader metric than functionality is capability: how suitable is this product for a range of different project scenarios and will it support both simple and complex environments?

Earlier this year Bloor Research published some research into the data integration market, which exactly investigated these issues: how often were tools reused, how many targets and sources were involved, for what sort of projects were products deemed suitable? And then we compared these with total cost of ownership figures that we also captured in our survey. I will be discussing the results of our research live with Kristin Kokie, who is the interim CIO of Informatica, on Guy Fawkes’ day (November 5th). I don’t promise anything explosive but it should be interesting and I hope you can join us. The discussions will be vendor neutral (mostly: I expect that Kristin has a degree of bias).

To Register for the Webinar, click Here.

Share
Posted in Data Integration, Data Integration Platform, Data Migration | Tagged , , | Leave a comment

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

Back in 1884, a man had a revolutionary idea; he envisioned a compact knife that was lightweight and would combine the functions of many stand-alone tools into a single tool. This idea became what the world has known for over a century as the Swiss Army Knife.

This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army.  In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.

Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.

Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.

PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane. 

For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping.  And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.

As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.

To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at:  https://community.informatica.com/solutions/pcexpress

 Business Use Cases

Business Use Case for PowerCenter Express

Share
Posted in Architects, Data Integration, Data Migration, Data Transformation, Data Warehousing, PowerCenter, Vibe | Tagged , | Leave a comment