Category Archives: Data Migration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

Back in 1884, a man had a revolutionary idea; he envisioned a compact knife that was lightweight and would combine the functions of many stand-alone tools into a single tool. This idea became what the world has known for over a century as the Swiss Army Knife.

This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army.  In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.

Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.

Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.

PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane. 

For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping.  And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.

As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.

To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at:  https://community.informatica.com/solutions/pcexpress

 Business Use Cases

Business Use Case for PowerCenter Express

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Data Integration, Data Migration, Data Transformation, Data Warehousing, PowerCenter, Vibe | Tagged , | Leave a comment

Comparative Costs and Uses for Data Integration Platforms – A Study from Bloor Research

Data Integration PlatformsFor years, companies have wrestled with the promise of data integration platforms. Along the way, businesses have asked many questions, including:

  • Does Data Integration technology truly provide a clear path toward unified data?
  • Can businesses truly harness the potential of their information?
  • Can companies take powerful action as a result?

Recently, Bloor Research set out to evaluate how things were actually playing out on the ground. In particular, they wanted to determine which data integration projects were actually taking place, at what scale, and with what results. The study, “Comparative Costs and Uses for Data Integration Platforms,” was authored by Philip Howard, research director at Bloor. The study examined data integration tool suitability across a range of scenarios, including:

  1. Data migration and consolidation projects
  2. Master data management (MDM) and associated solutions
  3. Application-to-application integration
  4. Data warehousing and business intelligence implementations
  5. Synching data with SaaS applications
  6. B2B data exchange

To draw conclusions, Bloor examined 292 responses from a range of companies. The responders used a variety of data integration approaches, from commercial data integration tools to “hand-coding.”

Informatica is pleased to be able to offer you a copy of this research for your review. The research covers areas like:

  • Suitability
  • Productivity
  • Reusability
  • Total Cost of Ownership (TCO)

We welcome you to download a copy of “Comparative Costs and Uses for Data Integration Platforms” today. We hope these findings offer you insights as you implement and evaluate your data integration projects and options.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Migration, Master Data Management | Tagged , , , | Leave a comment

Oracle Data Migration Best Practices: Join Us And Learn

Oracle Data Migration Best PracticesAre you interested in Oracle Data Migration Best Practices? Are you upgrading, consolidating or migrating to or from an Oracle application? Moving to the cloud or a hosted service? Research and experience confirms that the tasks associated with migrating application data during these initiatives have the biggest impact on whether the project is considered a failure or success. So how do your peers ensure data migration success?

Informatica will be offering a full day Oracle Migrations Best Practices workshop at Oracle Application User Group’s annual conference, Collaborate 14, this year on April 7th in Las Vegas, NV. During this workshop, peers and experts will share best practices for how to avoid the pitfalls and ensure successful projects, lowering migration cost and risk. Our full packed agenda includes:

  1. Free use and trials of data migration tools and software
  2. Full training sessions on how to integrate cloud-based applications
  3. How to provision test data using different data masking techniques
  4. How to ensure consistent application performance during and after a migration
  5. A review of Oracle Migration Best Practices and case studies

Case Study: EMC

One of the key case studies that will be highlighted is EMC’s Oracle migration journey. EMC Corporation migrated to Oracle E-Business Suite, acquired more than 40 companies in 4 years, consolidated and retired environments, and is now on its path to migrating to SAP. Not only did they migrate applications, but they also migrated their entire technology platform from physical to virtual on their journey to the cloud. They needed to control the impact of data growth along the way, manage the size of their test environments while reducing the risk of exposing sensitive data to unauthorized users during development cycles. With best practices, and the help from Informatica, they estimate that they have saved approximately $45M in IT cost savings throughout their migrations. Now that they are deploying a new analytics platform based on Hadoop. They are leveraging existing skill sets and Informatica tools to ensure data is loaded into Hadoop without missing a beat.

Case Study: Verizon

Verizon is the second case study we will be discussing. They recently migrated to Salesforce.com and needed to ensure that more than 100 data objects were integrated with on-premises, back end applications. In addition, they needed to ensure that data was synchronized and kept secure in non-production environments in the cloud. They were able to leverage a cloud-based integration solution from Informatica to simplify their complex IT application architecture and maintain data availability and security – all while migrating a major business application to the cloud.

Case Study: OEM Heavy Equipment Manufacturer

The third case study we will review involves a well-known heavy equipment manufacturer who was facing a couple of challenges – the first was a need to separate data in in an Oracle E-Business Suite application as a result of a divestiture. Secondly, they also needed to control the impact of data growth on their production application environments that were going through various upgrades. Using an innovative approach based on Smart Partitioning, this enterprise estimates it will save $23M over a 5 year period while achieving 40% performance improvements across the board.

To learn more about what Informatica will be sharing at Collaborate 14, watch this video. If you are planning to attend Collaborate 14 this year and you are interested in joining us, you can register for the Oracle Migrations Best Practices Workshop here.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Migration | Tagged , | Leave a comment

Was Your Last Data Migration Project On Time and On Budget?

Data Migration ProjectData migration projects are notorious for going over budget and over time. These large projects typically cost around $875,000 and an average of 30% of that is due to project overruns. In today’s fast-paced, big data era, organizations cannot afford these missteps. Unfortunately, many companies treat major data projects as one-off events. This approach leads to product launch delays, produces no re-usable assets or best practices, and presents an outsized risk to business objectives.

I look forward to sharing how the successful organizations we work with have combated these issues using Master Data Management (MDM) as a platform for systems consolidation, migration, and upgrade projects. MDM accomplishes the following:

  • Creates authoritative, trustworthy data
  • Simplifies migration architecture using a hub-and-spoke model
  • Maintains data consistency across new and old systems post-migration
  • Enables reuse of data, mappings, and rules for the next migration project

In summary, MDM allows organizations to minimize risk and increase the speed of data migration.

To address this topic, I will be hosting a webinar titled “MDM as Platform for Systems Consolidation, Migration and Upgrade” on March 19th at 2:00 PM Eastern. In this webinar, you will learn about:

  • Challenges faced in systems consolidation, migration and upgrades
  • Solutions MDM brings to address these challenges in pre-migration, during-migration, and post-migration phases
  • Examples of companies using MDM to manage data migration as a repeatable process
  • Tips for expanding the use of MDM beyond data migration for operational and analytical purposes

Join me to learn how MDM works in practice and to gain understanding of how it can help make your next systems consolidation, migration, or upgrade the most efficient and effective yet. Sign up today for the webinar on Wednesday, March 19, 2014.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Migration, Master Data Management | Tagged , , | 1 Comment

Streamlining and Securing Application Test and Development Processes

Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers.  Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.

We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data.  Here are the results from that poll:

It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges.   Data integrity/synchronization was a very close 3rd .

Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times.  As part of the next generation test data management platform, key components of that solution include:

Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data.  Many times, sensitive data resides in test copies unbeknownst to auditors.  Once data has been located, data can be masked in non-production copies.

Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy.  Data formats are preserved while original values are completely protected.

Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.

Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets.  Having the ability to reset test data becomes a key requirement to improve efficiencies.

lockner 2

Figure 2 Next Generation Test Data Management

When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities.  This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.

lockner 3

Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets.  For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Migration, Data Privacy | Tagged , , , , | Leave a comment

Reducing Business Risk with Data Integration

A study by Bloor Research put the failure rate for data migration projects at 38%. When you consider that a failed data migration project can temporarily hold up vital business processes, this becomes even more bad news.  This affects customer service, internal business processes, productivity, etc., leading to an IT infrastructure that is just not meeting the expectations of the business.

If you own one of these dysfunctional IT infrastructures, you’re not alone.  Most enterprises struggle with the ability to manage the use of data within the business.  Data integration becomes an ad hoc concept that is solved when needed using whatever works at the time.  Moreover, the ability to manage migration and data quality becomes a lost art, and many users distrust the information coming from business systems they should rely upon.

The solution to this problem is complex.  There needs to be a systemic approach to data integration that is led by key stakeholders.  Several business objectives should be set prior to creating a strategy, approach, and purchasing key technologies.  This includes:

  • Define the cost of risk in having substandard data quality.
  • Define the cost of risk in not having data available to systems and humans in the business.
  • Define the cost of lost strategic opportunities, such as moving into a new product line or acquiring a company.

The idea is that, by leveraging data integration approaches and technology, we’ll reduce much of the risk, which actually has a cost.

The risk of data quality is obvious to those inside and out of IT, but the damage that could occur when not having a good data integration and data quality strategy and supporting technology is perhaps much farther reaching that many think.  The trick is to solve both problems at the same time, leveraging data integration technology that can deal with data quality issues as well.

Not having data available to both end users who need to see it to operate the business, as well as to machines that need to respond to changing data, adds to the risk and thus the cost.  In many enterprises, there is a culture of what I call “data starvation.”  This means it’s just accepted that you can’t track orders with accurate data, you can’t pull up current customer sales information, and this is just the way things are.  This is really an easy fix these days, and one dollar invested in creating a strategy or purchasing and implementing technology will come back to the business twenty fold, at least.

Finally, define the cost of lost strategic opportunities.  This is a risk that many companies pay for, but it’s complex and difficult to define.  This means that the inability to get the systems communicating and sharing data around a merger, for example, means that the enterprises can’t easily take advantage of market opportunities.

I don’t know how many times I’ve heard of enterprises failing at their attempts to merge two businesses because IT could not figure out how to the make the systems work and play well together.  As with the other two risks, a manageable investment of time and money will remove this risk and thus the cost of the risk.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Migration, Data Quality | Tagged , | 1 Comment

Get Your Data Butt Off The Couch and Move It

Data is everywhere.  It’s in databases and applications spread across your enterprise.  It’s in the hands of your customers and partners.  It’s in cloud applications and cloud servers.  It’s on spreadsheets and documents on your employee’s laptops and tablets.  It’s in smartphones, sensors and GPS devices.  It’s in the blogosphere, the twittersphere and your friends’ Facebook timelines. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, B2B, Big Data, Cloud Computing, Complex Event Processing, Data Governance, Data Integration, Data Migration, Data Quality, Data Services, Data Transformation, Data Warehousing, Enterprise Data Management, Integration Competency Centers | Tagged , , , | Leave a comment

Enterprise Application Projects Are Much Riskier Than You Think

IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.

But, it is important to remember that there is significant risk in these projects.  An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%.  The HRB article refers to these projects as “black swans.”

How can you reduce this risk of project failure?  Typically, 30% to 40% of an enterprise application project is data migration.  A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.

How can you improve the odds of success in data migration projects?

  1. Use data profiling tools to understand your data before you move it.
  2. Use data quality tools to correct data quality problems.  There is absolutely no point in moving bad data around the organization – but it happens.
  3. Use a proven external methodology. In plain English, work with people who have “done it before”
  4. Develop your own internal competence.  Nobody knows your data, and more importantly, the business context of your data than your own staff.  Develop the skills and engage your business subject matter experts.

Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.

To find out more about successful data migration:

  • Informatica World:  Visit us at the Hands On Lab – Data Migration.
  • Informatica World: Informatica Presentation on Application Data Migration.

Application Data Migrations with Informatica Velocity Migration Methodology

Friday June 5, 2013          9:00 to 10:00

  • Informatica World: Data Migration Factory Presentation by  Accenture

Accelerating the Power of Data Migration

Tuesday June 4, 2013     2:00 to 3:00

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Data Governance, Data Migration, Data Quality, Informatica Events | Tagged , , , , | Leave a comment

Informatica World Healthcare Path

Join us this year at Informatica World!

We have a great line up of speakers and events to help you become a data driven healthcare organization… I’ve provided a few highlights below:

Participate in the Informatica World Keynote sessions with Sohaib Abbasi and Rick Smolan who wrote “The Human Face of Big Data”  — learn more via this quick YouTube video: http://www.youtube.com/watch?v=7K5d9ArRLJE&feature=player_embedded

With more than 100 interactive and in-depth breakout sessions, spanning 6 different tracks, (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk), Informatica World is an excellent way to ensure you are getting the most from your Informatica investment. Learn best practices from organizations who are realizing the potential of their data like: Ochsner Health, Sutter Health, UMass Memorial, Qualcomm and Paypal.

Finally, we want you to balance work with a little play… we invite you to network with industry peers at our Healthcare Cocktail Reception on the evening of Wednesday, June 5th and again during our Data Driven Healthcare Breakfast Roundtable on Thursday, June 6th.

See you there!

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, B2B, Complex Event Processing, Data Integration, Data Integration Platform, Data masking, Data Migration, Data Warehousing, Healthcare, Informatica Events, Master Data Management, Uncategorized | Tagged , | Leave a comment