Tag Archives: Data Migration

Come to Informatica World 2015 and Advance Your Career

INFA15- Data Integration

Come to Informatica World 2015 and Advance Your Career

5 Reasons Data Integration Professionals Absolutely Must Not Miss This Informatica World.

If you are a Data Integration or Data Management professional, you really cannot afford to miss this event.  This year’s theme in the Data Integration track at Informatica World is all about customers.  Over 50 customers will be sharing their experiences and best practices for succeeding with for data integration projects such as analytics, big data, application consolidation and migration, and much more.

If you still need convincing, here are the five reasons:

  1. Big Data:A special Big Data Summit is part of the track.
  2. Immediate Value:over 50 customers will be sharing their experience and best practices. Things you can start doing now to improve your organization.
  3. Architecture for Business Transformation. An architecture track focused on practical approaches for using architecture to enable business transformation, with specific examples and real customer experiences.
  4. Hands on Labs:Everybody loves them. This year we have even more. Sign up early to make sure that you get your choice. They go fast!
  5. New “Meet the Experts” Sessions:These are small group meetings for business-level discussions around subjects like big data, analytics, application consolidation, and more.

This truly will be a one-stop shop for all things data integration at Informatica World.  The pace of both competition and technology change is accelerating.  Attend this event to stay on top of what is happening in the word of data integration and how leading companies and experts are using data for competitive advantage within their organizations.

To help start your planning, here is a listing of the Data Integration, Architecture, and Big Data Sessions this year.  I hope to see you there.

QUICK GUIDE

DATA INTEGRATION AND BIG DATA at INFORMATICA WORLD 2015

Breakout Sessions, Tuesday, May 13

Session Time Location
Accelerating Business Value Delivery with Informatica Platform  (Architect Track Keynote) 10:45am – 11:15am Gracia 6
How to Support Real Time Data Integration Projects with PowerCenter (Grant Thornton) 1:30pm – 2:30pm Gracia 2
Knowledgent 11:30am – 12:15pm Gracia 8
Putting Big Data to Work to Make Cancer History at MD Anderson Cancer Center (MD Anderson) 11:30am – 12:15pm Gracia 4
Modernize your Data Architecture for Speed, Efficiency, and Scalability 11:30am – 12:15pm Castellana 1
An Architectural Approach to Data as an Asset (Cisco) 11:30am – 12:15pm Gracia 6
Accenture 11:30am – 12:15pm Gracia 2
Architectures for Next-Generation Analytics 1:30pm – 2:30pm Gracia 6
Informatica Marketplace (Tamara Strifler) 1:30pm – 2:30pm Gracia 4
Informatica Big Data Ready Summit: Keynote Address (Anil Chakravarthy, EVP and Chief Product Officer) 1:40 – 2:25 Castellana 1
Big Data Keynote: Tom Davenport, Distinguished Professor in Management and Information Technology, Babson College 2:30 – 3:15 Castellana 1
How to Test and Monitor Your Critical Business Processes with PowerCenter (Discount Tire, AT&T) 2:40pm – 3:25pm Gracia 2
Enhancing Consumer Experiences with Informatica Data Integration Hub (Humana) 2:40pm – 3:25pm Gracia 4
Business Transformation:  The Case for Information Architecture (Cisco) 2:40pm – 3:25pm Gracia 6
Succeeding with Big Data and Avoiding Pitfalls (CapGemini, Cloudera, Cognizant, Hortonworks) 3:15 – 3:30
What’s New in B2B Data Exchange: Self-Service Integration of 3rd Party Partner Data (BMC Software) 3:35pm – 4:20pm Gracia 2
PowerCenter Developer:  Mapping Development Tips & Tricks 3:35pm – 4:20pm Gracia 4
Modernize Your Application Architecture and Boost Your Business Agility (Mototak Consulting) 3:35pm – 4:20pm Gracia 6
The Big Data Journey: Traditional BI to Next Gen Analytics (Johnson&Johnson, Transamerica, Devon Energy, KPN) 4:15 – 4:30 Castellana 1
L&T Infotech 4:30 – 5:30 Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 4:30 – 5:30 Gracia 4
Next-Generation Analytics Architecture for the Year 2020 4:30 – 5:30 Gracia 6
Accelerate Big Data Projects with Informatica (Jeff Rydz) 4:35 – 5:20 Castellana 1
Big DataMichael J. Franklin, Professor of Computer Science, UC Berkeley 5:20 -5:30 Castellana 1
  • Informatica World Pavilion5:15 PM – 8:00 PM

Breakout Sessions, Wednesday, May 14

Session Time Location
How Mastercard is using a Data Hub to Broker Analytics Data Distribution (Mastercard) 2:00pm – 2:45pm Gracia 2
Cause: Business and IT Collaboration Effect: Cleveland Clinic Executive Dashboard (Cleveland Clinic) 2:00pm – 2:45pm Castellana 1
Application Consolidation & Migration Best Practices: Customer Panel (Discount Tire, Cisco, Verizon) 2:55pm – 3:55pm Gracia 2
Big Data Integration Pipelines at Cox Automotive (Cox Automotive) 2:55pm – 3:55pm Gracia 4
Performance Tuning for PowerCenter and Informatica Data Services 2:55pm – 3:55pm Gracia 6
US Bank and Cognizant 2:55pm – 3:55pm Castellana 1
Analytics architecture (Teradata, Hortonworks) 4:05pm – 4:50pm Gracia 4
A Case Study in Application Consolidation and Modernization—Migrating from Ab Initio to Informatica (Kaiser Permanente) 4:05pm – 4:50pm Castellana 1
Monetize Your Data With Hadoop and Agile Data Integration (AT&T) 4:05pm – 4:50pm Gracia 2
How to Enable Advanced Scaling and Metadata Management with PowerCenter (PayPal) 5:00pm – 5:45pm Castellana 1
How Verizon is consolidating 50+ legacy systems into a modern application architecture, optimizing Verizon’s enterprise sales and delivery process (Verizon) 5:00pm – 5:45pm Gracia 6
A guided tour to one of the most complex Informatica Installations worldwide (HP) 5:00pm – 5:45pm Gracia 2
Integration with Hadoop:  Best Practices for mapping development using Big Data Edition 5:00pm – 5:45pm Gracia 4

Meet The Experts Sessions, Wednesday, May 14

Session Time Location
Meet the Expert: App Consolidation – Driving Greater Business Agility and Reducing Costs Through Application Consolidation and Migration (Roger Nolan) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Big Data – Delivering on the Promise of Big Data Analytics (John Haddad) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Architect – Laying the Architectural Foundation for the Data-Driven Enterprise (David Lyle) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
  • Informatica World Pavilion11:45 PM – 2:00 PM

Breakout Sessions, Thursday, May 15

Session Time Location
Enterprise Architecture and Business Transformation Panel  (Cisco) 9:00am – 10:00am Gracia 6
The Data Lifecycle: From infancy through retirement, how Informatica can help (Mototak Consulting) 9:00am – 10:00am Gracia 4
How Allied Solutions Streamlined Customer Data Integration using B2B Data Exchange (Allied Solutions) 9:00am – 10:00am Gracia 2
How the State of Washington and Michigan State University are Delivering Integration as a Service (Michigan State University, Washington State Department of Enterprise Services) 9:00am – 10:00am Gracia 1
Real Time Big Data Streaming Analytics (PRA Group) 10:10am – 11:10am Gracia 1
Extending and Modernizing Enterprise Data Architectures (Philip Russom, TDWI) 10:10am – 11:10am Gracia 4
Best Practices for Saving Millions by Offloading ETL/ELT to Hadoop with Big Data Edition and Vibe Data Stream (Cisco) 10:10am – 11:10am Gracia 2
Retire Legacy Applications – Improve Your Bottom-Line While Managing Compliance (Cisco) 11:20am – 12:20pm Gracia 4
How a Data Hub Reduces Complexity, Cost and Risk for Data Integration Projects 11:20am – 12:20pm Gracia 1
Title? (Cap Gemini) 11:20am – 12:20pm Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 2:30pm – 3:30pm Gracia 4
Title?  Keyur Desai 2:30pm – 3:30pm Gracia 2
How to run PowerCenter & Big Data Edition on AWS & connect Data as a Service (Customer) 2:30pm – 3:30pm Gracia 1
Accelerating Business with Near-Realtime Architectures 2:30pm – 3:30pm Gracia 6
  • Informatica World Pavillion12:30 PM – 3:30 PM

Hands-On Labs

Session Time Location
General Interest
PowerCenter 9.6.1 Upgrade 1 Table 01
PowerCenter 9.6.1 Upgrade 2 (repeat) Table 02
PowerCenter Advanced Edition – High Availability & Grid Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 03a
PowerCenter Advanced Edition – Metadata Manager & Business Glossary Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 03b
Data Archive Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 06a
Test Data Management Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 06b
Analytics- Related
PowerCenter Big Data Edition – Delivering on the Promise of Big Data Analytics All other times not taken by 11b. Table 11a
Elastic Analytics:  Big Data Edition in the Cloud Mon 4:00
Tue 11:45, 3:35
Wed 12:45, 5:00, 7:00
Thu  9:00;1:15;2:15
Fri 10:30
Table 11b
Greater Agility and Business-IT Collaboration using Data Virtualization Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 12a
Boosting your performance and productivity with Informatica Developer Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 12b
Democratizing your Data through the Informatica Data Lake Table 13
Enabling Self-Service Analytics with Informatica Rev Table 14
Real-time Data Integration: PowerCenter Architecture & Implementation Considerations Monday 1pm
Tuesday 7:30am, 1:45pm
Wed 7:30, 2:00, 4:05
Thu 9am, 11:20am
Fri 8:30am
Table 15a
Real-time Data Integration: PowerExchange CDC on z/OS Monday 2pm
Tue 10:45, 2:40
Wed 10:45, 5pm
Thu 12:15pm
Fri 9:30am
Table 15b
Real-time Data Integration: PowerExchange CDC on i5/OS Monday 3pm
Tuesday 3:35pm
Wed 11:45am, 6pm
Thu 1:15pm
Fri 10:30am
Table 15c
Real-time Data Integration: PowerExchange CDC for Relational (Oracle, DB2, MS-SQL) Mon 4pm
Tue 11:45am, 4:25pm
Wed 12:45pm, 2:55pm, 7pm
Thu 7:30am, 10:10am, 2:15pm
Fri 7:30am, 11:30am
Table 15d
Healthcare Data Management and Modernization for Healthcare Providers Table 16
Data Management of Machine Data & Internet of Things Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 17a
Handling Complex Data Types with B2B Data Transformation Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 17b
Application Consolidation & Migration Related
Simplifying Complex Data Integrations with Data Integration Hub Table 18
Implementing Trading Partner Integration with B2B Data Exchange Table 19
Operationalizing and Scaling your PowerCenter Environment Mon 1pm, 2pm
Tue 7:30, 10:45, 2:40, 3:35
Wed 10:45, 12:45, 5pm, 6pm, 7pm
Thu 7:30, 9am, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 20a
Effective Operations management and Administration – What’s New M: 3:00 – 3:45pm
4:00 – 4:45pm
Tu: 11:45 – 12:30pm
1:45 – 2:30pm
4:25 – 5:15pm
W: 7:30 – 8:15am
11:45 – 12:30pm
2:55 – 3:40pm
4:05 – 4:50pm
Th: 10:10 – 10:55am
12:15 – 1:00pm
2:15 – 3:00pm
F:  8:30 – 9:15am
10:30 – 11:15am
Table 20b
Getting the Most out of your Data Integration & Data Quality Platform – Performance and Scalability Tips & Tricks Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 21a
Getting the Most out of your BigData Edition – Performance Best Practices Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 21b
Modernizing and Consolidating Legacy and Application Data: Leveraging Data Services, Data Quality and Data Explorer Mon 1:00
Tue 10:45,  2:40, 4:25
Wed 10:45, 2:00, 2:55, 4:05, 7:00
Thu 11:20, 2:15 PM
Fri  9:30AM, 10:30AM
Table 22a
Connect to *: Connectivity to Long Tail of Next Generation Data Sources Mon 2:00, 3:00pm
Tue  7:30AM, 11:45, 1:45
Wed 7:30AM,  12:45pm,, 5:00pm
Thu 9:00am, 10:10am, 1:15pm
Fri 7:30am,8:30AM,
Table 22b
Modernizing and Consolidating Legacy and Application Data with PowerExchange Mainframe and CDC Mon 4:00PM
Tue 3:35
Wed 11:45, 6:00
Thu 7:30AM, 12:15, 2:15
Fri 11:30
Table 22c
Retire Legacy Applications and Optimize Application Performance with Informatica Data Archive Table 23
Protect Salesforce Sandboxes with Cloud Data Masking Tue 3:35, 4:25
Wed 6:00, 7:00
Thu 1:15, 2:15
Fri 7:30
Table 24a
Optimally Provision Test Data Sets with Test Data Management Mon: all times Monday
Tues: 7:30,10:45, 11:45, 1:45, 2:40
Wed: 7:30, 10:45, 11:45, 12:45, 2:00, 2:55, 4:05, 5:00
Thurs: 7:30, 9:00, 10:10, 11:20, 12:15
Fri: 8:30, 9:30, 10:30, 11:30
Table 24b
Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Gut_Instincts

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.

Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching.  The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.

The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”

This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.

As Carey explains it:

“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”

Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.

There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.

Share
Posted in B2B, B2B Data Exchange, Business/IT Collaboration, Data First, Data Integration, Data Services | Tagged , , , , , | Leave a comment

What’s Driving Core Banking Modernization?

Renew

What’s Driving Core Banking Modernization

When’s the last time you visited your local branch bank and spoke to a human being? How about talking to your banker over the phone?  Can’t remember?  Well you’re not alone and don’t worry, it’s not a bad thing. The days of operating physical branches with expensive workers to greet and service customers  are being replaced with more modern and customer friendly mobile banking applications that allow consumers to deposit checks from the phone, apply for a mortgage and sign closing documents electronically, to eliminating the need to go to an ATM and get physical cash by using mobile payment solutions like Apple Pay.  In fact, a new report titled ‘Bricks + Clicks: Building the Digital Branch,’ from Jeanne Capachin and Jim Marous takes an in-depth look at how banks and credit unions are changing their branch and customer channel strategies to meet the demand of today’s digital banking customer.

Why am I talking about this? These market trends are dominating the CEO and CIO agenda in today’s banking industry. I just returned from the 2015 IDC Asian Financial Congress event in Singapore where the digital journey for the next generation bank was a major agenda item. According the IDC Financial Insights, global banks will invest $31.5B USD in core banking modernization to enable these services, improve operational efficiency, and position these banks to better compete on technology and convenience across markets. Core banking modernization initiatives are complex, costly, and fraught with risks. Let’s take a closer look. (more…)

Share
Posted in Application Retirement, Architects, Banking & Capital Markets, Data Migration, Data Privacy, Data Quality, Vertical | Tagged , , | Leave a comment

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

The Billion Dollar (Data Integration) Mistake

How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!

Data IntegrationHow would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.

This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line.  How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?

A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.

Data Validation_Customer Benefits

Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:

  • Reduce costs associated with data testing.
  • Reduce time associated with data testing.
  • Increase IT productivity.
  • Increase the business trust in the data.

Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill.  In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.

Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!

Data Validation_Use Cases

TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:

  • Regression (Unit) testing
  • Application migration or consolidation
  • Software upgrades (Applications, databases, PowerCenter)
  • Production reconciliation

One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing.  Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.

A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.

And what about you?  What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me

Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Comparative Costs and Uses for Data Integration Platforms – A Study from Bloor Research

Data Integration PlatformsFor years, companies have wrestled with the promise of data integration platforms. Along the way, businesses have asked many questions, including:

  • Does Data Integration technology truly provide a clear path toward unified data?
  • Can businesses truly harness the potential of their information?
  • Can companies take powerful action as a result?

Recently, Bloor Research set out to evaluate how things were actually playing out on the ground. In particular, they wanted to determine which data integration projects were actually taking place, at what scale, and with what results. The study, “Comparative Costs and Uses for Data Integration Platforms,” was authored by Philip Howard, research director at Bloor. The study examined data integration tool suitability across a range of scenarios, including:

  1. Data migration and consolidation projects
  2. Master data management (MDM) and associated solutions
  3. Application-to-application integration
  4. Data warehousing and business intelligence implementations
  5. Synching data with SaaS applications
  6. B2B data exchange

To draw conclusions, Bloor examined 292 responses from a range of companies. The responders used a variety of data integration approaches, from commercial data integration tools to “hand-coding.”

Informatica is pleased to be able to offer you a copy of this research for your review. The research covers areas like:

  • Suitability
  • Productivity
  • Reusability
  • Total Cost of Ownership (TCO)

We welcome you to download a copy of “Comparative Costs and Uses for Data Integration Platforms” today. We hope these findings offer you insights as you implement and evaluate your data integration projects and options.

Share
Posted in Data Integration, Data Integration Platform, Data Migration, Master Data Management | Tagged , , , | Leave a comment

Oracle Data Migration Best Practices: Join Us And Learn

Oracle Data Migration Best PracticesAre you interested in Oracle Data Migration Best Practices? Are you upgrading, consolidating or migrating to or from an Oracle application? Moving to the cloud or a hosted service? Research and experience confirms that the tasks associated with migrating application data during these initiatives have the biggest impact on whether the project is considered a failure or success. So how do your peers ensure data migration success?

Informatica will be offering a full day Oracle Migrations Best Practices workshop at Oracle Application User Group’s annual conference, Collaborate 14, this year on April 7th in Las Vegas, NV. During this workshop, peers and experts will share best practices for how to avoid the pitfalls and ensure successful projects, lowering migration cost and risk. Our full packed agenda includes:

  1. Free use and trials of data migration tools and software
  2. Full training sessions on how to integrate cloud-based applications
  3. How to provision test data using different data masking techniques
  4. How to ensure consistent application performance during and after a migration
  5. A review of Oracle Migration Best Practices and case studies

Case Study: EMC

One of the key case studies that will be highlighted is EMC’s Oracle migration journey. EMC Corporation migrated to Oracle E-Business Suite, acquired more than 40 companies in 4 years, consolidated and retired environments, and is now on its path to migrating to SAP. Not only did they migrate applications, but they also migrated their entire technology platform from physical to virtual on their journey to the cloud. They needed to control the impact of data growth along the way, manage the size of their test environments while reducing the risk of exposing sensitive data to unauthorized users during development cycles. With best practices, and the help from Informatica, they estimate that they have saved approximately $45M in IT cost savings throughout their migrations. Now that they are deploying a new analytics platform based on Hadoop. They are leveraging existing skill sets and Informatica tools to ensure data is loaded into Hadoop without missing a beat.

Case Study: Verizon

Verizon is the second case study we will be discussing. They recently migrated to Salesforce.com and needed to ensure that more than 100 data objects were integrated with on-premises, back end applications. In addition, they needed to ensure that data was synchronized and kept secure in non-production environments in the cloud. They were able to leverage a cloud-based integration solution from Informatica to simplify their complex IT application architecture and maintain data availability and security – all while migrating a major business application to the cloud.

Case Study: OEM Heavy Equipment Manufacturer

The third case study we will review involves a well-known heavy equipment manufacturer who was facing a couple of challenges – the first was a need to separate data in in an Oracle E-Business Suite application as a result of a divestiture. Secondly, they also needed to control the impact of data growth on their production application environments that were going through various upgrades. Using an innovative approach based on Smart Partitioning, this enterprise estimates it will save $23M over a 5 year period while achieving 40% performance improvements across the board.

To learn more about what Informatica will be sharing at Collaborate 14, watch this video. If you are planning to attend Collaborate 14 this year and you are interested in joining us, you can register for the Oracle Migrations Best Practices Workshop here.

Share
Posted in Application ILM, Data masking, Data Migration | Tagged , | Leave a comment

Streamlining and Securing Application Test and Development Processes

Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers.  Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.

We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data.  Here are the results from that poll:

It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges.   Data integrity/synchronization was a very close 3rd .

Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times.  As part of the next generation test data management platform, key components of that solution include:

Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data.  Many times, sensitive data resides in test copies unbeknownst to auditors.  Once data has been located, data can be masked in non-production copies.

Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy.  Data formats are preserved while original values are completely protected.

Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.

Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets.  Having the ability to reset test data becomes a key requirement to improve efficiencies.

lockner 2

Figure 2 Next Generation Test Data Management

When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities.  This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.

lockner 3

Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets.  For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.

 

Share
Posted in Data masking, Data Migration, Data Privacy | Tagged , , , , | Leave a comment

Enterprise Application Projects Are Much Riskier Than You Think

IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.

But, it is important to remember that there is significant risk in these projects.  An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%.  The HRB article refers to these projects as “black swans.”

How can you reduce this risk of project failure?  Typically, 30% to 40% of an enterprise application project is data migration.  A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.

How can you improve the odds of success in data migration projects?

  1. Use data profiling tools to understand your data before you move it.
  2. Use data quality tools to correct data quality problems.  There is absolutely no point in moving bad data around the organization – but it happens.
  3. Use a proven external methodology. In plain English, work with people who have “done it before”
  4. Develop your own internal competence.  Nobody knows your data, and more importantly, the business context of your data than your own staff.  Develop the skills and engage your business subject matter experts.

Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.

To find out more about successful data migration:

  • Informatica World:  Visit us at the Hands On Lab – Data Migration.
  • Informatica World: Informatica Presentation on Application Data Migration.

Application Data Migrations with Informatica Velocity Migration Methodology

Friday June 5, 2013          9:00 to 10:00

  • Informatica World: Data Migration Factory Presentation by  Accenture

Accelerating the Power of Data Migration

Tuesday June 4, 2013     2:00 to 3:00

 

Share
Posted in Application Retirement, Data Governance, Data Migration, Data Quality, Informatica Events | Tagged , , , , | Leave a comment