Tag Archives: data

A Data Integration Love-Fest in Vegas

Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?

Is it?

Next-Gen Data Integration

Agile Data Integration

a) They are all top in their field.

b) They all view data as critical to their business success.

c) They are all using Agile Data Integration to drive business agility.

d) They have spoken about their Data Integration strategy at Informatica World in Vegas.

Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?

Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World.  They shared their enthusiasm for the important role of data in their business.  These industry leaders discussed best practices that facilitate an Agile Data Integration process.

American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task.  This effort involved large-scale data migration and included many legacy data sources.  The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.

American Airlines architects recommend the use of Data Integration design patterns in order to improve agility.  The architects shared success-factors for merger Data Integration.  They discussed the importance of ownership by leadership from IT and business.  They emphasized the benefit of open and honest communications between teams.  They architects also highlighted the need to identify integration teams and priorities.  Finally the architects discussed the significance of understanding cultural differences and celebrating success.  The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.

Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions.  The Data Integration team needs to support this business process.  They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge.  Integrating all claims in a single location was critical for smooth processing of insurance claims.  A single system also leads to reduced costs and complexity for support and maintenance.

Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation.  Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.

Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system.  This complex project included data conversion from 50 legacy systems.  The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain.  This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.

Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of  development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.

MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report.  They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.

MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders.  This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.

So let’s recap, Data Integration is helping:

- An airlines continue to serve its customers and run its business smoothly post-merger.

- A tire retail company to procure and provide tires to its customers and maintain leadership

- An insurance company to process claims accurately and in a timely manner, while minimizing costs, and

- A cancer research center to cure cancer.

Not too shabby, right? Data Integration is clearly essential to business success!

So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!

To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform | Tagged , , , , , , , | Leave a comment

Oh the Data I’ve Seen…

shutterstock_152663261Eighteen months ago, I was sitting in a conference room, nothing remarkable except for the great view down 6th Avenue toward the Empire State Building.  The pre-sales consultant sitting across from me had just given a visually appealing demonstration to the CIO of a multinational insurance corporation.  There were fancy graphics and colorful charts sharply displayed on an iPad and refreshing every few seconds.  The CIO asked how long it had taken to put the presentation together. The consultant excitedly shared that it only took him four to five hours, to which the CIO responded, “Well, if that took you less than five hours, we should be able to get a production version in about two to three weeks, right?”

The facts of the matter were completely different however. The demo, while running with the firm’s own data, had been running from a spreadsheet, housed on the laptop of the consultant and procured after several weeks of scrubbing, formatting, and aggregating data from the CIO’s team; this does not even mention the preceding data procurement process.  And so, as the expert in the room, the voice of reason, the CIO turned to me wanting to know how long it would take to implement the solution.  At least six months, was my assessment.  I had seen their data, and it was a mess. I had seen the flow, not a model architecture and the sheer volume of data was daunting. If it was not architected correctly, the pretty colors and graphs would take much longer to refresh; this was not the answer he wanted to hear.

The advancement of social media, new web experiences and cutting edge mobile technology have driven users to expect more of their applications.  As enterprises push to drive value and unlock more potential in their data, insurers of all sizes have attempted to implement analytical and business intelligence systems.  But here’s the truth: by and large most insurance enterprises are not in a place with their data to make effective use of the new technologies in BI, mobile or social.  The reality is that data cleanliness, fit for purpose, movement and aggregation is being done in a BI when it should be done lower down so that all applications can take advantage of it.

Let’s face it – quality data is important. Movement and shaping of data in the enterprise is important.  Identification of master data and metadata in the enterprise is important and data governance is important.  It brings to mind episode 165, “The Apology”, of the mega-hit show Seinfeld.  Therein George Costanza accuses erstwhile friend Jason Hanky of being a “step skipper”.  What I have seen in enterprise data is “step skipping” as users clamor for new and better experiences, but the underlying infrastructure and data is less than ready for consumption.  So the enterprise bootstraps, duct tapes and otherwise creates customizations where it doesn’t architecturally belong.

Clearly this calls for a better solution; A more robust and architecturally sustainable data ecosystem, which shepherds the data from acquisition through to consumption and all points in between. It also must be attainable by even modestly sized insurance firms.

First, you need to bring the data under your control.  That may mean external data integration, or just moving it from transactional, web, or client-server systems into warehouses, marts or other large data storage schemes and back again.  But remember, the data is in various stages of readiness.  This means that through out of the box or custom cleansing steps the data needs to be processed, enhanced and stored in a way that is more in line with corporate goals for governing the quality of that data.  And this says nothing of the need to change a data normalization factor between source and target.  When implemented as a “factory” approach, the ability to bring new data streams online, integrate them quickly and maintain high standards become small incremental changes and not a ground up monumental task.  Move your data shaping, cleansing, standardization and aggregation further down in the stack and many applications will benefit from the architecture.

Critical to this process is that insurance enterprises need to ensure the data remains secure, private and is managed in accordance with rules and regulations. They must also govern the archival, retention and other portions of the data lifecycle.

At any point in the life of your information, you are likely sending or receiving data from an agent, broker, MGA or service provider, which needs to be processed using the robust ecosystem, described above. Once an effective data exchange infrastructure is implemented, the steps to process the data can nicely complement your setup as information flows to and from your trading partners.

Finally, as your enterprise determines “how” to implement these solutions, you may look to a cloud based system for speed to market and cost effectiveness compared to on-premises solutions.

And don’t forget to register for Informatica World 2014 in Las Vegas, where you can take part in sessions and networking tailored specifically for insurers.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration Platform, Data Quality, Enterprise Data Management, Financial Services | Tagged , , , , , , | Leave a comment

Data is the Key to Value-based Healthcare

The transition to value-based care is well underway. From healthcare delivery organizations to clinicians, payers, and patients, everyone feels the impact.  Each has a role to play. Moving to a value-driven model demands agility from people, processes, and technology. Organizations that succeed in this transformation will be those in which:

  • Collaboration is commonplace
  • Clinicians and business leaders wear new hats
  • Data is recognized as an enterprise asset

The ability to leverage data will differentiate the leaders from the followers. Successful healthcare organizations will:

1)      Establish analytics as a core competency
2)      Rely on data to deliver best practice care
3)      Engage patients and collaborate across the ecosystem to foster strong, actionable relationships

Trustworthy data is required to power the analytics that reveal the right answers, to define best practice guidelines and to identify and understand relationships across the ecosystem. In order to advance, data integration must also be agile. The right answers do not live in a single application. Instead, the right answers are revealed by integrating data from across the entire ecosystem. For example, in order to deliver personalized medicine, you must analyze an integrated view of data from numerous sources. These sources could include multiple EMRs, genomic data, data marts, reference data and billing data.

A recent PWC survey showed that 62% of executives believe data integration will become a competitive advantage.  However, a July 2013 Information Week survey reported that 40% of healthcare executives gave their organization only a grade D or F on preparedness to manage the data deluge.

value-based healthcare

What grade would you give your organization?

You can improve your organization’s grade, but it will require collaboration between business and IT.  If you are in IT, you’ll need to collaborate with business users who understand the data. You must empower them with self-service tools for improving data quality and connecting data.  If you are a business leader, you need to understand and take an active role with the data.

To take the next step, download our new eBook, “Potential Unlocked: Transforming healthcare by putting information to work.”  In it, you’ll learn:

  1. How to put your information to work
  2. New ways to govern your data
  3. What other healthcare organizations are doing
  4. How to overcome common barriers

So go ahead, download it now and let me know what you think. I look forward to hearing your questions and comments….oh, and your grade!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Warehousing, Healthcare, Master Data Management | Tagged , , , | Leave a comment

Nine Forms of Analytics Data That Matter the Most

Big Data takes a lot of forms and shapes, and flows in from all over the place – from the Internet, from devices, from machines, and even from cars. In all the data being generated are valuable nuggets of information.

The challenge is being able to find the right data needed, and being able to employ that data to solve a business challenge. What types of data are worthwhile for organizations to capture?

In his new book, Taming the Big Data Tidal Wave: Finding Opportunities in Huge Data Streams With Advanced Analytics, Bill Franks provides an wide array of examples of  the types of data that can best meet the needs of business today. Franks, chief analytics officer with Teradata, points out that his list is not exhaustive, as there is almost an unlimited number of sources that will only keep growing as users discover new ways to apply the data. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , , , , , , , , , , , , , | Leave a comment

Why Data Integration Technology Beats Manual Coding Every Time

With the ready availability of data integration technology, it’s amazing to me that the use of manual coding for data integration flows is even a consideration. However, based upon this article in SearchDataManagement, the concept is still out there.

Of course the gist of the article is that hand coding is no longer considered the most productive way to go, which is correct. However, just the fact that this is still an issue and a consideration for anyone moving to data integration solutions perplexes me. Perhaps it’s the new generation of architects and data management professionals who need a quick lesson on the pitfalls of doing data integration by hand.  (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , , | 1 Comment

Leverage Big Data or Go Out of Business

I’m sitting in the Taiwan airport on my way to Guangzhou. We just completed the Informatica World Tour in Hong Kong, Beijing and Taiwan, and I’ve had the opportunity to deliver the keynote presentation, Maximize Your Return on Big Data.

All of our audiences exceeded our expectations. We had 50% more attendees than planned. Why? Big data. It is a hot topic and everyone is trying to determine how to leverage big data in their enterprise to get a competitive advantage. At the event, I made the point – if you’re not trying to understand how to leverage big data in your enterprise, your successor will. Kitty Fok, the IDC China Country Manager, spoke after me. Her consistent comment was – “if your company isn’t looking to leverage big data, you will be out of business.” (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO | Tagged , , , , , , , , | 1 Comment

Data Governance and Technical Issues

In contrast to addressing the management and process issues, we might say that the technical issues are actually quite straightforward to address. In my original enumeration from a few posts back, I ordered the data issue categories in the reverse order of the complexity of their solution. Model and information architecture problems are the most challenging, because of the depth to which business applications are inherently dependent on their underlying models. Even simple changes require significant review to make sure that no expected capability is inadvertently broken. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality | Tagged , , , , | Leave a comment

Stop Hoarding Data – Retire Your Old, Redundant Applications!

Just like your house needs yearly spring cleaning and you need to regularly throw out old junk, your application portfolio needs periodic review and rationalization to identify legacy, redundant applications that can be decommissioned to reduce bloat and save costs. If you have a hard time letting go of old stuff, it’s probably even harder for your application users to let go of access to their data. However, retiring applications doesn’t have to mean that you also lose the data within them. If the data within those applications are still needed for periodic reporting or for regulatory compliance, then there are still ways to retain the data without maintaining the application.  (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Database Archiving, Informatica Events | Tagged , , , , , , , , , , | Leave a comment

Classifying Types of Data Management Issues

Coincidentally, my company is involved with a number of different customers who are reviewing the quality criteria associated with addresses. Each scenario has different motivations for assessing address data quality. One use case focuses on administrative management – ensuring that things that need to happen at a particular location have an accurate and valid address. A different use case considers one aspect of regulatory compliance regarding protection of private information (since mail delivered to the wrong address is a potential exposure of the private information contained within the envelope). Another compliance use case looks at timely delivery of hard copy notifications as part of a legal process, requiring the correct address. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Master Data Management | Tagged , , | Leave a comment

Scoping Failure Analysis

In adapting the six-sigma technique of failure mode and effects analysis for data quality management, we are hoping to proactively identify the potential errors that lead to the most severe business impacts and then strengthen the processes and applications to prevent errors from being introduced in the first place. In my last post, though, I noted that the approach to this analysis starts with the errors and then figures out the impacts. I think we should go the other way so as to optimize the effort and reduce the analysis time to focus on the most important potentialities. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , | Leave a comment