Tag Archives: data

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

Healthcare’s Love Hate Relationship with Data

Healthcare's Love Hate Relationship with Data

Healthcare’s Love Hate Relationship with Data

Healthcare and data have the makings of an epic love affair, but like most relationships, it’s not all roses. Data is playing a powerful role in finding a cure for cancer, informing cost reduction, targeting preventative treatments and engaging healthcare consumers in their own care. The downside? Data is needy. It requires investments in connectedness, cleanliness and safety to maximize its potential.

  • Data is ubiquitous…connect it.

4400 times the amount of information held at the Library of Congress – that’s how much data Kaiser Permanente alone has generated from its electronic medical record. Kaiser successfully makes every piece of information about each patient available to clinicians, including patient health history, diagnosis by other providers, lab results and prescriptions. As a result, Kaiser has seen marked improvements in outcomes: 26% reduction in office visits per member and a 57% reduction in medication errors.

Ongoing value, however, requires continuous investment in data. Investments in data integration and data quality ensure that information from the EMR is integrated with other sources (think claims, social, billing, supply chain) so that clinicians and decision makers have access in the format they need. Without this, self-service intelligence can be inhibited by duplicate data, poor quality data or application silos.

  • Data is popular…ensure it is clean.

Healthcare leaders can finally rely on electronic data to make strategic decisions. A CHRISTUS Health anecdote you might relate to – In a weekly meeting each executive reviews their strategic dashboard; these dashboards drive strategic decision making about CPOE adoption (computerized physician order entry), emergency room wait times and price per procedure. Powered by enterprise information management, these dashboards paint a reliable and consistent view across the system’s 60 hospitals. Previous to the implementation of an enterprise data platform, each executive was reliant on their own set of data.

In the pre-data investment era, seemingly common data elements from different sources did not mean the same thing. For example, “Admit Date” in one report reflected the emergency department admission date whereas “Admit Date” in another report referred to the inpatient admission date.

  •  Sharing data is necessary…make it safe.

To cure cancer, reduce costs and engage patients, care providers need access to data and not just the data they generate; it has to be shared for coordination of care through transitions of care and across settings, i.e. home care, long term care and behavioral health. Fortunately, Consumers and Clinicians agree on this, PWC reports that 56% of consumers and 30% of physicians are comfortable with data sharing for care coordination. Further progress is demonstrated by healthcare organizations willingly adopting cloud based applications –as of 2013, 40% of healthcare organizations were already storing protected health information (PHI) in the cloud.

Increased data access carries risk, leaving health data exposed, however. The threat of data breach or hacking is multiplied by the presence (in many cases necessary) of PHI on employee laptops and the fact that providers are provided increased access to PHI. Ponemon Institute, a security firm estimates that data breaches cost the industry $5.6 billion each year. Investments in data-centric security are necessary to assuage fear, protect personal health data and make secure data sharing a reality.

Early improvements in patient outcomes indicate that the relationship between data and healthcare is a valuable investment. The International Institute of Analytics supports this, reporting that although analytics and data maturity across healthcare lags other industries, the opportunity to positively impact clinical and operational outcomes is significant.

Share
Posted in Big Data, Data Integration, Data Quality, Healthcare, Master Data Management, Mergers and Acquisitions, Operational Efficiency | Tagged , | Leave a comment

The Super Bowl of Data-Driven Marketing Potential

I absolutely love football, so when the Super Bowl came to our hometown Phoenix, it was my paradise!  Football on every.single.channel.  Current and former NFL players were everywhere – I ate breakfast next to Howie Long and pumped gas next to Tony Romo.  ESPN & NFL Network analysts were commentating from blocks away.  Even our downtown was transformed into a giant celebration of football.

People often talk about the “Super Bowl of Marketing”, referring to the advertising extravaganza and the millions of dollars spent on hilarious (and sometimes not) commercials.  But spending so much time immersed in the Super Bowl festivities got me thinking about one of my other fascinations… data!  It was the Super Bowl of data too!

NFL Experience Field GoalOn Sunday morning, before the big game (of the Superb Owl as Stephen Colbert would say), I got to witness first-hand the data-driven marketing potential at the NFL Experience in Downtown Phoenix.  The NFL did an amazing job putting on this event – it was truly exceptional with something for everyone.

Once we purchased our tickets, we decided to take the kids to do some Play 60 activities.  Before they could participate, we were shuttled to a bank of computers to “get a wristband” and to sign a waiver.  I’m sure the lawyers made sure that everyone participating in anything physical wouldn’t sue the NFL or the sponsors if they got a hangnail or twisted ankle. But the data ready marketer in me realized that these wristbands were much more than a liability waiver.  They were also a data treasure map!

To get the wristband, you had to provide the NFL (and their sponsors) with your demographic & contact information, your favorite teams, your children’s names and ages, and give them permission to contact you.  You also received an emailed QR code that you could use to unlock certain activities throughout the Experience.

As we moved around the Experience, they scanned our wristband or QR code at each activity.  So now the NFL knows that we have 3 children and their names and ages.  They now know our two youngest love to play football (because they participated in a flag football Play 60 clinic).  They now know that we are huge Denver Bronco fans and purchased a few new jerseys of our favorite players at their shop (where they again scanned our QR code for a small discount).   They now know we use AT&T wireless and our phone numbers.  They know that our boys really want to improve the speed of NationwideNFLExperiencetheir throws because they went through the Peyton Manning Nationwide arm speed and throw accuracy activity five different times… and that nobody ever got over 35MPH.  And they also now know that none of us will ever become great kickers because we all seriously shanked our field goal tries!  And we happily gave them all our data because they provided us with a meaningful service – a really fun, family experience.

Even better for them, for the first time, I actually logged into the NFL Mobile app and turned on location permissions so that I could get real time alerts to what was going on in the area.  Since I use the app all time, that’s a lot of future data that I’ve now given them.

GMC sponsored the Experience and had a huge space in the main area to show off their new car lineup, and they definitely took full advantage of the data provided.  They held a car giveaway that required you to scan your NFL QR code to start the process, and then answer several questions about your vehicle likes and future purchase plans.  You then had to go around to your favorite three vehicles and answer questions about their amazing features (D all of the above was the answer of course!).  After you visited your favorite vehicles, you took your QR code back to see if you won.  My 13 year old was hopeful that we were going to win him a new Denali, but sadly, we did not!  And sadly for him, had we been fortunate enough to win, he wouldn’t be driving it anyway!

I waited a few days to write this blog because I was hopeful that I would receive some sort of personalized experience from the NFL that would blow my socks off.  I’m not sure what technology the NFL & GMC marketing teams use, and if they are data ready.  If they were though, I would have hoped they already would have engaged me with a personalized experience based on the data I have given them.

GMC has sent me a few emails, one with a photo that was taken green-screen style of my kids.  And yes, I’ve downloaded it and have a photo of them with the GMC logo loud and proud on my desktop.

But other than that, nothing very exciting as of yet, and definitely nothing innovative or engaging yet.  But I truly hope that the NFL & GMC use this data to provide me with a better, personalized experience.  Isn’t that why our consumers freely offer their information?  To receive something of value back.

Here are a few ideas for you NFL:

  • Special discounts on Denver Broncos apparel
  • Alert from the NFL ticket exchange the next time the Broncos play the Cardinals in Arizona, and 5 tickets become available
  • Information about how to sign up for NFL kids clinics
  • Sorry GMC, I’m not quite sure what to suggest because we just bought a new Toyota a few months ago  (but you know that I’m not in the market for a new car right now because I gave you that information too).

Thank you for a really wonderful experience NFL & GMC!  In this age of data-driven personalization, I am anxiously awaiting your next move!  Now, are you ready for some football (sorry couldn’t resist!)?  But in all seriousness, are you ready to reach your data-driven marketing potential.

Will this be the beginning of the Super Bowl of Data Ready Marketing!  As an NFL fan and consumer, I know I’m ready!

Are you ready?  Please tell us in our survey about data ready marketing. The results are coming out soon so don’t miss your chance to be a part. You can find the link here.

Also, follow me on twitter – The Data Ready Marketer (@StephanieABest) for some of the latest & greatest news and insights on the world of data ready marketing.

And stay tuned because we have several new Data Ready Marketing pieces coming out soon – InfoGraphics, eBooks, SlideShares, and more!

Share
Posted in CMO, Data First | Tagged , , , , , , , , , , , | Leave a comment

Is it the CDO or CAO or Someone Else?

Frank-Friedman-199x300A month ago, I shared that Frank Friedman believes CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. Even though many CFOs are increasingly taking on what could be considered an internal CEO or COO role, many readers protested my post which focused on reviewing Frank Friedman’s argument. At the same time, CIOs have been very clear with me that they do not want to personally become their company’s data steward. So the question becomes should companies be creating a CDO or CAO role to lead this important function? And if yes, how common are these two roles anyway?

Data analyticsRegardless of eventual ownership, extracting value out of data is becoming a critical business capability. It is clear that data scientists should not be shoe horned into the traditional business analyst role. Data Scientists have the unique ability to derive mathematical models “for the extraction of knowledge from data “(Data Science for Business, Foster Provost, 2013, pg 2). For this reason, Thomas Davenport claims that data scientists need to be able to network across an entire business and be able to work at the intersection of business goals, constraints, processes, available data and analytical possibilities. Given this, many organizations today are starting to experiment with the notion of having either a chief data officers (CDOs) or chief analytics officers (CAOs). The open questions is should an enterprise have a CDO or a CAO or both? And as important in the end, it is important to determine where should each of these roles report in the organization?

Data policy versus business questions

Data analyticsIn my opinion, it is the critical to first look into the substance of each role before making a decision with regards to the above question. The CDO should be about ensuring that information is properly secured, stored, transmitted or destroyed.  This includes, according to COBIT 5, that there are effective security and controls over information systems. To do this, procedures need to be defined and implemented to ensure the integrity and consistency of information stored in databases, data warehouses, and data archives. According to COBIT 5, data governance requires the following four elements:

  • Clear information ownership
  • Timely, correct information
  • Clear enterprise architecture and efficiency
  • Compliance and security

Data analyticsTo me, these four elements should be the essence of the CDO role. Having said this, the CAO is related but very different in terms of the nature of the role and the business skills require. The CRISP model points out just how different the two roles are. According to CRISP, the CAO role should be focused upon business understanding, data understanding, data preparation, data modeling, and data evaluation. As such the CAO is focused upon using data to solve business problems while the CDO is about protecting data as a business critical asset. I was living in in Silicon Valley during the “Internet Bust”. I remember seeing very few job descriptions and few job descriptions that existed said that they wanted a developer who could also act as a product manager and do some marketing as a part time activity. This of course made no sense. I feel the same way about the idea of combining the CDO and CAO. One is about compliance and protecting data and the other is about solving business problems with data. Peanut butter and chocolate may work in a Reese’s cup but it will not work here—the orientations are too different.

So which business leader should own the CDO and CAO?

Clearly, having two more C’s in the C-Suite creates a more crowded list of corporate officers. Some have even said that this will extended what is called senior executive bloat. And what of course how do these new roles work with and impact the CIO? The answer depends on organization’s culture, of course. However, where there isn’t an executive staff office, I suggest that these roles go to different places. Clearly, many companies already have their CIO function already reporting to finance. Where this is the case, it is important determine whether a COO function is in place. The COO clearly could own the CDO and CAO functions because they have a significant role in improving process processes and capabilities. Where there isn’t a COO function and the CIO reports to the CEO, I think you could have the CDO report to the CIO even though CIOs say they do not want to be a data steward. This could be a third function in parallel the VP of Ops and VP of Apps. And in this case, I would put the CAO report to one of the following:  the CFO, Strategy, or IT. Again this all depends on current organizational structure and corporate culture. Regardless of where it reports, the important thing is to focus the CAO on an enterprise analytics capability.

Related Blogs

Should we still be calling it Big Data?

Is Big Data Destined To Become Small And Vertical?

Big Data Why?

What is big data and why should your business care?

Author Twitter: @MylesSuer

Share
Posted in Big Data, CIO | Tagged , , , , , , | 2 Comments

Garbage In, Garbage Out? Don’t Take Data for Granted in Analytics Initiatives!

Cant trust data_1The verdict is in. Data is now broadly perceived as a source of competitive advantage. We all feel the heat to deliver good data. It is no wonder organizations view Analytics initiatives as highly strategic. But the big question is, can you really trust your data? Or are you just creating pretty visualizations on top of bad data?

We also know there is a shift towards self-service Analytics. But did you know that according to Gartner, “through 2016, less than 10% of self-service BI initiatives will be governed sufficiently to prevent inconsistencies that adversely affect the business”?1 This means that you may actually show up at your next big meeting and have data that contradicts your colleague’s data.  Perhaps you are not working off of the same version of the truth. Maybe you have siloed data on different systems and they are not working in concert? Or is your definition of ‘revenue’ or ‘leads’ different from that of your colleague’s?

So are we taking our data for granted? Are we just assuming that it’s all available, clean, complete, integrated and consistent?  As we work with organizations to support their Analytics journey, we often find that the harsh realities of data are quite different from perceptions. Let’s further investigate this perception gap.

For one, people may assume they can easily access all data. In reality, if data connectivity is not managed effectively, we often need to beg borrow and steal to get the right data from the right person. If we are lucky. In less fortunate scenarios, we may need to settle for partial data or a cheap substitute for the data we really wanted. And you know what they say, the only thing worse than no data is bad data. Right?

Another common misperception is: “Our data is clean. We have no data quality issues”.  Wrong again.  When we work with organizations to profile their data, they are often quite surprised to learn that their data is full of errors and gaps.  One company recently discovered within one minute of starting their data profiling exercise, that millions of their customer records contained the company’s own address instead of the customers’ addresses… Oops.

Another myth is that all data is integrated.  In reality, your data may reside in multiple locations: in the cloud, on premise, in Hadoop and on mainframe and anything in between. Integrating data from all these disparate and heterogeneous data sources is not a trivial task, unless you have the right tools.

And here is one more consideration to mull over. Do you find yourself manually hunting down and combining data to reproduce the same ad hoc report over and over again? Perhaps you often find yourself doing this in the wee hours of the night? Why reinvent the wheel? It would be more productive to automate the process of data ingestion and integration for reusable and shareable reports and Analytics.

Simply put, you need great data for great Analytics. We are excited to host Philip Russom of TDWI in a webinar to discuss how data management best practices can enable successful Analytics initiatives. 

And how about you?  Can you trust your data?  Please join us for this webinar to learn more about building a trust-relationship with your data!

  1. Gartner Report, ‘Predicts 2015: Power Shift in Business Intelligence and Analytics Will Fuel Disruption’; Authors: Josh Parenteau, Neil Chandler, Rita L. Sallam, Douglas Laney, Alan D. Duncan; Nov 21 2014
Share
Posted in Architects, Business/IT Collaboration, Data Governance, Data Integration, Data Warehousing | Tagged , , , , , , | 1 Comment

Marketers, Are You Ready? The Impending Data Explosion from the New Gizmos and Gadgets Unveiled at CES

What Marketers Can Learn from CES

What Marketers Can Learn from CES

This is the first year in a very long time that I wasn’t in Las Vegas during CES.  Although it’s not quite as exciting as actually being there, I love that the Twitter-verse and industry news sites kept us all up to date about the latest and greatest announcements.  Now that CES2015 is all wrapped up, I find myself thinking about the potential of some very interesting announcements – from the wild to the wonderful to the leave-you-wondering!  What strikes me isn’t how useful these new gizmos and gadgets will likely be to myself and my consumer counterparts, but instead what incredible new data sources they will offer to my fellow marketers.

One thing is for sure… the connected “Internet of Things” is indeed here.  It’s no longer just a vision.  Sure, we’re just seeing the early stages, but it’s becoming more and more main stream by the day.  And as marketers, we have so much opportunity ahead of us!

I ran across an interesting video interview on the CES show floor with Jack Smith from GroupM on Adweek.com.  Jack says that “data from sensors will have a bigger impact, longer term, than the Internet itself.”  That is a lofty statement, and I’m not sure I’ll go quite that far yet, but I absolutely agree with his premise… this new world of connectivity is already shifting marketing, and it will almost certainly radically change the way we market in the near future.

Riding the Data Explosion (Literally)

Connected CycleThe Connected Cycle is one of the announcements that I find intriguing as a marketer. In short, it’s a bike pedal equipped with GPS and GPRS sensors that “monitor your movements and act as a basic fitness tracker.”  It’s being positioned as a way to track stolen bicycles, which is a massive problem in Europe particularly, with the side benefit of being a powerful fitness tracker.  It may not be as sexy as some other announcements, but I think there is buried treasure in devices like these.

Imagine how powerful that data would be to a sporting goods retailer?  What if the rider of that bicycle had opted into a program that allowed the retailer to track their activity in exchange for highly targeted offers?

Let’s say that the rider is nearing one of your stores and it’s a colder than usual day.  Perhaps you could push them an offer to their smart phone for some neoprene booties.  Or let’s say that, based on their activity patterns, the rider appears to be stepping up their activity and is riding more frequently suggesting they may be ready for a race you are sponsoring in a few months in the area.  Perhaps you could push them an inspirational message saying how great they’re progressing and had they thought about signing up for the big race, with a special incentive of course.

The segmentation possibilities are endless, and the analytics that could be done on the data leaves the data-driven marketer salivating!

Home Automation Meets Business Automation

smart-homeThere were numerous announcements about the connected “house of the future”, and it’s clear that we are just beginning of the home automation wave.  Several of the big dogs like Samsung, Google, and Apple are building or buying automation hub platforms, so it’s going to be easier and easier to connect appliances and other home devices to one another, and also to mobile technology and wearables.   As marketers, there is incredible potential to really tap into this.  Imagine the possibility of interconnecting your customers’ home automation systems with your own marketing automation systems?  Marketers will soon be able literally serve up offers based upon things that are occurring in the home in real time.

Oh no, your teenage son finished off all but the last drop of milk (and put the almost-empty jug back in the fridge without a second thought)!  Not to worry, you’ve linked your refrigerator’s sensor data with your favorite grocery store.  An alert is sent asking if you want more milk, and oh by the way, your shopping patterns indicate you may be running out of your son’s favorite cereal too, so it offers you a special discount if you add a box to your order.  Oh yeah, of course he was complaining about being out just yesterday!  And whala, a gallon of milk and some Cinnamon Toast Crunch magically arrives at your door by the end of the day.  Heck, it will probably arrive within an hour via a drone if Amazon has anything to say about it!  No manual business processes whatsoever.  It’s your appliance’s sensors talking to your customer data warehouse, which is talking to your marketing automation system, which is talking to a mobile app, which is talking to an ordering system, which is talking to a payment system, which is talking to a logistics/delivery system.  That is, of course, if your internal processes are ready!

Some of the More Weird and Wacky, But There May Just Be Something…

Smart MirrorPanasonic’s Smart Mirror allows you to analyze your skin and allows you to visualize yourself with different makeup or even a different haircut.  Cosmetics and hair care companies should be all over this.  Imagine the possibilities of visualizing yourself looking absolutely stunning – if only virtually – with perfect makeup and hair.  Who wouldn’t want to rush right out and capture the look for real?   What if a store front could virtually put the passer-byer in their products, and once the customer is inside the store, point them to the products that were featured?  Take it a step further and send them a special offer the next week to come back buy the hat that just goes perfectly with the rest of the outfit.  It all sounds a little bit “Minority Report-esque”, but it’s closer to becoming true every day.  The power of the interconnected world is endless for the marketer.

BeltyAnd then there’s Belty…  it’s definitely garnered a lot of news (and snarky comments too!).  Belty is a smart belt that slims or expands based upon your waist size at that very moment – whether you’re sitting, standing, or just had a too-large meal.  I don’t see Belty taking off, but you never know!  If it does however, can’t you just see Belty sending a message to your Weight Watchers app about needing to get back on diet?  Or better yet, pointing you to the Half Yearly Sale at Nordstrom because you’re getting too skinny for your pants?

The “Internet of Things” is Becoming Reality… Is Your Marketing Team Ready?

The internet of things is already changing the way consumers live, and it’s beginning to change the way marketers market.  With the It is critical that marketers are thinking about how they can leverage the new devices and the data they provide.  Connecting the dots between devices can become a marketer’s best friend (if they’re ready), or worst enemy (if they’re not).

Are you ready?  Ask yourself these 6 questions:

  1. Are your existing business applications connected to one another?   Do your marketing systems “talk” to your finance systems and your sales systems and your customer support systems?
  2. Do you have fist-class data quality and validation technology and practices in place?  Real-time, automated processes will only amplify data quality problems.
  3. Can you connect easily to any new data source as it becomes available, no matter where it lives and no matter what format it is in?  The only constant in this new world is the speed of change, so if you’re not building processes and leveraging technologies that can keep up, you’re already missing the boat!
  4. Are you building real time capabilities into your processes and technologies?  You systems are going to have to handle real-time sensor data, and make real-time decisions based on the data they provide.
  5. Are your marketing analytics capabilities leading the pack or just getting out of the gate?  Are they harnessing all of the rich data available within your organization today?  Are you ready to analyze all of the new data sources to determine trends and segment for maximum effect?
  6. Are you talking to your counterparts in IT, logistics, finance, etc. about the business processes and technologies you are going to need to harness the data that the interconnected world of today, and of the near future?  If not, don’t wait!  Begin that conversation ASAP!

Informatica is ready to help you embark on this new and exciting data journey.  For some additional perspectives from Informatica on the technologies announced at CES2015, I encourage you to read some of my colleagues’ recent blog posts:

Jumping on the Internet of Things (IoT) Band Wagon?

CES, Digital Strategy and Architecture: Are You Ready?

Share
Posted in Business Impact / Benefits, CMO, Real-Time | Tagged , , , , , | Leave a comment

Analytics Stories: A Case Study from Quintiles

Pharma CIOAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. For pharmaceutical businesses, strengthening the right to win begins and ends with the drug product development lifecycle. I remember, for example, talking several years ago to the CFO of major pharmaceutical company and having him tell me the most important financial metrics for him had to do with reducing the time to market for a new drug and maximizing the period of patent protection. Clearly, the faster a pharmaceutical company gets a product to market, the faster it can begin to earning a return on its investment.

Fragmented data challenged analytical efforts

PharmaceuticalAt Quintiles, what the business needed was a system with the ability to optimize design, execution, quality, and management of clinical trials. Management’s goal was to dramatically shorten time to complete each trial, including quickly identifying when a trial should be terminated. At the same time, management wanted to continuously comply with regulatory scrutiny from Federal Drug Administration and use it to proactively monitor and manage notable trial events.

The problem was Quintiles data was fragmented across multiple systems and this delayed the ability to make business decisions. Like many organizations, Quintiles data was located in multiple incompatible legacy systems. This meant there was extensive manual data manipulation before data could become useful. As well, incompatible legacy systems impeded data integration and normalization, and prohibited a holistic view across all sources. Making matters worse, management felt that it lacked the ability to take corrective actions in a timely manner.

Infosario launched to manage Quintiles analytical challenges

PharmaceuticalTo address these challenges, Quintiles leadership launched the Infosario Clinical Data Management Platform to power its pharmaceutical product development process. Infosario breaks down the silos of information that have limited combining massive quantities of scientific and operational data collected during clinical development with tens of millions of real-world patient records and population data. This step empowered researchers and drug developers to unlock a holistic view of data. This improved decision-making, and ultimately increasing the probability of success at every step in a product’s lifecycle. Quintiles Chief Information Officer, Richard Thomas says, “The drug development process is predicated upon the availability of high quality data with which to collaborate and make informed decisions during the evolution of a product or treatment”.

What Quintiles has succeeded in doing with Infosario is the integration of data and processes associated with a drug’s lifecycle. This includes creating a data engine to collect, clean, and prepare data for analysis. The data is then combined with clinical research data and information from other sources to provide a set of predictive analytics. This of course is aimed at impacting business outcomes.

The Infosario solution consists of several core elements

At its core, Infosario provides the data integration and data quality capabilities for extracting and organizing clinical and operational data. The approach combines and harmonizes data from multiple heterogeneous sources into what is called the Infosario Data Factory repository. The end is to accelerate reporting. Infosario leverages data federation /virtualization technologies to acquire information from disparate sources in a timely manner without affecting the underlying foundational enterprise data warehouse. As well, it implements a rule-based, real-time intelligent monitoring and alerting to enable the business to tweak and enhance business processes as they are needed. A “monitoring and alerting layer” sits on top of the data, with the facility to rapidly provide intelligent alerts to appropriate stakeholders regarding trial-related issues and milestone events. Here are some more specifics on the components of the Infosario solution:

• Data Mastering provides the capability to link multi-domains of data. This enables enterprise information assets to be actively managed, with an integrated view of the hierarchies and relationships.

• Data Management provides the high performance, scalable data integration needed to support enterprise data warehouses and critical operational data stores.

• Data Services provides the ability to combine data from multiple heterogeneous data sources into a single virtualized view. This allows Infosario to utilize data services to accelerate delivery of needed information.

• Complex Event Processing manages the critical task of monitoring enterprise data quality events and delivering alerts to key stakeholders to take necessary action.

Parting Thoughts

According to Richard Thomas, “the drug development process rests on the high quality data being used to make informed decisions during the evolution of a product or treatment. Quintiles’ Infosario clinical data management platform gives researchers and drug developers with the knowledge needed to improve decision-making and ultimately increase the probability of success at every step in a product’s lifecycle.” This it enables enhanced data accuracy, timeliness, and completeness. On the business side, it has enables Quintiles to establish industry-leading information and insight. And this in turn has enables the ability to make faster, more informed decisions, and to take action based on insights. This importantly has led to a faster time to market and a lengthening of the period of patent protection.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in CIO, Data Governance, Data Quality | Tagged , , , , | Leave a comment

Business Asking IT to Install MDM: Will IDMP make it happen?

Will IDMP Increase MDM Adoption?

Will IDMP Increase MDM Adoption?

MDM for years has been a technology struggling for acceptance.  Not for any technical reason, or in fact any sound business reason.  Quite simply, in many cases the business people cannot attribute value delivery directly to MDM, so MDM projects can be rated as ‘low priority’.  Although the tide is changing, many business people still need help in drawing a direct correlation between Master Data Management as a concept and a tool, and measurable business value.  In my experience having business people actively asking for a MDM project is a rare occurrence.  This should change as the value of MDM is becoming clearer, it is certainly gaining acceptance in principle that MDM will deliver value.  Perhaps this change is not too far off – the introduction of Identification of Medicinal Products (IDMP) regulation in Europe may be a turning point.

At the DIA conference in Berlin this month, Frits Stulp of Mesa Arch Consulting suggested that IDMP could get the business asking for MDM.  After looking at the requirements for IDMP compliance for approximately a year, his conclusion from a business point of view is that MDM has a key role to play in IDMP compliance.  A recent press release by Andrew Marr, an IDMP and XEVMPD expert and  specialist consultant, also shows support for MDM being ‘an advantageous thing to do’  for IDMP compliance.  A previous blog outlined my thoughts on why MDM can turn regulatory compliance into an opportunity, instead of a cost.  It seems that others are now seeing this opportunity too.

So why will IDMP enable the business (primarily regulatory affairs) to come to the conclusion that they need MDM?  At its heart, IDMP is a pharmacovigilance initiative which has a goal to uniquely identify all medicines globally, and have rapid access to the details of the medicine’s attributes.  If implemented in its ideal state, IDMP will deliver a single, accurate and trusted version of a medicinal product which can be used for multiple analytical and procedural purposes.  This is exactly what MDM is designed to do.

Here is a summary of the key reasons why an MDM-based approach to IDMP is such a good fit.

1.  IDMP is a data Consolidation effort; MDM enables data discovery & consolidation

  • IDMP will probably need to populate between 150 to 300 attributes per medicine
  • These attributes will be held in 10 to 13 systems, per product.
  • MDM (especially with close coupling to Data Integration) can easily discover and collect this data.

2.  IDMP requires cross-referencing; MDM has cross-referencing and cleansing as key process steps.          

  • Consolidating data from multiple systems normally means dealing with multiple identifiers per product.
  • Different entities must be linked to each other to build relationships within the IDMP model.
  • MDM allows for complex models catering for multiple identifiers and relationships between entities.

3.  IDMP submissions must ensure the correct value of an attribute is submitted; MDM has strong capabilities to resolve different attribute values.

  • Many attributes will exist in more than one of the 10 to 13 source systems
  • Without strong data governance, these values can (and probably will be) different.
  • MDM can set rules for determining the ‘golden source’ for each attribute, and then track the history of these values used for submission.

4.  IDMP is a translation effort; MDM is designed to translate

  • Submission will need to be within a defined vocabulary or set of reference data
  • Different regulators may opt for different vocabularies, in addition to the internal set of reference data.
  • MDM can hold multiple values/vocabularies for entities, depending on context.

5.  IDMP is a large co-ordination effort; MDM enables governance and is generally associated with higher data consistency and quality throughout an organisation.

  • The IDMP scope is broad, so attributes required by IDMP may also be required for compliance to other regulations.
  • Accurate compliance needs tracking and distribution of attribute values.  Attribute values submitted for IDMP, other regulations, and supporting internal business should be the same.
  • Not only is MDM designed to collect and cleanse data, it is equally comfortable for data dispersion and co-ordination of values across systems.

 Once business users assess the data management requirements, and consider the breadth of the IDMP scope, it is no surprise that some of them could be asking for a MDM solution.  Even if they do not use the acronym ‘MDM’ they could actually be asking for MDM by capabilities rather than name.

Given the good technical fit of a MDM approach to IDMP compliance, I would like to put forward three arguments as to why the approach makes sense.  There may be others, but these are the ones I feel are most compelling:

1.  Better chance to meet tight submission time

There is slightly over 18 months left before the EMA requires IDMP compliance.  Waiting for final guidance will not provide enough time for compliance.  Using MDM you have a tool to begin with the most time consuming tasks:  data discovery, collection and consolidation.  Required XEVMPD data, and the draft guidance can serve as a guide as to where to focus your efforts.

2.  Reduce Risk of non-compliance

With fines in Europe of ‘fines up to 5% of revenue’ at stake, risking non-compliance could be expensive.  Not only will MDM increase your chance of compliance on July 1, 2016, but will give you a tool to manage your data to ensure ongoing compliance in terms of meeting deadlines for delivering new data, and data changes.

3.  Your company will have a ready source of clean, multi-purpose product data

Unlike some Regulatory Information Management tools, MDM is not a single-purpose tool.  It is specifically designed to provide consolidated, high-quality master data to multiple systems and business processes.  This data source could be used to deliver high-quality data to multiple other initiatives, in particular compliance to other regulations, and projects addressing topics such as Traceability, Health Economics & Outcomes, Continuous Process Verification, Inventory Reduction.

So back to the original question – will the introduction of IDMP regulation in Europe result in the business asking IT to implement MDM?  Perhaps they will, but not by name.  It is still possible that they won’t.  However, for those of you who have been struggling to get buy-in to MDM within your organisation, and you need to comply to IDMP, then you may be able to find some more allies (potentially with an approved budget) to support you in your MDM efforts.

Share
Posted in Data Governance, Healthcare, Master Data Management | Tagged , | Leave a comment

Swim Goggles, Great Data, and Total Customer Value

Total Customer Value

Total Customer Value on CMO.com

The other day I ran across an article on CMO.com from a few months ago entitled “Total Customer Value Trumps Simple Loyalty in Digital World”.  It’s a great article, so I encourage you to go take a look, but the basic premise is that loyalty does not necessarily equal value in today’s complicated consumer environment.

Customers can be loyal for a variety of reasons as the author Samuel Greengard points out.  One of which may be that they are stuck with a certain product or service because they believe there is no better alternative available. I know I can relate to this after a recent series of less-than-pleasant experiences with my bank. I’d like to change banks, but frankly they’re all about the same and it just isn’t worth the hassle.  Therefore, I’m loyal to my unnamed bank, but definitely not an advocate.

The proverbial big fish in today’s digital world, according to the author, are customers who truly identify with the brand and who will buy the company’s products eagerly, even when viable alternatives exist.  These are the customers who sing the brand’s praises to their friends and family online and in person.  These are the customers who write reviews on Amazon and give your product 5 stars.  These are the customers who will pay markedly more just because it sports your logo.  And these are the customers whose voices hold weight with their peers because they are knowledgeable and passionate about the product.  I’m sure we all have a brand or two that we’re truly passionate about.

Total Customer Value in the Pool

Total Customer Value

Total Customer Value in the Pool

My 13 year old son is a competitive swimmer and will only use Speedo goggles – ever – hands down – no matter what.  He wears Speedo t-shirts to show his support.  He talks about how great his goggles are and encourages his teammates to try on his personal pair to show them how much better they are.  He is a leader on his team, so when newbies come in and see him wearing these goggles and singing their praises, and finishing first, his advocacy holds weight.  I’m sure we have owned well over 30 pair of Speedo goggles over the past 4 years at $20 a pop – and add in the T-Shirts and of course swimsuits – we probably have a historical value of over $1000 and a potential lifetime value of tens of thousands (ridiculous I know!).  But if you add in the influence he’s had over others, his value is tremendously more – at least 5X.

This is why data is king!

I couldn’t agree more that total customer value, or even total partner or total supplier value, is absolutely the right approach, and is a much better indicator of value.  But in this digital world of incredible data volumes and disparate data sources & systems, how can you really know what a customer’s value is?

The marketing applications you probably already use are great – there are so many great automation, web analytics, and CRM systems around.  But what fuels these applications?  Your data.

Most marketers think that data is the stuff that applications generate or consume. As if all data is pretty much the same.  In truth, data is a raw ingredient.  Data-driven marketers don’t just manage their marketing applications, they actively manage their data as a strategic asset.

Total Customer Value

This is why data is king!

How are you using data to analyze and identify your influential customers?  Can you tell that a customer bought their fourth product from your website, and then promptly tweeted about the great deal they got on it?  Even more interesting, can you tell that that five of their friends followed the link, 1 bought the same item, 1 looked at it but ended up buying a similar item, and 1 put it in their cart but didn’t buy it because it was cheaper on another website?  And more importantly, how can you keep this person engaged so they continue their brand preference – so somebody else with a similar brand and product doesn’t swoop in and do it first?  And the ultimate question… how can you scale this so that you’re doing this automatically within your marketing processes, with confidence, every time?

All marketers need to understand their data – what exists in your information ecosystem , whether it be internally or externally.  Can you even get to the systems that hold the richest data?  Do you leverage your internal customer support/call center records?  Is your billing /financial system utilized as a key location for customer data?  And the elephant in the room… can you incorporate the invaluable social media data that is ripe for marketers to leverage as an automated component of their marketing campaigns?
This is why marketers need to care about data integration

Even if you do have access to all of the rich customer data that exists within and outside of your firewalls, how can you make sense of it?  How can you pull it together to truly understand your customers… what they really buy, who they associate with, and who they influence.  If you don’t, then you’re leaving dollars, and more importantly, potential advocacy and true customer value, on the table.
This is why marketers need to care about achieving a total view of their customers and prospects… 

And none of this matters if the data you are leveraging is plain incorrect or incomplete.  How often have you seen some analysis on an important topic, had that gut feeling that something must be wrong, and questioned the data that was used to pull the report?  The obvious data quality errors are really only the tip of the iceberg.  Most of the data quality issues that marketers face are either not glaringly obvious enough to catch and correct on the spot, or are baked into an automated process that nobody has the opportunity to catch.  Making decisions based upon flawed data inevitably leads to poor decisions.
This is why marketers need to care about data quality.

So, as the article points out, don’t just look at loyalty, look at total customer value.  But realize, that this is easier said than done without a focusing in on your data and ensuring you have all of the right data, at the right place, in the right format, right away.

Now…  Brand advocates, step up!  Share with us your favorite story.  What brands do you love?  Why?  What makes you so loyal?

Share
Posted in Business Impact / Benefits, CMO, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , | Leave a comment

Just In Time For the Holidays: How The FTC Defines Reasonable Security

Reasonable Security

How The FTC Defines Reasonable Security

Recently the International Association of Privacy Professionals (IAPP, www.privacyassociation.org ) published a white paper that analyzed the Federal Trade Commission’s (FTC) data security/breach enforcement. These enforcements include organizations from the finance, retail, technology and healthcare industries within the United States.

From this analysis in “What’s Reasonable Security? A Moving Target,” IAPP extrapolated the best practices from the FTC’s enforcement actions.

While the white paper and article indicate that “reasonable security” is a moving target it does provide recommendations that will help organizations access and baseline their current data security efforts.  Interesting is the focus on data centric security, from overall enterprise assessment to the careful control of access of employees and 3rd parties.  Here some of the recommendations derived from the FTC’s enforcements that call for Data Centric Security:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

How does Data Centric Security help organizations achieve this inferred baseline? 

  1. Data Security Intelligence (Secure@Source coming Q2 2015), provides the ability to “…identify reasonably foreseeable risks.”
  2. Data Masking (Dynamic and Persistent Data Masking)  provides the controls to limit access of information to employees and 3rd parties.
  3. Data Archiving provides the means for the secure disposal of information.

Other data centric security controls would include encryption for data at rest/motion and tokenization for securing payment card data.  All of the controls help organizations secure their data, whether a threat originates internally or externally.   And based on the never ending news of data breaches and attacks this year, it is a matter of when, not if your organization will be significantly breached.

For 2015, “Reasonable Security” will require ongoing analysis of sensitive data and the deployment of reciprocal data centric security controls to ensure that the organizations keep pace with this “Moving Target.”

Share
Posted in Data Integration, Data masking, Data Privacy, Data Security | Tagged , , , | Leave a comment