Tag Archives: data

The Impact of the Industrial Internet on Data

Internet-impact

Impact of the Industrial Internet

At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”.  One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.

As software and the value of data are  becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc.  As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance,  the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.

The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs.  These business  models enable suppliers  to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict  malfunction, proactive maintenance planning, etc.

So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.

Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered  and governed in IoT projects with the same priorities as the data in the more traditional applications.

To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough.  In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.

Share
Posted in B2B, Big Data, Business Impact / Benefits, Cloud, Cloud Data Integration, Data Security, Data Services | Tagged , , , , , , | Leave a comment

Data Wizard Beta: Paving the Way for Next-Generation Data Loaders

data transformation

The Data Wizard, Changes the Landscape of What Traditional Data Loaders Can Do

The emergence of the business cloud is making the need for data ever more prevalent. Whatever your business, if your role is in the sales, marketing or service departments, chances are your productivity depends a great deal on the ability to move data quickly in and out of Salesforce and its ecosphere of applications.

With the in-built data transformation intelligence, the Data Wizard (click here to try the Beta version), changes the landscape of what traditional data loaders can do. The Data Wizard takes care of the following aspects, so that you don’t have to:

  1. Data Transformations: We built in over 300 standard data transformations so you don’t have to format the data before bringing it in (eg. combining first and last names into full names, adding numeric columns for totals, splitting address fields into its separate components).
  2. Built-in intelligence: We automate the mapping of data into Salesforce for a range of common use cases (eg., Automatically mapping matching fields, intelligently auto-generating date format conversions , concatenating multiple fields).
  3. App-to-app integration: We incorporated pre-built integration templates to encapsulate the logic required for integrating Salesforce with other applications (eg., single click update of customer addresses in a Cloud ERP application based on Account addresses in Salesforce) .

Unlike the other data loading apps out there, the Data Wizard doesn’t presuppose any technical ability on the part of the user. It was purpose-built to solve the needs of every type of user, from the Salesforce administrator to the business analyst.

Despite the simplicity the Data Wizard offers, it is built on the robust Informatica Cloud integration platform, providing the same reliability and performance that is key to the success of Informatica Cloud’s enterprise customers, who integrate over 5 billion rows of data per day. We invite you to try the Data Wizard for free, and contribute to the Beta process by providing us with your feedback.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform, Data Services | Tagged , , , , , | Leave a comment

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Gut_Instincts

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.

Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching.  The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.

The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”

This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.

As Carey explains it:

“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”

Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.

There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.

Share
Posted in B2B, B2B Data Exchange, Business/IT Collaboration, Data First, Data Integration, Data Services | Tagged , , , , , | Leave a comment

You Can’t Improve What You Don’t Measure

omni-channel

Register for the Webinar on 19th March, 2015

80% of companies surveyed said that they offer superior customer service, but only 8% of their customers agreed with them. (Source: Bain & Company)

With numbers like that there is plenty of room to improve.  But improve what?

Traditionally retailers have measured themselves against year over year increase in sales for like-stores, increased margins and lower operating costs. But, retailing has changed, customers can interact and transact with you across multiple touch points along their path to purchase and beyond. Poor performance at any one of these interaction points could lose you a customer and damage your brand.

A better measure is to calculate the customer experience across the omni-channel landscape. This will provide better insight into how you are attracting and retaining customers, and how well you are serving them. However, many retailers lack the technology and processes to deliver on a plan to improve the omni-channel customer experience.

Once you have decided to do something, what are you going to measure? Is it time spent on website versus sales? Speed to resolve problems in contact center versus number of repeat transactions from customer? Number of touch points before purchase? But what about the softer measures like how well your staff interact with customers in-store or social channels? How many “Pins” you have, or how do you assign value to them?

Organizations need to account for (CHURN, ATTRITION, LOYALTY and LIFETIME VALUE) to be able to evaluate their performance from a holistic view of their customer, not just in the confines of their own operational silo.

In an up and coming webinar Arkady Kleyner, from Intricity will break apart key components of the Omni-Channel Customer Experience calculation. Additionally, Arkady will identify the upstream components that keep this measure accurate and current.

Attend this webinar to learn:

  • The foundational calculations of Omni-Channel Customer Experience
  • Common customizations to fit different scenarios
  • Upstream components to keep the calculation current and accurate
  • Register here to receive a calendar invitation with the webinar details.
  • Join us for a 1 hour webinar and Q/A session. The event will occur March 19th at 2:00PM EST.
Share
Posted in Customer Acquisition & Retention, Customers, Retail | Tagged , , , , | Leave a comment

Startup Winners of the Informatica Data Mania Connect-a-Thon

Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.

While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP.  Bugs also continue to mount, and technical debt increases.

Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.

BigML

With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important.  BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.

Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw

Databricks

Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.

Can’t play the video? Click here, http://youtu.be/607ugvhzVnY

FollowAnalytics

With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.

Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg

ThoughtSpot

Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.

Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4

Share
Posted in B2B, Business Impact / Benefits, Cloud, Data Integration, Data Integration Platform, Data Privacy, Data Quality, Data Services, Data Transformation | Tagged , , , , , | Leave a comment

Gamers Need Great Data and Connected Platforms

Great Data and Connected Platforms

Gamers Need Great Data and Connected Platforms

Who remembers their first game of Pong? Celebrating more than 40 years of innovation, gaming is no longer limited to monochromatic screens and dedicated, proprietary platforms. The PC gaming industry is expected to exceed $35bn by 2018. Phone and handheld games is estimated at $34bn in 5 years and quickly closing the gap. According to EEDAR, 2014 recorded more than 141 million mobile gamers just in North America, generating $4.6B in revenue for mobile game vendors.

This growth has spawned a growing list of conferences specifically targeting gamers, game developers, the gaming industry and more recently gaming analytics! This past weekend in Boston, for example, was PAX East where people of all ages and walks of life played games on consoles, PC, handhelds, and good old fashioned board games. With my own children in attendance, the debate of commercial games versus indie favorites, such as Minecraft , dominates the dinner table.

Online games are where people congregate online, collaborate, and generate petabytes of data daily. With the added bonus of geospatial data from smart phones, the opportunity for more advanced analytics. Some of the basic metrics that determine whether a game is successful, according to Ninja Metrics, include:

  • New Users, Daily Active Users, Retention
  • Revenue per user
  • Session length and number of sessions per user

Additionally, they provide predictive analytics, customer lifetime value, and cohort analysis. If this is your gig, there’s a conference for that as well – the Gaming Analytics Summit !

At the Game Developers Conference recently held in San Francisco, the focus of this event has shifted over the years from computer games to new gaming platforms that need to incorporate mobile, smartphone, and online components. In order to produce a successful game, it requires the following:

  • Needs to be able to connect to a variety of devices and platforms
  • Needs to use data to drive decisions and improve user experience
  • Needs to ensure privacy laws are adhered to.

Developers are able to quickly access online gaming data and tweak or change their sprites’ attributes dynamically to maximize player experience.

When you look at what is happening in the gaming industry, you can start to see why colleges and universities like my own alma mater, WPI, now offers a computer science degree in Interactive Media and Game Design degree . The IMGD curriculum includes heavy coursework in data science, game theory, artificial intelligence and story boarding. When I asked a WPI IMGD student about what they are working on, they are mapping out decision trees that dictate what adversary to pop up based on the player’s history (sounds a lot like what we do in digital marketing…).

As we start to look at the Millennial Generation entering into the workforce, maybe we should look at our own recruiting efforts and consider game designers. They are masters in analytics and creativity with an appreciation for the importance of great data. Combining the magic and the math makes a great gaming experience. Who wouldn’t want that for their customers?

Share
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Data Integration, DaaS, Data Integration Platform, Data Services | Tagged , , | Leave a comment

Asia-Pacific Ecommerce surpassed Europe and North America

With a total B2C e-commerce turnover of $567.3bn in 2013, Asia-Pacific was the strongest e-commerce region in the world in 2013, as it surpassed Europe ($482.3bn) and North America ($452.4bn). Online sales in Asia-Pacific expected to have reached $799.2 billion in 2014, due to latest report from the Ecommerce Foundation.

Revenue: China, followed by Japan and Australia
As a matter of fact, China was the second-largest e-commerce market in the world, only behind the US ($419.0 billion), and for 2014 it is estimated that China even surpassed the US ($537.0 billion vs. $456.0 billion). In terms of B2C e-commerce turnover, Japan ($136.7 billion) ranked second, followed by Australia ($35.7 billion), South Korea ($20.2 billion) and India ($10.7 billion).

On average, Asian-Pacific e-shoppers spent $1,268 online in 2013
Ecommerce Europe’s research reveals that 235.7 million consumers in Asia-Pacific purchased goods and services online in 2013. On average, APAC online consumers each spent $1,268 online in 2013. This is slightly lower than the global average of $1,304. At $2,167, Australian e-shopper were the biggest spenders online, followed by the Japanese ($1,808) and China ($1,087).

Mobile: Japan and Australia lead the pack
In the frequency of mobile purchasing Japan shows the highest adoption, followed by Japan. An interesting fact is that 50% of transactions are done at home, 20% at work and 10% on the go.

frequency mobile shopping APAC

You can download the full report here. What does this mean for your business opportunity? Read more on the omnichannel trends 2015, which are driving customer experience. Happy to discuss @benrund.

Share
Posted in CMO, Customer Acquisition & Retention, DaaS, Data Quality, Manufacturing, Master Data Management, PiM, Product Information Management, Retail | Tagged , , , , | Leave a comment

A True Love Quiz: Is Your Marketing Data Right For You?

questionnaire and computer mouseValentine’s Day is such a strange holiday.  It always seems to bring up more questions than answers.  And the internet always seems to have a quiz to find out the answer!  There’s the “Does he have a crush on you too – 10 simple ways to find out” quiz.  There’s the “What special gift should I get her this Valentine’s Day?” quiz.  And the ever popular “Why am I still single on Valentine’s Day?” quiz.

Well Marketers, it’s your lucky Valentine’s Day!  We have a quiz for you too!  It’s about your relationship with data.  Where do you stand?  Are you ready to take the next step?


Question 1:  Do you connect – I mean, really connect – with your data?
Connect My Data□ (A) Not really.  We just can’t seem to get it together and really connect.
□ (B) Sometimes.  We connect on some levels, but there are big gaps.
□ (C) Most of the time.  We usually connect, but we miss out on some things.
□ (D) We are a perfect match!  We connect about everything, no matter where, no matter when.

Translation:  Data ready marketers have access to the best possible data, no matter what form it is in, no matter what system it is in.  They are able to make decisions based everything the entire organization “knows” about their customer/partner/product – with a complete 360 degree view. And they are also able to connect to and integrate with data outside the bounds of their organization to achieve the sought-after 720 degree view.  They can integrate and react to social media comments, trends, and feedback – in real time – and to match it with an existing record whenever possible. And they can quickly and easily bring together any third party data sources they may need.


Question 2:  How good looking & clean is you data?
My Data is So Good Looking□ (A) Yikes, not very. But it’s what’s on the inside that counts right?
□ (B) It’s ok.  We’ve both let ourselves go a bit.
□ (C) It’s pretty cute.  Not supermodel hot, but definitely girl or boy next door cute.
□ (D) My data is HOT!  It’s perfect in every way!

Translation: Marketers need data that is reliable and clean. According to a recent Experian study, American companies believe that 25% of their data is inaccurate, the rest of the world isn’t much more confident. 90% of respondents said they suffer from common data errors, and 78% have problems with the quality of the data they gather from disparate channels.  Making marketing decisions based upon data that is inaccurate leads to poor decisions.  And what’s worse, many marketers have no idea how good or bad their data is, so they have no idea what impact it is having on their marketing programs and analysis.  The data ready marketer understands this and has a top tier data quality solution in place to make sure their data is in the best shape possible.


Question 3:  Do you feel safe when you’re with your data?
I Heart Safe Data□ (A) No, my data is pretty scary.  911 is on speed dial.
□ (B) I’m not sure actually. I think so?
□ (C) My date is mostly safe, but it’s got a little “bad boy” or “bad girl” streak.
□ (D) I protect my data, and it protects me back.  We keep each other safe and secure.

Translation: Marketers need to be able to trust the quality of their data, but they also need to trust the security of their data.  Is it protected or is it susceptible to theft and nefarious attacks like the ones that have been all over the news lately?  Nothing keeps a CMO and their PR team up at night like worrying they are going to be the next brand on the cover of a magazine for losing millions of personal customer records. But beyond a high profile data breach, marketers need to be concerned over data privacy.  Are you treating customer data in the way that is expected and demanded?  Are you using protected data in your marketing practices that you really shouldn’t be?  Are you marketing to people on excluded lists


Question 4:  Is your data adventurous and well-traveled, or is it more of a “home-body”?
Home is where my data is□ (A) My data is all over the place and it’s impossible to find.
□ (B) My data is all in one place.  I know we’re missing out on fun and exciting options, but it’s just easier this way.
□ (C) My data is in a few places and I keep fairly good tabs on it. We can find each other when we need to, but it takes some effort.
□ (D) My data is everywhere, but I have complete faith that I can get ahold of any source I might need, when and where I need it.

Translation: Marketing data is everywhere. Your marketing data warehouse, your CRM system, your marketing automation system.  It’s throughout your organization in finance, customer support, and sale systems. It’s in third party systems like social media and data aggregators. That means it’s in the cloud, it’s on premise, and everywhere in between.  Marketers need to be able to get to and integrate data no matter where it “lives”.


Question 5:  Does your data take forever to get ready when it’s time to go do so something together?
My data is ready on time□ (A) It takes forever to prepare my data for each new outing.  It’s definitely not “ready to go”.
□ (B) My data takes it’s time to get ready, but it’s worth the wait… usually!
□ (C) My data is fairly quick to get ready, but it does take a little time and effort.
□ (D) My data is always ready to go, whenever we need to go somewhere or do something.

Translation:  One of the reasons many marketers end up in marketing is because it is fast paced and every day is different. Nothing is the same from day-to-day, so you need to be ready to act at a moment’s notice, and change course on a dime.  Data ready marketers have a foundation of great data that they can point at any given problem, at any given time, without a lot of work to prepare it.  If it is taking you weeks or even days to pull data together to analyze something new or test out a new hunch, it’s too late – your competitors have already done it!


Question 6:  Can you believe the stories your data is telling you?
My data tells the truth□ (A) My data is wrong a lot.  It stretches the truth a lot, and I cannot rely on it.
□ (B) I really don’t know.  I question these stories – dare I say excused – but haven’t been able to prove it one way or the other.
□ (C) I believe what my data says most of the time. It rarely lets me down.
□ (D) My data is very trustworthy.  I believe it implicitly because we’ve earned each other’s trust.

Translation:  If your data is dirty, inaccurate, and/or incomplete, it is essentially “lying” to you. And if you cannot get to all of the data sources you need, your data is telling you “white lies”!  All of the work you’re putting into analysis and optimization is based on questionable data, and is giving you questionable results.  Data ready marketers understand this and ensure their data is clean, safe, and connected at all times.


Question 7:  Does your data help you around the house with your daily chores?
My data helps me out□ (A) My data just sits around on the couch watching TV.
□ (B) When I nag my data will help out occasionally.
□ (C) My data is pretty good about helping out. It doesn’t take imitative, but it helps out whenever I ask.
□ (D) My data is amazing.  It helps out whenever it can, however it can, even without being asked.

Translation:  Your marketing data can do so much. It should enable you be “customer ready” – helping you to understand everything there is to know about your customers so you can design amazing personalized campaigns that speak directly to them.  It should enable you to be “decision ready” – powering your analytics capabilities with great data so you can make great decisions and optimize your processes.  But it should also enable you to be “showcase ready” – giving you the proof points to demonstrate marketing’s actual impact on the bottom line.


Now for the fun part… It’s time to rate your  data relationship status
If you answered mostly (A):  You have a rocky relationship with your data.  You may need some data counseling!

If you answered mostly (B):  It’s time to decide if you want this data relationship to work.  There’s hope, but you’ve got some work to do.

If you answered mostly (C):  You and your data are at the beginning of a beautiful love affair.  Keep working at it because you’re getting close!

If you answered mostly (D): Congratulations, you have a strong data marriage that is based on clean, safe, and connected data.  You are making great business decisions because you are a data ready marketer!


Do You Love Your Data?
Learn to love your dataNo matter what your data relationship status, we’d love to hear from you.  Please take our survey about your use of data and technology.  The results are coming out soon so don’t miss your chance to be a part.  https://www.surveymonkey.com/s/DataMktg

Also, follow me on twitter – The Data Ready Marketer – for some of the latest & greatest news and insights on the world of data ready marketing.  And stay tuned because we have several new Data Ready Marketing pieces coming out soon – InfoGraphics, eBooks, SlideShares, and more!

Share
Posted in 5 Sales Plays, Business Impact / Benefits, CMO, Customers, Data First, Data Integration, Data masking, Data Privacy, Data Quality, Data Security, Intelligent Data Platform, Master Data Management, Operational Efficiency, Total Customer Relationship | Tagged , , , , , , , , | Leave a comment

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

Healthcare’s Love Hate Relationship with Data

Healthcare's Love Hate Relationship with Data

Healthcare’s Love Hate Relationship with Data

Healthcare and data have the makings of an epic love affair, but like most relationships, it’s not all roses. Data is playing a powerful role in finding a cure for cancer, informing cost reduction, targeting preventative treatments and engaging healthcare consumers in their own care. The downside? Data is needy. It requires investments in connectedness, cleanliness and safety to maximize its potential.

  • Data is ubiquitous…connect it.

4400 times the amount of information held at the Library of Congress – that’s how much data Kaiser Permanente alone has generated from its electronic medical record. Kaiser successfully makes every piece of information about each patient available to clinicians, including patient health history, diagnosis by other providers, lab results and prescriptions. As a result, Kaiser has seen marked improvements in outcomes: 26% reduction in office visits per member and a 57% reduction in medication errors.

Ongoing value, however, requires continuous investment in data. Investments in data integration and data quality ensure that information from the EMR is integrated with other sources (think claims, social, billing, supply chain) so that clinicians and decision makers have access in the format they need. Without this, self-service intelligence can be inhibited by duplicate data, poor quality data or application silos.

  • Data is popular…ensure it is clean.

Healthcare leaders can finally rely on electronic data to make strategic decisions. A CHRISTUS Health anecdote you might relate to – In a weekly meeting each executive reviews their strategic dashboard; these dashboards drive strategic decision making about CPOE adoption (computerized physician order entry), emergency room wait times and price per procedure. Powered by enterprise information management, these dashboards paint a reliable and consistent view across the system’s 60 hospitals. Previous to the implementation of an enterprise data platform, each executive was reliant on their own set of data.

In the pre-data investment era, seemingly common data elements from different sources did not mean the same thing. For example, “Admit Date” in one report reflected the emergency department admission date whereas “Admit Date” in another report referred to the inpatient admission date.

  •  Sharing data is necessary…make it safe.

To cure cancer, reduce costs and engage patients, care providers need access to data and not just the data they generate; it has to be shared for coordination of care through transitions of care and across settings, i.e. home care, long term care and behavioral health. Fortunately, Consumers and Clinicians agree on this, PWC reports that 56% of consumers and 30% of physicians are comfortable with data sharing for care coordination. Further progress is demonstrated by healthcare organizations willingly adopting cloud based applications –as of 2013, 40% of healthcare organizations were already storing protected health information (PHI) in the cloud.

Increased data access carries risk, leaving health data exposed, however. The threat of data breach or hacking is multiplied by the presence (in many cases necessary) of PHI on employee laptops and the fact that providers are provided increased access to PHI. Ponemon Institute, a security firm estimates that data breaches cost the industry $5.6 billion each year. Investments in data-centric security are necessary to assuage fear, protect personal health data and make secure data sharing a reality.

Early improvements in patient outcomes indicate that the relationship between data and healthcare is a valuable investment. The International Institute of Analytics supports this, reporting that although analytics and data maturity across healthcare lags other industries, the opportunity to positively impact clinical and operational outcomes is significant.

Share
Posted in Big Data, Data Integration, Data Quality, Healthcare, Master Data Management, Mergers and Acquisitions, Operational Efficiency | Tagged , | Leave a comment