Category Archives: Data Quality

Empowering Your Organization with 3 Views of Customer Data

According to Accenture – 2013 Global Consumer Pulse Survey, “85 percent of customers are frustrated by dealing with a company that does not make it easy to do business with them, 84 percent by companies promising one thing, but delivering another; and 58 percent are frustrated with inconsistent experiences from channel to channel.”

Consumers expect more from the companies they do business with. In response, many companies are shifting from managing their business based on an application-, account- or product-centric approach to a customer-centric approach. And this is one of the main drivers for master data management (MDM) adoption. According to a VP of Data Strategy & Services at one of the largest insurance companies in the world, “Customer data is the lifeblood of a company that is serious about customer-centricity.” So, better managing customer data, which is what MDM enables you to do, is a key to the success of any customer-centricity initiative. MDM provides a significant competitive differentiation opportunity for any organization that’s serious about improving customer experience. It enables customer-facing teams to assess the value of any customer, at the individual, household or organization level.

Amongst the myriad business drivers of a customer-centricity initiative, key benefits include delivering an enhanced customer experience – leading to higher customer loyalty and greater share of wallet, more effective cross-sell and upsell targeting to increase revenue, and improved regulatory compliance.

To truly achieve all the benefits expected from a customer-first, customer-centric strategy, we need to look beyond the traditional approaches of data quality and MDM implementations, which often consider only one foundational (yet important) aspect of the technology solution. The primary focus has always been to consolidate and reconcile internal sources of customer data with the hope that this information brought under a single umbrella of a database and a service layer will provide the desired single view of customer. But in reality, this data integration mindset misses the goal of creating quality customer data that is free from duplication and enriched to deliver significant value to the business.

Today’s MDM implementations need to take their focus beyond mere data integration to be successful. In the following section, I will explain 3 levels of customer views which can be built incrementally to be able to make most out of your MDM solution. When implemented fully, these customer views act as key ingredients for improving the execution of your customer-centric business functions.

Trusted Customer View

The first phase of the solution should cover creation of trusted customer view. This view empowers your organization with an ability to see complete, accurate and consistent customer information.

In this stage, you take the best information from all the applications and compile it into a single golden profile. You not only use data integration technology for this, but also employ data quality tools to ensure the correctness and completeness of the customer data. Advanced matching, merging and trust framework are used to derive the most up-to-date information about your customer. You also guarantee that the golden record you create is accessible to business applications and systems of choice so everyone who has the authority can leverage the single version of the truth.

At the end of this stage, you will be able to clearly say John D. who lives at 123 Main St and Johnny Doe at 123 Main Street, who are both doing business with you, are not really two different individuals.

Customer data

Customer Relationships View

The next level of visibility is about providing a view into the customer’s relationships. It takes advantage of the single customer view and layers in all valuable family and business relationships as well as account and product information. Revealing these relationships is where the real value of multidomain MDM technology comes into action.

At the end of this phase, you not only see John Doe’s golden profile, but the products he has. He might have a personal checking from the Retail Bank, a mortgage from the Mortgage line of business, and brokerage and trust account with the Wealth Management division. You can see that John has his own consulting firm. You can see he has a corporate credit card and checking account with the Commercial division under the name John Doe Consulting Company.

At the end of this phase, you will have a consolidated view of all important relationship information that will help you evaluate the true value of each customer to your organization.

Customer Interactions and Transactions View

The third level of visibility is in the form of your customer’s interactions and transactions with your organization.

During this phase, you tie transactional information, historical data and social interactions your customer has with your organization to further enhance the system. Building this view provides you a whole new world of opportunities because you can see everything related to your customer in one central place. Once you have this comprehensive view, when John Doe calls your call center, you know how valuable he is to your business, which product he just bought from you (transactional data), what is the problem he is facing (social interactions).

A widely accepted rule of thumb holds that 80 percent of your company’s future revenue will come from 20 percent of your existing customers.  Many organizations are trying to ensure they are doing everything they can to retain existing customers and grow wallet share. Starting with Trusted Customer View is first step towards making your existing customers stay. Once you have established all three states discussed here, you can arm your customer-facing teams with a comprehensive view of customers so they can:

  • Deliver the best customer experiences possible at every touch point,
  • Improve customer segmentation for tailored offers, boost marketing and sales productivity,
  • Increase cross-sell and up-sell success,  and
  • Streamline regulatory reporting.

Achieving the 3 views discussed here requires a solid data management platform. You not only need an industry leading multidomain MDM technology, but also require tools which will help you integrate data, control the quality and connect all the dots. These technologies should work together seamlessly to make your implementation easier and help you gain rapid benefits. Therefore, choose your data management platform. To know more about MDM vendors, read recently released Gartner’s Magic Quadrant for MDM of Customer Data Solutions.

-Prash (@MDMGeek)

www.mdmgeek.com

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Master Data Management | Tagged , , , , | Leave a comment

Raised Expectations and New Discoveries with Great Customer Data

New Discoveries with Great Customer Data

New Discoveries with Great Customer Data

We have a landing! On November 12, the Rosetta probe arrived at its destination, a comet 300 million miles away from Earth.

Fulfilling its duty after a 10-year journey, Rosetta dropped its lander, Philae, to gather data from the comet below.

Everything about the comet so far is both challenging and fascinating, from its advanced age – 4.6 billion years old, to its hard-to- pronounce name, Churyumov-Gerasimenko.

The size of Gerasimenko? Roughly that of lower Manhattan. The shape wasn’t the potato-like image some anticipated of a typical comet. Instead it had a form that was compared to that of a “rubber-duck,” making landing trickier than expected.

To add one more challenging feature, the comet was flying at 38,000 mph. The feat of landing the probe onto the comet has been compared to hitting a speeding bullet with another speeding bullet.

All of this would have been impossible if the ESA didn’t have serious data on the lofty goal they set forth.

As this was happening, on the same day there was a quieter landing: Salesforce and LinkedIn paired up to publish research they conducted on marketing strategies by surveying 900+ senior-level B2B and B2C marketers through their social networks about their roles, marketing trends, and challenges they face.

This one finding stood out to me: “Only 17% of respondents said their company had fully integrated their customer data across all areas of the organization. However, 97% of those ‘fully integrated’ marketing leaders said they were at least somewhat effective at creating a cohesive customer journey across all touchpoints and channels.”

While not as many companies were implementing customer data like they should, those who did felt the strong impact of the benefits. It’s like knowing the difference between interacting with a potato-shaped company, or a B2C, vs. interacting with a rubber-duck-shaped company, or a B2B, for example.

Efficient customer data could help you learn how to land each one properly. While the methods for dealing with both might be similar, they’re not identical, and taking the wrong approach could mean a failed landing. One of the conclusions from the survey showed there is a “direct link between how well a marketer integrated customer data and the self-reported successes of that brand’s strategy.”

When interviewed by MSNBC on the comet landing, Bill Nye, also known as “the Science Guy,” had many positive things to say on the historic event. One question he answered was why do programs like the ESA exist – or basically, why do we go to space?

Nye had two replies: “It raises the expectations of your society,” and “You’re going to make discoveries.”

customer dataMarketers armed with insights from powerful customer data can have their own “landing on a comet” moment. Properly integrated customer data means you’ll be making new discoveries about your own clientele while simultaneously raising the expectations of your business.

The world couldn’t progress forward without quality data, whether in the realm of retail or planetary science. We put a strong emphasis on validating and cleanse your customer data at the point of entry or the point of collection.

Check out a quick video demo here of three data quality solutions: Email Verification, Address Verification, and Phone Validation.

 

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Data Services | Tagged , , , | Leave a comment

CFO Rising: CFO’s Show They Are Increasingly Business Oriented

The Rising CFO is Increasingly Business Oriented

CFO risingAt the CFO Rising West Conference on October 30th and 31st, there were sessions on managing capital expenditures, completing an IPO, and even managing margin and cash flow. However, the keynote presenters did not spend much of time on these topics. Instead, they focused on how CFOs need to help their firms execute better. Here is a quick summary of the suggestions made from CFOs in broadcasting, consumer goods, retail, healthcare, and medical devices.

The Modern CFO is Strategic

CFO risingThe Broadcasting CFO started his talk by saying he was not at the conference to share why CFOs need to move from being “bean counters to strategic advisors”. He said “let’s face it the modern CFO is a strategic CFO”. Agreeing with this viewpoint, the Consumer Goods CFO said that finance organizations have a major role to play in business transformation. He said that finance after all is the place to drive corporate improvement as well as business productivity and business efficiency.

CFOs Talked About Their Business’ Issues

CFO risingThe Retailer CFO talked like he was a marketing person. He said retail today is all about driving a multichannel customer experience. To do this, finance increasingly needs to provide real business value. He said, therefore, that data is critical to the retailer’s ability to serve customers better. He claimed that customers are changing how they buy, what they want to buy, and when they want to buy. We are being disrupted and it is better to understand and respond to these trends. We are trying, therefore, to build a better model of ecommerce.

Meanwhile, the Medical Devices CFO said that as a supplier to medical device vendors “what we do is compete with our customers engineering staffs”. And the Consumer Goods CFO added the importance of finance driving sustained business transformation.

CFOs Want To Improve Their Business’ Ability To Execute

CFO risingThe Medical Devices CFO said CFOs need to look for “earlier execution points”. They need to look for the drivers of behavior change. As a key element of this, he suggested that CFOs need to develop “early warning indicators”. He said CFOs need to actively look at the ability to achieve objectives. With sales, we need to ask what deals do we have in the pipe? At what size are these deals? And at what success rate will these deals be closed? Only with this information, can the CFO derive an expected company growth rate. He then asked CFOs in the room to identify themselves. With their hands in the air, he asked them are they helping to create a company that executes or not. He laid down the gauntlet for the CFOs in the room by then asserting that if you are not creating a company that executes then are going to be looking at cutting costs sooner rather than later.

The retailer CFO agreed with this CFO. He said today we need to focus on how to win a market. We need to be asking business questions including:

  • How should we deploy resources to deliver against our firm’s value proposition?
  • How do we know when we win?

CFOs Claim Ownership For Enterprise Performance Measurement

Data AnalysisThe Retail CFO said that finance needs to own “the facts for the organization”—the metrics and KPIs. This is how he claims CFOs will earn their seat at the CEOs table. He said in the past the CFO have tended to be stoic, but this now needs to change.

The Medical Devices CFO agreed and said enterprises shouldn’t be tracking 150 things—they need to pare it down to 12-15 things. They need to answer with what you measure—who, what, and when. He said in an execution culture people need to know the targets. They need measurable goals. And he asserted that business metrics are needed over financial metrics. The Consumer Goods CFO agreed by saying financial measures alone would find that “a house is on fire after half the house had already burned down”. The Healthcare CFO picked up on this idea and talked about the importance of finance driving value scorecards and monthly benchmarks of performance improvement. The broadcaster CFO went further and suggested the CFO’s role is one of a value optimizer.

CFOs Own The Data and Drive a Fact-based, Strategic Company Culture

FixThe Retail CFOs discussed the need to drive a culture of insight. This means that data absolutely matters to the CFO. Now, he honestly admits that finance organizations have not used data well enough but he claims finance needs to make the time to truly become data centric. He said I do not consider myself a data expert, but finance needs to own “enterprise data and the integrity of this data”. He said as well that finance needs to ensure there are no data silos. He summarized by saying finance needs to use data to make sure that resources are focused on the right things; decisions are based on facts; and metrics are simple and understandable. “In finance, we need use data to increasingly drive business outcomes”.

CFOs Need to Drive a Culture That Executes for Today and the Future

Honestly, I never thought that I would hear this from a group of CFOs. The Retail CFO said we need to ensure that the big ideas do not get lost. We need to speed-up the prosecuting of business activities. We need to drive more exponential things (this means we need to position our assets and resources) and we need, at the same time, to drive the linear things which can drive a 1% improvement in execution or a 1% reduction in cost. Meanwhile, our Medical Device CFO discussed the present value, for example, of a liability for rework, lawsuits, and warranty costs. He said that finance leaders need to ensure things are done right today so the business doesn’t have problems a year from today. “If you give doing it right the first time a priority, you can reduce warranty reserve and this can directly impact corporate operating income”.

CFOs need to lead on ethics and compliance

The Medical Devices CFO said that CFOs, also, need to have high ethics and drive compliance. The Retail CFO discussed how finance needs to make the business transparent. Finance needs to be transparent about what is working and what is not working. The role of the CFO, at the same time, needs to ensure the integrity of the organization. The Broadcaster CFO asserted the same thing by saying that CFOs need to take a stakeholder approach to how they do business.

Final remarks

In whole, CFOs at CFO Rising are showing the way forward for the modern CFOs. This CFO is all about the data to drive present and future performance, ethics and compliance, and business transparency. This is a big change from the historical controller approach and mentality. I once asked a boss about what I needed to be promoted to a Vice President; my boss said that I needed to move from a technical specialist to a business person. Today’s CFOs clearly show that they are a business person first.

Related links

Solution Brief: The Intelligent Data Platform

Related Blogs
CFOs Move to Chief Profitability Officer
CFOs Discuss Their Technology Priorities
The CFO Viewpoint upon Data
How CFOs can change the conversation with their CIO?
New type of CFO represents a potent CIO ally
Competing on Analytics
The Business Case for Better Data Connectivity
Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, General, Governance, Risk and Compliance, Healthcare | Tagged , , , | Leave a comment

How Email Marketers Can Keep Up With Changes to Their Industry

Keep Up With Changes

Keep Up With Changes

Email has come a long way since its beginning. Two years after the U.S. launched a rocket to the moon, programmer Raymond Tomlinson launched the first email, a message that read “QWERTYUIOP” in 1971.

In 1991, when the World Wide Web was created, email then had the opportunity to evolve into the mainstream form of communication it is today.

The statistics for modern day email are staggering. Worldwide, there are 2.2 billion email users as of 2012, according to MarketingProfs.

With all these messages flying about, you know there’s going to be some email overload. Google’s Gmail alone has 425 million users worldwide. ESPs know people have too much email to deal with, and there’s a lot of noise out there. More than 75% of the world’s email is spam.

Gmail is one of the applications that recently responded to this problem, and all email marketers need to be aware.

On October 22, Google announced Inbox.

Google’s Inbox takes several steps to bring structure to the abundant world of email with these features:

  • Categorizes and bundles emails.
  • Highlights important content within the body of the email.
  • Allows users to customize messages by adding their own reminders.

This latest update to Gmail is just one small way that the landscape of email marketing and audience preferences is changing all the time.

As we integrate more technology into our daily lives, it only makes sense that we use digital messages as a means of communication more often. What will this mean for email in the future? How will marketers adjust to the new challenges email marketing will present at larger volumes, with audiences wanting more segmentation and personalization?

All About eMail Virtual Conference

All About eMail Virtual Conference

One easy way to stay on top of these and other changes to the e-mail landscape is talking to your peers and experts in the industry. Luckily, an opportunity is coming up — and it’s free.

Make sure you check out the All About eMail Virtual Conference & Expo on November 13. It’s a virtual event, which means you can attend without leaving your desk!

It’s a one-day event with the busy email marketer in mind. Register for free and be sure to join us for our presentation, “Maximizing Email Campaign Performance Through Data Quality.”

Other strategic sessions include email marketing innovations presented by Forrester Research, mobile email, ROI, content tips, email sending frequency, and much more. (See the agenda here.)

This conference is just one indication that email marketing is still relevant, but only if email marketers adjust to changing audience preferences. With humble beginnings in 1971 email has come a long way. Email marketing has the best ROI in the business, and will continue to have value long into the future.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Data Quality, Retail | Tagged , , , , , | Leave a comment

If Data Projects Weather, Why Not Corporate Revenue?

Every fall Informatica sales leadership puts together its strategy for the following year.  The revenue target is typically a function of the number of sellers, the addressable market size and key accounts in a given territory, average spend and conversion rate given prior years’ experience, etc.  This straight forward math has not changed in probably decades, but it assumes that the underlying data are 100% correct. This data includes:

  • Number of accounts with a decision-making location in a territory
  • Related IT spend and prioritization
  • Organizational characteristics like legal ownership, industry code, credit score, annual report figures, etc.
  • Key contacts, roles and sentiment
  • Prior interaction (campaign response, etc.) and transaction (quotes, orders, payments, products, etc.) history with the firm

Every organization, no matter if it is a life insurer, a pharmaceutical manufacturer, a fashion retailer or a construction company knows this math and plans on getting somewhere above 85% achievement of the resulting target.  Office locations, support infrastructure spend, compensation and hiring plans are based on this and communicated.

data revenue

We Are Not Modeling the Global Climate Here

So why is it that when it is an open secret that the underlying data is far from perfect (accurate, current and useful) and corrupts outcomes, too few believe that fixing it has any revenue impact?  After all, we are not projecting the climate for the next hundred years here with a thousand plus variables.

If corporate hierarchies are incorrect, your spend projections based on incorrect territory targets, credit terms and discount strategy will be off.  If every client touch point does not have a complete picture of cross-departmental purchases and campaign responses, your customer acquisition cost will be too high as you will contact the wrong prospects with irrelevant offers.  If billing, tax or product codes are incorrect, your billing will be off.  This is a classic telecommunication example worth millions every month.  If your equipment location and configuration is wrong, maintenance schedules will be incorrect and every hour of production interruption will cost an industrial manufacturer of wood pellets or oil millions.

Also, if industry leaders enjoy an upsell ratio of 17%, and you experience 3%, data (assuming you have no formal upsell policy as it violates your independent middleman relationship) data will have a lot to do with it.

The challenge is not the fact that data can create revenue improvements but how much given the other factors: people and process.

Every industry laggard can identify a few FTEs who spend 25% of their time putting one-off data repositories together for some compliance, M&A customer or marketing analytics.  Organic revenue growth from net-new or previously unrealized revenue is what the focus of any data management initiative should be.  Don’t get me wrong; purposeful recruitment (people), comp plans and training (processes) are important as well.  Few people doubt that people and process drives revenue growth.  However, few believe data being fed into these processes has an impact.

This is a head scratcher for me. An IT manager at a US upstream oil firm once told me that it would be ludicrous to think data has a revenue impact.  They just fixed data because it is important so his consumers would know where all the wells are and which ones made a good profit.  Isn’t that assuming data drives production revenue? (Rhetorical question)

A CFO at a smaller retail bank said during a call that his account managers know their clients’ needs and history. There is nothing more good data can add in terms of value.  And this happened after twenty other folks at his bank including his own team delivered more than ten use cases, of which three were based on revenue.

Hard cost (materials and FTE) reduction is easy, cost avoidance a leap of faith to a degree but revenue is not any less concrete; otherwise, why not just throw the dice and see how the revenue will look like next year without a central customer database?  Let every department have each account executive get their own data, structure it the way they want and put it on paper and make hard copies for distribution to HQ.  This is not about paper versus electronic but the inability to reconcile data from many sources on paper, which is a step above electronic.

Have you ever heard of any organization move back to the Fifties and compete today?  That would be a fun exercise.  Thoughts, suggestions – I would be glad to hear them?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Product Information Management | Tagged , | 1 Comment

Analytics Stories: A Healthcare Case Study

As I indicated in Competing on Analytics, if you ask CIOs today about the importance of data to their enterprises, they will likely tell you about their business’ need to “compete on analytics”, to deliver better business insights, and to drive faster business decision making. These have a high place on the business and CIO agendas, according to Thomas H. Davenport, because “at a time when firms in many industries offer similar products and use comparable technologies, business processes are among the last remaining points of differentiation.” For this reason, Davenport claims timely analytics enables companies to “wring every last drop of value from their processes”.

So is anyone showing the way on how to compete on analytics?

healthcareUMass Memorial Health Care is a great example of an enterprise that is using analytics to “wring every last drop of value from their processes”. However, before UMass could compete on data, it needed to create data that could be trusted by its leadership team.

Competing on analytics requires trustworthy data

TrustworthyAt UMass, they found that they could not accurately measure the size of their patient care population. This is a critical metric for growing market share. Think about how hard it would be to operate any business without an accurate count of how many customers are being served. Lacking this information hindered UMass’ ability to make strategic market decisions and drive key business and clinical imperatives.

A key need at UMASS was to determine a number of critical success factors for its business. This included obviously the size of the patient population but it also included the composition of the patient population and the number of unique patients served by primary care physician providers across each of its business locations. Without this knowledge, UMASS found itself struggling to make effective decisions regarding its strategic direction, its clinical policies, and even its financial management. And all of these factors really matter in an era of healthcare reform.

Things proved particularly complex at UMass since they act as what is called a “complex integrated delivery network”. This means that portions of its business effectively operated under different business models. This, however, creates a data challenge in healthcare. Unlike other diversified enterprises, UMASS needs an operating model–“the necessary level of business process integration and standardization for delivering its services to customers”[1]– that could support different elements of its business but be unified for integrative analysis. This matters because in UMass’ case, there is a single denominator, the patient. And to be clear, while each of UMASS’ organizations could depend on their data to meet their needs, UMASS lacked an integrative view into patients.

Departmental Data may be good for a department but not for the Enterprise

Data AnalysisUMass had adequate data for each organization, such as delivering patient care or billing for a specific department or hospital, but it was inadequate for system wide measures. And aggregation and analytics, which needed to combine data across systems and organizations was stymied by data inconsistencies, incomplete population of fields, or other types of data quality problems between each system. These issues made it impossible to provide the analytics UMass’ senior managers needed. For example, UMass’ aggregated data contained duplicate patients—people who had been treated at different sites and had different medical record numbers, but who were in fact the same patients.

A key need for UMass creating the ability to compete on analytics was to measure and report on the number of primary care patients being treated across their entire healthcare system. UMass leadership saw this as a key planning and strategy metric because primary care patients today are the focus of investments in wellness and prevention programs, as well as a key source of specialty visits and inpatients. According to George Brenckle, Senior Vice President and CIO, they “had an urgent need for improved clinical and business intelligence across all our operations, we needed an integrated view of patient information, encounters, providers, and UMass Memorial locations to support improved decision making, advance the quality of patient care, and increase patient loyalty. To put the problem into perspective, we have more than 100 applications—some critical, some not so critical—and our ultimate ambition was to integrate all of these areas of business, leverage analytics, and drive clinical and operational excellence.”

The UMASS Solution

solutionThe UMass solved the above issues by creating an integrated way to view of patient information, encounters, providers, and UMass Memorial locations. This allowed UMass to compute the number of primary care physician patients cared for. In order to make this work, the solution merged data from the core hospital information applications and processed this data for quality issue that prevented UMass from deriving the primary care patient count. Armed with this, data integration helped UMass Memorial improve its clinical outcomes, grow its patient population, increase process efficiency, and ultimately maximize its return on data. As well UMASS gained a reliable measure of its primary care patient population, UMASS now was able to determine an accurate counts for unique patients served by its hospitals (3.2 million), active patients (i.e., those treated within the last three years—approximately 1.7 million), and unique providers (approximately 24,000).

According to Brenckle, data integration transformed their analytical capabilities and decision making. “We know who our primary care patients are and how many there are of them, whether the volume of patients is rising or decreasing, how many we are treating in an ambulatory or acute care setting, and what happens to those patients as they move through the healthcare system. We are able to examine which providers they saw and at which location. This data is vital to improving clinical outcomes, growing the patient population, and increasing efficiency.”

Related links

Solution Brief: The Intelligent Data Platform
Details on the UMASS Solution

Related Blogs

Thomas Davenport Book “Competing On Analytics”
Competing on Analytics
The Business Case for Better Data Connectivity
CIO explains the importance of Big Data to Healthcare
The CFO Viewpoint upon Data
What an enlightened healthcare CEO should tell their CIO?
Twitter: @MylesSuer

 


 

[1] Enterprise Architecture as Business Strategy, Jeanne Ross, Harvard Business School Press, Page 8
FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Governance, Data Quality | Tagged , , , | Leave a comment

More Evidence That Data Integration Is Clearly Strategic

Data Integration Is Clearly Strategic

Data Integration Is Strategic

A recent study from Epicor Software Corporation surveyed more than 300 IT and business decision-makers.  The study results highlighted the biggest challenges and opportunities facing Australian businesses. The independent research report “From Business Processes to Product Distribution” was based upon a survey of Australian organizations with more than 20 employees.

Key findings from the report include:

  • 65% of organizations cite data processing and integration as hampering distribution capability, with nearly half claiming their existing software and ERP is not suitable for distribution.
  • Nearly two-thirds of enterprises have some form of distribution process, involving products or services.
  • More than 80% of organizations have at least some problem with product or service distribution.
  • More than 50% of CIOs in organizations with distribution processes believe better distribution would increase revenue and optimize business processes, with a further 38% citing reduced operating costs.

The core findings: “With better data integration comes better automation and decision making.”

This report is one of many I’ve seen over the years that come to the same conclusion.  Most of those involved with the operations of the business don’t have access to key data points they need, thus they can’t automate tactical decisions, and also cannot “mine” the data, in terms of understanding the true state of the business.

The more businesses deal with building and moving products, the more data integration becomes an imperative value.  As stated in this survey, as well as others, the large majority cite “data processing and integration as hampering distribution capabilities.”

Of course, these issues goes well beyond Australia.  Most enterprises I’ve dealt with have some gap between the need to share key business data to support business processes, and decision support, and what current exists in terms of data integration capabilities.

The focus here is on the multiple values that data integration can bring.  This includes:

  • The ability to track everything as it moves from manufacturing, to inventory, to distribution, and beyond.  You to bind these to core business processes, such as automatic reordering of parts to make more products, to fill inventory.
  • The ability to see into the past, and to see into the future.  The emerging approaches to predictive analytics allow businesses to finally see into the future.  Also, to see what went truly right and truly wrong in the past.

While data integration technology has been around for decades, most businesses that both manufacture and distribute products have not taken full advantage of this technology.  The reasons range from perceptions around affordability, to the skills required to maintain the data integration flow.  However, the truth is that you really can’t afford to ignore data integration technology any longer.  It’s time to create and deploy a data integration strategy, using the right technology.

This survey is just an instance of a pattern.  Data integration was considered optional in the past.  With today’s emerging notions around the strategic use of data, clearly, it’s no longer an option.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data First, Data Integration, Data Integration Platform, Data Quality | Tagged , , , | Leave a comment

At Valspar Data Management is Key to Controlling Purchasing Costs

Steve Jenkins, Global IT Director at Valspar

Steve Jenkins is working to improve information management maturity at Valspar

Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”

Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.

As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.

“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”

Poorly managed vendor and raw materials data was impacting Valspar’s buying power

Data management at Valspar

“We realized our buying power was limited by the age and quality of available vendor and raw materials data.”

The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve. 

The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.

These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.

Valspar needed a single trusted source of vendor and raw materials data

Informatica MDM supports vendor and raw materials data management at Valspar

The team chose Informatica MDM as their enterprise hub for vendors and raw materials

The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.

Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.

Better vendor and raw materials data management results in cost savings

Valspar Chameleon Jon

Valspar will gain benefits by fueling applications with clean, consistent, connected and enriched data

Valspar expects to gain the following business benefits:

  • Streamline the RFQ process to accelerate raw materials cost savings
  • Reduce the total number of raw materials SKUs and vendors
  • Increase productivity of staff focused on pulling and maintaining data
  • Leverage consistent global data visibly to:
    • increase leverage during contract negotiations
    • improve acquisition due diligence reviews
    • facilitate process standardization and reporting

 

Valspar’s vision is to tranform data and information into a trusted organizational assets

“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.

Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”

Total Supplier Information Management eBook

Click here to download the Total Supplier Information Management eBook

Want more? Download the Total Supplier Information Management eBook. It covers:

  • Why your fragmented supplier data is holding you back
  • The cost of supplier data chaos
  • The warning signs you need to be looking for
  • How you can achieve Total Supplier Information Management

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management, Operational Efficiency, PowerCenter, Vertical | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

Scalable Enterprise Analytics: Informatica PowerCenter Data Quality and Oracle Exadata

In 2012, Forbes published an article predicting an upcoming problem.

The Need for Scalable Enterprise Analytics

Specifically, increased exploration in Big Data opportunities would place pressure on the typical corporate infrastructure. The generic hardware used to run most tech industry enterprise applications was not designed to handle real-time data processing. As a result, the explosion of mobile usages, and the proliferation of social networks, was increasing the strain on the system. Most companies now faced real-time processing requirements beyond what the traditional model was designed to handle.

In the past two years, the volume of data and speed of data growth has grown significantly. As a result, the problem has become more severe. It is now clear that these challenges can’t be overcome by simply doubling or tripling their IT spending on infrastructure sprawl. Today, enterprises seek consolidated solutions that offer scalability, performance and ease of administration. The present need is for scalable enterprise analytics.

A Clear Solution Is Available

Informatica PowerCenter and Data Quality is the market leading data integration and data quality platform. This platform has now been certified by Oracle as an optimal solution for both the Oracle Exadata Database Machine and the Oracle SuperCluster.

As the high-speed on-ramp for data into Oracle Exadata, PowerCenter and Data Quality deliver up-to five times faster performance on data load, query, profiling and cleansing tasks. Informatica’s data integration customers can now easily reuse data integration code, skills and resources to access and transform any data from any data source and load it into Exadata, with the highest throughput and scalability.

Customers adopting Oracle Exadata for high-volume, high-speed analytics can now be confident with Informatica PowerCenter and Data Quality. With these products, they can ingest, cleanse and transform all types of data into Exadata with the highest performance and scale required to maximize the value of their Exadata investment.

Proving the Value of Scalable Enterprise Analytics

In order to demonstrate the efficacy of their partnership, the two companies worked together on a Proof Of Value (POV) project. The goal is to prove that using PowerCenter with Exadata would improve both performance and scalability. The project involved PowerCenter and Data Quality 9.6.1 and x4-2 Exadata Machine. Oracle 11g was considered for both standard Oracle and Exadata versions.

The first test conducted a 1TB load test to Exadata and standard Oracle in a typical PowerCenter use case. The second test consisted of querying 1TB profiling warehouse database in Data Quality use case scenario. Performance data was collected for both tests. The scalability factor was also captured. A variant of the TPCH dataset was used to generate the test data. The results were significantly higher than prior Exabyte 1TB test. In particular:

  • The data query tests achieved 5x performance.
  • The data load tests achieved a 3x-5x speed increase.
  • Linear scalability was achieved with read/write tests on Exadata.

What Business Benefits Could You Expect?

Informatica PowerCenter and Data Quality, along-with Oracle Exadata, now provide the best-of-breed combination of software and hardware, optimized to deliver the highest possible total system performance. These comprehensive tools drive agile reporting and analytics, while empowering IT organizations to meet SLAs and quality goals like never before.

  1. Extend Oracle Exadata’s access to even more business critical data sources. Utilize optimized out-of-the-box Informatica connectivity to easily access hundreds of data sources, including all the major databases, on-premise and cloud applications, mainframe, social data and Hadoop.
  2. Get more data, more quickly into Oracle Exadata. Move higher volumes of trusted data quickly into Exadata to support timely reporting with up-to-date information (i.e. up to 5x performance improvement compared to Oracle database).
  3.  Centralize management and improve insight into large scale data warehouses. Deliver the necessary insights to stakeholders with intuitive data lineage and a collaborative business glossary. Contribute to high quality business analytics, in a timely manner across the enterprise.
  4. Instantly re-direct workloads and resources to Oracle Exadata without compromising performance. Leverage existing code and programming skills to execute high-performance data integration directly on Exadata by performing push down optimization.
  5. Roll-out data integration projects faster and more cost-effectively. Customers can now leverage thousands of Informatica certified developers to execute existing data integration and quality transformations directly on Oracle Exadata, without any additional coding.
  6. Efficiently scale-up and scale-out. Customers can now maximize performance and lower the costs of data integration and quality operations of any scale by performing Informatica workload and push down optimization on Oracle Exadata.
  7.  Save significant costs involved in administration and expansion. Customers can now easily and economically manage large-scale analytics data warehousing environments with a single point of administration and control, and consolidate a multitude of servers on one rack.
  8.  Reduce risk. Customers can now leverage Informatica’s data integration and quality platform to overcome the typical performance and scalability limitations seen in databases and data storage systems. This will help reduce quality-of-service risks as data volumes rise.

Conclusion

Oracle Exadata is a well-engineered system that offers customers out-of-box scalability and performance on demand.  Informatica PowerCenter and Data Quality are optimized to run on Exadata, offering customers business benefits that speed up data integration and data quality tasks like never before.  Informatica’s certified, optimized, and purpose-built solutions for Oracle can help you enable more timely and trustworthy reporting.  You can now benefit from Informatica’s optimized solutions for Oracle Exadata to make better business decisions by unlocking the full potential of the most current and complete enterprise data available. As shown in our test results, you can attain up to 5x performance by scaling Exadata. Informatica Data Quality customers can perform profiling 1TB datasets, which is unheard before. We urge you to deploy the combined solution to solve your data integration and quality problems today while achieving high speed business analytics in these days of big data exploration and Internet Of Things.

Note:

Listen to what Ash Kulkarni, SVP, at OOW14 has to say on how @InformaticaCORP PowerCenter and Data Quality certified by Oracle as optimized for Exadata can deliver up-to five times faster performance improvement on data load, query, profiling, cleansing and mastering tasks, for Exadata.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Quality, Data Services, Data Warehousing, Enterprise Data Management, PowerCenter, Vibe | Tagged | Leave a comment