Category Archives: Business/IT Collaboration

Analytics Stories: An Educational Case Study

AnalyticsAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. At first glance, you might not think of universities needing to worry much about their right to win, but universities today are facing increasing competition for students as well as the need to increase efficiency, decrease dependence upon state funding, create new and less expensive delivery models, and drive better accountability.

George Washington University Perceives The Analytic Opportunity

AnalyticsGeorge Washington University (GWU) is no different. And for this reason their leadership determined that they needed to gain the business insight to compete for the best students, meet student diversity needs, and provide accountability to internal and external stakeholders. All of these issues turned out to have a direct impact upon GWU’s business processes—from student recruitment to financial management. At the same time university leadership determined the complexity of these challenges requires continual improvement in the University’s operational strategies and most importantly, accurate, timely, and consistent data.

Making It A Reality

processing dataGWU determined that getting after these issues required a flexible system that could provide analytics and key academic performance indicators and metrics on demand, whenever they needed them. They, also, determined that the analytics and underlying data needed to enable accurate, balanced decisions needed to be performed more quickly and more effectively than in the past.

Unfortunately, GWU’s data was buried in disparate data sources that were largely focused on supporting transactional, day-to-day business processes. This data was difficult to extract and even more difficult to integrate into a single format, owing to inherent system inconsistencies and the ownership issues surrounding them — a classic problem for collegial environments. Moreover, the university’s transaction applications did not store data in models that supported on-demand and ad hoc aggregations that GWU business users required.

To solve these issues, GWU created a data integration and business intelligence implementation dubbed the Student Data Mart (SDM). The SDM integrates raw structured and unstructured data into a unified data model to support key academic metrics.

“The SDM represents a life record of the students,” says Wolf, GWU’s Director of Business Intelligence. “It contains 10 years of recruitment, admissions, enrollment, registration, and grade-point average information for all students across all campuses”. It supports a wide-range of academic metrics around campus enrollment counts, admissions selectivity, course enrollment, student achievement, and program metrics.

These metrics are directly and systematically aligned with the academic goals for each department and with GWU’s overall overarching business goals. Wolf says, “The SDM system provides direct access to key measures of academic performance”. “By integrating data into a clean repository and disseminating information over their intranet, the SDM has given university executivesdirect access to key academic metrics. Based on these metrics, users are able to make decisions in a timely manner and with more precision than before.”

Their integration technology supports a student account system, which supplies more than 400 staff with a shared, unified view of the financial performance of students. It connects data from a series of diverse, fragmented internal sources and third-party data from employers, sponsors, and collection agencies. The goal is to answer business questions about whether students paid their fees or how much they paid for each university course.

Continual Quality Improvement

AnalyticsDuring its implementation, GWU’s data integration process exposed a number of data quality issues that were the natural outcome of a distributed data ownership. Without an enterprise approach to data and analytics, it would have been difficult to investigate the nature and extent of data quality issues from its historical fragmented business intelligence system. Taking an enterprise approach has, as well, enabled GWU to improve data quality standards and procedures.

Wolf explains, “Data quality is an inevitable problem in any higher education establishment, because you have so many different people—lecturers, students, and administration staff—all entering data. With our system, we can find hidden data problems, wherever they are, and analyze the anomalies across all data sources. This helps build our trust and confidence in the data. It also speeds up the design phase because it overcomes the need to hand query the data to see what the quality is like.”

Connecting The Dots

Dots_gameplayWolf and his team have not stopped here. As data emanating from social media has grown, they have designed their system so social data can be integrated just as easily as their traditional data sources including Oracle Financials, SunGard, SAP, and flat file data. Wolf says the SDM platform doesn’t turn its back on any type of data. By allowing the university to integrate any type of data, including social media, Wolf has been able to support key measures of academic performance, improving standards, and reducing costs. Ultimately, this is helping GWU maintain its business position as well as the University’s position especially as a magnet for the best students around the world.

In sum, the GWU analytics solution has helped it achieve the following business goals:

  • Attract the best students
  • Provide trusted reliable data for decision makers
  • Enable more timely business decisions
  • Increase achievement of academic and administrative goals
  • Deliver new business insight by combining social media with existing data sources

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance | Tagged , , | Leave a comment

Building Engagement through Service and Support

This is a guest post by Tom Petrocelli, Research Director of Enterprise Social, Mobile and Cloud Applications at Neuralytix

Engagement through Service and Support

Engagement through Service and Support

A product is not just an object or the bits that comprise software or digital media; it is an entire experience. The complete customer experience is vital to the overall value a customer derives from their product and the on-going relationship between the customer and the vendor. The customer experience is enhanced through a series of engagements over a variety of digital, social, and personal channels.  Each point of contact between a vendor and customer is an opportunity for engagement. These engagements over time affect the level of satisfaction the customers with the vendor relationship.

Service and support is a critical part of this engagement strategy. Retail and consumer goods companies recognize the importance of support to the overall customer relationship. Subsequently, these companies have integrated their before and after-purchase support into their multi-channel marketing and omni-channel marketing strategies. While retail and consumer products companies have led the way on support an integral part of on-going customer engagement, B2B companies have begun to do the same. Enterprise IT companies, which are primarily B2B companies, have been expanding their service and support capabilities to create more engagement between their customers and themselves. Service offerings have expanded to include mobile tools, analytics-driven self-help, and support over social media and other digital channels. The goal of these investments has been to make interactions more productive for the customer, strengthen relationships through positive engagement, and to gather data that drives improvements in both the product and service.

A great example of an enterprise software company that understands the value in customer engagement though support is Informatica.  Known primarily for their data integration products, Informatica has been quickly expanding their portfolio of data management and data access products over the past few years. This growth in their product portfolio has introduced many new types of customers Informatica and created more complex customer relationships. For example, the new SpringBok product is aimed at making data accessible to the business user, a new type of interaction for Informatica. Informatica has responded with a collection of new service enhancements that augment and extend existing service channels and capabilities.

What these moves say to me is that Informatica has made a commitment to deeper engagement with customers. For example, Informatica has expanded the avenues from which customers can get support. By adding social media and mobile capabilities, they are creating additional points of presence that address customer issues when and where customers are. Informatica provides support on the customers’ terms instead of requiring customers to do what is convenient for Informatica. Ultimately, Informatica is creating more value by making it easier for customers to interact with them. The best support is that which solves the problem quickest with the least amount of effort. Intuitive knowledge base systems, online support, sourcing answers from peers, and other tools that help find solutions immediately are more valued than traditional phone support. This is the philosophy that drives the new self-help portal, predicative escalation, and product adoption services.

Informatica is also shifting the support focus from products to business outcomes. They are manage problems holistically and are not simply trying to create product band-aids. This shows a recognition that technical problems with data are actually business problems that have broad effects on a customer’s business.  Contrast this with the traditional approach to support that focuses fixing a technical issue but doesn’t necessarily address the wider organizational effects of those problems.

More than anything, these changes are preparation for a very different support landscape. With the launch of the Springbok data analytics tool, Informatica’s support organization is clearly positioning itself to help business analysts and similar semi-technical end-users. The expectations of these end-users have been set by consumer applications. They expect more automation and more online resources that help them to use and derive value from their software and are less enamored with fixing technical problems.

In the past, technical support was mostly charged with solving immediate technical issues.  That’s still important since the products have to work first to be useful. Now, however, support organizations has an expanded mission to be part of the overall customer experience and to enhance overall engagement. The latest enhancements to the Informatica support portfolio reflects this mission and prepares them for the next generation of non-IT Informatica customers.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration | Tagged , , | Leave a comment

CFO Checklist to Owning Enterprise Analytics

Frank-FriedmanLast month, the CEO of Deloitte said that CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. In my discussions with CFOs, they have expressed similar opinions.  Given this, the question becomes what does a CFO need to do to be effective leader of their company’s analytics agenda? To answer this, I took a look at what Tom Davenport suggests in his book “Analytics at Work”. In this book, Tom suggests that an analytical leader need to do the following twelve things to be effective:

12 Ways to Be an Effective Analytics Leader

1)      Develop their people skills. This is not just about managing analytical people which has its own challenges. It is, also, about CFOs establishing the “the credibility and trust needed when analytics produce insights that effectively debunk currently accepted wisdom”.
2)      Push for fact based decision making. You need to, as a former boss of mine like to say, become the lightening rod and in this case, set the expectation that people will make decisions based upon data and analysis.
3)      Hire and retain smart people. You need to provide a stimulating and supportive work environment for analysts and give them credit when they do something great.
4)      Be the analytical example. You need to lead by example. This means you need to use data and analysis in making your own decisions
5)      Signup for improved results. You need to commit to driving improvements in a select group of business processes by using analytics. Pick something meaningful ike reducing the cost of customer acquisition or optimizing your company’s supply chain management.
6)      Teach the organization how to use analytic methods. Guide employees and other stakeholders into using more rigorous thinking and decision making.
7)      Set strategies and performance expectations. Analytics and fact-based decisions cannot happen in a vacuum. They need strategies and goals that analytics help achieve.
8)      Look for leverage points. Look for the business problems where analytics can make a real difference. Look for places where a small improvement in a process driven by analytics can make a big difference.
9)      Demonstrate persistence. Work doggedly and persistently to apply analytics to decision making, business processes, culture, and business strategy.
10)   Build an analytics ecosystem with your CIO. Build an ecosystem consisting of other business leaders, employees, external analytics suppliers, and business partners. Use them to help you institutionalize analytics at your company.
11)    Apply analytics on more than one front. No single initiative will make the company more successful—no single analytics initiative will do so either.
12)   Know the limits to analytics. Know when it is appropriate to use intuition instead of analytics. As a professor of mine once said not all elements of business strategy can be solved by using statistics or analytics. You should know where and when analytics are appropriate.

Data AnalyticsFollowing these twelve items will help strategic oriented CFOs lead the analytics agenda at their companies. As I indicated in “Who Owns the Analytics Agenda?”, CFOs already typically act as data validators at their firms, but taking this next step matters to their enterprise because “if we want to make better decisions and take the right actions, we have use analytics” (Analytics at Work, Tom Davenport, Harvard Business Review Press, page 1). Given this, CFOs really need to get analytics right. The CFOs that I have talked to say they already “rely on data and analytics and they need them to be timely and accurate”.

One CFO, in fact, said that data is potentially the only competitive advantage left for his firm”. And while implementing the data side of this depends on the CIO. It is clear from the CFOs that I have talked to that they believe a strong business relationship with their CIO is critical to the success of their business.

Enterprise DataSo the question remains are you ready as a financial leader to lead on the analytics agenda? If you are and you want to learn more about setting the analytics agenda, please consider yourself invited to webinar that I am doing with the CFO of RoseRyan in January.

The Webinar is entitled “Analytics and Data for the Strategic CFO”. And by clicking this link you can register to attend. See you there.

Related Blogs

CFOs Move to Chief Profitability Officer
CFOs Discuss Their Technology Priorities
The CFO Viewpoint upon Data
How CFOs can change the conversation with their CIO?
New type of CFO represents a potent CIO ally
Competing on Analytics
The Business Case for Better Data Connectivity

Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Data Governance | Tagged , , , , | Leave a comment

Connecting Architecture To Business Strategy

On November 13, 2014, Informatica acquired the assets of Proact, whose Enterprise Architecture tools and delivery capability link architecture to business strategy. The BOST framework is now the Informatica Business Transformation Toolkit which received high marks in a recent research paper:

“(BOST) is a framework that provides four architectural views of the enterprise (Business, Operational, Systems, and Technology). This EA methodology plans and organizes capabilities and requirements at each view, based on evolving business and opportunities. It is one of the most finalized of the methodologies, in use by several large enterprises.” [1] (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Integration Competency Centers | Tagged , , , | Leave a comment

Decrease Salesforce Data Prep Time With Project Springbok

Account Executives update opportunities in Salesforce all the time. As opportunities close, payment information is received in the financial system. Normally, they spend hours trying to combine the data, to prepare it for differential analysis. Often, there is a prolonged, back-and-forth dialogue with IT. This takes time and effort, and can delay the sales process.

What if you could spend less time preparing your Salesforce data and more time analyzing it?

Decrease Salesforce Data Prep Time With Project Springbok

Decrease Data Prep Time With Project Springbok

Informatica has a vision to solve this challenge by providing self-service data to non-technical users. Earlier this year, we announced our Intelligent Data Platform. One of the key projects in the IDP, code-named “Springbok“, uses an excel-like search interface to let business users find and shape the data they need.

Informatica’s Project Springbok is a faster, better and, most importantly, easier way to intelligently work with data for any purpose. Springbok guides non-technical users through a data preparation process in a self-service manner. It makes intelligent recommendations and suggestions, based on the specific data they’re using.

To see this in action, we welcome you to join us as we partner with Halak Consulting, LLC for an informative webinar. The webinar will take place on November 18th at 10am PST. You will learn from the Springbok VP of Strategy and from an experienced Springbok user about how Springbok can benefit you.

So REGISTER for the webinar today!

For another perspective, see the “Imagine Not Needing to do a VLookup ever again!” from Deepa Patel, Salesforce.com MVP.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration | Tagged , , , | Leave a comment

If Data Projects Weather, Why Not Corporate Revenue?

Every fall Informatica sales leadership puts together its strategy for the following year.  The revenue target is typically a function of the number of sellers, the addressable market size and key accounts in a given territory, average spend and conversion rate given prior years’ experience, etc.  This straight forward math has not changed in probably decades, but it assumes that the underlying data are 100% correct. This data includes:

  • Number of accounts with a decision-making location in a territory
  • Related IT spend and prioritization
  • Organizational characteristics like legal ownership, industry code, credit score, annual report figures, etc.
  • Key contacts, roles and sentiment
  • Prior interaction (campaign response, etc.) and transaction (quotes, orders, payments, products, etc.) history with the firm

Every organization, no matter if it is a life insurer, a pharmaceutical manufacturer, a fashion retailer or a construction company knows this math and plans on getting somewhere above 85% achievement of the resulting target.  Office locations, support infrastructure spend, compensation and hiring plans are based on this and communicated.

data revenue

We Are Not Modeling the Global Climate Here

So why is it that when it is an open secret that the underlying data is far from perfect (accurate, current and useful) and corrupts outcomes, too few believe that fixing it has any revenue impact?  After all, we are not projecting the climate for the next hundred years here with a thousand plus variables.

If corporate hierarchies are incorrect, your spend projections based on incorrect territory targets, credit terms and discount strategy will be off.  If every client touch point does not have a complete picture of cross-departmental purchases and campaign responses, your customer acquisition cost will be too high as you will contact the wrong prospects with irrelevant offers.  If billing, tax or product codes are incorrect, your billing will be off.  This is a classic telecommunication example worth millions every month.  If your equipment location and configuration is wrong, maintenance schedules will be incorrect and every hour of production interruption will cost an industrial manufacturer of wood pellets or oil millions.

Also, if industry leaders enjoy an upsell ratio of 17%, and you experience 3%, data (assuming you have no formal upsell policy as it violates your independent middleman relationship) data will have a lot to do with it.

The challenge is not the fact that data can create revenue improvements but how much given the other factors: people and process.

Every industry laggard can identify a few FTEs who spend 25% of their time putting one-off data repositories together for some compliance, M&A customer or marketing analytics.  Organic revenue growth from net-new or previously unrealized revenue is what the focus of any data management initiative should be.  Don’t get me wrong; purposeful recruitment (people), comp plans and training (processes) are important as well.  Few people doubt that people and process drives revenue growth.  However, few believe data being fed into these processes has an impact.

This is a head scratcher for me. An IT manager at a US upstream oil firm once told me that it would be ludicrous to think data has a revenue impact.  They just fixed data because it is important so his consumers would know where all the wells are and which ones made a good profit.  Isn’t that assuming data drives production revenue? (Rhetorical question)

A CFO at a smaller retail bank said during a call that his account managers know their clients’ needs and history. There is nothing more good data can add in terms of value.  And this happened after twenty other folks at his bank including his own team delivered more than ten use cases, of which three were based on revenue.

Hard cost (materials and FTE) reduction is easy, cost avoidance a leap of faith to a degree but revenue is not any less concrete; otherwise, why not just throw the dice and see how the revenue will look like next year without a central customer database?  Let every department have each account executive get their own data, structure it the way they want and put it on paper and make hard copies for distribution to HQ.  This is not about paper versus electronic but the inability to reconcile data from many sources on paper, which is a step above electronic.

Have you ever heard of any organization move back to the Fifties and compete today?  That would be a fun exercise.  Thoughts, suggestions – I would be glad to hear them?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Product Information Management | Tagged , | 1 Comment

Ebola: Why Big Data Matters

Ebola: Why Big Data Matters

Ebola: Why Big Data Matters

The Ebola virus outbreak in West Africa has now claimed more than 4,000 lives and has entered the borders of the United States. While emergency response teams, hospitals, charities, and non-governmental organizations struggle to contain the virus, could big data analytics help?

A growing number of Data Scientists believe so.

If you recall the Cholera outbreak of Haiti in 2010 after the tragic earthquake, a joint research team from Karolinska Institute in Sweden and Columbia University in the US analyzed calling data from two million mobile phones on the Digicel Haiti network. This enabled the United Nations and other humanitarian agencies to understand population movements during the relief operations and during the subsequent cholera outbreak. They could allocate resources more efficiently and identify areas at increased risk of new cholera outbreaks.

Mobile phones, widely owned even in the poorest countries in Africa. Cell phones are also a rich source of data irrespective of which region where other reliable sources are sorely lacking. Senegal’s Orange Telecom provided Flowminder, a Swedish non-profit organization, with anonymized voice and text data from 150,000 mobile phones. Using this data, Flowminder drew up detailed maps of typical population movements in the region.

Today, authorities use this information to evaluate the best places to set up treatment centers, check-posts, and issue travel advisories in an attempt to contain the spread of the disease.

The first drawback is that this data is historic. Authorities really need to be able to map movements in real time especially since people’s movements tend to change during an epidemic.

The second drawback is, the scope of data provided by Orange Telecom is limited to a small region of West Africa.

Here is my recommendation to the Centers for Disease Control and Prevention (CDC):

  1. Increase the area for data collection to the entire region of Western Africa which covers over 2.1 million cell-phone subscribers.
  2. Collect mobile phone mast activity data to pinpoint where calls to helplines are mostly coming from, draw population heat maps, and population movement. A sharp increase in calls to a helpline is usually an early indicator of an outbreak.
  3. Overlay this data over censuses data to build up a richer picture.

The most positive impact we can have is to help emergency relief organizations and governments anticipate how a disease is likely to spread. Until now, they had to rely on anecdotal information, on-the-ground surveys, police, and hospital reports.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration | Tagged , , , , , , , | Leave a comment

Go On, Flip Your Division of Labor: More Time Analyzing and Less Time Prepping Data

Are you in Sales Operations, Marketing Operations, Sales Representative/Manager, or Marketing Professional? It’s no secret that if you are, you benefit greatly from the power of performing your own analysis, at your own rapid pace. When you have a hunch, you can easily test it out by visually analyzing data in Tableau without involving IT. When you are faced with tight timeframes in which to gain business insight from data, being able to do it yourself in the time you have available and without technical roadblocks makes all the difference.

Self-service Business Intelligence is powerful!  However, we all know it can be even more powerful. When needing to put together an analysis, we know that you spend about 80% of your time putting together data, and then just 20% of your time analyzing data to test out your hunch or gain your business insight. You don’t need to accept this anymore. We want you to know that there is a better way!

We want to allow you to Flip Your Division of Labor and allow you to spend more than 80% of your time analyzing data to test out your hunch or gain your business insight and less than 20% of your time putting together data for your Tableau analysis! That’s right. You like it. No, you love it. No, you are ready to run laps around your chair in sheer joy!! And you should feel this way. You now can spend more time on the higher value activity of gaining business insight from the data, and even find copious time to spend with your family. How’s that?

Project Springbok is a visionary new product designed by Informatica with the goal of making data access and data quality obstacles a thing of the past.  Springbok is meant for the Tableau user, a data person would rather spend their time visually exploring information and finding insight than struggling with complex calculations or waiting for IT. Project Springbok allows you to put together your data, rapidly, for subsequent analysis in Tableau. Project Springbok tells you things about your data that even you may not have known. It does it through Intelligent Suggestions that it presents to the User.

Let’s take a quick tour:

  • Project Springbok tells you, that you have a date column and that you likely want to obtain the Year and Quarter for your analysis (Fig 1)., And if you so wish, by a single click, voila, you have your corresponding years and even the quarters. And it all happened in mere seconds. A far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

data

                                                                      Fig. 1

VALUE TO A MARKETING CAMPAIGN PROFESSIONAL: Rapidly validate and accurately complete your segmentation list, before you analyze your segments in Tableau. Base your segments on trusted data that did not take you days to validate and enrich.

  • Then Project Springbok will tell you that you have two datasets that could be joined on a common key, email for example, in each dataset, and would you like to move forward and join the datasets (Fig 2)? If you agree with Project Springbok’s suggestion, voila, dataset joined in a mere few seconds. Again, a far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

Data

  Fig. 2

VALUE TO A SALES REPRESENTATIVE OR SALES MANAGER: You can now access your Salesforce.com data (Fig 3) and effortlessly combine it with ERP data to understand your true quota attainment. Never miss quota again due to a revenue split, be it territory or otherwise. Best of all, keep your attainment datatset refreshed and even know exactly what datapoint changed when your true attainment changes.

Data

Fig. 3

  • Then, if you want, Project Springbok will tell you that you have emails in the dataset, which you may or may not have known, but more importantly it will ask you if you wish to determine which emails can actually be mailed to. If you proceed, not only will Springbok check each email for correct structure (Fig 4), but will very soon determine if the email is indeed active, and one you can expect a response from. How long would that have taken you to do?

VALUE TO A TELESALES REPRESENTATIVE OR MARKETING EMAIL CAMPAIGN SPECIALIST : Ever thought you had a great email list and then found out most emails bounced? Now, confidently determine which emails are truly ones will be able to email to, before you send the message. Email prospects who you know are actually at the company and be confident you have their correct email addresses. You can then easily push the dataset into Tableau to analyze the trends in email list health.

Data

Fig. 4

 And, in case you were wondering, there is no training or install required for Project Springbok. The 80% of your time you used to spend on data preparation is now shrunk considerably, and this is after using only a few of Springbok’s capabilities. One more thing: You can even directly export from Project Springbok into Tableau via the “Export to Tableau TDE” menu item (Fig 5).  Project Springbok creates a Tableau TDE file and you just double click on it to open Tableau to test out your hunch or gain your business insight.

Data

Fig. 5

Here are some other things you should know, to convince you that you, too, can only spend no more than 20% of you time on putting together data for your subsequent Tableau analysis:

  • Springbok Sign-Up is Free
  • Springbok automatically finds problems with your data, and lets you fix them with a single click
  • Springbok suggests useful ways for you to combine different datasets, and lets you combine them effortlessly
  • Springbok suggests useful summarizations of your data, and lets you follow through on the summarizations with a single click
  • Springbok allows you to access data from your cloud or on-premise systems with a few clicks, and the automatically keep it refreshed. It will even tell you what data changed from the last time you saw it
  • Springbok allows you to collaborate by sharing your prepared data with others
  • Springbok easily exports your prepared data directly into Tableau for immediate analysis. You do not have to tell Tableau how to interpret the prepared data
  • Springbok requires no training or installation

Go on. Shift your division of labor in the right direction, fast. Sign-Up for Springbok and stop wasting precious time on data preparation. http://bit.ly/TabBlogs

———-

Are you going to be at Dreamforce this week in San Francisco?  Interested in seeing Project Springbok working with Tableau in a live demonstration?  Visit the Informatica or Tableau booths and see the power of these two solutions working hand-in-hand.Informatica is Booth #N1216 and Booth #9 in the Analytics Zone. Tableau is located in Booth N2112.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, General | Tagged , , , | Leave a comment

The Case for Smart Data: When Big Data Isn’t Enough

Every two years, the typical company doubles the amount of data they store. However, this Data is inherently “dumb.” Acquiring more of it only seems to compound its lack of intellect.

When revitalizing your business, I won’t ask to look at your data – not even a little bit. Instead, we look at the process of how you use the data. What I want to know is this:

How much of your day-to-day operations are driven by your data?

The Case for Smart Data

I recently learned that 7-Eleven Japan has pushed decision-making down to the store level – in fact, to the level of clerks. Store clerks decide what goes on the shelves in their individual 7-Eleven stores. These clerks push incredible inventory turns. Some 70% of the products on the shelves are new to stores each year. As a result, this chain has been the most profitable Japanese retailer for 30 years running.

Click to enlarge

Click to enlarge

Instead of just reading the data and making wild guesses on why something works and why something doesn’t, these clerks acquired the skill of looking at the quantitative and the qualitative and connected dots. Data told them what people are talking about, how it’s related to their product and how much weight it carried. You can achieve this as well. To do so, you must introduce a culture that emphasizes discipline around processes. A disciplined process culture uses:

  1. A template approach to data with common processes, reuse of components, and a single face presented to customers
  2. Employees who consistently follow standard procedures

If you cannot develop such company-wide consistency, you will not gain benefits of ERP or CRM systems.

Make data available to the masses. Like at 7-Eleven Japan, don’t centralize the data decision-making process. Instead, push it out to the ranks. By putting these cultures and practices into play, businesses can use data to run smarter.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Synchronization, Data Transformation, Data Warehousing, Master Data Management, Retail | Tagged , , , , , | Leave a comment

At Valspar Data Management is Key to Controlling Purchasing Costs

Steve Jenkins, Global IT Director at Valspar

Steve Jenkins is working to improve information management maturity at Valspar

Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”

Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.

As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.

“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”

Poorly managed vendor and raw materials data was impacting Valspar’s buying power

Data management at Valspar

“We realized our buying power was limited by the age and quality of available vendor and raw materials data.”

The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve. 

The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.

These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.

Valspar needed a single trusted source of vendor and raw materials data

Informatica MDM supports vendor and raw materials data management at Valspar

The team chose Informatica MDM as their enterprise hub for vendors and raw materials

The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.

Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.

Better vendor and raw materials data management results in cost savings

Valspar Chameleon Jon

Valspar will gain benefits by fueling applications with clean, consistent, connected and enriched data

Valspar expects to gain the following business benefits:

  • Streamline the RFQ process to accelerate raw materials cost savings
  • Reduce the total number of raw materials SKUs and vendors
  • Increase productivity of staff focused on pulling and maintaining data
  • Leverage consistent global data visibly to:
    • increase leverage during contract negotiations
    • improve acquisition due diligence reviews
    • facilitate process standardization and reporting

 

Valspar’s vision is to tranform data and information into a trusted organizational assets

“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.

Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”

Total Supplier Information Management eBook

Click here to download the Total Supplier Information Management eBook

Want more? Download the Total Supplier Information Management eBook. It covers:

  • Why your fragmented supplier data is holding you back
  • The cost of supplier data chaos
  • The warning signs you need to be looking for
  • How you can achieve Total Supplier Information Management

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management, Operational Efficiency, PowerCenter, Vertical | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment