Category Archives: Business/IT Collaboration
Within every corporation there are lines of businesses, like Finance, Sales, Logistics and Marketing. And within those lines of businesses are business users who are either non-technical or choose to be non-technical.
These business users are increasingly using Next-Generation Business Intelligence Tools like Tableau, Qliktech, MicroStrategy Visual Insight, Spotfire or even Excel. A unique capability of these Next-Generation Business Intelligence Tools is that they allow a non-technical Business User to prepare data, themselves, prior to the ingestion of the prepared data into these tools for subsequent analysis.
Initially, the types of activities involved in preparing this data are quite simple. It involves, perhaps, putting together two excel files via a join on a common field. However, over time, the types of operations a non-technical user wishes to perform on the data become more complex. They wish to do things like join two files of differing grain, or validate/complete addresses, or even enrich company or customer profile data. And when a non-technical user reaches this point they require either coding or advanced tooling, neither of which they have access to. Therefore, at this point, they will pick up the phone, call their brethren in IT and ask nicely for help with combining, enhancing quality and enriching the data. Often times they require the resulting dataset back in a tight timeframe, perhaps a couple of hours. IT, will initially be very happy to oblige. They will get the dataset back to the business user in the timeframe requested and at the quality levels expected. No issues.
However, as the number of non-technical Business Users using Next-Generation Business Intelligence tools increase, the number of requests to IT for datasets also increase. And so, while initially IT was able to meet the “quick hit dataset” requests from the Business, over time, and to the best of their abilities, IT increasingly becomes unable to do so.
The reality is that over time, the business will see a gradual decrease in the quality of the datasets returned, as well as an increase the timeframe required for IT to provide the data. And at some point the business will reach a decision point. This is where they determine that for them to meet their business commitments, they will have to find other means by which to put together their “quick hit datasets.” It is precisely at this point that the business may do things like hire an IT contractor to sit next to them to do nothing but put together these “quick hit” datasets. It is also when IT begins to feel marginalized and will likely begin to see a drop in funding.
This dynamic is one that has been around for decades and has continued to worsen due to the increase in the pace of data driven business decision making. I feel that we at Informatica have a truly unique opportunity to innovate a technology solution that focuses on two related constituents, specifically, the Non-Technical Business User and the IT Data Provisioner.
The specific point of value that this technology will provide to the Non-Technical Business User will enable them to rapidly put together datasets for subsequent analysis in their Next-Generation BI tool of choice. Without this tool they might spend a week or two putting together a dataset or wait for someone else to put it together. I feel we can improve this division-of-labor and allow business users to spend 1-2 weeks performing meaningful analysis before spending 15 minutes putting the data set together themselves. Doing so, we allow non-technical business users to dramatically decrease their decision making time.
The specific point of value that this technology will provide the IT data provisioner is that they will now be able to effectively scale data provisioning as the number of requests for “quick hit datasets” rapidly increase. Most importantly, they will be able to scale, proactively.
Because of this, the Business and IT relationship has become a match made in heaven.
“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.
In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.
Here are my MDM Day fun facts and key takeaways:
- Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?
GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services. Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.
- Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”
- Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?
Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.
- Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?
Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.
While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:
- Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
- Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”
I also had the opportunity to speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.
There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.
I recently came across an article from 2006, which is clearly out-of-date, but still a good read about the state of data integration eight years ago. “Data integration was hot in 2005, and the intense interest in this topic continues in 2006 as companies struggle to integrate their ever-growing mountain of data.
A TDWI study on data integration last November found that 69% of companies considered data integration issues to be a high or very high barrier to new application development. To solve this problem, companies are increasing their spending on data integration products.”
Business intelligence (BI) and data warehousing were the way to go at the time, and companies were spending millions to stand up these systems. Data integration was all massive data movements and manipulations, typically driven by tactical tools rather than true data integration solutions.
The issue I had at the time was the inability to deal with real-time operational data, and the cost of the technology and deployments. While these issues were never resolved with traditional BI and data warehousing technology, we now have access to databases that can manage over a petabyte of data, and the ability to cull through the data in seconds.
The ability to support massive amounts of data have reignited the interest in data integration. Up-to-the-minute operational data in these massive data stores is actually possible. We can now understand the state of the business as it happens, and thus make incremental adjustments based upon almost perfect information.
What this situation leads to is true value. We have delivery of the right information to the right people, at the right time, and the ability to place automated processes and polices around this data. Business becomes self-correcting and self-optimizing. The outcome is a business that is data-driven, and thus more responsive to the markets as well as to the business world itself.
However, big data is an impossible dream without a focus on how the data moves from place to place, using data integration best practices and technology. I guess we can call this big data integration, but it’s really the path to provide these massive data stores with the operational data required to determine the proper metrics for the business.
While data integration is not a new term. However the application of new ways to leverage and value data brings unprecedented new value to enterprises. Millions of dollars an hour of value are being delivered to Global 2000 organizations that leverage these emerging data integration approaches and technology. What’s more, data integration is moving from the tactical to the strategic budgets of IT.
So, what’s changed in eight years? We finally figured out how to get the value from our data, using big data and data integration. It took us long enough, but I’m glad it’s finally become a priority.
Now you can experience the next best thing by attending InformaticaWorld 2014 and hearing the American Airlines US Airways Data Architects talk about the data challenges they faced. They will discuss the role of architecture in M&A, integrating legacy data, lessons learned, and best practices in Data Integration.
While you are at the show, you will have the opportunity to hear many industry experts discuss current trends in Agile end-to-end Data Integration.
Agile Data Integration Development
To deliver the agility that your business requires, IT and Business must pursue a collaborative Data Integration process, with the appropriate Analyst self-service Data Integration tools. At InformaticaWorld, you can learn about Agile Data Integration development from the experts at GE Aviation, who will discuss Agile Data Integration for Big Data Analytics. Experts from Roche, will discuss how Agile Data Integration has lead to a 5x reduction in development time, improved business self-service capabilities and increased data credibility.
Another aspect of agility is your ability to scale your Data Warehouse to rapidly support more data, data sources, users and projects. Come hear the experts from Liberty Mutual share challenges, pitfalls, best practices and recommendations for those considering large-scale Data Integration projects, including successful implementation of complex data migrations, data quality and data distribution processes.
The management of an enterprise-scale Data Warehouse involves the operation of a mature and complex mission-critical environment, which is commonly driven through an Integration Competency Center (ICC) initiative. You now have the need to inspect and adapt your production system and expedite data validation and monitoring processes through automation, so that data issues can be quickly caught and corrected and resources can be freed up to focus on development.
The experts from University of Pittsburgh Medical Center, along with Informatica Professional Services experts, will discuss best practices, lessons learned and the process of transitioning from ‘analytics as project’ to an enterprise initiative through the use of an Integration Competency Center.
Hear from the Informatica Product Experts
You will have many opportunities to hear directly from the Informatica product experts about end-to-end Data Integration Agility delivered in the recent 9.6 release of PowerCenter.
See PowerCenter 9.6 in Action
Don’t miss the opportunity to see live demos of the cool new features of PowerCenter 9.6 release at the multitude of hands-on labs being offered at InformaticaWorld this year.
For example you can learn how to empower business users through self-service Data Integration with PowerCenter Analyst tool; how to reduce testing time of Data Integration projects through automated validation tests; and how to scale your Data Integration with High Availability and Grid.
The sessions we described here are a sampling of the rich variety of sessions that will be offered on Data Integration at the show. We hope that you will join us at InformaticaWorld this year in Las Vegas on May 13-15 and as you plan your visit, please check out the complete listing of sessions and labs that are focused on Data Integration.
Please feel free to leave a comment and let us know which InformaticaWorld session/s you are most looking forward to! See you there!
Data security breaches continue to escalate. Privacy legislation and enforcement is tightening and analysts have begun making dire predictions in regards to cyber security’s effectiveness. But there is more – Trusted insiders continue to be the major threat. In addition, most executives cannot identify the information they are trying to protect.
Data security is a senior management concern, not exclusive to IT. With this in mind, what is the next step CxOs must take to counter these breaches?
A new approach to Data Security
It is clear that a new approach is needed. This should focus on answering fundamental, but difficult and precise questions in regards to your data:
- What data should I be concerned about?
- Can I create re-usable rules for identifying and locating sensitive data in my organization?
- Can I do so both logically and physically?
- What is the source of the sensitive data and where is it consumed?
- What are the sensitive data relationships and proliferation?
- How is it protected? How should it be protected?
- How can I integrate data protection with my existing cyber security infrastructure?
The answers to these questions will help guide precise data security measures in order to protect the most valuable data. The answers need to be presented in an intuitive fashion, leveraging simple, yet revealing graphics and visualizations of your sensitive data risks and vulnerabilities.
At Informatica World 2014, Informatica will unveil its vision to help organizations address these concerns. This vision will assist in the development of precise security measures designed to counter the growing sophistication and frequency of cyber-attacks, and the ever present danger of rogue insiders.
Stay tuned, more to come from Informatica World 2014.
Recently, we posted an initial discussion between Informatica’s CMO Marge Breya and CIO Eric Johnson, explaining how CIOs and CMOs can align and thrive. In the dialog below, Breya and Johnson provide additional detail on how their departments partner effectively.
Q: Pretty much everyone agrees that marketing has changed from an art to a science. How does that shift translate into how you work together day to day?
Eric: The different ways that marketers now have to get to the prospects and customers to grow their marketshare has exploded. It used to be a single marketing solution that was an after-thought, and bolted on to the CRM solution. Now, there are just so many ways that marketers have to consider how they market to people. It’s driven by things going on in the market, like how people interact with companies and the lifestyle changes people have made around mobile devices.
Marge: Just look at the sheer number of systems and sources of data we care about. If you want to understand upsell and cross-sell for customers you have to look at what’s happening in the ERP system, what’s happened from a bookings standpoint, whether the customer is a parent or child of another customer, how you think about data by region, by industry by job title. And there’s how you think about successful conversion of leads. Is it the way you’d predicted? What’s your most valuable content? Who’s your most valuable outlet or event? What’s your ROI? You can’t get that from any one single system. More and more, it’s all about conversion rates, about forecasting and theories about how the business is working from a model standpoint. And I haven’t even talked about social.
Q: With so many emerging technologies to look at, how do CMOs reconcile the need to quickly add new products, while CIOs reconcile the need for everything to work securely and well together?
Eric: There’s this yin and yang that’s starting to build between the CIO and the CMO as we both understand each other and the world we each live in, and therefore collaborate and partner more. But at the same time, there’s a tension between a CMO’s need to bring in solutions very quickly, and the CIO’s need to do some basic vetting of that technology. It’s a tension between speed vs. scale and liability to the company. It’s on a case-by-case basis, but as a CIO you don’t say “no.” You give options. You show CMOs the tradeoffs they’re going to make.
There are also risks that are easy to take and worth taking. They won’t cause any problems with the enterprise on a security or integration perspective, so let’s just try it. It may not work — and that’s OK.
Marge: There’s temptation across departments for the shiny new object. You’ll hear about a new technology, and you think this might solve our problems, or move the business faster. The tension even within the marketing department is: do we understand how and if it will impact the business process? And do we understand how that business process will have to change if the shiny new object comes on board?
Q: CMOs are getting data from potentially hundreds of sources, including partners, third parties, LinkedIn and Google. How do the two of you work together to determine a trustworthy data source? Do you talk about it?
Eric: The issue of trusting your data and making sure you’re doing your due diligence on it is incredibly important. Without doing that, you are running the risk of finding yourself in a very tricky situation from a legal perspective, and potentially a liability perspective. To do that, we have a lot of technology that helps us manage a lot data sources coming into a single source of truth.
On top of that, we are working with marketers who are much more savvy about technology and data. And that makes IT’s job easier — and our partnership better — because we are now talking the same language. Sometimes it’s even hard to tell where the line between the two groups actually sits. Some of the marketing people are as technical as the IT people, and some of the IT people are becoming pretty well-versed in marketing.
Q: How do you decide what technologies to buy?
Marge: A couple of weeks ago we went on a shopping trip, and spent the day at a venture capital firm looking at new companies. It was fun. He and I were brainstorming and questioning each other to see if each technology would be useful, and could we imagine how everything would go together. We first explored possibilities, and then we considered whether it was practical.
Eric: Ultimately, Marge owns the budget. But before the budgeting cycle we sit down to discuss what things she wants to work on, and whether she wants to swap technology out. I make sure Marge is getting what she needs from the technologies. There’s a reliance on the IT team to do some due diligence on the technical aspects of this technology: Does it work. Do we want to do business with these people? Is it going to scale? So each party has a role to play in evaluating whether it’s a good solution for the company. As a CIO you don’t say “no” unless there’s something really bad, and you hope you have a relationship with the CMO where you can say here are the tradeoffs you’re making. You say no one has an agenda here, but here are the risks you have to be ok taking. It’s not a “no.” It’s options.
Salesforce.com is one of the most widely used cloud applications across every industry. Initially, Salesforce gained dominance from mid-market customers due to the agility and ease of deployment that the SaaS approach delivered. A cloud-based CRM system enabled SMB companies to easily automate sales processes that recorded customer interactions during the sales cycle and scale without costly infrastructure to maintain. This resulted in faster growth, thereby showing rapid ROI of a Salesforce deployment in most cases.
The Eye of the Enterprise
When larger enterprises saw the rapid growth that mid-market players had achieved, they realized that Salesforce was a unique technology enabler capable of helping their businesses to also speed time to market and scale more effectively. In most enterpises, the Salesforce deployments were driven by line-of-business units such as Sales and Customer Service, with varying degrees of coordination with central IT groups – in fact, most initial deployments of Salesforce orgs were done fairly autonomously from central IT.
With Great Growth Comes Greater Integration Challenges
When these business units needed to engage with each other to run cross functional tasks, the lack of a single customer view across the siloed Salesforce instances became a problem. Each individual Salesforce org had its own version of the truth and it was impossible to locate where in the sales cycle each customer was in respect to each business unit. As a consequence, cross-selling and upselling became very difficult. In short, the very application that was a key technology enabler for growth was now posing challenges to meet business objectives.
Scaling for Growth with Custom Apps
While many companies use the pre-packaged functionality in Salesforce, ISVs have also begun building custom apps using the Force.com platform due to its extensibility and rapid customization features. By using Salesforce to build native applications from the ground up, they could design innovative user interfaces that expose powerful functionality to end users. However, to truly add value, it was not just the user interface that was important, but also the back-end of the technology stack. This was especially evident when it came to aggregating data from several sources, and surfacing them in the custom Force.com apps.
On April 23rd at 10am PDT, you’ll hear how two CIOs from two different companies tackled the above integration challenges with Salesforce: Rising Star finalist of the 2013 Silicon Valley Business Journal CIO Awards, Eric Johnson of Informatica, and Computerworld’s 2014 Premier 100 IT Leaders, Derald Sue of InsideTrack.
Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.
Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).
Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.
As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse, most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:
- How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
- How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
- How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
- How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?
Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,
- a customer moves, changes jobs, gets married, or changes their purchasing habits;
- a suppliers moves, goes bankrupt or acquires a competitor;
- you discontinue a product or launch a new one; or
- you onboard a new asset or retire an old one.
As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”
Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!
Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.
Lastly, look for a technology that is purpose-built to deal with this problem. Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.
Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.
MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.
When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.
Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.
In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.
Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:
- MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
- MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies, contracts and the products they supply.
- MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.
Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.
Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.
The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.
One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.
Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.
Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!
See the MDM Day Agenda.
A few years back, there was a movement in some businesses to establish “data stewards” – individuals who would sit at the hearts of the enterprise and make it their job to assure that data being consumed by the organization is of the highest possible quality, is secure, is contextually relevant, and capable of interoperating across any applications that need to consume it. While the data steward concept came along when everything was relational and structured, these individuals are now earning their pay when it comes to managing the big data boom.
The rise of big data is creating more than simple headaches for data stewards, it is creating turf wars across enterprises. As pointed out in a recent article in The Wall Street Journal, there isn’t yet a lot of clarity as to who owns and cares for such data. Is it IT? Is it lines of business? Is it legal? There are arguments that can be made for all jurisdictions.
In organizations these days, for example, marketing executives are generating, storing and analyzing large volumes of their own data within content management systems and social media analysis solutions. Many marketing departments even have their own IT budgets. Along with marketing, of course, everyone else within enterprises is seeking to pursue data analytics to better run their operations as well as foresee trends.
Typically, data has been under the domain of the CIO, the person who oversaw the collection, management and storage of information. In the Wall Street Journal article, however, it’s suggested that legal departments may be the best caretakers of big data, since big data poses a “liability exposure,” and legal departments are “better positioned to understand how to use big data without violating vendor contracts and joint-venture agreements, as well as keeping trade secrets.”
However, legal being legal, it’s likely that insightful data may end up getting locked away, never to see the light of day. Others may argue IT department needs to retain control, but there again, IT isn’t trained to recognize information that may set the business on a new course.
Focusing on big data ownership isn’t just an academic exercise. The future of the business may depend on the ability to get on top of big data. Gartner, for one, predicts that within the next three years, at least of a third of Fortune 100 organizations will experience an information crisis, “due to their inability to effectively value, govern and trust their enterprise information.”
This ability to “value, govern and trust” goes way beyond the traditional maintenance of data assets that IT has specialized in over the past few decades. As Gartner’s Andrew White put it: “Business leaders need to manage information, rather than just maintain it. When we say ‘manage,’ we mean ‘manage information for business advantage,’ as opposed to just maintaining data and its physical or virtual storage needs. In a digital economy, information is becoming the competitive asset to drive business advantage, and it is the critical connection that links the value chain of organizations.”
For starters, then, it is important that the business have full say over what data needs to be brought in, what data is important for further analysis, and what should be done with data once it gains in maturity. IT, however, needs to take a leadership role in assuring the data meets the organization’s quality standards, and that it is well-vetted so that business decision-makers can be confident in the data they are using.
The bottom line is that big data is a team effort, involving the whole enterprise. IT has a role to play, as does legal, as do the line of business.