Tag Archives: ERP
In my last blog post I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.
Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs). (more…)
According to a Forbes article, the average organization will grow their data by 50 percent in the coming year. Overall corporate data is expected to grow by 94 percent. According to Informatica, data is predicted to increase by as much as 75 times the current volume by the year 2020. What is Big Data all about? Big Data is the management and analytics of an immensely growing volume, variety, and velocity of data in a digital world. A precise definition of big data from analysts like Gartner and Forrester is a hot topic right now that is covered in a lot of blogs.
In my point of view, big data is connecting the dots. It is connecting more than ever before. But what is the role of product data in a big data world?
After recently talking to our customer Halfords, the UK retailer for bicycle and auto parts revealed: All challenge Amazon. Halfords is known as the expert and friend for cyclists. Therefore they position their brand as the leading expert with the best information. They use product information as a differentiator in the market to gain customers’ trust.
This article refers to a challenge that a lot of distributors and retailers are facing. In order to better serve their B2B and B2C customers, they grow and position their product range to be the one trusted supplier. The long tail (endless aisle) strategy offers higher margins with niche products as well.
These distributors and retailers are faced with the challenge of handling 100s or 1000s of suppliers providing content for millions of products. What happens when different suppliers provide information for the same product?
A business case of big product data: Innovative distributors attempt to merge different product content to create the best and richest product information. This requires an intelligent analysis of a supplier’s product data, and intelligent automatism in order to merge this data to create superior product content. The role of the data steward in defining these rules and policies becomes more important than ever before.
How can this be solved?
Data doesn’t only come from suppliers but from other data sources as well. Basic product information might come from a data hub like GS1 or could be synchronized from the distributor’s ERP system, which in turn might be leading the creation of new products in the distributor’s master assortment.
This basic data will be enriched by data coming directly from the manufactures or the suppliers of the distributor. These different data sources provide content for the same products in different levels of quality, richness, and completeness.
Which parts of product information are used from which data sources is determined by objective data quality rules combined with a definition of trust specific to each data source. One supplier is known for accurate descriptions in English while another provides the better German information. And yet a third data source usually provides the best images.
Governance of Product Information Creates Competitive Advantages
This is when Product Information Management comes into the field: to control big product data. According to Heiler’s PIM Product Manager, Markus Schuster, these business processes can only be successful when used with intelligent, highly automated data quality proofpoints and workflows that adhere to the data governance policy.
According to a 2011 Ovum survey, 85% of respondents cited ballooning data sets as the cause of application performance problems. Many IT organizations fell short in 2012 letting unmanaged data growth impact the business. This year, Informatica is witnessing a surge of interest in Enterprise Data Archive solutions. This interest is being created because executives want to invest in innovative technologies for real-time and operational analytics. Yet, with little to no IT budget increase, IT leaders are getting creative.
Businesses are moving from on premises applications to Software as a Service (SaaS) freeing up time and resources – yet the legacy application being replaced all too often stays in the data center consuming costly resources. IT leaders are recognizing the quick win of retiring legacy applications. An application retirement strategy supports data center consolidation and application modernization initiatives – while ensuring data is retained to meet regulatory compliance and business needs. Significant cost savings are realized because mainframe systems can be turned off, maintenance costs go away. With this new source of revenue, executives can fund their analytics projects and drive competitive operations. (more…)
As one of the founders of Informatica’s Smart Partitioning capability, I am constantly asked, “Why can’t we just use (insert DB vendor here) tools to accomplish the same thing?” What a great, simple, straightforward question…and what a nuanced answer! Instead of talking about how great our technology is or walk through all the features and functionality, I thought it would be best to answer the actual question, “Why can’t we do this on our own?” In this two part series, we will explore the manual process of implementing Oracle database partitioning and compression in complex OLTP applications. (more…)
A CIO told me “After five years with an integration Center of Excellence, I expect them to be excellent. They aren’t.” But so what? The IT organization has lots of things to focus on. Is integration excellence really essential? (more…)
I’m sitting in the Taiwan airport on my way to Guangzhou. We just completed the Informatica World Tour in Hong Kong, Beijing and Taiwan, and I’ve had the opportunity to deliver the keynote presentation, Maximize Your Return on Big Data.
All of our audiences exceeded our expectations. We had 50% more attendees than planned. Why? Big data. It is a hot topic and everyone is trying to determine how to leverage big data in their enterprise to get a competitive advantage. At the event, I made the point – if you’re not trying to understand how to leverage big data in your enterprise, your successor will. Kitty Fok, the IDC China Country Manager, spoke after me. Her consistent comment was – “if your company isn’t looking to leverage big data, you will be out of business.” (more…)
Over the last few years most enterprises have implemented several (if not more) large ERP and CRM suites. Although these applications were meant to have self-contained data models, it turns out that many enterprises still need to manage “master data” between the various applications. So the traditional IT role of hardware administration and custom programming has evolved to packaged application implementation and large scale data management. According to Wikipedia: “MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.” Instead of designing large data warehouses to maintain the master data, many organizations turn to packaged Master Data Management (MDM) packages (such as Informatica MDM). With these tools at hand, IT shops can then build true Customer Master, Product Master (Product Information Management – PIM), Employee, or Supplier Master solutions. (more…)
Let’s say you’re a Fortune 500 manufacturer and a supplier informs you that a part it sold you last year is faulty and needs to be replaced. What’s the first thing you do—and how do you do it?
You need answers fast to critical questions: In which products did we use the faulty part? Which customers bought those products and where are they located? Do we have substitute parts in stock? Do we have an alternate supplier? (more…)
As companies increasingly explore master data management (MDM), we often hear inquiries about the usability of master data by business users.
Common questions include: Do business users need to learn and use a separate MDM application? Do they need support from IT to access master data? Can master data fit into the everyday business applications they use for CRM, SFA, ERP, supply chain management, and so forth?
If your organization has ever asked these questions, you should take a look at our new white paper, “Drive Business User Adoption of Master Data.” (more…)