Tag Archives: ERP
Recently I interviewed a consulting manager who had just delivered a very large ERP data migration. What was his biggest challenge? It was getting the business users engaged early in the project. And this is a very familiar story. I have been encountering this same challenge for years in delivering application data migrations. And it cuts both ways, either the business users are unwilling to spend the time, or the IT team does not recognize the need. In this case the team was delayed significantly (the estimate was that a project that should have taken four months lasted nearly seven), as data issues reared their ugly heads at the end of the development cycle, during the first mock load of the data. Lack of collaboration? It has impacts. In the recent white paper published by Informatica, The Five Pitfalls of Data Migration, one of the pitfalls outlined was a lack of collaboration between the business users (or data experts) and the data migration technical team.
What do I mean by business/IT collaboration? Really I am talking about baking data stewardship into the process of application data migration. And the tools to support that process. (more…)
As CIOs embark on their enterprise 2.0 strategy, even more confusion exists around the topic of community, content and collaboration.
Let’s first clarify some nomenclature by defining enterprise 1.0.
These enterprises were happy with implementing document management systems, search, portals and establishing security strategies to protect the perimeter. In this world, you are either inside or outside of the company network. If you need information, you would first need to somehow connect to the network.
Business modernization programs typically focus on process standardization to gain the benefits of efficient repeatable, measurable processes. Enterprise resource planning (ERP) technologies fulfill the process standardization requirements and have now become a central point for management of business processes. However, ERP systems do not prevent low quality data from entering the systems nor do they measure its impact on the efficiency of a business process. Most organizations today are using the same ERP systems (SAP or Oracle) that were configured by the same consultancies. Therefore, the uniqueness and the scope for competitive advantage of any organization are defined by the people and the data. (more…)
Through his many speaking engagements and regular contributions to publications such as Information Management, Dan Power has built a solid reputation as one of the most knowledgeable commentators on all things in master data management. He’s got a new white paper titled “When Data Governance Turns Bureaucratic: How Data Governance Police Can Constrain the Value of Your Master Data Management Initiative” that we’re featuring on the Siperian website.
Dan observes that while many early adopters of MDM expected to quickly establish a “single source of truth” across their information systems, many have encountered problems. The culprit? Reactive data governance.
As many as 80% – 90% of companies implementing MDM start with a “coexistence” architecture whereby front office applications (CRM) and back office applications (ERP) are still used to author master data (customer and product data, suppliers, employees, etc.). Because these applications remain the “Systems of Entry”, while the MDM hub’s role is limited to being the “System of Record,” some of the biggest promises of MDM remain unrealized.
Dan shows that firms embracing a proactive data governance approach can overcome these limitations. By authoring data directly in the master data management hub itself, firms can decouple data entry from the traditional CRM and ERP systems. When the System of Entry and the System of Record are one and the same, the application architecture is simplified quite a bit. The CRM and ERP systems become consumers of master data only – they no longer originate it.
Read Dan’s new white paper to learn about the shortfalls of reactive data governance and how proactive data governance can benefit you. You can download the white paper here.
Earlier this month our Managing Director for Informatica in ASEAN, Suganthi Shivkumar, wrote an article entitled “Data Integration Helps Achieve Business Goals”.
I have reposted it here with permission from CXOToday.
Data Integration Helps Achieve Business Goals
Imagine a world where IT is perfectly aligned with the business. Accurate, relevant information flows freely throughout the enterprise, driving timely decisions and actions. The IT infrastructure is flexible and designed for reuse ensuring that companies don t just respond to changing business requirements and competitive pressures, but stay ahead of them. And in an ideal world, IT delivers real, measurable value to the business, supporting key goals such as: (more…)
Over the last 4 years we have seen Customer Data Integration (CDI) evolve into Master Data Management (MDM), heard many debates about MDM architectural styles, features and functions, operational vs. analytical, and the ever present need for an MDM platform to have web services so that applications could be developed to manage and retrieve the valuable master data contained within the MDM system. In fact, one vendor used to tout the number of services they shipped with their Hub (Hundreds) as an indicator of why they were the best MDM platform.
Fast forward to present day. Many MDM systems are now up and running, processing master data sourced from internal CRM, ERP, legacy applications to create a Reliable Trusted Golden Record (RTGR), but they have not yet “mastered” the ability to enable business users to directly create and consume master data from the MDM system. As a result, business users continue to create incomplete, inconsistent and duplicate data within their business applications (CRM, ERP, etc.) but those applications are unable to enforce data quality at the point of creation. The result is that data stewards end up retroactively resolving those problems downstream. This situation quickly creates a bottleneck because shear volume of errors means that data stewards cannot fix the thousands of bad data records fast enough. While providing more features and functions to data stewardship consoles and improving back end automatic processing capabilities helps the problem somewhat, many companies are now realizing that a possible solution could be to take a leaf out of Business Intelligence (BI) in the 90’s by giving more exposure and management of Master Data to the business users.
In order to do this, IT must now decide how they apply a business user facing user interface (UI) on top of their MDM Hub all the while noting that the other types of master data will likely be added to the Hub with ever changing business requirements. They also face ever changing standards by which popular UI interaction technologies continue to evolve and mature. Many IT are not equipped to do custom application development. Even if they could, there is that little problem of finding enough additional bandwidth to learn how to incorporate those hundreds of services to embark on major ground-up development of new applications. Finally the maintenance and change costs would be high.
What is needed is for MDM platforms to move to the next level. The architecture of a MDM platform should be integrated, model/metadata driven and flexible enough to apply the latest technologies in UI, workflow, etc. along with tight security. All these should be easily achievable through configuration rather than coding.
MDM for the business masses, it’s time.
It is estimated that organizations spend more than $37 billion annually on enterprise application software licenses alone – not including maintenance, services or other associated costs. So enterprise applications have been and continue to be huge investments for many companies. In fact, many companies have multiple ERP and CRM systems running across various divisions and business units. So you would think enterprises have all the bases covered, right? Not necessarily – a huge piece of these investments get neglected or overlooked, and that is the data that is tied to these systems.
This is the assertion of a new study from Datamonitor, which concludes that while billions upon billions are spent on enterprise application systems, organizations are not investing nearly enough resources or attention in the data that will be moving in and out of these systems. (more…)