Tag Archives: ERP
In my last blog post I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.
Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs). (more…)
According to a 2011 Ovum survey, 85% of respondents cited ballooning data sets as the cause of application performance problems. Many IT organizations fell short in 2012 letting unmanaged data growth impact the business. This year, Informatica is witnessing a surge of interest in Enterprise Data Archive solutions. This interest is being created because executives want to invest in innovative technologies for real-time and operational analytics. Yet, with little to no IT budget increase, IT leaders are getting creative.
Businesses are moving from on premises applications to Software as a Service (SaaS) freeing up time and resources – yet the legacy application being replaced all too often stays in the data center consuming costly resources. IT leaders are recognizing the quick win of retiring legacy applications. An application retirement strategy supports data center consolidation and application modernization initiatives – while ensuring data is retained to meet regulatory compliance and business needs. Significant cost savings are realized because mainframe systems can be turned off, maintenance costs go away. With this new source of revenue, executives can fund their analytics projects and drive competitive operations. (more…)
As one of the founders of Informatica’s Smart Partitioning capability, I am constantly asked, “Why can’t we just use (insert DB vendor here) tools to accomplish the same thing?” What a great, simple, straightforward question…and what a nuanced answer! Instead of talking about how great our technology is or walk through all the features and functionality, I thought it would be best to answer the actual question, “Why can’t we do this on our own?” In this two part series, we will explore the manual process of implementing Oracle database partitioning and compression in complex OLTP applications. (more…)
A CIO told me “After five years with an integration Center of Excellence, I expect them to be excellent. They aren’t.” But so what? The IT organization has lots of things to focus on. Is integration excellence really essential? (more…)
I’m sitting in the Taiwan airport on my way to Guangzhou. We just completed the Informatica World Tour in Hong Kong, Beijing and Taiwan, and I’ve had the opportunity to deliver the keynote presentation, Maximize Your Return on Big Data.
All of our audiences exceeded our expectations. We had 50% more attendees than planned. Why? Big data. It is a hot topic and everyone is trying to determine how to leverage big data in their enterprise to get a competitive advantage. At the event, I made the point – if you’re not trying to understand how to leverage big data in your enterprise, your successor will. Kitty Fok, the IDC China Country Manager, spoke after me. Her consistent comment was – “if your company isn’t looking to leverage big data, you will be out of business.” (more…)
Over the last few years most enterprises have implemented several (if not more) large ERP and CRM suites. Although these applications were meant to have self-contained data models, it turns out that many enterprises still need to manage “master data” between the various applications. So the traditional IT role of hardware administration and custom programming has evolved to packaged application implementation and large scale data management. According to Wikipedia: “MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.” Instead of designing large data warehouses to maintain the master data, many organizations turn to packaged Master Data Management (MDM) packages (such as Informatica MDM). With these tools at hand, IT shops can then build true Customer Master, Product Master (Product Information Management – PIM), Employee, or Supplier Master solutions. (more…)
Let’s say you’re a Fortune 500 manufacturer and a supplier informs you that a part it sold you last year is faulty and needs to be replaced. What’s the first thing you do—and how do you do it?
You need answers fast to critical questions: In which products did we use the faulty part? Which customers bought those products and where are they located? Do we have substitute parts in stock? Do we have an alternate supplier? (more…)
As companies increasingly explore master data management (MDM), we often hear inquiries about the usability of master data by business users.
Common questions include: Do business users need to learn and use a separate MDM application? Do they need support from IT to access master data? Can master data fit into the everyday business applications they use for CRM, SFA, ERP, supply chain management, and so forth?
Recently I interviewed a consulting manager who had just delivered a very large ERP data migration. What was his biggest challenge? It was getting the business users engaged early in the project. And this is a very familiar story. I have been encountering this same challenge for years in delivering application data migrations. And it cuts both ways, either the business users are unwilling to spend the time, or the IT team does not recognize the need. In this case the team was delayed significantly (the estimate was that a project that should have taken four months lasted nearly seven), as data issues reared their ugly heads at the end of the development cycle, during the first mock load of the data. Lack of collaboration? It has impacts. In the recent white paper published by Informatica, The Five Pitfalls of Data Migration, one of the pitfalls outlined was a lack of collaboration between the business users (or data experts) and the data migration technical team.
What do I mean by business/IT collaboration? Really I am talking about baking data stewardship into the process of application data migration. And the tools to support that process. (more…)