Tag Archives: Best Practices
I’ve been in the data management industry for over 20 years, and you’ve always been very supportive of my career – even though you admit you have no clue what it is I do for a living. So here’s my best shot at explaining what I do so you can more accurately brag about me to your friends! J
If you are asked “what is the biggest application in your organization”, what would you say? If you’re in banking you might say it’s the Hogan deposit system. If you’re in Telecom maybe it’s the Amdocs Customer Care and Billing system. If you’re in retail, you might say the Retek Merchandizing system. If you are a manufacturer, it might be your SAP ERP system. The list goes on, but you get the point. The prevailing perception is that the core business application of whatever industry you are in is the biggest application. But this is a case where perception is not reality. (more…)
I’ll get to the secret in just a minute, but first an observation about the cost of IT. Forrester has been conducting a cost of IT study for many years with the most recent results published in the 2013 IT Budget Planning Guide for CIOs. The report includes a chart of total IT spending as a percent of revenue by industry and company size. Cost as a percentage of revenue is a key performance indicator for IT efficiency as organizations increase in size. I first noticed a peculiarity in the data in the 2007 study and I was wondering if it had changed over the years – it hasn’t. The observation is this; for many industries, the cost of IT as a percent of revenue increases as organizations get larger. What is going on here? Whatever happened to “economies of scale?” Instead we seem to have “diseconomies of scale!” (more…)
Following up on the discussion I started on GovernYourData.com (thanks to all who provided great feedback), here’s my full proposal on this topic:
We all know about the “Garbage In/Garbage Out” reality that data quality and data governance practitioners have been fighting against for decades. If you don’t trust data when it’s initially captured, how can you trust it when it’s time to consume or analyze it? But I’m also looking at the tougher problem of data degradation. The data comes into your environment just fine, but any number of actions, events – or inactions – turns that “good” data “bad”.
So far I’ve been able to hypothesize eight root causes of data degradation. I’d really love your feedback on both the validity and completeness of these categories. I’ve used similar examples across a number of these to simplify. (more…)
This is a Lean Integration story – trust me, it will become clear as the story progresses.
I’ve now passed through London’s Heathrow airport security at least five times in the past year, so that makes me an expert. A common pattern I have observed is when the x-ray scanner notices something “suspicious” (like fluids or creams that should be in a separate clear plastic bag.) Then the nightmare starts. (more…)
In my previous blog I explored the importance of a firm understanding of commercial packaged applications on data quality success. In this final post, I will examine the benefits of having operational experience as a key enabler of effective data quality delivery. (more…)
Integration technologies have been around for 20 years (as long as Informatica has been in business) and have proliferated in corporate IT. We are now at an inflection point in the business needs and maturity of integration best practices which we can call Next Generation Data Integration (DI). If we’re going to talk about the next generation, then first we need to put a stake in the ground to describe the current, or prior generation. Furthermore, for it to be a “generational” change, it needs to be a significant step-function improvement in how the work is done and in the business value generated by data assets. Or as Jim Collins said in Built to Last: Successful Habits of Visionary Companies, we need a Big Hairy Audacious Goal. (more…)
If your goal is to implement a world class Integration Competency Center (ICC) or COE, the best people you could find to make up the team already work for you. If you don’t currently have technical superstars on your team, you can still have a leading-edge world-class ICC that will “wow” your internal customers every time. You don’t need a world-class team to have a world-class competency center……you need a world-class management system. (more…)
In my last blog post I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.
Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs). (more…)
In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance. With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources. Earl suggested I speak with the head of our enterprise risk management group … and I left inspired! I was shown a portfolio management-style approach to prioritizing risk management investment. It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk . I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)