Tag Archives: Management
In my last post I started to talk about ideas for classifying the data management issues, with the reasoning that it will help to determine the feasibility that the expectation that acquiring a particular solution will actually address the core issues. I actually have used this categorization with some of our customers, and the process of classification does lend some clarity when considering solutions. There are five categories: (more…)
Are BI managers and professionals sometimes too eager to please the business? Are centralized BI efforts slowing down progress? Should BI teams address requirements before the business even asks for them? These questions may seem counter-intuitive, but Wayne Eckerson, director of research for TDWI, says that the best intentions for BI efforts in many organizations may actually result in sluggish projects, duplication of effort, and misaligned priorities between BI teams and the business. (more…)
At a conference last fall, I heard Martin Brodbeck, executive director for strategic architecture at Pfizer, describe how his company, a $48-billion pharmaceutical giant, was able to employ master data management (MDM) to bring together data assets from across its global enterprise into a single, centralized data definition.
The key ingredient to Pfizer’s success in this area, Brodbeck said, was not technology by itself, but enterprise governance. Pfizer’s MDM effort was led by an internal business sponsor, who helped promote the concept to the rest of the global enterprise. “Master data management is much more about governance than it is about technology,” he pointed out. (more…)
There’s no question that integrating analytical and transaction data to deliver “Pervasive Business Intelligence” can be a significant project for many enterprises. However, the good news is that it’s a capability that’s within the reach of many enterprises today. That’s the gist of a Q&A with three industry thought leaders, published in the latest edition of Intelligent Enterprise. (more…)
Since launching the EDM blog in early 2007, we have focused on a wide variety of data management, Informatica usage and technology topics. In 2008, I will also be discussing my experiences and research in Enterprise Data Warehousing, an area that our customers have used our software and solutions to great success.
Enterprise Data Warehousing is a term that has been around for a long time. In the mid-90’s, Bill Inmon preached an enterprise approach to data warehousing that was based on a central repository of corporate data. With the technology at the time, success was only attainable by a few elite organizations at extreme levels of funding. Informatica pioneered an incremental data mart approach that led to years of prosperity in the Data Warehousing market for Informatica and customers using our technology for their data warehousing related projects.
I attended my first parent-teacher meeting the other day for my five-year old daughter. Another one of those “life stage” events done and dusted – I remember dreading the annual meeting when I was a kid. The notion of my parents and my teacher comparing notes on my behaviour was too much to bear – somebody was eventually going to put two and two together and find out I was up to no good.
It all got me thinking about a recent blog post by my esteemed colleague Garry Moroney. His post Mobilizing the Data Quality Army outlined the level of effort, thought and planning that the US Department of Education is putting into data quality.
As Garry points out dealing with data quality in a large, disconnected organization such as the US schools system is not a trivial exercise. But if you were to only read that one post you might be overwhelmed by the potential size of the data quality task in front of you.
A recent InformationWeek article* described the growth in IT employment across the US as a result of a shift in skills. Rather than focusing on pure IT proficiency, organizations are looking for talent with “a more hybrid mix of technology skills, along with an understanding of the business and its customers.”
IT departments are highly motivated to increase the level of collaboration with their counterparts in the business. Nowhere is this more critical than in the area of data quality and the trend is causing a shift in the way companies are looking to solve their data quality issues. First generation data quality tools had a natural focus on technology, instead of business. Here are some of the differences between technology focused data quality solutions and business-focused data quality solutions.
Tools vs. Process
Technology focused data quality solutions provide tools that automate data processing. Evidence of this type of focus can be seen in the way that vendors will tout the sophistication and type of their algorithms over and above their ability to support ongoing data quality management processes. While technology is extremely important, its relevance cannot eclipse the overall data quality management process. Even if your data quality tool can automate the correction of 95 percent of the data, if the remaining five percent cannot be managed properly, you will continue to suffer from poor data quality.
I recently received an email from one of my early clients. After having worked in four different companies in four different industries, she came to a sad conclusion, writing:
“The thing that they all have in common is a desire to cut corners and deal with quality later. It takes a lot of energy to be the information quality cheerleader, and I find it discouraging and overwhelming at times. Keep writing your articles and books to encourage all the people like me who are dealing with these issues every day.” P. G.
The discovery that P. G. has experienced is, unfortunately, the norm—not the exception. There are two critical elements in this experience.
Alice: Would you tell me, please, which way I ought to go from here?
The Cheshire Cat: That depends a good deal on where you want to get to
Alice: I don’t much care where.
The Cheshire Cat: Then it doesn’t much matter which way you go
– Lewis Carroll, Alice’s Adventures in Wonderland
When confronted with the problem of how to address their data quality issues many organisations are faced with a similar dilemma to that which confronted Alice during her travels in Wonderland; “I know that I need to do something, but I don’t know where to start”. Knowing where to start and, equally importantly, the size of the problem as well as where an organisation needs to go are critical factors in ensuring that their data quality journey takes them where they need to be at the price they are prepared to pay.
When planning their “journey” organisations need to address the issue of data quality holistically by considering each of the three DQ pillars in turn; firstly “People”, then “Ideas” and finally “Technology”. Many DQ initiatives have failed as the primary focus has been on delivering a technical solution. However without the right framework in place and operated by the right people this approach will never deliver the results that organisations need. Time and time again within the IT industry it has been proved that the pure application of technology will never solve business issues, as technology in itself will never win the “war”, it is always the right people with the right ideas who use the technology in the right way.
I’ve just been reading a US Department of Education briefing document on improving data quality in education performance data. The report stresses the impact that low quality data can have on measuring the success of education programs. It discusses for example the numerous data quality problems identified in the “No child left behind” program established in 2001. The problems are typical – non-standardized data definitions, inconsistent data from different sources, data entry errors, lack of timeliness.
The briefing document outlines a broad set of data quality guidelines to be implemented right across the education system in the US – at State level, in Local Education Agencies (LEAs) and in schools themselves. The three foundation stones of the data quality framework outlined are:
• suitable technical infrastructure,
• a comprehensive dictionary of data definitions
• staff ownership, organization and training