Tag Archives: Governance
Continuing the tour of our Data Governance Framework, it’s time to discuss the corporate policies that must be documented to form the foundation of your data governance efforts. When defined, approved, evangelized and enforced appropriately, these policies have the power to accomplish a feat that grassroots data governance efforts fail at repeatedly: Evolving your corporate culture to one that actually does manage data as an asset.
The next stop on the tour of our data governance framework focuses on the people investments that your organization must make to build out data governance capabilities. The right people are required to support, sponsor, steward, operationalize and ultimately deliver a positive return on your data assets. As you can imagine, the challenge of defining the right roles and responsibilities, job descriptions, career paths, and incentive plans for the people needed to make data governance a success will not be solved in a short blog post. So my goal here is to share my thoughts and open the discussion to identify the most relevant areas for consideration. (more…)
Last year, while still an analyst with Forrester Research, OCDQ “Blogger in Chief” Jim Harris and I coordinated dueling blogs taking polar-opposite stances on the debate over whether Data Governance initiatives should embrace an approach to optimize their agility or an approach to formalize the necessary bureaucracy. (You can read Jim’s blog here and my blog here.)
That was a fun exercise, but our clear conclusion was that aspects of both agility and bureaucracy are necessary to some extent for data governance to deliver real business value. (more…)
In the world of big data, getting access to data and making sense of it is often times a more important consideration than managing sheer volume itself. Companies that are successful in unlocking true value from big data open themselves up to a world of insight for better understanding of things like customer preferences, satisfaction and regional purchasing differences. Doing this obviously is often harder than it seems due to the variety of information itself, leading to standardization and duplication issues. Ownership is often an issue as well, with departmental lines being the most common constraint to sharing important data across the enterprise. (more…)
Looking back at some of my Informatica Perspectives posts over the past year or so, I reflected on some common themes about data management and data governance, especially in the context of master data management and particularly, master data models. As both the tools and the practices around MDM mature, we have seen some disillusionment in attempts to deploy an MDM solution, with our customers noting that they continue to hit bumps in the road in the technical implementation associated with both master data consolidation and then with publication of shared master data.
Almost every issue we see can be characterized into one of three buckets: (more…)
In my last post I started to talk about ideas for classifying the data management issues, with the reasoning that it will help to determine the feasibility that the expectation that acquiring a particular solution will actually address the core issues. I actually have used this categorization with some of our customers, and the process of classification does lend some clarity when considering solutions. There are five categories: (more…)
The cost for 1GB of magnetic disk storage 20 years ago was $1,000 – now it’s eight cents. 1GB is enough to store about 20 thousand letter-size scanned documents. To store the same number of paper documents would require two four-drawer filing cabinets which would cost about $400. The cost of electronic data storage is five thousand times less than paper storage.
Costs have dropped consistently 40% per year which accounts for the more than 12,000 times reduction in cost since 1992. The cost for RAID or mainframe disk storage is somewhat greater, but the historical trend for other storage devices has been similar and the forecast for the foreseeable future is that costs will continue to decrease at the same rate. Twenty years from now we will be able to buy one tera-byte of storage for a penny. (more…)
The “Dodd-Frank Wall Street Reform and Consumer Protection Act” has recently been passed by the US federal government to regulate financial institutions. Per this legislation, there will be more “watchdog” agencies that will be auditing banks, lending and investment institutions to ensure compliance. As an example, there will be an Office of Financial Research within the Federal Treasury responsible for collecting and analyzing data. This legislation brings with it a higher risk of fines for non-compliance. (more…)
The CIO of GT Inc. (the fictitious name of a real company) met with his middleware vendor rep to deliver some depressing news.
“We established an outsourced factory delivery model two years ago using the productivity tools that you sold us and we made it our enterprise standard. The factory results however, are discouraging use of your integration platform. Projects are not getting approved by the business because of high costs, or else project teams are working around the standard and building hand-coded solutions. Did I make a mistake in buying your software?” (more…)
Gartner recently released their 2011 Magic Quadrant for Data Quality Tools and I’m happy to announce that Informatica is positioned in the Leaders’ quadrant. We believe our position is a testament to the fact that customers like Station Casinos and U.S. Xpress continue to turn to Informatica to solve their most critical data quality challenges.
The publishing of the Magic Quadrant is often a great opportunity to reflect on the state of the data quality market. It should come as no surprise that data quality as a business imperative isn’t going away any time soon. We are continuing to see customers looking for help and expertise in solving a wide range of data quality problems, largely associated with data governance initiatives, master data management (MDM), business intelligence and application modernization. And the association of data quality in these areas is only getting stronger. (more…)