Tag Archives: Data Management
“If I had my way, I’d fire the statisticians – all of them – they don’t add value”.
Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production? But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.
So, perhaps this title should be re-written to be:
Fire your Data Scientists – They Aren’t Able to Add Value.
Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:
- “I’m told our data is 60% accurate, which means I can’t trust any of it.”
- “We achieved our goal of an answer within a week by working 24 hours a day.”
- “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
- “Fred manually audits 10% of the invoices. When he is on holiday, we just don’t do the audit.”
This is why I think the original quote is so insightful. Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.
Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.
With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.
Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?
Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.
One thing we all have in common in this modern world, is that we have all, at some point in our lives, been on the receiving end of poor customer service.
Don’t get me wrong, a career in customer service is not an easy one, and I’m sure there are many service providers out there who have been wrongly on the receiving end of an angry customer, for reasons out of the businesses hands, that’s another topic in itself. It is hard, however, to ignore that one thing companies often fail on heavily is providing a timely, easy to access and appropriate level of service for their customers. (more…)
Some interesting news hit UK headlines last year that companies could be made to give the public greater access to their personal transaction data in an electronic, portable and machine-readable format. That’s if the midata project has anything to do with it.
Launched in April 2011 midata is part of the UK Government’s consumer empowerment strategy, Better Choices: Better Deals. Essentially, it’s a partnership between government, consumer groups and major businesses. Its aim is to give consumers access to the data that they produce, from the likes of household utilities, and banking, to internet transactions and high street loyalty cards. (more…)
Thomas Davenport, visiting professor at Harvard University and author of the watershed book Competing on Analytics, is once again making waves across the datasphere with his proclamation of data scientist as the “sexiest job of the 21st century.”
To many readers here at the Perspectives site, of course, this is not news, as many data professionals have increasingly been recognizing – and are being recognized – for the increasing power of information in driving new insights and business opportunities. (more…)
There are those who look at the emerging world of cloud computing as a trend to efficiency. It gives them the ability to leverage resources using much more cost effective models, where those resources are provisioned and shared amongst many consumers.
However, in the quest for efficiency, we often overlook the functionality of “the cloud.” Or, the usefulness of placing core features in a centralized location, which thus provides better control and governance, as well as efficiency. This leads to the placement of core enterprise data management services in the cloud, such as Master Data Management, or MDM. (more…)
In my previous post I discussed effective stakeholder management and communications as a key enabler of successful data quality delivery. In this blog, I will discuss the importance of demonstrated project management fundamentals.
Large-scale, complex enterprise Data Quality and Data Management efforts are characterized by numerous activities and tasks being performed iteratively by multiple resources, across multiple work streams, with high volume units of work (i.e. dozens of source systems and data objects, hundreds of tables, thousands of data elements, hundreds of thousands of data defects and millions of records). Without the means to effectively define, plan and manage these efforts, success is nearly impossible. (more…)
We have been looking at how data management issues can be classified, and in my last post I provided five categories, but broken them down into two groups: Systemic and System. The systemic issues are ones in which process or management gaps allow data flaws to be introduced. A good example occurs when consumers of reports from the data warehouse insist that the data sets are incomplete, and the root cause is that the processes in which the data is initially collected or created do not comply with the downstream requirement for capturing the missing values. (more…)
The banking sector has been through the mill over the past couple of years. Yet as the sector works through the aftermath of the economic turmoil and seeks to innovate with customer service initiatives, many are taking big risks over potential loss of customer data.
Why? Well because adequate safeguards may not be in place to protect confidential data during the testing and development of new web-based services and applications. Worse though, keeping bank accounts secure is not the only risk they’re running – many may not be meeting the data privacy standards required by the regulators. (more…)
Reposted with permission
Shahid Shah’s healthcare IT, EMR, EHR, PHR, medical content, and document management advisory service. Enjoy.
Join me for a free webinar on “Understanding the Escalating Data Challenges of Meaningful Use” on Thursday, April 7th
I’ve been doing a good deal of coaching and consulting on what Meaningful Use really means to technology professionals lately so I was pleased to accept an invitation by Informatica to lead a webinar on that subject for a data management audience.
Data management professionals and the executives that they report to have now had enough time to learn how difficult meeting the escalating requirements for MU actually is; most are reporting that it’s been more work than they thought. Gone are the days when health systems thought they could just install a certified EHR and they would be able to meet the MU goals. Everyone now understands that even if they’re able to collect the measures required in the first phase of MU, the escalating data challenges of later phases will be more difficult. (more…)