Tag Archives: audit
Data volumes are exploding. We see it all around us. The problem is that too much data can have a very negative impact on user productivity. Think about how long it takes to sift through emails after returning from vacation? Consider how long it takes to complete a purchase on an Ecommerce sight on Black Friday? The more data, the longer any of these processes take and the more time spent combing through more and more data. Informatica has been successfully working with Symantec and our customers through our partnership to help them find ways to control the impact of ‘too much data’. We are helping them to define projects that improve their ability to meet SLAs and application performance, reduce costs and mitigate any compliance risks – all while IT budgets remain relatively flat. (more…)
In a recent InformationWeek blog, “Big Data A Big Backup Challenge”, George Crump aptly pointed out the problems of backing up big data and outlined some best practices that should be applied to address them, including:
- Identifying which data can be re-derived and therefore doesn’t need to be backed up
- Eliminating redundancy, file de-duplication, and applying data compression
- Using storage tiering and the combination of online disk and tapes to reduce storage cost and optimize performance (more…)
“The first step in fixing a problem is to measure the size of the problem”. … Agreed! Easy. The next step is harder – how to sustain high quality data and generate ongoing business value.
Measuring data quality became possible with the advent of profiling and scorecarding features as standard functionality within enterprise data quality products. By using a combination of data quality rules, reference data and technology, you can create a report which lists the quality and percentage of records with completeness, conformity, consistency, duplicate and accuracy issues. This process is called a data quality audit. The ongoing process of scorecarding is called monitoring.
The next question is – what do we do now? We have the DQ metrics. How do we get the business and IT motivated to sustain high quality data? (more…)
I read an article the other day called the “The 9 Dirty Little Secrets of CRM”. As a user of CRM systems for many years including salesforce.com (which I rate highly), I agree with many of the statements around the secrets of successful CRM. Firstly it’s all about the data! If high quality data is entered and data quality processes are in place to maintain the data, then user adoption will grow and the CRM implementation will be perceived to be a success.
The first step is a data quality audit/assessment which identifies data quality issues associated with the key data fields which drive your processes. Let’s assume a data quality audit results in a score of 65% – this means that your CRM processes will be only 65% effective due to the quality of the data. So if the CRM application is to support “attracting and retaining” customers and “increasing revenues via cross-selling”, management need to be very aware of the impact low quality data has on the effectiveness of the CRM processes. The data quality metrics are based on defining data quality dimensions including completeness, conformity, consistency, duplicates and accuracy, and applying these dimensions to the data fields. So for example a customer or prospect record includes the name, address, email and telephone number fields. Each field can be audited using several dimensions. The results are aggregated, resulting in the overall score e.g. in this case 65% – which is for many unacceptable. (more…)