In the old days, data quality process was executed as a one-way process: IT gets access to raw data and builds lineage and mapping for the data, then passes them to business analyst to perform analysis. Often time analyst finds the data she received are not up to the required standard so she ends up spending great length of time checking and correcting the errors in the data, before playing any analytic magic with the data.
The downside of this process? Poor operation efficiency, delayed time-to-results, unhappy employees (analyst in particular), just to name a few. In the end business is still left with lousy quality of data which provides little value to the company.
As an analyst in my previous career, I have spent days, weeks, sometimes months to clean and standardize the data I gathered, before I could create reports and build charts to explain the meaning of the data, which btw, was the fun part of my gig. Those late night crunch has long memories. Cleaning the data by hand was probably the least enjoyable task for me as an analyst.
But I didn’t know better then, not until I came to Informatica. I learned that there are software tools that can automate the data quality process and significantly reduce the amount of manual work, and most important, with the help of those tools, you get to play with clean and relevant data, you also feel confident that you can rely on those data to make important business decisions. But the best part of my learning is the concept of the role-based data quality process identified by Informatica and implemented in its Data Quality 9.6, the latest release of it Data Quality product family.
In the nutshell, the role-based process works like this:
First analyst and IT will examine the raw data together to understand what’s in the data. They spend time on things such as identifying the relationships in the data, discovering the data domains, etc, just to make sure the condition of the raw data meets their requirement. After this first pass, analyst will need to come up with a set of business rules based on the objectives she wants to achieve, those are the rules that will help identify trends and patterns in the data so proper business decisions can be made. To apply those rules in the actual workflow analyst will need help from her peers in IT to implement those rules into a software so the process can run automatically. Once IT completes the implementation, the data quality process can be executed automatically. However, no data is perfect and no rules can capture all the scenarios. When exceptional event happens, analyst needs to be alerted and decide what to do with the anomaly. Finally, the data quality process can’t be running in the dark, analyst needs be able to monitor and measure the effectiveness of the data quality rules they created, both proactively and post-event for compliance purposes.
Picture perfect but rather complicated implementation right? The good news is, Informatica is now able to deliver many of the capabilities described above to organizations facing data quality challenges. The latest release of Data Quality 9.6 is built around this role-based concept and offers many new functionalities for business users and analysts so they can easily collaborate with their IT peers to implement a holistic data quality process in their organization.
For the first time, role-play (no pun intended) makes data quality sexy and fun. With Informatica Data Quality 9.6, turning your raw data into clean and trusted assets is no longer a resource intensive and draining process. I invite you to join us at a webinar on June 10, in which we will present you an in-depth demonstration of the new capabilities in Informatica Data Quality 9.6, built for enabling a holistic data stewardship for your company.