Blue Hill Research Links Dollars to Analysts’ Data Prep Time: Highlights from “Quantifying the True Value of Collaborative Data Prep” [Webinar Recap]
In our recent webinar “Quantifying the True Value of Collaborative Data Prep,” principal analyst James Haight of Blue Hill Research and director of product management Andrew Comstock discussed the key findings and implications of the Blue Hill Research benchmark study.
First, we’d like to thank everyone who joined us for the live webinar. We appreciate all the comments, feedback, and questions. Special thanks to James and Andrew for engaging in this lively discussion and for sharing their perspectives and insights!
The most interesting takeaway? While they shared many intriguing tidbits, my favorite moment was when James attached actual dollar amounts to analysts’ data prep time.
Missed the webinar? No worries—you can catch the replay any time!
About the Blue Hill Research Benchmark Study
In order to better understand the time distribution and resource allocation for various data preparation activities and tasks, James Haight conducted a survey with 186 data analysts. To quantify the financial cost of data prep, his benchmark report explores:
- Macro trends influencing today’s world of enterprise data
- Methods analysts use for data preparation
- Time spent preparing data
- Time saved using dedicated data prep functionality
- Recommendations for more efficient data prep workflows
Discussion Highlights: Understanding the Role of Data Prep
As promised, James and Andrew discussed why the need for data prep is expanding and how that affects the enterprise.
- Who needs to prep data and why?
Self-service access to frontend visualization/analytics tools is quickly increasing,” explains James. He spelled it out simply: “Yesterday’s ‘Big Data’ is today’s normal data.
It’s not just data analysts and business analysts who need to work with data. More and more employees, including information workers and line-of-business managers, need data in order to perform their jobs.
James noted that accessible data prep offers tons of untapped opportunities for sales and marketing, in particular.
- How much time do analysts actually spend preparing data?
James’ short answer? “A lot!” (No doubt about that!)
His long answer?
The median response rate showed that data analysts spend anywhere from 4 – 6 hours a day working with data, while 15 percent spend at least 8 hours/day working with data.
James outlined how analysts access data, which data sources are the most popular, and which data sources are on the rise. He also described the additional factors decision-makers should consider when evaluating the opportunity costs that standalone or purpose-built data prep solutions offer.
- Bottom line: Do dedicated data prep solutions save time?
How much time? (Scoop up the exact details in his benchmark report.)
- What happens when you link dollars to data prep?
Based on his research, James linked dollars to the minimum time analysts spend preparing data. Even with a conservative estimate, the dollar amounts correlated with time spent on data prep activities were absolutely mind-boggling.
Curious about how James quantified the annual costs of data preparation tasks? Catch the on-demand webinar to hear his surprising conclusions.
3 Tenets of Informatica’s Collaborative Data Preparation Solution
“All analysts are not equal,” says Andrew Comstock. “Some are data-savvy, while some are more business-savvy.”
Andrew introduced the three tenets of Informatica Data Prep that represent the spirit of our approach to data preparation.
Our data prep solution addresses a variety of skill levels. With its familiar spreadsheet-like environment, Informatica Data Prep caters to the needs of non-technical as well as technical data analysts who can write advanced SQL scripts, procedures, and code to extract data.
Even analysts with minimal knowledge of SQL can use our intuitive drag-and-drop based visual query builder to create complex queries, without extensive training.
How can automated data prep tools help streamline everyday manual data management tasks?
In order to keep up with the ever-evolving ways data is produced, stored, and accessed, James recommends source-agnostic data prep systems and intuitive workflows that ‘augment capabilities across a spectrum of skills.’
Responding to this recommendation, Andrew described how Informatica’s Data Preparation application is source agnostic. Enterprise architecture teams can roll out next-generation technology with minimal change management and maximum adoption to realize the full ROI of their tech investment.
To provide more context, Andrew outlined three specific customer use cases with examples of how applying automated data prep to your data management routine can significantly reduce manual, time-consuming tasks like standardization, validation, enrichment, reshaping, and reformatting datasets to make data usable and ready for analysis.
Informatica’s approach to data prep gives analysts a consistent experience, regardless of enterprise database technology with:
- Direct connectivity to file sources
- Native connectivity to relational data sources
- Certified ODBC connectors, NoSQL, Cloud, and BI sources
In other words, analysts don’t have to jump from app to app to gather the data they need.
Data ingestion, exploration, and manipulation can happen in the same app: they can pull in data from different systems and applications; cleanse and reshape it; and then export it—all from one place, one interface.
Finally, Andrew summarized the advanced data manipulation capabilities that enhance analysts’ productivity and effectiveness with their routine data prep tasks:
- Visual data profiling
- Formula-free, automatic data transformations
- Data federation
- Automated suggestions for how to improve or fix your dataset
Redefining Your Data Prep Routine
James summed up the conventional perception of data prep vividly:
Data prep time is like your daily commute.
People tend to view data prep as a part of doing business—it’s time built into your daily routine like your commute. It’s just something you have to do to get to where you need to be.
Challenging status quo, James highly recommends moving from a commute-like approach, where data prep time is just accepted as part of the day, to a more proactive stage where you can reallocate that time to value-added activities to get “more bang for your hour, more bang for your buck.”
So, what happens when you switch to a dedicated data prep solution and what does your company stand to gain?
What are the limitations and risks of retrofitting traditional methods of data prep for today’s modern data requirements?
How are companies balancing self-service demands with enterprise needs in terms of security and data governance?