In a recent webinar, Mark Smith, CEO at Ventana Research and David Lyle, vice president, Product Strategy at Informatica discussed: “Building the Business Case and Establishing the Fundamentals for Big Data Projects.” Mark pointed out that the second biggest barrier that impedes improving big data initiatives is that the “business case is not strong enough.” The first and third barriers respectively, were “lack of resources” and “no budget” which are also related to having a strong business case. In this context, Dave provided a simple formula from which to build the business case:
Return on Big Data = Value of Big Data / Cost of Big Data
By maximizing your return on big data you essentially build the business case for big data projects. Dave was then able to prescribe some solid fundamentals to remove the top barriers noted above from the Ventana Research and support the business case:
- To get started with big data, you need to get your BI/analytics and data warehouse resource and infrastructure costs under control which can self-fund your first big data project. Begin by offloading unused or raw data to low cost commodity hardware to extend the capacity of your data warehouse. Optimize ETL/ELT processing using Hadoop or traditional scale-out grid computing. Offload expensive mainframe processing using high-speed data replication to take the load off of the mainframe and other expensive platforms.
- Select a first big data project that you have complete control over, can measure/demonstrate the value and minimize the risk of adopting new technologies like Hadoop. Choose a visual no-code develop environment like PowerCenter, so you can quickly staff big data projects from over one hundred thousand Informatica trained professionals worldwide. With the PowerCenter Big Data Edition you can design data integration flows once and deploy them anywhere that is most cost-effective (e.g. Hadoop, traditional scale-out grid computing, or pushdown to databases.) This “design once, deploy anywhere” capability minimizes the risk of adopting new technologies by hiding the underlying complexities of data integration so as new technologies emerge you don’t have to refactor your work.
- After you get your costs under control and demonstrate value with your first big data project you will have an architectural foundation to support more big data projects. In other words, now that you can walk, it’s time to run so you can maximize your return on big data and stay ahead of your competition. This requires a data integration platform that enables you to quickly onboard new data types (e.g. unstructured data, web logs, social data, sensor device data, industry standards, etc.), discover big data insights faster, and then operationalize these insights to scale as data volumes grow while being flexible to change and easy to maintain.
Dave also pointed out that now is a time for unhindered creativity and thinking outside the box – “If you could wave a magic wand” imagine what you could do if you had access to any data and cost was no longer a major barrier. Ventana Research pointed out several benefits related to increasing revenues/margins, increased speed and cost savings. In summary, the big data fundamentals enabling a strong business case boil down to three steps: 1) lower and control data management costs by offloading data and processing to low-cost commodity hardware; 2) minimize the risk of adopting new technologies using a no-code development environment having the ability to design once and deploy anywhere and staff projects with over 100,000+ readily available trained Informatica developers; and 3) select a data integration platform that enables you to quickly onboard new data types, discover insights fast and operationalize the insights to deliver real business value. This is a tall order but Informatica provides a safe on-ramp to achieving breakthrough results with big data.