The hype around big data is certainly top of mind with executives at most companies today but what I am really seeing are companies finally making the connection between innovation and data. Data as a corporate asset is now getting the respect it deserves in terms of a business strategy to introduce new innovative products and services and improve business operations. The most advanced companies have C-level executives responsible for delivering top and bottom line results by managing their data assets to their maximum potential. The Chief Data Officer and Chief Analytics Officer own this responsibility and report directly to the CEO.
Unfortunately, too many companies are taking a wait and see approach to implementing a big data strategy as their competitors gain the upper hand. Their perception is that big data is complex and they cannot afford to take the risk. One reason for this perception is there are hundreds of vendors claiming to be big data solutions and numerous open-source projects evolving so quickly that new versions and incubator projects are launched every other week. Furthermore, once an organization proposes a big data strategy they struggle to find the resource skills to staff and implement their strategy. Adding to their fear of big data is we are seeing analysts announce that the big data hype is entering the trough of disillusionment as reports of big data projects failing begin to pile up. So, where should you begin the big data journey and how do you minimize the risk of failure?
What if you could remove the risk associated with emerging technologies and staff big data projects with resource skills you have today? Informatica recently published a white paper titled, “The Safe On-Ramp to Big Data”, that describes how to lower costs, minimize risk, and innovate faster with a proven approach to big data. Big data projects can quickly consume the capacity of existing database and data warehouse appliance infrastructure. Moving the raw data and ETL processing to Hadoop can extend the capacity of your existing investments while lowering the cost to optimize performance of managing more data and more types of data. With the Informatica PowerCenter Big Data Edition you can staff big data projects with the same ETL development teams you have in place today and they don’t need to learn any special Hadoop skills like Java MapReduce programming, Pig, or Hive scripting. In other words, you minimize the risk associated with staffing big data projects because you don’t have to seek hard to find and more expensive developers with years of Hadoop experience. In fact, recent customer engagements have shown Informatica developers to be up to 5 times faster than hand-coding on Hadoop which means big data projects can be completed sooner thereby lowering costs.
My recommendation to organizations starting their big data journey is try to use the tools you are familiar with today. Informatica is the safe on-ramp to big data because it allows you to use the resource skills you have today in PowerCenter with up to 5x productivity over hand-coding no matter what underlying technology you choose to deploy on. With Informatica you can map your data processing once and deploy it on Hadoop or traditional grid computing platforms both on-premise or in the cloud. This removes the risk associated with placing the wrong bet on new emerging technologies. You don’t need to watch the big data phenomena from the sidelines because you can get started today leveraging your investment in Informatica. Join us at Informatica World 2013 and attend the session Informatica for Hadoop to learn how Informatica is the safe on-ramp to big data.