Next Generation Analytics Strategies – Intelligently Disrupt or Become Disrupted

When I was asked to give a presentation at Informatica World 2018 on Next Generation Analytics strategies I first thought to myself about our customers and why they had selected Informatica for their analytics strategy. One of the major reasons companies invested in the Informatica Intelligent Data Platform is to increase the productivity and efficiency of their scarce data professional resources such as data engineers, data analysts, data scientists and data stewards. This shortage of data management and analytic skills is a big problem at many companies. In a study by Forbes Insights and Dun & Bradstreet surveying 300 senior executives it was cited, “27% cited skills gaps as a major blocker to their current data and analytics efforts.” And in the same study, “24% cited data quality and accuracy as a major obstacle to the success of their analytics efforts.”

Which leads me to the first strategy for Next Generation Analytics:

Invest in a data management platform having an enterprise unified metadata foundation powered by artificial intelligence (AI) and machine learning (ML).

That sounds like a mouthful but it is essentially the secret sauce behind the intelligence in Informatica’s Intelligent Data Platform – what we refer to as the CLAIRE™ engine. AI/ML is being used in all industries today to automate business processes, recommend and guide user behavior and deliver new products and services that engage and delight customers. The same is true in the world of analytics and data management. The only way to scale data management operations to meet the needs of the 21st century is through intelligent automation. In the past, process automation was mostly based on business rules encoded in deterministic business logic and middleware. However, the challenges faced today are much more complex and dynamic in nature and therefore require AI and ML algorithms applied to massive amounts of metadata in order to do a good job at automating the data pipeline and guiding the behavior of knowledge workers with intelligent recommendations. A data management platform built from the ground up on enterprise unified metadata powered by AI/ML is the only way to scale the massive amount of work without hiring an army of scarce and expensive data professionals.

At Informatica we’re focused on enabling customers to overcome their most complex data challenges. So here are five more strategies that our customers have adopted to empower their Next Generation Analytics journey.

Data cataloguing is the first step

Before tackling any project its always prudent to first take inventory of what’s available. This helps you plan and execute towards a timely and efficient project completion. It’s now common knowledge that unfortunately data scientists or analysts spend 80% of their time looking for the data they need for an analytics project. Imagine a data analyst at a life sciences or healthcare company working to build an analytic model to improve patient outcomes. There are thousands of possible data sets across the enterprise ranging from data related to patient clinical, electronic medical records (EMR), genomics, claims, billing, patient forums, call detail records, HL7 data and much more. Where does one begin? An intelligent data catalog powered by AI/ML can help data scientists and analysts find and recommend the data they need. And the data catalog facilitates collaboration among the analytics teams helping curate the data so it improves in quality and value over time.

Optimize the data pipeline for Big Data

Consider the complexities of modern industrial supply chains – for example all the people, processes and steps involved in collecting crude oil, transporting, refining and delivering gasoline to the pump. Or consider an automobile and all the raw material, machined and electrical component parts, sub-assemblies, documentation, quality and safety checks, logistics, etc. involved in the supply chain that delivers a new car to the dealership. Today’s data pipelines are even more complex having to deal with hundreds of source systems (on-premises and in the Cloud) and thousands of sensors and machines, collecting, integrating, cleansing, preparing, relating, protecting and delivering trusted data to consuming applications and into the hands of business or analytic project users. Just like in manufacturing, where competitive battles were won by the most efficient supply chains, innovative businesses can only achieve their analytic goals by optimizing the entire data pipeline for Big Data workloads operating in hybrid (multi-cloud and on-premises) environments.

End-to-end collaborative data governance

Enterprise data are now one of the most critical of company assets. Therefore, data must be managed and governed as any other valuable company asset especially in regulated industries and global markets. This means data management policies must be well defined and adhered to by validating data usage, quality, privacy and compliance (e.g. GDPR, HIPAA, SOX, etc.). Data governance must be frictionless with no barriers that slow down business. This is best achieved through executive sponsored data governance initiatives and an intelligent and integrated data platform to support the necessary collaboration and end-to-end governance processes that connect policies with operations.

Build in data privacy and protection

The only way to democratize your company data (i.e. make it readily available) for analytic projects and comply with industry regulations is to ensure data is secure and protected. Whatever methods you choose must pass the scrutiny of your Chief Information Security Officer (CISO). How do you avoid over-spending to comply with regulations or under-spending putting your company at risk of a data breach, bad publicity and stiff fines? Companies are investing in risk-based data security intelligence tools to paint a landscape of where all their sensitive data resides, how it’s propagating across systems and regions, and who is accessing data and from where. When this information is combined with data privacy policies, risk can be assessed and remediated through more cost-effective data protection (e.g. data masking and encryption) strategies.

Architect for Cloud and hybrid

More and more analytics projects are moving to the Cloud. The reasons are many and include operational efficiency, scalability, simplicity, security, flexibility and many other ‘bilities. Data management and analytic platforms must inherently support Cloud and hybrid infrastructures otherwise they’ll be obsolete before the ink dries on the purchase license agreement. The State of Cloud Analytics report from EMA, Deloitte and Informatica found “60.1% of enterprises are relying on hybrid and public clouds as the platforms to enable Big Data Analytics, leading all other current analytics initiatives planned by respondents.” Our customers require that data management operations can run in multi-cloud environments and Informatica has the most comprehensive security coverage in trusted enterprise cloud data management covering security, compliance and privacy (e.g. SOC2, SOC3, ISO, HIPAA, CSA, etc.).

Hopefully I’ve given you a few strategies to think about as you embark on your Next Generation Analytics journey. If you’d like to learn more about Informatica’s solutions as relates to these strategies I encourage you to attend my session Next Generation Analytics Strategies (NGA102) at Informatica World 2018 on Tuesday, May 22 at 1:40pm. And if you haven’t registered yet for Informatica World 2018 in Las Vegas at the Venetian – what are you waiting for?

Comments