Exploit Perishable Insights with Informatica’s Intelligent Streaming Solution
With the number of connected devices expected to reach 50 billion by 2020, there is a huge volume of data that is being generated. These devices are changing lives—and the nature of data. This data provides businesses with nearly real-time access into how its products are being used. Organizations need to identify and act on this data in real time or else they will lose out to competitors as the latency in the data will make the insights perishable.
What are perishable insights?
Perishable insights can provide exponentially more value than traditional analytics, but the value expires and evaporates once the moment is gone.
Sometimes one minute later is too late. How to quickly process, analyze, and act on data? What opportunity are you missing? It’s not cost-effective to store all data, especially if it’s low-value or yet to be deemed of value (noise). But it is highly valuable to inspect and analyze all the data, to distinguish the signal from the noise or determine what needs to be persisted. There is value in identifying the signal in the past: offline analysis is required, but you’ve now lost the chance to affect the present.
How Informatica Data Engineering Streaming helps
In the latest release of Informatica Data Engineering Streaming 10.4 version, we have added the following new features and capabilities to help data engineers sense, reason, and act on real-time trusted data for actionable insights, while data is still being generated.
Data quality transformations in streaming: Data Engineering Streaming (DES) now supports Data quality transformations on real-time streaming data to ensure good quality data gets ingested into the data lake for batch and stream processing. This helps drive industry use cases like targeted marketing campaigns in retail, predictive maintenance in manufacturing, fraud detection in banking and clinical research optimization in healthcare.
Azure Databricks and Databricks Delta target support in streaming: Customers can now run streaming jobs on Azure Databricks with Databricks Delta as target. This will help you to build and configure streaming data pipelines with Spark Structured Streaming and store the data in Databricks Delta.
Confluent schema registry support: This helps customers to process complex hierarchical data from Kafka. Customers can also use DES with SSL-enabled Kafka. It also supports advanced parsing of industry standard message formats like HL7 using the integration with Informatica Intelligent Structure Discovery (ISD).
Enhanced Connectivity across AWS and Microsoft Azure: DES includes enhanced connectivity to cloud ecosystems including ADLS Gen2, Databricks Delta, and Snowflake. Customers can now run streaming jobs on ephemeral clusters in the cloud.
Governance for streaming data flow pipelines: Govern your mission-critical streaming pipelines with end-to-end lineage for DES mapping in Informatica Enterprise Data Catalog.
Enhanced support for change data capture (CDC) data: Customers can now capture change data from transactional systems using CDC streaming techniques and run machine learning algorithms to detect potential fraud or drive personalized real-time offers.
To learn more about the new DES features and view a demo, view the webinar, download the data sheet, and visit our website: