Tag Archives: Real-Time
Ah yes. Las Vegas in May… not a bad place to be, unless you are planning to recreate the latest Hangover Movie.
Informatica World happens to be there on May 11-15th, 2015, and once again I’ll offer my predictions (aka Morning Line) for your BEST viewing and learning opportunities. If you see all of these, who knows, you just may win the EXACTA for real time prowess. These sessions will appeal to you, especially if you fancy the latest in Real Time Big data Integration information.
So with no further ado, forthwith are your best bets —
May 12, 1:30 – 2:30 Room: Gracia 2
How to Support Real-Time Data Integration Projects with PowerCenter
James Chidichimo, Vice President of Information Technology, PRA Group
Brandon Wyckoff, Data Warehouse Administrator, PRA Group
Raj Khot, IT Manager – Application/Data Management, Integration and BI, Grant Thornton LLP
Lokesh Krishnamoorthy, Project Lead, IT Data Integration, Grant Thornton LLP
Terry Simonds, Senior Director, Business Development, Informatica
Informatica recently released PowerCenter v9.6, which is designed to deliver end-to-end data integration agility to support business agility. A big part of business agility is about ensuring that your IT organization can make the most current and trustworthy data available, in whatever form necessary, regardless of where it resides, at the right time.
Maybe you want to do more real-time data warehousing by shortening your batch windows
Maybe you want to synchronize all operational and transactional systems with consistent and accurate information
Maybe you want to execute intraday batches to augment overnight batch processing in your data warehousing projects
Maybe you want to better integrate the data layer with an application integration layer (i.e.EAI, SOA, etc.)
In this session, we will discuss and demo how you can leverage your existing Informatica skills and PowerCenter’s real-time capabilities to enable change data capture (CDC), integration with messaging systems, and Web services with a single platform to increase the speed of accessing and delivering data, mix real-time operational data with historical data, and capture changes in near real-time, non-invasively, without impacting the system, to support IT projects including real time data warehousing, data synchronization, data replication, operational data hubs, and more
May 14, 10:10 – 11:10 Room: Gracia 1
Big Data Real Time Streaming Analytics Session
Scott Hagan, Principal Product Manager, Informatica
Terry Simonds, Senior Director, Business Development, Informatica
Amrish Thakkar, Principal Product Manager, Informatica
One of the more interesting and advanced use cases of Informatica has been in the area of streaming and predictive analytics for Big Data
In this session, you will hear from experts who have implemented a streaming analytics application that predicts when spare parts on jet engines need service—even before they need service
Don’t miss this interactive session that will feature discussions around real-time best practices and tips and tricks, from design to implementation
May 14, 2:30 – 3:30 Room: Gracia 6
Accelerating Business with Near Real-Time Architectures
Deepak Gattala, Solution Architect, Dell Inc.
Hari Kalla, Principal Consultant, SSG Limited
This session will describe near real-time architectures for accelerating the delivery of data to critical analytics and customer service applications
Architects from Dell and SSG will talk about how they are using Vibe Data Stream and other technologies to build the architectures that bring near real-time data to key applications and analytics use cases
The business benefits of these approaches include timely business decisions and more effective and timely customer support.
May 13, 2:00 – 2:45 Room: Gracia 3
Building a Real-time Data Platform for Improved Customer Engagement
Chris Hammond, Project Director, British Telecom
Terry Britt, Senior DBA, Camping World
Luc Clement, Director Product Management, Informatica Cloud
Every company says that customers are their #1 priority, but how do you use the data throughout your organization to go the extra mile? Join British Telecom and Camping World as they discuss their different approaches to improving the customer experience
British Telecom decided to consolidate customer data from seven different Salesforce systems in real-time to improve customer response times
Camping World forged a strategy around customer satisfaction to support a dynamic B2C retail business
You’ll also hear how both companies implemented elegant and successful customer service solutions with Informatica Cloud. This is sure to be a fascinating session with lots of practical advice on how to improve your customer experience with real-time app integration and customer analytics to deliver a 360-degree view of every customer interaction
May 14, 11:20 – 12:20 Room: Gracia 8
Enabling Digital Transformation with Real-Time Master Data Management
Sanjeev Gupta, CIO, Cover-More Group
Cover-More has embarked on the development of a customer-centric strategy. A core component of the strategy is the use of digital channels as the key mechanism to understand and have deeper relationships with customers and prospects
Success in the digital channel is dependent on having good quality customer information and 360 degree view of the customer Currently, Cover-More faces several challenges including information that is located across disparate sources and customer information being held in silos. With a mandate to rapidly enable this strategy, Cover-More has partnered with Informatica and Capgemini to implement a real-time Master data management integrated solution that will eliminate duplicate records at the time of on boarding; integrate in real-time with the online channel, and create a more seamless experience for customers
The Big Data Journey: Traditional BI to Next Gen Analytics – Customer Panel
- Deepika Sinha, IT Manager Enterprise Business Solutions, Johnson & Johnson
- Dave Beaudoin, VP Data Architecture, Transamerica
- Christine Miesner, Manager E&P Data Management, Devon Energy
- Thomas Reichel, Lead Architect, KPN
May 12, 3:30 – 4:30 Room: Gracia 6
Modernize Your Application Architecture and Boost Your Business Agility
Tom Kato, CEO, Mototak LLC
Rick Mutsaers, Senior Consultant, Informatica
Big application projects present a chance to take a fresh look at your overall application and data architecture. Done right, application consolidation, application migration, and M&A integration offer the chance to rationalize your architecture, reducing your IT costs and speeding up future project delivery through a simplified environment
We will talk about approaches to these complex projects as well as best practices to lower your risk and increase productivity
So enjoy the conference and hope to see (or meet) you there.
When the average person hears of cloning, my bet is that they think of the controversy and ethical issues surrounding cloning, such as the cloning of Dolly the sheep, or the possible cloning of humans by a mad geneticist in a rogue nation state. I would also put money down that when an Informatica blog reader thinks of cloning they think of “The Matrix” or “Star Wars” (that dreadful episode II Attack of the Clones). I did. Unfortunately.
But my pragmatic expectation is that when Informatica customers think of cloning, they also think of Data Cloning software. Data Cloning software clones terabytes of database data into a host of other databases, data warehouses, analytical appliances, and Big Data stores such as Hadoop. And just for hoots and hollers, you should know that almost half of all Data Integration efforts involve replication, be it snapshot or real-time, according to TDWI survey data. Survey also says… replication is the second most popular — or second most used — data integration tool, behind ETL.
Do your company’s cloning tools work with non-standard types? Know that Informatica cloning tools can reproduce Oracle data to just about anything on 2 tuples (or more). We do non-discriminatory duplication, so it’s no wonder we especially fancy cloning the Oracle! (a thousand apologies for the bad “Matrix” pun)
Just remember that data clones are an important and natural component of business continuity, and the use cases span both operational and analytic applications. So if you’re not cloning your Oracle data safely and securely with the quality results that you need and deserve, it’s high time that you get some better tools.
Send in the Clones
With that in mind, if you haven’t tried to clone before, for a limited time, Informatica is making Fast Clone database cloning trial software product available for a free download. Click here to get it now.
You can think of big data as approaches and mechanisms that can manage and process petabytes of structured and unstructured data that may be centralized or distributed. Or, a single approach and technology for getting at most relevant data no matter the size or the structure.
Indeed, big data provides us with the power to leverage information that would normally not be accessible, or cause way too much latency than practical when searching through the data. (more…)
One up-and-coming use case in the Capital Markets that we are excited about is front office real-time risk analytics on streaming market data, to decrease risk by informing traders in real time about potential changes to trading strategies, based on the most up-to-date data possible.
One person’s “real-time” is another person’s “fast enough”. (Or is it vice versa?) I’ve been in the Complex Event Processing (CEP) space for almost five years, and nothing gets this industry more spun up than heated discussions about “feeds and speeds” – the fastest products, the lowest latencies, the greatest event volumes, the most events per second and so on. (more…)
Actually, not much, but I couldn’t pass up a chance to mention Tebow in a blog after yet another miraculous victory. Seriously, there’s actually some interesting commonality with CEP and not just Tim Tebow, but most NFL quarterbacks.
In his new book ‘Event Processing for Business‘, father-of-CEP, David Luckham says, “In today’s fast-paced corporate environment, real-time events require immediate action. Event processing (EP)-the ability to collect, analyze, and react to real-time events-is a key component of twenty first century business information systems.” (more…)
Guest Blog by Michelle de Haaff, CMO, Attensity
It’s great to be guest blogging on the Informatica site this week. The topic: BIG DATA in the Enterprise and specifically the growth of customer data fueled by social media that creates a very large treasure trove of insights for businesses.
Informatica made a very exciting announcement yesterday about their Informatica 9.1 BIG DATA offering and we were proud to be a part of it. Attensity made its own announcement on BIG DATA earlier this year as well. It’s great to partner with the world leader in data integration technology. Why? A big question that many of both Attensity and Informatica customers ask is how they can bring unstructured data, prose or text that are in emails, survey verbatims, documents, social media conversations, service and repair notes and more into a data analytics platform, combined with structured data for analytics. (more…)
On November 10, Informatica made history with the launch of Informatica 9. In my mind, being a SOA enthusiast, another equally significant event transpired – the birth of SOA-based Data Services – transformational SOA data integration that can revive your enterprise architecture.
So, what exactly are SOA-based Data Services and why am I so excited?
I am back after a somewhat self-imposed hiatus during which I have been doing some soul-searching, or rather talking to a number of practitioners, experts, thought-leaders and analysts in the integration space. My singular quest was to uncover some real-world myths about SOA.
I spoke to a variety of integration experts – enterprise and application architects, application developers, data architects and data integration developers. During these interesting conversations, we discussed real-world SOA…or let me qualify that term further as real-world “service-orientation.”
Of course we discussed paradigms such as “loose-coupling,” “modularity,” “services,” etc., but more importantly in many cases, we spoke at length about how they were falling short of realizing the promised benefits of SOA. On probing each usage scenario further, I chanced upon a couple of interesting myths about SOA, which I would like to share with you. (more…)
Before we can have lengthy discussions around whether SOA is dead, or SOA is alive and kicking, I thought that it would serve us all well, including myself, to get to a generally agreed upon definition of what exactly we are talking about – what is Service-Oriented Architecture or SOA?
According to Wikipedia, “service-oriented architecture (SOA) provides methods for systems development and integration where systems group functionality around business processes and package these as interoperable services.”
This sounds like a definition right out of a technical book, while SOA’s biggest claim to fame was based on a more business-like perspective which is its promise of agility achieved through the alignment of business and IT. Let’s see if we can dig up some real-world observations around the current state of SOA.