Tag Archives: Ultra Messaging

Big Data Ingestion got you down? I N F O R M A T I C A spells relief

Big Data alike

I have a little fable to tell you…

This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.

And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables.  After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said?  Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables.  “Shall I do an outside join instead?” asked the programmer?  The host considered their schema and assigned 2 seats to the space.

The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said.  With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories.  They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.

They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello.  There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all.  You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.

It was clear they needed relief.  The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available.  No response. Then they asked again. There was still no response.  So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”

Just then, the programmer remembered a remedy he had heard about previously and so he spoke up.  “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”

Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.

And so they all lived happily ever after and all was good in the IT Corporation once again.

***

If you think you know what this fable is about and want a more thorough and technical explanation, check out this tech talk Here

Or

Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Big Data, Complex Event Processing, Data Integration, Data Synchronization, Hadoop, Marketplace, Real-Time, Ultra Messaging | Tagged , , , , , , , , , | Leave a comment

Keeping it “Real” at Informatica World 2014

Keeping it Real

This dog is Keeping it Real

Most have heard by now that Informatica World 2014 will be a unique opportunity for attendees to get new ideas, expert advice, and hands-on demonstrations on real-time data integration products and capabilities at the premier data conference of the year. 

However, it is Las Vegas after all, so for those of you wagering on the best sessions, here’s my quick guide (or morning line) for your viewing, research or attending pleasure.

I hope to see you there.

Tuesday May 13, 2014

BREAKOUT SESSIONS

1:00 PM (GRACIA 8):
ARCH101 – Enterprise Data Architecture for the Data-Centric Enterprise 
How do you build an architecture that increases development speed, protects quality, and reduces cost and complexity, all while accelerating current project delivery?
Srinivas Kolluru – Chief Architect, Southwest Power Pool, Inc.
Tom Kato – Architect, US AirwaysAmerican Airlines
John Schmidt – Vice President, Informatica

 4:45 PM (GRACIA 8):
ARCH113 – Western Union: Implementing a Hadoop-based Enterprise Data Hub with Cloudera and Informatica 
To expand its business and delight customers with proactive, personalized web and mobile marketing, Western Union needs to process massive amounts of data from multiple sources.
Pravin Darbare – Senior Manager Integration and Transformation, Western Union
Clarke Patterson – Sr. Director of Product Marketing, Cloudera, Inc.

 4:45 PM (GRACIA 3):
IPaW132 – NaviNet, Inc and Informatica: Delivering Network Intelligence… The Value to the Payer, Provider and Patient 
Healthcare payers and providers today must share information in unprecedented ways to achieve their goals of reducing redundancy, cutting costs, coordinating care and driving positive outcomes.
Frank Ingari – CEO, NaviNet, Inc.

HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B):
Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32):
Informatica Data Replication 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica

 BOOTHS 5:30-8:00 PM

2032- PowerExchange Change Data Capture 9.6
2065- Informatica Data Replication
2136- Informatica Real-time Data Integration
2137- Vibe Data Stream for Machine Data

Wednesday, May 14, 2014

BREAKOUT SESSIONS

11:30 AM (CASTELLANA 1)
ARCH109 – Proactive Analytics: The Next-Generation of Business Intelligence 
Business users demand self-service tools that give them faster access to better insights. Exploding data volumes and variety make finding relevant, trusted information a challenge.
John Poonen – Director, Infosario Data Services Group, Quintiles
Senthil Kanakarajan – VP, Technology Manager, Wells Fargo
Nelson Petracek – Senior Director Emerging Technology Architecture, Informatica

3:45 PM (GRACIA 8)
ARCH102 – Best Practices and Architecture for Agile Data Integration
Business can’t wait for IT to deliver data and reports that may not meet business needs by the time they’re delivered. In the age of self-service and lean integration, business analysts need more control even as IT continues to govern development.
Jared Hillam – EIM Practice Director, Intricity, LLC
John Poonen – Director, Infosario Data Services Group, Quintiles
Robert Myers – Tech Delivery Manager, Informatica

3:45 PM (GRACIA 6)
ARCH119 – HIPAA Validation for Eligibility and Claims Status in Real Time
Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how Health Net met regulatory requirements and limited both costs and expensive rework by architecting a real-time data integration architecture that lets it respond to eligibility and claims status requests within six seconds, near error-free.
Jerry Allen – IT Architect, Health Net, Inc.

3:45 PM (GRACIA 1)
ARCH114 – Bi-Directional, Real-Time Hadoop Streaming 
As organizations seek faster access to data insights, Hadoop is becoming the architectural foundation for real-time data processing environments. With the increase of Hadoop deployments in operational workloads, the importance of real-time and bi-directional data integration grows. MapR senior product management director Anoop Dawar will describe a streaming architecture that combines Informatica technologies with Hadoop. You’ll learn how this architecture can augment capabilities around 360-degree customer views, data warehouse optimization, and other big data business initiatives.
Anoop Dawar – Senior Director, Product Management, MapR Technologies

 HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B)
Table 17b – Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32)
Table 32 – Informatica Data Replication
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica
 

Thursday, May 15, 2014

BREAKOUT SESSIONS

 DEV108 – Informatica and the Information Potential of the “Internet of Things” 
The “Internet of Things” and its torrents of data from multiple sources — clickstreams from web servers, application and infrastructure log data, real-time systems, social media, sensor data, and more — offers an unprecedented opportunity for insight and business transformation. Learn how Informatica can help you access and integrate these massive amounts of real-time data with your enterprise data and achieve your information potential.
Amrish Thakkar – Senior Product Manager, Informatica
Boris Bulanov – Senior Director Solutions Product Management, Informatica

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Complex Event Processing, Data Integration Platform, Informatica Events, Informatica World 2014, Real-Time, Ultra Messaging | Tagged , | Leave a comment

Sub-100 Nanosecond Pub/Sub Messaging: What Does It Matter?

Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.

But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?

Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”.  He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.”  What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.

The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU.   The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.

So congrats to the UM development team for achieving this  important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and  your businesses.

 

*- http://tabbforum.com/opinions/the-death-of-alpha-on-wall-street?utm_source=TabbFORUM+Alerts&utm_campaign=1c01537e42-UA-12160392-1&utm_medium=email&utm_term=0_29f4b8f8f1-1c01537e42-270859141

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Ultra Messaging | Tagged , , , , , | Leave a comment

Electronic Trading Moves Into Fixed Income

Evaluating a price in the Equities or Foreign Exchange (“FX”) markets does not require much calculation, and so one of the prime limiting factors on winning those trades has been the speed of data movement, from one application to another, either within the same host or across the network. But the world of Fixed Income, commonly known as “bonds”, is different. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Financial Services, Operational Efficiency, Ultra Messaging | Tagged , , , , | Leave a comment

Real-Time Risk Analytics with Ultra Messaging and Data Transformation

To follow up on our recent blog about B2B Data Transformation and Ultra Messaging Cache Option, we would now like to discuss the same topic within the Capital Markets.

One up-and-coming use case in the Capital Markets that we are excited about is front office real-time risk analytics on streaming market data, to decrease risk by informing traders in real time about potential changes to trading strategies, based on the most up-to-date data possible.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Data Transformation, Real-Time, Ultra Messaging | Tagged , , , , , , , | Leave a comment

Remote Data Collection and Transformation – with Ultra Messaging Cache Option and B2B Data Transformation

Sometimes when I drive past an electronic tollway collection sensor, I wonder about the amount of data it must generate. I’m no expert on such technology, but at a minimum, the RFID sensor has to read the chip in your car, and log the date and time plus your RFID info, and then a camera takes a picture to catch any potential violators. Now multiply that data times the hundreds of thousands of cars that drive such roads every day, times the number of sensors they pass, and I’m quite sure this number exceeds several million messages per day. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Data Aggregation, Data Transformation, Financial Services, Healthcare, Public Sector, Telecommunications, Ultra Messaging | Tagged , , , , , , , , , , | Leave a comment

Remove the Restrictor Plate with High Performance Load Balancing

Similar to the way that a carburetor restrictor plate prevents NASCAR race cars from going as fast as possible by restricting maximum airflow, inefficient messaging middleware prevents IT organizations from processing vital business data as fast as possible.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Financial Services, Governance, Risk and Compliance, Operational Efficiency, Telecommunications, Ultra Messaging | Tagged , , , , , , , , , , , | Leave a comment

Informatica Ultra Messaging Enables Early SEF Movers

In a recent post: Informatica Ultra Messaging Software Supports Capital Markets Reforms, I discussed the technology implications of the OTC derivatives (swaps) market moving to electronic trading as mandated by the Dodd-Frank Act (DFA) in the US and the European Market Infrastructure Regulation (EMIR) in Europe. One area where new technology infrastructure will be especially critical is in the creation and operation of “exchanges” for electronic swaps trading, similar to what is used for equities and other asset classes. In the language of the DFA, such exchange venues are called Swap Execution Facilities (SEFs) and are defined as “a facility, trading system or platform in which multiple participants have the ability to execute or trade swaps by accepting bids and offers made by other participants that are open to multiple participants in the facility or system, through any means of interstate commerce.” This of course includes capturing orders electronically, matching bids and offers, executing the trades, and providing connections to central clearing houses. And perhaps nowhere else in the new ecosystem is the expected growth in message volumes and associated need for new messaging middleware technology more evident than here. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Financial Services, Governance, Risk and Compliance, Real-Time, Ultra Messaging | Tagged , , , , , | Leave a comment

Addressing Data Volume Growth With Better Efficiency

Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.

But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Financial Services, Governance, Risk and Compliance, Healthcare, Operational Efficiency, Public Sector, Telecommunications, Ultra Messaging, Vertical | Tagged , , , , , | Leave a comment

WEBINAR: Now You Too Can Have App-to-App Latency Under 10 Microseconds! On 10GigE -OR- Infiniband.

Informatica Ultra Messaging (previously 29West) now supports both native InfiniBand RDMA and RDMA over 10GbE, which means better performance and lower latency than ever before.

Now you can stay ahead of the competition and reduce latency at every level of the trading structure with the combination of Ultra Messaging, Voltaire 10GbE or InfiniBand switches, and Voltaire Messaging Accelerator™ (VMA) software.

Click the link below to register for this special webinar on Tuesday, December 7 (11:00 am ET/4:00 pm GMT), and see experts from Informatica-29West and Voltaire to get an an overview of the Informatica-Voltaire solution, as well as new benchmark results showing the lowest latency and highest volumes ever achieved with Informatica Ultra Messaging.

PRESENTERS:

Todd Montgomery, CTO, Informatica Ultra Messaging (previously 29West)
Tzahi Oved, Director, Solutions Product Management, Voltaire

Register Here

FacebookTwitterLinkedInEmailPrintShare
Posted in Ultra Messaging | Tagged , , , , , , , , | Leave a comment