Tag Archives: low latency messaging
I have a little fable to tell you…
This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.
And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables. After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said? Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables. “Shall I do an outside join instead?” asked the programmer? The host considered their schema and assigned 2 seats to the space.
The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said. With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories. They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.
They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello. There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all. You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.
It was clear they needed relief. The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available. No response. Then they asked again. There was still no response. So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”
Just then, the programmer remembered a remedy he had heard about previously and so he spoke up. “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”
Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.
And so they all lived happily ever after and all was good in the IT Corporation once again.
Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.
Printed words are good, but pictures and sound are better. Watch the video below for a quick summation of how Informatica Ultra Messaging can help your business:
- Increase application performance and throughput
- Reduce fixed and operational costs
- Increase capacity
- Reduce single points of failure
- Increase scalability, reliability, and availability
For more information, have a look at: Ultra Messaging, Better Value with Better Technology.
Pete Benesh discusses some of the major business challenges driving IT investment in capital markets today. He also highlights how Informatica Ultra Messaging can help organizations tackle these challenges.
There are lots of ways to run a trading firm.
Some firms use a strategy centered around high frequency or algorithmic trading, which are similar in that having the best technology and writing the fastest trading applications is essential.
At the other end of the spectrum, some firms employ only human traders, using a traditional buy-and-hold strategy, expecting to hold the security for months or even years before moving it.
But in between these two ends of the spectrum, there exists a hybrid that uses electronic trading with a bit of buy-and-hold added in. Some call this blend “trade smarter, not harder”.
Instead of competing with other traders to get the absolute lowest price, this strategy prioritizes on making better decisions by doing “pre-trade analytics” on historical and financial data.
More and more business applications are moving from the desktop to the cloud, and electronic trading applications are no different.
Over the last five or ten years, application vendors have established several advantages of running major applications, even mission-critical applications like salesforce.com, over the cloud.
These advantages include:
- Easier and smoother upgrades, which provides much better adaptability and agility in the face of changing market and business conditions, plus a better user experience,
- Better scalability, with newer technology advances, and
- Better portability across a wide array of device types, including smartphones and tablets (especially in the last 2-3 years).
Recent improvements in Web technology, such as HTML5 WebSockets, are helping to speed this transition along by providing several throughput and latency advantages over earlier iterations of Web technology, and even over native Windows applications. Now, application architects can freely choose the technology that provides a better path for growth, agility, and scalability, which is often a Cloud-based solution.
As I write this, a few of our customers who provide electronic trading solutions to their clients are making the strategic move to develop a next generation application based in the Cloud. The main driver for one customer was to be able to take on more clients more quickly and therefore grow the business faster by increasing marginal revenue and profitability. They found that the list of challenges with a thick desktop client to be just too big for growing the business as quickly as they wanted to — or needed to.
Messaging middleware, especially peer-to-peer solutions such as Informatica Ultra Messaging, can be a very important piece of a Cloud-based application. The peer-to-peer “nothing in the middle” model provides applications not just ultra-high performance (whether for high throughput or low latency), but also near-linear scalability, true 24×7 reliability and availability, and business and IT agility. These qualities tie directly to the advantages listed above.
Cloud-based applications, of course, must also contend with the Internet and all that comes with that: support for various browsers and platforms (and versions of each), scalability and bandwidth issues, and mobile devices like smartphones and tablets. New web technologies like HTML5 WebSockets from Kaazing are best positioned to take care of the path from server to the smartphone or tablet, and with JMS connectivity to Ultra Messaging on the back end, can provide a Cloud-based application with a lean, scalable and agile infrastructure, usually with less hardware.
For more, please see our 2011 Efficiency series (#1, #2, #3) on our Perspectives blog, or whitepapers such as Modern Messaging Middleware for Big Data in Motion or Enterprise Messaging Data for the Web.
Evaluating a price in the Equities or Foreign Exchange (“FX”) markets does not require much calculation, and so one of the prime limiting factors on winning those trades has been the speed of data movement, from one application to another, either within the same host or across the network. But the world of Fixed Income, commonly known as “bonds”, is different. (more…)
One up-and-coming use case in the Capital Markets that we are excited about is front office real-time risk analytics on streaming market data, to decrease risk by informing traders in real time about potential changes to trading strategies, based on the most up-to-date data possible.
Remote Data Collection and Transformation – with Ultra Messaging Cache Option and B2B Data Transformation
Sometimes when I drive past an electronic tollway collection sensor, I wonder about the amount of data it must generate. I’m no expert on such technology, but at a minimum, the RFID sensor has to read the chip in your car, and log the date and time plus your RFID info, and then a camera takes a picture to catch any potential violators. Now multiply that data times the hundreds of thousands of cars that drive such roads every day, times the number of sensors they pass, and I’m quite sure this number exceeds several million messages per day. (more…)
In a recent post: Remove the Restrictor Plate with High Performance Load Balancing, my colleague Jeff Brokaw compared the high performance architecture of Informatica Ultra Messaging to the removal of the carburetor restrictor plate on a NASCAR racing engine to increase airflow and speed. Ultra Messaging has removed the “restrictor plate” in that it provides direct peer-to-peer communication between applications with no intermediary brokers – thereby delivering extremely high and sustained throughput rates at low latencies. (more…)
Similar to the way that a carburetor restrictor plate prevents NASCAR race cars from going as fast as possible by restricting maximum airflow, inefficient messaging middleware prevents IT organizations from processing vital business data as fast as possible.