Category Archives: Ultra Messaging

Big Data Ingestion got you down? I N F O R M A T I C A spells relief

Big Data alike

I have a little fable to tell you…

This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.

And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables.  After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said?  Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables.  “Shall I do an outside join instead?” asked the programmer?  The host considered their schema and assigned 2 seats to the space.

The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said.  With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories.  They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.

They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello.  There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all.  You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.

It was clear they needed relief.  The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available.  No response. Then they asked again. There was still no response.  So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”

Just then, the programmer remembered a remedy he had heard about previously and so he spoke up.  “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”

Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.

And so they all lived happily ever after and all was good in the IT Corporation once again.

***

If you think you know what this fable is about and want a more thorough and technical explanation, check out this tech talk Here

Or

Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Big Data, Complex Event Processing, Data Integration, Data Synchronization, Hadoop, Marketplace, Real-Time, Ultra Messaging | Tagged , , , , , , , , , | Leave a comment

Keeping it “Real” at Informatica World 2014

Keeping it Real

This dog is Keeping it Real

Most have heard by now that Informatica World 2014 will be a unique opportunity for attendees to get new ideas, expert advice, and hands-on demonstrations on real-time data integration products and capabilities at the premier data conference of the year. 

However, it is Las Vegas after all, so for those of you wagering on the best sessions, here’s my quick guide (or morning line) for your viewing, research or attending pleasure.

I hope to see you there.

Tuesday May 13, 2014

BREAKOUT SESSIONS

1:00 PM (GRACIA 8):
ARCH101 – Enterprise Data Architecture for the Data-Centric Enterprise 
How do you build an architecture that increases development speed, protects quality, and reduces cost and complexity, all while accelerating current project delivery?
Srinivas Kolluru – Chief Architect, Southwest Power Pool, Inc.
Tom Kato – Architect, US AirwaysAmerican Airlines
John Schmidt – Vice President, Informatica

 4:45 PM (GRACIA 8):
ARCH113 – Western Union: Implementing a Hadoop-based Enterprise Data Hub with Cloudera and Informatica 
To expand its business and delight customers with proactive, personalized web and mobile marketing, Western Union needs to process massive amounts of data from multiple sources.
Pravin Darbare – Senior Manager Integration and Transformation, Western Union
Clarke Patterson – Sr. Director of Product Marketing, Cloudera, Inc.

 4:45 PM (GRACIA 3):
IPaW132 – NaviNet, Inc and Informatica: Delivering Network Intelligence… The Value to the Payer, Provider and Patient 
Healthcare payers and providers today must share information in unprecedented ways to achieve their goals of reducing redundancy, cutting costs, coordinating care and driving positive outcomes.
Frank Ingari – CEO, NaviNet, Inc.

HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B):
Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32):
Informatica Data Replication 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica

 BOOTHS 5:30-8:00 PM

2032- PowerExchange Change Data Capture 9.6
2065- Informatica Data Replication
2136- Informatica Real-time Data Integration
2137- Vibe Data Stream for Machine Data

Wednesday, May 14, 2014

BREAKOUT SESSIONS

11:30 AM (CASTELLANA 1)
ARCH109 – Proactive Analytics: The Next-Generation of Business Intelligence 
Business users demand self-service tools that give them faster access to better insights. Exploding data volumes and variety make finding relevant, trusted information a challenge.
John Poonen – Director, Infosario Data Services Group, Quintiles
Senthil Kanakarajan – VP, Technology Manager, Wells Fargo
Nelson Petracek – Senior Director Emerging Technology Architecture, Informatica

3:45 PM (GRACIA 8)
ARCH102 – Best Practices and Architecture for Agile Data Integration
Business can’t wait for IT to deliver data and reports that may not meet business needs by the time they’re delivered. In the age of self-service and lean integration, business analysts need more control even as IT continues to govern development.
Jared Hillam – EIM Practice Director, Intricity, LLC
John Poonen – Director, Infosario Data Services Group, Quintiles
Robert Myers – Tech Delivery Manager, Informatica

3:45 PM (GRACIA 6)
ARCH119 – HIPAA Validation for Eligibility and Claims Status in Real Time
Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how Health Net met regulatory requirements and limited both costs and expensive rework by architecting a real-time data integration architecture that lets it respond to eligibility and claims status requests within six seconds, near error-free.
Jerry Allen – IT Architect, Health Net, Inc.

3:45 PM (GRACIA 1)
ARCH114 – Bi-Directional, Real-Time Hadoop Streaming 
As organizations seek faster access to data insights, Hadoop is becoming the architectural foundation for real-time data processing environments. With the increase of Hadoop deployments in operational workloads, the importance of real-time and bi-directional data integration grows. MapR senior product management director Anoop Dawar will describe a streaming architecture that combines Informatica technologies with Hadoop. You’ll learn how this architecture can augment capabilities around 360-degree customer views, data warehouse optimization, and other big data business initiatives.
Anoop Dawar – Senior Director, Product Management, MapR Technologies

 HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B)
Table 17b – Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32)
Table 32 – Informatica Data Replication
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica
 

Thursday, May 15, 2014

BREAKOUT SESSIONS

 DEV108 – Informatica and the Information Potential of the “Internet of Things” 
The “Internet of Things” and its torrents of data from multiple sources — clickstreams from web servers, application and infrastructure log data, real-time systems, social media, sensor data, and more — offers an unprecedented opportunity for insight and business transformation. Learn how Informatica can help you access and integrate these massive amounts of real-time data with your enterprise data and achieve your information potential.
Amrish Thakkar – Senior Product Manager, Informatica
Boris Bulanov – Senior Director Solutions Product Management, Informatica

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Complex Event Processing, Data Integration Platform, Informatica Events, Informatica World 2014, Real-Time, Ultra Messaging | Tagged , | Leave a comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts ;-)   – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

Sub-100 Nanosecond Pub/Sub Messaging: What Does It Matter?

Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.

But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?

Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”.  He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.”  What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.

The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU.   The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.

So congrats to the UM development team for achieving this  important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and  your businesses.

 

*- http://tabbforum.com/opinions/the-death-of-alpha-on-wall-street?utm_source=TabbFORUM+Alerts&utm_campaign=1c01537e42-UA-12160392-1&utm_medium=email&utm_term=0_29f4b8f8f1-1c01537e42-270859141

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Ultra Messaging | Tagged , , , , , | Leave a comment

Messaging Challenges Over The WAN

Originally posted on low-latency.com

 

 

Many global companies, especially in the front office of capital markets, have a strategic need to send real-time data from one location to another, often thousands of miles away, over a Wide Area Network (or WAN) connection.

With foreign exchange, for example, global investment banks send dealable streaming prices, orders, trades, and reference data across the WAN. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Ultra Messaging | Tagged , , | Leave a comment

Evolution: The General Store to the Individualized Shopper

The retail industry has seen a major transformation since its evolution. Tracing back to the 18th century where the concept of retail was limited to a “general store” the industry has now grown to the concept of the “individualized shopper.” Initially, the power lay with the merchants, then it was shifted to “brands” and then to “big-box chain stores.” Today the power has shifted squarely to the consumer. This new consumer is empowered – primarily through online and social channels – with nearly limitless options on their path to purchase. The industry is beginning to recognize that the experience they offer, whether in store or online, must be centered on this new consumer reality. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration Platform, Informatica University, Master Data Management, Ultra Messaging | Tagged , , , | Leave a comment

Informatica Ultra Messaging: Great Performance . . . At Lower Cost?

Printed words are good, but pictures and sound are better. Watch the video below for a quick summation of how Informatica Ultra Messaging can help your business:

  1. Increase application performance and throughput
  2. Reduce fixed and operational costs
  3. Increase capacity
  4. Reduce single points of failure
  5. Increase scalability, reliability, and availability

For more information, have a look at: Ultra Messaging, Better Value with Better Technology.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Operational Efficiency, Real-Time, Ultra Messaging | Tagged , , , , , | Leave a comment

Capital Markets and Ultra Messaging

Pete Benesh discusses some of the major business challenges driving IT investment in capital markets today. He also highlights how Informatica Ultra Messaging can help organizations tackle these challenges.
 

FacebookTwitterLinkedInEmailPrintShare
Posted in Ultra Messaging | Tagged , | Leave a comment

More Trading Firms Using a “Trade Smarter, Not Harder” Strategy

There are lots of ways to run a trading firm.

Some firms use a strategy centered around high frequency or algorithmic trading, which are similar in that having the best technology and writing the fastest trading applications is essential.

At the other end of the spectrum, some firms employ only human traders, using a traditional buy-and-hold strategy, expecting to hold the security for months or even years before moving it.

But in between these two ends of the spectrum, there exists a hybrid that uses electronic trading with a bit of buy-and-hold added in. Some call this blend “trade smarter, not harder”.

Instead of competing with other traders to get the absolute lowest price, this strategy prioritizes on making better decisions by doing “pre-trade analytics” on historical and financial data.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Financial Services, Operational Efficiency, Real-Time, Ultra Messaging | Tagged , , , , | Leave a comment

Electronic Trading Systems Moving to the Cloud

More and more business applications are moving from the desktop to the cloud, and electronic trading applications are no different.

Over the last five or ten years, application vendors have established several advantages of running major applications, even mission-critical applications like salesforce.com, over the cloud.

These advantages include:

  • Easier and smoother upgrades, which provides much better adaptability and agility in the face of changing market and business conditions, plus a better user experience,
  • Better scalability, with newer technology advances, and
  • Better portability across a wide array of device types, including smartphones and tablets (especially in the last 2-3 years).

Recent improvements in Web technology, such as HTML5 WebSockets, are helping to speed this transition along by providing several throughput and latency advantages over earlier iterations of Web technology, and even over native Windows applications. Now, application architects can freely choose the technology that provides a better path for growth, agility, and scalability, which is often a Cloud-based solution.

As I write this, a few of our customers who provide electronic trading solutions to their clients are making the strategic move to develop a next generation application based in the Cloud. The main driver for one customer was to be able to take on more clients more quickly and therefore grow the business faster by increasing marginal revenue and profitability. They found that the list of challenges with a thick desktop client to be just too big for growing the business as quickly as they wanted to — or needed to.

Messaging middleware, especially peer-to-peer solutions such as Informatica Ultra Messaging, can be a very important piece of a Cloud-based application. The peer-to-peer “nothing in the middle” model provides applications not just ultra-high performance (whether for high throughput or low latency), but also near-linear scalability, true 24×7 reliability and availability, and business and IT agility. These qualities tie directly to the advantages listed above.

Cloud-based applications, of course, must also contend with the Internet and all that comes with that: support for various browsers and platforms (and versions of each), scalability and bandwidth issues, and mobile devices like smartphones and tablets. New web technologies like HTML5 WebSockets from Kaazing are best positioned to take care of the path from server to the smartphone or tablet, and with JMS connectivity to Ultra Messaging on the back end, can provide a Cloud-based application with a lean, scalable and agile infrastructure, usually with less hardware.

For more, please see our 2011 Efficiency series (#1, #2, #3) on our Perspectives blog, or whitepapers such as Modern Messaging Middleware for Big Data in Motion or Enterprise Messaging Data for the Web.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud Computing, Operational Efficiency, Real-Time, SaaS, Ultra Messaging | Tagged , , , | Leave a comment