Tag Archives: Informatica RulePoint

Big Data Ingestion got you down? I N F O R M A T I C A spells relief

Big Data alike

I have a little fable to tell you…

This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.

And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables.  After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said?  Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables.  “Shall I do an outside join instead?” asked the programmer?  The host considered their schema and assigned 2 seats to the space.

The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said.  With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories.  They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.

They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello.  There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all.  You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.

It was clear they needed relief.  The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available.  No response. Then they asked again. There was still no response.  So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”

Just then, the programmer remembered a remedy he had heard about previously and so he spoke up.  “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”

Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.

And so they all lived happily ever after and all was good in the IT Corporation once again.

***

If you think you know what this fable is about and want a more thorough and technical explanation, check out this tech talk Here

Or

Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Big Data, Complex Event Processing, Data Integration, Data Synchronization, Hadoop, Marketplace, Real-Time, Ultra Messaging | Tagged , , , , , , , , , | 1 Comment

Major Financial Services Institution uses technology to improve your teller experience

Financial Services

Major Financial Services Institution uses technology to improve your teller experience

Like many American men, I judge my banking experience by the efficiency of my transaction time. However, my wife often still likes to go into the bank and see her favorite teller.

For her, banking is a bit more of a social experience. And every once in a while, my wife even drags into her bank as well.  But like many of my male counterparts, I still judge the quality of the experience by the operational efficiency of her teller. And the thing that I hate the most is when our experience at the bank is lengthened when the teller can’t do something and has to get the bank manager’s approval.

Now, a major financial institution has decided to make my life and even my wife’s life better. Using Informatica Rulepoint, they have come up with a way to improve teller operational efficiency and customer experience while actually decreasing operational business risks. Amazing!

How has this bank done this magic? They make use of the data that they have to create a better banking experience. They already capture historical transactions data and team member performance against each transaction in multiple databases. What they are doing now is using this information to make better decisions. With this information, this bank is able to create and update a risk assessment score for each team member at a branch location. And then by using Informatica Rulepoint, they have created approximately 100 rules that are able change teller’s authority based upon the new transaction, the teller’s transaction history, and the teller’s risk assessment score. This means that if my wife carefully picks the right teller, she is speed through the line without waiting for management approval.

So the message at this bank is the fastest teller is the best teller. To me this is really using data to improve  customer experience and allow for less time in a line. Maybe I should get this bank to talk next to my auto mechanic!

FacebookTwitterLinkedInEmailPrintShare
Posted in Financial Services, Real-Time | Tagged , , | Leave a comment

Make Time an Asset not an Enemy

Do you know how long ago the earthquake in Haiti was?  It was 9 months ago.  For some reason, when I heard a recent report, that fact caught me off guard. Wow, 9 months ago! Next year will be 10 years after 9-11.

Just try to catch time

They say time is a constant but it certainly feels like it’s constantly speeding up and passing us by.  It’s so easy to see how time is our enemy. After all it’s the only sure roadmap to all our earthly destinies (sorry for getting philosophical for a moment).

How do we turn time into an organizational asset?  Time seems to always be the enemy when we are in a “reactive” state.  This is when something bad has already happened and we are working to fix it.

But, instrumenting both technical and business operations to sense more granular activities and identify meaningful opportunities is the key way we turn time into an asset.  So, do you fight time? Or do you take time by the horns and make it a strategic advantage?

We’ll be talking about being proactive when it comes to emergency services and public safety at Informatica World.  I’m excited to have retired NYPD detective Gary Maio (now partner at Data Vision Group) and Bellingham Fire Chief Bill Boyd (Twitter handle ‘chiefb2‘) presenting on Wednesday, November 3rd at Informatica World in Washington, DC. They’ll talk about real cases, the challenges they face and the technology currently used along with their vision for the future – a future that will make us all safer.

FacebookTwitterLinkedInEmailPrintShare
Posted in Complex Event Processing, Operational Efficiency, Public Sector, Real-Time | Tagged , , , , | 4 Comments

Rich On Data, Poor On Actionable Insight

InformationWeek says companies lack Real-Time Insight

Many companies have built out their data infrastructure, composed of large data warehouses, varying data marts, integration processes, quality controls and much more.  And, it has driven tremendous value in the form of stronger business intelligence, better views of the customer, faster business processes and more.  But, with all the help on the analysis side, where’s the evolution in putting those findings into operational practice. Like quickly identifying that a premium/platinum-level client was put on hold by a customer care agent for more than 5 minutes (and no, the “we are experiencing unusually high call volume” does not let anyone off the hook).

InformationWeek’s May 24, 2010 cover story was titled “Are your Apps Smart Enough” and it prominently featured a BI report card.  Here were the grades: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Complex Event Processing, Operational Efficiency, Real-Time | Tagged , , , , , , , , , | Leave a comment