Category Archives: Ultra Messaging

Digital Signage helps Reinventing the Store

Reinventing the store was one of the key topics at NRF. Over the last three to four years we have been seeing a lot push and invest for ecommerce innovation and replatforming ecommerce strategies. Now the retail, CPG and brand manufacturers are working on a renaissance of the store and show room, driven by digital. And there is still way to go.

Incremental part of the omnichannel strategy of our PIM customer Murdoch’s Ranch and Home Supply is digital signage for in-store product promotions. This selfie was shot with my dear colleague Thomas Kasemir (VP RnD PIM & Procurement) at the NRF booth of Four Winds Interactive.

IMG_5517

Four Winds serves about 5,000 companies worldwide and I would consider them as one of the market leaders. Alison Rank and her team did show case how static product promotions work and how dynamic personalized product promotions can look like, when John Doe enters the store.

John Doe’s Personalized Purchase Journey

John Doe and his wife are out and about in the city; with the advice from his son, John has created a pro-file on Facebook and Foursquare with his new generation smartphone enabling him to receive any special offers in his vicinity. Mr. Doe has voluntarily agreed to share his data for the specific purpose of allowing retailers to call to his attention any special offers in the area. As both of them have interest in visiting the store they respond to the offer.

At the entrance to the store he is advised to start up the special store app and is promised a “personalized shopping” experience. As John Doe enters the store, a friendly greeting appears on his digital signage screen: “Welcome Mr. Doe, the men’s suits are on the 3rd floor and we have the following offers for you.” Upon reaching the 3rd floor, the salesperson is already standing there with the right suit. The suit is one size smaller than usual, but it fits John Doe. After the fitting, the salesperson even points out the new women’s hat collection in the women’s department. Satisfied with their purchases, Mr. and Mrs. Doe leave the store.

For me it is clear assuming that the future of shopping will look something like this, due to the fact that all of these technologies are already available. But what has taken place? The reason why John Doe receives location-based offers has already been explained above; the point that needs to be made is that there is now the ability to link personal and statistical data to customers. By means of the app, the store already knows whom they are dealing with as soon as they enter the store. Or can messaging services be used to send an alert to a shop assistant that a A-Customer with high value shopping carts has just entered the store.

To this point, stores can leverage both personal information as well as location-based information to generate a personal greeting for the customer.

  • What did he buy? In which department was he and for how long?
  • When did he purchase his last suit(s)?
  • What sizes were these?
  • Does he have an online profile?
  • What does he order online and does he finish the transaction?

All of this analytical data can be stored and retrieved behind the scenes. 

Catch Me if I Want

The targeted sales approach at the point of interest (POI) and point of sale (POS) is considered to be increasingly important.  This type of communication is becoming dynamic and is taking precedent over traditional forms of advertising.

When entering the store today, customers are for the most part undecided. Based on this assumption, they can be influenced by ads and targeted product placement.  Customers are now willing to disclose their location data and personal information provided there is added value for them to do so.

Example from Vapiano Restaurant

A good example is the Vapiano restaurant chain. Vapiano restaurants take an extra step further than the tradi-tional loyalty card by utilizing a special smartphone app where the customer can not only choose the nearest restau-rant along with special offers and menu, but also receive a kind of credit after payment via barcode. After collecting 10 credits, the restaurant guest receives a main course for free on the 11th visit. Sound good? It sure does, and from the company’s perspective this is a win-win situation. These obvious benefits move the customer to disclose his or her eating habits and personal data. The restaurant chain now has access to their birth dates, which is rewarded as well. This data aggregation is definitely recommendable, since it requires the guest’s explicit consent and assumes a certain degree of active participation from the guest to be eligible for the rewards offered by the restaurant.

Summary

If John Doe allowed my as brand manufacturer in my showroom or as a retailer to catch him, companies will need to ensure that they are really able to identity John Doe wit this all channel customer profile to come up with a personalized offer on digital signage. But this needs to be covered in an additional blogs…

Share
Posted in Data Governance, Manufacturing, Master Data Management, PiM, Product Information Management, Real-Time, Retail, Ultra Messaging | Tagged , , , | 1 Comment

Riding The Wave – Forrester Style

wavepic1

 

Forrester Research, a leading independent analyst firm just released a new Wavetm report about Big Data Streaming Analytics Platforms and in it, Informatica was designated a Leader.

This is exciting news for a number of reasons.

– Personally, as a product leader focused on some of our newer technologies at Informatica, this is a positive sign of wider acceptance in the market.  One might argue it’s the start of a mainstreaming process.  Analyst firms don’t usually release these reports unless there are clear signs of a critical mass of customer interest (among other criteria).  And indeed, in their report, the authors cited Forrester survey data that revealed firms’ use of Streaming Analytics increasing 66% in the past two years.

– To validate product and vendor qualifications, Forrester conducted reference calls with current customers, so thank you to our customers for feedback you’ve provided to the analysts about our Big Data Streaming Analytics platform. It means a lot.

– The authors make an important point in the report when they write “Streaming Analytics is anything but a sleepy, rear-view-mirror analysis of data.  No, it is about knowing and acting on what’s happening in your business – now…The high velocity, white water flow of data from innumerable real-time data sources such as market data, Internet of Things, mobile, sensors, click stream and even transactions remain largely unnavigated by most firms.  The opportunity to leverage streaming analytics has never been greater”  We would agree.

– Finally it’s been an area of importance, investment, and diligent work (aka Blood Sweat and Tears) for Informatica for a while now. This really validates for us that we’ve been carving our surfboard in the right direction and now we are totally stoked that we’ve caught a righteous gnarly wave.

So while we’ll celebrate this accomplishment for today, the work really begins now…

To read the full report, The Forrester Wave™: Big Data Streaming Analytics Platforms, Q3 2014,  hang loose and surf on over here


 

Share
Posted in Big Data, Complex Event Processing, Real-Time, Ultra Messaging | Tagged , , , , , | Leave a comment

Big Data Ingestion got you down? I N F O R M A T I C A spells relief

Big Data alike

I have a little fable to tell you…

This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.

And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables.  After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said?  Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables.  “Shall I do an outside join instead?” asked the programmer?  The host considered their schema and assigned 2 seats to the space.

The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said.  With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories.  They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.

They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello.  There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all.  You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.

It was clear they needed relief.  The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available.  No response. Then they asked again. There was still no response.  So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”

Just then, the programmer remembered a remedy he had heard about previously and so he spoke up.  “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”

Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.

And so they all lived happily ever after and all was good in the IT Corporation once again.

***

If you think you know what this fable is about and want a more thorough and technical explanation, check out this tech talk Here

Or

Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.

 

Share
Posted in Architects, Big Data, Complex Event Processing, Data Integration, Data Synchronization, Hadoop, Marketplace, Real-Time, Ultra Messaging | Tagged , , , , , , , , , | 1 Comment

Keeping it “Real” at Informatica World 2014

Keeping it Real

This dog is Keeping it Real

Most have heard by now that Informatica World 2014 will be a unique opportunity for attendees to get new ideas, expert advice, and hands-on demonstrations on real-time data integration products and capabilities at the premier data conference of the year. 

However, it is Las Vegas after all, so for those of you wagering on the best sessions, here’s my quick guide (or morning line) for your viewing, research or attending pleasure.

I hope to see you there.

Tuesday May 13, 2014

BREAKOUT SESSIONS

1:00 PM (GRACIA 8):
ARCH101 – Enterprise Data Architecture for the Data-Centric Enterprise 
How do you build an architecture that increases development speed, protects quality, and reduces cost and complexity, all while accelerating current project delivery?
Srinivas Kolluru – Chief Architect, Southwest Power Pool, Inc.
Tom Kato – Architect, US AirwaysAmerican Airlines
John Schmidt – Vice President, Informatica

 4:45 PM (GRACIA 8):
ARCH113 – Western Union: Implementing a Hadoop-based Enterprise Data Hub with Cloudera and Informatica 
To expand its business and delight customers with proactive, personalized web and mobile marketing, Western Union needs to process massive amounts of data from multiple sources.
Pravin Darbare – Senior Manager Integration and Transformation, Western Union
Clarke Patterson – Sr. Director of Product Marketing, Cloudera, Inc.

 4:45 PM (GRACIA 3):
IPaW132 – NaviNet, Inc and Informatica: Delivering Network Intelligence… The Value to the Payer, Provider and Patient 
Healthcare payers and providers today must share information in unprecedented ways to achieve their goals of reducing redundancy, cutting costs, coordinating care and driving positive outcomes.
Frank Ingari – CEO, NaviNet, Inc.

HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B):
Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32):
Informatica Data Replication 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica

 BOOTHS 5:30-8:00 PM

2032- PowerExchange Change Data Capture 9.6
2065- Informatica Data Replication
2136- Informatica Real-time Data Integration
2137- Vibe Data Stream for Machine Data

Wednesday, May 14, 2014

BREAKOUT SESSIONS

11:30 AM (CASTELLANA 1)
ARCH109 – Proactive Analytics: The Next-Generation of Business Intelligence 
Business users demand self-service tools that give them faster access to better insights. Exploding data volumes and variety make finding relevant, trusted information a challenge.
John Poonen – Director, Infosario Data Services Group, Quintiles
Senthil Kanakarajan – VP, Technology Manager, Wells Fargo
Nelson Petracek – Senior Director Emerging Technology Architecture, Informatica

3:45 PM (GRACIA 8)
ARCH102 – Best Practices and Architecture for Agile Data Integration
Business can’t wait for IT to deliver data and reports that may not meet business needs by the time they’re delivered. In the age of self-service and lean integration, business analysts need more control even as IT continues to govern development.
Jared Hillam – EIM Practice Director, Intricity, LLC
John Poonen – Director, Infosario Data Services Group, Quintiles
Robert Myers – Tech Delivery Manager, Informatica

3:45 PM (GRACIA 6)
ARCH119 – HIPAA Validation for Eligibility and Claims Status in Real Time
Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how Health Net met regulatory requirements and limited both costs and expensive rework by architecting a real-time data integration architecture that lets it respond to eligibility and claims status requests within six seconds, near error-free.
Jerry Allen – IT Architect, Health Net, Inc.

3:45 PM (GRACIA 1)
ARCH114 – Bi-Directional, Real-Time Hadoop Streaming 
As organizations seek faster access to data insights, Hadoop is becoming the architectural foundation for real-time data processing environments. With the increase of Hadoop deployments in operational workloads, the importance of real-time and bi-directional data integration grows. MapR senior product management director Anoop Dawar will describe a streaming architecture that combines Informatica technologies with Hadoop. You’ll learn how this architecture can augment capabilities around 360-degree customer views, data warehouse optimization, and other big data business initiatives.
Anoop Dawar – Senior Director, Product Management, MapR Technologies

 HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B)
Table 17b – Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32)
Table 32 – Informatica Data Replication
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica
 

Thursday, May 15, 2014

BREAKOUT SESSIONS

 DEV108 – Informatica and the Information Potential of the “Internet of Things” 
The “Internet of Things” and its torrents of data from multiple sources — clickstreams from web servers, application and infrastructure log data, real-time systems, social media, sensor data, and more — offers an unprecedented opportunity for insight and business transformation. Learn how Informatica can help you access and integrate these massive amounts of real-time data with your enterprise data and achieve your information potential.
Amrish Thakkar – Senior Product Manager, Informatica
Boris Bulanov – Senior Director Solutions Product Management, Informatica

 

Share
Posted in Complex Event Processing, Data Integration Platform, Informatica Events, Informatica World 2014, Real-Time, Ultra Messaging | Tagged , | Leave a comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts ;-)  – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

Sub-100 Nanosecond Pub/Sub Messaging: What Does It Matter?

Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.

But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?

Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”.  He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.”  What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.

The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU.   The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.

So congrats to the UM development team for achieving this  important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and  your businesses.

 

*- http://tabbforum.com/opinions/the-death-of-alpha-on-wall-street?utm_source=TabbFORUM+Alerts&utm_campaign=1c01537e42-UA-12160392-1&utm_medium=email&utm_term=0_29f4b8f8f1-1c01537e42-270859141

Share
Posted in Banking & Capital Markets, Ultra Messaging | Tagged , , , , , | Leave a comment

Messaging Challenges Over The WAN

Originally posted on low-latency.com

 

 

Many global companies, especially in the front office of capital markets, have a strategic need to send real-time data from one location to another, often thousands of miles away, over a Wide Area Network (or WAN) connection.

With foreign exchange, for example, global investment banks send dealable streaming prices, orders, trades, and reference data across the WAN. (more…)

Share
Posted in Ultra Messaging | Tagged , , | Leave a comment

Evolution: The General Store to the Individualized Shopper

The retail industry has seen a major transformation since its evolution. Tracing back to the 18th century where the concept of retail was limited to a “general store” the industry has now grown to the concept of the “individualized shopper.” Initially, the power lay with the merchants, then it was shifted to “brands” and then to “big-box chain stores.” Today the power has shifted squarely to the consumer. This new consumer is empowered – primarily through online and social channels – with nearly limitless options on their path to purchase. The industry is beginning to recognize that the experience they offer, whether in store or online, must be centered on this new consumer reality. (more…)

Share
Posted in Data Integration Platform, Informatica University, Master Data Management, Ultra Messaging | Tagged , , , | Leave a comment

Informatica Ultra Messaging: Great Performance . . . At Lower Cost?

Printed words are good, but pictures and sound are better. Watch the video below for a quick summation of how Informatica Ultra Messaging can help your business:

  1. Increase application performance and throughput
  2. Reduce fixed and operational costs
  3. Increase capacity
  4. Reduce single points of failure
  5. Increase scalability, reliability, and availability

For more information, have a look at: Ultra Messaging, Better Value with Better Technology.

Share
Posted in Business Impact / Benefits, Operational Efficiency, Real-Time, Ultra Messaging | Tagged , , , , , | Leave a comment

Capital Markets and Ultra Messaging

Pete Benesh discusses some of the major business challenges driving IT investment in capital markets today. He also highlights how Informatica Ultra Messaging can help organizations tackle these challenges.
 

Share
Posted in Ultra Messaging | Tagged , | Leave a comment