Alan Lundberg

Alan Lundberg
Alan Lundberg is Principal Marketing Manager for Emerging Products at Informatica responsible for driving product strategy and marketing around Event Processing and Real-Time Operational Intelligence. Alan is a well-known author and speaker on Event Processing and has a long record of successfully marketing advanced enterprise software products. Prior to Informatica, he was in product management and marketing management roles at TIBCO, FICO Software and Inference. He is a steering committee member and co-founder of the Event Processing Technical Society and has authored technical articles in Business Intelligence Journal and AI Expert among others. Mr. Lundberg received his M.B.A. from Loyola Marymount University.

Let’s Get Real… At Informatica World 2015

IW15

Let’s Get Real… At Informatica World 2015

Ah yes. Las Vegas in May… not a bad place to be, unless you are planning to recreate the latest Hangover Movie.

Informatica World happens to be there on May 11-15th, 2015, and once again I’ll offer my predictions (aka Morning Line) for your BEST viewing and learning opportunities.  If you see all of these, who knows, you just may win the EXACTA for real time prowess. These sessions will appeal to you, especially if you fancy the latest in Real Time Big data Integration information.

So with no further ado, forthwith are your best bets —

May 12, 1:30 – 2:30  Room: Gracia 2

How to Support Real-Time Data Integration Projects with PowerCenter

James Chidichimo, Vice President of Information Technology, PRA Group
Brandon Wyckoff, Data Warehouse Administrator, PRA Group
Raj Khot, IT Manager – Application/Data Management, Integration and BI, Grant Thornton LLP
Lokesh Krishnamoorthy, Project Lead, IT Data Integration, Grant Thornton LLP
Terry Simonds, Senior Director, Business Development, Informatica

Informatica recently released PowerCenter v9.6, which is designed to deliver end-to-end data integration agility to support business agility. A big part of business agility is about ensuring that your IT organization can make the most current and trustworthy data available, in whatever form necessary, regardless of where it resides, at the right time.

Maybe you want to do more real-time data warehousing by shortening your batch windows

Maybe you want to synchronize all operational and transactional systems with consistent and accurate information

Maybe you want to execute intraday batches to augment overnight batch processing in your data warehousing projects

Maybe you want to better integrate the data layer with an application integration layer (i.e.EAI, SOA, etc.)

In this session, we will discuss and demo how you can leverage your existing Informatica skills and PowerCenter’s real-time capabilities to enable change data capture (CDC), integration with messaging systems, and Web services with a single platform to increase the speed of accessing and delivering data, mix real-time operational data with historical data, and capture changes in near real-time, non-invasively, without impacting the system, to support IT projects including real time data warehousing, data synchronization, data replication, operational data hubs, and more

May 14, 10:10 – 11:10 Room: Gracia 1

Big Data Real Time Streaming Analytics Session

Scott Hagan, Principal Product Manager, Informatica
Terry Simonds, Senior Director, Business Development, Informatica
Amrish Thakkar, Principal Product Manager, Informatica

One of the more interesting and advanced use cases of Informatica has been in the area of streaming and predictive analytics for Big Data

In this session, you will hear from experts who have implemented a streaming analytics application that predicts when spare parts on jet engines need service—even before they need service

Don’t miss this interactive session that will feature discussions around real-time best practices and tips and tricks, from design to implementation

May 14, 2:30 – 3:30    Room: Gracia 6

Accelerating Business with Near Real-Time Architectures

Deepak Gattala, Solution Architect, Dell Inc.
Hari Kalla, Principal Consultant, SSG Limited

This session will describe near real-time architectures for accelerating the delivery of data to critical analytics and customer service applications

Architects from Dell and SSG will talk about how they are using Vibe Data Stream and other technologies to build the architectures that bring near real-time data to key applications and analytics use cases

The business benefits of these approaches include timely business decisions and more effective and timely customer support.

May 13, 2:00 – 2:45   Room: Gracia 3

Building a Real-time Data Platform for Improved Customer Engagement

Chris Hammond, Project Director, British Telecom
Terry Britt, Senior DBA, Camping World
Luc Clement, Director Product Management, Informatica Cloud

Every company says that customers are their #1 priority, but how do you use the data throughout your organization to go the extra mile? Join British Telecom and Camping World as they discuss their different approaches to improving the customer experience

British Telecom decided to consolidate customer data from seven different Salesforce systems in real-time to improve customer response times

Camping World forged a strategy around customer satisfaction to support a dynamic B2C retail business

You’ll also hear how both companies implemented elegant and successful customer service solutions with Informatica Cloud. This is sure to be a fascinating session with lots of practical advice on how to improve your customer experience with real-time app integration and customer analytics to deliver a 360-degree view of every customer interaction

May 14, 11:20 – 12:20   Room: Gracia 8

Enabling Digital Transformation with Real-Time Master Data Management
Sanjeev Gupta, CIO, Cover-More Group

Cover-More has embarked on the development of a customer-centric strategy. A core component of the strategy is the use of digital channels as the key mechanism to understand and have deeper relationships with customers and prospects

Success in the digital channel is dependent on having good quality customer information and 360 degree view of the customer Currently, Cover-More faces several challenges including information that is located across disparate sources and customer information being held in silos. With a mandate to rapidly enable this strategy, Cover-More has partnered with Informatica and Capgemini to implement a real-time Master data management integrated solution that will eliminate duplicate records at the time of on boarding; integrate in real-time with the online channel, and create a more seamless experience for customers

The Big Data Journey: Traditional BI to Next Gen Analytics – Customer Panel

  • Deepika Sinha, IT Manager Enterprise Business Solutions, Johnson & Johnson
  • Dave Beaudoin, VP Data Architecture, Transamerica
  • Christine Miesner, Manager E&P Data Management, Devon Energy
  • Thomas Reichel, Lead Architect, KPN

May 12, 3:30 – 4:30    Room: Gracia 6

Modernize Your Application Architecture and Boost Your Business Agility

Tom Kato, CEO, Mototak LLC
Rick Mutsaers, Senior Consultant, Informatica

Big application projects present a chance to take a fresh look at your overall application and data architecture. Done right, application consolidation, application migration, and M&A integration offer the chance to rationalize your architecture, reducing your IT costs and speeding up future project delivery through a simplified environment

We will talk about approaches to these complex projects as well as best practices to lower your risk and increase productivity

So enjoy the conference and hope to see (or meet) you there.

Share
Posted in Informatica World 2015 | Tagged , | Leave a comment

Big Data Is Neither-Part II

Big_DataYou Say Big Dayta, I say Big Dahta

Some say Big Data is a great challenge while others say Big Data creates new opportunities. Where do you stand?  For most companies concerned with their Big Data challenges, it shouldn’t be so difficult – at least on paper. Computing costs (both hardware and software) have vastly shrunk. Databases and storage techniques have become more sophisticated and scale massively, and companies such as Informatica have made connecting and integrating all the “big” and disparate data sources much easier and have helped companies achieve a sort of “big data synchronicity”. As it is.

In the process of creating solutions to Big Data problems, humans (and the supra-species known as IT Sapiens) have a tendency to use theories based on linear thinking and the scientific method. There is data as our systems know it and data as our systems don’t. The reality, in my opinion, is that “Really Big Data” problems now and in the future will have complex correlations and unintuitive relationships that need to utilize mathematical disciplines, data models and algorithms that haven’t even been discovered or invented yet and when eventually discovered, will make current database science positively primordial.

At some point in the future, machines will be able to predict, based on big, perhaps unknown data types when someone is having a bad day or a good day, or more importantly whether a person may behave in a good or bad way. Many people do this now when they take a glance at someone across a room and infer how that person is feeling or what they will do next. They see eyes that are shiny or dull, crinkles around eyes or sides of mouths, then hear the “tone” in a voice and then their neurons put it altogether that this is a person that is having a bad day and needs a hug. Quickly. No one knows exactly how the human brain does this, but it does what it does and we go with it and we are usually right.

U.S._Air_Force_Senior_Airman__130429-F-ZX232-013

And some day, Big Data will be able to derive this and it will be an evolution point and it will also be a big business opportunity. Through bigger and better data ingestion and integration techniques and more sophisticated math and data models, a machine will do this fast and relatively speaking, cheaply. The vast majority won’t understand why or how it’s done, but it will work and it will be fairly accurate.

And my question to you all is this.

Do you see any other alternate scenarios regarding the future of big data? Is contextual computing an important evolution and will big data integration be more or less of a problem in the future.

PS. Oh yeah, one last thing to chew on concerning Big Data… If Big Data becomes big enough, does that spell the end of modelling as we know it?

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CMO, Complex Event Processing, Data Integration Platform, Hadoop, Intelligent Data Platform | Tagged , , , | Leave a comment

Big Data Is Neither-Part I

humongdataI’ve been having some interesting conversations with work colleagues recently about the Big Data hubbub and I’ve come to the conclusion that “Big Data” as hyped is neither, really. In fact, both terms are relative. “Big” 20 years ago to many may have been 1 terabyte. “Data” 20 years ago may have been Flat files, Sybase, Oracle, Informix, SQL Server or DB2 tables. Fast forward to today and “Big” is now Exabytes (or millions of terabytes). “Data” are now expanded to include events, sensors, messages, RFID, telemetry, GPS, accelerometers, magnetometers, IoT / M2M and other new and evolving data classifications.

And then there’s social and search data.

Surely you would classify Google data as really really big data – I can tell when I do a search, and get 487,464,685 answers within fractions of a second that they appear to have gotten a handle on their big data speeds and feeds. However, it’s also telling that nearly all of those bazillion results are actually not relevant to what I am searching for.

My conclusion is that if you have the right algorithms, invest in and use the right hardware and software technology and make sure to measure the pertinent data sources, harnessing big data can yield speedy &“big”results.

So what’s the rub then?

It usually boils down to having larger and more sophisticated data stores and still not understanding its structure, OR it can’t be integrated into cohesive formats, OR there is important hidden meaning in the data that we don’t have the wherewithal to derive, see or understand a la Google? So how DO you find the timely and important information out of your company’s big data (AKA the needle in the haystack)?

needlehaystack-Big Data

More to the point, how do you better ingest, integrate, parse, analyze, prepare, and cleanse your data to get the speed, but also the relevancy in a Big Data world?

Hadoop related tools are one of the current technologies of choice when it comes to solving Big Data related problems, and as an Informatica customer, you can leverage these tools, regardless of whether it’s Big Data or Not So Big Data, fast data or slow data. In fact, it actually astounds me that many IT professionals would want to go back to hand coding with a Hadoop tool just because they don’t know that the tools to do so are right under their nose, installed and running in their familiar Informatica User Interface (AND that work with Hadoop right out of the box.)

So what does your company get out of using Informatica in conjunction with Hadoop tools? Namely, better customer service and responsiveness, better operational efficiencies, more effective supply chains, better governance, service assurance, and the ability to discover previously unknown opportunities as well as stopping problems when they are an issue – not after the fact. In other words, Big Data done right can be a great advantage to many of today’s organizations.

Much more to say on this this subject as I delve into the future of Big Data. For more, see Part 2.

Share
Posted in Big Data, Business Impact / Benefits, Complex Event Processing, Intelligent Data Platform | Tagged , , , , | Leave a comment

Fast and Fasterer: Screaming Streaming Data on Hadoop

Hadoop

Guest Post by Dale Kim

This is a guest blog post, written by Dale Kim, Director of Product Marketing at MapR Technologies.

Recent published research shows that “faster” is better than “slower.” The point, ladies and gentlemen, is that speed, for lack of a better word, is good. But granted, you won’t always have the need for speed. My Lamborghini is handy when I need to elude the Bakersfield fuzz on I-5, but it does nothing for my Costco trips. There, I go with capacity and haul home my 30-gallon tubs of ketchup with my Ford F150. (Note: this is a fictitious example, I don’t actually own an F150.)

But if speed is critical, like in your data streaming application, then Informatica Vibe Data Stream and the MapR Distribution including Apache™ Hadoop® are the technologies to use together. But since Vibe Data Stream works with any Hadoop distribution, my discussion here is more broadly applicable. I first discussed this topic earlier this year during my presentation at Informatica World 2014. In that talk, I also briefly described architectures that include streaming components, like the Lambda Architecture and enterprise data hubs. I recommend that any enterprise architect should become familiar with these high-level architectures.

Data streaming deals with a continuous flow of data, often at a fast rate. As you might’ve suspected by now, Vibe Data Stream, based on the Informatica Ultra Messaging technology, is great for that. With its roots in high speed trading in capital markets, Ultra Messaging quickly and reliably gets high value data from point A to point B. Vibe Data Stream adds management features to make it consumable by the rest of us, beyond stock trading. Not surprisingly, Vibe Data Stream can be used anywhere you need to quickly and reliably deliver data (just don’t use it for sharing your cat photos, please), and that’s what I discussed at Informatica World. Let me discuss two examples I gave.

Large Query Support. Let’s first look at “large queries.” I don’t mean the stuff you type on search engines, which are typically no more than 20 characters. I’m referring to an environment where the query is a huge block of data. For example, what if I have an image of an unidentified face, and I want to send it to a remote facial recognition service and immediately get the identity? The image would be the query, the facial recognition system could be run on Hadoop for fast divide-and-conquer processing, and the result would be the person’s name. There are many similar use cases that could leverage a high speed, reliable data delivery system along with a fast processing platform, to get immediate answers to a data-heavy question.

Data Warehouse Onload. For another example, we turn to our old friend the data warehouse. If you’ve been following all the industry talk about data warehouse optimization, you know pumping high speed data directly into your data warehouse is not an efficient use of your high value system. So instead, pipe your fast data streams into Hadoop, run some complex aggregations, then load that processed data into your warehouse. And you might consider freeing up large processing jobs from your data warehouse onto Hadoop. As you process and aggregate that data, you create a data flow cycle where you return enriched data back to the warehouse. This gives your end users efficient analysis on comprehensive data sets.

Hopefully this stirs up ideas on how you might deploy high speed streaming in your enterprise architecture. Expect to see many new stories of interesting streaming applications in the coming months and years, especially with the anticipated proliferation of internet-of-things and sensor data.

To learn more about Vibe Data Stream you can find it on the Informatica Marketplace .


 

Share
Posted in Big Data, Business Impact / Benefits, Data Services, Hadoop | Tagged , , , , | Leave a comment

Riding The Wave – Forrester Style

wavepic1

 

Forrester Research, a leading independent analyst firm just released a new Wavetm report about Big Data Streaming Analytics Platforms and in it, Informatica was designated a Leader.

This is exciting news for a number of reasons.

– Personally, as a product leader focused on some of our newer technologies at Informatica, this is a positive sign of wider acceptance in the market.  One might argue it’s the start of a mainstreaming process.  Analyst firms don’t usually release these reports unless there are clear signs of a critical mass of customer interest (among other criteria).  And indeed, in their report, the authors cited Forrester survey data that revealed firms’ use of Streaming Analytics increasing 66% in the past two years.

– To validate product and vendor qualifications, Forrester conducted reference calls with current customers, so thank you to our customers for feedback you’ve provided to the analysts about our Big Data Streaming Analytics platform. It means a lot.

– The authors make an important point in the report when they write “Streaming Analytics is anything but a sleepy, rear-view-mirror analysis of data.  No, it is about knowing and acting on what’s happening in your business – now…The high velocity, white water flow of data from innumerable real-time data sources such as market data, Internet of Things, mobile, sensors, click stream and even transactions remain largely unnavigated by most firms.  The opportunity to leverage streaming analytics has never been greater”  We would agree.

– Finally it’s been an area of importance, investment, and diligent work (aka Blood Sweat and Tears) for Informatica for a while now. This really validates for us that we’ve been carving our surfboard in the right direction and now we are totally stoked that we’ve caught a righteous gnarly wave.

So while we’ll celebrate this accomplishment for today, the work really begins now…

To read the full report, The Forrester Wave™: Big Data Streaming Analytics Platforms, Q3 2014,  hang loose and surf on over here


 

Share
Posted in Big Data, Complex Event Processing, Real-Time, Ultra Messaging | Tagged , , , , , | Leave a comment

Big Data Ingestion got you down? I N F O R M A T I C A spells relief

Big Data alike

I have a little fable to tell you…

This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.

And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables.  After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said?  Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables.  “Shall I do an outside join instead?” asked the programmer?  The host considered their schema and assigned 2 seats to the space.

The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said.  With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories.  They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.

They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello.  There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all.  You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.

It was clear they needed relief.  The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available.  No response. Then they asked again. There was still no response.  So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”

Just then, the programmer remembered a remedy he had heard about previously and so he spoke up.  “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”

Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.

And so they all lived happily ever after and all was good in the IT Corporation once again.

***

If you think you know what this fable is about and want a more thorough and technical explanation, check out this tech talk Here

Or

Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.

 

Share
Posted in Architects, Big Data, Complex Event Processing, Data Integration, Data Synchronization, Hadoop, Marketplace, Real-Time, Ultra Messaging | Tagged , , , , , , , , , | 1 Comment

Keeping it “Real” at Informatica World 2014

Keeping it Real

This dog is Keeping it Real

Most have heard by now that Informatica World 2014 will be a unique opportunity for attendees to get new ideas, expert advice, and hands-on demonstrations on real-time data integration products and capabilities at the premier data conference of the year. 

However, it is Las Vegas after all, so for those of you wagering on the best sessions, here’s my quick guide (or morning line) for your viewing, research or attending pleasure.

I hope to see you there.

Tuesday May 13, 2014

BREAKOUT SESSIONS

1:00 PM (GRACIA 8):
ARCH101 – Enterprise Data Architecture for the Data-Centric Enterprise 
How do you build an architecture that increases development speed, protects quality, and reduces cost and complexity, all while accelerating current project delivery?
Srinivas Kolluru – Chief Architect, Southwest Power Pool, Inc.
Tom Kato – Architect, US AirwaysAmerican Airlines
John Schmidt – Vice President, Informatica

 4:45 PM (GRACIA 8):
ARCH113 – Western Union: Implementing a Hadoop-based Enterprise Data Hub with Cloudera and Informatica 
To expand its business and delight customers with proactive, personalized web and mobile marketing, Western Union needs to process massive amounts of data from multiple sources.
Pravin Darbare – Senior Manager Integration and Transformation, Western Union
Clarke Patterson – Sr. Director of Product Marketing, Cloudera, Inc.

 4:45 PM (GRACIA 3):
IPaW132 – NaviNet, Inc and Informatica: Delivering Network Intelligence… The Value to the Payer, Provider and Patient 
Healthcare payers and providers today must share information in unprecedented ways to achieve their goals of reducing redundancy, cutting costs, coordinating care and driving positive outcomes.
Frank Ingari – CEO, NaviNet, Inc.

HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B):
Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32):
Informatica Data Replication 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica

 BOOTHS 5:30-8:00 PM

2032- PowerExchange Change Data Capture 9.6
2065- Informatica Data Replication
2136- Informatica Real-time Data Integration
2137- Vibe Data Stream for Machine Data

Wednesday, May 14, 2014

BREAKOUT SESSIONS

11:30 AM (CASTELLANA 1)
ARCH109 – Proactive Analytics: The Next-Generation of Business Intelligence 
Business users demand self-service tools that give them faster access to better insights. Exploding data volumes and variety make finding relevant, trusted information a challenge.
John Poonen – Director, Infosario Data Services Group, Quintiles
Senthil Kanakarajan – VP, Technology Manager, Wells Fargo
Nelson Petracek – Senior Director Emerging Technology Architecture, Informatica

3:45 PM (GRACIA 8)
ARCH102 – Best Practices and Architecture for Agile Data Integration
Business can’t wait for IT to deliver data and reports that may not meet business needs by the time they’re delivered. In the age of self-service and lean integration, business analysts need more control even as IT continues to govern development.
Jared Hillam – EIM Practice Director, Intricity, LLC
John Poonen – Director, Infosario Data Services Group, Quintiles
Robert Myers – Tech Delivery Manager, Informatica

3:45 PM (GRACIA 6)
ARCH119 – HIPAA Validation for Eligibility and Claims Status in Real Time
Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how Health Net met regulatory requirements and limited both costs and expensive rework by architecting a real-time data integration architecture that lets it respond to eligibility and claims status requests within six seconds, near error-free.
Jerry Allen – IT Architect, Health Net, Inc.

3:45 PM (GRACIA 1)
ARCH114 – Bi-Directional, Real-Time Hadoop Streaming 
As organizations seek faster access to data insights, Hadoop is becoming the architectural foundation for real-time data processing environments. With the increase of Hadoop deployments in operational workloads, the importance of real-time and bi-directional data integration grows. MapR senior product management director Anoop Dawar will describe a streaming architecture that combines Informatica technologies with Hadoop. You’ll learn how this architecture can augment capabilities around 360-degree customer views, data warehouse optimization, and other big data business initiatives.
Anoop Dawar – Senior Director, Product Management, MapR Technologies

 HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B)
Table 17b – Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32)
Table 32 – Informatica Data Replication
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica
 

Thursday, May 15, 2014

BREAKOUT SESSIONS

 DEV108 – Informatica and the Information Potential of the “Internet of Things” 
The “Internet of Things” and its torrents of data from multiple sources — clickstreams from web servers, application and infrastructure log data, real-time systems, social media, sensor data, and more — offers an unprecedented opportunity for insight and business transformation. Learn how Informatica can help you access and integrate these massive amounts of real-time data with your enterprise data and achieve your information potential.
Amrish Thakkar – Senior Product Manager, Informatica
Boris Bulanov – Senior Director Solutions Product Management, Informatica

 

Share
Posted in Complex Event Processing, Data Integration Platform, Informatica Events, Informatica World 2014, Real-Time, Ultra Messaging | Tagged , | Leave a comment

Non-Clonetroversial Oracle Data Cloning

Cloning

When the average person hears of cloning, my bet is that they think of the controversy and ethical issues surrounding cloning, such as the cloning of Dolly the sheep, or the possible cloning of humans by a mad geneticist in a rogue nation state. I would also put money down that when an Informatica blog reader thinks of cloning they think of “The Matrix” or “Star Wars” (that dreadful episode II Attack of the Clones).   I did.  Unfortunately.

But my pragmatic expectation is that when Informatica customers think of cloning, they also think of Data Cloning software.  Data Cloning software clones terabytes of database data into a host of other databases, data warehouses, analytical appliances, and Big Data stores such as Hadoop.  And just for hoots and hollers, you should know that almost half of all Data Integration efforts involve replication, be it snapshot or real-time, according to TDWI survey data. Survey also says… replication is the second most popular — or second most used — data integration tool, behind ETL.

ClonewocontroCloning should be easy and very natural. It’s an important part of life (at your job).   However, we can all admit that it is also a process that causes many a headache and ruins many a relationship.

Do your company’s cloning tools work with non-standard types? Know that Informatica cloning tools can reproduce Oracle data to just about anything on 2 tuples (or more).  We do non-discriminatory duplication, so it’s no wonder we especially fancy cloning the Oracle!  (a thousand apologies for the bad “Matrix” pun)

Just remember that data clones are an important and natural component of business continuity, and the use cases span both operational and analytic applications.  So if you’re not cloning your Oracle data safely and securely with the quality results that you need and deserve, it’s high time that you get some better tools.

Send in the Clones

With that in mind, if you haven’t tried to clone before, for a limited time, Informatica is making Fast Clone database cloning trial software product available for a free download. Click here to get it now.

Share
Posted in data replication, Data Synchronization, Database Archiving, Enterprise Data Management, Marketplace | Tagged , , , , , , , , | Leave a comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts 😉  – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

True Facts About Informatica RulePoint Real-Time Integration

Cabralia_computer_center

Shhhh… RulePoint Programmer Hard at Work

End of year.  Out with the old, in with the new.  A time where everyone gets their ducks in order, clears the pipe and gets ready for the New Year. For R&D, one of the gating events driving the New Year is the annual sales kickoff event where we present to Sales the new features so they can better communicate a products’ road map and value to potential buyers.  All well and good.  But part of the process is to fill out a Q and A that explains the product “Value Prop” and they only gave us 4 lines. I think the answer also helps determine speaking slots and priority.

So here’s the question I had to fill out –

FOR SALES TO UNDERSTAND THE PRODUCT BETTER, WE ASK THAT YOU ANSWER THE FOLLOWING QUESTION:

WHAT IS THE PRODUCT VALUE PROPOSITION AND ARE THERE ANY SIGNIFICANT DEPLOYMENTS OR OTHER CUSTOMER EXPERIENCES YOU HAVE HAD THAT HAVE HELPED TO DEFINE THE PRODUCT OFFERING?

Here’s what I wrote:

Informatica RULEPOINT is a real-time integration and event processing software product that is deployed very innovatively by many businesses and vertical industries.  Its value proposition is that it helps large enterprises discover important situations from their droves of data and events and then enables users to take timely action on discovered business opportunities as well as stop problems while or before they happen.

Here’s what I wanted to write:

RulePoint is scalable, low latency, flexible and extensible and was born in the pure and exotic wilds of the Amazon from the minds of natives that have never once spoken out loud – only programmed.  RulePoint captures the essence of true wisdom of the greatest sages of yesteryear. It is the programming equivalent and captures what Esperanto linguistically tried to do but failed to accomplish.

As to high availability, (HA) there has never been anything in the history of software as available as RulePoint. Madonna’s availability only pales in comparison to RulePoint’s availability.  We are talking 8 Nines cubed and then squared ( 😉 ). Oracle = Unavailable. IBM = Unavailable. Informatica RulePoint = Available.

RulePoint works hard, but plays hard too.  When not solving those mission critical business problems, RulePoint creates Arias worthy of Grammy nominations. In the wee hours of the AM, RulePoint single-handedly prevented the outbreak and heartbreak of psoriasis in East Angola.

One of the little known benefits of RulePoint is its ability to train the trainer, coach the coach and play the player. Via chalk talks? No, RulePoint uses mind melds instead.  Much more effective. RulePoint knows Chuck Norris.  How do you think Chuck Norris became so famous in the first place? Yes, RulePoint. Greenpeace used RulePoint to save dozens of whales, 2 narwhal, a polar bear and a few collateral penguins (the bear was about to eat the penguins).  RulePoint has been banned in 16 countries because it was TOO effective.  “Veni, Vidi, RulePoint Vici” was Julius Caesar’s actual quote.

The inspiration for Gandalf in the Lord of the Rings? RulePoint. IT heads worldwide shudder with pride when they hear the name RulePoint mentioned and know that they acquired it. RulePoint is stirred but never shaken. RulePoint is used to train the Sherpas that help climbers reach the highest of heights. RulePoint cooks Minute rice in 20 seconds.

The running of the bulls in Pamplona every year –  What do you think they are running from? Yes,  RulePoint. RulePoint put the Vinyasa back into Yoga. In fact, RulePoint will eventually create a new derivative called Full Contact Vinyasa Yoga and it will eventually supplant gymnastics in the 2028 Summer Olympic games.

The laws of physics were disproved last year by RulePoint.  RulePoint was drafted in the 9th round by the LA Lakers in the 90s, but opted instead to teach math to inner city youngsters. 5 years ago, RulePoint came up with an antivenin to the Black Mamba and has yet to ask for any form of recompense. RulePoint’s rules bend but never break. The stand-in for the “Mind” in the movie “A Beautiful Mind” was RulePoint.

RulePoint will define a new category for the Turing award and will name it the 2Turing Award.  As a bonus, the 2Turing Award will then be modestly won by RulePoint and the whole category will be retired shortly thereafter.  RulePoint is… tada… the most interesting software in the world.

But I didn’t get to write any of these true facts and product differentiators on the form. No room.

Hopefully I can still get a primo slot to talk about RulePoint.

 

And so from all the RulePoint and Emerging Technologies team, including sales and marketing, here’s hoping you have great holiday season and a Happy New Year!

 

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Uncategorized | Tagged , , , , | 1 Comment