Tag Archives: Business Intelligence

Go On, Flip Your Division of Labor: More Time Analyzing and Less Time Prepping Data

Are you in Sales Operations, Marketing Operations, Sales Representative/Manager, or Marketing Professional? It’s no secret that if you are, you benefit greatly from the power of performing your own analysis, at your own rapid pace. When you have a hunch, you can easily test it out by visually analyzing data in Tableau without involving IT. When you are faced with tight timeframes in which to gain business insight from data, being able to do it yourself in the time you have available and without technical roadblocks makes all the difference.

Self-service Business Intelligence is powerful!  However, we all know it can be even more powerful. When needing to put together an analysis, we know that you spend about 80% of your time putting together data, and then just 20% of your time analyzing data to test out your hunch or gain your business insight. You don’t need to accept this anymore. We want you to know that there is a better way!

We want to allow you to Flip Your Division of Labor and allow you to spend more than 80% of your time analyzing data to test out your hunch or gain your business insight and less than 20% of your time putting together data for your Tableau analysis! That’s right. You like it. No, you love it. No, you are ready to run laps around your chair in sheer joy!! And you should feel this way. You now can spend more time on the higher value activity of gaining business insight from the data, and even find copious time to spend with your family. How’s that?

Project Springbok is a visionary new product designed by Informatica with the goal of making data access and data quality obstacles a thing of the past.  Springbok is meant for the Tableau user, a data person would rather spend their time visually exploring information and finding insight than struggling with complex calculations or waiting for IT. Project Springbok allows you to put together your data, rapidly, for subsequent analysis in Tableau. Project Springbok tells you things about your data that even you may not have known. It does it through Intelligent Suggestions that it presents to the User.

Let’s take a quick tour:

  • Project Springbok tells you, that you have a date column and that you likely want to obtain the Year and Quarter for your analysis (Fig 1)., And if you so wish, by a single click, voila, you have your corresponding years and even the quarters. And it all happened in mere seconds. A far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

data

                                                                      Fig. 1

VALUE TO A MARKETING CAMPAIGN PROFESSIONAL: Rapidly validate and accurately complete your segmentation list, before you analyze your segments in Tableau. Base your segments on trusted data that did not take you days to validate and enrich.

  • Then Project Springbok will tell you that you have two datasets that could be joined on a common key, email for example, in each dataset, and would you like to move forward and join the datasets (Fig 2)? If you agree with Project Springbok’s suggestion, voila, dataset joined in a mere few seconds. Again, a far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

Data

  Fig. 2

VALUE TO A SALES REPRESENTATIVE OR SALES MANAGER: You can now access your Salesforce.com data (Fig 3) and effortlessly combine it with ERP data to understand your true quota attainment. Never miss quota again due to a revenue split, be it territory or otherwise. Best of all, keep your attainment datatset refreshed and even know exactly what datapoint changed when your true attainment changes.

Data

Fig. 3

  • Then, if you want, Project Springbok will tell you that you have emails in the dataset, which you may or may not have known, but more importantly it will ask you if you wish to determine which emails can actually be mailed to. If you proceed, not only will Springbok check each email for correct structure (Fig 4), but will very soon determine if the email is indeed active, and one you can expect a response from. How long would that have taken you to do?

VALUE TO A TELESALES REPRESENTATIVE OR MARKETING EMAIL CAMPAIGN SPECIALIST : Ever thought you had a great email list and then found out most emails bounced? Now, confidently determine which emails are truly ones will be able to email to, before you send the message. Email prospects who you know are actually at the company and be confident you have their correct email addresses. You can then easily push the dataset into Tableau to analyze the trends in email list health.

Data

Fig. 4

 And, in case you were wondering, there is no training or install required for Project Springbok. The 80% of your time you used to spend on data preparation is now shrunk considerably, and this is after using only a few of Springbok’s capabilities. One more thing: You can even directly export from Project Springbok into Tableau via the “Export to Tableau TDE” menu item (Fig 5).  Project Springbok creates a Tableau TDE file and you just double click on it to open Tableau to test out your hunch or gain your business insight.

Data

Fig. 5

Here are some other things you should know, to convince you that you, too, can only spend no more than 20% of you time on putting together data for your subsequent Tableau analysis:

  • Springbok Sign-Up is Free
  • Springbok automatically finds problems with your data, and lets you fix them with a single click
  • Springbok suggests useful ways for you to combine different datasets, and lets you combine them effortlessly
  • Springbok suggests useful summarizations of your data, and lets you follow through on the summarizations with a single click
  • Springbok allows you to access data from your cloud or on-premise systems with a few clicks, and the automatically keep it refreshed. It will even tell you what data changed from the last time you saw it
  • Springbok allows you to collaborate by sharing your prepared data with others
  • Springbok easily exports your prepared data directly into Tableau for immediate analysis. You do not have to tell Tableau how to interpret the prepared data
  • Springbok requires no training or installation

Go on. Shift your division of labor in the right direction, fast. Sign-Up for Springbok and stop wasting precious time on data preparation. http://bit.ly/TabBlogs

———-

Are you going to be at Dreamforce this week in San Francisco?  Interested in seeing Project Springbok working with Tableau in a live demonstration?  Visit the Informatica or Tableau booths and see the power of these two solutions working hand-in-hand.Informatica is Booth #N1216 and Booth #9 in the Analytics Zone. Tableau is located in Booth N2112.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, General | Tagged , , , | Leave a comment

The Five C’s of Data Management

The Five C’s of Data Management

The Five C’s of Data Management

A few days ago, I came across a post, 5 C’s of MDM (Case, Content, Connecting, Cleansing, and Controlling), by Peter Krensky, Sr. Research Associate, Aberdeen Group and this response by Alan Duncan with his 5 C’s (Communicate, Co-operate, Collaborate, Cajole and Coerce). I like Alan’s list much better. Even though I work for a product company specializing in information management technology, the secret to successful enterprise information management (EIM) is in tackling the business and organizational issues, not the technology challenges. Fundamentally, data management at the enterprise level is an agreement problem, not a technology problem.

So, here I go with my 5 C’s: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , , | Leave a comment

Is Big Data Destined To Become Small And Vertical?

Several years ago, I got to participate in one of the first neural net conferences. At the time, I thought it was amazing just to be there. There were chip and software vendors galore. Many even claimed to be the next Intel or the next Microsoft. Years later I joined a complex event processing vendor. Again, I felt the same sense of excitement. In both cases, the market participants moved from large horizontal market plays to smaller and more vertical solutions.

deja vuA sense of deja vu

Now to be clear, it is not my goal today to pop anyone’s big data balloon. But as I have gotten more excited about big data, I have gotten more and more an eerie sense of deja vu. The fact is the more that I dig into big data and hear customer’s stories about what they are trying to do with big data; the more I have concern about the similarities between big data and neural nets and complex event processing.

big dataBig Data offers new features

Clearly, big data does offer some interesting new features. And big data does take advantage of other market trends including virtualization and cloud. By doing so, big data achieves new orders of scalability than traditional business intelligence processing and storage. At the same time, big data offers the potential for lowering cost but I should take a moment to stress the word potential. The reason I do this is that while a myriad of processing approaches have been developed, no standard has yet emerged. And early adopters complain about having a difficulty in hiring big data map reduce programmers. And just like neural nets, the programing that needs to be done is turning out to be application specific.

With this said, it should be clear that big data does offer the potential to test datasets and to discover new and sometimes unexpected data relationship. This is a real positive thing. However, like its predecessors, this work is application specific and the data that is being related is truly of differing quality and detail. This means that the best that big data can do as a technology movement is discover potential data relationship. Once this is done, meaning can only be created by establishing detailed data relationships and dealing with the varying quality of data sets within the big data cluster.

Big Data will become small for management analysis

This means that big data must become small in order to really solve customer problems. Judith Hurwitz puts it this way, “big data analysis is really about small data. Small data, therefore, is the product of big data analysis. Big data will become small so that it is easier to comprehend”. What is “more necessary than ever is the capability to analyze the right data in a timely enough fashion to make decisions and take actions”. Judith says that in the end what is needed is quality data that is consistent, accurate, reliable, complete, timely, reasonable, and valid. The critical point is whether you use map reduce processing or traditional BI means, you shouldn’t throw out your data integration and quality tools. As big data becomes smaller, these will in reality become increasingly important.

So how does Judith see big data evolving? Judith sees big data propelling a lot of new small data. Judith believes that, “getting the right perspective on data quality can be very challenging in the world of big data. With a majority of big data sources, you need to assume that you are working with data that is not clean”. Judith says that we need to accept the fact that a lot of noise will exist in data. It is by searching and pattern matching that you will be able to find some sparks of truth in the midst of some very dirty data”. Judith suggests, therefore, a two phase approach—1) look for patterns in big data without concern for data quality; and 2) after you locate your patterns, applying the same data quality standards that have been applied to traditional data sources.

history repeatsHistory will repeat itself

For this reason, I believe that history will to a degree repeat itself. Clearly, the big data emperor does have his clothes on, but big data will become smaller and more vertical. Big data will become about relationship discovery and small data will become about quality analysis of data sources. In sum, this means that small data analysis is focused and provides the data for business decision making and big data analysis is broad and is about discovering what data potentially relates to what data.  I know this is a bit of different from the hype but it is realistic and makes sense. Remember, in the end, you will still need what business intelligence has refined.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , , , , | Leave a comment

Data Integration Eight Years Later

Data IntegrationI recently came across an article from 2006, which is clearly out-of-date, but still a good read about the state of data integration eight years ago. “Data integration was hot in 2005, and the intense interest in this topic continues in 2006 as companies struggle to integrate their ever-growing mountain of data.

A TDWI study on data integration last November found that 69% of companies considered data integration issues to be a high or very high barrier to new application development. To solve this problem, companies are increasing their spending on data integration products.”

Business intelligence (BI) and data warehousing were the way to go at the time, and companies were spending millions to stand up these systems. Data integration was all massive data movements and manipulations, typically driven by tactical tools rather than true data integration solutions.

The issue I had at the time was the inability to deal with real-time operational data, and the cost of the technology and deployments. While these issues were never resolved with traditional BI and data warehousing technology, we now have access to databases that can manage over a petabyte of data, and the ability to cull through the data in seconds.

The ability to support massive amounts of data have reignited the interest in data integration. Up-to-the-minute operational data in these massive data stores is actually possible. We can now understand the state of the business as it happens, and thus make incremental adjustments based upon almost perfect information.

What this situation leads to is true value. We have delivery of the right information to the right people, at the right time, and the ability to place automated processes and polices around this data. Business becomes self-correcting and self-optimizing. The outcome is a business that is data-driven, and thus more responsive to the markets as well as to the business world itself.

However, big data is an impossible dream without a focus on how the data moves from place to place, using data integration best practices and technology. I guess we can call this big data integration, but it’s really the path to provide these massive data stores with the operational data required to determine the proper metrics for the business.

While data integration is not a new term. However the application of new ways to leverage and value data brings unprecedented new value to enterprises. Millions of dollars an hour of value are being delivered to Global 2000 organizations that leverage these emerging data integration approaches and technology. What’s more, data integration is moving from the tactical to the strategic budgets of IT.

So, what’s changed in eight years? We finally figured out how to get the value from our data, using big data and data integration. It took us long enough, but I’m glad it’s finally become a priority.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , | Leave a comment

Integration Gives You Muscle. Data Integration with Event Processing Gives You 6-Pack Abs.

6Packabs

Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system.  The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.

Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT.  But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it.  The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.

Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs?  Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body.  It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete.  And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.

You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place?  The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives.  But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.

So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Complex Event Processing, Data Integration, Data Integration Platform, Operational Efficiency, Real-Time | Tagged , , , , , , , , , | Leave a comment

What Does “Right Time” Data Integration Mean to You? Join the Discussions at Informatica World 2013

The way I see it – it’s all about having access to the right data at the right time. A company is like a large ship that must be effectively steered at both a strategic as well as an operational level to reach its destination – despite treacherous waters. Where am I going with this? Well, companies continuously face challenging situations such as competitive threats, customer expectations, market trends, and other external constraints. Overcoming the challenges of global uncertainty requires business agility to navigate an efficient and effective course towards achieving the company’s objectives.

Ok – so what does all this have to do with right time data integration? Actually – EVERYTHING! Companies require timely information to add value to their products, customer relationships, and business partnerships – agreed? And business conditions today are influenced by powerful market forces such as globalization, mergers and acquisitions, regulatory compliance, fierce competition, tight operating budgets, increased demands for improved customer service, and ever faster product delivery – right?  So, it should follow that only the most responsive and agile companies will perform the best.

If you are with me so far, you will immediately see the connection between business agility and the speed of data processing and delivery. This typically spans a wide range of latencies depending on the business process. In analytical data integration projects (e.g. data warehousing) latencies can range from weeks to days. Operational data integration projects on the other hand require information within hours, minutes, and seconds. Examples of such projects include real-time data warehousing, operational data hubs, data synchronization, data replication, and agile business intelligence (BI).

The success of such operational data integration projects depends on the ability to meet service level agreements (SLAs) related to data latency. You may require data delivered frequently (e.g., real-time or near- real-time) or infrequently (e.g., weekly batch windows). Or, you may need to move large data volumes or small datasets between applications. Or, have ready access to the most current data available in operational systems. Or, you may need to deliver new data and reports in days vs. months. Or, streamline the accessibility of business partner data. Or, be aware of real-time events as they occur.

So, whatever right time data integration means to you, please do join us at Informatica World 2013 to discuss – here are my recommendations on just some of the relevant activities on this topic:

BREAKOUT SESSIONS:

Tuesday, June 4

Wednesday, June 5

Thursday, June 6

BIRDS OF A FEATHER ROUNDTABLES (Check Agenda for Daily Timings):

  • How to Untangle the Data Integration Hairball

HANDS-ON LABS (Check Agenda for Daily Timings):

  • Table 13 – Data Virtualization for Agile BI / Data Services
  • Table 19 – Empowering Analyst Self-Service with Informatica Analyst
  • Table 41 – Informatica Data Replication
  • Table 44 – Data Integration Hub
  • Table 45 – B2B Data Transformation
  • Table 46 – B2B Data Exchange
  • Table 47 – RulePoint / Operational Intelligence Platform
  • Table 48 – Proactive Healthcare Decision Management

BOOTHS (Check Agenda for Daily Timings):

  • Operational Intelligence and Reporting
  • Real Time Data Integration
  • External Data for Analytics
  • Customer/Supplier Collaboration
  • Operational Synchronization

 

I look forward to seeing you at Informatica World 2013.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Informatica Events | Tagged , , , , , , | Leave a comment

The Freshest Data Makes the Best Business Decisions

There’s no denying that business continues to accelerate its pace, and that the luxury of using historical data for business intelligence (BI) and planning is quickly becoming a thing of the past.  Today, businesses need immediate insight into rapidly changing data in order to survive and thrive.  Data that’s even a few hours old—let alone a few days old—is largely useless.   But most current information architectures today still only provide data that is a day, a week, or sometimes as much as a month old. 

This leaves most BI, reporting, and analytics systems to operate without up-to-date data from operational systems, data that’s fundamental to making informed decisions about the business.   To operate at the speed of business, it’s imperative that executives and decision makers have ready access to fresh information at all times, delivered continuously and automatically without impact on operational systems. 

More and more organizations have found the answer in data replication.  Data replication allows you to work with and make the best business decisions based on the freshest data drawn from all your operational systems. It delivers this current, up-to-date data in a seamless and non-intrusive manner, empowering you to operate at the speed of your business without constraints. It also automatically delivers this data wherever it’s needed – for operational intelligence, as well as operational use – without direct impact.

This unique “data-on-demand” approach removes the constraints of stale, old information and enables powerful outcomes for business initiatives. Fresh, current data drives new thinking across the enterprise, and can help organizations to:

  • Increase revenue, delight customers, and outshine the competition
  • Improve the quality and efficiency of business decisions
  • Standardize on a single reliable and scalable solution that lowers costs and removes complexity

At Informatica, we’ve seen numerous customers implement Informatica Data Replication to deliver this fresh, up-to-date data for operational intelligence, reporting, and analytics, and report tremendous positive changes to their business. Some examples of customers using Informatica Data Replication with great success are:

  • Westlake Financial Systems saved hundreds of thousands of dollars and improved its profitability and customer satisfaction through a more effective payment  collection system
  • Optus Australia increased both revenues and customer satisfaction by providing calling plan access, alerting, and self-service upgrades directly to its customer base
  • A major national pharmacy chain accelerated and improved health care decision making across the business and increased agility and responsiveness to its customers, resulting in higher customer satisfaction while driving down the cost of technology

Is your business ready to make the leap to true operational intelligence using the freshest data to make your business decisions?  Do you want to understand more about the impact that this kind of insight can make to your business? 

If yes, please join us for a discussion with two business executives who have seen the impact in their own and their customers’ businesses using data replication on August 28 at 10 am Pacific.   You can register using the link below.

Bad Business Decision? Blame Stale Data – Register Today

The freshest data does make the best business decisions. We look forward to your joining and participating in the discussion.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, CIO, Data Integration, data replication, Real-Time | Tagged , , , , | Leave a comment

How Integration Platform-as-a-Service Impacts Cloud Adoption

Did you know that Forrester estimates in their 10 Cloud Predictions For 2012 blog post that on average organizations will be running more than 10 different cloud applications and that the public Software-as-a-Service (SaaS) market will hit $33 billion by the end of 2012?

However, in the same post, Forrester also acknowledged that SaaS adoption is led mainly by Customer Relationship Management (CRM), procurement, collaboration, and Human Capital Management (HCM) software and that all other software segments will “still have significantly lower SaaS adoption rates”. It’s not hard to see this in the market today, with cloud juggernaut salesforce.com leading the way in CRM, and Workday and SuccessFactors doing battle in HCM, for example. Forrester claims that amongst the lesser known software segments, Product Lifecycle Management (PLM), Business Intelligence (BI), and Supply Chain Management (SCM) will be the categories to break through as far as SaaS adoption is concerned, with approximately 25% of companies using these solutions by 2012. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, PaaS, SaaS | Tagged , , , , , , , , , , , , | 1 Comment

Collaborative Learning in a Lean Transformation

Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Integration Competency Centers | Tagged , , , , , | Leave a comment