Tag Archives: Business Intelligence
A few days ago, I came across a post, 5 C’s of MDM (Case, Content, Connecting, Cleansing, and Controlling), by Peter Krensky, Sr. Research Associate, Aberdeen Group and this response by Alan Duncan with his 5 C’s (Communicate, Co-operate, Collaborate, Cajole and Coerce). I like Alan’s list much better. Even though I work for a product company specializing in information management technology, the secret to successful enterprise information management (EIM) is in tackling the business and organizational issues, not the technology challenges. Fundamentally, data management at the enterprise level is an agreement problem, not a technology problem.
So, here I go with my 5 C’s: (more…)
Several years ago, I got to participate in one of the first neural net conferences. At the time, I thought it was amazing just to be there. There were chip and software vendors galore. Many even claimed to be the next Intel or the next Microsoft. Years later I joined a complex event processing vendor. Again, I felt the same sense of excitement. In both cases, the market participants moved from large horizontal market plays to smaller and more vertical solutions.
Now to be clear, it is not my goal today to pop anyone’s big data balloon. But as I have gotten more excited about big data, I have gotten more and more an eerie sense of deja vu. The fact is the more that I dig into big data and hear customer’s stories about what they are trying to do with big data; the more I have concern about the similarities between big data and neural nets and complex event processing.
Clearly, big data does offer some interesting new features. And big data does take advantage of other market trends including virtualization and cloud. By doing so, big data achieves new orders of scalability than traditional business intelligence processing and storage. At the same time, big data offers the potential for lowering cost but I should take a moment to stress the word potential. The reason I do this is that while a myriad of processing approaches have been developed, no standard has yet emerged. And early adopters complain about having a difficulty in hiring big data map reduce programmers. And just like neural nets, the programing that needs to be done is turning out to be application specific.
With this said, it should be clear that big data does offer the potential to test datasets and to discover new and sometimes unexpected data relationship. This is a real positive thing. However, like its predecessors, this work is application specific and the data that is being related is truly of differing quality and detail. This means that the best that big data can do as a technology movement is discover potential data relationship. Once this is done, meaning can only be created by establishing detailed data relationships and dealing with the varying quality of data sets within the big data cluster.
Big Data will become small for management analysis
This means that big data must become small in order to really solve customer problems. Judith Hurwitz puts it this way, “big data analysis is really about small data. Small data, therefore, is the product of big data analysis. Big data will become small so that it is easier to comprehend”. What is “more necessary than ever is the capability to analyze the right data in a timely enough fashion to make decisions and take actions”. Judith says that in the end what is needed is quality data that is consistent, accurate, reliable, complete, timely, reasonable, and valid. The critical point is whether you use map reduce processing or traditional BI means, you shouldn’t throw out your data integration and quality tools. As big data becomes smaller, these will in reality become increasingly important.
So how does Judith see big data evolving? Judith sees big data propelling a lot of new small data. Judith believes that, “getting the right perspective on data quality can be very challenging in the world of big data. With a majority of big data sources, you need to assume that you are working with data that is not clean”. Judith says that we need to accept the fact that a lot of noise will exist in data. It is by searching and pattern matching that you will be able to find some sparks of truth in the midst of some very dirty data”. Judith suggests, therefore, a two phase approach—1) look for patterns in big data without concern for data quality; and 2) after you locate your patterns, applying the same data quality standards that have been applied to traditional data sources.
For this reason, I believe that history will to a degree repeat itself. Clearly, the big data emperor does have his clothes on, but big data will become smaller and more vertical. Big data will become about relationship discovery and small data will become about quality analysis of data sources. In sum, this means that small data analysis is focused and provides the data for business decision making and big data analysis is broad and is about discovering what data potentially relates to what data. I know this is a bit of different from the hype but it is realistic and makes sense. Remember, in the end, you will still need what business intelligence has refined.
I recently came across an article from 2006, which is clearly out-of-date, but still a good read about the state of data integration eight years ago. “Data integration was hot in 2005, and the intense interest in this topic continues in 2006 as companies struggle to integrate their ever-growing mountain of data.
A TDWI study on data integration last November found that 69% of companies considered data integration issues to be a high or very high barrier to new application development. To solve this problem, companies are increasing their spending on data integration products.”
Business intelligence (BI) and data warehousing were the way to go at the time, and companies were spending millions to stand up these systems. Data integration was all massive data movements and manipulations, typically driven by tactical tools rather than true data integration solutions.
The issue I had at the time was the inability to deal with real-time operational data, and the cost of the technology and deployments. While these issues were never resolved with traditional BI and data warehousing technology, we now have access to databases that can manage over a petabyte of data, and the ability to cull through the data in seconds.
The ability to support massive amounts of data have reignited the interest in data integration. Up-to-the-minute operational data in these massive data stores is actually possible. We can now understand the state of the business as it happens, and thus make incremental adjustments based upon almost perfect information.
What this situation leads to is true value. We have delivery of the right information to the right people, at the right time, and the ability to place automated processes and polices around this data. Business becomes self-correcting and self-optimizing. The outcome is a business that is data-driven, and thus more responsive to the markets as well as to the business world itself.
However, big data is an impossible dream without a focus on how the data moves from place to place, using data integration best practices and technology. I guess we can call this big data integration, but it’s really the path to provide these massive data stores with the operational data required to determine the proper metrics for the business.
While data integration is not a new term. However the application of new ways to leverage and value data brings unprecedented new value to enterprises. Millions of dollars an hour of value are being delivered to Global 2000 organizations that leverage these emerging data integration approaches and technology. What’s more, data integration is moving from the tactical to the strategic budgets of IT.
So, what’s changed in eight years? We finally figured out how to get the value from our data, using big data and data integration. It took us long enough, but I’m glad it’s finally become a priority.
Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system. The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.
Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT. But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it. The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.
Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs? Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body. It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete. And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.
You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place? The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives. But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.
So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.
The way I see it – it’s all about having access to the right data at the right time. A company is like a large ship that must be effectively steered at both a strategic as well as an operational level to reach its destination – despite treacherous waters. Where am I going with this? Well, companies continuously face challenging situations such as competitive threats, customer expectations, market trends, and other external constraints. Overcoming the challenges of global uncertainty requires business agility to navigate an efficient and effective course towards achieving the company’s objectives.
Ok – so what does all this have to do with right time data integration? Actually – EVERYTHING! Companies require timely information to add value to their products, customer relationships, and business partnerships – agreed? And business conditions today are influenced by powerful market forces such as globalization, mergers and acquisitions, regulatory compliance, fierce competition, tight operating budgets, increased demands for improved customer service, and ever faster product delivery – right? So, it should follow that only the most responsive and agile companies will perform the best.
If you are with me so far, you will immediately see the connection between business agility and the speed of data processing and delivery. This typically spans a wide range of latencies depending on the business process. In analytical data integration projects (e.g. data warehousing) latencies can range from weeks to days. Operational data integration projects on the other hand require information within hours, minutes, and seconds. Examples of such projects include real-time data warehousing, operational data hubs, data synchronization, data replication, and agile business intelligence (BI).
The success of such operational data integration projects depends on the ability to meet service level agreements (SLAs) related to data latency. You may require data delivered frequently (e.g., real-time or near- real-time) or infrequently (e.g., weekly batch windows). Or, you may need to move large data volumes or small datasets between applications. Or, have ready access to the most current data available in operational systems. Or, you may need to deliver new data and reports in days vs. months. Or, streamline the accessibility of business partner data. Or, be aware of real-time events as they occur.
So, whatever right time data integration means to you, please do join us at Informatica World 2013 to discuss – here are my recommendations on just some of the relevant activities on this topic:
Tuesday, June 4
- 9:00AM – 10:00AM – Platform & Products Track – Best Practices for Doing Data Replication the Right Way
- 10:15AM – 11:15AM – Architecture Track – Informatica for Agile Data Integration and Rapid Prototyping
- 11:30AM – 12:30AM – Big Data Track – Using Big Data and Events to Drive Operational Intelligence
- 2:00PM – 3:00PM – Architecture Track – PowerCenter Architecture
- 2:00PM – 3:00PM – Platform & Products Track – Real Time Data Integration with Next-Generation Data Replication
Wednesday, June 5
- 9:00AM – 10:00AM – Tech Talk Track – Look Deeper Into Operational Data with Data Replication and PowerCenter
- 9:00AM – 10:00AM – Platform & Products Track – Informatica for Real Time Data Integration
- 10:15AM – 11:15AM – Architecture Track – Transform Analytics and Customer Experience with Operational Intelligence
- 10:15AM – 11:15AM – Big Data Track – Unlock the Potential in External and Hierarchical Data Like XML and JSON
- 11:30AM – 12:30AM – Architecture Track – Data Integration Hub Implementation Best Practices and Architecture
- 11:30AM – 12:30AM – Best Practices Track – Using B2B Data Transformation to Tame Diverse Source File Structures
- 2:30PM – 3:30PM – Platform & Products Track – New Approaches to Reducing PowerCenter Testing and Monitoring Time
- 2:30PM – 3:30PM – Architecture Track – Automating the Payment Lifecycle with Informatica B2B
- 2:30PM – 3:30PM – Hybrid IT Track – Ultimate Software: Embedded Cloud Integration
Thursday, June 6
- 9:00AM – 10:00AM– Tech Talk Track – Proactive Monitoring: Greater IT Productivity with Streamlined Data Integration
- 10:15AM – 11:15AM– Platform & Products Track – What’s New from Informatica to Improve Data Warehouse Performance and Lowering Costs
- 10:15AM – 11:15AM – Architecture Track – Fast Tracking ROI from Analytics Platforms Using Fast Clone
- 10:15AM – 11:15AM – Architecture Track – Enabling Data Synchronization Projects with Informatica
- 11:30AM – 12:30AM – Best Practices Track – Best Practices for Delivering Proactive Operational Intelligence
- 11:30AM – 12:30AM – Big Data Track – Real-Time, High Performance Big Data Ingestion
- 11:30AM – 12:30AM – Platform & Products – Informatica Solutions for Oracle
- 11:30AM – 12:30AM – Tech Talk Track – Data Transformation Tips and Tricks
- 2:30PM – 3:30PM – Architecture Track – Real-Time Data Integration Best Practices and Architecture
- 2:30PM – 3:30PM – Platform & Products Track – Expediting Integration of External Data Sources and Formats
BIRDS OF A FEATHER ROUNDTABLES (Check Agenda for Daily Timings):
- How to Untangle the Data Integration Hairball
HANDS-ON LABS (Check Agenda for Daily Timings):
- Table 13 – Data Virtualization for Agile BI / Data Services
- Table 19 – Empowering Analyst Self-Service with Informatica Analyst
- Table 41 – Informatica Data Replication
- Table 44 – Data Integration Hub
- Table 45 – B2B Data Transformation
- Table 46 – B2B Data Exchange
- Table 47 – RulePoint / Operational Intelligence Platform
- Table 48 – Proactive Healthcare Decision Management
BOOTHS (Check Agenda for Daily Timings):
- Operational Intelligence and Reporting
- Real Time Data Integration
- External Data for Analytics
- Customer/Supplier Collaboration
- Operational Synchronization
I look forward to seeing you at Informatica World 2013.
There’s no denying that business continues to accelerate its pace, and that the luxury of using historical data for business intelligence (BI) and planning is quickly becoming a thing of the past. Today, businesses need immediate insight into rapidly changing data in order to survive and thrive. Data that’s even a few hours old—let alone a few days old—is largely useless. But most current information architectures today still only provide data that is a day, a week, or sometimes as much as a month old.
This leaves most BI, reporting, and analytics systems to operate without up-to-date data from operational systems, data that’s fundamental to making informed decisions about the business. To operate at the speed of business, it’s imperative that executives and decision makers have ready access to fresh information at all times, delivered continuously and automatically without impact on operational systems.
More and more organizations have found the answer in data replication. Data replication allows you to work with and make the best business decisions based on the freshest data drawn from all your operational systems. It delivers this current, up-to-date data in a seamless and non-intrusive manner, empowering you to operate at the speed of your business without constraints. It also automatically delivers this data wherever it’s needed – for operational intelligence, as well as operational use – without direct impact.
This unique “data-on-demand” approach removes the constraints of stale, old information and enables powerful outcomes for business initiatives. Fresh, current data drives new thinking across the enterprise, and can help organizations to:
- Increase revenue, delight customers, and outshine the competition
- Improve the quality and efficiency of business decisions
- Standardize on a single reliable and scalable solution that lowers costs and removes complexity
At Informatica, we’ve seen numerous customers implement Informatica Data Replication to deliver this fresh, up-to-date data for operational intelligence, reporting, and analytics, and report tremendous positive changes to their business. Some examples of customers using Informatica Data Replication with great success are:
- Westlake Financial Systems saved hundreds of thousands of dollars and improved its profitability and customer satisfaction through a more effective payment collection system
- Optus Australia increased both revenues and customer satisfaction by providing calling plan access, alerting, and self-service upgrades directly to its customer base
- A major national pharmacy chain accelerated and improved health care decision making across the business and increased agility and responsiveness to its customers, resulting in higher customer satisfaction while driving down the cost of technology
Is your business ready to make the leap to true operational intelligence using the freshest data to make your business decisions? Do you want to understand more about the impact that this kind of insight can make to your business?
If yes, please join us for a discussion with two business executives who have seen the impact in their own and their customers’ businesses using data replication on August 28 at 10 am Pacific. You can register using the link below.
The freshest data does make the best business decisions. We look forward to your joining and participating in the discussion.
Did you know that Forrester estimates in their 10 Cloud Predictions For 2012 blog post that on average organizations will be running more than 10 different cloud applications and that the public Software-as-a-Service (SaaS) market will hit $33 billion by the end of 2012?
However, in the same post, Forrester also acknowledged that SaaS adoption is led mainly by Customer Relationship Management (CRM), procurement, collaboration, and Human Capital Management (HCM) software and that all other software segments will “still have significantly lower SaaS adoption rates”. It’s not hard to see this in the market today, with cloud juggernaut salesforce.com leading the way in CRM, and Workday and SuccessFactors doing battle in HCM, for example. Forrester claims that amongst the lesser known software segments, Product Lifecycle Management (PLM), Business Intelligence (BI), and Supply Chain Management (SCM) will be the categories to break through as far as SaaS adoption is concerned, with approximately 25% of companies using these solutions by 2012. (more…)
Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)
I just came back from MicroStrategy World. There were many conversations about social, mobile, cloud and big data. There was strong interest in cloud, clear adoption of mobile, and some big data adoption. eHarmony had a great presentation about how they handle big data with Informatica, and how they’re starting to use Hadoop with Informatica HParser running on Hadoop for processing JSON.
But that wasn’t the number one conversation. The one topic that everyone was interested in – and I talked to nearly 100 customers and partners over four days – was creating new reports faster, or Agile BI. (more…)