Tag Archives: Business Intelligence
Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system. The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.
Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT. But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it. The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.
Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs? Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body. It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete. And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.
You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place? The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives. But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.
So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.
The way I see it – it’s all about having access to the right data at the right time. A company is like a large ship that must be effectively steered at both a strategic as well as an operational level to reach its destination – despite treacherous waters. Where am I going with this? Well, companies continuously face challenging situations such as competitive threats, customer expectations, market trends, and other external constraints. Overcoming the challenges of global uncertainty requires business agility to navigate an efficient and effective course towards achieving the company’s objectives.
Ok – so what does all this have to do with right time data integration? Actually – EVERYTHING! Companies require timely information to add value to their products, customer relationships, and business partnerships – agreed? And business conditions today are influenced by powerful market forces such as globalization, mergers and acquisitions, regulatory compliance, fierce competition, tight operating budgets, increased demands for improved customer service, and ever faster product delivery – right? So, it should follow that only the most responsive and agile companies will perform the best.
If you are with me so far, you will immediately see the connection between business agility and the speed of data processing and delivery. This typically spans a wide range of latencies depending on the business process. In analytical data integration projects (e.g. data warehousing) latencies can range from weeks to days. Operational data integration projects on the other hand require information within hours, minutes, and seconds. Examples of such projects include real-time data warehousing, operational data hubs, data synchronization, data replication, and agile business intelligence (BI).
The success of such operational data integration projects depends on the ability to meet service level agreements (SLAs) related to data latency. You may require data delivered frequently (e.g., real-time or near- real-time) or infrequently (e.g., weekly batch windows). Or, you may need to move large data volumes or small datasets between applications. Or, have ready access to the most current data available in operational systems. Or, you may need to deliver new data and reports in days vs. months. Or, streamline the accessibility of business partner data. Or, be aware of real-time events as they occur.
So, whatever right time data integration means to you, please do join us at Informatica World 2013 to discuss – here are my recommendations on just some of the relevant activities on this topic:
Tuesday, June 4
- 9:00AM – 10:00AM – Platform & Products Track – Best Practices for Doing Data Replication the Right Way
- 10:15AM – 11:15AM – Architecture Track – Informatica for Agile Data Integration and Rapid Prototyping
- 11:30AM – 12:30AM – Big Data Track – Using Big Data and Events to Drive Operational Intelligence
- 2:00PM – 3:00PM – Architecture Track – PowerCenter Architecture
- 2:00PM – 3:00PM – Platform & Products Track – Real Time Data Integration with Next-Generation Data Replication
Wednesday, June 5
- 9:00AM – 10:00AM – Tech Talk Track – Look Deeper Into Operational Data with Data Replication and PowerCenter
- 9:00AM – 10:00AM – Platform & Products Track – Informatica for Real Time Data Integration
- 10:15AM – 11:15AM – Architecture Track – Transform Analytics and Customer Experience with Operational Intelligence
- 10:15AM – 11:15AM – Big Data Track – Unlock the Potential in External and Hierarchical Data Like XML and JSON
- 11:30AM – 12:30AM – Architecture Track – Data Integration Hub Implementation Best Practices and Architecture
- 11:30AM – 12:30AM – Best Practices Track – Using B2B Data Transformation to Tame Diverse Source File Structures
- 2:30PM – 3:30PM – Platform & Products Track – New Approaches to Reducing PowerCenter Testing and Monitoring Time
- 2:30PM – 3:30PM – Architecture Track – Automating the Payment Lifecycle with Informatica B2B
- 2:30PM – 3:30PM – Hybrid IT Track – Ultimate Software: Embedded Cloud Integration
Thursday, June 6
- 9:00AM – 10:00AM– Tech Talk Track – Proactive Monitoring: Greater IT Productivity with Streamlined Data Integration
- 10:15AM – 11:15AM– Platform & Products Track – What’s New from Informatica to Improve Data Warehouse Performance and Lowering Costs
- 10:15AM – 11:15AM – Architecture Track – Fast Tracking ROI from Analytics Platforms Using Fast Clone
- 10:15AM – 11:15AM – Architecture Track – Enabling Data Synchronization Projects with Informatica
- 11:30AM – 12:30AM – Best Practices Track – Best Practices for Delivering Proactive Operational Intelligence
- 11:30AM – 12:30AM – Big Data Track – Real-Time, High Performance Big Data Ingestion
- 11:30AM – 12:30AM – Platform & Products – Informatica Solutions for Oracle
- 11:30AM – 12:30AM – Tech Talk Track – Data Transformation Tips and Tricks
- 2:30PM – 3:30PM – Architecture Track – Real-Time Data Integration Best Practices and Architecture
- 2:30PM – 3:30PM – Platform & Products Track – Expediting Integration of External Data Sources and Formats
BIRDS OF A FEATHER ROUNDTABLES (Check Agenda for Daily Timings):
- How to Untangle the Data Integration Hairball
HANDS-ON LABS (Check Agenda for Daily Timings):
- Table 13 – Data Virtualization for Agile BI / Data Services
- Table 19 – Empowering Analyst Self-Service with Informatica Analyst
- Table 41 – Informatica Data Replication
- Table 44 – Data Integration Hub
- Table 45 – B2B Data Transformation
- Table 46 – B2B Data Exchange
- Table 47 – RulePoint / Operational Intelligence Platform
- Table 48 – Proactive Healthcare Decision Management
BOOTHS (Check Agenda for Daily Timings):
- Operational Intelligence and Reporting
- Real Time Data Integration
- External Data for Analytics
- Customer/Supplier Collaboration
- Operational Synchronization
I look forward to seeing you at Informatica World 2013.
There’s no denying that business continues to accelerate its pace, and that the luxury of using historical data for business intelligence (BI) and planning is quickly becoming a thing of the past. Today, businesses need immediate insight into rapidly changing data in order to survive and thrive. Data that’s even a few hours old—let alone a few days old—is largely useless. But most current information architectures today still only provide data that is a day, a week, or sometimes as much as a month old.
This leaves most BI, reporting, and analytics systems to operate without up-to-date data from operational systems, data that’s fundamental to making informed decisions about the business. To operate at the speed of business, it’s imperative that executives and decision makers have ready access to fresh information at all times, delivered continuously and automatically without impact on operational systems.
More and more organizations have found the answer in data replication. Data replication allows you to work with and make the best business decisions based on the freshest data drawn from all your operational systems. It delivers this current, up-to-date data in a seamless and non-intrusive manner, empowering you to operate at the speed of your business without constraints. It also automatically delivers this data wherever it’s needed – for operational intelligence, as well as operational use – without direct impact.
This unique “data-on-demand” approach removes the constraints of stale, old information and enables powerful outcomes for business initiatives. Fresh, current data drives new thinking across the enterprise, and can help organizations to:
- Increase revenue, delight customers, and outshine the competition
- Improve the quality and efficiency of business decisions
- Standardize on a single reliable and scalable solution that lowers costs and removes complexity
At Informatica, we’ve seen numerous customers implement Informatica Data Replication to deliver this fresh, up-to-date data for operational intelligence, reporting, and analytics, and report tremendous positive changes to their business. Some examples of customers using Informatica Data Replication with great success are:
- Westlake Financial Systems saved hundreds of thousands of dollars and improved its profitability and customer satisfaction through a more effective payment collection system
- Optus Australia increased both revenues and customer satisfaction by providing calling plan access, alerting, and self-service upgrades directly to its customer base
- A major national pharmacy chain accelerated and improved health care decision making across the business and increased agility and responsiveness to its customers, resulting in higher customer satisfaction while driving down the cost of technology
Is your business ready to make the leap to true operational intelligence using the freshest data to make your business decisions? Do you want to understand more about the impact that this kind of insight can make to your business?
If yes, please join us for a discussion with two business executives who have seen the impact in their own and their customers’ businesses using data replication on August 28 at 10 am Pacific. You can register using the link below.
The freshest data does make the best business decisions. We look forward to your joining and participating in the discussion.
Did you know that Forrester estimates in their 10 Cloud Predictions For 2012 blog post that on average organizations will be running more than 10 different cloud applications and that the public Software-as-a-Service (SaaS) market will hit $33 billion by the end of 2012?
However, in the same post, Forrester also acknowledged that SaaS adoption is led mainly by Customer Relationship Management (CRM), procurement, collaboration, and Human Capital Management (HCM) software and that all other software segments will “still have significantly lower SaaS adoption rates”. It’s not hard to see this in the market today, with cloud juggernaut salesforce.com leading the way in CRM, and Workday and SuccessFactors doing battle in HCM, for example. Forrester claims that amongst the lesser known software segments, Product Lifecycle Management (PLM), Business Intelligence (BI), and Supply Chain Management (SCM) will be the categories to break through as far as SaaS adoption is concerned, with approximately 25% of companies using these solutions by 2012. (more…)
Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)
I just came back from MicroStrategy World. There were many conversations about social, mobile, cloud and big data. There was strong interest in cloud, clear adoption of mobile, and some big data adoption. eHarmony had a great presentation about how they handle big data with Informatica, and how they’re starting to use Hadoop with Informatica HParser running on Hadoop for processing JSON.
But that wasn’t the number one conversation. The one topic that everyone was interested in – and I talked to nearly 100 customers and partners over four days – was creating new reports faster, or Agile BI. (more…)
Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?
In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):
- BI hasn’t fully empowered information workers, who still largely depend on IT
- BI platforms, tools and applications aren’t agile enough (more…)
I spent last weekend reading Geoffrey Moore’s new book, Escape Velocity: Free Your Company’s Future from the Pull of the Past. Then on Sunday, the New York Times published this article about salesforce.com: A Leader in the Cloud Gains Rivals. Clearly “The Big Switch” is on. With this as a backdrop, the need for a comprehensive cloud data management strategy has surfaced as a top IT imperative heading into the New Year – How and when do you plan to move data to the cloud? How will you prevent SaaS silos? How will you ensure your cloud data is trustworthy, relevant and complete? What is your plan for longer-term cloud governance and control?
These are just a few of the questions you need to think through as you develop your short, medium and long-term cloud strategy. Here are my predictions for what else should be on your 2012 cloud integration radar. (more…)
Data is the Answer, Now What’s the Question? Hint: It’s The Key to Optimizing Enterprise Applications
Data quality improvement isn’t really anything new; it’s been around for some time now. Fundamentally the goal of cleansing, standardizing and enriching enterprise data through data quality processes remains the same. What’s different now, however, is that in an increasingly competitive marketplace and in difficult economic times, a complete enterprise data quality management approach can separate the leaders from the laggards. With a sound approach to enterprise data quality management, organizations reap the benefits of turning enterprise data into a key strategic asset. This helps to increase revenue, eliminate costs and reduce risks. Using the right solution, organizations can leverage data in a way never possible before, holistically and proactively, by addressing data quality issues when and where they arise. Doing so ensures key IT initiatives, like business intelligence, master data management, and enterprise applications, deliver on their promises of better business results. (more…)