Tag Archives: BI
According to Doug Henschen, Executive Editor at InformationWeek, “Despite the weak economy and zero growth in many IT salary categories, business intelligence (BI), analytics, information-integration and data warehousing professionals are seeing a slow-but-steady rise in income.” (more…)
In a recent webinar, Mark Smith, CEO at Ventana Research and David Lyle, vice president, Product Strategy at Informatica discussed: “Building the Business Case and Establishing the Fundamentals for Big Data Projects.” Mark pointed out that the second biggest barrier that impedes improving big data initiatives is that the “business case is not strong enough.” The first and third barriers respectively, were “lack of resources” and “no budget” which are also related to having a strong business case. In this context, Dave provided a simple formula from which to build the business case:
Return on Big Data = Value of Big Data / Cost of Big Data (more…)
Did you know that Forrester estimates in their 10 Cloud Predictions For 2012 blog post that on average organizations will be running more than 10 different cloud applications and that the public Software-as-a-Service (SaaS) market will hit $33 billion by the end of 2012?
However, in the same post, Forrester also acknowledged that SaaS adoption is led mainly by Customer Relationship Management (CRM), procurement, collaboration, and Human Capital Management (HCM) software and that all other software segments will “still have significantly lower SaaS adoption rates”. It’s not hard to see this in the market today, with cloud juggernaut salesforce.com leading the way in CRM, and Workday and SuccessFactors doing battle in HCM, for example. Forrester claims that amongst the lesser known software segments, Product Lifecycle Management (PLM), Business Intelligence (BI), and Supply Chain Management (SCM) will be the categories to break through as far as SaaS adoption is concerned, with approximately 25% of companies using these solutions by 2012. (more…)
I spent last weekend reading Geoffrey Moore’s new book, Escape Velocity: Free Your Company’s Future from the Pull of the Past. Then on Sunday, the New York Times published this article about salesforce.com: A Leader in the Cloud Gains Rivals. Clearly “The Big Switch” is on. With this as a backdrop, the need for a comprehensive cloud data management strategy has surfaced as a top IT imperative heading into the New Year – How and when do you plan to move data to the cloud? How will you prevent SaaS silos? How will you ensure your cloud data is trustworthy, relevant and complete? What is your plan for longer-term cloud governance and control?
These are just a few of the questions you need to think through as you develop your short, medium and long-term cloud strategy. Here are my predictions for what else should be on your 2012 cloud integration radar. (more…)
Adopting Agile may require a cultural shift and in the beginning can be disruptive to an organization. However, as I mentioned in Part 1 of this blog series, Agile Data Integration holds the promise to increase chances of success, deliver projects faster, and reduce defects. Applying Lean principles within your organization can help ease the transition to Agile Data Integration. Lean is a set of principles first explored in the context of data integration by John Schmidt and David Lyle in their book on Lean Integration. First and foremost Lean recommends an organization focus on eliminating waste and optimizing the data integration process from the customers’ perspective. Agile Data Integration maximizes the business value of projects (e.g. Agile BI, Data Warehousing, Big Data Analytics, Data Migration, etc.) because you can get it right the first time by delivering exactly what the business needs when they need it. Break big projects into smaller more manageable deliverables so that you can incrementally deliver value to the business. Agile Data Integration also recommends the following: (more…)
Many companies have built out their data infrastructure, composed of large data warehouses, varying data marts, integration processes, quality controls and much more. And, it has driven tremendous value in the form of stronger business intelligence, better views of the customer, faster business processes and more. But, with all the help on the analysis side, where’s the evolution in putting those findings into operational practice. Like quickly identifying that a premium/platinum-level client was put on hold by a customer care agent for more than 5 minutes (and no, the “we are experiencing unusually high call volume” does not let anyone off the hook).
I was driving to work this week when I heard about Walt Disney Company’s announcement to buy comic book giant Marvel Entertainment (MVL) for US$4 billion. I started thinking about what it would be like to see the X-MEN hanging out with Snow White’s Seven Dwarfs the next time I take my kids to Disneyland. At the same time, I was wondering if an increase mergers and acquisitions were a sign of an economic recovery? (more…)
On www.ebizq.net a few days ago, the question was posed “Why is a Single Version of the Truth Still Difficult to Achieve With BI?” I provided a quick response there but felt that it’s a topic worthy of expanded discussion.
As I wrote in my comment, a single version of the truth by definition means that there's a single representation of critical master data such as customers, products, assets, and more that's unique, complete, and consistent, and becomes the most reliable and authoritative information for the entire enterprise.
Business Intelligence as a technology or market is specialized to report on existing data from multiple systems without prejudice. It may do some aggregation or rollup for dimensional analysis but it is not designed or equipped to create a single version of the truth. The inconsistency in customer or product dimensions from siloed applications can make any BI analytics running on data warehouses or operational data unreliable. It should come as no surprise to regular readers of this blog and to those familiar with MDM that MDM is needed to recognize (disparate data), resolve (into a single version of the truth), and relate (them to derive some meaning). In fact, many MDM projects have been kicked off for the initial goal to have BI reporting to more accurately reflect the true nature of the business.
Brian Gentile, CEO of Jaspersoft made another interesting comment by saying “Even within classical BI systems, more than one version of the truth can persist.” He reasons that OLAP systems that require OLAP cubes exacerbate the problem by holding yet more copies. In an MDM system this could be avoided as a matter of process by always using the latest dimensions fed from the MDM Hub. In effect, the OLAP system or data warehouse should be considered a downstream sync which needs to be updated by the integration capabilities of the MDM platform.
Regardless, I think the question posed was an excellent one because it reminds us that many companies out there are still starting to learn about MDM. Those of us who have been living and breathing MDM for the last 5 years or more take it for granted that MDM should be a precursor for many downstream benefits such as more accurate BI. It goes to show that plenty of continued education and support is needed by all of us to help organizations learn about and realize the benefits of MDM. Hence this blog