Category Archives: CIO
As we renew or reinvent ourselves for 2015, I wanted to share a case of “imagine if” with you and combine it with the narrative of an American frontier town out West, trying to find a new Sheriff – a Wyatt Earp. In this case the town is a legacy European communications firm and Wyatt and his brothers are the new managers – the change agents.
Here is a positive word upfront. This operator has had some success in rolling outs broadband internet and IPTV products to residential and business clients to replace its dwindling copper install base. But they are behind the curve on the wireless penetration side due to the number of smaller, agile MVNOs and two other multi-national operators with a high density of brick-and-mortar stores, excellent brand recognition and support infrastructure. Having more than a handful of brands certainly did not make this any easier for our CSP. To make matters even more challenging, price pressure is increasingly squeezing all operators in this market. The ones able to offset the high-cost Capex for spectrum acquisitions and upgrades with lower-cost Opex for running the network and maximizing subscriber profitability, will set themselves up for success (see one of my earlier posts around the same phenomenon in banking).
Not only did they run every single brand on a separate CRM and billing application (including all the various operational and analytical packages), they also ran nearly every customer-facing-service (CFS) within a brand the same dysfunctional way. In the end, they had over 60 CRM and the same number of billing applications across all copper, fiber, IPTV, SIM-only, mobile residential and business brands. Granted, this may be a quite excessive example; but nevertheless, it is relevant for many other legacy operators.
As a consequence, their projections indicate they incur over €600,000 annually in maintaining duplicate customer records (ignoring duplicate base product/offer records for now) due to excessive hardware, software and IT operations. Moreover, they have to stomach about the same amount for ongoing data quality efforts in IT and the business areas across their broadband and multi-play service segments.
Here are some more consequences they projected:
- €18.3 million in call center productivity improvement
- €790,000 improvement in profit due to reduced churn
- €2.3 million reduction in customer acquisition cost
- And if you include the fixing of duplicate and conflicting product information, add another €7.3 million in profit via billing error and discount reduction (which is inline with our findings from a prior telco engagement)
Despite major business areas not having contributed to the investigation and improvements being often on the conservative side, they projected a 14:1 return ratio between overall benefit amount and total project cost.
Coming back to the “imagine if” aspect now, one would ask how this behemoth of an organization can be fixed. Well, it will take years but without management (in this case new managers busting through the door), this organization has the chance to become the next Rocky Mountain mining ghost town.
The good news is that this operator is seeing some management changes now. The new folks have a clear understanding that business-as-usual won’t do going forward and that centralization of customer insight (which includes some data elements) has its distinct advantages. They will tackle new customer analytics, order management, operational data integration (network) and next-best-action use cases incrementally. They know they are in the data, not just the communication business. They realize they have to show a rapid succession of quick wins rather than make the organization wait a year or more for first results. They have fairly humble initial requirements to get going as a result.
You can equate this to the new Sheriff not going after the whole organization of the three, corrupt cattle barons, but just the foreman of one of them for starters. With little cost involved, the Sheriff acquires some first-hand knowledge plus he sends a message, which will likely persuade others to be more cooperative going forward.
What do you think? Is new management the only way to implement drastic changes around customer experience, profitability or at least understanding?
When you talk to CIOs today about their business priorities, the top of their list is better connecting what IT is doing to business strategy. Or put another way, it is about establishing business/IT alignment. One area where CIOs need to make sure there is better alignment is enterprise analytics. CIOs that I have talk to share openly that business users are demanding the ability to reach their apps and data anywhere and on any device. For this reason, even though CIOs say they have interest in the mechanisms of data delivery–data integration, data cleanliness, data governance, data mastering, and even metadata management — they would not take a meeting on these topics. The reason is that CIOs say they would need to involve their business partner in these meetings. CIOs these days want you have to have a business value proposition. Given this, CIOs say that they would want to hear about what the business wants to hear about.
- Enabling new, valuable business insights out data to happen faster
- Enabling their businesses to compete with analytics
CIOs as an analytics proponent versus the analytics customer
So if the question is about competing with analytics, what role does the CIO have in setting the agenda here? Tom Davenport says that CIOs–as I heard in my own conversations with CIOs–have good intentions when it comes to the developing an enterprise information strategy. They can see the value of taking an enterprise versus a departmental view. Tom suggests, however, that CIOs should start by focusing upon the analytics that will matter most to the business. He says that IT organizations should, also, build an IT infrastructure capable of delivering the information and analytics that people across the enterprise need not just now but also in the future.
Tom says that IT organizations must resist the temptation to provide analytics as an add-on or a bolt-on basis for whatever transactions system have just been developed. As a product manager, I had a development team that preferred to add analytics by source rather than do the hard work of creating integrative measures that crossed sources. So I know this problem firsthand. Tom believes that IT needs to build a platform that can be standardized and integrate data from more than one source. This includes the ability to adapt as business needs and business strategies change.
Making this an Enterprise Analytics Capability
In the early stage for analytics, IT organizations need to focus more upon a self-service approach. But as the business matures at analytics, Tom says that IT needs to shift gears and become a proactive advocate and architect of change. Tom says that IT should be a part owner of the company’s analytical capabilities. IT managers, therefore, must understand and be able to articulate the potential for analytics being created at an enterprise level. At the same time, the IT staff–which often lacks the heavy mathematical backgrounds of analysts–needs to be able to interact with the analytics pros who use and consume the information that IT creates to build models. I had this dilemma first hand where my analytics modelers were disconnected from BI product developers. They were two different communities working on our project. And although some modelers can build apps or even a BI system, what excites them most in life is building new analytical models.
Talk the language of the business
Tom Davenport says that IT managers can make their own lives easier with the business and the with analysts by instead of discussing cloud computing, service oriented architecture, or even OLAP, discussing decision making, insights, and business performance. Meanwhile, Tom feels that the enterprise analytics journey starts with good, integrated data on transactions and business processes managed through enterprise applications like ERP and CRM Systems (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 51).
Focusing on the big questions and the right problems
Clearly driving the business to focus on the big questions and the right problems is critical. IT cannot do this but they can facilitate it. Why does it matter? An Accenture Study found that “companies that derived any real value from them (their analytics) had anticipated how to leverage the information to generate new insights to improve business performance. (“Using Enterprise Systems to Gain Uncommon Competitive Advantage, Accenture, page 3). This is critical and too few organizations succeed in doing it.
With this accomplished and to achieve the second goal, IT needs to be eliminating legacy BI systems and old spaghetti code as well as silo data marts. The goal should be to replace them with an enterprise analytics capability that answers the big questions. This requires standardization around an enterprise wide approach that ensures a consistent approach to data management and provides an integrated environment complete with data repositories/data lakes, analytical tools, presentation applications, and transformational tools. This investment should be focused on improving business processes or providing data needed for system of systems products. Tom says that IT’s job is to watch out for current and future users of information systems.
So the question is where is your IT organization at today? Clearly, it is important as well that IT measure enterprise analytic initiatives too. IT should measure adoption. IT should find out what is used or not they are used. I had a CIO once admit to me that he did not know whether currently supported data marts were being used or even still had value. It is important that we have these answers. Clearly, being close to the business customer from the start can limit what this CIO discussed.
Related Blogs and Links
Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer
Have you noticed something different this winter season that most people are cheery about? I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.
The report predicts:
- 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
- Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
- The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
- By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
- With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
- By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
- Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
- In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
- With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
- With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.
Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities. Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.
Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making). The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.
So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future? Here are some tips for consideration:
- Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
- Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
- Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
- Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
- Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.
In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?
Click to learn more about Informatica in today’s Energy Sector:
Happy Holidays, Happy HoliData
In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.
Thanks a lot to all my great teammates, who made this series happen.
Happy Holidays, Happy HoliData.
Six ideas for CIOs in 2015 to put the innovation back in CIO
For most, the “I” in CIO stands for Information. But what about that other “I”, Innovation? For many IT organizations, 60-80% of IT spending continues to be tied up in keeping the IT lights on. But innovation matters more than ever to the business bottom line. According Geoffrey Moore, “without innovation, offerings become more and more like each other. They commoditize.” (“Dealing with Darwin”, Geoffrey Moore, page 1). Geoffrey goes on to say later in “Dealing with Darwin” that commoditization will over time drop business returns to “the cost of capital”. So clearly, this is a place that no CIO would want their enterprises to consciously go.
Given this, what is the role of the CIO in driving enterprise innovation? I believe that it is a significant one. Without question, technology investment has been a major driver of enterprise productivity gains. At the same time, IT investment has had a major role in improving business capabilities and the business value chains. And more recently, IT is even carving out a role in products themselves as part of the IoT. So how can CIOs help drive business innovation?
1) Get closer to your business customers. CIOs have said to me that their number one priority is connecting what the IT is doing to what the business is doing. Given this, CIOs should make it a real priority for their teams to get closer to the business this year. According to Kamalini Ramdas’ Article in Harvard Business Review, “to succeed at innovation, you need to have a culture in which everyone in the company is constantly scanning for ideas”.
2) Develop internal design partners. When I have started new businesses, I have always created a set of design partners to ensure that I built the right products. I tell my design partners to beat me up now rather than after I build the product. You need, as Kamalini Ramdas suggests, to harvest the best ideas of your corporate team just like I did with startups. You can start by focusing your attention upon the areas of distinctive capability—the places that give your firm its right to win.
3) Enabling your IT leaders and individual contributors to innovate. For many businesses, speed to market or speed of business processes can represent a competitive advantage. Foundationally to this are IT capabilities including up time, system performance, speed of project delivery, and the list goes on. Encouraging everyone on your team to drive superior operational capabilities can enable business competitive advantage. And one more thing, make sure to work with your business leaders to pass a portion of the business impact for improvements into a bonus for the entire enabling IT team. At Lincoln Electric, they used bonuses by team to continuously improve their products. This arch welding company shares the money saved from each process improvement with the entire team. They end up getting the best team and highest team longevity as teams work improves product quality and increases cost take out. According Kamalini, “in truly innovative culture, leaders need to imbue every employee with a clear vision and a sense of empowerment that helps them identify synergistic ideas and run with them” (“Build a Company Where Everyone’s Looking for New Ideas”, Harvard Business Review, page 1).
4) Architect for Innovation. As the velocity of change increases, businesses need IT organizations to be able to move more quickly. This requires an enterprise architecture built for agility. According to Jeanne Ross, the more agile companies have a high percentage of their core business processes digitized and they have as well standardized their technology architecture (Enterprise Architecture as Strategy, Jeanne Ross, page 12).
5) Look for disruptive innovations. I remember a professor of mine suggesting that we cannot predict the future when discussing futures research. But I believe that you can instead get closer to your customers than anyone else. CIOs should dedicate a non-trival portion of IT spend to germinating potentially disruptive ideas. They should use their design partners to select what gets early stage funding. Everyone here should act like a seed stage venture capitalist. You need to let people experiment. At the same time, design partners should set reasonable goals and actively measure performance toward goals.
6) Use analytics. Look at business analytics for areas of that could use IT’s help. Open up discussions with design partners for areas needing capability improvement. This is a great place to start. Look as well for where there are gaps in business delivery that could be drive better performance from further or improved digitization/automation. And once an innovation is initiated, analytics should actively ensure the management of the innovation’s delivery.
There is always more that you can do to innovate. The key thing is to get innovation front and center on the IT agenda. Actively sponsor it and most importantly empower the team to do remarkable things. And when this happens, reward the teams that made it happen.
A couple months ago, I reached out to a set of CIOs on the importance of good governance and security. All of them agreed that both were incredibly important. However, one CIO retorted a very pointed remark by saying that “the IT leadership at these breached companies wasn’t stupid.” He continued by saying that when selling the rest of the C-Suite, the discussion needs to be about business outcomes and business benefits. For this reason, he said that CIOs have struggled at selling the value of investments in governance and security investment. Now I have suggested previously that security pays because of the impact on “brand promise”. And, I still believe this.
However, this week the ante was raised even higher. A district judge ruled that a group of banks can proceed to sue a retailer for negligence in their data governance and security. The decision could clearly lead to significant changes in the way the cost of fraud is distributed among parties within the credit card ecosystem. Where once banks and merchant acquirers would have shouldered the burden of fraud, this decision paves the way for more card-issuing banks to sue merchants for not adequately protecting their POS systems.
The judge’s ruling said that “although the third-party hackers’ activities caused harm, merchant played a key role in allowing the harm to occur.” The judge also determined that the bank suit against merchants was valid because the plaintiffs adequately showed that the retailer failed “to disclose that its data security systems were deficient.” This is interesting because it says that security systems should be sufficient and if not, retailers need to inform potentially affected stakeholders of their deficient systems. And while taking this step could avoid a lawsuit, it would likely increase the cost of interchange for more risky merchants. This would effectively create a risk premium for retailers that do not adequately govern and protect their IT environments.
There are broad implications for all companies who end up harming customer, partners, or other stakeholders by not keeping their security systems up to snuff. The question is, will this make good governance have enough of a business outcome and benefit that businesses will actually want to pay it forward — i.e. invest in good governance and security? What do you think? I would love to hear from you.
In last 50-60 years, we have witnessed another revolution, through the invention of computing machines and the Internet – a digital revolution. It has transformed every industry and allowed us to operate at far greater scale – processing more transactions and in more locations – than ever before. New cities emerged on the map, migrations of knowledge workers throughout the world followed, and the standard of living increased again. And digitally available information transformed how we run businesses, cities, or countries.
Forces Shaping Digital Revolution
Over the last 5-6 years, we’ve witnessed a massive increase in the volume and variety of this information. Leading forces that contributed to this increase are:
- Next generation of software technology connecting data faster from any source
- Little to no hardware cost to process and store huge amount of data (Moore’s Law)
- A sharp increase in number of machines and devices generating data that are connected online
- Massive worldwide growth of people connecting online and sharing information
- Speed of Internet connectivity that’s now free in many public places
As a result, our engagement with the digital world is rising – both for personal and business purposes. Increasingly, we play games, shop, sign digital contracts, make product recommendations, respond to customer complains, share patient data, and make real time pricing changes to in-store products – all from a mobile device or laptop. We do so increasingly in a collaborative way, in real-time, and in a very personalized fashion. Big Data, Social, Cloud, and Internet of Things are key topics dominating our conversations and thoughts around data these days. They are altering our ways to engage with and expectations from each other.
This is the emergence of a new revolution or it is the next phase of our digital revolution – the democratization and ubiquity of information to create new ways of interacting with customers and dramatically speeding up market launch. Businesses will build new products and services and create new business models by exploiting this vast new resource of information.
The Quest for Great Data
But, there is work to do before one can unleash the true potential captured in data. Data is no more a by-product or transaction record. Neither it has anymore an expiration date. Data now flows through like a river fueling applications, business processes, and human or machine activities. New data gets created on the way and augments our understanding of the meaning behind this data. It is no longer good enough to have good data in isolated projects, but rather great data need to become accessible to everyone and everything at a moment’s notice. This rich set of data needs to connect efficiently to information that has been already present and learn from it. Such data need to automatically rid itself of inaccurate and incomplete information. Clean, safe, and connected – this data is now ready to find us even before we discover it. It understands the context in which we are going to make use of this information and key decisions that will follow. In the process, this data is learning about our usage, preference, and results. What works versus what doesn’t. New data is now created that captures such inherent understanding or intelligence. It needs to flow back to appropriate business applications or machines for future usage after fine-tuning. Such data can then tell a story about human or machine actions and results. Such data can become a coach, a mentor, a friend of kind to guide us through critical decision points. Such data is what we would like to call great data. In order to truly capitalize on the next step of digital revolution, we will pervasively need this great data to power our decisions and thinking.
Impacting Every Industry
By 2020, there’ll be 50 Billion connected devices, 7x more than human beings on the planet. With this explosion of devices and associated really big data that will be processed and stored increasingly in the cloud. More than size, this complexity will require a new way of addressing business process efficiency that renders agility, simplicity, and capacity. Impact of such transformation will spread across many industries. A McKinsey article, “The Future of Global Payments”, focuses on digital transformation of payment systems in the banking industry and ubiquity of data as a result. One of the key challenges for banks will be to shift from their traditional heavy reliance on siloed and proprietary data to a more open approach that encompasses a broader view of customers.
Industry executives, front line managers, and back office workers are all struggling to make the most sense of the data that’s available.
Closing Thoughts on Great Data
A “2014 PWC Global CEO Survey ” showed 81% ranked technology advances as #1 factor to transform their businesses over next 5 years. More data, by itself, isn’t enough for this transformation. A robust data management approach integrating machine and human data, from all sources and updated in real-time, among on-premise and cloud-based systems must be put in place to accomplish this mission. Such an approach will nurture great data. This end-to-end data management platform will provide data guidance and curate an organization’s one of the most valuable assets, its information. Only by making sense of what we have at our disposal, will we unleash the true potential of the information that we possess. The next step in the digital revolution will be about organizations of all sizes being fueled by great data to unleash their potential tapped.
As I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. At first glance, you might not think of universities needing to worry much about their right to win, but universities today are facing increasing competition for students as well as the need to increase efficiency, decrease dependence upon state funding, create new and less expensive delivery models, and drive better accountability.
George Washington University Perceives The Analytic Opportunity
George Washington University (GWU) is no different. And for this reason their leadership determined that they needed to gain the business insight to compete for the best students, meet student diversity needs, and provide accountability to internal and external stakeholders. All of these issues turned out to have a direct impact upon GWU’s business processes—from student recruitment to financial management. At the same time university leadership determined the complexity of these challenges requires continual improvement in the University’s operational strategies and most importantly, accurate, timely, and consistent data.
Making It A Reality
GWU determined that getting after these issues required a flexible system that could provide analytics and key academic performance indicators and metrics on demand, whenever they needed them. They, also, determined that the analytics and underlying data needed to enable accurate, balanced decisions needed to be performed more quickly and more effectively than in the past.
Unfortunately, GWU’s data was buried in disparate data sources that were largely focused on supporting transactional, day-to-day business processes. This data was difficult to extract and even more difficult to integrate into a single format, owing to inherent system inconsistencies and the ownership issues surrounding them — a classic problem for collegial environments. Moreover, the university’s transaction applications did not store data in models that supported on-demand and ad hoc aggregations that GWU business users required.
To solve these issues, GWU created a data integration and business intelligence implementation dubbed the Student Data Mart (SDM). The SDM integrates raw structured and unstructured data into a unified data model to support key academic metrics.
“The SDM represents a life record of the students,” says Wolf, GWU’s Director of Business Intelligence. “It contains 10 years of recruitment, admissions, enrollment, registration, and grade-point average information for all students across all campuses”. It supports a wide-range of academic metrics around campus enrollment counts, admissions selectivity, course enrollment, student achievement, and program metrics.
These metrics are directly and systematically aligned with the academic goals for each department and with GWU’s overall overarching business goals. Wolf says, “The SDM system provides direct access to key measures of academic performance”. “By integrating data into a clean repository and disseminating information over their intranet, the SDM has given university executivesdirect access to key academic metrics. Based on these metrics, users are able to make decisions in a timely manner and with more precision than before.”
Their integration technology supports a student account system, which supplies more than 400 staff with a shared, unified view of the financial performance of students. It connects data from a series of diverse, fragmented internal sources and third-party data from employers, sponsors, and collection agencies. The goal is to answer business questions about whether students paid their fees or how much they paid for each university course.
Continual Quality Improvement
During its implementation, GWU’s data integration process exposed a number of data quality issues that were the natural outcome of a distributed data ownership. Without an enterprise approach to data and analytics, it would have been difficult to investigate the nature and extent of data quality issues from its historical fragmented business intelligence system. Taking an enterprise approach has, as well, enabled GWU to improve data quality standards and procedures.
Wolf explains, “Data quality is an inevitable problem in any higher education establishment, because you have so many different people—lecturers, students, and administration staff—all entering data. With our system, we can find hidden data problems, wherever they are, and analyze the anomalies across all data sources. This helps build our trust and confidence in the data. It also speeds up the design phase because it overcomes the need to hand query the data to see what the quality is like.”
Connecting The Dots
Wolf and his team have not stopped here. As data emanating from social media has grown, they have designed their system so social data can be integrated just as easily as their traditional data sources including Oracle Financials, SunGard, SAP, and flat file data. Wolf says the SDM platform doesn’t turn its back on any type of data. By allowing the university to integrate any type of data, including social media, Wolf has been able to support key measures of academic performance, improving standards, and reducing costs. Ultimately, this is helping GWU maintain its business position as well as the University’s position especially as a magnet for the best students around the world.
In sum, the GWU analytics solution has helped it achieve the following business goals:
- Attract the best students
- Provide trusted reliable data for decision makers
- Enable more timely business decisions
- Increase achievement of academic and administrative goals
- Deliver new business insight by combining social media with existing data sources
Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”
Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer
A couple comments on the importance of integration platforms like Informatica in an EDW/Hadoop environment.
- Hadoop does mean you can do some quick and inexpensive exploratory analysis with little or no ETL. The issue is that it will not perform at the level you need to take it to production. As the webinar points out, applying some structure to the data with columnar files (not RDBMS) will dramatically speed up query performance.
- The other thing that makes an integration platform more important than ever is the explosion of data complexity. As Dr. Kimball put it:
“Integration is even more important these days because you are looking at all sorts of data sources coming in from all sorts of directions.”
To perform interesting analyses, you are going to have to be able to join data with different formats and different semantic meaning. And that is going to require integration tools.
- Thirdly, if you are going to put this data into production, you will want to incorporate data cleansing, metadata management, and possibly formal data governance to ensure that your data is trustworthy, auditable, and has business context. There is no point in serving up bad data quickly and inexpensively. The result will be poor business decisions and flawed analyses.
For Data Warehouse Architects
The challenge is to deliver actionable content from the exploding amount of data available. You will need to be constantly scanning for new sources of data and looking for ways to quickly and efficiently deliver that to the point of analysis.
For Enterprise Architects
The challenge with adding Big Data to Your EDW Architecture is to define and drive a coherent enterprise data architecture across your organization that standardizes people, processes, and tools to deliver clean and secure data in the most efficient way possible. It will also be important to automate as much as possible to offload routine tasks from the IT staff. The key to that automation will be the effective use of metadata across the entire environment to not only understand the data itself, but how it is used, by whom, and for what business purpose. Once you have done that, then it will become possible to build intelligence into the environment.
For more on Informatica’s vision for an Intelligent Data Platform and how this fits into your enterprise data architecture see Think “Data First” to Drive Business Value
Last month, the CEO of Deloitte said that CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. In my discussions with CFOs, they have expressed similar opinions. Given this, the question becomes what does a CFO need to do to be effective leader of their company’s analytics agenda? To answer this, I took a look at what Tom Davenport suggests in his book “Analytics at Work”. In this book, Tom suggests that an analytical leader need to do the following twelve things to be effective:
12 Ways to Be an Effective Analytics Leader
1) Develop their people skills. This is not just about managing analytical people which has its own challenges. It is, also, about CFOs establishing the “the credibility and trust needed when analytics produce insights that effectively debunk currently accepted wisdom”.
2) Push for fact based decision making. You need to, as a former boss of mine like to say, become the lightening rod and in this case, set the expectation that people will make decisions based upon data and analysis.
3) Hire and retain smart people. You need to provide a stimulating and supportive work environment for analysts and give them credit when they do something great.
4) Be the analytical example. You need to lead by example. This means you need to use data and analysis in making your own decisions
5) Signup for improved results. You need to commit to driving improvements in a select group of business processes by using analytics. Pick something meaningful ike reducing the cost of customer acquisition or optimizing your company’s supply chain management.
6) Teach the organization how to use analytic methods. Guide employees and other stakeholders into using more rigorous thinking and decision making.
7) Set strategies and performance expectations. Analytics and fact-based decisions cannot happen in a vacuum. They need strategies and goals that analytics help achieve.
8) Look for leverage points. Look for the business problems where analytics can make a real difference. Look for places where a small improvement in a process driven by analytics can make a big difference.
9) Demonstrate persistence. Work doggedly and persistently to apply analytics to decision making, business processes, culture, and business strategy.
10) Build an analytics ecosystem with your CIO. Build an ecosystem consisting of other business leaders, employees, external analytics suppliers, and business partners. Use them to help you institutionalize analytics at your company.
11) Apply analytics on more than one front. No single initiative will make the company more successful—no single analytics initiative will do so either.
12) Know the limits to analytics. Know when it is appropriate to use intuition instead of analytics. As a professor of mine once said not all elements of business strategy can be solved by using statistics or analytics. You should know where and when analytics are appropriate.
Following these twelve items will help strategic oriented CFOs lead the analytics agenda at their companies. As I indicated in “Who Owns the Analytics Agenda?”, CFOs already typically act as data validators at their firms, but taking this next step matters to their enterprise because “if we want to make better decisions and take the right actions, we have use analytics” (Analytics at Work, Tom Davenport, Harvard Business Review Press, page 1). Given this, CFOs really need to get analytics right. The CFOs that I have talked to say they already “rely on data and analytics and they need them to be timely and accurate”.
One CFO, in fact, said that data is potentially the only competitive advantage left for his firm”. And while implementing the data side of this depends on the CIO. It is clear from the CFOs that I have talked to that they believe a strong business relationship with their CIO is critical to the success of their business.
So the question remains are you ready as a financial leader to lead on the analytics agenda? If you are and you want to learn more about setting the analytics agenda, please consider yourself invited to webinar that I am doing with the CFO of RoseRyan in January.
CFOs Move to Chief Profitability Officer
CFOs Discuss Their Technology Priorities
The CFO Viewpoint upon Data
How CFOs can change the conversation with their CIO?
New type of CFO represents a potent CIO ally
Competing on Analytics
The Business Case for Better Data Connectivity