What does it take to be an analytics-driven business? That’s a question that requires a long answer. Recently, Gartner research director Lisa Kart took on this question, noting how the key to becoming an analytics-driven business.
So, the secret of becoming an analytics-driven business is to bust down the silos — easier than done, of course. The good news, as Kart tells it, is that one doesn’t need to be casting a wide net across the world in search of the right data for the right occasion. The biggest opportunities are with connecting the data you already have, she says.
Taking Kart’s differentiation of just-using-analytics versus analytics-driven culture a step further, hare is a brief rundown of how businesses just using analytics approach the challenge, versus their more enlightened counterparts:
Business just using analytics: Lots of data, but no one really understands how much is around, or what to do with it.
Analytics-driven business: The enterprise has a vision and strategy, supported from the top down, closely tied to the business strategy. Management also recognizes that existing data has great value to the business.
Business just using analytics: Every department does its own thing, with varying degrees of success.
Analytics-driven business: Makes connections between all the data – of all types — floating around the organization. For example, gets a cross-channel view of a customer by digging deeper and connecting the silos together to transform the data into something consumable.
Business just using analytics: Some people in marketing have been collecting customer data and making recommendations to their managers.
Analytics-driven business: Marketing departments, through analytics, engage and interact with customers, Kart says. An example would be creating high end, in-store customer experiences that gave customers greater intimacy and interaction.
Business just using analytics: The CFO’s staff crunches numbers within their BI tools and arrive at what-if scenarios.
Analytics-driven business: Operations and finance departments share online data to improve performance using analytics. For example, a company may tap into a variety of data, including satellite images, weather patterns, and other factors that may shape business conditions, Kart says.
Business just using analytics: Some quants in the organization pour over the data and crank out reports.
Analytics-driven business: Encourages maximum opportunities for innovation by putting analytics in the hands of all employees. Analytics-driven businesses recognize that more innovation comes from front-line decision-makers than the executive suite.
Business just using analytics: Decision makers put in report requests to IT for analysis.
Analytics-driven business: Decision makers can go to an online interface that enables them to build and display reports with a click (or two).
Business just using analytics: Analytics spits out standard bar charts, perhaps a scattergram.
Analytics-driven business: Decision makers can quickly visualize insights through 3D graphics, also reflecting real-time shifts.
Despite spending more than $30 Billion in annual spending on Big Data, successful big data implementations elude most organizations. That’s the sobering assessment of a recent study of 226 senior executives from Capgemini, which found that only 13 percent feel they have truly have made any headway with their big data efforts.
The reasons for Big Data’s lackluster performance include the following:
- Data is in silos or legacy systems, scattered across the enterprise
- No convincing business case
- Ineffective alignment of Big Data and analytics teams across the organization
- Most data locked up in petrified, difficult to access legacy systems
- Lack of Big Data and analytics skills
Actually, there is nothing new about any of these issues – in fact, the perceived issues with Big Data initiatives so far map closely with the failed expect many other technology-driven initiatives. First, there’s the hype that tends to get way ahead of any actual well-functioning case studies. Second, there’s the notion that managers can simply take a solution of impressive magnitude and drop it on top of their organizations, expecting overnight delivery of profits and enhanced competitiveness.
Technology, and Big Data itself, is but a tool that supports the vision, well-designed plans and hard work of forward-looking organizations. Those managers seeking transformative effects need to look deep inside their organizations, at how deeply innovation is allowed to flourish, and in turn, how their employees are allowed to flourish. Think about it: if line employees suddenly have access to alternative ways of doing things, would they be allowed to run with it? If someone discovers through Big Data that customers are using a product differently than intended, do they have the latitude to promote that new use? Or do they have to go through chains of approval?
Big Data may be what everybody is after, but Big Culture is the ultimate key to success.
For its part, Capgemini provides some high-level recommendations for better baking in transformative values as part of Big Data initiatives, based on their observations of best-in-class enterprises:
The vision thing: “It all starts with vision,” says Capgemini’s Ron Tolido. “If the company executive leadership does not actively, demonstrably embrace the power of technology and data as the driver of change and future performance, nothing digitally convincing will happen. We have not even found one single exception to this rule. The CIO may live and breathe Big Data and there may even be a separate Chief Data Officer appointed – expect more of these soon – if they fail to commit their board of executives to data as the engine of success, there will be a dark void beyond the proof of concept.”
Establish a well-defined organizational structure: “Big Data initiatives are rarely, if ever, division-centric,” the Capgemini report states. “They often cut across various departments in an organization. Organizations that have clear organizational structures for managing rollout can minimize the problems of having to engage multiple stakeholders.”
Adopt a systematic implementation approach: Surprisingly, even the largest and most sophisticated organizations that do everything on process don’t necessarily approach Big Data this way, the report states. “Intuitively, it would seem that a systematic and structured approach should be the way to go in large-scale implementations. However, our survey shows that this philosophy and approach are rare. Seventy-four percent of organizations did not have well-defined criteria to identify, qualify and select Big Data use-cases. Sixty-seven percent of companies did not have clearly defined KPIs to assess initiatives. The lack of a systematic approach affects success rates.”
Adopt a “venture capitalist” approach to securing buy-in and funding: “The returns from investments in emerging digital technologies such as Big Data are often highly speculative, given the lack of historical benchmarks,” the Capgemini report points out. “Consequently, in many organizations, Big Data initiatives get stuck due to the lack of a clear and attributable business case.” To address this challenge, the report urges that Big Data leaders manage investments “by using a similar approach to venture capitalists. This involves making multiple small investments in a variety of proofs of concept, allowing rapid iteration, and then identifying PoCs that have potential and discarding those that do not.”
Leverage multiple channels to secure skills and capabilities: “The Big Data talent gap is something that organizations are increasingly coming face-to-face with. Closing this gap is a larger societal challenge. However, smart organizations realize that they need to adopt a multi-pronged strategy. They not only invest more on hiring and training, but also explore unconventional channels to source talent. Capgemini advises reaching out to partner organizations for the skills needed to develop Big Data initiatives. These can be employee exchanges, or “setting up innovation labs in high-tech hubs such as Silicon Valley.” Startups may also be another source of Big Data talent.
By now, the business benefits of effectively leveraging big data have become well known. Enhanced analytical capabilities, greater understanding of customers, and ability to predict trends before they happen are just some of the advantages. But big data doesn’t just appear and present itself. It needs to be made tangible to the business. All too often, executives are intimidated by the concept of big data, thinking the only way to work with it is to have an advanced degree in statistics.
There are ways to make big data more than an abstract concept that can only be loved by data scientists. Four of these ways were recently covered in a report by David Stodder, director of business intelligence research for TDWI, as part of TDWI’s special report on What Works in Big Data.
The time is ripe for experimentation with real-time, interactive analytics technologies, Stodder says. The next major step in the movement toward big data is enabling real-time or near-real-time delivery of information. Real-time data has been a challenge with BI data for years, with limited success, Stodder says. The good news is that Hadoop framework, originally built for batch processing, now includes interactive querying and streaming applications, he reports. This opens the way for real-time processing of big data.
Design for self-service
Interest in self-service access to analytical data continues to grow. “Increasing users’ self-reliance and reducing their dependence on IT are broadly shared goals,” Stodder says. “Nontechnical users—those not well versed in writing queries or navigating data schemas—are requesting to do more on their own.” There is an impressive array of self-service tools and platforms now appearing on the market. “Many tools automate steps for underlying data access and integration, enabling users to do more source selection and transformation on their own, including for data from Hadoop files,” he says. “In addition, new tools are hitting the market that put greater emphasis on exploratory analytics over traditional BI reporting; these are aimed at the needs of users who want to access raw big data files, perform ad-hoc requests routinely, and invoke transformations after data extraction and loading (that is, ELT) rather than before.”
Nothing gets a point across faster than having data points visually displayed – decision-makers can draw inferences within seconds. “Data visualization has been an important component of BI and analytics for a long time, but it takes on added significance in the era of big data,” Stodder says. “As expressions of meaning, visualizations are becoming a critical way for users to collaborate on data; users can share visualizations linked to text annotations as well as other types of content, such as pictures, audio files, and maps to put together comprehensive, shared views.”
Unify views of data
Users are working with many different data types these days, and are looking to bring this information into a single view – “rather than having to move from one interface to another to view data in disparate silos,” says Stodder. Unstructured data – graphics and video files – can also provide a fuller context to reports, he adds.
The interesting thing is that many of the upstarts do not even intend to take on the market leader in the segment. Christensen cites the classic example of Digital Equipment Corporation in the 1980s, which was unable to make the transition from large, expensive enterprise systems to smaller, PC-based equipment. The PC upstarts in this case did not take on Digital directly – rather they addressed unmet needs in another part of the market.
Christensen wrote and published The Innovator’s Dilemma more than 17 years ago, but his message keeps reverberating across the business world. Lately, Jill Lapore questioned some of thinking that has evolved around disruptive innovation in a recent New Yorker article. “Disruptive innovation is a theory about why businesses fail. It’s not more than that. It doesn’t explain change. It’s not a law of nature,” she writes. Christensen responded with a rebuttal to Lapore’s thesis, noting that “disruption doesn’t happen overnight,” and that “[Disruptive innovation] is not a theory about survivability.”
There is something Lapore points out that both she and Christensen can agree on: “disruption” is being oversold and misinterpreted on a wide scale these days. Every new product that rolls out is now branded as “disruptive.” As stated above, the true essence of disruption is creating new markets where the leaders would not tread.
Data itself can potentially be a source of disruption, as data analytics and information emerge as strategic business assets. While the ability to provide data analysis at real-time speeds, or make new insights possible isn’t disruption in the Christensen sense, we are seeing the rise of new business models built around data and information that could bring new leaders to the forefront. Data analytics can either play a role in supporting this movement, or data itself may be the new product or service disrupting existing markets.
We’ve already been seeing this disruption taking place within the publishing industry, for example – companies or sites providing real-time or near real-time services such as financial updates, weather forecasts and classified advertising have displaced traditional newspapers and other media as information sources.
Employing data analytics as a tool for insights never before available within an industry sector also may be part of disruptive innovation. Tesla Motors, for example, is disruptive to the automotive industry because it manufactures entirely electric cars. But the formula to its success is its employment of massive amounts of data from its array of vehicle in-devices to assure quality and efficiency.
Likewise, data-driven disruption may be occurring in places that may have been difficult to innovate. For example, it’s long been speculated that some of the digital giants, particularly Google, are poised to enter the long-staid insurance industry. If this were to happen, Google would not enter as a typical insurance company with a new web-based spin. Rather, the company would be employing new techniques of data gathering, insight and analysis to offer an entirely new model to consumers – one based on data. As Christopher Hernaes recently related in TechCrunch, Google’s ability to collect and mine data on homes, business and autos give it a unique value proposition n the industry’s value chain.
We’re in an era in which Christensen’s mode of disruptive innovation has become a way of life. Increasingly, it appears that enterprises that are adept and recognizing and acting upon the strategic potential of data may be joining the ranks of the disruptors.
“What really matters about big data is what it does. Aside from how we define big data as a technological phenomenon, the wide variety of potential uses for big data analytics raises crucial questions about whether our legal, ethical, and social norms are sufficient to protect privacy and other values in a big data world.”
These crucial questions, raised in a recent White House report on the implications of big data, frame a growing debate taking place across both society and the business world on how far organizations can push the limits with data collection and analysis. The report, issued by a presidential commission tasked with assessing big data’s privacy implications, explains how big data is a double-edged sword. While big data analytics pave the way to unexpected discoveries, innovations, and advancements in our quality of life, it also has the potential for abuse as well. As the report puts it, big data’s capabilities, “most of which are not visible or available to the average consumer, also create an asymmetry of power between those who hold the data and those who intentionally or inadvertently supply it.”
The report’s authors acknowledge that big data analytics is an engine of economic growth and a competitive tool for companies across all industries, as well as a tool for quality of life. “Used well, big data analysis can boost economic productivity, drive improved consumer and government services, thwart terrorists, and save lives,” the report states. In addition, there will likely be a profound impact as data analytics gets applied to the Internet of Things, which “have made it possible to merge the industrial and information economies.” In another example, healthcare providers and payers can employ predictive analytics to detect fraud and abuse in real time.
The report’s main thrust is personal privacy implications, and many these issues will inevitably shape the practices and policies of enterprises as they expand their businesses into the big data realm. The managers and professionals charged with identifying, collecting and analyzing information assets will increasingly be under pressure – as their organizations feel pressure – to understand the boundaries between insight, targeted engagement, and overreach.
For example, a still relatively unexplored area of big data is its ownership. Does data belong to those who collect it, or those who contribute to it? “Big data may be viewed as property, as a public resource, or as an expression of individual identity,” the report states.
Another challenge is the fact that many organizations will opt to assemble massive databases as they move forward with big data analysis. “Big data technologies can derive value from large data sets in ways that were previously impossible — indeed, big data can generate insights that researchers didn’t even think to seek.” For example, new tools and technologies provide for analysis across entire data sets, versus extracting a small representative subset of the data and extrapolating any results against a larger universe. However, with so much data, analysis may potentially be erroneous as well. “Correlation still doesn’t equal causation,” the report’s authors state. “Finding a correlation with big data techniques may not be an appropriate basis for predicting out-comes or behavior, or rendering judgments on individuals. In big data, as with all data, interpretation is always important.”
Another issue is the permanence of data – which also is a privacy issue. At the same time, this may also create headaches for corporate data managers as well. “In the past, retaining physical control over one’s personal information was often sufficient to ensure privacy,” the report states. “Documents could be destroyed, conversations forgotten, and records expunged. But in the digital world, information can be captured, copied, shared, and transferred at high fidelity and retained indefinitely. Volumes of data that were once unthinkably expensive to preserve are now easy and affordable to store on a chip the size of a grain of rice. As a consequence, data, once created, is in many cases effectively permanent. Furthermore, digital data often concerns multiple people, making personal control impractical.”
The report’s authors state that organizations need to take steps to address privacy issues, and suggest de-identification and encryption as technical solutions that are available at this time. However, in the long run, de-identification is still a weak approach to the problem. “Many technologists are of the view that de-identification of data as a means of protecting individual privacy is, at best, a limited proposition. In practice, data collected and de-identified is protected in this form by companies’ commitments to not re-identify the data and by security measures put in place to ensure those protections.”
Ultimately, the best methods to ensure the ethical use of data need to come through inspired and forward-thinking management. It takes judicious management, a commitment to training and education, and a focus on what nuggets of information matter the most to the business. Big data opens up many new vistas for enterprises, and those that take the high road will reap its rewards.
There’s a reason why big data analytics are so successful at some companies, yet fall flat at others. As MIT’s Michael Shrage put it in a recent Harvard Business Review article, it all depends on how deeply the data and tools are employed in the business. “Companies with mediocre to moderate outcomes use big data and analytics for decision support,” he says. “Successful ROA—Return on Analytics—firms use them to effect and support behavior change.”
In other words, analytics really need to drill down deep into the psyche of organizations to make a difference. The more big data analytics get baked into business processes and outcomes, the more likely they are to deliver transformative results to the organization. As he puts it, “better data-driven analyses aren’t simply ‘plugged-in’ to existing processes and reviews, they’re used to invent and encourage different kinds of conversations and interactions.”
You may have heard some of these success stories in recent years – the casino and resort company that tracks customer engagements in real-time and extends targeted offers that will enrich their stay; the logistics company that knows where its trucks are, and can reroute them to speed up delivery and save fuel; the utility that can regulate customers’ energy consumption at critical moments to avoid brownouts.
Shrage’s observations come from interviews and discussions with hundreds of organizations in recent years. His conclusions point to the need to develop an “analytical culture” – in which the behaviors, practices, rituals and shared vision of the organization are based on data versus guesswork. This is not to say gut feel and passion don’t have a place in successful ventures – because they do. But having the data to back up passionate leadership is a powerful combination in today’s business climate.
Most executives instinctively understand the advantages big data can bring to their operations, especially with predictive analytics and customer analytics. The ability to employ analytics means better understanding customers and markets, as well as spotting trends as they are starting to happen, or have yet to happen. Performance analytics, predictive analytics, and prescriptive analytics all are available to decision makers.
Here are some considerations for “baking” data analytics deeper into the business:
Identify the business behaviors or processes to be changed by analytics. In his article, Shrage quotes a financial services CIO, who points out that standard BI and analytical tools often don’t go deeply enough into an organization’s psyche: “Improving compliance and financial reporting is the low-hanging fruit. But that just means we’re using analytics to do what we are already doing better.” The key is to get the business to open up and talk about what they would like to see changed as a result of analytics.
Focus on increasing analytic skills – for everyone. While many organizations go out searching for individual that can fill data scientist roles (or something similar), there’s likely an abundance of talent and insightfulness that can be brought out from current staff, both inside and outside of IT. Business users, for example, can be trained to work with the latest front-end tools that bring data forward into compelling visualizations. IT and data professionals can sharpen their skills with emerging tools and platforms such as Hadoop and MapReduce, as well as working with analytical languages such as R.
Shrage cites one company that recognized that a great deal of education and training was required before it could re-orient its analytics capabilities around “most profitable customers” and “most profitable products.” Even clients and partners required some level of training. The bottom line: “The company realized that these analytics shouldn’t simply be used to support existing sales and services practices but treated as an opportunity to facilitate a new kind of facilitative and consultative sales and support organization.”
Automate, and what you can’t automate, make as friendly and accessible as possible. Automated decision management can improve the quality of analytics and the analytics experience for decision makers. That’s because automating low-level decisions – such as whether to grant a credit line increase or extend a special offer to a customer – removes these more mundane tasks from decision makers’ plates. As a result, they are freed up to concentrate on higher-level, more strategic decisions. For those decisions that can’t be automated, information should be as easily accessible as possible to all levels of decision makers – through mobile apps, dashboards, and self-service portals.
A few years back, there was a movement in some businesses to establish “data stewards” – individuals who would sit at the hearts of the enterprise and make it their job to assure that data being consumed by the organization is of the highest possible quality, is secure, is contextually relevant, and capable of interoperating across any applications that need to consume it. While the data steward concept came along when everything was relational and structured, these individuals are now earning their pay when it comes to managing the big data boom.
The rise of big data is creating more than simple headaches for data stewards, it is creating turf wars across enterprises. As pointed out in a recent article in The Wall Street Journal, there isn’t yet a lot of clarity as to who owns and cares for such data. Is it IT? Is it lines of business? Is it legal? There are arguments that can be made for all jurisdictions.
In organizations these days, for example, marketing executives are generating, storing and analyzing large volumes of their own data within content management systems and social media analysis solutions. Many marketing departments even have their own IT budgets. Along with marketing, of course, everyone else within enterprises is seeking to pursue data analytics to better run their operations as well as foresee trends.
Typically, data has been under the domain of the CIO, the person who oversaw the collection, management and storage of information. In the Wall Street Journal article, however, it’s suggested that legal departments may be the best caretakers of big data, since big data poses a “liability exposure,” and legal departments are “better positioned to understand how to use big data without violating vendor contracts and joint-venture agreements, as well as keeping trade secrets.”
However, legal being legal, it’s likely that insightful data may end up getting locked away, never to see the light of day. Others may argue IT department needs to retain control, but there again, IT isn’t trained to recognize information that may set the business on a new course.
Focusing on big data ownership isn’t just an academic exercise. The future of the business may depend on the ability to get on top of big data. Gartner, for one, predicts that within the next three years, at least of a third of Fortune 100 organizations will experience an information crisis, “due to their inability to effectively value, govern and trust their enterprise information.”
This ability to “value, govern and trust” goes way beyond the traditional maintenance of data assets that IT has specialized in over the past few decades. As Gartner’s Andrew White put it: “Business leaders need to manage information, rather than just maintain it. When we say ‘manage,’ we mean ‘manage information for business advantage,’ as opposed to just maintaining data and its physical or virtual storage needs. In a digital economy, information is becoming the competitive asset to drive business advantage, and it is the critical connection that links the value chain of organizations.”
For starters, then, it is important that the business have full say over what data needs to be brought in, what data is important for further analysis, and what should be done with data once it gains in maturity. IT, however, needs to take a leadership role in assuring the data meets the organization’s quality standards, and that it is well-vetted so that business decision-makers can be confident in the data they are using.
The bottom line is that big data is a team effort, involving the whole enterprise. IT has a role to play, as does legal, as do the line of business.
Which comes first: innovation or analytics?
Bain & Company released some survey findings a few months back that actually put a value on big data. Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.
This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?
You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.
The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”
Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”
Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:
Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”
Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”
Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”
Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.
There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.
Let’s face it, big data – or data in any size, format or shape – is nothing more than just a bunch of digital bits that occupy space on a disk somewhere. To be useful to the business, end-users need to be able to access it, and pull out and assemble the nuggets of information they need. Data needs to be brought to life.
That’s the theme of a webcast I recently had the opportunity to co-present with Tableau Software, titled “Making Big Data User-Centric.” In fact, there’s a lot more to it than making data user-centric – big data should be a catalyst that fires peoples’ imaginations, enabling them to explore new avenues that were never opened up before.
Many organizations are beginning their journey into the new big data analytics space, and are starting to discover all the possibilities it offers. But, in an era where data is now scaling into the petabyte range, it’s more than technology. It’s a disruptive force, and with disruption comes new opportunities for growth.
Here are nine ways to make this innovative disruption possible:
1. Remember that “data” is not “information.” Too many people think that data itself is a valuable commodity. However, that is like taking oil right out of the ground and trying to sell it at gas stations – it’s not usable. It needs to be processed, refined, and packaged for delivery. It needs to be unified for eventual delivery and presentation. And, finally, to give information its value, it needs to tell a story.
2. Make data sharable across the enterprise. Big data – like all types of data – tend to naturally drift into silos within departments across enterprises. For years, people have struggled to break down these silos and provide a single view of all relevant data. Now there’s a away to do it – through a unified service layer. Think of all the enterprisey things coming to the forefront in recent years – service oriented architecture, data virtualization, search technologies. No matter how you do it, the key is to provide a way for data to be made available across enterprise walls.
3. Use analytics to push the innovation envelope. Big data analytics enables end-users to ask questions and consider options that weren’t possible within standard, relational data environments.
4. Encourage critical thinking among data users. Business users have powerful tools at their disposal, and access to data they’ve never had before. It’s more important than ever to consider where the information came from, its context, and other potential sources that are not in the enterprise’s data stream.
5. Develop analytical skills across the board. Surveys I have conducted in partnership with Unisphere Research finds barely 10% of organizations offer self-service BI on a widespread basis. This needs to change. Everybody is working with information and data, everyone needs to understand the implications of the information and data with which they are working.
6. Promote self-service. Analytic capabilities should be delivered on a self-service basis. End-users are accustomed to information being delivered to them a Google speeds, making the processes they deal with at work – requesting reports from their IT departments, setting up queries – seem downright antiquated, as well as frustrating.
7. Make it visual. Yes, graphical displays of data have been around for more than a couple of decades now. But now, there is an emerging class of front-end visualization tools that convert data points into visual displays – often stunning – that enable users to spot anomalies or trends within seconds.
8. Make it mobile. Just about everyone now carries mobile devices from which they can access data from any place. It’s now possible to offering analytics ranging from key performance indicator marketing, drill-down navigation, data selection, data filtering, and alerts.
9. Make it social. There are two ways to look at big data analytics and social media. First, there’s the social media data itself. BI and analytics efforts would be missing a big piece of the picture if it did not address the wealth of social media data flowing through organizations. This includes sentiment analysis and other applications to monitor interactions on external social media sites, to determine reactions to new products or predict customer needs. But there’s also the collaboration aspect, the ability to share insights and discoveries with peers and partners. Either way, it takes many minds working together to effectively pull information from all that data.
In recent times, the big Internet companies – the Googles, Yahoos and eBays – have proven that it is possible to build a sustainable business on data analytics, in which corporate decisions and actions are being seamlessly guided via an analytics culture, based on data, measurement and quantifiable results. Now, two of the top data analytics thinkers say we are reaching a point that non-tech, non-Internet companies are on their way to becoming analytics-driven organizations in a similar vein, as part of an emerging data economy.
In a report written for the International Institute for Analytics, Thomas Davenport and Jill Dyché divulge the results of their interviews with 20 large organizations, in which they find big data analytics to be well integrated into the decision-making cycle. “Large organizations across industries are joining the data economy,” they observe. “They are not keeping traditional analytics and big data separate, but are combining them to form a new synthesis.”
Davenport and Dyché call this new state of management “Analytics 3.0, ” in which the concept and practices of competing on analytics are no longer confined to data management and IT departments or quants – analytics is embedded into all key organizational processes. That means major, transformative effects for organizations. “There is little doubt that analytics can transform organizations, and the firms that lead the 3.0 charge will seize the most value,” they write.
Analytics 3.0 is the current of three distinct phases in the way data analytics has been applied to business decision making, Davenport and Dyché say. The first two “eras” looked like this:
- Analytics 1.0, prevalent between 1954 and 2009, was based on relatively small and structured data sources from internal corporate sources.
- Analytics 2.0, which arose between 2005 and 2012, saw the rise of the big Web companies – the Googles and Yahoos and eBays – which were leveraging big data stores and employing prescriptive analytics to target customers and shape offerings. This time span was also shaped by a growing interest in competing on analytics, in which data was applied to strategic business decision-making. “However, large companies often confined their analytical efforts to basic information domains like customer or product, that were highly-structured and rarely integrated with other data,” the authors write.
- In the Analytics 3.0 era, analytical efforts are being integrated with other data types, across enterprises.
This emerging environment “combines the best of 1.0 and 2.0—a blend of big data and traditional analytics that yields insights and offerings with speed and impact,” Davenport and Dyché say. The key trait of Analytics 3.0 “is that not only online firms, but virtually any type of firm in any industry, can participate in the data-driven economy. Banks, industrial manufacturers, health care providers, retailers—any company in any industry that is willing to exploit the possibilities—can all develop data-based offerings for customers, as well as supporting internal decisions with big data.”
Davenport and Dyché describe how one major trucking and transportation company has been able to implement low-cost sensors for its trucks, trailers and intermodal containers, which “monitor location, driving behaviors, fuel levels and whether a trailer/container is loaded or empty. The quality of the optimized decisions [the company] makes with the sensor data – dispatching of trucks and containers, for example – is improving substantially, and the company’s use of prescriptive analytics is changing job roles and relationships.”
New technologies and methods are helping enterprises enter the Analytics 3.0 realm, including “a variety of hardware/software architectures, including clustered parallel servers using Hadoop/MapReduce, in-memory analytics, and in-database processing,” the authors adds. “All of these technologies are considerably faster than previous generations of technology for data management and analysis. Analyses that might have taken hours or days in the past can be done in seconds.”
In addition, another key characteristic of big data analytics-driven enterprises is the ability to fail fast – to deliver, with great frequency, partial outputs to project stakeholders. With the rise of new ‘agile’ analytical methods and machine learning techniques, organizations are capable of delivering “insights at a much faster rate,” and provide for “an ongoing sense of urgency.”
Perhaps most importantly, big data and analytics are integrated and embedded into corporate processes across the board. “Models in Analytics 3.0 are often being embedded into operational and decision processes, dramatically increasing their speed and impact,” Davenport and Dyché state. “Some are embedded into fully automated systems based on scoring algorithms or analytics-based rules. Some are built into consumer-oriented products and features. In any case, embedding the analytics into systems and processes not only means greater speed, but also makes it more difficult for decision-makers to avoid using analytics—usually a good thing.”
The report is available here.