Tag Archives: Analytics
As I indicated in my last case study regarding competing on analytics, Thomas H. Davenport believes “business processes are among the last remaining points of differentiation.” For this reason, Davenport contends that businesses that create a sustainable right to win use analytics to “wring every last drop of value from their processes”. For financial services, the mission critical areas needing process improvement center are around improving the consistency of decision making and making the management of regulatory and compliance more efficient and effective.
Why does Fannie Mae need to compete on analytics?
Fannie Mae is in the business of enabling people to buy, refinance, or rent homes. As a part of this, Fannie Mae says it is all about keeping people in their homes and getting people into new homes. Foundational to this mission is the accurate collection and reporting of data for decision making and risk management. According to Tracy Stephan at Fannie Mae, their “business needs to have the data to make decisions in a more real time basis. Today, this is all about getting the right data to the right people at the right time”.
Fannie Mae claims when the mortgage crisis hit, a lot of the big banks stopped lending and this meant that Fannie Mae among others needed to pick up the slack. Their action here, however, caused the Federal Government to require them to report monthly and quarterly against goals that the Federal Government set for it. “This meant that there was not room for error in how data gets reported”. In the end, Fannie Mae says three business imperatives drove it’s need to improve its reporting and its business processes:
- To ensure that go forward business decisions were made consistently using the most accurate business data available
- To avoid penalties by adhering to Dodd-Frank and other regulatory requirements established for it after the 2008 Global Financial Crisis
- To comply with reporting to Federal Reserve and Wall Street regarding overall business risk as a function of: data quality and accuracy, credit-worthiness of loans, and risk levels of investment positions.
Delivering required Fannie Mae to change how it managed data
Given these business imperatives, IT leadership quickly realized it needed to enable the business to use data to truly drive better business processes from end to end of the organization. However, this meant enabling Fannie Mae’s business operations teams to more effectively and efficiently manage data. This caused Fannie Mae to determine that it needed a single source of truth whether it was for mortgage applications or the passing of information securely to investors. This need required Fannie Mae to establish the ability to share the same data across every Fannie Mae repository.
But there was a problem. Fannie Mae needed clean and correct data collected and integrated from more than 100 data sources. Fannie Mae determined that doing so with its current data processes could not scale. And as well, it determined that its data processes would not allow it to meet its compliance reporting requirements. At the same time, Fannie Mae needed to deliver more proactive management of compliance. This required that it know how critical business data enters and flows through each of its systems. This includes how data was changed by multiple internal processing and reporting applications. As well, Fannie Mae leadership felt that this was critical to ensure traceability to the individual user.
Per its discussions with business customers, Fannie Mae’s IT leadership determined that it needed to get real time, trustworthy data to improve its business operations and to improve its business processes and decision making. As said, these requirements could not be met with its historical approaches to integrating and managing data.
Fannie Mae determined that it needed to create a platform that was high availability, scalable, and largely automating its management of data quality management. At the same time, the platform needed to provide the ability to create a set of business glossaries with clear data lineages. Fannie Mae determined it needed effectively a single source of truth across all of its business systems. According to Tracy Stephan, IT Director, Fannie Mae, “Data quality is the key to the success of Fannie Mae’s mission of getting the right people into the right homes. Now all our systems look at the same data – that one source of truth – which gives us great comfort.” To learn more specifics about how Fannie Mae improved its business processes and demonstrated that it is truly “data driven”, please click on this video of their IT leadership.
Solution Brief: The Intelligent Data Platform
Thomas Davenport Book “Competing On Analytics”
Competing on Analytics
The Business Case for Better Data Connectivity
The CFO Viewpoint upon Data
What an enlightened healthcare CEO should tell their CIO?
The future of lighting may first be peeking through at Newark Liberty Airport in New Jersey. The airport has installed 171 new LED-based light fixtures that include a variety of sensors to detect and record what’s going in the airport, as reported by Diane Cardwell in The New York Times. Together they make a network of devices that communicates wirelessly and allows authorities to scan license plates of passing cars, watch out for lines and delays, and check out travelers for suspicious activities.
I get the feeling that Newark’s new gear will not be the last of lighting-based digital networks. Over the last few years, LED street lights have gone from something cities would love to have to the sector standard. That the market has shifted so swiftly is thanks to the efforts of early movers such as the City of Los Angeles, which last year completed the world’s largest LED street light replacement project, with LED fixtures installed on 150,000 streetlights.
Los Angeles is certainly not alone in making the switch to LED street lighting. In March 2013, Las Vegas outfitted 50,000 streetlights with LED fixtures. One month later, the Austin TX announced plans to install 35,000 LED street lights. Not to be outdone, New York City, is planning to go all-LED by 2017, which would save $14 million and many tons of carbon emissions each year.
The impending switch to LEDs is an excellent opportunity for LED light fixture makers and Big Data software vendors like Informatica. These fixtures are made with a wide variety of sensors that can be tailored to whatever the user wants to detect, including temperature, humidity, seismic activity, radiation, audio, and video, among other things. The sensors could even detect and triangulate the source of a gunshot.
This steady stream of real-time data collected from these fixtures can be transformed into torrents of small messages and events with unprecedented agility using Informatica Vibe Data Stream. Analyzed data can then be distributed to various governmental and non-governmental agencies, such as; law enforcement, environmental monitors, retailers, etc.
If I were to guess the number of streetlights in the world, I would say 4 billion. Upgrading these is a “once-in-a-generation opportunity” to harness “lots of data, i.e., Sensory big data.”
Come and get it. For developers hungry to get their hands on Informatica on Hadoop, a downloadable free trial of Informatica Big Data Edition was launched today on the Informatica Marketplace. See for yourself the power of the killer app on Hadoop from the leader in data integration and quality.
Thanks to the generous help of our partners, the Informatica Big Data team has preinstalled the Big Data Edition inside the sandbox VMs of the two leading Hadoop distributions. This empowers Hadoop and Informatica developers to easily try the codeless, GUI driven Big Data Edition to build and execute ETL and data integration pipelines natively on Hadoop for Big Data analytics.
Informatica Big Data Edition is the most complete and powerful suite for Hadoop data pipelines and can increase productivity up to 5 times. Developers can leverage hundreds of out-of-the-box Informatica pre-built transforms and connectors for structured and unstructured data processing on Hadoop. With the Informatica Vibe Virtual Data Machine running directly on each node of the Hadoop cluster, the Big Data Edition can profile, parse, transform and cleanse data at any scale to prepare data for data science, business intelligence and operational analytics.
The Informatica Big Data Edition Trial Sandbox VMs will have a 60 day trial version of the Big Data Edition preinstalled inside a 1-node Hadoop cluster. The trials include sample data and mappings as well as getting started documentation and videos. It is possible to try your own data with the trials, but processing is limited to the 1-node Hadoop cluster and the machine you have it running on. Any mappings you develop in the trial can be easily moved on to a production Hadoop cluster running the Big Data Edition. The Informatica Big Data Edition also supports MapR and Pivotal Hadoop distributions, however, the trial is currently only available for Cloudera and Hortonworks.
Accelerate your ability to bring Hadoop from the sandbox into production by leveraging Informatica’s Big Data Edition. Informatica’s visual development approach means that more than one hundred thousand existing Informatica developers are now Hadoop developers without having to learn Hadoop or new hand coding techniques and languages. Informatica can help organizations easily integrate Hadoop into their enterprise data infrastructure and bring the PowerCenter data pipeline mappings running on traditional servers onto Hadoop clusters with minimal modification. Informatica Big Data Edition reduces the risk of Hadoop projects and increases agility by enabling more of your organization to interact with the data in your Hadoop cluster.
To get the Informatica Big Data Edition Trial Sandbox VMs and more information please visit Informatica Marketplace
In my last blog I promised I would report back my experience on using Informatica Data Quality, a software tool that helps automate the hectic, tedious data plumbing task, a task that routinely consumes more than 80% of the analyst time. Today, I am happy to share what I’ve learned in the past couple of months.
But first, let me confess something. The reason it took me so long to get here was that I was dreaded by trying the software. Never a savvy computer programmer, I was convinced that I would not be technical enough to master the tool and it would turn into a lengthy learning experience. The mental barrier dragged me down for a couple of months and I finally bit the bullet and got my hands on the software. I am happy to report that my fear was truly unnecessary – It took me one half day to get a good handle on most features in the Analyst Tool, a component of the Data Quality designed for analyst and business users, then I spent 3 days trying to figure out how to maneuver the Developer Tool, another key piece of the Data Quality offering mostly used by – you guessed it, developers and technical users. I have to admit that I am no master of the Developer Tool after 3 days of wrestling with it, but, I got the basics and more importantly, my hands-on interaction with the entire software helped me understand the logic behind the overall design, and see for myself how analyst and business user can easily collaborate with their IT counterpart within our Data Quality environment.
To break it all down, first comes to Profiling. As analyst we understand too well the importance of profiling as it provides an anatomy of the raw data we collected. In many cases, it is a must have first step in data preparation (especially when our raw data came from different places and can also carry different formats). A heavy user of Excel, I used to rely on all the tricks available in the spreadsheet to gain visibility of my data. I would filter, sort, build pivot table, make charts to learn what’s in my raw data. Depending on how many columns in my data set, it could take hours, sometimes days just to figure out whether the data I received was any good at all, and how good it was.
Switching to the Analyst Tool in Data Quality, learning my raw data becomes a task of a few clicks – maximum 6 if I am picky about how I want it to be done. Basically I load my data, click on a couple of options, and let the software do the rest. A few seconds later I am able to visualize the statistics of the data fields I choose to examine, I can also measure the quality of the raw data by using Scorecard feature in the software. No more fiddling with spreadsheet and staring at busy rows and columns. Take a look at the above screenshots and let me know your preference?
Once I decide that my raw data is adequate enough to use after the profiling, I still need to clean up the nonsense in it before performing any analysis work, otherwise bad things can happen — we call it garbage in garbage out. Again, to clean and standardize my data, Excel came to rescue in the past. I would play with different functions and learn new ones, write macro or simply do it by hand. It was tedious but worked if I worked on static data set. Problem however, was when I needed to incorporate new data sources in a different format, many of the previously built formula would break loose and become inapplicable. I would have to start all over again. Spreadsheet tricks simply don’t scale in those situation.
With Data Quality Analyst Tool, I can use the Rule Builder to create a set of logical rules in hierarchical manner based on my objectives, and test those rules to see the immediate results. The nice thing is, those rules are not subject to data format, location, or size, so I can reuse them when the new data comes in. Profiling can be done at any time so I can re-examine my data after applying the rules, as many times as I like. Once I am satisfied with the rules, they will be passed on to my peers in IT so they can create executable rules based on the logic I create and run them automatically in production. No more worrying about the difference in format, volume or other discrepancies in the data sets, all the complexity is taken care of by the software, and all I need to do is to build meaningful rules to transform the data to the appropriate condition so I can have good quality data to work with for my analysis. Best part? I can do all of the above without hassling my IT – feeling empowered is awesome!
Use the right tool for the right job will improve our results, save us time, and make our jobs much more enjoyable. For me, no more Excel for data cleansing after trying our Data Quality software, because now I can get a more done in less time, and I am no longer stressed out by the lengthy process.
I encourage my analyst friends to try Informatica Data Quality, or at least the Analyst Tool in it. If you are like me, feeling weary about the steep learning curve then fear no more. Besides, if Data Quality can cut down your data cleansing time by half (mind you our customers have reported higher numbers), how many more predictive models you can build, how much you will learn, and how much faster you can build your reports in Tableau, with more confidence?
I recently had the opportunity to have a very interesting discussion with Glenn Gow, the CEO of Crimson Marketing. I was impressed at what an interesting and smart guy he was, and with the tremendous insight he has into the marketing discipline. He consults with over 150 CMOs every year, and has a pretty solid understanding about the pains they are facing, the opportunities in front of them, and the approaches that the best-of-the-best are taking that are leading them towards new levels of success.
I asked Glenn if he would be willing to do a Q&A in order to share some of his insight. I hope you find his perspective as interesting as I did!
Q: What do you believe is the single biggest advantage that marketers have today?
A: Being able to use data in marketing is absolutely your single biggest competitive advantage as a marketer. And therefore your biggest challenge is capturing, leveraging and rationalizing that data. The marketers we speak with tend to fall into two buckets.
- Those who understand that the way they manage data is critical to their marketing success. These marketers use data to inform their decisions, and then rely on it to measure their effectiveness.
- Those who haven’t yet discovered that data is the key to their success. Often these people start with systems in mind – marketing automation, CRM, etc. But after implementing and beginning to use these systems, they almost always come to the realization that they have a data problem.
Q: How has this world of unprecedented data sources and volumes changed the marketing discipline?
A: In short… dramatically. The shift has really happened in the last two years. The big impetus for this change has really been the availability of data. You’ve probably heard this figure, but Google’s Eric Schmidt likes to say that every two days now, we create as much information as we did from the dawn of civilization until 2003.
We believe this is a massive opportunity for marketers. The question is, how do we leverage this data. How do we pull the golden nuggets out that will help us do our jobs better. Marketers now have access to information they’ve never had access to or even contemplated before. This gives them the ability to become a more effective marketer. And by the way… they have to! Customers expect them to!
For example, ad re-targeting. Customers expect to be shown ads that are relevant to them, and if marketers don’t successfully do this, they can actually damage their brand.
In addition, competitors are taking full advantage of data, and are getting better every day at winning the hearts and minds of their customers – so marketers need to act before their competitors do.
Marketers have a tremendous opportunity – rich data is available and the technology is available to harness it is now, so that they can win a war that they could never before.
Q: Where are the barriers they are up against in harnessing this data?
A: I’d say that barriers can really be broken down into 4 main buckets: existing architecture, skill sets, relationships, and governance.
- Existing Architecture: The way that data has historically been collected and stored doesn’t have the CMO’s needs in mind. The CMO has an abundance of data theoretically at their fingertips, but they cannot do what they want with it. The CMO needs to insist on, and work together with the CIO to build an overarching data strategy that meets their needs – both today and tomorrow because the marketing profession and tool sets are rapidly changing. That means the CMO and their team need to step into a conversation they’ve never had before with the CIO and his/her team. And it’s not about systems integration but it’s about data integration.
- Existing Skill Sets: The average marketer today is a right-brained individual. They entered the profession because they are naturally gifted at branding, communications, and outbound perspectives. And that requirement doesn’t go away – it’s still important. But today’s marketer now needs to grow their left-brained skills, so they can take advantage of inbound information, marketing technologies, data, etc. It’s hard to ask a right-brained person to suddenly be effective at managing this data. The CMO needs to fill this skillset gap primarily by bringing in people that understand it, but they cannot ignore it themselves. The CMO needs to understand how to manage a team of data scientists and operations people to dig through and analyze this data. Some CMOs have actually learned to love data analysis themselves (in fact your CMO at Informatica Marge Breya is one of them).
- Existing Relationships: In a data-driven marketing world, relationships with the CIO become paramount. They have historically determined what data is collected, where it is stored, what it is connected to, and how it is managed. Today’s CMO isn’t just going to the CIO with a simple task, as in asking them to build a new dashboard. They have to collectively work together to build a data strategy that will work for the organization as a whole. And marketing is the “new kid on the block” in this discussion – the CIO has been working with finance, manufacturing, etc. for years, so it takes some time (and great data points!) to build that kind of cohesive relationship. But most CIOs understand that it’s important, if for no other reason that they see budgets increasingly shifting to marketing and the rest of the Lines of Business.
- Governance: Who is ultimately responsible for the data that lives within an organization? It’s not an easy question to answer. And since marketing is a relatively new entrant into the data discussion, there are often a lot of questions left to answer. If marketing wants access to the customer data, what are we going to let them do with it? Read it? Append to it? How quickly does this happen? Who needs to author or approve changes to a data flow? Who manages opt ins/outs and regulatory black lists? And how does that impact our responsibility as an organization? This is a new set of conversations for the CMO – but they’re absolutely critical.
Q: Are the CMOs you speak with concerned with measuring marketing success?
A: Absolutely. CMOs are feeling tremendous pressure from the CEO to quantify their results. There was a recent Duke University study of CMOs that asked if they were feeling pressure from the CEO or board to justify what they’re doing. 64% of the respondents said that they do feel this pressure, and 63% say this pressure is increasing.
CMOs cannot ignore this. They need to have access to the right data that they can trust to track the effectiveness of their organizations. They need to quantitatively demonstrate the impact that their activities have had on corporate revenue – not just ROI or Marketing Qualified Leads. They need to track data points all the way through the sales cycle to close and revenue, and to show their actual impact on what the CEO really cares about.
Q: Do you think marketers who undertake marketing automation products without a solid handle on their data first are getting solid results?
A: That is a tricky one. Ideally, yes, they’d have their data in great shape before undertaking a marketing automation process. The vast majority of companies who have implemented the various marketing technology tools have encountered dramatic data quality issues, often coming to light during the process of implementing their systems. So data quality and data integration is the ideal first step.
But the truth is, solving a company’s data problem isn’t a simple, straight-forward challenge. It takes time and it’s not always obvious how to solve the problem. Marketers need to be part of this conversation. They need to drive how they’re going to be managing data moving forward. And they need to involve people who understand data well, whether they be internal (typically in IT), or external (consulting companies like Crimson, and technology providers like Informatica).
So the reality for a CMO, is that it has to be a parallel path. CMOs need to get involved in ensuring that data is managed in a way they can use effectively as a marketer, but in the meantime, they cannot stop doing their day-to-day job. So, sure, they may not be getting the most out of their investment in marketing automation, but it’s the beginning of a process that will see tremendous returns over the long term.
Q: Is anybody really getting it “right” yet?
A: This is the best part… yes! We are starting to see more and more forward-thinking organizations really harnessing their data for competitive advantage, and using technology in very smart ways to tie it all together and make sense of it. In fact, we are in the process of writing a book entitled “Moneyball for Marketing” that features eleven different companies who have marketing strategies and execution plans that we feel are leading their industries.
So readers, what do you think? Who do you think is getting it “right” by leveraging their data with smart technology and truly getting meaningful an impactful results?
Last week I had the opportunity to attend the Gartner Security and Risk Management Summit. At this event, Gartner analysts and security industry experts meet to discuss the latest trends, advances, best practices and research in the space. At the event, I had the privilege of connecting with customers, peers and partners. I was also excited to learn about changes that are shaping the data security landscape.
Here are some of the things I learned at the event:
- Security continues to be a top CIO priority in 2014. Security is well-aligned with other trends such as big data, IoT, mobile, cloud, and collaboration. According to Gartner, the top CIO priority area is BI/analytics. Given our growing appetite for all things data and our increasing ability to mine data to increase top-line growth, this top billing makes perfect sense. The challenge is to protect the data assets that drive value for the company and ensure appropriate privacy controls.
- Mobile and data security are the top focus for 2014 spending in North America according to Gartner’s pre-conference survey. Cloud rounds out the list when considering worldwide spending results.
- Rise of the DRO (Digital Risk Officer). Fortunately, those same market trends are leading to an evolution of the CISO role to a Digital Security Officer and, longer term, a Digital Risk Officer. The DRO role will include determination of the risks and security of digital connectivity. Digital/Information Security risk is increasingly being reported as a business impact to the board.
- Information management and information security are blending. Gartner assumes that 40% of global enterprises will have aligned governance of the two programs by 2017. This is not surprising given the overlap of common objectives such as inventories, classification, usage policies, and accountability/protection.
- Security methodology is moving from a reactive approach to compliance-driven and proactive (risk-based) methodologies. There is simply too much data and too many events for analysts to monitor. Organizations need to understand their assets and their criticality. Big data analytics and context-aware security is then needed to reduce the noise and false positive rates to a manageable level. According to Gartner analyst Avivah Litan, ”By 2018, of all breaches that are detected within an enterprise, 70% will be found because they used context-aware security, up from 10% today.”
I want to close by sharing the identified Top Digital Security Trends for 2014
- Software-defined security
- Big data security analytics
- Intelligent/Context-aware security controls
- Application isolation
- Endpoint threat detection and response
- Website protection
- Adaptive access
- Securing the Internet of Things
In my last blog, I talked about the dreadful experience of cleaning raw data by hand as a former analyst a few years back. Well, the truth is, I was not alone. At a recent data mining Meetup event in San Francisco bay area, I asked a few analysts: “How much time do you spend on cleaning your data at work?” “More than 80% of my time” and “most my days” said the analysts, and “they are not fun”.
But check this out: There are over a dozen Meetup groups focused on data science and data mining here in the bay area I live. Those groups put on events multiple times a month, with topics often around hot, emerging technologies such as machine learning, graph analysis, real-time analytics, new algorithm on analyzing social media data, and of course, anything Big Data. Cools BI tools, new programming models and algorithms for better analysis are a big draw to data practitioners these days.
That got me thinking… if what analysts said to me is true, i.e., they spent 80% of their time on data prepping and 1/4 of that time analyzing the data and visualizing the results, which BTW, “is actually fun”, quoting a data analyst, then why are they drawn to the events focused on discussing the tools that can only help them 20% of the time? Why wouldn’t they want to explore technologies that can help address the dreadful 80% of the data scrubbing task they complain about?
Having been there myself, I thought perhaps a little self-reflection would help answer the question.
As a student of math, I love data and am fascinated about good stories I can discover from them. My two-year math program in graduate school was primarily focused on learning how to build fabulous math models to simulate the real events, and use those formula to predict the future, or look for meaningful patterns.
I used BI and statistical analysis tools while at school, and continued to use them at work after I graduated. Those software were great in that they helped me get to the results and see what’s in my data, and I can develop conclusions and make recommendations based on those insights for my clients. Without BI and visualization tools, I would not have delivered any results.
That was fun and glamorous part of my job as an analyst, but when I was not creating nice charts and presentations to tell the stories in my data, I was spending time, great amount of time, sometimes up to the wee hours cleaning and verifying my data, I was convinced that was part of my job and I just had to suck it up.
It was only a few months ago that I stumbled upon data quality software – it happened when I joined Informatica. At first I thought they were talking to the wrong person when they started pitching me data quality solutions.
Turns out, the concept of data quality automation is a highly relevant and extremely intuitive subject to me, and for anyone who is dealing with data on the regular basis. Data quality software offers an automated process for data cleansing and is much faster and delivers more accurate results than manual process. To put that in math context, if a data quality tool can reduce the data cleansing effort from 80% to 40% (btw, this is hardly a random number, some of our customers have reported much better results), that means analysts can now free up 40% of their time from scrubbing data, and use that times to do the things they like – playing with data in BI tools, building new models or running more scenarios, producing different views of the data and discovering things they may not be able to before, and do all of that with clean, trusted data. No more bored to death experience, what they are left with are improved productivity, more accurate and consistent results, compelling stories about data, and most important, they can focus on doing the things they like! Not too shabby right?
I am excited about trying out the data quality tools we have here at Informtica, my fellow analysts, you should start looking into them also. And I will check back in soon with more stories to share..
There’s a reason why big data analytics are so successful at some companies, yet fall flat at others. As MIT’s Michael Shrage put it in a recent Harvard Business Review article, it all depends on how deeply the data and tools are employed in the business. “Companies with mediocre to moderate outcomes use big data and analytics for decision support,” he says. “Successful ROA—Return on Analytics—firms use them to effect and support behavior change.”
In other words, analytics really need to drill down deep into the psyche of organizations to make a difference. The more big data analytics get baked into business processes and outcomes, the more likely they are to deliver transformative results to the organization. As he puts it, “better data-driven analyses aren’t simply ‘plugged-in’ to existing processes and reviews, they’re used to invent and encourage different kinds of conversations and interactions.”
You may have heard some of these success stories in recent years – the casino and resort company that tracks customer engagements in real-time and extends targeted offers that will enrich their stay; the logistics company that knows where its trucks are, and can reroute them to speed up delivery and save fuel; the utility that can regulate customers’ energy consumption at critical moments to avoid brownouts.
Shrage’s observations come from interviews and discussions with hundreds of organizations in recent years. His conclusions point to the need to develop an “analytical culture” – in which the behaviors, practices, rituals and shared vision of the organization are based on data versus guesswork. This is not to say gut feel and passion don’t have a place in successful ventures – because they do. But having the data to back up passionate leadership is a powerful combination in today’s business climate.
Most executives instinctively understand the advantages big data can bring to their operations, especially with predictive analytics and customer analytics. The ability to employ analytics means better understanding customers and markets, as well as spotting trends as they are starting to happen, or have yet to happen. Performance analytics, predictive analytics, and prescriptive analytics all are available to decision makers.
Here are some considerations for “baking” data analytics deeper into the business:
Identify the business behaviors or processes to be changed by analytics. In his article, Shrage quotes a financial services CIO, who points out that standard BI and analytical tools often don’t go deeply enough into an organization’s psyche: “Improving compliance and financial reporting is the low-hanging fruit. But that just means we’re using analytics to do what we are already doing better.” The key is to get the business to open up and talk about what they would like to see changed as a result of analytics.
Focus on increasing analytic skills – for everyone. While many organizations go out searching for individual that can fill data scientist roles (or something similar), there’s likely an abundance of talent and insightfulness that can be brought out from current staff, both inside and outside of IT. Business users, for example, can be trained to work with the latest front-end tools that bring data forward into compelling visualizations. IT and data professionals can sharpen their skills with emerging tools and platforms such as Hadoop and MapReduce, as well as working with analytical languages such as R.
Shrage cites one company that recognized that a great deal of education and training was required before it could re-orient its analytics capabilities around “most profitable customers” and “most profitable products.” Even clients and partners required some level of training. The bottom line: “The company realized that these analytics shouldn’t simply be used to support existing sales and services practices but treated as an opportunity to facilitate a new kind of facilitative and consultative sales and support organization.”
Automate, and what you can’t automate, make as friendly and accessible as possible. Automated decision management can improve the quality of analytics and the analytics experience for decision makers. That’s because automating low-level decisions – such as whether to grant a credit line increase or extend a special offer to a customer – removes these more mundane tasks from decision makers’ plates. As a result, they are freed up to concentrate on higher-level, more strategic decisions. For those decisions that can’t be automated, information should be as easily accessible as possible to all levels of decision makers – through mobile apps, dashboards, and self-service portals.
Which comes first: innovation or analytics?
Bain & Company released some survey findings a few months back that actually put a value on big data. Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.
This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?
You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.
The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”
Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”
Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:
Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”
Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”
Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”
Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.
There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.