Tag Archives: Analytics

The Streetlight Is Watching You

The Streetlight Is Watching You

The Streetlight Is Watching You

We are hugely dependent upon technology and sometimes take it for granted. It is always worth reminding ourselves where it all began so we can fully appreciate how lucky we are. Take the Light Emitting Diodes (LEDs) for example. They have come a long way in a relatively short time. They were used first as low-intensity light emitters in electronic devices, it is difficult to believe anyone would foresee them one day lighting our homes.

The future of lighting may first be peeking through at Newark Liberty Airport in New Jersey. The airport has installed 171 new LED-based light fixtures that include a variety of sensors to detect and record what’s going in the airport, as reported by Diane Cardwell in The New York Times. Together they make a network of devices that communicates wirelessly and allows authorities to scan license plates of passing cars, watch out for lines and delays, and check out travelers for suspicious activities.

I get the feeling that Newark’s new gear will not be the last of lighting-based digital networks. Over the last few years, LED street lights have gone from something cities would love to have to the sector standard. That the market has shifted so swiftly is thanks to the efforts of early movers such as the City of Los Angeles, which last year completed the world’s largest LED street light replacement project, with LED fixtures installed on 150,000 streetlights.

Los Angeles is certainly not alone in making the switch to LED street lighting. In March 2013, Las Vegas outfitted 50,000 streetlights with LED fixtures. One month later, the Austin TX announced plans to install 35,000 LED street lights. Not to be outdone, New York City, is planning to go all-LED by 2017, which would save $14 million and many tons of carbon emissions each year.

The impending switch to LEDs is an excellent opportunity for LED light fixture makers and Big Data software vendors like Informatica. These fixtures are made with a wide variety of sensors that can be tailored to whatever the user wants to detect, including temperature, humidity, seismic activity, radiation, audio, and video, among other things. The sensors could even detect and triangulate the source of a gunshot.

This steady stream of real-time data collected from these fixtures can be transformed into torrents of small messages and events with unprecedented agility using Informatica Vibe Data Stream. Analyzed data can then be distributed to various governmental and non-governmental agencies, such as; law enforcement, environmental monitors, retailers, etc.

If I were to guess the number of streetlights in the world, I would say 4 billion. Upgrading these is a “once-in-a-generation opportunity” to harness “lots of data, i.e., Sensory big data.”

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Utilities & Energy, Vibe | Tagged , , , , , | Leave a comment

Download the Informatica Big Data Edition Trial and Unleash the Power of Hadoop

Cloudera Hadoop

Big Data Edition Trial Sandbox for Cloudera

Come and get it.  For developers hungry to get their hands on Informatica on Hadoop, a downloadable free trial of Informatica Big Data Edition was launched today on the Informatica Marketplace.  See for yourself the power of the killer app on Hadoop from the leader in data integration and quality.

Thanks to the generous help of our partners, the Informatica Big Data team has preinstalled the Big Data Edition inside the sandbox VMs of the two leading Hadoop distributions.  This empowers Hadoop and Informatica developers to easily try the codeless, GUI driven Big Data Edition to build and execute ETL and data integration pipelines natively on Hadoop for Big Data analytics.

Informatica Big Data Edition is the most complete and powerful suite for Hadoop data pipelines and can increase productivity up to 5 times. Developers can leverage hundreds of out-of-the-box Informatica pre-built transforms and connectors for structured and unstructured data processing on Hadoop.  With the Informatica Vibe Virtual Data Machine running directly on each node of the Hadoop cluster, the Big Data Edition can profile, parse, transform and cleanse data at any scale to prepare data for data science, business intelligence and operational analytics.

The Informatica Big Data Edition Trial Sandbox VMs will have a 60 day trial version of the Big Data Edition preinstalled inside a 1-node Hadoop cluster.  The trials include sample data and mappings as well as getting started documentation and videos.  It is possible to try your own data with the trials, but processing is limited to the 1-node Hadoop cluster and the machine you have it running on.  Any mappings you develop in the trial can be easily moved on to a production Hadoop cluster running the Big Data Edition. The Informatica Big Data Edition also supports MapR and Pivotal Hadoop distributions, however, the trial is currently only available for Cloudera and Hortonworks.

Hadoop Hortonworks

Big Data Edition Trial Sandbox for Hortonworks

Accelerate your ability to bring Hadoop from the sandbox into production by leveraging Informatica’s Big Data Edition. Informatica’s visual development approach means that more than one hundred thousand existing Informatica developers are now Hadoop developers without having to learn Hadoop or new hand coding techniques and languages. Informatica can help organizations easily integrate Hadoop into their enterprise data infrastructure and bring the PowerCenter data pipeline mappings running on traditional servers onto Hadoop clusters with minimal modification. Informatica Big Data Edition reduces the risk of Hadoop projects and increases agility by enabling more of your organization to interact with the data in your Hadoop cluster.

To get the Informatica Big Data Edition Trial Sandbox VMs and more information please visit Informatica Marketplace

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Integration, Hadoop | Tagged , , , , | Leave a comment

Reflections Of A Former Data Analyst (Part 2) – Changing The Game For Data Plumbing

 

Elephant cleansing

Cleaning. Sometimes is challenging!

In my last blog I promised I would report back my experience on using Informatica Data Quality, a software tool that helps automate the hectic, tedious data plumbing task, a task that routinely consumes more than 80% of the analyst time. Today, I am happy to share what I’ve learned in the past couple of months.

But first, let me confess something. The reason it took me so long to get here was that I was dreaded by trying the software.  Never a savvy computer programmer, I was convinced that I would not be technical enough to master the tool and it would turn into a lengthy learning experience. The mental barrier dragged me down for a couple of months and I finally bit the bullet and got my hands on the software. I am happy to report that my fear  was truly unnecessary –  It took me one half day to get a good handle on most features in the Analyst Tool, a component  of the Data Quality designed for analyst and business users,   then I spent 3 days trying to figure out how to maneuver the Developer Tool, another key piece of the Data Quality offering mostly used by – you guessed it, developers and technical users.  I have to admit that I am no master of the Developer Tool after 3 days of wrestling with it, but, I got the basics and more importantly, my hands-on interaction with the entire software helped me understand the logic behind the overall design, and see for myself  how analyst and business user can easily collaborate with their IT counterpart within our Data Quality environment.

To break it all down, first comes to Profiling. As analyst we understand too well the importance of profiling as it provides an anatomy of the raw data we collected. In many cases, it is a must have first step in data preparation (especially when our  raw data came from different places and can also carry different formats).  A heavy user of Excel, I used to rely on all the tricks available in the spreadsheet to gain visibility of my data. I would filter, sort, build pivot table, make charts to learn what’s in my raw data.  Depending on how many columns in my data set, it could take hours, sometimes days just to figure out whether the data I received was any good at all, and how good it was.

which one do you like better?

which one do you like better?

Switching to the Analyst Tool in Data Quality, learning my raw data becomes a task of a few clicks – maximum 6 if I am picky about how I want it to be done.  Basically I load my data, click on a couple of options, and let the software do the rest.  A few seconds later I am able to visualize the statistics of the data fields I choose to examine,  I can also measure the quality of the raw data by using Scorecard feature in the software. No more fiddling with spreadsheet and staring at busy rows and columns.  Take a look at the above screenshots and let me know your preference?

Once I decide that my raw data is adequate enough to use after the profiling, I still need to clean up the nonsense in it before performing any analysis work, otherwise  bad things can happen — we call it garbage in garbage out. Again, to clean and standardize my data, Excel came to rescue in the past.  I would play with different functions and learn new ones, write macro or simply do it by hand. It was tedious but worked if I worked on static data set. Problem however, was when I needed to incorporate new data sources in a different format, many of the previously built formula would break loose and become inapplicable. I would have to start all over again. Spreadsheet tricks simply don’t scale in those situation.

Rule Builder in Analyst Tool

Rule Builder in Analyst Tool

With Data Quality Analyst Tool, I can use the Rule Builder to create a set of logical rules in hierarchical manner based on my objectives,  and test those rules to see the immediate results. The nice thing is, those rules are not subject to data format, location, or size, so I can reuse them when the new data comes in.  Profiling can be done at any time so I can re-examine my data after applying the rules, as many times as I like. Once I am satisfied with the rules, they will be passed on to my peers in IT so they can create executable rules based on the logic I create and run them automatically in production. No more worrying about the difference in format, volume or other discrepancies in the data sets, all the complexity is taken care of by the software, and all I need to do is to build meaningful rules to transform the data to the appropriate condition so I can have good quality data to work with for my analysis.  Best part? I can do all of the above without hassling my IT – feeling empowered is awesome!

Changing The Game For Data Plumbing

Use the Right Tool for the Job

Use the right tool for the right job will improve our results, save us time, and make our jobs much more enjoyable. For me, no more Excel for data cleansing after trying our Data Quality software, because now I can get a more done in less time, and I am no longer stressed out by the lengthy process.

I encourage my analyst friends to try Informatica Data Quality, or at least the Analyst Tool in it.  If you are like me, feeling weary about the steep learning curve then fear no more. Besides, if Data Quality can cut down your data cleansing time by half (mind you our customers have reported higher numbers), how many more predictive models you can build, how much you will learn, and how much faster you can build your reports in Tableau, with more confidence?

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality | Tagged , , , | Leave a comment

The Data-Driven CMO: A Q&A with Glenn Gow (CEO of Crimson Research)

Q&A with Crimson Research

I recently had the opportunity to have a very interesting discussion with Glenn Gow, the CEO of Crimson Marketing.  I was impressed at what an interesting and smart guy he was, and with the tremendous insight he has into the marketing discipline.  He consults with over 150 CMOs every year, and has a pretty solid understanding about the pains they are facing, the opportunities in front of them, and the approaches that the best-of-the-best are taking that are leading them towards new levels of success.

I asked Glenn if he would be willing to do a Q&A in order to share some of his insight.  I hope you find his perspective as interesting as I did!

 crimson_logo

______________________________________________

Q: What do you believe is the single biggest advantage that marketers have today?

A: Being able to use data in marketing is absolutely your single biggest competitive advantage as a marketer.  And therefore your biggest challenge is capturing, leveraging and rationalizing that data.  The marketers we speak with tend to fall into two buckets.

  1. Those who understand that the way they manage data is critical to their marketing success.  These marketers use data to inform their decisions, and then rely on it to measure their effectiveness.
  2. Those who haven’t yet discovered that data is the key to their success. Often these people start with systems in mind – marketing automation, CRM, etc.  But after implementing and beginning to use these systems, they almost always come to the realization that they have a data problem.

______________________________________________

Q:  How has this world of unprecedented data sources and volumes changed the marketing discipline?

A:  In short… dramatically.  The shift has really happened in the last two years. The big impetus for this change has really been the availability of data.  You’ve probably heard this figure, but Google’s Eric Schmidt likes to say that every two days now, we create as much information as we did from the dawn of civilization until 2003.

We believe this is a massive opportunity for marketers.  The question is, how do we leverage this data.  How do we pull the golden nuggets out that will help us do our jobs better.  Marketers now have access to information they’ve never had access to or even contemplated before.  This gives them the ability to become a more effective marketer. And by the way… they have to!  Customers expect them to!

For example, ad re-targeting.  Customers expect to be shown ads that are relevant to them, and if marketers don’t successfully do this, they can actually damage their brand.

In addition, competitors are taking full advantage of data, and are getting better every day at winning the hearts and minds of their customers – so marketers need to act before their competitors do.

Marketers have a tremendous opportunity – rich data is available and the technology is available to harness it is now, so that they can win a war that they could never before.

______________________________________________

Q:  Where are the barriers they are up against in harnessing this data?

A:
  I’d say that barriers can really be broken down into 4 main buckets: existing architecture, skill sets, relationships, and governance.

  • Existing Architecture: The way that data has historically been collected and stored doesn’t have the CMO’s needs in mind.  The CMO has an abundance of data theoretically at their fingertips, but they cannot do what they want with it.  The CMO needs to insist on, and work together with the CIO to build an overarching data strategy that meets their needs – both today and tomorrow because the marketing profession and tool sets are rapidly changing.  That means the CMO and their team need to step into a conversation they’ve never had before with the CIO and his/her team.  And it’s not about systems integration but it’s about data integration.
  • Existing Skill Sets:  The average marketer today is a right-brained individual.  They entered the profession because they are naturally gifted at branding, communications, and outbound perspectives.  And that requirement doesn’t go away – it’s still important.  But today’s marketer now needs to grow their left-brained skills, so they can take advantage of inbound information, marketing technologies, data, etc.  It’s hard to ask a right-brained person to suddenly be effective at managing this data.  The CMO needs to fill this skillset gap primarily by bringing in people that understand it, but they cannot ignore it themselves.  The CMO needs to understand how to manage a team of data scientists and operations people to dig through and analyze this data.  Some CMOs have actually learned to love data analysis themselves (in fact your CMO at Informatica Marge Breya is one of them).
  • Existing Relationships:  In a data-driven marketing world, relationships with the CIO become paramount.  They have historically determined what data is collected, where it is stored, what it is connected to, and how it is managed.  Today’s CMO isn’t just going to the CIO with a simple task, as in asking them to build a new dashboard.  They have to collectively work together to build a data strategy that will work for the organization as a whole.  And marketing is the “new kid on the block” in this discussion – the CIO has been working with finance, manufacturing, etc. for years, so it takes some time (and great data points!) to build that kind of cohesive relationship.  But most CIOs understand that it’s important, if for no other reason that they see budgets increasingly shifting to marketing and the rest of the Lines of Business.
  • Governance:  Who is ultimately responsible for the data that lives within an organization?  It’s not an easy question to answer.  And since marketing is a relatively new entrant into the data discussion, there are often a lot of questions left to answer. If marketing wants access to the customer data, what are we going to let them do with it? Read it?  Append to it?  How quickly does this happen? Who needs to author or approve changes to a data flow?  Who manages opt ins/outs and regulatory black lists?  And how does that impact our responsibility as an organization?  This is a new set of conversations for the CMO – but they’re absolutely critical.

______________________________________________

Q:  Are the CMOs you speak with concerned with measuring marketing success?

A:  Absolutely.  CMOs are feeling tremendous pressure from the CEO to quantify their results.  There was a recent Duke University study of CMOs that asked if they were feeling pressure from the CEO or board to justify what they’re doing.  64% of the respondents said that they do feel this pressure, and 63% say this pressure is increasing.

CMOs cannot ignore this.  They need to have access to the right data that they can trust to track the effectiveness of their organizations.  They need to quantitatively demonstrate the impact that their activities have had on corporate revenue – not just ROI or Marketing Qualified Leads.  They need to track data points all the way through the sales cycle to close and revenue, and to show their actual impact on what the CEO really cares about.

______________________________________________

Q:  Do you think marketers who undertake marketing automation products without a solid handle on their data first are getting solid results?

A:
  That is a tricky one.  Ideally, yes, they’d have their data in great shape before undertaking a marketing automation process.  The vast majority of companies who have implemented the various marketing technology tools have encountered dramatic data quality issues, often coming to light during the process of implementing their systems. So data quality and data integration is the ideal first step.

But the truth is, solving a company’s data problem isn’t a simple, straight-forward challenge.  It takes time and it’s not always obvious how to solve the problem.  Marketers need to be part of this conversation.  They need to drive how they’re going to be managing data moving forward.  And they need to involve people who understand data well, whether they be internal (typically in IT), or external (consulting companies like Crimson, and technology providers like Informatica).

So the reality for a CMO, is that it has to be a parallel path.  CMOs need to get involved in ensuring that data is managed in a way they can use effectively as a marketer, but in the meantime, they cannot stop doing their day-to-day job.  So, sure, they may not be getting the most out of their investment in marketing automation, but it’s the beginning of a process that will see tremendous returns over the long term.

______________________________________________

Q:  Is anybody really getting it “right” yet?

A:  This is the best part… yes!  We are starting to see more and more forward-thinking organizations really harnessing their data for competitive advantage, and using technology in very smart ways to tie it all together and make sense of it.  In fact, we are in the process of writing a book entitled “Moneyball for Marketing” that features eleven different companies who have marketing strategies and execution plans that we feel are leading their industries.

______________________________________________

So readers, what do you think?  Who do you think is getting it “right” by leveraging their data with smart technology and truly getting meaningful an impactful results?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CMO, Customer Acquisition & Retention, Operational Efficiency, Vibe | Tagged , , , , , , | Leave a comment

Takeaways from the Gartner Security and Risk Management Summit (2014)

Last week I had the opportunity to attend the Gartner Security and Risk Management Summit. At this event, Gartner analysts and security industry experts meet to discuss the latest trends, advances, best practices and research in the space. At the event, I had the privilege of connecting with customers, peers and partners. I was also excited to learn about changes that are shaping the data security landscape.

Here are some of the things I learned at the event:

  • Security continues to be a top CIO priority in 2014. Security is well-aligned with other trends such as big data, IoT, mobile, cloud, and collaboration. According to Gartner, the top CIO priority area is BI/analytics. Given our growing appetite for all things data and our increasing ability to mine data to increase top-line growth, this top billing makes perfect sense. The challenge is to protect the data assets that drive value for the company and ensure appropriate privacy controls.
  • Mobile and data security are the top focus for 2014 spending in North America according to Gartner’s pre-conference survey. Cloud rounds out the list when considering worldwide spending results.
  • Rise of the DRO (Digital Risk Officer). Fortunately, those same market trends are leading to an evolution of the CISO role to a Digital Security Officer and, longer term, a Digital Risk Officer. The DRO role will include determination of the risks and security of digital connectivity. Digital/Information Security risk is increasingly being reported as a business impact to the board.
  • Information management and information security are blending. Gartner assumes that 40% of global enterprises will have aligned governance of the two programs by 2017. This is not surprising given the overlap of common objectives such as inventories, classification, usage policies, and accountability/protection.
  • Security methodology is moving from a reactive approach to compliance-driven and proactive (risk-based) methodologies. There is simply too much data and too many events for analysts to monitor. Organizations need to understand their assets and their criticality. Big data analytics and context-aware security is then needed to reduce the noise and false positive rates to a manageable level. According to Gartner analyst Avivah Litan, ”By 2018, of all breaches that are detected within an enterprise, 70% will be found because they used context-aware security, up from 10% today.”

I want to close by sharing the identified Top Digital Security Trends for 2014

  • Software-defined security
  • Big data security analytics
  • Intelligent/Context-aware security controls
  • Application isolation
  • Endpoint threat detection and response
  • Website protection
  • Adaptive access
  • Securing the Internet of Things
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO, Data Governance, Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , , , , , | Leave a comment

Reflections of a Former Analyst

In my last blog, I talked about the dreadful experience of cleaning raw data by hand as a former analyst a few years back. Well, the truth is, I was not alone. At a recent data mining Meetup event in San Francisco bay area,  I asked a few analysts: “How much time do you spend on cleaning your data at work?”  “More than 80% of my time” and “most my days” said the analysts, and “they are not fun”.

But check this out: There are over a dozen Meetup groups focused on data science and data mining here in the bay area I live. Those groups put on events multiple times a month, with topics often around hot, emerging  technologies such as machine learning, graph analysis, real-time analytics, new algorithm on analyzing social media data, and of course, anything Big Data.  Cools BI tools, new programming models and algorithms for better analysis are a big draw to data practitioners these days.

That got me thinking… if what analysts said to me is true, i.e., they spent 80% of their time on data prepping and 1/4 of that time analyzing the data and visualizing the results, which BTW, “is actually fun”, quoting a data analyst, then why are they drawn to the events focused on discussing the tools that can only help them 20% of the time? Why wouldn’t they want to explore technologies that can help address the dreadful 80% of the data scrubbing task they complain about?

Having been there myself, I thought perhaps a little self-reflection would help answer the question.

As a student of math, I love data and am fascinated about good stories I can discover from them.  My two-year math program in graduate school was primarily focused on learning how to build fabulous math models to simulate the real events, and use those formula to predict the future, or look for meaningful patterns.

I used BI and statistical analysis tools while at school, and continued to use them at work after I graduated. Those software were great in that they helped me get to the results and see what’s in my data, and I can develop conclusions and make recommendations based on those insights for my clients. Without BI and visualization tools, I would not have delivered any results.

That was fun and glamorous part of my job as an analyst, but when I was not creating nice charts and presentations to tell the stories in my data, I was spending time, great amount of time, sometimes up to the wee hours cleaning and verifying my data, I was convinced that was part of my job and I just had to suck it up.

It was only a few months ago that I stumbled upon data quality software – it happened when I joined Informatica. At first I thought they were talking to the wrong person when they started pitching me data quality solutions.

Turns out, the concept of data quality automation is a highly relevant and extremely intuitive subject to me, and for anyone who is dealing with data on the regular basis. Data quality software offers an automated process for data cleansing and is much faster and delivers more accurate results than manual process.  To put that in  math context, if a data quality tool can  reduce the data cleansing effort  from 80% to 40% (btw, this is hardly a random number, some of our customers have reported much better results),  that means analysts can now free up 40% of their time from scrubbing data,  and use that times to do the things they like  – playing with data in BI tools, building new models or running more scenarios,  producing different views of the data and discovering things they may not be able to before, and do all of that with clean, trusted data. No more bored to death experience, what they are left with are improved productivity, more accurate and consistent results, compelling stories about data, and most important, they can focus on doing the things they like! Not too shabby right?

I am excited about trying out the data quality tools we have here at Informtica, my fellow analysts, you should start looking into them also.  And I will check back in soon with more stories to share..

 

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Customers, Data Governance, Data Quality, Hadoop, Healthcare, Life Sciences, Profiling, Retail, Utilities & Energy | Tagged , , , , , , | Leave a comment

Why Some Companies are So Good With Analytics

There’s a reason why big data analytics are so successful at some companies, yet fall flat at others. As MIT’s Michael Shrage put it in a recent Harvard Business Review article, it all depends on how deeply the data and tools are employed in the business. “Companies with mediocre to moderate outcomes use big data and analytics for decision support,” he says. “Successful ROA—Return on Analytics—firms use them to effect and support behavior change.”

use1

In other words, analytics really need to drill down deep into the psyche of organizations to make a difference. The more big data analytics get baked into business processes and outcomes, the more likely they are to deliver transformative results to the organization. As he puts it, “better data-driven analyses aren’t simply ‘plugged-in’ to existing processes and reviews, they’re used to invent and encourage different kinds of conversations and interactions.”

You may have heard some of these success stories in recent years – the casino and resort company that tracks customer engagements in real-time and extends targeted offers that will enrich their stay; the logistics company that knows where its trucks are, and can reroute them to speed up delivery and save fuel; the utility that can regulate customers’ energy consumption at critical moments to avoid brownouts.

Shrage’s observations come from interviews and discussions with hundreds of organizations in recent years. His conclusions point to the need to develop an “analytical culture” – in which the behaviors, practices, rituals and shared vision of the organization are based on data versus guesswork. This is not to say gut feel and passion don’t have a place in successful ventures – because they do. But having the data to back up passionate leadership is a powerful combination in today’s business climate.

Most executives instinctively understand the advantages big data can bring to their operations, especially with predictive analytics and customer analytics. The ability to employ analytics means better understanding customers and markets, as well as spotting trends as they are starting to happen, or have yet to happen. Performance analytics, predictive analytics, and prescriptive analytics all are available to decision makers.

Here are some considerations for “baking” data analytics deeper into the business:

Identify the business behaviors or processes to be changed by analytics. In his article, Shrage quotes a financial services CIO, who points out that standard BI and analytical tools often don’t go deeply enough into an organization’s psyche: “Improving compliance and financial reporting is the low-hanging fruit. But that just means we’re using analytics to do what we are already doing better.” The key is to get the business to open up and talk about what they would like to see changed as a result of analytics.

Focus on increasing analytic skills – for everyone. While many organizations go out searching for individual that can fill data scientist roles (or something similar), there’s likely an abundance of talent and insightfulness that can be brought out from current staff, both inside and outside of IT. Business users, for example, can be trained to work with the latest front-end tools that bring data forward into compelling visualizations. IT and data professionals can sharpen their skills with emerging tools and platforms such as Hadoop and MapReduce, as well as working with analytical languages such as R.

Shrage cites one company that recognized that a great deal of education and training was required before it could re-orient its analytics capabilities around “most profitable customers” and “most profitable products.”  Even clients and partners required some level of training. The bottom line: “The company realized that these analytics shouldn’t simply be used to support existing sales and services practices but treated as an opportunity to facilitate a new kind of facilitative and consultative sales and support organization.”

Automate, and what you can’t automate, make as friendly and accessible as possible. Automated decision management can improve the quality of analytics and the analytics experience for decision makers. That’s because automating low-level decisions – such as whether to grant a credit line increase or extend a special offer to a customer – removes these more mundane tasks from decision makers’ plates. As a result, they are freed up to concentrate on higher-level, more strategic decisions. For those decisions that can’t be automated, information should be as easily accessible as possible to all levels of decision makers – through mobile apps, dashboards, and self-service portals.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Quality | Tagged , , , | Leave a comment

Data Ambition: What Does It Take To Become an Analytics Leader?

shutterstock_136599614

Which comes first: innovation or analytics?

Bain & Company released some survey findings a few months back that actually put a value on big data.  Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.

This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?

You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.

The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”

Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”

Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:

Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”

Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”

Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”

Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.

There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration | Tagged , | 2 Comments

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

Fire your Data Scientists – They Don’t Add Value

Data ScientistYears ago, I was on a project to improve production and product quality through data analysis. During the project, I heard one man say: 

“If I had my way, I’d fire the statisticians – all of them – they don’t add value”. 

Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production?  But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.

So, perhaps this title should be re-written to be: 

Fire your Data Scientists – They Aren’t Able to Add Value.

Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:

  • “I’m told our data is 60% accurate, which means I can’t trust any of it.”
  • “We achieved our goal of an answer within a week by working 24 hours a day.”
  • “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
  • “Fred manually audits 10% of the invoices.  When he is on holiday, we just don’t do the audit.”

This is why I think the original quote is so insightful.  Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.

Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.

With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.

Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?

Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Integration Platform, Data Warehousing | Tagged , , , | 4 Comments