Category Archives: Big Data

Is it the CDO or CAO or Someone Else?

Frank-Friedman-199x300A month ago, I shared that Frank Friedman believes CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. Even though many CFOs are increasingly taking on what could be considered an internal CEO or COO role, many readers protested my post which focused on reviewing Frank Friedman’s argument. At the same time, CIOs have been very clear with me that they do not want to personally become their company’s data steward. So the question becomes should companies be creating a CDO or CAO role to lead this important function? And if yes, how common are these two roles anyway?

Data analyticsRegardless of eventual ownership, extracting value out of data is becoming a critical business capability. It is clear that data scientists should not be shoe horned into the traditional business analyst role. Data Scientists have the unique ability to derive mathematical models “for the extraction of knowledge from data “(Data Science for Business, Foster Provost, 2013, pg 2). For this reason, Thomas Davenport claims that data scientists need to be able to network across an entire business and be able to work at the intersection of business goals, constraints, processes, available data and analytical possibilities. Given this, many organizations today are starting to experiment with the notion of having either a chief data officers (CDOs) or chief analytics officers (CAOs). The open questions is should an enterprise have a CDO or a CAO or both? And as important in the end, it is important to determine where should each of these roles report in the organization?

Data policy versus business questions

Data analyticsIn my opinion, it is the critical to first look into the substance of each role before making a decision with regards to the above question. The CDO should be about ensuring that information is properly secured, stored, transmitted or destroyed.  This includes, according to COBIT 5, that there are effective security and controls over information systems. To do this, procedures need to be defined and implemented to ensure the integrity and consistency of information stored in databases, data warehouses, and data archives. According to COBIT 5, data governance requires the following four elements:

  • Clear information ownership
  • Timely, correct information
  • Clear enterprise architecture and efficiency
  • Compliance and security

Data analyticsTo me, these four elements should be the essence of the CDO role. Having said this, the CAO is related but very different in terms of the nature of the role and the business skills require. The CRISP model points out just how different the two roles are. According to CRISP, the CAO role should be focused upon business understanding, data understanding, data preparation, data modeling, and data evaluation. As such the CAO is focused upon using data to solve business problems while the CDO is about protecting data as a business critical asset. I was living in in Silicon Valley during the “Internet Bust”. I remember seeing very few job descriptions and few job descriptions that existed said that they wanted a developer who could also act as a product manager and do some marketing as a part time activity. This of course made no sense. I feel the same way about the idea of combining the CDO and CAO. One is about compliance and protecting data and the other is about solving business problems with data. Peanut butter and chocolate may work in a Reese’s cup but it will not work here—the orientations are too different.

So which business leader should own the CDO and CAO?

Clearly, having two more C’s in the C-Suite creates a more crowded list of corporate officers. Some have even said that this will extended what is called senior executive bloat. And what of course how do these new roles work with and impact the CIO? The answer depends on organization’s culture, of course. However, where there isn’t an executive staff office, I suggest that these roles go to different places. Clearly, many companies already have their CIO function already reporting to finance. Where this is the case, it is important determine whether a COO function is in place. The COO clearly could own the CDO and CAO functions because they have a significant role in improving process processes and capabilities. Where there isn’t a COO function and the CIO reports to the CEO, I think you could have the CDO report to the CIO even though CIOs say they do not want to be a data steward. This could be a third function in parallel the VP of Ops and VP of Apps. And in this case, I would put the CAO report to one of the following:  the CFO, Strategy, or IT. Again this all depends on current organizational structure and corporate culture. Regardless of where it reports, the important thing is to focus the CAO on an enterprise analytics capability.

Related Blogs

Should we still be calling it Big Data?

Is Big Data Destined To Become Small And Vertical?

Big Data Why?

What is big data and why should your business care?

Author Twitter: @MylesSuer

Share
Posted in Big Data, CIO | Tagged , , , , , , | Leave a comment

Major Oil Company Uses Analytics to Gain Business Advantage

analytics case studies-GasAccording Michelle Fox of CNBC and Stephen Schork, the oil industry is in ‘dire straits’. U.S. crude posted its ninth-straight weekly loss this week, landing under $50 a barrel. The news is bad enough that it is now expected to lead to major job losses. The Dallas Federal Reserve anticipates that the Texas could lose about 125,000 jobs by the end of June. Patrick Jankowski, an economist and vice president of research at the Greater Houston Partnership, expects exploration budgets will be cut 30-35 percent, which will result in approximately 9,000 fewer wells being drilled. The problem is “if oil prices keep falling, at some point it’s not profitable to pull it out of the ground” (“When, and where, oil is too cheap to be profitable”, CNBC, John W. Schoen). job losses 

This means that a portion of the world’s oil supply will become unprofitable to produce. According to Wood Mackenzie, “once the oil price reaches these levels, producers have a sometimes complex decision to continue producing, losing money on every barrel produced, or to halt production, which will reduce supply”. The question are these the only answers?

Major Oil Company Uses Analytics to Gain Business Advantage

analytics case studiesA major oil company that we are working with has determined that data is a success enabler for their business. They are demonstrating what we at Informatica like to call a “data ready business”—a business that is ready for any change in market conditions. This company is using next generation analytics to ensure their businesses survival and to make sure they do not become what Jim Cramer likes to call a “marginal producer”.  This company has said to us that their success is based upon being able to extract oil more efficiently than its competitors.

Historically data analysis was pretty simple

analytics case studies-oil drillingTraditionally oil producers would get oil by drilling a new hole in the ground.  And in 6 months they would start getting the oil flowing commercially and be in business. This meant it would typically take them 6 months or longer before they could get any meaningful results including data that could be used to make broader production decisions.

Drilling from data

Today, oil is, also, produced from shale or fracking techniques.  This process can take only 30-60 days before oil producers start seeing results.  It is based not just on innovation in the refining of oil, but also on innovation in the refining of data from operational business decisions can be made. The benefits of this approach including the following:

Improved fracking process efficiency

analytics case studies-FrackingFracking is a very technical process. Producers can have two wells on the same field that are performing at very different levels of efficiency. To address this issue, the oil company that we have been discussing throughout this piece is using real-time data to optimize its oil extraction across an entire oil field or region. Insights derived from these allow them to compare wells in the same region for efficiency or productivity and even switch off certain wells if the oil price drops below profitability thresholds. This ability is especially important as the price of oil continues to drop.  At $70/barrel, many operators go into the red while more efficient data driven operators can remain profitable at $40/barrel.  So efficiency is critical across a system of wells.

Using data to decide where to build wells in the first place

When constructing a fracking or sands well, you need more information on trends and formulas to extract oil from the ground.  On a site with 100+ wells for example, each one is slightly different because of water tables, ground structure, and the details of the geography. You need the right data, the right formula, and the right method to extract the oil at the best price and not impact the environment at the very same time.

The right technology delivers the needed business advantage

analytics case studiesOf course, technology is never been simple to implement. The company we are discussing has 1.2 Petabytes of data they were processing and this volume is only increasing.  They are running fiber optic cables down into wells to gather data in real time. As a result, they are receiving vast amounts of real time data but cannot store and analyze the volume of data efficiently in conventional systems. Meanwhile, the time to aggregate and run reports can miss the window of opportunity while increasing cost. Making matters worse, this company had a lot of different varieties of data. It also turns out that quite of bit of the useful information in their data sets was in the comments section of their source application.  So traditional data warehousing would not help them to extract the information they really need. They decided to move to new technology, Hadoop. But even seemingly simple problems, like getting access to data were an issue within Hadoop.  If you didn’t know the right data analyst, you might not get the data you needed in a timely fashion. Compounding things, a lack of Hadoop skills in Oklahoma proved to be a real problem.

The right technology delivers the right capability

The company had been using a traditional data warehousing environment for years.  But they needed help to deal with their Hadoop environment. This meant dealing with the volume, variety and quality of their source well data. They needed a safe, efficient way to integrate all types of data on Hadoop at any scale without having to learn the internals of Hadoop. Early adopters of Hadoop and other Big Data technologies have had no choice but to hand-code using Java or scripting languages such as Pig or Hive. Hiring and retaining big data experts proved time consuming and costly. This is because data scientists and analysts can spend only 20 percent of their time on data analysis and the rest on the tedious mechanics of data integration such as accessing, parsing, and managing data. Fortunately for this oil producer, it didn’t have to be this way. They were able to get away with none of the specialized coding required to scale performance on distributed computing platforms like Hadoop. Additionally, they were able “Map Once, Deploy Anywhere,” knowing that even as technologies change they can run data integration jobs without having to rebuild data processing flows.

Final remarks

It seems clear that we live in an era where data is at the center of just about every business. Data-ready enterprises are able to adapt and win regardless of changing market conditions. These businesses invested in building their enterprise analytics capability before market conditions change. In this case, these oil producers will be able to produce oil at lower costs than others within their industry. Analytics provides three benefits to oil refiners.

  • Better margins and lower costs from operations
  • Lowers risk of environmental impact
  • Lower time to build a successful well

In essence, those that build analytics as a core enterprise capability will continue to have a right to win within a dynamic oil pricing environment.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer

Share
Posted in Big Data, CIO, Data Quality | Tagged , , , | Leave a comment

There are Three Kinds of Lies: Lies, Damned lies, and Data

Lies, Damned lies, and Data

Lies, Damned lies, and Data

The phrase Benjamin Disraeli used in the 19th century was: There are three kinds of lies: lies, damned lies, and statistics.

Not so long ago, Google created a Web site to figure out just how many people had influenza. How they did this was by tracking “flu-related search queries”, “location of the query,” and applied it to an estimation algorithm. According to the website, at the flu season’s peak in January, nearly 11 percent of the United States population may have influenza. This means that nearly 44 million of us will have had the flu or flu-like symptoms. In its weekly report the Centers for Disease Control and Prevention put this at 5.6%, which means that less than 23 million of us actually went to the doctor’s office to be tested for flu or to get a flu-shot.

Now, imagine if I were a drug manufacturer. There is a theory about what went wrong. The problems may be due to widespread media coverage of this year’s flu season. Then add social media, which helped news of the flu spread quicker than the virus itself. In other words, the algorithm is looking only at the numbers, not at the context of the search results.

In today’s digitally connected world, data is everywhere: in our phones, search queries, friendships, dating profiles, cars, food, and reading habits. Almost everything we touch is part of a larger data set. The people and companies that interpret the data may fail to apply background and outside conditions to the numbers they capture.

Now, while we build our big data repositories, we have to spend some time to explain how we collected the data and under what context.

Twitter @bigdatabeat

Share
Posted in Big Data, Cloud Data Management, Data Governance, Data Transformation, Data Warehousing, Hadoop | Tagged , , , , | Leave a comment

Half of Healthcare Execs Have Analytics Initiatives

Half of Healthcare Execs Have Analytics Initiatives

Half of Healthcare Execs Have Analytics Initiatives

In a recent report posted sponsored by GE Healthcare on ModernHealthcare.com, only 49% of healthcare executives say they have an established strategy for Big Data or a specific Data and Analytics initiative.  With the move to electronic medical and health records, I would have thought this number was higher given the access to a gold mine of data.  I was also disappointed that of those who do have an initiative, 42% said they were unsure if the organization benefited from applying analytics so far!

I understand that fighting for budget and time to implement analytics is a challenge with all the changes happening in healthcare (ICD-10, M&A, etc.).  But hospitals using analytics to drive Value-based care are leading healthcare reform and setting a higher bar for quality of service.  Value-based care promises quicker recoveries, fewer readmissions, lower infection rates, and fewer medical errors – something we all want as consumers.

In order to truly achieve value-based care, analytics is a must have. If you are looking for the business case or inspiration for the business driver, here are a few ideas:

  • In surgery, do you have the data to show how many patients had lower complication rates and higher long-term survival rates? Do you have that data across the different surgical procedures you offer?
  • Do you have data to benchmark your practice quality? How do you compare to other practices in terms of infection rates? Can you use that data to promote your services from a marketing perspective?
  • Do you know how much a readmission is costing your hospital?
  • From a finance perspective, have you adopted best practices from other industries with respect to supply-chain management or cost optimization strategies?

If you don’t have the expertise, there are plenty of consulting organizations who specialize in implementing analytics to provide insight to make the transition to value-based care and pricing.

We are always going to be facing limited budgets, the day will always have 24 hours in it, and organizations are constantly changing as new leaders take over with a different agenda.  But one thing is certain; a decision without data is just someone’s opinion. In healthcare with only half of the executives making decisions based on analytics, maybe we should all be asking for a second opinion – and one based on data.

Share
Posted in Big Data, Healthcare | Tagged , , , , | Leave a comment

Jumping on the Internet of Things (IoT) Band Wagon?

Jumping on the IoT band wagon

IoT and the Smart Home

There is a new “Band Wagon” out there and it’s not Big Data. If you were at this year’s CES Show this past week, it would have been impossible even with a “Las Vegas-size” hangover not to have heard the hype around the Internet of Things (IoT).  The Internet of Things includes anything and everything that is connected to the Internet and able to communicate and share information with other “smart” devices. This year as well as last it was about home appliances, fitness and health monitors, home security systems, Bluetooth enabled toothbrushes, sensors in shoes to monitor weight and mileage, thermostats that monitor humidity and sound, to kitchen utensils that can track and monitor the type of food you cook and eat.

If you ask me, all these devices and the IoT movement is both cool and creepy. Cool in the sense that networking technology has both matured and become affordable for devices to transmit data for companies to turn into actionable intelligence. IoT is creepy in the sense where do I really want someone monitoring what I cook or how many times I wake up and night?  Like other hype cycles or band wagons, there are different opinions as to the size of the IoT market.  Gartner expects it to include nearly 26 billion devices, with a “global economic value-add” of $1.9 trillion by 2020.  The question is whether the Internet of Things is truly transformational to our daily lives?  The answer to that really depends on being able to harness all that data into information. Just because my new IoT toothbrush can monitor and send data on how many times I brush my teeth, it doesn’t provide any color whether that makes me healthier or have a prettier smile :).

To help answer these questions, here are examples and potential use cases of leveraging all that Big Data from Small devices of the IoT world:

  • Mimo’s Smart Baby Monitor is aimed at helping to prevent SIDS, the Mimo monitor is a new kind of infant monitor that provides parents with real-time information about their baby’s breathing, skin temperature, body position, and activity level on their smartphones.
  • GlowCaps fit prescription bottles and via a wireless chip provide services that help people stick with their prescription regimen; from reminder messages, all the way to refill and doctor coordination.
  • BeClose offers a wearable alarm button and other discrete wireless sensors placed around the home, the BeClose system can track your loved one’s daily routine and give you peace of mind for their safety by alerting you to any serious disruptions detected in their normal schedule.
  • Postscapes provides technology a suite of sensors and web connectivity help save you time and resources by keeping plants fed based on their actual growing needs and conditions while automating much of the labor processes.
  • OnFarm solution combines real-time sensor data from soil moisture levels, weather forecasts, and pesticide usage from farming sites into a consolidated web dashboard. Farmers can use this data with advanced imaging and mapping information to spot crop issues and remotely monitor all of the farms assets and resource usage levels.
  • Banks and auto lenders are using cellular GPS units that report location and usage of financed cars in addition to locking the ignitions to prevent further movement in the case of default.
  • Sensors on farm equipment now provides real-time intelligence on how many hours trackers are used, the weather conditions to predict mechanical problems, and measuring the productivity of the farmer to predict trends in the commodity market.

I can see a number of other potential use cases for IoT including:

  • Health devices not only sending data but receiving data from other IoT devices to provide real time recommendations on workout routines based on weather data received from real-time weather sensors, food intake from kitchen devices, to nutritional information based on vitamins and medications consumed by the wearer.
  • Credit card banks leveraging their GPS tracking device data from auto loan customers to combine it with credit card data to deliver real-time offers on merchant promotions while on the road.
  • GPS tracking devices on hotel card keys to track where you go, eat, entertain to deliver more customized services and offers while one is on a business trip or vacation.
  • Boxing gloves transmitting the impact and force of a punch to monitor for athlete concussions.

What does this all mean?

The Internet of Things has changed the way we live and do business and will continue to shape the future hopefully in a positive way. Harnessing all of that Big Data from Small devices does not come easily. Every device that generates data sends it to some central system through WiFi or cellular network.  Once in that central system, it needs to be access, translated, transformed, cleansed, and standardized for business use with data from other systems that run the business.  For example:

  • Access, transform, and validate data from IoT with data generated from other business applications. Formats and values will be often different and change over time and needs to be rationalized and standardized for downstream business use. Otherwise, you end up with a bunch of Alphas and Numerics that make no sense.
  • Data quality and validation: Just because a sensor can send data, it does not mean it will send the right data or data that is right for a business user trying to make sense of it. GPS data requires accurate coordinate data. If any value is transmitted incorrectly, it is important to identify those errors; more importantly correct it so the business can take action.  This is especially important when combining like values (e.g. Weather status = Cold, Wet, Hot however the device is sending A,B, C)
  • Shared with other systems: Once your data is ready to be consumed by new and existing analytic applications, marketing systems, CRM, or your fraud surveillance systems, it needs to be available in in real-time if required, in the right format, and structure as required by those  applications and doing it in a way that is seamless, automated, and does not require heavy IT lifting.

In closing, IoT’s future is bright along with the additional insights gained from all that data.  Consider it Cool or Creepy one thing is for sure, the IoT band wagon is in full swing!

Share
Posted in Big Data, Vibe | Tagged , , | Leave a comment

8 Information Quality Predictions for 2015 And Beyond

Information Quality Predictions

Information Quality Predictions

Andy Hayler of Information Difference wrote in October last year that it’s been 10 years since the master data management (MDM) industry emerged. Andy sees MDM technology maturing and project success rates rising. He concluded that MDM has moved past its infancy and has a promising future as it is approaching its teenage years.

The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.

1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”

Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”

2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:

  • What were my sales last week?
  • Which promotions are performing well and poorly?
  • Which suppliers are not delivering on their SLAs?
  • Which stores aren’t selling according to plan?
  • How are the products performing in specific markets?”

Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.

3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica

Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”

4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.

5. Increased adoption of matching and linking capabilities on Hadoop 
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.

6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”

Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.

7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.

8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.

Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”.  Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.

Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.

So, what do you predict? I would love to hear your opinions and comments.

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Big Data, Cloud, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Product Information Management | Tagged , , , , , | Leave a comment

Data Security – A Major Concern in 2015

Data Security

Data Security – A Major Concern in 2015

2014 ended with a ton of hype and expectations and some drama if you are a data security professional or business executive responsible for shareholder value.  The recent attacks on Sony Pictures by North Korea during December caught everyone’s attention, not about whether with Sony would release “The Interview” but how vulnerable we as a society are to these criminal acts.

I have to admit, I was one of those who saw the movie and found the film humorous to say the least and can see why a desperate regime like North Korea would not want their leader admitting they love margarita’s and Katy Perry. What concerned me about the whole event was whether these unwanted security breaches were now just a fact of life?  As a disclaimer, I have no affinity over the downfall of the North Korean government however what transpired was fascinating and amazing that companies like Sony continue to struggle to protect sensitive data despite being one of the largest companies in the world.

According to the Identity Theft Resource Center, there were 761 reported data security breaches in 2014 impacting over 83 million breached records across industries and geographies with B2B and B2C retailers leading the pack with 79.2% of all breaches. Most of these breaches originated through the internet via malicious WORMS and viruses purposely designed to identify and rely back sensitive information including credit card numbers, bank account numbers, and social security information used by criminals to wreak havoc and significant financial losses to merchants and financial institutions. According to the 2014 Ponemon Institute Research study:

  • The average cost of cyber-crime per company in the US was $12.7 million this year, according to the Ponemon report, and US companies on average are hit with 122 successful attacks per year.
  • Globally, the average annualized cost for the surveyed organizations was $7.6 million per year, ranging from $0.5 million to $61 million per company. Interestingly, small organizations have a higher per-capita cost than large ones ($1,601 versus $437), the report found.
  • Some industries incur higher costs in a breach than others, too. Energy and utility organizations incur the priciest attacks ($13.18 million), followed closely by financial services ($12.97 million). Healthcare incurs the fewest expenses ($1.38 million), the report says.

Despite all the media attention around these awful events last year, 2015 does not seem like it’s going to get any better. According to CNBC just this morning, Morgan Stanley reported a data security breach where they had fired an employee who it claims stole account data for hundreds of thousands of its wealth management clients. Stolen information for approximately 900 of those clients was posted online for a brief period of time.  With so much to gain from this rich data, businesses across industries have a tough battle ahead of them as criminals are getting more creative and desperate to steal sensitive information for financial gain. According to a Forrester Research, the top 3 breach activities included:

  • Inadvertent misuse by insider (36%)
  • Loss/theft of corporate asset (32%)
  • Phishing (30%)

Given the growth in data volumes fueled by mobile, social, cloud, and electronic payments, the war against data breaches will continue to grow bigger and uglier for firms large and small.  As such, Gartner predicts investments in Information Security Solutions will grow further 8.2 percent in 2015 vs. 2014 reaching $76.9+ billion globally.  Furthermore, by 2018, more than half of organizations will use security services firms that specialize in data protection, security risk management and security infrastructure management to enhance their security postures.

Like any war, you have to know your enemy and what you are defending. In the war against data breaches, this starts with knowing where your sensitive data is before you can effectively defend against any attack. According to the Ponemon Institute, 18% of firms who were surveyed said they knew where their structured sensitive data was located where as the rest were not sure. 66% revealed that if would not be able to effectively know if they were attacked.   Even worse, 47% were NOT confident at having visibility into users accessing sensitive or confidential information and that 48% of those surveyed admitted to a data breach of some kind in the last 12 months.

In closing, the responsibilities of today’s information security professional from Chief Information Security Officers to Security Analysts are challenging and growing each day as criminals become more sophisticated and desperate at getting their hands on one of your most important assets….your data.  As your organizations look to invest in new Information Security solutions, make sure you start with solutions that allow you to identify where your sensitive data is to help plan an effective data security strategy both to defend your perimeter and sensitive data at the source.   How prepared are you?

For more information about Informatica Data Security Solutions:

  • Download the Gartner Data Masking Magic Quadrant Report
  • Click here to learn more about Informatica’s Data Masking Solutions
  • Click here to access Informatica Dynamic Data Masking: Preventing Data Breaches with Benchmark-Proven Performance whitepaper
Share
Posted in Application ILM, Banking & Capital Markets, Big Data, CIO, Data masking, Data Privacy, Data Security | Tagged , , | Leave a comment

How to Get the Biggest Returns from Your Hadoop and Big Data Investments in 2015

Big Data2014 was the year that Big Data went mainstream from conversations asking “What is Big Data?” to “How do we harness the power of Big Data to solve real business problems”. It seemed like everyone jumped on the Big Data band wagon from new software start-ups offering the “next generation” predictive analytic applications to traditional database, data quality, business intelligence, and data integration vendors, all calling themselves Big Data providers. The truth is, they all play a role in this Big Data movement.

Earlier in 2014, Wikibon estimated the Big Data market is currently on pace to top $50 billion in 2017, which translates to a 38% compound annual growth rate over the six year period from 2011 (the first year Wikibon sized the Big Data market) to 2017. Most of the excitement around Big Data has been around Hadoop as early adopters who experimented with open source versions quickly grew to adopt enterprise-class solutions from companies like Cloudera™, HortonWorks™, MapR™, and Amazon’s RedShift™ to address real-world business problems including: (more…)

Share
Posted in B2B, Big Data, Business Impact / Benefits, Data Aggregation, Hadoop | Tagged , , , | Comments Off

Imagine A New Sheriff In Town

As we renew or reinvent ourselves for 2015, I wanted to share a case of “imagine if” with you and combine it with the narrative of an American frontier town out West, trying to find a new Sheriff – a Wyatt Earp.  In this case the town is a legacy European communications firm and Wyatt and his brothers are the new managers – the change agents.

management

Is your new management posse driving change?

Here is a positive word upfront.  This operator has had some success in rolling outs broadband internet and IPTV products to residential and business clients to replace its dwindling copper install base.  But they are behind the curve on the wireless penetration side due to the number of smaller, agile MVNOs and two other multi-national operators with a high density of brick-and-mortar stores, excellent brand recognition and support infrastructure.  Having more than a handful of brands certainly did not make this any easier for our CSP.   To make matters even more challenging, price pressure is increasingly squeezing all operators in this market.  The ones able to offset the high-cost Capex for spectrum acquisitions and upgrades with lower-cost Opex for running the network and maximizing subscriber profitability, will set themselves up for success (see one of my earlier posts around the same phenomenon in banking).

Not only did they run every single brand on a separate CRM and billing application (including all the various operational and analytical packages), they also ran nearly every customer-facing-service (CFS) within a brand the same dysfunctional way.  In the end, they had over 60 CRM and the same number of billing applications across all copper, fiber, IPTV, SIM-only, mobile residential and business brands.  Granted, this may be a quite excessive example; but nevertheless, it is relevant for many other legacy operators.

As a consequence, their projections indicate they incur over €600,000 annually in maintaining duplicate customer records (ignoring duplicate base product/offer records for now) due to excessive hardware, software and IT operations.  Moreover, they have to stomach about the same amount for ongoing data quality efforts in IT and the business areas across their broadband and multi-play service segments.

Here are some more consequences they projected:

  • €18.3 million in call center productivity improvement
  • €790,000 improvement in profit due to reduced churn
  • €2.3 million reduction in customer acquisition cost
  • And if you include the fixing of duplicate and conflicting product information, add another €7.3 million in profit via billing error and discount reduction (which is inline with our findings from a prior telco engagement)

Despite major business areas not having contributed to the investigation and improvements being often on the conservative side, they projected a 14:1 return ratio between overall benefit amount and total project cost.

Coming back to the “imagine if” aspect now, one would ask how this behemoth of an organization can be fixed.  Well, it will take years but without management (in this case new managers busting through the door), this organization has the chance to become the next Rocky Mountain mining ghost town.

Busting into the cafeteria with new ideas & looking good while doing it?

Busting into the cafeteria with new ideas & looking good while doing it?

The good news is that this operator is seeing some management changes now.  The new folks have a clear understanding that business-as-usual won’t do going forward and that centralization of customer insight (which includes some data elements) has its distinct advantages.  They will tackle new customer analytics, order management, operational data integration (network) and next-best-action use cases incrementally. They know they are in the data, not just the communication business.  They realize they have to show a rapid succession of quick wins rather than make the organization wait a year or more for first results.  They have fairly humble initial requirements to get going as a result.

You can equate this to the new Sheriff not going after the whole organization of the three, corrupt cattle barons, but just the foreman of one of them for starters.  With little cost involved, the Sheriff acquires some first-hand knowledge plus he sends a message, which will likely persuade others to be more cooperative going forward.

What do you think? Is new management the only way to implement drastic changes around customer experience, profitability or at least understanding?

Share
Posted in Big Data, Business Impact / Benefits, CIO, CMO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Operational Efficiency, Product Information Management, Telecommunications, Vertical | Tagged , , , , , , , , , | Leave a comment

What is the Role of the CIO in Driving Enterprise Analytics?

Data AnalysisWhen you talk to CIOs today about their business priorities, the top of their list is better connecting what IT is doing to business strategy. Or put another way, it is about establishing business/IT alignment. One area where CIOs need to make sure there is better alignment is enterprise analytics. CIOs that I have talk to share openly that business users are demanding the ability to reach their apps and data anywhere and on any device. For this reason, even though CIOs say they have interest in the mechanisms of data delivery–data integration, data cleanliness, data governance, data mastering, and even metadata management — they would not take a meeting on these topics. The reason is that CIOs say they would need to involve their business partner in these meetings. CIOs these days want you have to have a business value proposition. Given this, CIOs say that they would want to hear about what the business wants to hear about.

  • Enabling new, valuable business insights out data to happen faster
  • Enabling their businesses to compete with analytics

CIOs as an analytics proponent versus the analytics customer

Tom DavenportSo if the question is about competing with analytics, what role does the CIO have in setting the agenda here? Tom Davenport says that CIOs–as I heard in my own conversations  with CIOs–have good intentions when it comes to the developing an enterprise information strategy. They can see the value of taking an enterprise versus a departmental view. Tom suggests, however, that CIOs should start by focusing upon the analytics that will matter most to the business. He says that IT organizations should, also, build an IT infrastructure capable of delivering the information and analytics that people across the enterprise need not just now but also in the future.

Tom says that IT organizations must resist the temptation to provide analytics as an add-on or a bolt-on basis for whatever transactions system have just been developed. As a product manager, I had a development team that preferred to add analytics by source rather than do the hard work of creating integrative measures that crossed sources. So I know this problem firsthand. Tom believes that IT needs to build a platform that can be standardized and integrate data from more than one source. This includes the ability to adapt as business needs and business strategies change.

Making this an Enterprise Analytics Capability

analyticsIn the early stage for analytics, IT organizations need to focus more upon a self-service approach. But as the business matures at analytics, Tom says that IT needs to shift gears and become a proactive advocate and architect of change. Tom says that IT should be a part owner of the company’s analytical capabilities. IT managers, therefore, must understand and be able to articulate the potential for analytics being created at an enterprise level. At the same time, the IT staff–which often lacks the heavy mathematical backgrounds of analysts–needs to be able to interact with the analytics pros who use and consume the information that IT creates to build models. I had this dilemma first hand where my analytics modelers were disconnected from BI product developers. They were two different communities working on our project. And although some modelers can build apps or even a BI system, what excites them most in life is building new analytical models.

Talk the language of the business

Tom Davenport says that IT managers can make their own lives easier with the business and the with analysts by instead of discussing cloud computing, service oriented architecture, or even OLAP, discussing decision making, insights, and business performance. Meanwhile, Tom feels that the enterprise analytics journey starts with good, integrated data on transactions and business processes managed through enterprise applications like ERP and CRM Systems (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 51).

Focusing on the big questions and the right problems

Clearly driving the business to focus on the big questions and the right problems is critical. IT cannot do this but they can facilitate it. Why does it matter? An Accenture Study found that “companies that derived any real value from them (their analytics) had anticipated how to leverage the information to generate new insights to improve business performance. (“Using Enterprise Systems to Gain Uncommon Competitive Advantage, Accenture, page 3). This is critical and too few organizations succeed in doing it.

With this accomplished and to achieve the second goal, IT needs to be eliminating legacy BI systems and old spaghetti code as well as silo data marts. The goal should be to replace them with an enterprise analytics capability that answers the big questions. This requires standardization around an enterprise wide approach that ensures a consistent approach to data management and provides an integrated environment complete with data repositories/data lakes, analytical tools, presentation applications, and transformational tools. This investment should be focused on improving business processes or providing data needed for system of systems products. Tom says that IT’s job is to watch out for current and future users of information systems.

Parting Thoughts

So the question is where is your IT organization at today? Clearly, it is important as well that IT measure enterprise analytic initiatives too. IT should measure adoption. IT should find out what is used or not they are used. I had a CIO once admit to me that he did not know whether currently supported data marts were being used or even still had value. It is important that we have these answers. Clearly, being close to the business customer from the start can limit what this CIO discussed.

Related Blogs and Links

Analytics Stories: A Banking Case Study

Analytics Stories: A Financial Services Case Study

Analytics Stories: A Healthcare Case Study

Who Owns Enterprise Analytics and Data?

Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR

Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

 

Share
Posted in Big Data, Business/IT Collaboration, CIO, Data Governance | Tagged , | Leave a comment