Category Archives: Big Data

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

In a Data First World, IT must Empower Business Change!

IT must Empower Business ChangeYou probably know this already, but I’m going to say it anyway: It’s time you changed your infrastructure. I say this because most companies are still running infrastructure optimized for ERP, CRM and other transactional systems. That’s all well and good for running IT-intensive, back-office tasks. Unfortunately, this sort of infrastructure isn’t great for today’s business imperatives of mobility, cloud computing and Big Data analytics.

Virtually all of these imperatives are fueled by information gleaned from potentially dozens of sources to reveal our users’ and customers’ activities, relationships and likes. Forward-thinking companies are using such data to find new customers, retain existing ones and increase their market share. The trick lies in translating all this disparate data into useful meaning. And to do that, IT needs to move beyond focusing solely on transactions, and instead shine a light on the interactions that matter to their customers, their products and their business processes.

They need what we at Informatica call a “Data First” perspective. You can check out my first blog first about being Data First here.

A Data First POV changes everything from product development, to business processes, to how IT organizes itself and —most especially — the impact IT has on your company’s business. That’s because cloud computing, Big Data and mobile app development shift IT’s responsibilities away from running and administering equipment, onto aggregating, organizing and improving myriad data types pulled in from internal and external databases, online posts and public sources. And that shift makes IT a more-empowering force for business change. Think about it: The ability to connect and relate the dots across data from multiple sources finally gives you real power to improve entire business processes, departments and organizations.

I like to say that the role of IT is now “big I, little t,” with that lowercase “t” representing both technology and transactions. But that role requires a new set of priorities. They are:

  1. Think about information infrastructure first and application infrastructure second.
  2. Create great data by design. Architect for connectivity, cleanliness and security. Check out the eBook Data Integration for Dummies.
  3. Optimize for speed and ease of use – SaaS and mobile applications change often. Click here to try Informatica Cloud for free for 30 days.
  4. Make data a team sport. Get tools into your users’ hands so they can prepare and interact with it.

I never said this would be easy, and there’s no blueprint for how to go about doing it. Still, I recognize that a little guidance will be helpful. In a few weeks, Informatica’s CIO Eric Johnson and I will talk about how we at Informatica practice what we preach.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Data Integration, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR

CompetitionIf you ask a CIO today about the importance of data to their enterprises, they will likely tell you about the need to “compete on analytics” and to enable faster business decisions. At the same time, CIOs believe they “need to provide the intelligence to make better business decisions”. One CIO said it was in fact their personal goal to get the business to a new place faster, to enable them to derive new business insights, and to get to the gold at the end of the rainbow”.

Similarly, another CIO said that Big Data and Analytics were her highest priorities. “We have so much knowledge locked up in the data, it is just huge. We need the data cleaning and analytics to pull this knowledge out of data”. At the same time the CIOs that we talked to see their organizations as “entering an era of ubiquitous computing where users want all data on any device when they need it.”

Why does faster, better data really matters to the enterprise?

DavenportSo why does it matter? Thomas H. Davenport says, “at a time when firms in many industries offer similar products and use comparable technologies, business processes are among the last remaining points of differentiation.” A CIO that we have talked to concurred in saying, “today, we need to move from “management by exception to management by observation”. Derick Abell amplified upon this idea when he said in his book Managing with Dual Strategies “for control to be effective, data must be timely and provided at intervals that allow effective intervention”.

Davenport explains why timely data matters in this way “analytics competitors wring every last drop of value from those processes”. Given this, “they know what products their customers want, but they also know what prices those customers will pay, how many items each will buy in a lifetime, and what triggers will make people buy more. Like other companies, they know compensation costs and turnover rates, but they can also calculate how much personnel contribute to or detract from the bottom line and how salary levels relate to individuals’ performance. Like other companies, they know when inventories are running low, but they can also predict problems with demand and supply chains, to achieve low rates of inventory and high rates of perfect orders”.

What then prevents businesses from competing on analytics?

FixMoving to what Davenport imagines requires not just a visualizing tool. It involves fixing what is allying IT’s systems. One CIO suggested this process can be thought of like an athlete building the muscles they need to compete. He said that businesses really need the same thing. In his eyes, data cleaning, data security, data governance, and master data management represent the muscles to compete effectively on analytics. Unless you do these things, you cannot truly compete on analytics. At UMASS Memorial Health, for example, they “had four independent patient registration systems supporting the operations of their health system, with each of these having its own means of identifying patients, assigning medical record numbers, and recording patient care and encounter information”. As a result, “UMass lacked an accurate, reliable, and trustworthy picture of how many unique patients were being treated by its health system. In order to fix things, UMASS needed to “resolve patient, provider and encounter data quality problems across 11 source systems to allow aggregation and analysis of data”. Prior to fixing its data management system, this meant that “UMass lacked a top-down, comprehensive view of clinical and financial performance across its extended healthcare enterprise”.

UMASS demonstrates how IT needs to fix their data management in order to improve their organization’s information intelligence and drive real and substantial business advantage. Fixing data management clearly involves delivering the good data that business users can safely use to make business decisions. It, also, involves ensuring that data created is protected. CFOs that we have talked to say Target was a watershed event for them—something that they expect will receive more and more auditing attention.

Once our data is good and safe, we need to connect current data sources and new data sources. And this needs to not take as long as it did in the past. The delivery of data needs to happen fast enough that business problems can be recognized as they occur and be solved before they become systemic.  For this reason, users need to get access to data when and where they it is needed.

With data management fixed, data intelligence is needed so that business users can make sense out of things faster. Business users need to be able to search and find data. They need self-service so they can combine existing and new unstructured data sources to test data interrelationship hypothesis. This means the ability to assemble data from different sources at different times. Simply put this is all about data orchestration without having any preconceived process. And lastly, they need the intelligence to automatically sense and respond to changes as new data becomes collected.

Some parting thoughts

The next question may be whether competing upon data actual pay business dividends. Alvin Toffler says “Tiny insights can yield huge outputs”. In other words, the payoff can be huge. And those that do so will increasingly have the “right to win” against their competitors as you use information to wring every last drop of value from your business processes.

Related links

Solution Brief: The Intelligent Data Platform

Related Blogs

Is Big Data Destined To Become Small And Vertical?
Big Data Why?
The Business Case for Better Data Connectivity
What is big data and why should your business care?
Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO | Tagged , , , , , | Leave a comment

Malcolm Gladwell, Big Data and What’s to be Done About Too Much Information

Malcolm Gladwell wrote an article in The New Yorker magazine in January, 2007 entitled “Open Secrets.” In the article, he pointed out that a national-security expert had famously made a distinction between puzzles and mysteries.

New Yorker writer Malcolm Gladwell

New Yorker writer Malcolm Gladwell

Osama bin Laden’s whereabouts were, for many years, a puzzle. We couldn’t find him because we didn’t have enough information. The key to the puzzle, it was assumed, would eventually come from someone close to bin Laden, and until we could find that source, bin Laden would remain at large. In fact, that’s precisely what happened. Al-Qaida’s No. 3 leader, Khalid Sheikh Mohammed, gave authorities the nicknames of one of bin Laden’s couriers, who then became the linchpin to the CIA’s efforts to locate Bin Laden.

By contrast, the problem of what would happen in Iraq after the toppling of Saddam Hussein was a mystery. It wasn’t a question that had a simple, factual answer. Mysteries require judgments and the assessment of uncertainty, and the hard part is not that we have too little information but that we have too much.

This was written before “Big Data” was a household word and it begs the very interesting question of whether organizations and corporations that are, by anyone’s standards, totally deluged with data, are facing puzzles or mysteries. Consider the amount of data that a company like Western Union deals with.

Western Union is a 160-year old company. Having built scale in the money transfer business, the company is in the process of evolving its business model by enabling the expansion of digital products, growth of web and mobile channels, and a more personalized online customer experience. Sounds good – but get this: the company processes more than 29 transactions per seconds on average. That’s 242 million consumer-to-consumer transactions and 459 million business payments in a year. Nearly a billion transactions – a billion! As my six-year-old might say, that number is big enough “to go to the moon and back.” Layer on top of that the fact that the company operates in 200+ countries and territories, and conducts business in 120+ currencies. Senior Director and Head of Engineering Abhishek Banerjee has said, “The data is speaking to us. We just need to react to it.” That implies a puzzle, not a mystery – but only if data scientists are able to conduct statistical modeling and predictive analysis, systematically noting trends in sending and receiving behaviors. Check out what Banerjee and Western Union CTO Sanjay Saraf have to say about it here.

Or consider General Electric’s aggressive and pioneering move into what’s dubbed as the industrial internet. In a white paper entitled “The Case for an Industrial Big Data Platform: Laying the Groundwork for the New Industrial Age,” GE reveals some of the staggering statistics related to the industrial equipment that it manufactures and supports (services comprise 75% of GE’s bottom line):

  • A modern wind turbine contains approximately 50 sensors and control loops which collect data every 40 milliseconds.
  • A farm controller then receives more than 30 signals from each turbine at 160-millisecond intervals.
  • At every one-second interval, the farm monitoring software processes 200 raw sensor data points with various associated properties with each turbine.

Phew! I’m no electricity operations expert, and you probably aren’t either. And most of us will get no further than simply wrapping our heads around the simple fact that GE turbines are collecting a LOT of data. But what the paper goes on to say should grab your attention in a big way: “The key to success for this wind farm lies in the ability to collect and deliver the right data, at the right velocity, and in the right quantities to a wide set of well-orchestrated analytics.” And the paper goes on to recommend that anyone involved in the Industrial Internet revolution strongly consider its talent requirements, with the suggestion that Chief Data officers and/or Data Scientists may be the next critical hires.

Which brings us back to Malcolm Gladwell. In the aforementioned article, Gladwell goes on to pull apart the Enron debacle, and argues that it was a prime example of the perils of too much information. “If you sat through the trial of (former CEO) Jeffrey Skilling, you’d think that the Enron scandal was a puzzle. The company, the prosecution said, conducted shady side deals that no one quite understood. Senior executives withheld critical information from investors…We were not told enough—the classic puzzle premise—was the central assumption of the Enron prosecution.” But in fact, that was not true. Enron employed complicated – but perfectly legal–accounting techniques used by companies that engage in complicated financial trading. Many journalists and professors have gone back and looked at the firm’s regulatory filings, and have come to the conclusion that, while complex and difficult to identify, all of the company’s shenanigans were right there in plain view. Enron cannot be blamed for covering up the existence of its side deals. It didn’t; it disclosed them. As Gladwell summarizes:

“Puzzles are ‘transmitter-dependent’; they turn on what we are told. Mysteries are ‘receiver dependent’; they turn on the skills of the listener.”

big data

Wind turbines, jet engines and other machinery sensors generate unprecedented amounts of data

I would argue that this extremely complex, fast moving and seismic shift that we call Big Data will favor those who have developed the ability to attune, to listen and make sense of the data. Winners in this new world will recognize what looks like an overwhelming and intractable mystery, and break that mystery down into small and manageable chunks and demystify the landscape, to uncover the important nuggets of truth and significance.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Enterprise Data Management | Tagged , , , | 1 Comment

Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns

This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.

“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”

– Paul Harris, Global Business Applications Director, SDL Pic

When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.

But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.

With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.

InformaticaCloud

Informatica Cloud Summer Release Webinar Replay

“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”

– Mark Nardella, Global Sales Process Director, Schneider Electric SE

With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important.  Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.

“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”

– Kimberly Jansen, Director CRM, Misys PLC

We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.

And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.

The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Integration, Uncategorized | Tagged , , , , , | Leave a comment

Driving Third Wave Businesses: Ensuring Your Business Has The Right To Win

TofflerAs adjunct university faculty, I get to talk to students about how business strategy increasingly depends upon understanding how to leverage information. To make discussion more concrete, I share with students the work of Alvin Toffler. In The Third Wave, Toffler asserts that we live in a world where competition will increasingly take place upon the currency and usability of information.

In a recent interview, Toffler said that “given the acceleration of change; companies, individuals, and governments base many of their daily decisions on obsoledge—knowledge whose shelf life has expired.” He continues by stating that “companies everywhere are trying to put a price on certain forms of intellectual property. But if…knowledge is at the core of the money economy, than we need to understand knowledge much better than we do now. And tiny insights can yield huge outputs”. 

Driving better information management in the information age

information age

To me, this drives to three salient conclusions for information age businesses:

  1. Information needs to drive further down organizations because top decision makers do not have the background to respond at the pace of change.
  2. Information needs to be available faster which means that we need to reducing the processing time for structure and unstructured information sources.
  3. Information needs to be available when the organization is ready for it. For multinational enterprises this means “Always On” 24/7 across multiple time zones on any device.

Effective managers today are effective managers of people and information

information

Effective managers today are effective managers of information. Because processing may take too much time, Toffler’s remarks suggest to me we need to consider human information—the ideas and communications we share every day—within the mix of getting access to the right information when it is needed and where it is needed. Now more than ever is the time for enterprises to ensure their decision makers have the timely information to make better business decisions when they are relevant. This means that unstructured data, a non-trivial majority of business information, needs to be made available to business users and related to existing structured sources of data.

Derick Abell says that “for (management) control to be effective, data must be timely and provided at interval that allows effective intervention”. Today this is a problem for most information businesses. As I see it, information optimization is the basis of powering the enterprise through “Third Wave” business competition. Organizations that have the “right to win” will have as a core capability better-than-class access to current information for decision makers.

Putting in place a winning information management strategy

If you talk to CIOs today, they will tell you that they are currently facing 4 major information age challenges.

  • Mobility—Enabling their users to view data anytime, anyplace, and any device
  • Information Trust—Making data dependable enough for business decisions as well as governing data across all business systems.
  • Competing on Analytics—Getting information to business users fast enough to avoid Toffler’s Obsoledge.
  • New and Big Data Sources—Connecting existing data to new value added sources of data.

Some information age

siloedLots of things, however, get in the way of delivering on the promises of the Information Age. Our current data architecture is siloed, fragile, and built upon layer after layer of spaghetti code integrations. Think about what is involved just to cobble together data on a company’s supply chain. A morass of structured data systems have vendor and transaction records locked up in application databases and data warehouses all over the extended enterprise. So it is not amazing that enterprises struggle to put together current, relevant data to run their businesses upon. Functions like finance depend largely upon manual extracts being massaged and integrated in spreadsheets because of concern over the quality of data being provided by financial systems. Some information age!

How do we connect to new sources of data?

At the same time, many are trying today to extend the information architecture to add social media data, mobile location data, and even machine data. Much of this data is not put together in the same way as data in an application database or data warehouse. However, being able to relate this data to existing data sources can yield significant benefits. Think about the potential benefit of being able to relate social interactions and mobile location data to sales data or to relate machine data to compliance data.

A big problem is many of these new data types potentially have even more data quality gaps than historical structured data systems. Often the signal to noise for this data can be very low for this reason. But this data can be invaluable to business decision making. For this reason, this data needs to be cleaned up and related to older data sources. Finally, it needs to be provided to business users in whatever manner they want to consume it. 

How then do we fix the Information Age?

fixing

Enabling the kind of Information Age that Toffler imagined requires two things. Enterprises fix their data management and enable the information intelligence needed to drive real business competitive advantage. Fixing data management involves delivering good data that business users can safely make decisions from. It, also, involves ensuring that data once created is protected. CFOs that we have talked to say Target was a watershed event for them—something that they expect will receive more and more auditing attention.

We need at the same time to build the connection between old data sources and new data sources. And this needs to not take as long as in the past to connect data. Delivery needs to happen faster so business problems can be recognized and solved more quickly.  Users need to get access to data when and where they need it.

With data management fixed, data intelligence needs to provide business users the ability to make sense out of things they find in the data. Business users need as well to be able to search and find data. They, also, need self-service so they can combine existing and new unstructured data sources to test data interrelationship hypothesis. This means the ability to assemble data and put it together and do it from different sources at different times. Simply put this is about data orchestration without any preconceived process. And lastly, business users need the intelligence to automatically sense and respond to changes as new data is collecting.

Tiny insights can yield huge outputs

payoffs

Obviously, there is a cost to solving our information age issues, but it is important to remember what Toffler says. “Tiny insights can yield huge outputs”. In other words, the payoff is huge for shaking off the shackles of our early information age business architecture. And those that do this will increasingly have the “right to win” against their competitors as they use information to wring every last drop of value from their business processes.

Related links
Solution Brief: The Intelligent Data Platform
Related Blogs

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO, Data Integration, Data Quality | Tagged , , , , , , | Leave a comment

The Data-Driven CMO: A Q&A with Glenn Gow (CEO of Crimson Research)

Q&A with Crimson Research

I recently had the opportunity to have a very interesting discussion with Glenn Gow, the CEO of Crimson Marketing.  I was impressed at what an interesting and smart guy he was, and with the tremendous insight he has into the marketing discipline.  He consults with over 150 CMOs every year, and has a pretty solid understanding about the pains they are facing, the opportunities in front of them, and the approaches that the best-of-the-best are taking that are leading them towards new levels of success.

I asked Glenn if he would be willing to do a Q&A in order to share some of his insight.  I hope you find his perspective as interesting as I did!

 crimson_logo

______________________________________________

Q: What do you believe is the single biggest advantage that marketers have today?

A: Being able to use data in marketing is absolutely your single biggest competitive advantage as a marketer.  And therefore your biggest challenge is capturing, leveraging and rationalizing that data.  The marketers we speak with tend to fall into two buckets.

  1. Those who understand that the way they manage data is critical to their marketing success.  These marketers use data to inform their decisions, and then rely on it to measure their effectiveness.
  2. Those who haven’t yet discovered that data is the key to their success. Often these people start with systems in mind – marketing automation, CRM, etc.  But after implementing and beginning to use these systems, they almost always come to the realization that they have a data problem.

______________________________________________

Q:  How has this world of unprecedented data sources and volumes changed the marketing discipline?

A:  In short… dramatically.  The shift has really happened in the last two years. The big impetus for this change has really been the availability of data.  You’ve probably heard this figure, but Google’s Eric Schmidt likes to say that every two days now, we create as much information as we did from the dawn of civilization until 2003.

We believe this is a massive opportunity for marketers.  The question is, how do we leverage this data.  How do we pull the golden nuggets out that will help us do our jobs better.  Marketers now have access to information they’ve never had access to or even contemplated before.  This gives them the ability to become a more effective marketer. And by the way… they have to!  Customers expect them to!

For example, ad re-targeting.  Customers expect to be shown ads that are relevant to them, and if marketers don’t successfully do this, they can actually damage their brand.

In addition, competitors are taking full advantage of data, and are getting better every day at winning the hearts and minds of their customers – so marketers need to act before their competitors do.

Marketers have a tremendous opportunity – rich data is available and the technology is available to harness it is now, so that they can win a war that they could never before.

______________________________________________

Q:  Where are the barriers they are up against in harnessing this data?

A:
  I’d say that barriers can really be broken down into 4 main buckets: existing architecture, skill sets, relationships, and governance.

  • Existing Architecture: The way that data has historically been collected and stored doesn’t have the CMO’s needs in mind.  The CMO has an abundance of data theoretically at their fingertips, but they cannot do what they want with it.  The CMO needs to insist on, and work together with the CIO to build an overarching data strategy that meets their needs – both today and tomorrow because the marketing profession and tool sets are rapidly changing.  That means the CMO and their team need to step into a conversation they’ve never had before with the CIO and his/her team.  And it’s not about systems integration but it’s about data integration.
  • Existing Skill Sets:  The average marketer today is a right-brained individual.  They entered the profession because they are naturally gifted at branding, communications, and outbound perspectives.  And that requirement doesn’t go away – it’s still important.  But today’s marketer now needs to grow their left-brained skills, so they can take advantage of inbound information, marketing technologies, data, etc.  It’s hard to ask a right-brained person to suddenly be effective at managing this data.  The CMO needs to fill this skillset gap primarily by bringing in people that understand it, but they cannot ignore it themselves.  The CMO needs to understand how to manage a team of data scientists and operations people to dig through and analyze this data.  Some CMOs have actually learned to love data analysis themselves (in fact your CMO at Informatica Marge Breya is one of them).
  • Existing Relationships:  In a data-driven marketing world, relationships with the CIO become paramount.  They have historically determined what data is collected, where it is stored, what it is connected to, and how it is managed.  Today’s CMO isn’t just going to the CIO with a simple task, as in asking them to build a new dashboard.  They have to collectively work together to build a data strategy that will work for the organization as a whole.  And marketing is the “new kid on the block” in this discussion – the CIO has been working with finance, manufacturing, etc. for years, so it takes some time (and great data points!) to build that kind of cohesive relationship.  But most CIOs understand that it’s important, if for no other reason that they see budgets increasingly shifting to marketing and the rest of the Lines of Business.
  • Governance:  Who is ultimately responsible for the data that lives within an organization?  It’s not an easy question to answer.  And since marketing is a relatively new entrant into the data discussion, there are often a lot of questions left to answer. If marketing wants access to the customer data, what are we going to let them do with it? Read it?  Append to it?  How quickly does this happen? Who needs to author or approve changes to a data flow?  Who manages opt ins/outs and regulatory black lists?  And how does that impact our responsibility as an organization?  This is a new set of conversations for the CMO – but they’re absolutely critical.

______________________________________________

Q:  Are the CMOs you speak with concerned with measuring marketing success?

A:  Absolutely.  CMOs are feeling tremendous pressure from the CEO to quantify their results.  There was a recent Duke University study of CMOs that asked if they were feeling pressure from the CEO or board to justify what they’re doing.  64% of the respondents said that they do feel this pressure, and 63% say this pressure is increasing.

CMOs cannot ignore this.  They need to have access to the right data that they can trust to track the effectiveness of their organizations.  They need to quantitatively demonstrate the impact that their activities have had on corporate revenue – not just ROI or Marketing Qualified Leads.  They need to track data points all the way through the sales cycle to close and revenue, and to show their actual impact on what the CEO really cares about.

______________________________________________

Q:  Do you think marketers who undertake marketing automation products without a solid handle on their data first are getting solid results?

A:
  That is a tricky one.  Ideally, yes, they’d have their data in great shape before undertaking a marketing automation process.  The vast majority of companies who have implemented the various marketing technology tools have encountered dramatic data quality issues, often coming to light during the process of implementing their systems. So data quality and data integration is the ideal first step.

But the truth is, solving a company’s data problem isn’t a simple, straight-forward challenge.  It takes time and it’s not always obvious how to solve the problem.  Marketers need to be part of this conversation.  They need to drive how they’re going to be managing data moving forward.  And they need to involve people who understand data well, whether they be internal (typically in IT), or external (consulting companies like Crimson, and technology providers like Informatica).

So the reality for a CMO, is that it has to be a parallel path.  CMOs need to get involved in ensuring that data is managed in a way they can use effectively as a marketer, but in the meantime, they cannot stop doing their day-to-day job.  So, sure, they may not be getting the most out of their investment in marketing automation, but it’s the beginning of a process that will see tremendous returns over the long term.

______________________________________________

Q:  Is anybody really getting it “right” yet?

A:  This is the best part… yes!  We are starting to see more and more forward-thinking organizations really harnessing their data for competitive advantage, and using technology in very smart ways to tie it all together and make sense of it.  In fact, we are in the process of writing a book entitled “Moneyball for Marketing” that features eleven different companies who have marketing strategies and execution plans that we feel are leading their industries.

______________________________________________

So readers, what do you think?  Who do you think is getting it “right” by leveraging their data with smart technology and truly getting meaningful an impactful results?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CMO, Customer Acquisition & Retention, Operational Efficiency, Vibe | Tagged , , , , , , | Leave a comment

The CIO Challenged

The CIO Challenged

future shockIf you ask a CIO today about their challenges, several things would clearly make the list. CIOs that I know personally are feeling a bit of Future Shock. They say that things are changing a lot faster these days than they did in the past. One CIO said to me in exasperation, “things are changing every 18 months”. Given this, I recently sat down with CIOs from several different industries to get their perspectives on how the CIO role is changing and the challenges they feel in their role as CIO. This post will focus upon the latter.

 
 
 

Healthcare CIO

The Healthcare CIOThe healthcare CIO participating said that CIOs need to manage four large mega trends simultaneously—mobile, cloud, social, and big data. At the same time in healthcare, they have the added complexity of Meaning Use, ICT 10, and HL7. For these reasons, this CIO worries about keeping the IT lights on while at the same time helping the business to expand. This CIO sees healthcare is clearly entering an era of ubiquitous computing with the iPad becoming the rounding and vitals instrument of choice. This links mobility, integration, and compliance around a standard like HL7. HL7 provides this CIO with a framework for exchanging, integrating, sharing, and retrieving of electronic health information. Like other CIOs that we talked to, our healthcare CIO says he needs to understand his enterprises business better in order to be a better partner.

 

Insurance CIO

Insurance CIOOur next CIO is from the insurance. He sees CIOs in general being challenged to move from being a builder of stuff to an orchestrator of business services. This CIO sees cloud and loosely oriented partnerships bringing vendor management to the forefront. At the same time, he feels challenged to provide application integration in a service oriented manner. He says that IT organizations need today to orchestrate across IT regardless of device. As well, he believes that IT organizations need to stitch together an IT that is fungible and support service oriented architecture. At the same time, he says that his business users “believe that data is strategic but they need it provided to them in a way that creates predictive capabilities and drives top line revenue”. We and our business customers know that we need to fix our mutual data problems in order to use data better. This CIO said believes that he needs to fix his enterprise’s data hygiene first in order to improve business outcomes.
 

Manufacturing CIO

Manufacturing CIOThe Manufacturing CIO participating said that CIOs have an opportunity to create informative analytics and help the business find value. However, this CIO worries that CEOs and CFOs are about to start complaining to their IT organizations that the information garnered from Big Data and Business Intelligence does not really make them more money. He claims, to make more money, IT organizations need to connect the dots between their transactional systems, BI systems, and the planning systems. More specifically, they need to convert insight into action. To do this, the business needs to be enabled to be more proactive and to cut the time it takes to execute. This means that IT needs to enable the enterprise to generate value different than its competitors. This CIO worries, therefore, about IT’s ability to drive flexibility and agility. We need to respond to the rate of change and be able to prototype faster at the same time as we cut the cost of failure. This CIO claims that CIOs needs to more actively manage the information lifecycle even though the business may own the data. Lastly, this CIO says that IT organizations need to be more forward looking. We need to be looking at things cross discipline. We need to be looking for new business insights. We have piles and piles of data from which to draw interesting insights from. How do you connect and create new business insights?
 

Pharma CIO

Pharma CIO Getting the CFOs to understand that technology is not a cost center was really important to our 4th CIO. We need to get everyone to understand that IT isn’t separate from the business. At the same time, we need to get business leaders to understand technology better. There is a real West Coast vs. East Coast split regarding business technology literacy. We need business leaders to start asking for digital services that support their product and service offerings. And this is all about data. “Think about it. What we do in IT is all about the intake of data, storing data, processing data, and analyzing data. And we need to provide the intelligence to make better decisions.  Competing with analytics is what we need to enable. Like an athlete that needs muscles—data needs cleaning, security, mastering, and governance to enable the business to compete with analytics”.
 

Broadcast CIO

Broadcast CIOOur broadcast CIO is focused on the explosion of Big Data. “I need to get my management team exposed to Big Data Analysis. I need as well to get the resources to do this well”. We need for example to get the business answers to its questions around customer behavior. From an integration perspective, this CIO said that she needs to get service based technology deployed. At the same time, she said I need to be able to have business apps for my business and consumer users to subscribe. This CIO said that speed to clients from integrated systems is a big issue. We need today to connect everything together.

CIOs as whole feel are feeling challenged

CIOs regardless of industry feel challenged. They feel challenged by changes coming at them in general and in industry specific mandates and standards. They clearly need to move faster and to move from organizations that are about getting the internals of IT running well to organizations that can absorb new technology models, scale up and down in “Internet time”, and flex seamlessly to support business model innovation. For more information, see the related links below:

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO | Tagged , , , | Leave a comment

The Five C’s of Data Management

The Five C’s of Data Management

The Five C’s of Data Management

A few days ago, I came across a post, 5 C’s of MDM (Case, Content, Connecting, Cleansing, and Controlling), by Peter Krensky, Sr. Research Associate, Aberdeen Group and this response by Alan Duncan with his 5 C’s (Communicate, Co-operate, Collaborate, Cajole and Coerce). I like Alan’s list much better. Even though I work for a product company specializing in information management technology, the secret to successful enterprise information management (EIM) is in tackling the business and organizational issues, not the technology challenges. Fundamentally, data management at the enterprise level is an agreement problem, not a technology problem.

So, here I go with my 5 C’s: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , , | Leave a comment

Is Big Data Destined To Become Small And Vertical?

Several years ago, I got to participate in one of the first neural net conferences. At the time, I thought it was amazing just to be there. There were chip and software vendors galore. Many even claimed to be the next Intel or the next Microsoft. Years later I joined a complex event processing vendor. Again, I felt the same sense of excitement. In both cases, the market participants moved from large horizontal market plays to smaller and more vertical solutions.

deja vuA sense of deja vu

Now to be clear, it is not my goal today to pop anyone’s big data balloon. But as I have gotten more excited about big data, I have gotten more and more an eerie sense of deja vu. The fact is the more that I dig into big data and hear customer’s stories about what they are trying to do with big data; the more I have concern about the similarities between big data and neural nets and complex event processing.

big dataBig Data offers new features

Clearly, big data does offer some interesting new features. And big data does take advantage of other market trends including virtualization and cloud. By doing so, big data achieves new orders of scalability than traditional business intelligence processing and storage. At the same time, big data offers the potential for lowering cost but I should take a moment to stress the word potential. The reason I do this is that while a myriad of processing approaches have been developed, no standard has yet emerged. And early adopters complain about having a difficulty in hiring big data map reduce programmers. And just like neural nets, the programing that needs to be done is turning out to be application specific.

With this said, it should be clear that big data does offer the potential to test datasets and to discover new and sometimes unexpected data relationship. This is a real positive thing. However, like its predecessors, this work is application specific and the data that is being related is truly of differing quality and detail. This means that the best that big data can do as a technology movement is discover potential data relationship. Once this is done, meaning can only be created by establishing detailed data relationships and dealing with the varying quality of data sets within the big data cluster.

Big Data will become small for management analysis

This means that big data must become small in order to really solve customer problems. Judith Hurwitz puts it this way, “big data analysis is really about small data. Small data, therefore, is the product of big data analysis. Big data will become small so that it is easier to comprehend”. What is “more necessary than ever is the capability to analyze the right data in a timely enough fashion to make decisions and take actions”. Judith says that in the end what is needed is quality data that is consistent, accurate, reliable, complete, timely, reasonable, and valid. The critical point is whether you use map reduce processing or traditional BI means, you shouldn’t throw out your data integration and quality tools. As big data becomes smaller, these will in reality become increasingly important.

So how does Judith see big data evolving? Judith sees big data propelling a lot of new small data. Judith believes that, “getting the right perspective on data quality can be very challenging in the world of big data. With a majority of big data sources, you need to assume that you are working with data that is not clean”. Judith says that we need to accept the fact that a lot of noise will exist in data. It is by searching and pattern matching that you will be able to find some sparks of truth in the midst of some very dirty data”. Judith suggests, therefore, a two phase approach—1) look for patterns in big data without concern for data quality; and 2) after you locate your patterns, applying the same data quality standards that have been applied to traditional data sources.

history repeatsHistory will repeat itself

For this reason, I believe that history will to a degree repeat itself. Clearly, the big data emperor does have his clothes on, but big data will become smaller and more vertical. Big data will become about relationship discovery and small data will become about quality analysis of data sources. In sum, this means that small data analysis is focused and provides the data for business decision making and big data analysis is broad and is about discovering what data potentially relates to what data.  I know this is a bit of different from the hype but it is realistic and makes sense. Remember, in the end, you will still need what business intelligence has refined.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , , , , | Leave a comment