Category Archives: Business/IT Collaboration
This week we kick off our 16th Informatica World – the conference about All Things Data. The 2015 show is literally bigger and better than ever, with a record number of attendees, record number of breakout sessions, record number of sponsors and beyond. I’m proud to report that Informatica World 2015 is sold out! This clearly shows me that our data message is striking a cord with customers, prospects, partners and beyond.
One of the aspects that I’m most excited about at the show is unveiling the Data Ready Enterprise campaign.
Read more about Data Ready Enterprise campaign here – www.informatica.com/ready.
In my keynote, I’ll be sharing some thoughts regarding how “great data” can help you advance your career by enabling you to make substantial improvements to your company’s business processes and operations. We will be discussing why it’s critical for you – as a data integration or data management professional – to ensure that, with great data, your company is ready for everything.
Today’s business imperatives of mobility, cloud computing and big data analytics are driven by data from multiple sources – both inside and outside the four walls of your enterprise.
Data is the fuel powering your business processes, machines, sensors and customized user experiences.
As a result, your enterprise today is confronted with an unparalleled opportunity to put massive amounts of data, structured and unstructured, to work in optimizing your business. You can streamline business processes through consolidated applications, build efficiencies through real-time collaboration and decision-making, and improve customer relationships and your total customer experience through data-enabled interactions and 24/7 support.
To succeed as a Data Ready Enterprise, you are faced with the imperative of putting massive amounts of both structured and unstructured data to work quickly and effectively. But not just any old data. You need great data.
But what do I mean by this? What is “great data” all about? What is so important about the term “great”?
Simply put, great data is clean, connected and safe. Great data can make an executive a success, and bad data can ruin an executive, literally.
You’re either smarter about what’s happening with your competition, smarter about what’s happening with your business, smarter about what’s happening in real-time with your portfolio, and in your go-to-market strategy vis-à-vis your margins, et cetera — or you’re guessing.
Worse yet, if you have bad data you’re bound to make horrible decisions – no matter how big the data is. The bigger the crummy data, the crummier the big decisions.
If the data’s not connected, if it’s not up to date, if it’s not connected with the systems that actually make for a complete answer, the data can’t be great.
This is why you absolutely must have great data that’s ready to use – and ready for everything – to be a Data Ready Enterprise.
Data Ready Enterprises today will be Decision Ready because they are able to take advantage of data analytics. They will be Customer Ready and prepared to comprehensively manage their total customer relationships. They will by Application Ready through application consolidation and optimization – making sure the right applications access the right data at the right time. They will be Cloud Ready and able to accelerate the transition to the real-time data driven collaboration. And, not least, they will be Regulation Ready to lift the burden of compliance with industry-specific and other regulations.
Every organization has its own unique data signature with the potential to build smarter systems, more intuitive services and better products. Our aim at Informatica is to empower organizations by delivering them great data that is ready for everything — enabling them to be ready in the ways that matter most.
At Informatica, we are dedicated to making data ready to use. We enable great data. Data that’s connected, clean, safe and intelligent. These are the four business-critical Data Ready Enterprise benefits delivered by our Intelligent Data Platform. We look forward to discussing at Informatica World 2015, and beyond, how our Data Ready approach can help you and your company to succeed.
Check out how you and your organization can become “Data Ready” here – www.informatica.com/ready.
According to Gartner, 64% of organizations surveyed have purchased or were planning to invest in Big Data systems. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.
Of that 64%, 30% have already invested in Big Data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner’s 720 respondents, however, have actually deployed Big Data technology. This is bad, because most companies simply don’t know what they’re doing when it comes to Big Data.
Over the years, we have heard that Big Data is Volume, Velocity, and Variety. I feel this is one of the reasons why despite the Big Data hype, most companies are still stuck in neutral is because of this limited view.
- Volume: Terabytes to Exabytes, petabytes to Zetabytes of lots of data
- Velocity: Streaming data, milliseconds to seconds, how fast data is produced, and how fast the data must be processed to meet the need or demand
- Variety: Structured, unstructured, text, multimedia, video, audio, sensor data, meter data, html, text, e-mails, etc.
For us, the focus is on collection of data. After all, we are prone to be hoarders. Wired by our survival extinct to collect and hoard for the leaner winter months that may come. So while we hoard data, as much as we can, for the illusive “What if?” scenario. “Maybe this will be useful someday.” It’s this stockpiling of Big Data without application that makes it useless.
While Volume, Velocity, and Variety are focused on collection of data, Gartner, in 2014, introduced 3 additional Vs: Veracity, Variability, and Value which focus on usefulness of the data.
- Veracity: Uncertainty due to data inconsistency and incompleteness, ambiguities, latency, deception, model approximations, accuracy, quality, truthfulness or trustworthiness
- Variability: The differing ways in which the data may be interpreted, different questions require different interpretations
- Value: Data for co-creation and deep learning
I believe that perfecting as few as 5% of the relevant variables will get a business 95% of the same benefit. The trick is identifying that viable 5%, and extracting meaningful information from it. In other words, “Value” is the long pole in the tent.
On Saturday, I got a call from my broadband company on my mobile phone. The sales rep pitched a great limited-time offer for new customers. I asked him whether I could take advantage of this great offer as well, even though I am an existing customer. He was surprised. “Oh, you’re an existing customer,” he said, dismissively. “No, this offer doesn’t apply to you. It’s for new customers only. Sorry.” You can imagine my annoyance.
If this company had built a solid foundation of customer data, the sales rep would have had a customer profile rich with clean, consistent, and connected information as reference. If he had visibility into my total customer relationship with his company, he’d know that I’m a loyal customer with two current service subscriptions. He’d know that my husband and I have been customers for 10 years at our current address. On top of that, he’d know we both subscribed to their services while live at separate addresses before we were married.
Unfortunately, his company didn’t arm him with the great customer data he needs to be successful. If they had, he could have taken the opportunity to offer me one of the four services I currently don’t subscribe to—or even a bundle of services. And I could have shared a very different customer experience.
Every customer interaction counts
Executives at companies of all sizes talk about being customer-centric, but it’s difficult to execute on that vision if you don’t manage your customer data like a strategic asset. If delivering seamless, integrated, and consistent customer experiences across channels and touch points is one of your top priorities, every customer interaction counts. But without knowing exactly who your customers are, you cannot begin to deliver the types of experiences that retain existing customers, grow customer relationships and spend, and attract new customers.
How would you rate your current ability to identify your customers across lines of business, channels and touch points?
Many businesses, however, have anything but an integrated and connected customer-centric view—they have a siloed and fragmented channel-centric view. In fact, sales, marketing, and call center teams often identify siloed and fragmented customer data as key obstacles preventing them from delivering great customer experiences.
According to Retail Systems Research, creating a consistent customer experience remains the most valued capability for retailers, but 55 % of those surveyed indicated their biggest inhibitor was not having a single view of the customer across channels.
Retailers are not alone. An SVP of marketing at a mortgage company admitted in an Argyle CMO Journal article that, now that his team needs to deliver consistent customer experiences across channels and touch points, they realize they are not as customer-centric as they thought they were.
Customer complexity knows no bounds
The fact is, businesses are complicated, with customer information fragmented across divisions, business units, channels, and functions.
Citrix, for instance, is bringing together valuable customer information from 4 systems. At Hyatt Hotels & Resorts, it’s about 25 systems. At MetLife, it’s 70 systems.
How many applications and systems would you estimate contain valuable customer information at your company?
Based on our experience working with customers across many industries, we know the total customer relationship allows:
- Marketing to boost response rates by better segmenting their database of contacts for personalized marketing offers.
- Sales to more efficiently and effectively cross-sell and up-sell the most relevant offers.
- Customer service teams to resolve customers’ issues immediately, instead of placing them on hold to hunt for information in a separate system.
If your marketing, sales, and customer service teams are struggling with inaccurate, inconsistent, and disconnected customer information, it is costing your company revenue, growth, and success.
Transforming customer data into total customer relationships
Informatica’s Total Customer Relationship Solution fuels business and analytical applications with clean, consistent and connected customer information, giving your marketing, sales, e-commerce and call center teams access to that elusive total customer relationship. It not only brings all the pieces of fragmented customer information together in one place where it’s centrally managed on an ongoing basis, but also:
- Reconciles customer data: Your customer information should be the same across systems, but often isn’t. Assess its accuracy, fixing and completing it as needed—for instance, in my case merging duplicate profiles under “Jakki” and “Jacqueline.”
- Reveals valuable relationships between customers: Map critical connections—Are individuals members of the same household or influencer network? Are two companies part of the same corporate hierarchy? Even link customers to personal shoppers or insurance brokers or to sales people or channel partners.
- Tracks thorough customer histories: Identify customers’ preferred locations; channels, such as stores, e-commerce, and catalogs; or channel partners.
- Validates contact information: Ensure email addresses, phone numbers, and physical addresses are complete and accurate so invoices, offers, or messages actually reach customers.
This is just the beginning. From here, imagine enriching your customer profiles with third-party data. What types of information help you better understand, sell to, and serve your customers? What are your plans for incorporating social media insights into your customer profiles? What could you do with this additional customer information that you can’t do today?
We’ve helped hundreds of companies across numerous industries build a total customer relationship view. Merrill Lynch boosted marketing campaign effectiveness by 30 percent. Citrix boosted conversion rates by 20%. A $60 billion global manufacturer improved cross-sell and up-sell success by 5%. A hospitality company boosted cross-sell and up-sell success by 60%. And Logitech increased sales across channels, including their online site, retail stores, and distributors.
Informatica’s Total Customer Relationship Solution empowers your people with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on social media.
Do you have a terrible customer experience or great customer experience to share? If so, please share them with us and readers using the Comment option below.
An increasing number of companies around the world moving to cloud-first or hybrid architectures for new systems to process their data for new analytics applications. In addition to adding new data source from SaaS (Software as a Service) applications to their data pipelines, they are hosting some or all of their data storage, processing and analytics in IaaS (Infrastructure as a Service) public hosted environments to augment on-premise systems. In order to enable our customers to take advantage of the benefits of IaaS options, Informatica is embracing this computing model.
As announced today, Informatica now fully supports running the traditionally on-premise Informatica PowerCenter, Big Data Edition (BDE), Data Quality and Data Exchange on Amazon Web Services (AWS) Elastic Compute (EC2). This provides customers with added flexibility, agility and time-to-production by enabling a new deployment option for running Informatica software.
Existing and new Informatica customers can now choose to develop and/or deploy data integration, quality and data exchange in AWS EC2 just as they would on on-premise servers. There is no need for any special licensing as Informatica’s standard product licensing now covers deployment on AWS EC2 on the same operating systems as on-premise. BDE on AWS EC2 supports the same versions of Cloudera and Hortonworks Hadoop that are supported on-premise.
Customers can install these Informatica products on AWS EC2 instances just as they would on servers running on an on-premise infrastructure. The same award winning Informatica Global Customer Service that thousands of Informatica customers use is now available on call and standing by to help with success on AWS EC2. Informatica Professional Services is also available to assist customers running these products on AWS EC2 as they are for on-premise system configurations.
Informatica customers can accelerate their time to production or experimentation with the added flexibility of installing Informatica products on AWS EC2 without having to wait for new servers to arrive. There is the flexibility to develop in the cloud and deploy production systems on-premise or develop on-premise and deploy production systems in AWS. Cloud-first companies can keep it all in the cloud by both developing and going into production on AWS EC2.
Customers can also benefit from the lower up-front costs, maintenance costs and pay-as-you-go infrastructure pricing of AWS. Instead of having to pay upfront for servers and managing them in an on-premise data center, customers can use virtual servers in AWS to run Informatica products on. Customers can use existing Informatica licenses or purchase them in the standard way from Informatica for use on top of AWS EC2.
Combined with the ease of use of Informatica Cloud, Informatica now offers customers looking for hybrid and cloud solutions even more options.
Read the press release including supporting quotes from AWS and Informatica customer ProQuest, here.
One of THE biggest challenges in companies today is complexity. To be more specific, unnecessary complexity resulting from silo behaviors and piece-meal point solutions. Businesses today are already extremely complex with the challenges of multiple products, multiple channels, global scale, higher customer expectations, and rapid and constant change, so we certainly don’t want to make the IT solutions more complex than they need to be. That said, I’m on the side of NO we don’t need a CSO as this blog recently surveyed its readers. We just need a business architecture practice that does what it’s supposed to. (more…)
Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. The solution to this predicament is an Enterprise Architecture (EA) process that can provide a framework for an optimized IT portfolio. IT Optimization strategy should be based on a comprehensive set of architectural principles which ensure consistency and make IT more responsive, efficient, and economical.
The rationalization, standardization, and consolidation process helps organizations understand their current EA maturity level and move forward on the appropriate roadmap. As they undertake the IT Optimization journey, the IT architecture matures through several stages, leveraging IT Optimization Architecture Principles to attain each level of maturity.
Level 1: The first step involves helping a company develop its architecture vision and operating model, with attention to cost, globalization, investiture, or whatever is driving the company strategically. Once that vision is in place, enterprise architects can guide the organization through an iterative process of rationalization, consolidation, and eventually shared-services and cloud computing.
Level 2: The rationalization exercise helps an organization identify what standards to move towards as they eliminate the complexities and silos they have built up over the years, along with the specific technologies that will help them get there.
Depending on the company, Rationalization could start with a technical discussion and be IT-driven; or it could start at a business level. For example, a company might have distributed operations across the globe and desire to consolidate and standardize its business processes. That could drive change in the IT portfolio. Or a company that has gone through mergers and acquisitions might have redundant business processes to rationalize.
Rationalizing involves understanding the current state of an organization’s IT portfolio and business processes, and then mapping business capabilities to IT capabilities. This is done by developing scoring criteria to analyze the current portfolio, and ultimately by deciding on the standards that will propel the organization forward. Standards are the outcome of a rationalization exercise.
Standardized technology represents the second level of EA maturity. Organizations at this level have evolved beyond isolated independent silos. They have well-defined corporate governance and procurement policies, which yields measurable cost savings through reduced software licenses and the elimination of redundant systems and skill sets.
Level 3: Consolidation entails reducing the footprint of your IT portfolio. That could involve consolidating the number of database servers, application servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing, and related consolidation initiatives.
Consolidation may be a by-product of another technology transformation, or it may be the driver of these transformations. But whatever motivates the change, the key is to be in alignment with the overall business strategy. Enterprise architects understand where the business is going so they can pick the appropriate consolidation strategy.
Level 4: One of the key outcomes of a rationalization and consolidation exercise is the creation of a strategic roadmap that continually keeps IT in line with where the business is going.
Having a roadmap is especially important when you move down the path to shared services and cloud computing. For a company that has a very complex IT infrastructure and application portfolio, having a strategic roadmap helps the organization to move forward incrementally, minimizing risk, and giving the IT department every opportunity to deliver value to the business.
In case you haven’t noticed, data integration is all the rage right now. Why? There are three major reasons for this trend that we’ll explore below, but a recent USA Today story focused on corporate data as a much more valuable asset than it was just a few years ago. Moreover, the sheer volume of data is exploding.
For instance, in a report published by research company IDC, they estimated that the total count of data created or replicated worldwide in 2012 would add up to 2.8 zettabytes (ZB). By 2020, IDC expects the annual data-creation total to reach 40 ZB, which would amount to a 50-fold increase from where things stood at the start of 2010.
But the growth of data is only a part of the story. Indeed, I see three things happening that drive interest in data integration.
First, the growth of cloud computing. The growth of data integration around the growth of cloud computing is logical, considering that we’re relocating data to public clouds, and that data must be synced with systems that remain on-premise.
The data integration providers, such as Informatica, have stepped up. They provide data integration technology that can span enterprises, managed service providers, and clouds that dealing with the special needs of cloud-based systems. Moreover, at the same time, data integration improves the ways we doing data governance, and data quality,
Second, the growth of big data. A recent IDC forecast shows that the big data technology and services market will grow at a 26.4% compound annual growth rate to $41.5 billion through 2018, or, about six times the growth rate of the overall information technology market. Additionally, by 2020, IDC believes that line of business buyers will help drive analytics beyond its historical sweet spot of relational to the double-digit growth rates of real-time intelligence and exploration/discovery of the unstructured worlds.
The world of big data razor blades around data integration. The more that enterprises rely on big data, and the more that data needs to move from place to place, the more a core data integration strategy and technology is needed. That means you can’t talk about big data without talking about big data integration.
Data integration technology providers have responded with technology that keeps up with the volume of data that moves from place to place. As linked to the growth of cloud computing above, providers also create technology with the understanding that data now moves within enterprises, between enterprises and clouds, and even from cloud to cloud. Finally, data integration providers know how to deal with both structured and unstructured data these days.
Third, better understanding around the value of information. Enterprise managers always knew their data was valuable, but perhaps they did not understand the true value that it can bring.
With the growth of big data, we now have access to information that helps us drive our business in the right directions. Predictive analytics, for instance, allows us to take years of historical data and determine patterns that allow us to predict the future. Mashing up our business data with external data sources makes our data even more valuable.
Of course, data integration drives much of this growth. Thus the refocus on data integration approaches and tech. There are years and years of evolution still ahead of us, and much to be learned from the data we maintain.
Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.
Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching. The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.
The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”
This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.
As Carey explains it:
“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”
Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.
There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.
For those hoping to push through a hard-hitting analytics effort that will serve as a beacon of light within an otherwise calcified organization, there’s probably a lot of work cut out for you. Evolving into an organization that fully grasps the power and opportunities of data analytics requires cultural change, and this is a challenge organizations have only begin to grasp.
“Sitting down with pizza and coffee could get you around can get around most of the technical challenges,” explained Sam Ransbotham, Ph.D, associate professor Boston College, at a recent panel webcast hosted by MIT Sloan Management Review, “but the cultural problems are much larger.”
That’s one of the key takeaways from a the panel, in which Ransbotham was joined by Tuck Rickards, head of digital transformation practice at Russell Reynolds Associates, a digital recruiting firm, and Denis Arnaud, senior data scientist Amadeus Travel Intelligence. The panel, which examined the impact of corporate culture on data analytics, was led by Michael Fitzgerald, contributing editor at MIT Sloan Management Review.
The path to becoming an analytics-driven company is a journey that requires transformation across most or all departments, the panelists agreed. “It’s fundamentally different to be a data-driven decision company than kind of a gut-feel decision-making company,” said Rickards. “Acquiring this capability to do things differently usually requires a massive culture shift.”
That’s because the cultural aspects of the organization – “the values, the behaviors, the decision making norms and the outcomes go hand in hand with data analytics,” said Ransbotham. “It doesn’t do any good to have a whole bunch of data processes if your company doesn’t have the culture to act on them and do something with them.” Rickards adds that bringing this all together requires an agile, open source mindset, with frequent, open communication across the organization.
So how does one go about building and promoting a culture that is conducive to getting the maximum benefit from data analytics? The most important piece is being about people who ate aware and skilled in analytics – both from within the enterprise and from outside, the panelists urged. Ransbotham points out that it may seem daunting, but it’s not. “This is not some gee-whizz thing,” he said. “We have to get rid of this mindset that these things are impossible. Everybody who has figured it out has figured it out somehow. We’re a lot more able to pick up on these things that we think — the technology is getting easier, it doesn’t require quite as much as it used to.”
The key to evolving corporate culture to becoming more analytics-driven is to identify or recruit enlightened and skilled individuals who can provide the vision and build a collaborative environment. “The most challenging part is looking for someone who can see the business more broadly, and can interface with the various business functions –ideally, someone who can manage change and transformation throughout the organization,” Rickards said.
Arnaud described how his organization – an online travel service — went about building an espirit de corps between data analytics staff and business staff to ensure the success of their company’s analytics efforts. “Every month all the teams would do a hands-on workshop, together in some place in Europe [Amadeus is headquartered in Madrid, Spain].” For example, a workshop may focus on a market analysis for a specific customer, and the participants would explore the entire end-to-end process for working with the customer, “from the data collection all the way through to data acquisition through data crunching and so on. The one knowing the data analysis techniques would explain them, and the one knowing the business would explain that, and so on.” As a result of these monthly workshops, business and analytics teams members have found it “much easier to collaborate,” he added.
Web-oriented companies such as Amadeus – or Amazon and eBay for that matter — may be paving the way with analytics-driven operations, but companies in most other industries are not at this stage yet, both Rickards and Ransbotham point out. The more advanced web companies have built “an end-to-end supply chain, wrapped around customer interaction,” said Rickards. “If you think of most traditional businesses, financial services or automotive or healthcare are a million miles away from that. It starts with having analytic capabilities, but it’s a real journey to take that capability across the company.”
The analytics-driven business of the near future – regardless of industry – will likely to be staffed with roles not seen as of yet today. “If you are looking to re-architect the business, you may be imagining roles that you don’t have in the company today,” said Rickards. Along with the need for chief analytics officers, data scientists, and data analysts, there will be many new roles created. “If you are on the analytics side of this, you can be in an analytics group or a marketing group, with more of a CRM or customer insights title. Yu can be in a planning or business functions. In a similar way on the technology side, there are people very focused on architecture and security.”
Ultimately, the demand will be for leaders and professionals who understand both the business and technology sides of the opportunity, Rickards continued. Ultimately, he added, “you can have good people building a platform, and you can have good data scientists. But you better have someone on the top of that organization knowing the business purpose.’