Category Archives: CIO

There is Just One V in Big Data

According to Gartner, 64% of organizations surveyed have purchased or were planning to invest in Big Data systems. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.

Of that 64%, 30% have already invested in Big Data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner’s 720 respondents, however, have actually deployed Big Data technology. This is bad, because most companies simply don’t know what they’re doing when it comes to Big Data.

Over the years, we have heard that Big Data is Volume, Velocity, and Variety. I feel this is one of the reasons why despite the Big Data hype, most companies are still stuck in neutral is because of this limited view.

  1. Volume: Terabytes to Exabytes, petabytes to Zetabytes of lots of data
  1. Velocity: Streaming data, milliseconds to seconds, how fast data is produced, and how fast the data must be processed to meet the need or demand
  1. Variety: Structured, unstructured, text, multimedia, video, audio, sensor data, meter data, html, text, e-mails, etc.

There is just one V in Big DataFor us, the focus is on collection of data. After all, we are prone to be hoarders. Wired by our survival extinct to collect and hoard for the leaner winter months that may come. So while we hoard data, as much as we can, for the illusive “What if?” scenario. “Maybe this will be useful someday.” It’s this stockpiling of Big Data without application that makes it useless.

While Volume, Velocity, and Variety are focused on collection of data, Gartner, in 2014, introduced 3 additional Vs: Veracity, Variability, and Value which focus on usefulness of the data.

  1. Veracity: Uncertainty due to data inconsistency and incompleteness, ambiguities, latency, deception, model approximations, accuracy, quality, truthfulness or trustworthiness
  1. Variability: The differing ways in which the data may be interpreted, different questions require different interpretations
  1. Value: Data for co-creation and deep learning

I believe that perfecting as few as 5% of the relevant variables will get a business 95% of the same benefit. The trick is identifying that viable 5%, and extracting meaningful information from it. In other words, “Value” is the long pole in the tent.

Twitter @bigdatabeat

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Hadoop | Tagged , | 1 Comment

Succeeding with Analytics in a Big Data World

shutterstock_227687962 (1) - CopyBig data is receiving a lot of press these days including from this author.  While there continues to be constructive dialog regarding whether volume, velocity, or variety are the most important attributes of big data movement, one thing is clear. Constructed correctly, big data has the potential to transform businesses by increasing sales and operational efficiencies. More importantly, when big data is combined with predictive analytics, big data can improve customer experience, enable better targeting of potential customers, and improve the core business capabilities that are foundational to a business’s right to win.

The problem many in the vanguard have discovered is their big data projects are fraught with risk if they are not built upon a solid data management foundation.  During the Big Data Summit, you will learn directly for the vanguard of big data. How have they successfully transition from the traditional world of data management to a new world of big data analytics. Hear from market leading enterprises like Johnson and Johnson, Transamerica, Devon Energy, KPN, and Western Union. As well, hear from Tom Davenport, Distinguished Professor in Management and Information Technology at Babson College and the bestselling author of “Competing on Analytics” and “Big Data at Work”. Tom will share in particular his perspective from interviewing hundreds of companies about the successes and failures of their big data initiatives. Tom Davenport initially thought big data was just another example of technology hype. But his research on big data changed his mind. And finally hear from big data thought leaders including Cloudera, Hortonworks, Cognizant, and Capgemini. They are all here to share their stories on how to avoid common pitfalls and accelerate your analytical returns in a big data world.

To attend in person, please join us on Tuesday the 12th at 1:30 in Las Vegas at the Big Data Summit. If you cannot join us in person, I will be share live tweets and videos through twitter starting at 1:30 PST. Look for me at @MylesSuer on twitter to follow along.

Related Blogs

What is Big Data and why should your business care?
Big Data: Does the emperor have their clothes on?
Should We Still be calling it Big Data?
CIO explains the importance of Big Data to healthcare
Big Data implementations need a systems view and to put in place trustworthy data.
The state of predictive analytics
Analytics should be built upon Business Strategy
Analytics requires continuous improvement too?
When should you acquire a data scientist or two?
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”
Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study

Author Twitter: @MylesSuer

 

Share
Posted in 5 Sales Plays, Big Data, CIO, Informatica World 2015 | Tagged , , | Leave a comment

Great Customer Experiences Start with Great Customer Data

TCRM

Are your customer-facing teams empowered with the great customer data they need to deliver great customer experiences?

On Saturday, I got a call from my broadband company on my mobile phone. The sales rep pitched a great limited-time offer for new customers. I asked him whether I could take advantage of this great offer as well, even though I am an existing customer. He was surprised.  “Oh, you’re an existing customer,” he said, dismissively. “No, this offer doesn’t apply to you. It’s for new customers only. Sorry.” You can imagine my annoyance.

If this company had built a solid foundation of customer data, the sales rep would have had a customer profile rich with clean, consistent, and connected information as reference. If he had visibility into my total customer relationship with his company, he’d know that I’m a loyal customer with two current service subscriptions. He’d know that my husband and I have been customers for 10 years at our current address. On top of that, he’d know we both subscribed to their services while live at separate addresses before we were married.

Unfortunately, his company didn’t arm him with the great customer data he needs to be successful. If they had, he could have taken the opportunity to offer me one of the four services I currently don’t subscribe to—or even a bundle of services. And I could have shared a very different customer experience.

Every customer interaction counts

Executives at companies of all sizes talk about being customer-centric, but it’s difficult to execute on that vision if you don’t manage your customer data like a strategic asset. If delivering seamless, integrated, and consistent customer experiences across channels and touch points is one of your top priorities, every customer interaction counts. But without knowing exactly who your customers are, you cannot begin to deliver the types of experiences that retain existing customers, grow customer relationships and spend, and attract new customers.

How would you rate your current ability to identify your customers across lines of business, channels and touch points?

Many businesses, however, have anything but an integrated and connected customer-centric view—they have a siloed and fragmented channel-centric view. In fact, sales, marketing, and call center teams often identify siloed and fragmented customer data as key obstacles preventing them from delivering great customer experiences.

ChannelCRM

Many companies are struggling to deliver great customer experiences across channels, because  their siloed systems give them a channel-centric view of customers

According to Retail Systems Research, creating a consistent customer experience remains the most valued capability for retailers, but 55 % of those surveyed indicated their biggest inhibitor was not having a single view of the customer across channels.

Retailers are not alone. An SVP of marketing at a mortgage company admitted in an Argyle CMO Journal article that, now that his team needs to deliver consistent customer experiences across channels and touch points, they realize they are not as customer-centric as they thought they were.

Customer complexity knows no bounds

The fact is, businesses are complicated, with customer information fragmented across divisions, business units, channels, and functions.

Citrix, for instance, is bringing together valuable customer information from 4 systems. At Hyatt Hotels & Resorts, it’s about 25 systems. At MetLife, it’s 70 systems.

How many applications and systems would you estimate contain valuable customer information at your company?

Based on our experience working with customers across many industries, we know the total customer relationship allows:

  • Marketing to boost response rates by better segmenting their database of contacts for personalized marketing offers.
  • Sales to more efficiently and effectively cross-sell and up-sell the most relevant offers.
  • Customer service teams to resolve customers’ issues immediately, instead of placing them on hold to hunt for information in a separate system.

If your marketing, sales, and customer service teams are struggling with inaccurate, inconsistent, and disconnected customer information, it is costing your company revenue, growth, and success.

Transforming customer data into total customer relationships

Informatica’s Total Customer Relationship Solution fuels business and analytical applications with clean, consistent and connected customer information, giving your marketing, sales, e-commerce and call center teams access to that elusive total customer relationship. It not only brings all the pieces of fragmented customer information together in one place where it’s centrally managed on an ongoing basis, but also:

  • Reconciles customer data: Your customer information should be the same across systems, but often isn’t. Assess its accuracy, fixing and completing it as needed—for instance, in my case merging duplicate profiles under “Jakki” and “Jacqueline.”
  • Reveals valuable relationships between customers: Map critical connections­—Are individuals members of the same household or influencer network? Are two companies part of the same corporate hierarchy? Even link customers to personal shoppers or insurance brokers or to sales people or channel partners.
  • Tracks thorough customer histories: Identify customers’ preferred locations; channels, such as stores, e-commerce, and catalogs; or channel partners.
  • Validates contact information: Ensure email addresses, phone numbers, and physical addresses are complete and accurate so invoices, offers, or messages actually reach customers.
PCRM

With a view of the Total Customer Relationship, teams are empowered to deliver great customer experiences

This is just the beginning. From here, imagine enriching your customer profiles with third-party data. What types of information help you better understand, sell to, and serve your customers? What are your plans for incorporating social media insights into your customer profiles? What could you do with this additional customer information that you can’t do today?

We’ve helped hundreds of companies across numerous industries build a total customer relationship view. Merrill Lynch boosted marketing campaign effectiveness by 30 percent. Citrix boosted conversion rates by 20%. A $60 billion global manufacturer improved cross-sell and up-sell success by 5%. A hospitality company boosted cross-sell and up-sell success by 60%. And Logitech increased sales across channels, including their online site, retail stores, and distributors.

Informatica’s Total Customer Relationship Solution empowers your people with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on social media.

Do you have a terrible customer experience or great customer experience to share? If so, please share them with us and readers using the Comment option below.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Data Integration, Data Quality, Data Services, Enterprise Data Management, Master Data Management, Total Customer Relationship | Tagged , , , , , , , , , | Leave a comment

When Should You Acquire a Data Scientist or Two?

Data Science should change how your businesses are run

data scienceThe importance of data science is becoming more and more clear. Marc Benioff says, “I think for every company, the revolution in data science will fundamentally change how we run our business”. “There’s just a huge amount more data than ever before, our greatest challenge is making sense of that data”. He goes on to say that “we need a new generation of executives who understand how to manage and lead through data. And we also need a new generation of employees who are able to help us organize and structure our business around data”. Mark then says “when I look at the next set of technologies that we have to build at Salesforce, it is all data science based technology.” Ram Charan in his article in Fortune Magazine “says to thrive, companies—and the execs who run them, must transform into math machines” (The Algorithmic CEO, Fortune Magazine, March 2015, page 45).

With such powerful endorsements for data science, the question you may be asking is when should you hire a data scientist or two. The answer has multiple answers. I liken data science to any business research. You need to do your upfront homework for the data scientists you hire to be effective.

swotCreate a situation analysis before you start

You need to start by defining your problem—are you losing sales, finding it takes too long to manufacturer something, less profitable than you would like to be, and the list goes on. Next, you should create a situational analysis. You want to arm your data scientists with as much information as possible to define what you want them to solve or change. Make sure that you are as concrete as possible here. Data scientists struggle when the business people that they work with are vague. As well, it is important that you indicate what kinds of business changes will be considered if the model and data deliver this results or that result.

Next you need to catalog the data that you already have which is relevant to the business problem. Without relevant data there is little that the data scientist can do to help you. With relevant data sources in hand, you need to define the range of actions that you can possibly take once a model has been created.

Be realistic about what is required

With these things in hand, it may be time to hire some data scientists. As you start your process, you need to be realistic about the difficulty of getting a top flight data scientist. Many of my customers have complained about the difficulty competing with Google and other tech startups. As important, “there is a huge variance in the quality and ability of data scientists”. (Data Science for Business, Foster Provost, O’Reilly, page 321). Once you have hired someone, you need to keep in mind that effective data science requires business and data science collaboration. As well, please know that data scientist struggle when business people don’t appreciate the effort needed to get an appropriate training data set or model evaluation procedures.

proposalMake sure internal or external data scientists give you an effective proposal

Once Once your data scientists are in place, you should realize that a data scientist worth their salt will create a proposal back to you. As we have said, it is important that you know what kinds of things will happen if the model and data delivery this results or that result. Data scientist in turn will be able to narrow things down to a dollar impact.

Their proposal should start by sharing their understanding of the business and the data which is available. What business problems are they trying to solve? Next the data scientist may define things like whether supervised or unsupervised learning will be used. Next they should openly discuss what efforts will be involved in data preparation. They should tell you here about the values for the target variable (whose values will be predicted). They should describe next their modeling approach and whether more than one model is be evaluated and then how models will be compared and final model be selected. And finally, they should discuss how the model will be evaluated and deployed. Are there evaluation and setup metrics? Data scientists can dedicate time and resources in their proposal to determining what things are real versus expected drivers.

To make all this work, it can be a good idea for data scientist to talk in their proposal about likelihood because business people that have not been through a quantitative MBA do not understand or remember statistics. It is important as well that data scientist before they begin ask business people the so what questions if the situation analysis is inadequate.

analyticsLeading an internal analytics team

In some cases, analytical teams will be built internally. Where this occurs, it is really importantly that the analytic leader have good people skills. They need as well to be able to set expectations that people will be making decisions from data and analysis. This includes having the ability to push back when someone comes to them will a recommendation based on gut feel.

The leader needs to hire smart analysts. To keep them, they need a stimulating and supportive work environment. Tom Davenport says analysts are motivated by interesting and challenging work that allows them to utilize their highly specialize skills. Like millenials, money is nice for analysts but they are more motivated more by exciting work and having the opportunity to grow and stretch their skills. Please know that data scientists want to spend time refining analytical models rather than doing simple analyses and report generation. Most importantly they want to do important work that makes a meaningful contribution. To do this, they want to feel supported and valued but have autonomy at work. This includes the freedom to organize their work. At the same time, analysts like to work together. And they like to be surrounded by other smart and capable collogues. Make sure to treat your data scientists as a strategic resource. This means you need development plans, career plans, and performance management processes.

Parting remarks

As we have discussed, make sure to do your homework before contracting or hiring for data scientists. Once you have done your homework, if you are an analytic leader, make sure that you create a stimulating environment. Additionally, prove the value of analytics by signing up for results that demonstrate data modeling efficacy. To do this, look here for business problems that will lead to a big difference. And finally if you need an analytics leader to emulate, look no further than Brian Cornell, the new CEO of Target.

Solution Briefs

Next Generation Analytics

Related Blogs

Big Data: The Emperor may have their Clothes on but…

Should We Still be calling it Big Data?

Analytics Stories: A Banking Case Study

Analytics Stories: A Pharmaceutical Case Study

Analytics Stories: An Educational Case Study

Analytics Stories: A Financial Services Case Study

Analytics Stories: A Healthcare Case Study

Major Oil Company Uses Analytics to Gain Business Advantage

Is it the CDO or CAO or Someone Else?

Should We Still be calling it Big Data?

What Should Come First: Business Processes or Analytics?

Should Analytics Be Focused on Small Questions Versus Big Questions?

Who Owns Enterprise Analytics and Data?

Competing on Analytics

Is Big Data Destined To Become Small And Vertical?

Big Data Why?

What is big data and why should your business care?

Myles in Twitter: @MylesSuer

Share
Posted in Big Data, CIO | Tagged , | Leave a comment

What is an Enterprise Architecture Maturity Model?

Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. The solution to this predicament is an Enterprise Architecture (EA) process that can provide a framework for an optimized IT portfolio. IT Optimization strategy should be based on a comprehensive set of architectural principles which ensure consistency and make IT more responsive, efficient, and economical.

The rationalization, standardization, and consolidation process helps organizations understand their current EA maturity level and move forward on the appropriate roadmap. As they undertake the IT Optimization journey, the IT architecture matures through several stages, leveraging IT Optimization Architecture Principles to attain each level of maturity.

EA Maturity

Multiple Levels of Enterprise Architecture Maturity Model

Level 1: The first step involves helping a company develop its architecture vision and operating model, with attention to cost, globalization, investiture, or whatever is driving the company strategically. Once that vision is in place, enterprise architects can guide the organization through an iterative process of rationalization, consolidation, and eventually shared-services and cloud computing.

Level 2: The rationalization exercise helps an organization identify what standards to move towards as they eliminate the complexities and silos they have built up over the years, along with the specific technologies that will help them get there.

Depending on the company, Rationalization could start with a technical discussion and be IT-driven; or it could start at a business level. For example, a company might have distributed operations across the globe and desire to consolidate and standardize its business processes. That could drive change in the IT portfolio. Or a company that has gone through mergers and acquisitions might have redundant business processes to rationalize.

Rationalizing involves understanding the current state of an organization’s IT portfolio and business processes, and then mapping business capabilities to IT capabilities. This is done by developing scoring criteria to analyze the current portfolio, and ultimately by deciding on the standards that will propel the organization forward. Standards are the outcome of a rationalization exercise.

Standardized technology represents the second level of EA maturity. Organizations at this level have evolved beyond isolated independent silos. They have well-defined corporate governance and procurement policies, which yields measurable cost savings through reduced software licenses and the elimination of redundant systems and skill sets.

Level 3: Consolidation entails reducing the footprint of your IT portfolio. That could involve consolidating the number of database servers, application servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing, and related consolidation initiatives.

Consolidation may be a by-product of another technology transformation, or it may be the driver of these transformations. But whatever motivates the change, the key is to be in alignment with the overall business strategy. Enterprise architects understand where the business is going so they can pick the appropriate consolidation strategy.

Level 4: One of the key outcomes of a rationalization and consolidation exercise is the creation of a strategic roadmap that continually keeps IT in line with where the business is going.

Having a roadmap is especially important when you move down the path to shared services and cloud computing. For a company that has a very complex IT infrastructure and application portfolio, having a strategic roadmap helps the organization to move forward incrementally, minimizing risk, and giving the IT department every opportunity to deliver value to the business.

Twitter @bigdatabeat

Share
Posted in 5 Sales Plays, Application Retirement, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Cloud, Mergers and Acquisitions | Tagged , , , , | Leave a comment

Speed: The #1 IT Challenge

Agile Data Integration

Speed: The #1 IT Challenge

Speed is the top challenge facing IT today, and it’s reaching crisis proportions at many organizations.  Specifically, IT needs to deliver business value at the speed that the business requires.

The challenge does not end there; This has to be accomplished without compromising cost or quality. Many people have argued that you only get two out of three on the Speed/Cost/Quality triangle, but I believe that achieving this is the central challenge facing Enterprise Architects today.  Many people I talk to are looking at agile technologies, and in particular Agile Data Integration.

There have been a lot of articles written about the challenges, but it’s not all doom and gloom.  Here is something you can do right now to dramatically increase the speed of your project delivery while improving cost and quality at the same time: Take a fresh look you Agile Data Integration environment and specifically at Data Virtualization.  Data Virtualization offers the opportunity to simplify and speed up the data part of enterprise projects.  And this is the place where more and more projects are spending 40% and more of their time.  For more information and an industry perspective you can download the latest Forrester Wave report for Data Virtualization Q1 2015.

Here is a quick example of how you can use Data Virtualization technology for rapid prototyping to speed up business value delivery:

  • Use data virtualization technology to present a common view of your data to your business-IT project teams.
  • IT and business can collaborate in realtime to access and manage data from a wide variety of very large data sources – eliminating the long, slow cycles of passing specifications back and forth between business and IT.
  • Your teams can discover, profile, and manage data using a single virtual interface that hides the complexity of the underlying data.
  • By working with a virtualization layer, you are assured that your teams are using the right data and data that can by verified by linking it to a Business Glossary with clear terms, definitions, owners, and business context to reduce the chance of misunderstandings and errors.
  • Leading offerings in this space include data quality and data masking tools in the interface, ensuring that you improve data quality in the process.
  • Data virtualization means that your teams can be delivering in days rather than months and faster delivery means lower cost.

There has been a lot of interest in agile development, especially as it relates to data projects.  Data Virtualization is a key tool to accelerate your team in this direction.

Informatica has a leading position in the Forrester report due to the productivity of the Agile Data Integration environment but also because of the integration with the rest of the Informatica platform.  From an architect’s point of view it is critical to start standardizing on an enterprise data management platform.  Continuing data and data tool fragmentation will only slow down future project delivery.  The best way to deal with the growing complexity of both data and tools is to drive standardization within your organizations.

Share
Posted in 5 Sales Plays, Architects, CIO, Data Integration | Tagged , , , , , | Leave a comment

IT Leadership Group Discusses Governance and Analytics

business leaders

IT Leadership Group Discusses Governance and Analytics

I recently got to talk to several senior IT leaders about their views on information governance and analytics. Participating were a telecom company, a government transportation entity, a consulting company, and a major retailer. Each shared openly in what was a free flow of ideas.

The CEO and Corporate Culture is critical to driving a fact based culture

I started this discussion by sharing the COBIT Information Life Cycle. Everyone agreed that the starting point for information governance needs to be business strategy and business processes. However, this caused an extremely interesting discussion about enterprise analytics readiness. Most said that they are in the midst of leading the proverbial horse to water—in this case the horse is the business. The CIO in the group said that he personally is all about the data and making factual decisions. But his business is not really there yet. I asked everyone at this point about the importance of culture and the CEO. Everyone agreed that the CEO is incredibly important in driving a fact based culture. Apparent, people like the new CEO of Target are in the vanguard and not the mainstream yet.

KPIs need to be business drivers

The above CIO said that too many of his managers are operationally, day-to-day focused and don’t understand the value of analytics or of predictive analytics. This CIO said that he needs to teach the business to think analytically and to understand how analytics can help drive the business as well as how to use Key Performance Indicators (KPIs). The enterprise architect in the group shared at this point that he had previously worked for a major healthcare organization. When organization was asked to determine a list of KPIs, they came back 168 KPIs. Obviously, this could not work so he explained to the business that an effective KPI must be a “driver of performance”. He stressed to the healthcare organization’s leadership the importance of having less KPIs and of having those that get produced being around business capabilities and performance drivers.

IT needs increasingly to understand their customers business models

I shared at this point that I visited a major Italian bank a few years ago. The key leadership had high definition displays that would roll by an analytic every five minutes. Everyone laughed at the absurdity of having so many KPIs. But with this said, everyone felt that they needed to get business buy in because only the business can derive the value from acting upon the data. According to this group of IT leaders, this causing them more and more to understand their customer’s business models.

Others said that they were trying to create an omni-channel view of customers. The retailer wanted to get more predictive. While Theodore Levitt said the job of marketing is to create and keep a customer. This retailer is focused on keeping and bringing back more often the customer. They want to give customers offers that use customer data that to increase sales. Much like what I described recently was happening at 58.com, eBay, and Facebook.

Most say they have limited governance maturity

We talked about where people are in their governance maturity. Even though, I wanted to gloss over this topic, the group wanted to spend time here and compare notes between each other. Most said that they were at stage 2 or 3 in in a five stage governance maturity process. One CIO said, gee does anyone ever at level 5. Like analytics, governance was being pushed forward by IT rather than the business. Nevertheless, everyone said that they are working to get data stewards defined for each business function.  At this point, I asked about the elements that COBIT 5 suggests go into good governance. I shared that it should include the following four elements: 1) clear information ownership; 2) timely, correct information; 3) clear enterprise architecture and efficiency; and 4) compliance and security. Everyone felt the definition was fine but wanted specifics with each element. I referred them and you to my recent article in COBIT Focus.

CIO says they are the custodians of data only

At this point, one of the CIOs said something incredibly insightful. We are not data stewards. This has to be done by the business—IT is the custodians of the data. More specifically, we should not manage data but we should make sure what the business needs done gets done with data. Everyone agreed with this point and even reused the term, data custodians several times during the next few minutes.  Debbie Lew of COBIT said just last week the same thing. According to her, “IT does not own the data. They facilitate the data”. From here, the discussion moved to security and data privacy. The retailer in the group was extremely concerned about privacy and felt that they needed masking and other data level technologies to ensure a breach minimally impacts their customers. At this point, another IT leader in the group said that it is the job of IT leadership to make sure the business does the right things in security and compliance. I shared here that one my CIO friends had said that “the CIOs at the retailers with breaches weren’t stupid—it is just hard to sell the business impact”. The CIO in the group said, we need to do risk assessments—also a big thing for COBIT 5–that get the business to say we have to invest to protect. “It is IT’s job to adequately explain the business risk”.

Is mobility a driver of better governance and analytics?

Several shared towards the end of the evening that mobility is an increasing impetus for better information governance and analytics.  Mobility is driving business users and business customers to demand better information and thereby, better governance of information. Many said that a starting point for providing better information is data mastering. These attendees felt as well that data governance involves helping the business determine its relevant business capabilities and business processes. It seems that these should come naturally, but once again, IT for these organizations seems to be pushing the business across the finish line.

Additional Materials

Solution Page:

Corporate Governance

Data Security

Governance Maturity Assessment Tool

Blogs and Articles:

Good Corporate Governance Is Built Upon Good Information and Data Governance

Using COBIT 5 to Deliver Information and Data Governance

Twitter:

@MylesSuer

 

 

Share
Posted in CIO, Data Governance | Tagged , , , | Leave a comment

Next Generation Planning for Agile Business Transformation

This is an age of technology disruption and digitization. Winners will be those organizations that can adapt quickly and drive business transformation on an ongoing basis.

When I first met John Schmidt Vice President of Global Integration Services at Informatica, he asked me to visualize Business Transformation as “A modern tool like the internet and Google Maps, with which planning a road trip from New York to San Francisco with a number of stops along the way to visit friends or see some sights takes just minutes. So you’re halfway through the trip and a friend calls to say he has suddenly been called out of town, you get on your mobile phone and within a few minutes, you have a new roadmap and a new plan.”

So, why is it that creating a roadmap for an enterprise initiative takes months or even years, and upon development of such a plan, it is nearly impossible to change even when new information or external events invalidate the plan? A single transformation is useful, but what you really want is the ability to transform our business on an ongoing basis. You need to be agile in planning of the transformation initiative itself. Is it even feasible to achieve a planning capability for complex enterprise initiatives that could approach the speed and agility of cross-country road-trip planning?

The short answer is YES; you can get much faster if you do three things:

First, throw out old notions of how planning in complex corporate environments is done, while keeping in mind that planning an enterprise transformation is fundamentally different than planning a focused departmental initiative.

Second, invest in tools equivalent to Google Maps for building the enterprise roadmap. Google Maps works because it leverages a database of information about roads, rules of the roads, related local services, and points of interest. In short, Google Map the enterprise, which is not as onerous as it sounds.

Third, develop a team of Enterprise Architects and planners with the skills and discipline to use the BOST™ Framework to maintain the underlying reference data about the business, its operations, the systems that support it, and the technologies that they are based on. This will provide the execution framework for your organization to deliver the data to fuel your business initiatives and digital strategy.

The results in a closer alignment of your business and IT organizations, there will be fewer errors due to communication issues, and because your business plans are linked directly to the underlying technical implementation, your business value will be delivered quicker.

BOSTThis is not some “pie in the sky” theory or a futuristic dream. What you need is a tool like Google Maps for Business Transformation. The tool is the BOST™ Toolkit leverages the BOST™ Framework, which through models, elements, and associated relationships built around an underlying Metamodel, interprets enterprise processes using a 4-dimensional view driven by business, operations, systems, and technology. Informatica in collaboration with certified partners built The BOST™ Framework. It provides an Architecture-led Planning approach to for business transformation.

Benefits of Architecture-led Planning

The Architecture-led Planning approach is effective when applied with governance and oversight. The following four features describe the benefits:

Enablement of Business and IT Collaboration – Uses a common reference model to facilitate cross-functional business alignment, as well as alignment between business and IT. The model gets everyone on the same page, regardless of line of business, location, or IT function. This model explicitly and dynamically starts with business strategy and links from there to the technical implementation.

Data-driven Planning – Being able to capture data in a structured repository helps with rapid planning. A data-driven plan makes it dynamic and adaptable to changing circumstances. When the plan changes, rather than updating dozens of documents, simply apply the change to the relevant components in the enterprise model repository and all business and technical model views that reference that component update automatically.

Cross-Functional Decision Making – Cross-functional decision-making is facilitated in several ways. First, by showing interdependencies between functions, business operations, and systems, the holistic view helps each department or team to understand the big-picture and its role in the overall process. Second, the future state architectural models are based on a view of how business operations will change. This provides the foundation to determine the business value of the initiative, measure your progress, and ultimately report the achievement of the goals. Quantifiable metrics help decision makers look beyond the subjective perspectives and agree on fact-based success metrics.

Reduced Execution Risk – Reduced execution risk results from having a robust and holistic plan based on a rigorous analysis of all the dependent enterprise components in the business, operations, systems and technology view. Risk is reduced with an effective governance discipline both from a program management as well as from an architectural change perspective.

Business Transformation with Informatica

Integrated Program Planning is for organizations that need large or complex Change Management assistance. Examples of candidates for Integrated Program Planning include:

Enterprise Initiatives: Large-scale mergers or acquisitions, switching from a product-centric operating model to more customer-centric operations, restructuring channel or supplier relationships, rationalizing the company’s product or service portfolio, or streamlining end-to-end processes such as order-to-cash, procure-to-pay, hire-to-retire or customer on-boarding.

Top-level Directives: Examples include board-mandated data governance, regulatory compliance initiatives that have broad organizational impacts such as data privacy or security, or risk management initiatives.

Expanding Departmental Solutions into Enterprise Solutions: Successful solutions in specific business areas can often be scaled-up to become cross-functional enterprise-wide initiatives. For example, expanding a successful customer master data initiative in marketing to an enterprise-wide Customer Information Management solution used by sales, product development, and customer service for an Omni-channel customer experience.

Twitter @bigdatabeat

The BOST™ Framework identifies and defines enterprise capabilities. These capabilities are modularized as reconfigurable and scalable business services. These enterprise capabilities are independent of organizational silos and politics, which provide strategists, architects, and planners the means to drive for high performance across the enterprise, regardless of the shifting set of strategic business drivers.The BOST™ Toolkit facilitates building and implementing new or improved capabilities, adjusting business volumes, and integrating with new partners or acquisitions through common views of these building blocks and through reusing solution components. In other words, Better, Faster, Cheaper projects.

The BOST View creates a visual understanding of the relationship between business functions, data, and systems. It helps with the identification of relevant operational capabilities and underlying support systems that need to change in order to achieve the organization’s strategic objectives. The result will be a more flexible business process with greater visibility and the ability to adjust to change without error.

Share
Posted in 5 Sales Plays, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO | Tagged , , , | Leave a comment

Top 5 Big Data Mistakes

Top 5 Big Data mistakes

Top 5 Big Data mistakes

I won’t say I’ve seen it all; I’ve only scratched the surface in the past 15 years. Below are some of the mistakes I’ve made or fixed during this time.

MongoDB as your Big Data platform

Ask yourself, why am I picking on MongoDB? The NoSQL database most abused at this point is MongoDB, while Mongo has an aggregation framework that tastes like MapReduce and even a very poorly documented Hadoop connector, its sweet spot is as an operational database, not an analytical system.

RDBMS schema as files

You dumped each table from your RDBMS into a file and stored that on HDFS, you now plan to use Hive on it. You know that Hive is slower than RDBMS; it’ll use MapReduce even for a simple select. Next, let’s look at row sizes; you have flat files measured in single-digit kilobytes.

Hadoop does best on large sets of relatively flat data. I’m sure you can create an extract that’s more de-normalized.

Data Ponds

Instead of creating a single Data Lake, you created a series of data ponds or a data swamp. Conway’s law has struck again; your business groups have created their own mini-repositories and data analysis processes. That doesn’t sound bad at first, but with different extracts and ways of slicing and dicing the data, you end up with different views of the data, i.e., different answers for some of the same questions.

Schema-on-read doesn’t mean, “Don’t plan at all,” but it means “Don’t plan for every question you might ask.”

Missing use cases

Vendors, to escape the constraints of departmental funding, are selling the idea of the data lake. The byproduct of this is the business lost sight of real use cases. The data-lake approach can be valid, but you won’t get much out of it if you don’t have actual use cases in mind.

It isn’t hard to come up with use cases, but that is always an afterthought. The business should start thinking of the use cases when their databases can’t handle the load.

SQL

You like SQL. Query languages and techniques have changed with time. Today, think of Pig as PL/SQL on steroids with maybe a touch of acid.

To do a larger bit of analytics, you may need a bigger tool set like that may include Hive, Pig, MapReduce, R, and more.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business Impact / Benefits, CIO, Hadoop | Tagged , , , , , , , | Leave a comment