Category Archives: CIO

Happy Holidays, Happy HoliData.

Happy Holidays, Happy HoliData

In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.

HappyHoliData_01 HappyHoliData_02 HappyHoliData_03 HappyHoliData_04 HappyHoliData_05 HappyHoliData_06 HappyHoliData_07 HappyHoliData_08 HappyHoliData_09 HappyHoliData_10 HappyHoliData_11 HappyHoliData_12 HappyHoliData_13 HappyHoliData_14 HappyHoliData_15 HappyHoliData_16 HappyHoliData_17 HappyHoliData_18 HappyHoliData_19 HappyHoliData_20 HappyHoliData_21 HappyHoliData_22 HappyHoliData_23 HappyHoliData_24

Thanks a lot to all my great teammates, who made this series happen.

Happy Holidays, Happy HoliData.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Big Data, CIO, CMO, Customers, Data Governance, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Manufacturing, Master Data Management, PaaS, PiM, Product Information Management, Retail, SaaS | Tagged , | Leave a comment

It is All About the “I”

Six ideas for CIOs in 2015 to put the innovation back in CIO

CIOFor most, the “I” in CIO stands for Information. But what about that other “I”, Innovation? For many IT organizations, 60-80% of IT spending continues to be tied up in keeping the IT lights on. But innovation matters more than ever to the business bottom line. According Geoffrey Moore, “without innovation, offerings become more and more like each other. They commoditize.” (“Dealing with Darwin”, Geoffrey Moore, page 1). Geoffrey goes on to say later in “Dealing with Darwin” that commoditization will over time drop business returns to “the cost of capital”. So clearly, this is a place that no CIO would want their enterprises to consciously go.

Given this, what is the role of the CIO in driving enterprise innovation? I believe that it is a significant one. Without question, technology investment has been a major driver of enterprise productivity gains. At the same time, IT investment has had a major role in improving business capabilities and the business value chains. And more recently, IT is even carving out a role in products themselves as part of the IoT. So how can CIOs help drive business innovation?

1) Get closer to your business customers. CIOs have said to me that their number one priority is connecting what the IT is doing to what the business is doing. Given this, CIOs should make it a real priority for their teams to get closer to the business this year. According to Kamalini Ramdas’ Article in Harvard Business Review, “to succeed at innovation, you need to have a culture in which everyone in the company is constantly scanning for ideas”.

2) Develop internal design partners. When I have started new businesses, I have always created a set of design partners to ensure that I built the right products. I tell my design partners to beat me up now rather than after I build the product. You need, as Kamalini Ramdas suggests, to harvest the best ideas of your corporate team just like I did with startups. You can start by focusing your attention upon the areas of distinctive capability—the places that give your firm its right to win.

3) Enabling your IT leaders and individual contributors to innovate. For many businesses, speed to market or speed of business processes can represent a competitive advantage. Foundationally to this are IT capabilities including up time, system performance, speed of project delivery, and the list goes on. Encouraging everyone on your team to drive superior operational capabilities can enable business competitive advantage. And one more thing, make sure to work with your business leaders to pass a portion of the business impact for improvements into a bonus for the entire enabling IT team. At Lincoln Electric, they used bonuses by team to continuously improve their products. This arch welding company shares the money saved from each process improvement with the entire team. They end up getting the best team and highest team longevity as teams work improves product quality and increases cost take out. According Kamalini, “in truly innovative culture, leaders need to imbue every employee with a clear vision and a sense of empowerment that helps them identify synergistic ideas and run with them” (“Build a Company Where Everyone’s Looking for New Ideas”, Harvard Business Review, page 1).

4) Architect for Innovation. As the velocity of change increases, businesses need IT organizations to be able to move more quickly. This requires an enterprise architecture built for agility. According to Jeanne Ross, the more agile companies have a high percentage of their core business processes digitized and they have as well standardized their technology architecture (Enterprise Architecture as Strategy, Jeanne Ross, page 12).

5) Look for disruptive innovations. I remember a professor of mine suggesting that we cannot predict the future when discussing futures research. But I believe that you can instead get closer to your customers than anyone else. CIOs should dedicate a non-trival portion of IT spend to germinating potentially disruptive ideas. They should use their design partners to select what gets early stage funding. Everyone here should act like a seed stage venture capitalist. You need to let people experiment. At the same time, design partners should set reasonable goals and actively measure performance toward goals.

6) Use analytics. Look at business analytics for areas of that could use IT’s help. Open up discussions with design partners for areas needing capability improvement. This is a great place to start. Look as well for where there are gaps in business delivery that could be drive better performance from further or improved digitization/automation. And once an innovation is initiated, analytics should actively ensure the management of  the innovation’s delivery.

Final remarks

There is always more that you can do to innovate. The key thing is to get innovation front and center on the IT agenda. Actively sponsor it and most importantly empower the team to do remarkable things. And when this happens, reward the teams that made it happen.

IT Is All About Data!
The Secret To Being A Successful CIO
Driving IT Business Alignment: One CIOs Journey
How Is The CIO Role Starting To Change?
The CIO Challenged

Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Governance | Tagged , , , , , | Leave a comment

Are The Banks Going to Make Retailers Pay for Their Poor Governance?

 

Retail and Data Governance

Retail and Data Governance

A couple months ago, I reached out to a set of CIOs on the importance of good governance and security. All of them agreed that both were incredibly important. However, one CIO retorted a very pointed remark by saying that “the IT leadership at these breached companies wasn’t stupid.” He continued by saying that when selling the rest of the C-Suite, the discussion needs to be about business outcomes and business benefits.  For this reason, he said that CIOs have struggled at selling the value of investments in governance and security investment. Now I have suggested previously that security pays because of the impact on “brand promise”.  And, I still believe this.

However, this week the ante was raised even higher. A district judge ruled that a group of banks can proceed to sue a retailer for negligence in their data governance and security. The decision could clearly lead to significant changes in the way the cost of fraud is distributed among parties within the credit card ecosystem. Where once banks and merchant acquirers would have shouldered the burden of fraud, this decision paves the way for more card-issuing banks to sue merchants for not adequately protecting their POS systems.

Accidents waste priceless time

Accidents waste priceless time

The judge’s ruling said that “although the third-party hackers’ activities caused harm, merchant played a key role in allowing the harm to occur.” The judge also determined that the bank suit against merchants was valid because the plaintiffs adequately showed that the retailer failed “to disclose that its data security systems were deficient.” This is interesting because it says that security systems should be sufficient and if not, retailers need to inform potentially affected stakeholders of their deficient systems. And while taking this step could avoid a lawsuit, it would likely increase the cost of interchange for more risky merchants. This would effectively create a risk premium for retailers that do not adequately govern and protect their IT environments.

There are broad implications for all companies who end up harming customer, partners, or other stakeholders by not keeping their security systems up to snuff. The question is, will this make good governance have enough of a business outcome and benefit that businesses will actually want to pay it forward — i.e. invest in good governance and security? What do you think? I would love to hear from you.

Related links

Solutions:

Enterprise Level Data Security

Hacking: How Ready Is Your Enterprise?

Gambling With Your Customer’s Financial Data

The State of Data Centric Security

Twitter: @MylesSuer

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, CIO, Data Governance, Data Security, Financial Services | Tagged , , , | Leave a comment

Great Data Puts You In Driver Seat: The Next Step in The Digital Revolution

Great Data Puts You In Driver Seat

Great Data Is the Next Step

The industrial revolution began in mid-late eighteenth century, introducing machines to cut costs and speed up manufacturing processes. Steam engines forever changed efficiency in iron making, textiles, and chemicals production, among many others. Transportation improved significantly, and the standard of living for the masses went saw significant, sustained growth.

In last 50-60 years, we have witnessed another revolution, through the invention of computing machines and the Internet – a digital revolution.  It has transformed every industry and allowed us to operate at far greater scale – processing more transactions and in more locations – than ever before.    New cities emerged on the map, migrations of knowledge workers throughout the world followed, and the standard of living increased again.  And digitally available information transformed how we run businesses, cities, or countries.

Forces Shaping Digital Revolution

Over the last 5-6 years, we’ve witnessed a massive increase in the volume and variety of this information.  Leading forces that contributed to this increase are:

  • Next generation of software technology connecting data faster from any source
  • Little to no hardware cost to process and store huge amount of data  (Moore’s Law)
  • A sharp increase in number of machines and devices generating data that are connected online
  • Massive worldwide growth of people connecting online and sharing information
  • Speed of Internet connectivity that’s now free in many public places

As a result, our engagement with the digital world is rising – both for personal and business purposes.  Increasingly, we play games, shop, sign digital contracts, make product recommendations, respond to customer complains, share patient data, and make real time pricing changes to in-store products – all from a mobile device or laptop.  We do so increasingly in a collaborative way, in real-time, and in a very personalized fashion.  Big Data, Social, Cloud, and Internet of Things are key topics dominating our conversations and thoughts around data these days.  They are altering our ways to engage with and expectations from each other.

This is the emergence of a new revolution or it is the next phase of our digital revolution – the democratization and ubiquity of information to create new ways of interacting with customers and dramatically speeding up market launch.  Businesses will build new products and services and create new business models by exploiting this vast new resource of information.

The Quest for Great Data

But, there is work to do before one can unleash the true potential captured in data.  Data is no more a by-product or transaction record.  Neither it has anymore an expiration date.  Data now flows through like a river fueling applications, business processes, and human or machine activities.  New data gets created on the way and augments our understanding of the meaning behind this data.  It is no longer good enough to have good data in isolated projects, but rather great data need to become accessible to everyone and everything at a moment’s notice. This rich set of data needs to connect efficiently to information that has been already present and learn from it.  Such data need to automatically rid itself of inaccurate and incomplete information.  Clean, safe, and connected – this data is now ready to find us even before we discover it.   It understands the context in which we are going to make use of this information and key decisions that will follow.  In the process, this data is learning about our usage, preference, and results.  What works versus what doesn’t.  New data is now created that captures such inherent understanding or intelligence.  It needs to flow back to appropriate business applications or machines for future usage after fine-tuning.  Such data can then tell a story about human or machine actions and results.  Such data can become a coach, a mentor, a friend of kind to guide us through critical decision points.  Such data is what we would like to call great data.  In order to truly capitalize on the next step of digital revolution, we will pervasively need this great data to power our decisions and thinking.

Impacting Every Industry

By 2020, there’ll be 50 Billion connected devices, 7x more than human beings on the planet.  With this explosion of devices and associated really big data that will be processed and stored increasingly in the cloud.  More than size, this complexity will require a new way of addressing business process efficiency that renders agility, simplicity, and capacity.  Impact of such transformation will spread across many industries.  A McKinsey article, “The Future of Global Payments”, focuses on digital transformation of payment systems in the banking industry and ubiquity of data as a result.   One of the key challenges for banks will be to shift from their traditional heavy reliance on siloed and proprietary data to a more open approach that encompasses a broader view of customers.

Industry executives, front line managers, and back office workers are all struggling to make the most sense of the data that’s available.

Closing Thoughts on Great Data

A “2014 PWC Global CEO Survey ” showed 81% ranked technology advances as #1 factor to transform their businesses over next 5 years.  More data, by itself, isn’t enough for this transformation.  A robust data management approach integrating machine and human data, from all sources and updated in real-time, among on-premise and cloud-based systems must be put in place to accomplish this mission.  Such an approach will nurture great data.  This end-to-end data management platform will provide data guidance and curate an organization’s one of the most valuable assets, its information.    Only by making sense of what we have at our disposal, will we unleash the true potential of the information that we possess.  The next step in the digital revolution will be about organizations of all sizes being fueled by great data to unleash their potential tapped.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Data Governance, Data Integration Platform, Enterprise Data Management | Tagged , , , , | Leave a comment

Analytics Stories: An Educational Case Study

AnalyticsAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. At first glance, you might not think of universities needing to worry much about their right to win, but universities today are facing increasing competition for students as well as the need to increase efficiency, decrease dependence upon state funding, create new and less expensive delivery models, and drive better accountability.

George Washington University Perceives The Analytic Opportunity

AnalyticsGeorge Washington University (GWU) is no different. And for this reason their leadership determined that they needed to gain the business insight to compete for the best students, meet student diversity needs, and provide accountability to internal and external stakeholders. All of these issues turned out to have a direct impact upon GWU’s business processes—from student recruitment to financial management. At the same time university leadership determined the complexity of these challenges requires continual improvement in the University’s operational strategies and most importantly, accurate, timely, and consistent data.

Making It A Reality

processing dataGWU determined that getting after these issues required a flexible system that could provide analytics and key academic performance indicators and metrics on demand, whenever they needed them. They, also, determined that the analytics and underlying data needed to enable accurate, balanced decisions needed to be performed more quickly and more effectively than in the past.

Unfortunately, GWU’s data was buried in disparate data sources that were largely focused on supporting transactional, day-to-day business processes. This data was difficult to extract and even more difficult to integrate into a single format, owing to inherent system inconsistencies and the ownership issues surrounding them — a classic problem for collegial environments. Moreover, the university’s transaction applications did not store data in models that supported on-demand and ad hoc aggregations that GWU business users required.

To solve these issues, GWU created a data integration and business intelligence implementation dubbed the Student Data Mart (SDM). The SDM integrates raw structured and unstructured data into a unified data model to support key academic metrics.

“The SDM represents a life record of the students,” says Wolf, GWU’s Director of Business Intelligence. “It contains 10 years of recruitment, admissions, enrollment, registration, and grade-point average information for all students across all campuses”. It supports a wide-range of academic metrics around campus enrollment counts, admissions selectivity, course enrollment, student achievement, and program metrics.

These metrics are directly and systematically aligned with the academic goals for each department and with GWU’s overall overarching business goals. Wolf says, “The SDM system provides direct access to key measures of academic performance”. “By integrating data into a clean repository and disseminating information over their intranet, the SDM has given university executivesdirect access to key academic metrics. Based on these metrics, users are able to make decisions in a timely manner and with more precision than before.”

Their integration technology supports a student account system, which supplies more than 400 staff with a shared, unified view of the financial performance of students. It connects data from a series of diverse, fragmented internal sources and third-party data from employers, sponsors, and collection agencies. The goal is to answer business questions about whether students paid their fees or how much they paid for each university course.

Continual Quality Improvement

AnalyticsDuring its implementation, GWU’s data integration process exposed a number of data quality issues that were the natural outcome of a distributed data ownership. Without an enterprise approach to data and analytics, it would have been difficult to investigate the nature and extent of data quality issues from its historical fragmented business intelligence system. Taking an enterprise approach has, as well, enabled GWU to improve data quality standards and procedures.

Wolf explains, “Data quality is an inevitable problem in any higher education establishment, because you have so many different people—lecturers, students, and administration staff—all entering data. With our system, we can find hidden data problems, wherever they are, and analyze the anomalies across all data sources. This helps build our trust and confidence in the data. It also speeds up the design phase because it overcomes the need to hand query the data to see what the quality is like.”

Connecting The Dots

Dots_gameplayWolf and his team have not stopped here. As data emanating from social media has grown, they have designed their system so social data can be integrated just as easily as their traditional data sources including Oracle Financials, SunGard, SAP, and flat file data. Wolf says the SDM platform doesn’t turn its back on any type of data. By allowing the university to integrate any type of data, including social media, Wolf has been able to support key measures of academic performance, improving standards, and reducing costs. Ultimately, this is helping GWU maintain its business position as well as the University’s position especially as a magnet for the best students around the world.

In sum, the GWU analytics solution has helped it achieve the following business goals:

  • Attract the best students
  • Provide trusted reliable data for decision makers
  • Enable more timely business decisions
  • Increase achievement of academic and administrative goals
  • Deliver new business insight by combining social media with existing data sources

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance | Tagged , , | Leave a comment

Adding Big Data to Your EDW Architecture

Adding Big Data to Your EDW Architecture

Adding Big Data to Your EDW Architecture

As you think forward towards how you will use Big Data to compliment your current enterprise data warehouse (EDW) environment, check out the excellent webinar by Ralph Kimball and Matt Brandwein of Cloudera:

Webinar: Building a Hadoop Data Warehouse: Hadoop 101 for EDW Professionals

A couple comments on the importance of integration platforms like Informatica in an EDW/Hadoop environment.

  • Hadoop does mean you can do some quick and inexpensive exploratory analysis with little or no ETL.  The issue is that it will not perform at the level you need to take it to production.  As the webinar points out, applying some structure to the data with columnar files (not RDBMS) will dramatically speed up query performance.
  • The other thing that makes an integration platform more important than ever is the explosion of data complexity.    As Dr. Kimball put it: 

“Integration is even more important these days because you are looking at all sorts of data sources coming in from all sorts of directions.” 

To perform interesting analyses, you are going to have to be able to join data with different formats and different semantic meaning.  And that is going to require integration tools.

  • Thirdly, if you are going to put this data into production, you will want to incorporate data cleansing, metadata management, and possibly formal data governance to ensure that your data is trustworthy, auditable, and has business context.  There is no point in serving up bad data quickly and inexpensively.  The result will be poor business decisions and flawed analyses.

For Data Warehouse Architects

The challenge is to deliver actionable content from the exploding amount of data available.  You will need to be constantly scanning for new sources of data and looking for ways to quickly and efficiently deliver that to the point of analysis.

For Enterprise Architects

The challenge with adding Big Data to Your EDW Architecture is to define and drive a coherent enterprise data architecture across your organization that standardizes people, processes, and tools to deliver clean and secure data in the most efficient way possible.  It will also be important to automate as much as possible to offload routine tasks from the IT staff.  The key to that automation will be the effective use of metadata across the entire environment to not only understand the data itself, but how it is used, by whom, and for what business purpose.  Once you have done that, then it will become possible to build intelligence into the environment.

For more on Informatica’s vision for an Intelligent Data Platform and how this fits into your enterprise data architecture see Think “Data First” to Drive Business Value

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Big Data, CIO, Data Warehousing | Tagged , , , , | Leave a comment

CFO Checklist to Owning Enterprise Analytics

Frank-FriedmanLast month, the CEO of Deloitte said that CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. In my discussions with CFOs, they have expressed similar opinions.  Given this, the question becomes what does a CFO need to do to be effective leader of their company’s analytics agenda? To answer this, I took a look at what Tom Davenport suggests in his book “Analytics at Work”. In this book, Tom suggests that an analytical leader need to do the following twelve things to be effective:

12 Ways to Be an Effective Analytics Leader

1)      Develop their people skills. This is not just about managing analytical people which has its own challenges. It is, also, about CFOs establishing the “the credibility and trust needed when analytics produce insights that effectively debunk currently accepted wisdom”.
2)      Push for fact based decision making. You need to, as a former boss of mine like to say, become the lightening rod and in this case, set the expectation that people will make decisions based upon data and analysis.
3)      Hire and retain smart people. You need to provide a stimulating and supportive work environment for analysts and give them credit when they do something great.
4)      Be the analytical example. You need to lead by example. This means you need to use data and analysis in making your own decisions
5)      Signup for improved results. You need to commit to driving improvements in a select group of business processes by using analytics. Pick something meaningful ike reducing the cost of customer acquisition or optimizing your company’s supply chain management.
6)      Teach the organization how to use analytic methods. Guide employees and other stakeholders into using more rigorous thinking and decision making.
7)      Set strategies and performance expectations. Analytics and fact-based decisions cannot happen in a vacuum. They need strategies and goals that analytics help achieve.
8)      Look for leverage points. Look for the business problems where analytics can make a real difference. Look for places where a small improvement in a process driven by analytics can make a big difference.
9)      Demonstrate persistence. Work doggedly and persistently to apply analytics to decision making, business processes, culture, and business strategy.
10)   Build an analytics ecosystem with your CIO. Build an ecosystem consisting of other business leaders, employees, external analytics suppliers, and business partners. Use them to help you institutionalize analytics at your company.
11)    Apply analytics on more than one front. No single initiative will make the company more successful—no single analytics initiative will do so either.
12)   Know the limits to analytics. Know when it is appropriate to use intuition instead of analytics. As a professor of mine once said not all elements of business strategy can be solved by using statistics or analytics. You should know where and when analytics are appropriate.

Data AnalyticsFollowing these twelve items will help strategic oriented CFOs lead the analytics agenda at their companies. As I indicated in “Who Owns the Analytics Agenda?”, CFOs already typically act as data validators at their firms, but taking this next step matters to their enterprise because “if we want to make better decisions and take the right actions, we have use analytics” (Analytics at Work, Tom Davenport, Harvard Business Review Press, page 1). Given this, CFOs really need to get analytics right. The CFOs that I have talked to say they already “rely on data and analytics and they need them to be timely and accurate”.

One CFO, in fact, said that data is potentially the only competitive advantage left for his firm”. And while implementing the data side of this depends on the CIO. It is clear from the CFOs that I have talked to that they believe a strong business relationship with their CIO is critical to the success of their business.

Enterprise DataSo the question remains are you ready as a financial leader to lead on the analytics agenda? If you are and you want to learn more about setting the analytics agenda, please consider yourself invited to webinar that I am doing with the CFO of RoseRyan in January.

The Webinar is entitled “Analytics and Data for the Strategic CFO”. And by clicking this link you can register to attend. See you there.

Related Blogs

CFOs Move to Chief Profitability Officer
CFOs Discuss Their Technology Priorities
The CFO Viewpoint upon Data
How CFOs can change the conversation with their CIO?
New type of CFO represents a potent CIO ally
Competing on Analytics
The Business Case for Better Data Connectivity

Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Data Governance | Tagged , , , , | Leave a comment

Getting to Your Future State Enterprise Data Architecture

Getting to Your Future State Enterprise Data Architecture

Future State Enterprise Data Architecture

Just exactly how do your move from a “Just a Bunch of Data” (JBOD) architecture to a coherent enterprise data architecture?

The white paper, “The Great Rethink: Building a Highly Responsive and Evolving Data Integration Architecture” by Claudia Imhoff and Joe McKendrick provides an interesting view of what such an architecture might look like.  The paper describes how to move from ad hoc Data Integration to an Enterprise Data Architecture.  The paper also describes an approach towards building architectural maturity and a next-generation enterprise data architecture that helps organizations to be more competitive.

Organizations that look to compete based on their data are searching for ways to design an architecture that:

  • On-boards new data quickly
  • Delivers clean and trustworthy data
  • Delivers data at the speed required of the business
  • Ensures that data is handled in secure way
  • Is flexible enough to incorporate new data types and new technology
  • Enables end user self-service
  • Speeds up the speed of business value delivery for an organization

In my previous blog, Digital Strategy and Architecture, we discussed the demands that digital strategies are putting on enterprise data architecture in particular.  Add to that the additional stress from business initiatives such as:

  • Supporting new mobile applications
  • Moving IT applications to the cloud – which significantly increases data management complexity
  • Dealing with external data.  One recent study estimates that a full 25% of the data being managed by the average organization is external data.
  • Next-generation analytics and predictive analytics with Hadoop and No SQL
  • Integrating analytics with applications
  • Event-driven architectures and projects
  • The list goes on…

The point here is that most people are unlikely to be funded to build an enterprise data architecture from scratch that can meet all these needs.  A pragmatic approach would be to build out your future state architecture in each new strategic business initiative that is implemented.  The real challenge of being an enterprise architect is ensuring that all of the new work does indeed add up to a coherent architecture as it gets implemented.

The “Great Rethink” white paper describes a practical approach to achieving an agile and responsive future state enterprise data architecture that will support your strategic business initiatives.  It also describes a high level data integration architecture and the building blocks to achieving that architecture.  This is highly recommended reading.

Also, you might recall that Informatica sponsored the Informatica Architect’s Challenge this year to design an enterprise-wide data architecture of the future.  The contest has closed and we have a winner.  See the site for details, Informatica Architect Challenge .

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, CIO, Data First, Data Integration Platform | Tagged , , , , , | Leave a comment

Should Analytics Be Focused on Small Questions Versus Big Questions?

AnalyticsShould the analytic resources of your company be focused upon small questions or big questions? For many, answering this question is not an easy one. Some find key managers preferring to make decisions from personal intuition or experience. When I worked for a large computer peripheral company, I remember executives making major decisions about product direction from their gut even when there was clear evidence that a major technology shift was about to happen. This company went from being a multi-billion dollar company to a $50 million dollar company in the matter of a few years.

In other cases, the entire company may not see the relationship between data and good decision making. When this happens, silos of the business collect data of value to them but there is not a coordinated, focused effort placed toward enterprise level strategic targets. This naturally leads to silos of analytical activity. Cleary answering small question may provide the value of having analytics quickly. However, answering the bigger questions will have the most value to the business as a whole. And while the big questions are often harder to answer, they can be pivotal to the go forward business. Here are just a few examples of the big questions that are worthy of being answered by most enterprises.

  • Which performance factors have the greatest impact on our future growth and profitability?
  • How can we anticipate and influence changing market conditions?
  • If customer satisfaction improves, what is the impact on profitability?
  • How should we optimize investments across our products, geographies, and market channels?

However, most businesses cannot easily answer these questions. Why then do they lack the analytical solutions to answer these questions?

Departmental BI does not yield strategic relevant data

Analytic- Business IntelligenceLet’s face it, business intelligence to data has largely been a departmental exercise. In most enterprises as we have been saying, analytics start as pockets of activity versus as an enterprise wide capability. The departmental approach leads business analysts to buy the same data or software that others in the organization have already bought. Enterprises end up with hundreds of data marts, reporting packages, forecasting tools, data management solutions, integration tools, and methodologies. According to Thomas Davenport, one firm he knows well has “275 data marts and thousand different information resources, but it couldn’t pull together a single view of the business in terms of key performance metrics and customer data” (Analytics at Work, Thomas Davenport, Harvard University Press, Page 47)

Clearly, answering the Big Questions requires leadership and a coordinated approach. Amazingly, taking this road often even reduces enterprise analytical expenditure as silos of information including data marts and spaghetti codes integrations are eliminated and replaced with a single enterprise capability. But if you want to take this approach how do you make sure that you get the right business questions answered?

Strategic Approach

Strategy Drives AnalyticsThe strategic approach starts with enterprise strategy.  In enterprise strategy,   leadership will define opportunities for business growth, innovation, differentiation, and marketplace impact. According to Derek Abell, this process should occur in a three cycle strategic planning approach. This approach has the enterprise doing business planning followed by functional planning, and lastly budgeting. Each cycle provides fodder for the stages that follow. For each stage, a set of overarching cascading objectives can be derived. From these, the businesses can define a set of critical success factors that will let it know whether or not business objectives are being met. Supporting each critical success factor are quantitative key performance indicators that in aggregate say whether the success factors are going to met. Finally, these key performance indicators derive the data that is needed to support the KPIs in terms of metrics or the supporting dimensional data for analysis. So the art and science here is defining critical success factors and KPIs that answer the big questions.

Core Capabilities

automationAs we saw above, the strategic approach is about tying questions to business strategy. In the capabilities approach, we tie questions to the capabilities that drive business competitive advantage. To determine these business capabilities, we need to start by looking at “the underling mechanism of value creation in the company (what they do best) and what the opportunities for meeting the market effectively. (“The Essential Advantage”, Paul Leinwand, Harvard Business Review Press, page 19). Typically, this determines 3-6 distinctive capabilities that impact the success of their enterprises service or product portfolio. These are the things that “enable your company to consistently outperform rivals” (“The Essential Advantage”, Paul Leinwand, Harvard Business Review Press, page 14). To optimize key business capabilities over time, and innovate and operate in ways that differentiate the businesses in the eyes and experience of customers (Analytics at Work, Thomas Davenport, Harvard University Press, Page 73). Here we want to target analytics investments at their distinctive capabilities. Here are some examples of potential target capabilities by industry:

  • Financial services: Credit scoring
  • Retail: Replenishment
  • Manufacturing: Supply Chain Optimization
  • Healthcare: Disease Management

Parting remarks

So as we have discussed, many firms are spending too much on analytic solutions that do not solve real business problems. Getting after this is not a technical issue—it is a business issue. It starts by asking the right business questions which can come from business strategy or your core business capabilities or some mix of each.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO, Data Governance | Tagged , , , , , | Leave a comment

Analytics Stories: A Banking Case Study

Right to winAs I have shared within other post within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. In banking, the right to win increasingly comes from improving two core sets of business capabilities—risk management and customer service.

Significant change has occurred in risk management over the last few years following theAnalytics subprime crisis and the subsequent credit crunch. These environmental changes have put increased regulatory pressure upon banks around the world. Among other things, banks need to comply with measures aimed at limiting the overvaluation of real estate assets and at preventing money laundering. A key element of handling these is to ensuring that go forward business decisions are made consistently using the most accurate business data available. It seems clear that data consistency can determine the quality of business operations especially business risk.

At the same time as banks need to strengthen their business capabilities around operations, and in particular risk management, they also need to use better data to improve the loyalty of their existing customer base.

Banco Popular launches itself into the banking vanguard

Banco Popular is an early responder regarding the need for better banking data consistency. Its leadership created a Quality of Information Office (the Office uniquely is not based within IT but instead with the Office of the President) with the mandate of delivering on two business objectives:

  1. Ensuring compliance with governmental regulations occurs
  2. Improving customer satisfaction based on accurate and up-to-date information

Part of the second objective is aimed at ensuring that each of Banco Popular’s customers was offered the ideal products for their specific circumstances. This is interesting because by its nature it assists in obtainment of the first objective. To validate it achieves both mandates, the Office started by creating an “Information Quality Index”. The Index is created using many different types of data relating to each of the bank’s six million customers–including addresses, contact details, socioeconomic data, occupation data, and banking activity data. The index is expressed in percentage terms, which reflects the quality of the information collected for each individual customer. The overarching target set for the organization is a score of 90 percent—presently, the figure sits at 75 percent. There is room to grow and improve!

Current data management systems limit obtainment of its business goals

Unfortunately, the millions of records needed by the Quality Information Office are spread across different tables in the organization’s central computing system and must be combined into one information file for each customer to be useful to business users. The problem is that they had depended on third parties to manually pull and clean up this data. This approach with the above mandates proved too slow to be executed in timely fashion. This, in turn, has impacted the quality of their business capabilities for risk and customer service. According to Banco Popular, their approach did not create the index and other analyses “with the frequency that we wanted and examining the variables of interest to us,” explains Federico Solana, an analyst at the Banco Popular Quality of Information Office.

Creating the Quality Index was just too time consuming and costly. But not improving data delivery performance had a direct impact on decision making.

Automation proves key to better business processes

TrustTo speed up delivery of its Quality Index, Banco Popular determined it needed to automate it’s creation of great data—data which is trustworthy and timely. According to Tom Davenport, “you can’t be analytical without data and you can’t be really good at analytics without really good data”. (Analytics at Work, 2010, Harvard Business Press, Page 23). Banco Popular felt that automating the tasks of analyzing and comparing variables would increase the value of data at lower cost and ensuring a faster return on data.

In addition to fixing the Quality Index, Banco Popular needed to improve its business capabilities around risk and customer service automation. This aimed at improving the analysis of mortgages while reducing the cost of data, accelerating the return on data, and boosting business and IT productivity.

Everything, however, needed to start with the Quality Index. After the Quality Index was created for individuals, Banco Popular created a Quality of Information Index for Legal Entities and is planning to extend the return on data by creating indexes for Products and Activities. For the Quality Index related to legal entities, the bank included variables that aimed at preventing the consumption of capital as well as other variables used to calculate the probability of underpayments and Basel models. Variables are classified as essential, required, and desirable. This evaluation of data quality allows for the subsequent definition of new policies and initiatives for transactions, the network of branches, and internal processes, among other aspects. In addition, the bank is also working on the in-depth analysis of quality variables for improving its critical business processes including mortgages.

Some Parting Remarks

In the end, Banco Popular has shown the way forward for analytics. In banking the measures of performance are often known, however, what is problematic is ensuring the consistency of decision making across braches and locations. By working first on data quality, Banco Popular ensured that the quality of data measures are consistent and therefore, it can now focus its attentions on improving underling business effectiveness and efficiency.

Related links

Related Blogs

Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

 

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Governance | Tagged , , , | 1 Comment