Tag Archives: CIO
Data and Information becoming a key corporate asset
According to Barbara Wixom at MIT CISR, “In a digital economy, data and the information it produces is one of a company’s most important assets”. (“Recognizing data as an enterprise asset”, Barbara Wixom, MIT CISR, 3 March 2015). Barbara goes onto suggest that businesses increasingly “need to take an enterprise view of data. They should understand and govern data as a corporate asset, even when data management remains distributed”.
CIOs are not the enterprise data steward
Given that data is a corporate asset, you might expect this would be an area for the CIO’s leadership. However, I heard differently when I recently met with two different groups of CIOs. Regardless of whether the CIOs were public sector or private sector, they told me that they did not want to be the owner of enterprise data. One CIO succinctly put it this way, “we are not data stewards. Governance has to be done by the business—IT is merely the custodians of their data”. These CIOs claim that the reason that the business must own business data and must determine how that data should be managed is because only the business understands the business context around the data.
Given this, the CIOs that I talked to said that IT should not manage data but “should make sure that what the business needs done gets done with data”. CIOs, therefore, own the processes and technology for ensuring data is secured and available when and where the business needs it. Debbie Lew from ISACA put it this way, “IT does not own the data. IT facilitates data”.
So if the management of data is distributed what is the role of the CIO in being a good data custodian?
COBIT 5 provides some concrete suggestions that are worth taking a look at. According to COBIT, IT should make sure information and data owners are established and that they are able to make decisions about data definition, data classification, data security and control, and data integrity. Additionally, IT needs to ensure that the information system provides the “knowledge required to support all staff in their work activities.”
IT must create facilities so knowledge can be used
This means IT organizations need to create facilities so that knowledge can be used, shared and updated. Part of doing this task well involves ensuring the reliable availability of useful information. This should involve keeping the ratio of erroneous or unavailable information to a minimum. Measuring performance here requires looking at the percent of reports that are not delivered on time and the percent of reports containing inaccuracies. These obviously need to be kept to a minimum. Clearly, this function is enabled by backup systems, applications, data and documentation. These should be worked according to a defined schedule that meets business requirements.
To establish a level of data accuracy, that is acceptable to business users, starts by building and maintaining an enterprise data dictionary that includes details about the data definition, data ownership, appropriate data security, and data retention and destruction requirements. This involves identifying the data outputs from the source and mapping data storage, location, retrieval and recoverability. It needs to ensure from a design perspective, appropriate redundancy, recovery and backup are built into the enterprise data architecture.
IT must enable compliance and security
COBIT 5 stresses the importance of data and information compliance and security. Information needs to be “properly secured, stored, transmitted or destroyed.” This starts with effective security and controls over information systems. To do this, procedures need to be defined and implemented to ensure the integrity and consistency of information stored in databases, data warehouses and data archives. All users need to be uniquely identifiable and have access rights in accordance with their business role. And for business compliance, all business transactions need to be retained for governance and compliance reasons. According to COBIT 5, IT organizations are chartered to ensuring the following four elements are established:
- Clear information ownership
- Timely, correct information
- Clear enterprise architecture and efficiency
- Compliance and security
There needs to be a common set of information requirements
But how are these objectives achieved? Effective information governance requires that the business and IT have a strong working relationship. It, also, requires that information requirements are established. Getting timely and correct information often starts by improving how data is managed. Instead of manually moving data or creating layer over layer of spaghetti code integration, enterprises need to standardize a data architecture that creates a single integration layer among all data sources.
This integration layer increasingly needs to support new sources of data too and be able to do so at the speed of business. Business users want trustworthy data. An expert on data integration “maintains that at least 20 percent of all raw data is incorrect. Inaccurate data leads data users to question the information their systems provide.” The data system needs to automatically and proactively fix data issues like addresses, missing data and data format problems. And once this has been accomplished, it needs to go after redundancies in customers and transactions. With multiple IT-managed transaction systems, it is easy to misstate both customers and customer transactions. It is also possible to miss potential business opportunities. All of these are required to get accurate data.
Data needs to be systematically protection
Additionally, data need to be systematically protected. This means that user access to data needs to be managed systematically across all IT-managed systems. Typical data integrations move data between applications without protecting the source data systems’ rules. A data security issue at any point in the IT system can expose all data. At the same time, enterprises need to control exactly what data are moved in test environments and product environments. Enterprises must also ensure that a common set of security governance rules are established and maintained across the entire enterprise, including data being exchanged with partners, employees and contractors using data outside of the enterprise firewall.
Clearly, COBIT 5 suggests that CIOs cannot completely divorce themselves from data governance. Yes, CIOs are data custodians but there are clear and specific tasks that the CIO and their staff must uniquely take on. Otherwise, a good foundation for data governance cannot be established.
Speed is the top challenge facing IT today, and it’s reaching crisis proportions at many organizations. Specifically, IT needs to deliver business value at the speed that the business requires.
The challenge does not end there; This has to be accomplished without compromising cost or quality. Many people have argued that you only get two out of three on the Speed/Cost/Quality triangle, but I believe that achieving this is the central challenge facing Enterprise Architects today. Many people I talk to are looking at agile technologies, and in particular Agile Data Integration.
There have been a lot of articles written about the challenges, but it’s not all doom and gloom. Here is something you can do right now to dramatically increase the speed of your project delivery while improving cost and quality at the same time: Take a fresh look you Agile Data Integration environment and specifically at Data Virtualization. Data Virtualization offers the opportunity to simplify and speed up the data part of enterprise projects. And this is the place where more and more projects are spending 40% and more of their time. For more information and an industry perspective you can download the latest Forrester Wave report for Data Virtualization Q1 2015.
Here is a quick example of how you can use Data Virtualization technology for rapid prototyping to speed up business value delivery:
- Use data virtualization technology to present a common view of your data to your business-IT project teams.
- IT and business can collaborate in realtime to access and manage data from a wide variety of very large data sources – eliminating the long, slow cycles of passing specifications back and forth between business and IT.
- Your teams can discover, profile, and manage data using a single virtual interface that hides the complexity of the underlying data.
- By working with a virtualization layer, you are assured that your teams are using the right data and data that can by verified by linking it to a Business Glossary with clear terms, definitions, owners, and business context to reduce the chance of misunderstandings and errors.
- Leading offerings in this space include data quality and data masking tools in the interface, ensuring that you improve data quality in the process.
- Data virtualization means that your teams can be delivering in days rather than months and faster delivery means lower cost.
There has been a lot of interest in agile development, especially as it relates to data projects. Data Virtualization is a key tool to accelerate your team in this direction.
Informatica has a leading position in the Forrester report due to the productivity of the Agile Data Integration environment but also because of the integration with the rest of the Informatica platform. From an architect’s point of view it is critical to start standardizing on an enterprise data management platform. Continuing data and data tool fragmentation will only slow down future project delivery. The best way to deal with the growing complexity of both data and tools is to drive standardization within your organizations.
I recently got to talk to several senior IT leaders about their views on information governance and analytics. Participating were a telecom company, a government transportation entity, a consulting company, and a major retailer. Each shared openly in what was a free flow of ideas.
The CEO and Corporate Culture is critical to driving a fact based culture
I started this discussion by sharing the COBIT Information Life Cycle. Everyone agreed that the starting point for information governance needs to be business strategy and business processes. However, this caused an extremely interesting discussion about enterprise analytics readiness. Most said that they are in the midst of leading the proverbial horse to water—in this case the horse is the business. The CIO in the group said that he personally is all about the data and making factual decisions. But his business is not really there yet. I asked everyone at this point about the importance of culture and the CEO. Everyone agreed that the CEO is incredibly important in driving a fact based culture. Apparent, people like the new CEO of Target are in the vanguard and not the mainstream yet.
KPIs need to be business drivers
The above CIO said that too many of his managers are operationally, day-to-day focused and don’t understand the value of analytics or of predictive analytics. This CIO said that he needs to teach the business to think analytically and to understand how analytics can help drive the business as well as how to use Key Performance Indicators (KPIs). The enterprise architect in the group shared at this point that he had previously worked for a major healthcare organization. When organization was asked to determine a list of KPIs, they came back 168 KPIs. Obviously, this could not work so he explained to the business that an effective KPI must be a “driver of performance”. He stressed to the healthcare organization’s leadership the importance of having less KPIs and of having those that get produced being around business capabilities and performance drivers.
IT needs increasingly to understand their customers business models
I shared at this point that I visited a major Italian bank a few years ago. The key leadership had high definition displays that would roll by an analytic every five minutes. Everyone laughed at the absurdity of having so many KPIs. But with this said, everyone felt that they needed to get business buy in because only the business can derive the value from acting upon the data. According to this group of IT leaders, this causing them more and more to understand their customer’s business models.
Others said that they were trying to create an omni-channel view of customers. The retailer wanted to get more predictive. While Theodore Levitt said the job of marketing is to create and keep a customer. This retailer is focused on keeping and bringing back more often the customer. They want to give customers offers that use customer data that to increase sales. Much like what I described recently was happening at 58.com, eBay, and Facebook.
Most say they have limited governance maturity
We talked about where people are in their governance maturity. Even though, I wanted to gloss over this topic, the group wanted to spend time here and compare notes between each other. Most said that they were at stage 2 or 3 in in a five stage governance maturity process. One CIO said, gee does anyone ever at level 5. Like analytics, governance was being pushed forward by IT rather than the business. Nevertheless, everyone said that they are working to get data stewards defined for each business function. At this point, I asked about the elements that COBIT 5 suggests go into good governance. I shared that it should include the following four elements: 1) clear information ownership; 2) timely, correct information; 3) clear enterprise architecture and efficiency; and 4) compliance and security. Everyone felt the definition was fine but wanted specifics with each element. I referred them and you to my recent article in COBIT Focus.
CIO says they are the custodians of data only
At this point, one of the CIOs said something incredibly insightful. We are not data stewards. This has to be done by the business—IT is the custodians of the data. More specifically, we should not manage data but we should make sure what the business needs done gets done with data. Everyone agreed with this point and even reused the term, data custodians several times during the next few minutes. Debbie Lew of COBIT said just last week the same thing. According to her, “IT does not own the data. They facilitate the data”. From here, the discussion moved to security and data privacy. The retailer in the group was extremely concerned about privacy and felt that they needed masking and other data level technologies to ensure a breach minimally impacts their customers. At this point, another IT leader in the group said that it is the job of IT leadership to make sure the business does the right things in security and compliance. I shared here that one my CIO friends had said that “the CIOs at the retailers with breaches weren’t stupid—it is just hard to sell the business impact”. The CIO in the group said, we need to do risk assessments—also a big thing for COBIT 5–that get the business to say we have to invest to protect. “It is IT’s job to adequately explain the business risk”.
Is mobility a driver of better governance and analytics?
Several shared towards the end of the evening that mobility is an increasing impetus for better information governance and analytics. Mobility is driving business users and business customers to demand better information and thereby, better governance of information. Many said that a starting point for providing better information is data mastering. These attendees felt as well that data governance involves helping the business determine its relevant business capabilities and business processes. It seems that these should come naturally, but once again, IT for these organizations seems to be pushing the business across the finish line.
Blogs and Articles:
Talking to architects about analytics at a recent event, I kept hearing the familiar theme; data scientists are spending 80% of their time on “data wrangling” leaving only 20% for delivering the business insights that will drive the company’s innovation. It was clear to everybody that I spoke to that the situation will only worsen. The coming growth everybody sees in data volume and complexity, will only lengthen the time to value.
Gartner recently predicted that:
“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”
Some of the details of this study are interesting. In the end, many organizations are coming to two conclusions:
- It’s risky to delete data, so they keep it around as insurance.
- All data has potential business value, so more organizations are keeping it around for potential analytical purposes.
The other mega-trend here is that more and more organizations are looking to compete on analytics – and they need data to do it, both internal data and external data.
From an architect’s perspective, here are several observations:
- The floodgates are open and analytics is a top priority. Given that, the emphasis should be on architecting to manage the dramatic increases in both data quantity and data complexity rather than on trying to stop it.
- The immediate architectural priority has to be on simplifying and streamlining your current enterprise data architecture. Break down those data silos and standardize your enterprise data management tools and processes as much as possible. As discussed in other blogs, data integration is becoming the biggest bottleneck to business value delivery in your environment. Gartner has projected that “by 2018, more than half the cost of implementing new large systems will be spent on integration.” The more standardized your enterprise data management architecture is, the more efficient it will be.
- With each new data type, new data tool (Hive, Pig, etc.), and new data storage technology (Hadoop, NoSQL, etc.) ask first if your existing enterprise data management tools can handle the task before people go out and create a new “data silo” based on the cool, new technologies. Sometimes it will be necessary, but not always.
- The focus needs to be on speeding value delivery for the business. And the key bottleneck is highly likely to be your enterprise data architecture.
Rather than focusing on managing data growth, the priority should be on managing it in the most standardized and efficient way possible. It is time to think about enterprise data management as a function with standard processes, skills and tools (just like Finance, Marketing or Procurement.)
Several of our leading customers have built or are building a central “Data as a Service” platform within their organizations. This is a single, central place where all developers and analysts can go to get trustworthy data that is managed by IT through a standard architecture and served up for use by all.
For more information, see “The Big Big Data Workbook”
*Gartner Predicts 2015: Managing ‘Data Lakes’ of Unprecedented Enormity, December 2014 http://www.gartner.com/document/2934417#
Customers often inquire about the best way to get their team up to speed on the Informatica solutions. The question Informatica University hears frequently is whether a team should attend our public scheduled courses or hold a Private training event. The number of resources to be skilled on the products will help to determine which option to choose. If your team, or multiple teams within your company, has 7 or more resources that require getting up to speed on the Informatica products, then a Private training event is the recommended choice.
Seven (7) for a remote instructor and nine (9) for an onsite instructor is the break even cost per resource when determining whether to hold a private training and is the most cost efficient delivery for a team. In addition to the cost benefit, customers who have taken this option value the daily access to their team members to keep business operations humming along, and the opportunity to collaborate with key team members not attending by allowing them to provide input to project perspective.
These reserved events also provide the opportunity to be adapted to focus on a customers needs by tailoring course materials to highlight topics that will be key to a project’s implementation which provide creative options to get a team up to speed on the Informatica projects at hand.
With Informatica University’s new flexible pricing, hosting a Private Training event is easy. All it takes is:
- A conference room
- Training PC’s or laptops for participants
- Access to the Internet
- An LCD projector, screen, white board, and appropriate markers
Private training events provide the opportunity to get your resources comfortable and efficient with the Informatica Solutions and have a positive impact on the success of your projects.
To understand more about Informatica’s New Flexible Pricing, contact firstname.lastname@example.org
In my discussions with CIOs, their opinions differ widely about the go forward nature of the CIO role. While most feel the CIO role will remain an important function, they also feel a sea state change is in process. According to Tim Crawford, a former CIO and strategic advisor to CIOs, “CIOs are getting out of the data center business”. In my discussions, not all yet see the complete demise for their data centers. However, it is becoming more common for CIOs to see themselves “becoming an orchestrator of business services versus a builder of new operational services”. One CIO put it this way, “the building stuff is now really table stakes. Cloud and loosely oriented partnerships are bringing vendor management to the forefront”.
As more and more of the service portfolio are provided by third parties in either infrastructure as a service (IaaS) or software as a service (SaaS) modes, the CIO needs to take on what will become an increasingly important role –the service broker. An element of the service broker role that will have increasingly importance is the ability to glue together business systems w6hether they are on premise, cloud managed (Iaas), or software as a service (Saas). Regardless of who creates or manages the applications of the enterprise, it is important to remember that integration is to a large degree the nervous system that connects applications into business capabilities. As such, the CIO’s team has a critical and continuing role in managing this linkage. For example, spaghetti code integrations can easily touch 20 or more systems for ERP or expense management systems.
Brokering integration services
As CIOs start to consider the move to cloud, they need to determine how this nervous system is connected, maintained, and improved. In particular, they need to determine maybe for the first time how to integrate their cloud systems to the rest of their enterprise systems. They clearly can continue to do so by building and maintaining hand coding or by using their existing ETL tools. This can work where one takes on an infrastructure as a service model. But it falls apart when looking at the total cost of ownership of managing the change of a SaaS model. This fact begs an interesting question. Shouldn’t the advantages of SaaS occur as well for integration? Shouldn’t there be Cloud Data Management (Integration as a Service)options? The answer is yes. Instead of investing in maintain integrations of SaaS systems which because of agile methodologies can change more frequently than traditional software development, couldn’tsomeone else manage this mess for me.
The advantage of the SaaS model is total cost of ownership and faster time to value. Instead of managing, integration between SaaS and historical environments, the integration between SaaS applications and historical applications can be maintained by the cloud data Management vendor. This would save both cost and time. As well, it would free you to focus your team’s energy upon cleaning up the integrations between historical systems and each other. This is a big advantage for organizations trying to get on the SaaS bandwagon but not incur significantly increased costs as a result.
Infrastructure as a Service (IaaS)—Provides processor, databases, etc. remotely but you control and maintain what goes on them
Software as a Service (Saas)—Provides software applications and underling infrastructure as a Service
Cloud Data Management—Provides Integration of applications in particular SaaS applications as a service
CIOs are embarking upon big changes. Building stuff is becoming less and less relevant. However, even as more and more services are managed remotely (even by other parties), it remains critical that CIOs and their teams manage the glue between applications. With SaaS application in particular, this is where Cloud Data Management can really help you control integrations with less time and cost.
Author Twitter: @MylesSuer
The start of the year is a great time to refresh and take a new look at your capabilities, goals, and plans for your future-state architecture. That being said, you have to take into consideration that the most scarce resource in your architecture is probably your own personal time.
Looking forward, here are three things that I would recommend that every architect do. I realize that all three of these relate to data, but as I have said in the eBook, Think “Data First” to Drive Business Value, we believe that data is the key bottleneck in your enterprise architecture in terms of slowing the delivery of business initiatives in support of your organization’s business strategy.
So, here are the recommendations. None of these will cost you anything if you are a current Informatica PowerCenter customer. And #2 and #3 are free regardless. It is only a matter of your time:
1. Take a look at the current Informatica Cloud offering and in particular the templating capabilities.
Informatica Cloud is probably much more capable than you think. The standard templating functionality supports very complex use cases and does it all from a very easy to use, no-coding, user interface. It comes with a strong library of integration stubs that can be dragged & dropped into Microsoft Viseo to create complex integrations. Once the flow is designed in Viseo, it can be easily imported into Informatica Cloud and from there users have a Wizard-driven UI to do the final customization for sources, targets, mappings, transformations, filters, etc. It is all very powerful and easy to use.
- YouTube: Building Custom templates https://www.youtube.com/watch?v=yHmFkxov6bs
- 30 day free Informatica Cloud trial. http://more.informatica.com/en/cloud_trial/org?offer=30day-ICwebPage
Why This Matters to Architects
- You will see how easy it is for new groups to get going with fairly complex integrations.
- This is a great tool for departmental or new user use, and it will be completely compatible with the rest of your Informatica architecture – not another technology silo for you to manage.
- Any mapping created for Informatica on-premise can also run on the cloud version.
2. Download Informatica Rev and understand what it can do for your analysts and “data wranglers.”
Your data analysts are spending 80% of their time managing their data and only 20% on the actual analysis they are trying to provide. Informatica Rev is a great way to prepare your data before use in analytics tools such as Qlik, Tableau, and others.
With Informatica Rev, people who are not data experts can access, mashup, prototype and cleanse their data all in a User Interface that looks like a spreadsheet and requires no previous experience in data tools.
- For a free Informatica Rev download https://rev.informatica.com/
- Informatica Rev (Project Springbok) demo https://www.youtube.com/watch?v=0F_58bHKDDs
Why This Matters for Architects
- Your data analysts are going to use analytics tools with or without the help of IT. This enables you to help them while ensuring that they are managing their data well and optimizing their productivity.
- This tool will also enable them to share their “data recipes” and for IT to be involved in how they access and use the organization’s data.
3. Look at the new features in PowerCenter 9.6. First, upgrade to 9.6 if you haven’t already, and particularly take a good look at these new capabilities that are bundled in every version. Many people we talk to have 9.6 but don’t realize the power of what they already own.
- Profiling: Discover and analyze your data quickly. Find relationships and data issues.
- Data Services: This presents any JDBC or ODBC repository as a logical data object. From there you can rapidly prototype new applications using these logical objects without worrying about the complexities of the underlying repositories. It can also do data cleansing on the fly.
- Webinar: Great Data by Design. https://www.brighttalk.com/webcast/10477/104939
- PowerCenter 9.6 deep dive demo https://www.brighttalk.com/webcast/10477/110535
Why This Matters for Architects
- The key challenge for IT and for Architects is to be able to deliver at the “speed of business.” These tools can dramatically improve the productivity of your team and speed the delivery of projects for your business “customers.”
Taking the time to understand what these tools can do in terms of increasing the productivity of your IT team and enabling your end users to self-service will make you a better business partner overall and increase your influence across the organization. Have a great year!
The current trend is that new types of data and new types of physical storage are changing all of that.
When I got back from my trip I found a TDWI white paper by Philip Russom that describes the situation very well in a white paper detailing his research on this subject; Evolving Data Warehouse Architectures in the Age of Big Data.
From an enterprise data architecture and management point of view, this is a very interesting paper.
- First the DW architectures are getting complex because of all the new physical storage options available
- Hadoop – very large scale and inexpensive
- NoSQL DBMS – beyond tabular data
- Columnar DBMS – very fast seek time
- DW Appliances – very fast / very expensive
- What is driving these changes is the rapidly-increasing complexity of data. Data volume has captured the imagination of the press, but it is really the rising complexity of the data types that is going to challenge architects.
- But, here is what really jumped out at me. When they asked the people in their survey what are the important components of their data warehouse architecture, the answer came back; Standards and rules. Specifically, they meant how data is modeled, how data quality metrics are created, metadata requirements, interfaces for data integration, etc.
The conclusion for me, from this part of the survey, was that business strategy is requiring more complex data for better analyses (example: realtime response or proactive recommendations) and business processes (example: advanced customer service). This, in turn, is driving IT to look into more advanced technology to deal with different data types and different use cases for the data. And finally, the way they are dealing with the exploding complexity was through standards, particularly data standards. If you are dealing with increasing complexity and have to do it better, faster and cheaper, they only way you are going to survive is by standardizing as much as reasonably makes sense. But, not a bit more.
If you think about it, it is good advice. Get your data standards in place first. It is the best way to manage the data and technology complexity. …And a chance to be the driver rather than the driven.
I highly recommend reading this white paper. There is far more in it than I can cover here. There is also a Philip Russom webinar on DW Architecture that I recommend.
A month ago, I shared that Frank Friedman believes CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. Even though many CFOs are increasingly taking on what could be considered an internal CEO or COO role, many readers protested my post which focused on reviewing Frank Friedman’s argument. At the same time, CIOs have been very clear with me that they do not want to personally become their company’s data steward. So the question becomes should companies be creating a CDO or CAO role to lead this important function? And if yes, how common are these two roles anyway?
Regardless of eventual ownership, extracting value out of data is becoming a critical business capability. It is clear that data scientists should not be shoe horned into the traditional business analyst role. Data Scientists have the unique ability to derive mathematical models “for the extraction of knowledge from data “(Data Science for Business, Foster Provost, 2013, pg 2). For this reason, Thomas Davenport claims that data scientists need to be able to network across an entire business and be able to work at the intersection of business goals, constraints, processes, available data and analytical possibilities. Given this, many organizations today are starting to experiment with the notion of having either a chief data officers (CDOs) or chief analytics officers (CAOs). The open questions is should an enterprise have a CDO or a CAO or both? And as important in the end, it is important to determine where should each of these roles report in the organization?
Data policy versus business questions
In my opinion, it is the critical to first look into the substance of each role before making a decision with regards to the above question. The CDO should be about ensuring that information is properly secured, stored, transmitted or destroyed. This includes, according to COBIT 5, that there are effective security and controls over information systems. To do this, procedures need to be defined and implemented to ensure the integrity and consistency of information stored in databases, data warehouses, and data archives. According to COBIT 5, data governance requires the following four elements:
- Clear information ownership
- Timely, correct information
- Clear enterprise architecture and efficiency
- Compliance and security
To me, these four elements should be the essence of the CDO role. Having said this, the CAO is related but very different in terms of the nature of the role and the business skills require. The CRISP model points out just how different the two roles are. According to CRISP, the CAO role should be focused upon business understanding, data understanding, data preparation, data modeling, and data evaluation. As such the CAO is focused upon using data to solve business problems while the CDO is about protecting data as a business critical asset. I was living in in Silicon Valley during the “Internet Bust”. I remember seeing very few job descriptions and few job descriptions that existed said that they wanted a developer who could also act as a product manager and do some marketing as a part time activity. This of course made no sense. I feel the same way about the idea of combining the CDO and CAO. One is about compliance and protecting data and the other is about solving business problems with data. Peanut butter and chocolate may work in a Reese’s cup but it will not work here—the orientations are too different.
So which business leader should own the CDO and CAO?
Clearly, having two more C’s in the C-Suite creates a more crowded list of corporate officers. Some have even said that this will extended what is called senior executive bloat. And what of course how do these new roles work with and impact the CIO? The answer depends on organization’s culture, of course. However, where there isn’t an executive staff office, I suggest that these roles go to different places. Clearly, many companies already have their CIO function already reporting to finance. Where this is the case, it is important determine whether a COO function is in place. The COO clearly could own the CDO and CAO functions because they have a significant role in improving process processes and capabilities. Where there isn’t a COO function and the CIO reports to the CEO, I think you could have the CDO report to the CIO even though CIOs say they do not want to be a data steward. This could be a third function in parallel the VP of Ops and VP of Apps. And in this case, I would put the CAO report to one of the following: the CFO, Strategy, or IT. Again this all depends on current organizational structure and corporate culture. Regardless of where it reports, the important thing is to focus the CAO on an enterprise analytics capability.
Author Twitter: @MylesSuer