Category Archives: CIO

How a Business-led Approach Displaces an IT-led Project

In my previous blog, I talked about how a business-led approach can displace technology-led projects. Historically IT-led projects have invested significant capital while returning minimal business value. It further talks about how transformation roadmap execution is sustainable because the business is driving the effort where initiative investments are directly traceable to priority business goals.

business-led approachFor example, an insurance company wants to improve the overall customer experience. Mature business architecture will perform an assessment to highlight all customer touch points. It requires a detailed capability map, fully formed, customer-triggered value streams, value stream/ capability cross-mappings and stakeholder/ value stream cross-mappings. These business blueprints allow architects and analysts to pinpoint customer trigger points, customer interaction points and participating stakeholders engaged in value delivery.

One must understand that value streams and capabilities are not tied to business unit or other structural boundaries. This means that while the analysis performed in our customer experience example may have been initiated by a given business unit, the analysis may be universally applied to all business units, product lines and customer segments. Using the business architecture to provide a representative cross-business perspective requires incorporating organization mapping into the mix.

Incorporating the application architecture into the analysis and proposed solution is simply an extension of business architecture mapping that incorporates the IT architecture. Robust business architecture is readily mapped to the application architecture, highlighting enterprise software solutions that automate various capabilities, which in turn enable value delivery. Bear in mind, however, that many of the issues highlighted through a business architecture assessment may not have corresponding software deployments since significant interactions across the business tend to be manual or desktop-enabled. This opens the door to new automation opportunities and new ways to think about business design solutions.

Building and prioritizing the transformation strategy and roadmap is dramatically simplified once all business perspectives needed to enhance customer experience are fully exposed. For example, if customer service is a top priority, then that value stream becomes the number one target, with each stage prioritized based on business value and return on investment. Stakeholder mapping further refines design approaches for optimizing stakeholder engagement, particularly where work is sub-optimized and lacks automation.

Capability mapping to underlying application systems and services provides the basis for establishing a corresponding IT deployment program, where the creation and reuse of standardized services becomes a focal point. In certain cases, a comprehensive application and data architecture transformation becomes a consideration, but in all cases, any action taken will be business and not technology driven.

Once this occurs, everyone will focus on achieving the same goals, tied to the same business perspectives, regardless of the technology involved.

Twitter @bigdatabeat

Share
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Integration Competency Centers | Tagged , , , , | Leave a comment

Good Corporate Governance Is Built Upon Good Information and Data Governance

Good Corporate Governance

Good Corporate Governance

As you may know, COSO provides the overarching enterprise framework for corporate governance. This includes operations, reporting, and compliance. A key objective for COSO is the holding of individuals accountable for their internal control responsibilities. The COSO process typically starts by accessing risks and developing sets of control activities to mitigate discovered risks.

On an ongoing basis, organizations then need as well to generate relevant, quality information to evaluate the functioning of established internal controls. And finally they need to select, develop, and perform ongoing evaluations to ascertain whether the internal controls are present and functioning appropriately. Having said all of this, the COSO framework will not be effective without first having established effective Information and Data Governance.

So you might be asking yourself as a corporate officer why should you care about this topic anyway. Isn’t this the job of the CIO or that new person, the CDO? The answer is no. Today’s enterprises are built upon data and analytics. The conundrum here is that “you can’t be analytical without data and you can’t be really good at analytics without really good data”. (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 23). What enterprises tell us they need is great data—data which is clean, safe, and increasingly connected. And yes, the CIO is going to make this happen for you, but they are not going to do this appropriately without the help of data stewards that you select from your business units. These stewards need to help the CIO or CDO determine what data matters to the enterprise. What data should be secured? And finally, they will determine what data, information, and knowledge will drive the business right to win on an ongoing basis.

So now that you know why your involvement matters, I need to share that this control activity is managed by a supporting standard to COSO, COBIT 5. To learn specifically about what COBIT 5 recommends for Information and Data Governance, please click and read an article from the latest COBIT Focus entitled “Using COBIT 5 to Deliver Information and Data Governance”.

Twitter: @MylesSuer

Share
Posted in CIO, Data Governance | Tagged , , | Leave a comment

Analytics Stories: A Case Study from Quintiles

Pharma CIOAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. For pharmaceutical businesses, strengthening the right to win begins and ends with the drug product development lifecycle. I remember, for example, talking several years ago to the CFO of major pharmaceutical company and having him tell me the most important financial metrics for him had to do with reducing the time to market for a new drug and maximizing the period of patent protection. Clearly, the faster a pharmaceutical company gets a product to market, the faster it can begin to earning a return on its investment.

Fragmented data challenged analytical efforts

PharmaceuticalAt Quintiles, what the business needed was a system with the ability to optimize design, execution, quality, and management of clinical trials. Management’s goal was to dramatically shorten time to complete each trial, including quickly identifying when a trial should be terminated. At the same time, management wanted to continuously comply with regulatory scrutiny from Federal Drug Administration and use it to proactively monitor and manage notable trial events.

The problem was Quintiles data was fragmented across multiple systems and this delayed the ability to make business decisions. Like many organizations, Quintiles data was located in multiple incompatible legacy systems. This meant there was extensive manual data manipulation before data could become useful. As well, incompatible legacy systems impeded data integration and normalization, and prohibited a holistic view across all sources. Making matters worse, management felt that it lacked the ability to take corrective actions in a timely manner.

Infosario launched to manage Quintiles analytical challenges

PharmaceuticalTo address these challenges, Quintiles leadership launched the Infosario Clinical Data Management Platform to power its pharmaceutical product development process. Infosario breaks down the silos of information that have limited combining massive quantities of scientific and operational data collected during clinical development with tens of millions of real-world patient records and population data. This step empowered researchers and drug developers to unlock a holistic view of data. This improved decision-making, and ultimately increasing the probability of success at every step in a product’s lifecycle. Quintiles Chief Information Officer, Richard Thomas says, “The drug development process is predicated upon the availability of high quality data with which to collaborate and make informed decisions during the evolution of a product or treatment”.

What Quintiles has succeeded in doing with Infosario is the integration of data and processes associated with a drug’s lifecycle. This includes creating a data engine to collect, clean, and prepare data for analysis. The data is then combined with clinical research data and information from other sources to provide a set of predictive analytics. This of course is aimed at impacting business outcomes.

The Infosario solution consists of several core elements

At its core, Infosario provides the data integration and data quality capabilities for extracting and organizing clinical and operational data. The approach combines and harmonizes data from multiple heterogeneous sources into what is called the Infosario Data Factory repository. The end is to accelerate reporting. Infosario leverages data federation /virtualization technologies to acquire information from disparate sources in a timely manner without affecting the underlying foundational enterprise data warehouse. As well, it implements a rule-based, real-time intelligent monitoring and alerting to enable the business to tweak and enhance business processes as they are needed. A “monitoring and alerting layer” sits on top of the data, with the facility to rapidly provide intelligent alerts to appropriate stakeholders regarding trial-related issues and milestone events. Here are some more specifics on the components of the Infosario solution:

• Data Mastering provides the capability to link multi-domains of data. This enables enterprise information assets to be actively managed, with an integrated view of the hierarchies and relationships.

• Data Management provides the high performance, scalable data integration needed to support enterprise data warehouses and critical operational data stores.

• Data Services provides the ability to combine data from multiple heterogeneous data sources into a single virtualized view. This allows Infosario to utilize data services to accelerate delivery of needed information.

• Complex Event Processing manages the critical task of monitoring enterprise data quality events and delivering alerts to key stakeholders to take necessary action.

Parting Thoughts

According to Richard Thomas, “the drug development process rests on the high quality data being used to make informed decisions during the evolution of a product or treatment. Quintiles’ Infosario clinical data management platform gives researchers and drug developers with the knowledge needed to improve decision-making and ultimately increase the probability of success at every step in a product’s lifecycle.” This it enables enhanced data accuracy, timeliness, and completeness. On the business side, it has enables Quintiles to establish industry-leading information and insight. And this in turn has enables the ability to make faster, more informed decisions, and to take action based on insights. This importantly has led to a faster time to market and a lengthening of the period of patent protection.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in CIO, Data Governance, Data Quality | Tagged , , , , | Leave a comment

CES, Digital Strategy and Architecture: Are You Ready?

CES, Digital Strategy and Architecture

CES, Digital Strategy and Architecture

CES, the International Consumer Electronics show is wrapping up this week and the array of new connected products and technologies was truly impressive. “The Internet of Things” is moving from buzzword to reality.  Some of the major trends seen this week included:

  • Home Hubs from Google, Samsung, and Apple (who did not attend the show but still had a significant impact).
  • Home Hub Ecosystems providing interoperability with cars, door locks, and household appliances.
  • Autonomous cars, and intelligent cars
  • Wearable devices such as smart watches and jewelry.
  • Drones that take pictures and intelligently avoid obstacles.  …Including people trying to block them.  There is a bit of a creepy factor here!
  • The next generation of 3D printers.
  • And the intelligent baby pacifier.  The idea is that it takes the baby’s temperature, but I think the sleeper hit feature on this product is the ability to locate it using GPS and a smart phone. How much money would you pay to get your kid to go to sleep when it is time to do so?

Digital Strategies Are Gaining Momentum

There is no escaping the fact that the vast majority of companies out there have active digital strategies, and not just in the consumer space. The question is: Are you going to be the disruptor or the disruptee?  Gartner offered an interesting prediction here:

“By 2017, 60% of global enterprise organizations will execute on at least one revolutionary and currently unimaginable business transformation effort.”

It is clear from looking at CES, that a lot of these products are “experiments” that will ultimately fail.  But focusing too much on that fact is to risk overlooking the profound changes taking place that will shake out industries and allow competitors to jump previously impassible barriers to entry.

IDC predicted that the Internet of Things market would be over $7 Trillion by the year 2020.  We can all argue about the exact number, but something major is clearly happening here.  …And it’s big.

Is Your Organization Ready?

A study by Gartner found that 52% of CEOs and executives say they have a digital strategy.  The problem is that 80% of them say that they will “need adaptation and learning to be effective in the new world.”  Supporting a new “Internet of Things” or connected device product may require new business models, new business processes, new business partners, new software applications, and require the collection and management of entirely new types of data.  Simply standing up a new ERP system or moving to a cloud application will not help your organization to deal with the new business models and data complexity.

Architect’s Call to Action

Now is the time (good New Year’s resolution!) to get proactive on your digital strategy.  Your CIO is most likely deeply engaged with her business counterparts to define a digital strategy for the organization. Now is the time to be proactive in terms of recommending the IT architecture that will enable them to deliver on that strategy – and a roadmap to get to the future state architecture.

Key Requirements for a Digital-ready Architecture

Digital strategy and products are all about data, so I am going to be very data-focused here.  Here are some of the key requirements:

  • First, it must be designed for speed.  How fast? Your architecture has to enable IT to move at the speed of business, whatever that requires.  Consider the speed at which companies like Google, Amazon and Facebook are making IT changes.
  • It has to explicitly directly link the business strategy to the underlying business models, processes, systems and technology.
  • Data from any new source, inside or outside your organization, has to be on-boarded quickly and in a way that it is immediately discoverable and available to all IT and business users.
  • Ongoing data quality management and Data Governance must be built into the architecture.  Point product solutions cannot solve these problems.  It has to be pervasive.
  • Data security also has to be pervasive for the same reasons.
  • It must include business self-service.  That is the only way that IT is going to be able to meet the needs of business users and scale to the demands of the changes required by digital strategy.

Resources:

For a webinar on connecting business strategy to the architecture of business transformation see; Next-Gen Architecture: A “Business First” Approach for Agile Architecture.   With John Schmidt of Informatica and Art Caston, founder of Proact.

For next-generation thinking on enterprise data architectures see; Think “Data First” to Drive Business Value

For more on business self-service for data preparation and a free software download.

Share
Posted in Architects, CIO, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , , | Leave a comment

Data Security – A Major Concern in 2015

Data Security

Data Security – A Major Concern in 2015

2014 ended with a ton of hype and expectations and some drama if you are a data security professional or business executive responsible for shareholder value.  The recent attacks on Sony Pictures by North Korea during December caught everyone’s attention, not about whether with Sony would release “The Interview” but how vulnerable we as a society are to these criminal acts.

I have to admit, I was one of those who saw the movie and found the film humorous to say the least and can see why a desperate regime like North Korea would not want their leader admitting they love margarita’s and Katy Perry. What concerned me about the whole event was whether these unwanted security breaches were now just a fact of life?  As a disclaimer, I have no affinity over the downfall of the North Korean government however what transpired was fascinating and amazing that companies like Sony continue to struggle to protect sensitive data despite being one of the largest companies in the world.

According to the Identity Theft Resource Center, there were 761 reported data security breaches in 2014 impacting over 83 million breached records across industries and geographies with B2B and B2C retailers leading the pack with 79.2% of all breaches. Most of these breaches originated through the internet via malicious WORMS and viruses purposely designed to identify and rely back sensitive information including credit card numbers, bank account numbers, and social security information used by criminals to wreak havoc and significant financial losses to merchants and financial institutions. According to the 2014 Ponemon Institute Research study:

  • The average cost of cyber-crime per company in the US was $12.7 million this year, according to the Ponemon report, and US companies on average are hit with 122 successful attacks per year.
  • Globally, the average annualized cost for the surveyed organizations was $7.6 million per year, ranging from $0.5 million to $61 million per company. Interestingly, small organizations have a higher per-capita cost than large ones ($1,601 versus $437), the report found.
  • Some industries incur higher costs in a breach than others, too. Energy and utility organizations incur the priciest attacks ($13.18 million), followed closely by financial services ($12.97 million). Healthcare incurs the fewest expenses ($1.38 million), the report says.

Despite all the media attention around these awful events last year, 2015 does not seem like it’s going to get any better. According to CNBC just this morning, Morgan Stanley reported a data security breach where they had fired an employee who it claims stole account data for hundreds of thousands of its wealth management clients. Stolen information for approximately 900 of those clients was posted online for a brief period of time.  With so much to gain from this rich data, businesses across industries have a tough battle ahead of them as criminals are getting more creative and desperate to steal sensitive information for financial gain. According to a Forrester Research, the top 3 breach activities included:

  • Inadvertent misuse by insider (36%)
  • Loss/theft of corporate asset (32%)
  • Phishing (30%)

Given the growth in data volumes fueled by mobile, social, cloud, and electronic payments, the war against data breaches will continue to grow bigger and uglier for firms large and small.  As such, Gartner predicts investments in Information Security Solutions will grow further 8.2 percent in 2015 vs. 2014 reaching $76.9+ billion globally.  Furthermore, by 2018, more than half of organizations will use security services firms that specialize in data protection, security risk management and security infrastructure management to enhance their security postures.

Like any war, you have to know your enemy and what you are defending. In the war against data breaches, this starts with knowing where your sensitive data is before you can effectively defend against any attack. According to the Ponemon Institute, 18% of firms who were surveyed said they knew where their structured sensitive data was located where as the rest were not sure. 66% revealed that if would not be able to effectively know if they were attacked.   Even worse, 47% were NOT confident at having visibility into users accessing sensitive or confidential information and that 48% of those surveyed admitted to a data breach of some kind in the last 12 months.

In closing, the responsibilities of today’s information security professional from Chief Information Security Officers to Security Analysts are challenging and growing each day as criminals become more sophisticated and desperate at getting their hands on one of your most important assets….your data.  As your organizations look to invest in new Information Security solutions, make sure you start with solutions that allow you to identify where your sensitive data is to help plan an effective data security strategy both to defend your perimeter and sensitive data at the source.   How prepared are you?

For more information about Informatica Data Security Solutions:

  • Download the Gartner Data Masking Magic Quadrant Report
  • Click here to learn more about Informatica’s Data Masking Solutions
  • Click here to access Informatica Dynamic Data Masking: Preventing Data Breaches with Benchmark-Proven Performance whitepaper
Share
Posted in Application ILM, Banking & Capital Markets, Big Data, CIO, Data masking, Data Privacy, Data Security | Tagged , , | Leave a comment

Imagine A New Sheriff In Town

As we renew or reinvent ourselves for 2015, I wanted to share a case of “imagine if” with you and combine it with the narrative of an American frontier town out West, trying to find a new Sheriff – a Wyatt Earp.  In this case the town is a legacy European communications firm and Wyatt and his brothers are the new managers – the change agents.

management

Is your new management posse driving change?

Here is a positive word upfront.  This operator has had some success in rolling outs broadband internet and IPTV products to residential and business clients to replace its dwindling copper install base.  But they are behind the curve on the wireless penetration side due to the number of smaller, agile MVNOs and two other multi-national operators with a high density of brick-and-mortar stores, excellent brand recognition and support infrastructure.  Having more than a handful of brands certainly did not make this any easier for our CSP.   To make matters even more challenging, price pressure is increasingly squeezing all operators in this market.  The ones able to offset the high-cost Capex for spectrum acquisitions and upgrades with lower-cost Opex for running the network and maximizing subscriber profitability, will set themselves up for success (see one of my earlier posts around the same phenomenon in banking).

Not only did they run every single brand on a separate CRM and billing application (including all the various operational and analytical packages), they also ran nearly every customer-facing-service (CFS) within a brand the same dysfunctional way.  In the end, they had over 60 CRM and the same number of billing applications across all copper, fiber, IPTV, SIM-only, mobile residential and business brands.  Granted, this may be a quite excessive example; but nevertheless, it is relevant for many other legacy operators.

As a consequence, their projections indicate they incur over €600,000 annually in maintaining duplicate customer records (ignoring duplicate base product/offer records for now) due to excessive hardware, software and IT operations.  Moreover, they have to stomach about the same amount for ongoing data quality efforts in IT and the business areas across their broadband and multi-play service segments.

Here are some more consequences they projected:

  • €18.3 million in call center productivity improvement
  • €790,000 improvement in profit due to reduced churn
  • €2.3 million reduction in customer acquisition cost
  • And if you include the fixing of duplicate and conflicting product information, add another €7.3 million in profit via billing error and discount reduction (which is inline with our findings from a prior telco engagement)

Despite major business areas not having contributed to the investigation and improvements being often on the conservative side, they projected a 14:1 return ratio between overall benefit amount and total project cost.

Coming back to the “imagine if” aspect now, one would ask how this behemoth of an organization can be fixed.  Well, it will take years but without management (in this case new managers busting through the door), this organization has the chance to become the next Rocky Mountain mining ghost town.

Busting into the cafeteria with new ideas & looking good while doing it?

Busting into the cafeteria with new ideas & looking good while doing it?

The good news is that this operator is seeing some management changes now.  The new folks have a clear understanding that business-as-usual won’t do going forward and that centralization of customer insight (which includes some data elements) has its distinct advantages.  They will tackle new customer analytics, order management, operational data integration (network) and next-best-action use cases incrementally. They know they are in the data, not just the communication business.  They realize they have to show a rapid succession of quick wins rather than make the organization wait a year or more for first results.  They have fairly humble initial requirements to get going as a result.

You can equate this to the new Sheriff not going after the whole organization of the three, corrupt cattle barons, but just the foreman of one of them for starters.  With little cost involved, the Sheriff acquires some first-hand knowledge plus he sends a message, which will likely persuade others to be more cooperative going forward.

What do you think? Is new management the only way to implement drastic changes around customer experience, profitability or at least understanding?

Share
Posted in Big Data, Business Impact / Benefits, CIO, CMO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Operational Efficiency, Product Information Management, Telecommunications, Vertical | Tagged , , , , , , , , , | Leave a comment

What is the Role of the CIO in Driving Enterprise Analytics?

Data AnalysisWhen you talk to CIOs today about their business priorities, the top of their list is better connecting what IT is doing to business strategy. Or put another way, it is about establishing business/IT alignment. One area where CIOs need to make sure there is better alignment is enterprise analytics. CIOs that I have talk to share openly that business users are demanding the ability to reach their apps and data anywhere and on any device. For this reason, even though CIOs say they have interest in the mechanisms of data delivery–data integration, data cleanliness, data governance, data mastering, and even metadata management — they would not take a meeting on these topics. The reason is that CIOs say they would need to involve their business partner in these meetings. CIOs these days want you have to have a business value proposition. Given this, CIOs say that they would want to hear about what the business wants to hear about.

  • Enabling new, valuable business insights out data to happen faster
  • Enabling their businesses to compete with analytics

CIOs as an analytics proponent versus the analytics customer

Tom DavenportSo if the question is about competing with analytics, what role does the CIO have in setting the agenda here? Tom Davenport says that CIOs–as I heard in my own conversations  with CIOs–have good intentions when it comes to the developing an enterprise information strategy. They can see the value of taking an enterprise versus a departmental view. Tom suggests, however, that CIOs should start by focusing upon the analytics that will matter most to the business. He says that IT organizations should, also, build an IT infrastructure capable of delivering the information and analytics that people across the enterprise need not just now but also in the future.

Tom says that IT organizations must resist the temptation to provide analytics as an add-on or a bolt-on basis for whatever transactions system have just been developed. As a product manager, I had a development team that preferred to add analytics by source rather than do the hard work of creating integrative measures that crossed sources. So I know this problem firsthand. Tom believes that IT needs to build a platform that can be standardized and integrate data from more than one source. This includes the ability to adapt as business needs and business strategies change.

Making this an Enterprise Analytics Capability

analyticsIn the early stage for analytics, IT organizations need to focus more upon a self-service approach. But as the business matures at analytics, Tom says that IT needs to shift gears and become a proactive advocate and architect of change. Tom says that IT should be a part owner of the company’s analytical capabilities. IT managers, therefore, must understand and be able to articulate the potential for analytics being created at an enterprise level. At the same time, the IT staff–which often lacks the heavy mathematical backgrounds of analysts–needs to be able to interact with the analytics pros who use and consume the information that IT creates to build models. I had this dilemma first hand where my analytics modelers were disconnected from BI product developers. They were two different communities working on our project. And although some modelers can build apps or even a BI system, what excites them most in life is building new analytical models.

Talk the language of the business

Tom Davenport says that IT managers can make their own lives easier with the business and the with analysts by instead of discussing cloud computing, service oriented architecture, or even OLAP, discussing decision making, insights, and business performance. Meanwhile, Tom feels that the enterprise analytics journey starts with good, integrated data on transactions and business processes managed through enterprise applications like ERP and CRM Systems (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 51).

Focusing on the big questions and the right problems

Clearly driving the business to focus on the big questions and the right problems is critical. IT cannot do this but they can facilitate it. Why does it matter? An Accenture Study found that “companies that derived any real value from them (their analytics) had anticipated how to leverage the information to generate new insights to improve business performance. (“Using Enterprise Systems to Gain Uncommon Competitive Advantage, Accenture, page 3). This is critical and too few organizations succeed in doing it.

With this accomplished and to achieve the second goal, IT needs to be eliminating legacy BI systems and old spaghetti code as well as silo data marts. The goal should be to replace them with an enterprise analytics capability that answers the big questions. This requires standardization around an enterprise wide approach that ensures a consistent approach to data management and provides an integrated environment complete with data repositories/data lakes, analytical tools, presentation applications, and transformational tools. This investment should be focused on improving business processes or providing data needed for system of systems products. Tom says that IT’s job is to watch out for current and future users of information systems.

Parting Thoughts

So the question is where is your IT organization at today? Clearly, it is important as well that IT measure enterprise analytic initiatives too. IT should measure adoption. IT should find out what is used or not they are used. I had a CIO once admit to me that he did not know whether currently supported data marts were being used or even still had value. It is important that we have these answers. Clearly, being close to the business customer from the start can limit what this CIO discussed.

Related Blogs and Links

Analytics Stories: A Banking Case Study

Analytics Stories: A Financial Services Case Study

Analytics Stories: A Healthcare Case Study

Who Owns Enterprise Analytics and Data?

Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR

Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

 

Share
Posted in Big Data, Business/IT Collaboration, CIO, Data Governance | Tagged , | Leave a comment

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

Happy Holidays, Happy HoliData.

Happy Holidays, Happy HoliData

In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.

HappyHoliData_01 HappyHoliData_02 HappyHoliData_03 HappyHoliData_04 HappyHoliData_05 HappyHoliData_06 HappyHoliData_07 HappyHoliData_08 HappyHoliData_09 HappyHoliData_10 HappyHoliData_11 HappyHoliData_12 HappyHoliData_13 HappyHoliData_14 HappyHoliData_15 HappyHoliData_16 HappyHoliData_17 HappyHoliData_18 HappyHoliData_19 HappyHoliData_20 HappyHoliData_21 HappyHoliData_22 HappyHoliData_23 HappyHoliData_24

Thanks a lot to all my great teammates, who made this series happen.

Happy Holidays, Happy HoliData.

Share
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Big Data, CIO, CMO, Customers, Data Governance, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Manufacturing, Master Data Management, PaaS, PiM, Product Information Management, Retail, SaaS | Tagged , | Leave a comment

It is All About the “I”

Six ideas for CIOs in 2015 to put the innovation back in CIO

CIOFor most, the “I” in CIO stands for Information. But what about that other “I”, Innovation? For many IT organizations, 60-80% of IT spending continues to be tied up in keeping the IT lights on. But innovation matters more than ever to the business bottom line. According Geoffrey Moore, “without innovation, offerings become more and more like each other. They commoditize.” (“Dealing with Darwin”, Geoffrey Moore, page 1). Geoffrey goes on to say later in “Dealing with Darwin” that commoditization will over time drop business returns to “the cost of capital”. So clearly, this is a place that no CIO would want their enterprises to consciously go.

Given this, what is the role of the CIO in driving enterprise innovation? I believe that it is a significant one. Without question, technology investment has been a major driver of enterprise productivity gains. At the same time, IT investment has had a major role in improving business capabilities and the business value chains. And more recently, IT is even carving out a role in products themselves as part of the IoT. So how can CIOs help drive business innovation?

1) Get closer to your business customers. CIOs have said to me that their number one priority is connecting what the IT is doing to what the business is doing. Given this, CIOs should make it a real priority for their teams to get closer to the business this year. According to Kamalini Ramdas’ Article in Harvard Business Review, “to succeed at innovation, you need to have a culture in which everyone in the company is constantly scanning for ideas”.

2) Develop internal design partners. When I have started new businesses, I have always created a set of design partners to ensure that I built the right products. I tell my design partners to beat me up now rather than after I build the product. You need, as Kamalini Ramdas suggests, to harvest the best ideas of your corporate team just like I did with startups. You can start by focusing your attention upon the areas of distinctive capability—the places that give your firm its right to win.

3) Enabling your IT leaders and individual contributors to innovate. For many businesses, speed to market or speed of business processes can represent a competitive advantage. Foundationally to this are IT capabilities including up time, system performance, speed of project delivery, and the list goes on. Encouraging everyone on your team to drive superior operational capabilities can enable business competitive advantage. And one more thing, make sure to work with your business leaders to pass a portion of the business impact for improvements into a bonus for the entire enabling IT team. At Lincoln Electric, they used bonuses by team to continuously improve their products. This arch welding company shares the money saved from each process improvement with the entire team. They end up getting the best team and highest team longevity as teams work improves product quality and increases cost take out. According Kamalini, “in truly innovative culture, leaders need to imbue every employee with a clear vision and a sense of empowerment that helps them identify synergistic ideas and run with them” (“Build a Company Where Everyone’s Looking for New Ideas”, Harvard Business Review, page 1).

4) Architect for Innovation. As the velocity of change increases, businesses need IT organizations to be able to move more quickly. This requires an enterprise architecture built for agility. According to Jeanne Ross, the more agile companies have a high percentage of their core business processes digitized and they have as well standardized their technology architecture (Enterprise Architecture as Strategy, Jeanne Ross, page 12).

5) Look for disruptive innovations. I remember a professor of mine suggesting that we cannot predict the future when discussing futures research. But I believe that you can instead get closer to your customers than anyone else. CIOs should dedicate a non-trival portion of IT spend to germinating potentially disruptive ideas. They should use their design partners to select what gets early stage funding. Everyone here should act like a seed stage venture capitalist. You need to let people experiment. At the same time, design partners should set reasonable goals and actively measure performance toward goals.

6) Use analytics. Look at business analytics for areas of that could use IT’s help. Open up discussions with design partners for areas needing capability improvement. This is a great place to start. Look as well for where there are gaps in business delivery that could be drive better performance from further or improved digitization/automation. And once an innovation is initiated, analytics should actively ensure the management of  the innovation’s delivery.

Final remarks

There is always more that you can do to innovate. The key thing is to get innovation front and center on the IT agenda. Actively sponsor it and most importantly empower the team to do remarkable things. And when this happens, reward the teams that made it happen.

IT Is All About Data!
The Secret To Being A Successful CIO
Driving IT Business Alignment: One CIOs Journey
How Is The CIO Role Starting To Change?
The CIO Challenged

Twitter: @MylesSuer

Share
Posted in CIO, Data Governance | Tagged , , , , , | 1 Comment