Tag Archives: Best Practices

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

To Determine The Business Value of Data, Don’t Talk About Data

The title of this article may seem counterintuitive, but the reality is that the business doesn’t care about data.  They care about their business processes and outcomes that generate real value for the organization. All IT professionals know there is huge value in quality data and in having it integrated and consistent across the enterprise.  The challenge is how to prove the business value of data if the business doesn’t care about it. (more…)

Share
Posted in Application Retirement, Business Impact / Benefits, CIO, Data Governance, Enterprise Data Management, Integration Competency Centers | Tagged , , , , , , | Leave a comment

Are You Looking For A New Information Governance Framework?

A few months ago, while addressing a room full of IT and business professional at an Information Governance conference, a CFO said – “… if we designed our systems today from scratch, they will look nothing like the environment we own.” He went on to elaborate that they arrived there by layering thousands of good and valid decisions on top of one another.

Similarly, Information Governance has also evolved out of the good work that was done by those who preceded us. These items evolve into something that only a few can envision today. Along the way, technology evolved and changed the way we interact with data to manage our daily tasks. What started as good engineering practices for mainframes gave way to data management.

Then, with technological advances, we encountered new problems, introduced new tasks and disciplines, and created Information Governance in the process. We were standing on the shoulders of data management, armed with new solutions to new problems. Now we face the four Vs of big data and each of those new data system characteristics have introduced a new set of challenges driving the need for Big Data Information Governance as a response to changing velocity, volume, veracity, and variety.

Information GovernanceDo you think we need a different framework?

Before I answer this question, I must ask you “How comprehensive is the framework you are using today and how well does it scale to address the new challenges?

While there are several frameworks out in the marketplace to choose from. In this blog, I will tell you what questions you need to ask yourself before replacing your old framework with a new one:

Q. Is it nimble?

The focus of data governance practices must allow for nimble responses to changes in technology, customer needs, and internal processes. The organization must be able to respond to emergent technology.

Q. Will it enable you to apply policies and regulations to data brought into the organization by a person or process?

  • Public company: Meet the obligation to protect the investment of the shareholders and manage risk while creating value.
  • Private company: Meet privacy laws even if financial regulations are not applicable.
  • Fulfill the obligations of external regulations from international, national, regional, and local governments.

Q. How does it Manage quality?

For big data, the data must be fit for purpose; context might need to be hypothesized for evaluation. Quality does not imply cleansing activities, which might mask the results.

Q. Does it understanding your complete business and information flow?

Attribution and lineage are very important in big data. Knowing what is the source and what is the destination is crucial in validating analytics results as fit for purpose.

Q. How does it understanding the language that you use, and can the framework manage it actively to reduce ambiguity, redundancy, and inconsistency?

Big data might not have a logical data model, so any structured data should be mapped to the enterprise model. Big data still has context and thus modeling becomes increasingly important to creating knowledge and understanding. The definitions evolve over time and the enterprise must plan to manage the shifting meaning.

Q. Does it manage classification?

It is critical for the business/steward to classify the overall source and the contents within as soon as it is brought in by its owner to support of information lifecycle management, access control, and regulatory compliance.

Q. How does it protect data quality and access?

Your information protection must not be compromised for the sake of expediency, convenience, or deadlines. Protect not just what you bring in, but what you join/link it to, and what you derive. Your customers will fault you for failing to protect them from malicious links. The enterprise must formulate the strategy to deal with more data, longer retention periods, more data subject to experimentation, and less process around it, all while trying to derive more value over longer periods.

Q. Does it foster stewardship?

Ensuring the appropriate use and reuse of data requires the action of an employee. E.g., this role cannot be automated, and it requires the active involvement of a member of the business organization to serve as the steward over the data element or source.

Q. Does it manage long-term requirements?

Policies and standards are the mechanism by which management communicates their long-range business requirements. They are essential to an effective governance program.

Q. How does it manage feedback?

As a companion to policies and standards, an escalation and exception process enables communication throughout the organization when policies and standards conflict with new business requirements. It forms the core process to drive improvements to the policy and standard documents.

Q. Does it Foster innovation?

Governance must not squelch innovation. Governance can and should make accommodations for new ideas and growth. This is managed through management of the infrastructure environments as part of the architecture.

Q. How does it control third-party content?

Third-party data plays an expanding role in big data. There are three types and governance controls must be adequate for the circumstances. They must consider applicable regulations for the operating geographic regions; therefore, you must understand and manage those obligations.

Share
Posted in Big Data, Data Governance, Master Data Management | Tagged , , , , , | Leave a comment

Don’t Fire the CIO, Transform the Business

Transform the Business

Transform the Business

The headline on the Venture Beat website this weekend was Why you should fire your CIO. The point of the article was that the rest of the executive suite in most organizations is ignorant about IT issues and has abdicated responsibility to the CIO, or they build their own information solutions without sufficient competence in information management. The article suggests that firing the CIO is one way to pass accountability for information management to the business leaders since there will be no place for them to hide. They simply won’t be able to deflect the decisions to the CIO. (more…)

Share
Posted in Business/IT Collaboration, CIO, Enterprise Data Management, Integration Competency Centers | Tagged , , , | Leave a comment

Reflections of a Former Analyst

In my last blog, I talked about the dreadful experience of cleaning raw data by hand as a former analyst a few years back. Well, the truth is, I was not alone. At a recent data mining Meetup event in San Francisco bay area,  I asked a few analysts: “How much time do you spend on cleaning your data at work?”  “More than 80% of my time” and “most my days” said the analysts, and “they are not fun”.

But check this out: There are over a dozen Meetup groups focused on data science and data mining here in the bay area I live. Those groups put on events multiple times a month, with topics often around hot, emerging  technologies such as machine learning, graph analysis, real-time analytics, new algorithm on analyzing social media data, and of course, anything Big Data.  Cools BI tools, new programming models and algorithms for better analysis are a big draw to data practitioners these days.

That got me thinking… if what analysts said to me is true, i.e., they spent 80% of their time on data prepping and 1/4 of that time analyzing the data and visualizing the results, which BTW, “is actually fun”, quoting a data analyst, then why are they drawn to the events focused on discussing the tools that can only help them 20% of the time? Why wouldn’t they want to explore technologies that can help address the dreadful 80% of the data scrubbing task they complain about?

Having been there myself, I thought perhaps a little self-reflection would help answer the question.

As a student of math, I love data and am fascinated about good stories I can discover from them.  My two-year math program in graduate school was primarily focused on learning how to build fabulous math models to simulate the real events, and use those formula to predict the future, or look for meaningful patterns.

I used BI and statistical analysis tools while at school, and continued to use them at work after I graduated. Those software were great in that they helped me get to the results and see what’s in my data, and I can develop conclusions and make recommendations based on those insights for my clients. Without BI and visualization tools, I would not have delivered any results.

That was fun and glamorous part of my job as an analyst, but when I was not creating nice charts and presentations to tell the stories in my data, I was spending time, great amount of time, sometimes up to the wee hours cleaning and verifying my data, I was convinced that was part of my job and I just had to suck it up.

It was only a few months ago that I stumbled upon data quality software – it happened when I joined Informatica. At first I thought they were talking to the wrong person when they started pitching me data quality solutions.

Turns out, the concept of data quality automation is a highly relevant and extremely intuitive subject to me, and for anyone who is dealing with data on the regular basis. Data quality software offers an automated process for data cleansing and is much faster and delivers more accurate results than manual process.  To put that in  math context, if a data quality tool can  reduce the data cleansing effort  from 80% to 40% (btw, this is hardly a random number, some of our customers have reported much better results),  that means analysts can now free up 40% of their time from scrubbing data,  and use that times to do the things they like  – playing with data in BI tools, building new models or running more scenarios,  producing different views of the data and discovering things they may not be able to before, and do all of that with clean, trusted data. No more bored to death experience, what they are left with are improved productivity, more accurate and consistent results, compelling stories about data, and most important, they can focus on doing the things they like! Not too shabby right?

I am excited about trying out the data quality tools we have here at Informtica, my fellow analysts, you should start looking into them also.  And I will check back in soon with more stories to share..

 

 

 

Share
Posted in Big Data, Business Impact / Benefits, Customers, Data Governance, Data Quality, Hadoop, Healthcare, Life Sciences, Profiling, Retail, Utilities & Energy | Tagged , , , , , , | Leave a comment

A Data Integration Love-Fest in Vegas

Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?

Is it?

Next-Gen Data Integration

Agile Data Integration

a) They are all top in their field.

b) They all view data as critical to their business success.

c) They are all using Agile Data Integration to drive business agility.

d) They have spoken about their Data Integration strategy at Informatica World in Vegas.

Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?

Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World.  They shared their enthusiasm for the important role of data in their business.  These industry leaders discussed best practices that facilitate an Agile Data Integration process.

American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task.  This effort involved large-scale data migration and included many legacy data sources.  The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.

American Airlines architects recommend the use of Data Integration design patterns in order to improve agility.  The architects shared success-factors for merger Data Integration.  They discussed the importance of ownership by leadership from IT and business.  They emphasized the benefit of open and honest communications between teams.  They architects also highlighted the need to identify integration teams and priorities.  Finally the architects discussed the significance of understanding cultural differences and celebrating success.  The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.

Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions.  The Data Integration team needs to support this business process.  They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge.  Integrating all claims in a single location was critical for smooth processing of insurance claims.  A single system also leads to reduced costs and complexity for support and maintenance.

Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation.  Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.

Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system.  This complex project included data conversion from 50 legacy systems.  The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain.  This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.

Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of  development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.

MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report.  They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.

MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders.  This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.

So let’s recap, Data Integration is helping:

– An airlines continue to serve its customers and run its business smoothly post-merger.

– A tire retail company to procure and provide tires to its customers and maintain leadership

– An insurance company to process claims accurately and in a timely manner, while minimizing costs, and

– A cancer research center to cure cancer.

Not too shabby, right? Data Integration is clearly essential to business success!

So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!

To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration

 

Share
Posted in Data Integration, Data Integration Platform | Tagged , , , , , , , | Leave a comment

Data Integration in Action at Informatica World

Great Data is By DesignWouldn’t you like to have been a fly on the wall when American Airlines and US Airways experts got together to integrate their data systems into one cohesive post-merger system?

Now you can experience the next best thing by attending InformaticaWorld 2014 and hearing the American Airlines US Airways Data Architects talk about the data challenges they faced. They will discuss the role of architecture in M&A, integrating legacy data, lessons learned, and best practices in Data Integration.

While you are at the show, you will have the opportunity to hear many industry experts discuss current trends in Agile end-to-end Data Integration.  

Agile Data Integration Development
To deliver the agility that your business requires, IT and Business must pursue a collaborative Data Integration process, with the appropriate Analyst self-service Data Integration tools.  At InformaticaWorld, you can learn about Agile Data Integration development from the experts at GE Aviation, who will discuss Agile Data Integration for Big Data Analytics. Experts from Roche, will discuss how Agile Data Integration has lead to a 5x reduction in development time, improved business self-service capabilities and increased data credibility.

Scalability
Another aspect of agility is your ability to scale your Data Warehouse to rapidly support more data, data sources, users and projects.  Come hear the experts from Liberty Mutual share challenges, pitfalls, best practices and recommendations for those considering large-scale Data Integration projects, including successful implementation of complex data migrations, data quality and data distribution processes.

Operational Confidence
The management of an enterprise-scale Data Warehouse involves the operation of a mature and complex mission-critical environment, which is commonly driven through an Integration Competency Center (ICC) initiative.  You now have the need to inspect and adapt your production system and expedite data validation and monitoring processes through automation, so that data issues can be quickly caught and corrected and resources can be freed up to focus on development. 

The experts from University of Pittsburgh Medical Center, along with Informatica Professional Services experts, will discuss best practices, lessons learned and the process of transitioning from ‘analytics as project’ to an enterprise initiative through the use of an Integration Competency Center. 

Hear from the Informatica Product Experts
You will have many opportunities to hear directly from the Informatica product experts about end-to-end Data Integration Agility delivered in the recent 9.6 release of PowerCenter.

See PowerCenter 9.6 in Action
Don’t miss the opportunity to see live demos of the cool new features of PowerCenter 9.6 release at the multitude of hands-on labs being offered at InformaticaWorld this year. 

For example you can learn how to empower business users through self-service Data Integration with PowerCenter Analyst tool; how to reduce testing time of Data Integration projects through automated validation tests; and how to scale your Data Integration with High Availability and Grid.

The sessions we described here are a sampling of the rich variety of sessions that will be offered on Data Integration at the show.  We hope that you will join us at InformaticaWorld this year in Las Vegas on May 13-15 and as you plan your visit, please check out the complete listing of sessions and labs that are focused on Data Integration.

Please feel free to leave a comment and let us know which InformaticaWorld session/s you are most looking forward to!  See you there!

Share
Posted in Business/IT Collaboration, Data Integration, Informatica Events, Informatica World 2014, Integration Competency Centers | Tagged , , , , , , | Leave a comment

Mars vs. Venus? How CMOs and CIOs can align and thrive.

CMOs and CIOsRecently, we posted an initial discussion between Informatica’s CMO Marge Breya and CIO Eric Johnson, explaining how CIOs and CMOs can align and thrive. In the dialog below, Breya and Johnson provide additional detail on how their departments partner effectively.

Q: Pretty much everyone agrees that marketing has changed from an art to a science. How does that shift translate into how you work together day to day? 

CIO Eric JohnsonEric: The different ways that marketers now have to get to the prospects and customers to grow their marketshare has exploded. It used to be a single marketing solution that was an after-thought, and bolted on to the CRM solution. Now, there are just so many ways that marketers have to consider how they market to people. It’s driven by things going on in the market, like how people interact with companies and the lifestyle changes people have made around mobile devices.

Informatica CMO Marge BreyaMarge: Just look at the sheer number of systems and sources of data we care about. If you want to understand upsell and cross-sell for customers you have to look at what’s happening in the ERP system, what’s happened from a bookings standpoint, whether the customer is a parent or child of another customer, how you think about data by region, by industry by job title. And there’s how you think about successful conversion of leads. Is it the way you’d predicted? What’s your most valuable content? Who’s your most valuable outlet or event? What’s your ROI? You can’t get that from any one single system. More and more, it’s all about conversion rates, about forecasting and theories about how the business is working from a model standpoint. And I haven’t even talked about social.

Q: With so many emerging technologies to look at, how do CMOs reconcile the need to quickly add new products, while CIOs reconcile the need for everything to work securely and well together?

Eric: There’s this yin and yang that’s starting to build between the CIO and the CMO as we both understand each other and the world we each live in, and therefore collaborate and partner more. But at the same time, there’s a tension between a CMO’s need to bring in solutions very quickly, and the CIO’s need to do some basic vetting of that technology. It’s a tension between speed vs. scale and liability to the company. It’s on a case-by-case basis, but as a CIO you don’t say “no.” You give options. You show CMOs the tradeoffs they’re going to make.

There are also risks that are easy to take and worth taking. They won’t cause any problems with the enterprise on a security or integration perspective, so let’s just try it. It may not work — and that’s OK.

Marge: There’s temptation across departments for the shiny new object. You’ll hear about a new technology, and you think this might solve our problems, or move the business faster. The tension even within the marketing department is: do we understand how and if it will impact the business process? And do we understand how that business process will have to change if the shiny new object comes on board?

Q: CMOs are getting data from potentially hundreds of sources, including partners, third parties, LinkedIn and Google. How do the two of you work together to determine a trustworthy data source? Do you talk about it?

Eric: The issue of trusting your data and making sure you’re doing your due diligence on it is incredibly important. Without doing that, you are running the risk of finding yourself in a very tricky situation from a legal perspective, and potentially a liability perspective. To do that, we have a lot of technology that helps us manage a lot data sources coming into a single source of truth.

On top of that, we are working with marketers who are much more savvy about technology and data. And that makes IT’s job easier — and our partnership better — because we are now talking the same language. Sometimes it’s even hard to tell where the line between the two groups actually sits. Some of the marketing people are as technical as the IT people, and some of the IT people are becoming pretty well-versed in marketing.

Q: How do you decide what technologies to buy?

Marge: A couple of weeks ago we went on a shopping trip, and spent the day at a venture capital firm looking at new companies. It was fun. He and I were brainstorming and questioning each other to see if each technology would be useful, and could we imagine how everything would go together. We first explored possibilities, and then we considered whether it was practical.

Eric: Ultimately, Marge owns the budget. But before the budgeting cycle we sit down to discuss what things she wants to work on, and whether she wants to swap technology out. I make sure Marge is getting what she needs from the technologies. There’s a reliance on the IT team to do some due diligence on the technical aspects of this technology: Does it work. Do we want to do business with these people? Is it going to scale? So each party has a role to play in evaluating whether it’s a good solution for the company. As a CIO you don’t say “no” unless there’s something really bad, and you hope you have a relationship with the CMO where you can say here are the tradeoffs you’re making. You say no one has an agenda here, but here are the risks you have to be ok taking. It’s not a “no.” It’s options.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO | Tagged , , , , | 1 Comment

Should Legal (Or Anyone Else Outside of IT) Be in Charge of Big Data?

shutterstock_145160692A few years back, there was a movement in some businesses to establish “data stewards” – individuals who would sit at the hearts of the enterprise and make it their job to assure that data being consumed by the organization is of the highest possible quality, is secure, is contextually relevant, and capable of interoperating across any applications that need to consume it. While the data steward concept came along when everything was relational and structured, these individuals are now earning their pay when it comes to managing the big data boom.

The rise of big data is creating more than simple headaches for data stewards, it is creating turf wars across enterprises.  As pointed out in a recent article in The Wall Street Journal, there isn’t yet a lot of clarity as to who owns and cares for such data. Is it IT?  Is it lines of business?  Is it legal? There are arguments that can be made for all jurisdictions.

In organizations these days, for example, marketing executives are generating, storing and analyzing large volumes of their own data within content management systems and social media analysis solutions. Many marketing departments even have their own IT budgets. Along with marketing, of course, everyone else within enterprises is seeking to pursue data analytics to better run their operations as well as foresee trends.

Typically, data has been under the domain of the CIO,  the person who oversaw the collection, management and storage of information. In the Wall Street Journal article, however, it’s suggested that legal departments may be the best caretakers of big data, since big data poses a “liability exposure,” and legal departments are “better positioned to understand how to use big data without violating vendor contracts and joint-venture agreements, as well as keeping trade secrets.”

However, legal being legal, it’s likely that insightful data may end up getting locked away, never to see the light of day. Others may argue IT department needs to retain control, but there again, IT isn’t trained to recognize information that may set the business on a new course.

Focusing on big data ownership isn’t just an academic exercise. The future of the business may depend on the ability to get on top of big data. Gartner, for one, predicts that within the next three years, at least of a third of Fortune 100 organizations will experience an information crisis, “due to their inability to effectively value, govern and trust their enterprise information.”

This ability to “value, govern and trust” goes way beyond the traditional maintenance of data assets that IT has specialized in over the past few decades. As Gartner’s Andrew White put it: “Business leaders need to manage information, rather than just maintain it. When we say ‘manage,’ we mean ‘manage information for business advantage,’ as opposed to just maintaining data and its physical or virtual storage needs. In a digital economy, information is becoming the competitive asset to drive business advantage, and it is the critical connection that links the value chain of organizations.”

For starters, then, it is important that the business have full say over what data needs to be brought in, what data is important for further analysis, and what should be done with data once it gains in maturity. IT, however, needs to take a leadership role in assuring the data meets the organization’s quality standards, and that it is well-vetted so that business decision-makers can be confident in the data they are using.

The bottom line is that big data is a team effort, involving the whole enterprise. IT has a role to play, as does legal, as do the line of business.

Share
Posted in Big Data, Business/IT Collaboration, Data Governance | Tagged , , | Leave a comment