Tag Archives: Data Governance

Data Governance: Another Hidden Compliance Opportunity?

Data Governance:  Another Hidden Compliance Opportunity?

Data Governance: Another Hidden Compliance Opportunity?

I previously wrote about how regulatory compliance could be used as an opportunity to build the foundations for a flexible data environment.  This opened up a few conversations with some colleagues, which makes me think that I actually missed a key point in my previous blog:  Why is it that compliance is so hard?  After all, regulators are simply asking for data about everyday business activities.  Surely you just reach into your body of knowledge, draw out the salient facts and provide them in a specified format?

But it’s not as easy as a couple of queries.  The reality is that the body of knowledge in question is seldom in a shape recognizable as a ‘body’.  In most corporations, the data regulators are asking for is distributed throughout the organization.  Perhaps a ‘Scattering of Knowledge’ is a more appropriate metaphor.

It is time to accept that data distribution is here to stay.  The idea of a single ERP has long gone.  Hype around Big Data is dying down, and being replaced by a focus on all data as a valuable asset.  IT architectures are becoming more complex as additional data storage and data fueled applications are introduced.  In fact, the rise of Data Governance’s profile within large organizations is testament to the acceptance of data distribution, and the need to manage it.  Forrester has just released their first Forrester Wave ™ on data governance.  They state it is time to address governance as “Data-driven opportunities for competitive advantage abound. As a consequence, the importance of data governance — and the need for tooling to facilitate data governance —is rising.”  (Informatica is recognized as a Leader)

However, Data Governance Programs are not yet as widespread as they should be.  Unfortunately it is hard to directly link strong Data Governance to business value.  This means trouble getting a senior exec to sponsor the investment and cultural change required for strong governance.  Which brings me back to the opportunity within Regulatory Compliance.  My thinking goes like this:

  1. Regulatory compliance is often about gathering and submitting high quality data
  2. This is hard as the data is distributed, and the quality may be questionable
  3. Tools are required to gather, cleanse, manage and submit data for compliance
  4. There is a high overlap of tools & processes for Data Governance and Regulatory Compliance

So – why not use Regulatory Compliance as an opportunity to pilot Data Governance tools, process and practice?

Far too often compliance is a once-off effort with a specific tool.  This tool collects data from disparate sources, with unknown data quality.  The underlying data processes are not addressed.  Strong Governance will have a positive effect on compliance – continually increasing data access and quality, and hence reducing the cost and effort of compliance.  Since the cost of non-compliance is often measured in millions, getting exec sponsorship for a compliance-based pilot may be easier than for a broader Data Governance project.  Once implemented, lessons learned and benefits realized can be leveraged to expand Data Governance into other areas.

Previously I likened Regulatory Compliance as a Buy One, Get One Free opportunity:  Compliance + a free performance boost.  If you use your compliance budget to pilot Data Governance – the boost will be larger than simply implementing Data Quality and MDM tools.  The business case shouldn’t be too hard to build.  Consider that EY’s research shows that companies that successfully use data are already outperforming their peers by as much as 20%.[i]

Data Governance Benefit = (Cost of non-compliance + 20% performance boost) – compliance budget

Yes, the equation can be considered simplistic.  But it is compelling.


[i] 17 Big data and enterprise mobility, EY, 2013.
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance | Tagged , | Leave a comment

Takeaways from the Gartner Security and Risk Management Summit (2014)

Last week I had the opportunity to attend the Gartner Security and Risk Management Summit. At this event, Gartner analysts and security industry experts meet to discuss the latest trends, advances, best practices and research in the space. At the event, I had the privilege of connecting with customers, peers and partners. I was also excited to learn about changes that are shaping the data security landscape.

Here are some of the things I learned at the event:

  • Security continues to be a top CIO priority in 2014. Security is well-aligned with other trends such as big data, IoT, mobile, cloud, and collaboration. According to Gartner, the top CIO priority area is BI/analytics. Given our growing appetite for all things data and our increasing ability to mine data to increase top-line growth, this top billing makes perfect sense. The challenge is to protect the data assets that drive value for the company and ensure appropriate privacy controls.
  • Mobile and data security are the top focus for 2014 spending in North America according to Gartner’s pre-conference survey. Cloud rounds out the list when considering worldwide spending results.
  • Rise of the DRO (Digital Risk Officer). Fortunately, those same market trends are leading to an evolution of the CISO role to a Digital Security Officer and, longer term, a Digital Risk Officer. The DRO role will include determination of the risks and security of digital connectivity. Digital/Information Security risk is increasingly being reported as a business impact to the board.
  • Information management and information security are blending. Gartner assumes that 40% of global enterprises will have aligned governance of the two programs by 2017. This is not surprising given the overlap of common objectives such as inventories, classification, usage policies, and accountability/protection.
  • Security methodology is moving from a reactive approach to compliance-driven and proactive (risk-based) methodologies. There is simply too much data and too many events for analysts to monitor. Organizations need to understand their assets and their criticality. Big data analytics and context-aware security is then needed to reduce the noise and false positive rates to a manageable level. According to Gartner analyst Avivah Litan, ”By 2018, of all breaches that are detected within an enterprise, 70% will be found because they used context-aware security, up from 10% today.”

I want to close by sharing the identified Top Digital Security Trends for 2014

  • Software-defined security
  • Big data security analytics
  • Intelligent/Context-aware security controls
  • Application isolation
  • Endpoint threat detection and response
  • Website protection
  • Adaptive access
  • Securing the Internet of Things
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO, Data Governance, Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , , , , , | Leave a comment

The Power and Security of Exponential Data

The Power and Security of Exponential Data

The Power and Security of Exponential Data

I recently heard a couple different analogies for data. The first is that data is the “new oil.” Data is a valuable resource that powers global business. Consequently, it is targeted for theft by hackers. The thinking is this: People are not after your servers, they’re after your data.

The other comparison is that data is like solar power. Like solar power, data is abundant. In addition, it’s getting cheaper and more efficient to harness. The juxtaposition of these images captures the current sentiment around data’s potential to improve our lives in many ways. For this to happen, however, corporations and data custodians must effectively balance the power of data with security and privacy concerns.

Many people have a preconception of security as an obstacle to productivity. Actually, good security practitioners understand that the purpose of security is to support the goals of the company by allowing the business to innovate and operate more quickly and effectively. Think back to the early days of online transactions; many people were not comfortable banking online or making web purchases for fear of fraud and theft. Similar fears slowed early adoption of mobile phone banking and purchasing applications. But security ecosystems evolved, concerns were addressed, and now Gartner estimates that worldwide mobile payment transaction values surpass $235B in 2013. An astute security executive once pointed out why cars have brakes: not to slow us down, but to allow us to drive faster, safely.

The pace of digital change and the current proliferation of data is not a simple linear function – it’s growing exponentially – and it’s not going to slow down. I believe this is generally a good thing. Our ability to harness data is how we will better understand our world. It’s how we will address challenges with critical resources such as energy and water. And it’s how we will innovate in research areas such as medicine and healthcare. And so, as a relatively new Informatica employee coming from a security background, I’m now at a crossroads of sorts. While Informatica’s goal of “Putting potential to work” resonates with my views and helps customers deliver on the promise of this data growth, I know we need to have proper controls in place. I’m proud to be part of a team building a new intelligent, context-aware approach to data security (Secure@SourceTM).

We recently announced Secure@SourceTM during InformaticaWorld 2014. One thing that impressed me was how quickly attendees (many of whom have little security background) understood how they could leverage data context to improve security controls, privacy, and data governance for their organizations. You can find a great introduction summary of Secure@SourceTM here.

I will be sharing more on Secure@SourceTM and data security in general, and would love to get your feedback. If you are an Informatica customer and would like to help shape the product direction, we are recruiting a select group of charter customers to drive and provide feedback for the first release. Customers who are interested in being a charter customer should register and send email to SecureCustomers@informatica.com.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance, Data Privacy, Data Security | Tagged , , , , , | Leave a comment

To Engage Business, Focus on Information Management rather than Data Management

Focus on Information Management

Focus on Information Management

IT professionals have been pushing an Enterprise Data Management agenda for decades rather than Information Management and are frustrated with the lack of business engagement. So what exactly is the difference between Data Management and Information Management and why does it matter? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , | Leave a comment

The Three Ingredients of Enterprise Information Management

Information Management

Enterprise Information Management

There is no shortage of buzzwords that speak to the upside and downside of data.  Big Data, Data as an Asset, the Internet of Things, Cloud Computing, One Version of the Truth, Data Breach, Black Hat Hacking, and so on. Clearly we are in the Information Age as described by Alvin Toffler in The Third Wave. But yet, most organizations are not effectively dealing with the risks of a data-driven economy nor are they getting the full benefits of all that data. They are stuck in a fire-fighting mode where each information management opportunity or problem is a one-time event that is man-handled with heroic efforts. There is no repeatability. The organization doesn’t learn from prior lessons and each business unit re-invents similar solutions. IT projects are typically late, over budget, and under delivered. There is a way to break out of this rut. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Integration, Enterprise Data Management, Integration Competency Centers | Tagged , , , , , , , , , | Leave a comment

A Data-Driven Healthcare Culture is Foundational to Delivering Personalized Medicine in Healthcare

According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing. 

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.

Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.

To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.

Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.

Video: MD Anderson Cancer CenterOrganizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.

Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.

As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.

The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:

1)      Practice analytics as a core competency
2)      Define evidence, deliver best practice care and personalize medicine
3)      Engage patients and collaborate to foster strong, actionable relationships

Healthcare e-bookTake a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.

Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).

What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.

So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.

Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Cost of Bad DataSure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.

For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.

Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring  examples, including:

  • Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
  • Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
  • Establishing consistent data definitions and parameters
  • Thinking about the internet of things (IoT) and how to incorporate device data into analysis
  • Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
  • Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
  • Analyzing data to understand what is working and what is not working so  that they can drive out unwanted variations in care

Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.

Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014.  In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Healthcare, Informatica World 2014, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

How Can CEOs Protect Customer Data And Their Own Jobs?

Data Security

Data-centric security

Recently, a number of high-profile data breaches have drawn attention to the impact that compromised data can have on a business. When customer data is breached, the consequences can include:

  • A loss of customer trust
  • Revenue shortfalls
  • A plummeting stock price
  • C-level executives losing their jobs

As a result, Data security and privacy has become a key topic of discussion, not just in IT meetings, but in the media and the boardroom.

Preventing access to sensitive data has become more complex than ever before. There are new potential entry points that IT never previously considered. These new options go beyond typical BYOD user devices like smartphones and tablets. Today’s entry points can be much smaller: Things like HVAC controllers, office polycoms and temperature control systems. 

So what can organizations do to combat this increasing complexity? Traditional data security practices focus on securing both the perimeter and the endpoints. However, these practices are clearly no longer working and no longer manageable. Not only is the number and type of devices expanding, but the perimeter itself is no longer present. As companies increasingly outsource, off-shore and move operations to the cloud, it is no longer possible fence the perimeters and to keep intruders out. Because 3rd parties often require some form of access, even trusted user credentials may fall into the hands of malicious intruders. 

Data security requires a new approach. It must use policies to follow the data and to protect it, regardless of where it is located and where it moves. Informatica is responding to this need. We are leveraging our market leadership and domain expertise in data management and security. We are defining a new data security offering and category.  This week, we unveiled our entry into the Data Security market at our Informatica World conference. Our new security offering, Secure@Source™ will allow enterprises to discover, detect and protect sensitive data.

The first step towards protecting sensitive data is to locate and identify them. So Secure@Source™ first allows you discover where all the sensitive data are located in the enterprise and classify them.  As part of the discovery, Secure@source also analyzes where sensitive data is being proliferated, who has access to the data, who are actually accessing them and whether the data is protected or unprotected when accessed.  Secure@Source™ leverages Informatica’s PowerCenter repository and lineage technology to perform a first pass, quick discovery with a more in depth analysis and profiling over time.  The solution allows you to determine the privacy risk index of your enterprise and slice and dice the analysis based on region, departments, organization hierarchy, as well as data classifications.

infaaa

The longer term vision of Secure@Source™ will allow you to detect suspicious usage patterns and orchestrate the appropriate data protection method, such as:  alerting, blocking, archiving and purging, dynamically masking, persistently masking, encrypting, and/or tokenizing the data. The data protection method will depend on whether the data store is a production or non-production system, and whether you would like to de-identify sensitive data across all users or only for some users.  All can be deployed based on policies. Secure@Source™ is intended to be an open framework for aggregating data security analytics and will integrate with key partners to provide a comprehensive visibility and assessment of an enterprise data privacy risk.

Secure@Source™ is targeted for beta at the end of 2014 and general availability in early 2015.  Informatica is recruiting a select group of charter customers to drive and provide feedback for the first release. Customers who are interested in being a charter customer should register and send email to SecureCustomers@informatica.com.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Customers, Data Governance, Data Privacy | Tagged , , , , , | Leave a comment

MDM Day Advice: Connect MDM to a Tangible Business Outcome or You Will Fail

“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information  Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.

A record 500 people attended MDM Day in Las Vegas

A record 500 people attended MDM Day in Las Vegas

In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.

Here are my MDM Day fun facts and key takeaways:

  • Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services.  Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.

  •  Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
    Barbara Latulippe, EMC

    Barbara Latulippe, Senior Director, Enterprise Information Management at EMC

    EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”

  • Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?

    Michael Delgado, Citrix

    Michael Delgado, Information Management Director at Citrix

    Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.

  •  Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?

    John Poonnen, Quintiles

    John Poonnen, Director Infosario Data Factory at Quintiles

    Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.

While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:

  • Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
  • Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”

I also had the opportunity MDM Day Sponsorsto speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.

There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Informatica World 2014, Master Data Management, Partners, PiM, Product Information Management, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | 1 Comment

5 Data Challenges That Frustrate Chief Risk Officers

Frustrated Chief Risk OfficersIt has never been a more challenging time to be a Chief Risk Officer at a financial services firm. New regulations (CCAR, Basel III, BCBS 239, Solvency II, EMIR) have increased the complexity of the role. Today, risk management organizations must use precise data to measure risk, allocate capital and cover exposure. In addition, they must equip compliance groups to explain these decisions to industry regulators.

The challenges facing a Chief Risk Officer are even more daunting when the data at the heart of each decision is incomplete, inaccessible or inaccurate.  Unless the data is complete, trustworthy, timely, authoritative and auditable, success will be hard to come by.

When data issues arise, most CROs lay blame at the feet of their risk applications. Next, they blame IT and the people and processes responsible for providing data to your risk modeling and analysis applications.  However, in most situations, the issue with the data is neither the fault of the applications nor the IT groups. The reality is, most business users are unfamiliar with the root causes of data issues. More importantly, they are unfamiliar with available data and information management solutions to resolve and prevent similar issues.  The root causes for existing data issues stem from processes and tools IT development teams use to deliver data to risk management groups.  Regrettably, ongoing budget constraints, lack of capable tools, and fear of becoming obsolete have resulted in CIO leaders “throwing bodies at the problem.” This approach consumes IT worker cycles, as they manually access, transform, cleanse, validate, and deliver data into risk and compliance applications.  

So what are the data issues impacting risk management organizations today? What should your organization consider and invest in if these situations exist?  Here is a list of issues I have heard through my own conversations with risk and technology executives in the global financial markets.  The following section is to help Chief Risk Officers, VP of Risk Management, and Risk Analysts to understand the solutions that can help with their data issues.

Challenge #1:     I don’t have the right data in my risk applications, models, and analytic applications. It is either incomplete, out of date, or pain incorrect.

 Root Causes:

  • The data required to manage and monitor risk across the business originate from hundreds of systems and applications. Data volumes continue to grow every day with new systems and types of data in today’s digital landscape
  • Due to the lack of proper tools to integrate required data, IT developers manually extract data from internal and external systems which can range in the hundreds and comes in various formats
  • Raw data from source systems are transformed and validated custom coded methods using COBOL, PLSQL, JAVA, PERL, etc.

Solutions that can help:

Consider investing in industry proven data integration and data quality software designed to reduce manual extraction, transformation, and validation and streamline the process of identifying and fixing upstream data quality errors. Data Integration tools not only reduce the risk of errors, they are designed to help IT professionals streamline these complex steps, reuse transformation and data quality rules across the risk data management process to enable repeatability, consistency, and efficiencies that require less resources to support current and future data needs by risk and other parts of the business.

Challenge #2:  We do not have a comprehensive view of risk to satisfy systemic risk requirements

 Root Causes:

  • Too many silos or standalone data marts or data warehouses containing segmented views of risk information
  •  Creating a single enterprise risk data warehouse takes too long to build, too complex, too expensive, too much data to process all in one system

 Solutions that can help:

  • Data virtualization solutions can tie existing data together to deliver a consolidated view of risk for business users without having to bring that data into an existing data warehouse.
  • Long term, look at consolidating and simplifying existing data warehouses into an enterprise data warehouse leveraging high performing data processing technologies like Hadoop.

Challenge #3:  I don’t trust the data being delivered into my risk applications and modeling solutions

 Root Causes:

  • Data quality checks and validations are performed after the fact or often not at all.
  • Business believes IT is performing the required data quality checks and corrections however business lacks visibility into how IT is fixing data errors and if these errors are being addressed at all.

 Solutions that can help:

  • Data quality solutions that allow business and IT to enforce data policies and standards to ensure business applications have accurate data for modeling and reporting purposes.
  • Data quality scorecards accessible by business users to showcase the performance of ongoing data quality rules used to validate and fix data quality errors before they go into downstream risk systems. 

Challenge #4:  Unable to explain how risk is measured and reported to external regulators

Root Causes:

  • Related to IT manually managing data integration processes, organizations lack up to date, detailed, and accurate documentation of all of these processes from beginning to end.  This results in IT not able to produce data lineage reports and information resulting in audit failures, regulatory penalties, and higher capital allocations that required.
  • Lack of agreed upon documentation of business terms and definitions explaining what data is available, how is it used, and who has the domain knowledge to answer questions

Solutions that can help:

  • Metadata management solutions that can capture upstream and downstream details of what data is collected, how it is processed, where it is used, and who uses it can help solve this requirement.
  • Business glossary for data stewards and owners to manage definitions of your data and provide seamless access by business users from their desktops

Challenge #5:  Unable to identify and measure risk exposures between counterparties and securities instruments

 Root Causes:

  • No single source of the truth – Existing counterparty/legal entity master data resides in systems across traditional business silos.
  • External identifiers including the proposed Global Legal Entity Identifier will never replace identifiers across existing systems
  • Lack of insight into how each legal entity is related to each other both from a legal hierarchy standpoint and their exposure to existing securities instruments.

 Solutions that can help:

  • Master Data Management for Counterparty and Securities Master Data can help provide a single, connected, and authoritative source of counterparty information including legal hierarchy relationships and rules to identify the role and relationship between counterparties and existing securities instruments. It also eliminates business confusion of having different identifiers for the same legal entity by creating a “master” record and cross reference to existing records and identifiers for the same entity.

In summary, Chief Risk Officers and their organizations are investing to improve existing business processes, people, and business applications to satisfy industry regulations and gain better visibility into their risk conditions.  Though these are important investments, it is also critical that you invest in the technologies to ensure IT has what it needs to access and deliver comprehensive, timely, trusted, and authoritative data. 

At the same time, CIO’s can no longer afford wasting precious resources supporting manual works of art. As you take this opportunity to invest in your data strategies and requirements, it is important that both business and IT realize the importance of investing in a scalable and proven Information and Data Architecture to not only satisfy upcoming regulatory requirements but have in place a solution that meets the needs of the future across all lines of business.  Click here to learn more about informatica’s solutions for banking and capital markets.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Financial Services | Tagged , | Leave a comment