Tag Archives: Compliance

Data Governance: Another Hidden Compliance Opportunity?

Data Governance:  Another Hidden Compliance Opportunity?

Data Governance: Another Hidden Compliance Opportunity?

I previously wrote about how regulatory compliance could be used as an opportunity to build the foundations for a flexible data environment.  This opened up a few conversations with some colleagues, which makes me think that I actually missed a key point in my previous blog:  Why is it that compliance is so hard?  After all, regulators are simply asking for data about everyday business activities.  Surely you just reach into your body of knowledge, draw out the salient facts and provide them in a specified format?

But it’s not as easy as a couple of queries.  The reality is that the body of knowledge in question is seldom in a shape recognizable as a ‘body’.  In most corporations, the data regulators are asking for is distributed throughout the organization.  Perhaps a ‘Scattering of Knowledge’ is a more appropriate metaphor.

It is time to accept that data distribution is here to stay.  The idea of a single ERP has long gone.  Hype around Big Data is dying down, and being replaced by a focus on all data as a valuable asset.  IT architectures are becoming more complex as additional data storage and data fueled applications are introduced.  In fact, the rise of Data Governance’s profile within large organizations is testament to the acceptance of data distribution, and the need to manage it.  Forrester has just released their first Forrester Wave ™ on data governance.  They state it is time to address governance as “Data-driven opportunities for competitive advantage abound. As a consequence, the importance of data governance — and the need for tooling to facilitate data governance —is rising.”  (Informatica is recognized as a Leader)

However, Data Governance Programs are not yet as widespread as they should be.  Unfortunately it is hard to directly link strong Data Governance to business value.  This means trouble getting a senior exec to sponsor the investment and cultural change required for strong governance.  Which brings me back to the opportunity within Regulatory Compliance.  My thinking goes like this:

  1. Regulatory compliance is often about gathering and submitting high quality data
  2. This is hard as the data is distributed, and the quality may be questionable
  3. Tools are required to gather, cleanse, manage and submit data for compliance
  4. There is a high overlap of tools & processes for Data Governance and Regulatory Compliance

So – why not use Regulatory Compliance as an opportunity to pilot Data Governance tools, process and practice?

Far too often compliance is a once-off effort with a specific tool.  This tool collects data from disparate sources, with unknown data quality.  The underlying data processes are not addressed.  Strong Governance will have a positive effect on compliance – continually increasing data access and quality, and hence reducing the cost and effort of compliance.  Since the cost of non-compliance is often measured in millions, getting exec sponsorship for a compliance-based pilot may be easier than for a broader Data Governance project.  Once implemented, lessons learned and benefits realized can be leveraged to expand Data Governance into other areas.

Previously I likened Regulatory Compliance as a Buy One, Get One Free opportunity:  Compliance + a free performance boost.  If you use your compliance budget to pilot Data Governance – the boost will be larger than simply implementing Data Quality and MDM tools.  The business case shouldn’t be too hard to build.  Consider that EY’s research shows that companies that successfully use data are already outperforming their peers by as much as 20%.[i]

Data Governance Benefit = (Cost of non-compliance + 20% performance boost) – compliance budget

Yes, the equation can be considered simplistic.  But it is compelling.


[i] 17 Big data and enterprise mobility, EY, 2013.
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance | Tagged , | Leave a comment

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Ready for Your Data Security Audit?

In a recent survey of Informatica customers,
• Over 60% of companies had a security audit in the last year
• 35% of the companies had an internal security audit
• 16% of the companies had both an internal security audit and one performed by an external auditor
• In addition, many of these organizations saw that another company in their same industry suffered a data breach.

These results are reinforced by the discussions I had with Audit and Compliance IT owners from various industries. Audits are on the rise as more customers require these audits before purchase. Compliance IT requires reports at a database or system level showing that the data has been protected. And they want to see these reports on a regular basis as data, including test data pulled from production environments, changes frequently.

Driving these audits and Informatica projects to protect data were the following top regulatory drivers (as reported by customers):
• SOX
• PCI
• PII
• PHI

These results are reinforced by the increasing use of Informatica’s regulatory and industry packs (containing pre-built rules and metadata), including PCI, PHI and PII. In addition to these areas, organizations I’ve spoken to are implementing projects to also protect non-public information, or confidential company information. For example, last week I spoke to a company about how they share detailed financial information about their company as part of the data they said to an outsourced partner. This financial information could be easily used to estimate company’s revenues and profits for any given quarter—before that information is released to the street, if at all.

In this same survey, the top benefits customers said that Informatica’s solution addressed included:
• Increasing productivity by leveraging pre-built masking techniques, accelerators and purpose-built tools
• Reducing the time it took to identify and capture optimal test cases, therefore reducing overall testing time
• Reducing the risk of data breach

Are you ready for your data security audit?

For more information on Informatica’s data security solutions for non-production environments, please join us for an upcoming webinar:

http://bit.ly/W5IciG

For more information on Informatica’s data security solutions in general, please see:

http://bit.ly/PGcJkq

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Privacy, Uncategorized | Tagged , , , , | Leave a comment

The Enterprise Data Archive For Hybrid IT

Data volumes are exploding. We see it all around us. The problem is that too much data can have a very negative impact on user productivity. Think about how long it takes to sift through emails after returning from vacation? Consider how long it takes to complete a purchase on an Ecommerce sight on Black Friday? The more data, the longer any of these processes take and the more time spent combing through more and more data.  Informatica has been successfully working with Symantec and our customers through our partnership to help them find ways to control the impact of ‘too much data’. We are helping them to define projects that improve their ability to meet SLAs and application performance, reduce costs and mitigate any compliance risks – all while IT budgets remain relatively flat. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, CIO, Data Archiving | Tagged , , , , , , | Leave a comment

The Individual in the European Data Tug of War

LinkedIn’s security breach this summer exposed a massive 6.5 million user passwords and was yet another reminder of the blanket lack of protection over consumer data.  The constant deluge of reports over personal data leakages has left 70% of EU citizens worried about the misuse of their personal data, according to the European Commission. That’s why the EU stepped in to look at strengthening the right to access, change or delete personal data. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Privacy | Tagged , , , , | Leave a comment

Lacking Data Integration, Cloud Computing Suffers

The findings of the Cloud Market Maturity study, a survey conducted jointly by Cloud Security Alliance (CSA) and ISACA, show that government regulations, international data privacy, and integration with internal systems dominate the top 10 areas where trust in the cloud is at its lowest.

The Cloud Market Maturity study examines the maturity of cloud computing and helps identify market changes. In addition, the report provides detailed information on the adoption of cloud services at all levels within global companies, including senior executives. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration | Tagged , , , , , , , , , , | 2 Comments

Data Governance Sustains Your Data Lifecycle

The next facet of our data governance framework focuses on the three intentionally simplified dependent processes that constitute the data lifecycle.  When educating your business sponsors and evangelists on the data lifecycle, I like to categorize it into these three broad areas: upstream processes, stewardship processes, and downstream processes.   If you’re an enterprise or data architect, you’ll likely have a much more granular set of steps in a data lifecycle, which is perfectly fine.  But when engaging with your business partners, keep it simple and they may actually listen!   (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , | 5 Comments

Data Retention Requirement in Financial Services – What Are They? Why is it so Hard?

The need for more robust data retention management and enforcement is more than just good data management practice. It is a legal requirement for financial services organizations across the globe to comply with the myriad of local, federal, and international laws that mandate the retention of certain types of data for example:

  • Dodd-Frank Act: Under Dodd-Frank, firms are required to maintain records for no less than five years.
  • Basel Accord: The Basel guidelines call for the retention of risk and transaction data over a period of three to seven years. Noncompliance can result in significant fines and penalties.
  • MiFiD II: Transactional data must also be stored in such a way that it meets new records retention requirements for such data (which must now be retained for up to five years) and easily retrieved, in context, to prove best execution.
  • Bank Secrecy Act: All BSA records must be retained for a period of five years and must be filed or stored in such a way as to be accessible within a reasonable period of time.
  • Payment Card Industry Data Security Standard (PCI): PCI requires card issuers and acquirers to retain an audit trail history for a period that is consistent with its effective use, as well as legal regulations. An audit history usually covers a period of at least one year, with a minimum of three months available on-line.
  • Sarbanes-Oxley:Section 103 requires firms to prepare and maintain, for a period of not less than seven years, audit work papers and other information related to any audit report, in sufficient detail to support the conclusions reached and reported to external regulators.

Each of these laws have distinct data collection, analysis, and retention requirements that must be factored into existing information management practices. Unfortunately, existing data archiving methods including traditional database and tape backup methods lack the required capabilities to effectively enforce and automate data retention policies to comply with industry regulations.  In addition, a number of internal and external challenges make it even more difficult for financial institutions to archive and retain required data due to the following trends: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, CIO, Database Archiving, Enterprise Data Management, Financial Services, Vertical | Tagged , , | 1 Comment