Tag Archives: Compliance
In a recent survey of Informatica customers,
• Over 60% of companies had a security audit in the last year
• 35% of the companies had an internal security audit
• 16% of the companies had both an internal security audit and one performed by an external auditor
• In addition, many of these organizations saw that another company in their same industry suffered a data breach.
These results are reinforced by the discussions I had with Audit and Compliance IT owners from various industries. Audits are on the rise as more customers require these audits before purchase. Compliance IT requires reports at a database or system level showing that the data has been protected. And they want to see these reports on a regular basis as data, including test data pulled from production environments, changes frequently.
Driving these audits and Informatica projects to protect data were the following top regulatory drivers (as reported by customers):
These results are reinforced by the increasing use of Informatica’s regulatory and industry packs (containing pre-built rules and metadata), including PCI, PHI and PII. In addition to these areas, organizations I’ve spoken to are implementing projects to also protect non-public information, or confidential company information. For example, last week I spoke to a company about how they share detailed financial information about their company as part of the data they said to an outsourced partner. This financial information could be easily used to estimate company’s revenues and profits for any given quarter—before that information is released to the street, if at all.
In this same survey, the top benefits customers said that Informatica’s solution addressed included:
• Increasing productivity by leveraging pre-built masking techniques, accelerators and purpose-built tools
• Reducing the time it took to identify and capture optimal test cases, therefore reducing overall testing time
• Reducing the risk of data breach
Are you ready for your data security audit?
For more information on Informatica’s data security solutions for non-production environments, please join us for an upcoming webinar:
For more information on Informatica’s data security solutions in general, please see:
Data volumes are exploding. We see it all around us. The problem is that too much data can have a very negative impact on user productivity. Think about how long it takes to sift through emails after returning from vacation? Consider how long it takes to complete a purchase on an Ecommerce sight on Black Friday? The more data, the longer any of these processes take and the more time spent combing through more and more data. Informatica has been successfully working with Symantec and our customers through our partnership to help them find ways to control the impact of ‘too much data’. We are helping them to define projects that improve their ability to meet SLAs and application performance, reduce costs and mitigate any compliance risks – all while IT budgets remain relatively flat. (more…)
In this video, Peter Ku, director of solution marketing, Global Financial Services, Informatica, discusses the data challenges associated with FATCA compliance.
The Foreign Account Tax Compliance Act (FATCA) was signed into U.S law in March 2010 and is coming into effect on January 1, 2014. The new law will require Foreign Financial Institutions to report the names of U.S. persons and owners of companies who have bank accounts in these banks for tax reporting and withholding purposes.
Peter answers the following questions:
1) What is FATCA and what are the requirements financial services companies must comply with?
2) What must financial institutions do to successfully meet these requirements?
3) What do financial institutions need to address the data-related challenges and comply with FATCA?
The findings of the Cloud Market Maturity study, a survey conducted jointly by Cloud Security Alliance (CSA) and ISACA, show that government regulations, international data privacy, and integration with internal systems dominate the top 10 areas where trust in the cloud is at its lowest.
The Cloud Market Maturity study examines the maturity of cloud computing and helps identify market changes. In addition, the report provides detailed information on the adoption of cloud services at all levels within global companies, including senior executives. (more…)
The next facet of our data governance framework focuses on the three intentionally simplified dependent processes that constitute the data lifecycle. When educating your business sponsors and evangelists on the data lifecycle, I like to categorize it into these three broad areas: upstream processes, stewardship processes, and downstream processes. If you’re an enterprise or data architect, you’ll likely have a much more granular set of steps in a data lifecycle, which is perfectly fine. But when engaging with your business partners, keep it simple and they may actually listen! (more…)
The need for more robust data retention management and enforcement is more than just good data management practice. It is a legal requirement for financial services organizations across the globe to comply with the myriad of local, federal, and international laws that mandate the retention of certain types of data for example:
- Dodd-Frank Act: Under Dodd-Frank, firms are required to maintain records for no less than five years.
- Basel Accord: The Basel guidelines call for the retention of risk and transaction data over a period of three to seven years. Noncompliance can result in significant fines and penalties.
- MiFiD II: Transactional data must also be stored in such a way that it meets new records retention requirements for such data (which must now be retained for up to five years) and easily retrieved, in context, to prove best execution.
- Bank Secrecy Act: All BSA records must be retained for a period of five years and must be filed or stored in such a way as to be accessible within a reasonable period of time.
- Payment Card Industry Data Security Standard (PCI): PCI requires card issuers and acquirers to retain an audit trail history for a period that is consistent with its effective use, as well as legal regulations. An audit history usually covers a period of at least one year, with a minimum of three months available on-line.
- Sarbanes-Oxley:Section 103 requires firms to prepare and maintain, for a period of not less than seven years, audit work papers and other information related to any audit report, in sufficient detail to support the conclusions reached and reported to external regulators.
Each of these laws have distinct data collection, analysis, and retention requirements that must be factored into existing information management practices. Unfortunately, existing data archiving methods including traditional database and tape backup methods lack the required capabilities to effectively enforce and automate data retention policies to comply with industry regulations. In addition, a number of internal and external challenges make it even more difficult for financial institutions to archive and retain required data due to the following trends: (more…)
In the second of two videos, Scott Fingerhut, senior director of product marketing for CEP at Informatica, talks about how Complex Event Processing (CEP) can be applied: proactive monitoring, proactive compliance and customer engagement.
Learn more about CEP in Scott’s first video: http://www.youtube.com/watch?v=AUmveP07Ea8.
Data warehouses are applications– so why not manage them like one? In fact, data grows at a much faster rate in data warehouses, since they integrate date from multiple applications and cater to many different groups of users who need different types of analysis. Data warehouses also keep historical data for a long time, so data grows exponentially in these systems. The infrastructure costs in data warehouses also escalate quickly since analytical processing on large amounts of data requires big beefy boxes. Not to mention the software license and maintenance costs of such a large amount of data. Imagine how many backup media is required to backup tens to hundreds of terabytes of data warehouses on a regular basis. But do you really need to keep all that historical data in production?
One of the challenges of managing data growth in data warehouses is that it’s hard to determine which data is actually used, which data is no longer being used, or even if the data was ever used at all. Unlike transactional systems where the application logic determines when records are no longer being transacted upon, the usage of analytical data in data warehouses has no definite business rules. Age or seasonality may determine data usage in data warehouses, but business users are usually loath to let go of the availability of all that data at their fingertips. The only clear cut way to prove that some data is no longer being used in data warehouses is to monitor its usage.