Tag Archives: Compliance
The need for more robust data retention management and enforcement is more than just good data management practice. It is a legal requirement for financial services organizations across the globe to comply with the myriad of local, federal, and international laws that mandate the retention of certain types of data for example:
- Dodd-Frank Act: Under Dodd-Frank, firms are required to maintain records for no less than five years.
- Basel Accord: The Basel guidelines call for the retention of risk and transaction data over a period of three to seven years. Noncompliance can result in significant fines and penalties.
- MiFiD II: Transactional data must also be stored in such a way that it meets new records retention requirements for such data (which must now be retained for up to five years) and easily retrieved, in context, to prove best execution.
- Bank Secrecy Act: All BSA records must be retained for a period of five years and must be filed or stored in such a way as to be accessible within a reasonable period of time.
- Payment Card Industry Data Security Standard (PCI): PCI requires card issuers and acquirers to retain an audit trail history for a period that is consistent with its effective use, as well as legal regulations. An audit history usually covers a period of at least one year, with a minimum of three months available on-line.
- Sarbanes-Oxley:Section 103 requires firms to prepare and maintain, for a period of not less than seven years, audit work papers and other information related to any audit report, in sufficient detail to support the conclusions reached and reported to external regulators.
Each of these laws have distinct data collection, analysis, and retention requirements that must be factored into existing information management practices. Unfortunately, existing data archiving methods including traditional database and tape backup methods lack the required capabilities to effectively enforce and automate data retention policies to comply with industry regulations. In addition, a number of internal and external challenges make it even more difficult for financial institutions to archive and retain required data due to the following trends: (more…)
In the second of two videos, Scott Fingerhut, senior director of product marketing for CEP at Informatica, talks about how Complex Event Processing (CEP) can be applied: proactive monitoring, proactive compliance and customer engagement.
Learn more about CEP in Scott’s first video: http://www.youtube.com/watch?v=AUmveP07Ea8.
Data warehouses are applications– so why not manage them like one? In fact, data grows at a much faster rate in data warehouses, since they integrate date from multiple applications and cater to many different groups of users who need different types of analysis. Data warehouses also keep historical data for a long time, so data grows exponentially in these systems. The infrastructure costs in data warehouses also escalate quickly since analytical processing on large amounts of data requires big beefy boxes. Not to mention the software license and maintenance costs of such a large amount of data. Imagine how many backup media is required to backup tens to hundreds of terabytes of data warehouses on a regular basis. But do you really need to keep all that historical data in production?
One of the challenges of managing data growth in data warehouses is that it’s hard to determine which data is actually used, which data is no longer being used, or even if the data was ever used at all. Unlike transactional systems where the application logic determines when records are no longer being transacted upon, the usage of analytical data in data warehouses has no definite business rules. Age or seasonality may determine data usage in data warehouses, but business users are usually loath to let go of the availability of all that data at their fingertips. The only clear cut way to prove that some data is no longer being used in data warehouses is to monitor its usage.
In the first of two videos, Scott Fingerhut, senior director of product marketing for CEP at Informatica, simplifies CEP: Complex Event Processing. In this video, he highlights the importance of being proactive. He also talks about how CEP can be applied: proactive monitoring, proactive compliance and customer engagement.
The cost for 1GB of magnetic disk storage 20 years ago was $1,000 – now it’s eight cents. 1GB is enough to store about 20 thousand letter-size scanned documents. To store the same number of paper documents would require two four-drawer filing cabinets which would cost about $400. The cost of electronic data storage is five thousand times less than paper storage.
Costs have dropped consistently 40% per year which accounts for the more than 12,000 times reduction in cost since 1992. The cost for RAID or mainframe disk storage is somewhat greater, but the historical trend for other storage devices has been similar and the forecast for the foreseeable future is that costs will continue to decrease at the same rate. Twenty years from now we will be able to buy one tera-byte of storage for a penny. (more…)
Similar to the way that a carburetor restrictor plate prevents NASCAR race cars from going as fast as possible by restricting maximum airflow, inefficient messaging middleware prevents IT organizations from processing vital business data as fast as possible.
The “Dodd-Frank Wall Street Reform and Consumer Protection Act” has recently been passed by the US federal government to regulate financial institutions. Per this legislation, there will be more “watchdog” agencies that will be auditing banks, lending and investment institutions to ensure compliance. As an example, there will be an Office of Financial Research within the Federal Treasury responsible for collecting and analyzing data. This legislation brings with it a higher risk of fines for non-compliance. (more…)
Informatica supports Agile Data Integration for Agile BI with best practices that encourage good data governance, facilitate business-IT collaboration, promote reuse & flexibility through data virtualization, and enable rapid prototyping and test-driven development. Organizations that want to successfully adopt Agile Data Integration should standardize on the following best practices and leverage Informatica 9.1 to streamline the data integration process, improve data governance, and provide a flexible data virtualization architecture.
1. The business and IT work efficiently and effectively to translate requirements and specifications into data services (more…)
Imagine you have a crystal ball. With it, you can look into the future and see where your business will be next year, whether or not that key project will be a success or failure. You could take corrective actions today that would help ensure that success. Life would be great.
Unfortunately, we don’t have a crystal ball so we have to embark on projects not knowing with any degree of certainty whether they will succeed or fail. Fortunately, we can increase our chances of success by conducting some due diligence which helps uncover otherwise unforeseen twists in the road. All too often, however, the due diligence step is left out and projects more often than not fail. (more…)