Tag Archives: reference data management
Financial Stability Board Pushes Legal Entity Identifier to the G20– Vote Expected this Month – What’s Next?
Hot off the press! The Financial Stability Board (FSB) published today (June 8th, 2012) a report entitled “A Global Legal Entity Identifier for Financial Markets” for the G20 supervisors for consideration and response to the mandate issued by the G20 at the Cannes Summit for a final vote at the end of the month in Mexico. It sets out 35 recommendations for the development and implementation of the global LEI system. These recommendations are guided by a set of “High Level Principles” which outline the objectives that a global LEI system should meet.
The proposed global Legal Entity Identifier (LEI) is expected to help regulators identify unique counterparties across the financial system and monitor the impact of risky counterparties holding positions with the banks. Assuming LEI is approved by the G20 this month, it will be the first of these infrastructure standards to be implemented globally requiring firms to integrate, reconcile and cross-reference the new LEI with existing counterparty identifiers and information, as well as manage accurate and current legal hierarchies. (more…)
Most of the buzz around the water cooler for those responsible for enterprise reference data in financial services has been around the recent G20 meeting in Switzerland on the details of the proposed Legal Entity Identifier (LEI). The LEI is designed to help regulators manage and monitor systemic risk in the financial markets by creating a unique ID to recognize legal entities/counterparties shared by the global financial companies and government regulators. Agreement to adoption is expected to be decided at the G20 leaders’ summit coming up in June in Mexico as regulators decide the details as to the administration, implementation and enforcement of the standard. Will the new LEI solve the issues that led to the recent financial crisis? (more…)
Karen Hsu shares why reference data is a big issue today and how Informatica helps customers better manage risk and comply with new regulation.
For additional research on reference data trends, see the following report from Aite Group:
Many felt that with the increased focus on credit issues and all the discussion around ‘too big to fail’… enforcement of Anti-Money Laundering (AML) and Bank Secrecy Act (BSA) would take a back seat; this has not been the case. Recent large and very visible enforcement actions have called into question current oversight of AML/BSA; congressional allegations of lax oversight have continued to make BSA compliance a top examination priority for regulators. According to moneylaundering.com, fines from non-compliance with BSA and OFAC almost quadrupled in 2010, costing financial services companies over $660M in fines just in the U.S. alone. Bankersonline.com reports the U.S. Justice Department is seeking to fine HSBC USA as much as $500 million for anti-money laundering compliance problems, an amount that would be the largest-ever penalty for such violations, say individuals familiar with the investigation. Why you may ask? (more…)
The complexity in automated payments processing, electronic bank account management, and reference data management is in the end-to-end integration. The value of the automation is the enhanced visibility into payment, counterparty, security data.
Your competitive differentiation lies in the level of visibility you provide your customers. (more…)
The devil, as they say, is in the detail. Your organization might have invested years of effort and millions of dollars in an enterprise data warehouse, but unless the data in it is accurate and free of contradiction, it can lead to misinformed business decisions and wasted IT resources.
We’re seeing an increasing number of organizations confront the issue of data quality in their data warehousing environments in efforts to sharpen business insights in a challenging economic climate. Many are turning to master data management (MDM) to address the devilish data details that can undermine the value of a data warehousing investment.
Consider this: Just 24 percent of data warehouses deliver “high value” to their organizations, according to a survey by The Data Warehousing Institute (TDWI). Twelve percent are low value and 64 percent are moderate value “but could deliver more,” TDWI’s report states. For many organizations, questionable data quality is the reason why data warehouses fall short of their potential. (more…)
One of the most critical first steps for financial services firms looking to implement multidomain master data management (MDM) is to quantify the cost savings they could achieve.
Unfortunately, a thorough analysis of potential ROI is also one of the steps least followed (a key culprit being disconnects between business and IT).
This shortcoming is spotlighted in a new Informatica white paper, “Five Steps to Managing Reference Data More Effectively in Investment Banking,” which outlines key questions to ask in sizing up the cost implications of bad data and antiquated systems, such as:
- How long does it take to introduce a new security to trade?
- How many settlements need to be fixed manually?
- How many redundant data feeds does your firm have to manage?
- How accurate and complete are your end-of-day reports?
- Do you have the data you need to minimize risk and exposure? (more…)