Tag Archives: banking
Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.
But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?
Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”. He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.” What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.
The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU. The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.
So congrats to the UM development team for achieving this important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and your businesses.
The need to be more customer-centric in financial services is more important than ever as banks and insurance companies look for ways to reduce churn as those in the industry know that loyal customers spend more on higher margin products and are likely to refer additional customers. Bankers and insurers who understand this, and get this right, are in a better position to maintain profitable and lasting customer loyalty and reap significant financial rewards. The current market conditions remain significant and will be difficult to overcome without the right information management architecture to help companies be truly customer centric. Here’s why:
- Customer satisfaction with retail banks has decreased for four consecutive years, with particularly low scores in customer service. Thirty-seven percent of customers who switched primary relationships cited in an industry survey showed poor customer service as the main reasons.
- The commoditization of traditional banking and insurance products has rapidly increased client attrition and decreased acquisition rates. Industry reports estimate that banks are losing customers at an average rate of 12.5% per year, while average acquisition rates are at 13.5%, making acquisitions nearly a zero-sum game. Further, the cost of acquiring new customers is estimated at five times the rate of retaining existing ones.
- Switching is easier than ever before. Customer churn is at an all-time high in most European countries. According to an industry survey, 42 percent of German banking customers had been with their main bank for less than a year. As customer acquisition costs running between of €200 to €400, bankers and insurers need to keep their clients at least 5 to 7 years to simply break even.
- Mergers and acquisitions impact even further the complexity and risks of maintaining customer relationships. According to a recent study, 17 percent of respondents who had gone through a merger or acquisition had switched at least one of their accounts to another institution after their bank was acquired, while an additional 31 percent said they were at least somewhat likely to switch over the next year.
Financial services professionals have long recognized the need to manage customer relationships vs. account relationships by shifting away from a product-centric culture toward a customer-centric model to maintain client loyalty and grow their bottom lines organically. Here are some reasons why:
- A 5% increase in customer retention can increase profitability by 35% in banking, 50% in brokerage, and 125% in the consumer credit card market.
- Banks can add more than $1 million to the profitability of their commercial banking business line by simply extending 16 of these large corporate relationships by one year, or by saving two such clients from defecting. In the insurance sector, a one percent increase in customer retention results in $1M in revenue.
- The average company has between a 60% and 70% probability of success selling more services to a current customer, a 20% to 40% probability of selling to a former customer, and a 5% to 20% probability of making a sale to a prospect.
- Up to 66% of current users of financial institutions’ social media sites engage in receiving information about financial services, 32% use it to retrieve information about offers or promotions and 30% to conduct customer service related activities.
So what does it take to become more Customer-centric?
Companies who have successful customer centric business models share similar cultures of placing the customer first, people who are willing to go that extra mile, business processes designed with the customer’s needs in mind, product and marketing strategy that is designed to meet a customer’s needs, and technology solutions that helps access and deliver trusted, timely, and comprehensive information and intelligence across the business. These technologies include
Why is data integration important? Customer centricity begins with the ability to access and integrate your data regardless of format, source system, structure, volume, latency, from any location including the cloud and social media sites. The data business needs originates from many different systems across the organization and outside including new Software as a Service solutions and cloud based technologies. Traditional hand coded methods and one off tools and open source data integration tools are not able to scale and perform to effectively and efficiently access, manage, and deliver the right data to the systems and applications in the front lined. A the same time, we live in the Big Data era with increasing transaction volumes, new channel adoption including mobile devices and social media combined generating petabytes of data of which to support a capable and sustainable customer centric business model, requires technology that can handle this complexity, scale with the business, while reducing costs and improving productivity.
Data quality issues must be dealt with proactively and managed by both business and technology stakeholders. Though technology itself cannot prevent all data quality errors from happening, it is a critical part of your customer information management process to ensure any issues that exist are identified and dealt with in an expeditious manner. Specifically, a Data Quality solution that can help detect data quality errors in any source, allow business users to define data quality rules, support seamless consumption of those rules by developers to execute, dashboards and reports for business stakeholders, and ongoing quality monitoring to deal with time and business sensitive exceptions. Data quality management can only scale and deliver value if an organization believes and manages data as an asset. It also helps to have a data governance framework consisting of processes, policies, standards, and people from business and IT working together in the process.
Lastly, growing your business, improving wallet share, retaining profitable relationships, and lowering the cost of managing customer relationships requires a single, trusted, holistic, and authoritative source of customer information. Managing customer information has historically been in applications across traditional business silos that lacked any common processes to reconcile duplicate and conflicting information across business systems. Master Data Management solutions are purposely designed to help breakdown the traditional application and business silos and helps deliver that single view of the truth for all systems to benefit. Master Data Management allows banks and insurance companies to access, identity unique customer entities, relate accounts to each customer, and extend that relationship view across other customers and employees including relationship bankers, financial advisors, to existing agents and brokers.
The need to attract and retain customers is a continuous journey for the financial industry however that need is greater than ever before. The foundation for successful customer centricity requires technology that can help access and deliver trusted, timely, consistent, and comprehensive customer information and insight across all channels and avoid the mistakes of the past, allow you to stay ahead of your competition, and maximize value for your shareholders.
 2010 UK Retail Banking Satisfaction Study, J.D. Power and Associates, October 2010.
 “Customer Winback”
 Mortgage Servicing News
According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.
One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new. Migrating data involves:
- Ability to access existing data in the legacy systems
- Understand the data structures that need to be migrated
- Transform and execute one-to-one mapping with the relevant fields in the new system
- Identify data quality errors and other gaps in the data
- Validate what is entered into the new system by identifying transformation or mapping errors
- Seamlessly connect to the target tables and fields in the new system
Sounds easy enough right? Not so fast! (more…)
While Dodd Frank received most of the media attention after the great financial crisis, during that period, the U.S. government signed into law the Foreign Account Tax Compliance Act (FATCA) back in March 2010 which will require Foreign Financial Institutions (FFIs) to report the names of U.S. persons and owners of companies who have bank accounts in foreign accounts for tax reporting and withholding purposes.
The law was set to go into effect on January 1, 2013 however on October 24, 2012, the U.S. Internal Revenue Service (IRS) announced a one year extension to January 1, 2014 to give FFIs more time implement procedures for meeting the FATCA reporting requirements. Banks who elect not to comply or fail to meet these deadlines will be tagged as a ‘non-participating FFI’ and subject to a 30% withholding tax on all U.S. sourced income paid to it by a U.S. financial institution. Ouch!!
The reasons for FATCA are fairly straight forward. The United States Internal Revenue Service (IRS) wants to collect its share of tax revenue from individuals who have financial accounts and assets in overseas banks. According to industry studies, it is estimated that of the seven million U.S. citizens and green card holders who live or work outside the U.S., less than seven percent file tax returns. Officially, the intention of FATCA is not to raise additional tax revenue but to trace its missing, non-compliant taxpayers and return them to the U.S. tax system. Once FATCA goes into effect, the IRS expects it will collect an additional $8.7 billion in tax revenue.
Satisfying FATCA reporting requirements will require banks to identify:
- Any customer who may have an existing U.S. tax status.
- Customers who hold a U.S. citizenship or green card.
- Country of birth and residency.
- U.S.-based addresses associated with accounts – incoming and outgoing payments.
- Customers who have re-occurring payments to the U.S. including electronic transfers and recipient banks located in the U.S.
- Customers who have payments coming from the U.S. to banks abroad.
- Customers with high balances across retail banking, wealth management, asset management, Investment and Commercial Banking business lines.
Although these requirements sound simple enough, there are many data challenges to overcome including:
- Access to account information from core banking systems, customer management and relationship systems, payment systems, databases and desktops across multiple lines of business which can range into the hundreds, if not thousands of individual data sources.
- Data varying in different formats and structures including unstructured documents such as scanned images, PDFs, etc.
- Data quality errors including:
- Incomplete records: Data that is missing or unusable from the source system or file yet required for FATCA identification.
- Non-conforming record types: Data that is available in a non-standard format that does not integrate with data from other systems.
- Inconsistent values: Data values that give conflicting information or have different definitions with similar values.
- Inaccuracy: Data that is incorrect or out of date.
- Duplicates: Data records or attributes are repeated.
- Lack of Integrity: Data that is missing or not referenced in any system.
Most modern core banking systems have built in data validation checks to ensure that the right values are entered. Unfortunately, many banks continue to operate 20-30 year-old systems, many of which were custom built and lack upstream validation capabilities. In many cases, these data errors arise when combining ‘like’ data and information from multiple systems. Given the number of data sources and the volume of data that banks deal with, it will be important for FFIs to have capable technology to expedite and accurately profile FATCA source data to identify errors at the source as well as errors that occur as data is being combined and transformed for reporting purposes.
Another data quality challenge facing FFI’s will be to identify unique account holders while dealing with the following data anomalies:
- Deciphering names across different language (山田太郎 vs. Taro Yamada)
- Use of Nicknames (e.g. John, Jonathan, Johnny)
- Concatenation (e.g. Mary Anne vs. Maryanne)
- Prefix / Suffix (e.g. MacDonald vs. McDonald)
- Spelling error (e.g. Potter vs. Porter)
- Typographical error (e.g. Beth vs. Beht)
- Transcription error (e.g. Hannah vs. Hamah)
- Localization (e.g. Stanislav Milosovich vs. Stan Milo)
- Phonetic variations (e.g. Edinburgh – Edinborough)
- Transliteration (e.g. Kang vs. Kwang)
Attempting to perform these intricate data validations and matching processes requires technology that is purposely built for this function. Specifically, identity matching and resolution technology that leverages proven probabilistic, deterministic and fuzzy matching algorithms against any data of any language, capable of processing large data sets in a timely manner and that is designed to be used by business analysts versus an IT developer. Most importantly, being able to deliver the end results into the bank’s FATCA reporting systems and applications where the business needs it most.
As I stated earlier, FATCA impacts both U.S. and non-U.S. banks and is as important for the U.S. tax collectors as well as to the health of the global financial and economic markets. Even with the extended deadlines, those who lack capable data quality management processes, policies, standards and enabling technologies to deal with these data quality issues must act now or face the penalties defined by Uncle Sam.
In the second of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about the latest trends regarding counterparty data and how the legal entity identifier (LEI) system will impact banks across the globe.
Specifically, he answers the following questions:
- What are the latest trends regarding counterparty information and how will the legal entity identifier system impact banks across the globe?
- How does Informatica help solve the challenges regarding counterparty information and help banks prepare for the new legal entity identifier system?
Also watch Peter’s first video (http://youtu.be/KvyDPzOTnUY) to learn about counterparty data and its challenges.
In the first of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about counterparty data and its challenges.
Specifically, he answers the following questions:
- What is counterparty data and why is it important?
- What are some of the challenges that the global banking industry is facing with their counterparty information?
There has been much discussion, particularly in the UK, about banks restricting the use of their investment and retail arms. The thinking process behind this is that investment banking is much riskier and so by drawing a clear line between the two, consumers will be better protected if another financial crisis should hit. (more…)
By Nancy Atkinson, Senior Analyst, Aite Group
Karen Hsu of Informatica organized a TweetJam (#INFAtj) recently on business-to-business (B2B) payments, SEPA, and integration. In conversation with Chris Skinner of Balatro Ltd., I stayed (mostly) within the 140-character message limitations of Twitter while the hour flew by. (more…)
I just returned from Informatica World 2010 and wanted to share the numerous stories and experiences from some of our banking and capital markets customers using Informatica beyond Extract/Transform/Load (E.T.L) and beyond data warehousing. More importantly, how Informatica is helping these companies combat fraud, manage risk and compliance, accelerate M&A integrations, attract and retain customers, and improve operational efficiencies. Take a look at what I learned! (more…)