Category Archives: Banking & Capital Markets

BCBS 239 – What Are Banks Talking About?

I recently participated on an EDM Council panel on BCBS 239 earlier this month in London and New York. The panel consisted of Chief Risk Officers, Chief Data Officers, and information management experts from the financial industry. BCBS 239 set out 14 key principles requiring banks aggregate their risk data to allow banking regulators to avoid another 2008 crisis, with a deadline of Jan 1, 2016.  Earlier this year, the Basel Committee on Banking Supervision released the findings from a self-assessment from the Globally Systemically Important Banks (GISB’s) in their readiness to 11 out of the 14 principles related to BCBS 239. 

Given all of the investments made by the banking industry to improve data management and governance practices to improve ongoing risk measurement and management, I was expecting to hear signs of significant process. Unfortunately, there is still much work to be done to satisfy BCBS 239 as evidenced from my findings. Here is what we discussed in London and New York.

  • It was clear that the “Data Agenda” has shifted quite considerably from IT to the Business as evidenced by the number of risk, compliance, and data governance executives in the room.  Though it’s a good sign that business is taking more ownership of data requirements, there was limited discussions on the importance of having capable data management technology, infrastructure, and architecture to support a successful data governance practice. Specifically capable data integration, data quality and validation, master and reference data management, metadata to support data lineage and transparency, and business glossary and data ontology solutions to govern the terms and definitions of required data across the enterprise.
  • With regard to accessing, aggregating, and streamlining the delivery of risk data from disparate systems across the enterprise and simplifying the complexity that exists today from point to point integrations accessing the same data from the same systems over and over again creating points of failure and increasing the maintenance costs of supporting the current state.  The idea of replacing those point to point integrations via a centralized, scalable, and flexible data hub approach was clearly recognized as a need however, difficult to envision given the enormous work to modernize the current state.
  • Data accuracy and integrity continues to be a concern to generate accurate and reliable risk data to meet normal and stress/crisis reporting accuracy requirements. Many in the room acknowledged heavy reliance on manual methods implemented over the years and investing in Automating data integration and onboarding risk data from disparate systems across the enterprise is important as part of Principle 3 however, much of what’s in place today was built as one off projects against the same systems accessing the same data delivering it to hundreds if not thousands of downstream applications in an inconsistent and costly way.
  • Data transparency and auditability was a popular conversation point in the room as the need to provide comprehensive data lineage reports to help explain how data is captured, from where, how it’s transformed, and used remains a concern despite advancements in technical metadata solutions that are not integrated with their existing risk management data infrastructure
  • Lastly, big concerns regarding the ability to capture and aggregate all material risk data across the banking group to deliver data by business line, legal entity, asset type, industry, region and other groupings, to support identifying and reporting risk exposures, concentrations and emerging risks.  This master and reference data challenge unfortunately cannot be solved by external data utility providers due to the fact the banks have legal entity, client, counterparty, and securities instrument data residing in existing systems that require the ability to cross reference any external identifier for consistent reporting and risk measurement.

To sum it up, most banks admit they have a lot of work to do. Specifically, they must work to address gaps across their data governance and technology infrastructure.BCBS 239 is the latest and biggest data challenge facing the banking industry and not just for the GSIB’s but also for the next level down as mid-size firms will also be required to provide similar transparency to regional regulators who are adopting BCBS 239 as a framework for their local markets.  BCBS 239 is not just a deadline but the principles set forth are a key requirement for banks to ensure they have the right data to manage risk and ensure transparency to industry regulators to monitor system risk across the global markets. How ready are you?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Aggregation, Data Governance, Data Services | Tagged , , , | Leave a comment

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Customer Centric Financial Services

Customer Centric Finance eBookThe business of financial services is transforming before our eyes. Traditional banking and insurance products have become commoditized. As each day passes, consumers demand increasingly personalized products and services. Social and mobile channels continue to overthrow traditional communication methods. To survive and grow in this complex environment, financial institutions must do three things:

  1. Attract and retain the best customers
  2. Grow wallet share
  3. Deliver top-notch customer experience across all channels and touch points

The finance industry is traditionally either product centric or account centric. However, to succeed in the future, financial institutions must become customer centric. Becoming customer-centric requires changes to your people, process, technology, and culture. You must offer the right product or service to the right customer, at the right time, via the right channel. To achive this, you must ensure alignment between business and technology leaders. It will require targeted investments to grow the business, particularly the need to modernize legacy systems.

To become customer-centric, business executives are investing in Big Data and in legacy modernization initiatives. These investments are helping Marketing, Sales and Support organizations to:

  • Improve conversion rates on new marketing campaigns on cross-sell and up-sell activities
  • Measure customer sentiment on particular marketing and sales promotions or on the financial institution as a whole
  • Improve sales productivity ratios by targeting the right customers with the right product at the right time
  • Identify key indicators that determine and predict profitable and unprofitable customers
  • Deliver an omni-channel experience across all lines of business, devices, and locations

At Informatica, we want to help you succeed. We want you to maximize the value in these investments. For this reason, we’ve written a new eBook titled: “Potential Unlocked – Improving revenue and customer experience in financial services”. In the eBook, you will learn:

  • The role customer information plays in taking customer experience to the next level
  • Best practices for shifting account-centric operations to customer-centric operations
  • Common barriers and pitfalls to avoid
  • Key considerations and best practices for success
  • Strategies and experiences from best-in-class companies

Take a giant step toward Customer-Centricity: Download the eBook now.

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Financial Services | Tagged , , | Leave a comment

Murphy’s First Law of Bad Data – If You Make A Small Change Without Involving Your Client – You Will Waste Heaps Of Money

I have not used my personal encounter with bad data management for over a year but a couple of weeks ago I was compelled to revive it.  Why you ask? Well, a complete stranger started to receive one of my friend’s text messages – including mine – and it took days for him to detect it and a week later nobody at this North American wireless operator had been able to fix it.  This coincided with a meeting I had with a European telco’s enterprise architecture team.  There was no better way to illustrate to them how a customer reacts and the risk to their operations, when communication breaks down due to just one tiny thing changing – say, his address (or in the SMS case, some random SIM mapping – another type of address).

Imagine the cost of other bad data (thecodeproject.com)

Imagine the cost of other bad data (thecodeproject.com)

In my case, I  moved about 250 miles within the United States a couple of years ago and this seemingly common experience triggered a plethora of communication screw ups across every merchant a residential household engages with frequently, e.g. your bank, your insurer, your wireless carrier, your average retail clothing store, etc.

For more than two full years after my move to a new state, the following things continued to pop up on a monthly basis due to my incorrect customer data:

  • In case of my old satellite TV provider they got to me (correct person) but with a misspelled last name at my correct, new address.
  • My bank put me in a bit of a pickle as they sent “important tax documentation”, which I did not want to open as my new tenants’ names (in the house I just vacated) was on the letter but with my new home’s address.
  • My mortgage lender sends me a refinancing offer to my new address (right person & right address) but with my wife’s as well as my name completely butchered.
  • My wife’s airline, where she enjoys the highest level of frequent flyer status, continually mails her offers duplicating her last name as her first name.
  • A high-end furniture retailer sends two 100-page glossy catalogs probably costing $80 each to our address – one for me, one for her.
  • A national health insurer sends “sensitive health information” (disclosed on envelope) to my new residence’s address but for the prior owner.
  • My legacy operator turns on the wrong premium channels on half my set-top boxes.
  • The same operator sends me a SMS the next day thanking me for switching to electronic billing as part of my move, which I did not sign up for, followed by payment notices (as I did not get my invoice in the mail).  When I called this error out for the next three months by calling their contact center and indicating how much revenue I generate for them across all services, they counter with “sorry, we don’t have access to the wireless account data”, “you will see it change on the next bill cycle” and “you show as paper billing in our system today”.

Ignoring the potential for data privacy law suits, you start wondering how long you have to be a customer and how much money you need to spend with a merchant (and they need to waste) for them to take changes to your data more seriously.  And this are not even merchants to whom I am brand new – these guys have known me and taken my money for years!

One thing I nearly forgot…these mailings all happened at least once a month on average, sometimes twice over 2 years.  If I do some pigeon math here, I would have estimated the postage and production cost alone to run in the hundreds of dollars.

However, the most egregious trespass though belonged to my home owner’s insurance carrier (HOI), who was also my mortgage broker.  They had a double whammy in store for me.  First, I received a cancellation notice from the HOI for my old residence indicating they had cancelled my policy as the last payment was not received and that any claims will be denied as a consequence.  Then, my new residence’s HOI advised they added my old home’s HOI to my account.

After wondering what I could have possibly done to trigger this, I called all four parties (not three as the mortgage firm did not share data with the insurance broker side – surprise, surprise) to find out what had happened.

It turns out that I had to explain and prove to all of them how one party’s data change during my move erroneously exposed me to liability.  It felt like the old days, when seedy telco sales people needed only your name and phone number and associate it with some sort of promotion (back of a raffle card to win a new car), you never took part in, to switch your long distance carrier and present you with a $400 bill the coming month.  Yes, that also happened to me…many years ago.  Here again, the consumer had to do all the legwork when someone (not an automatic process!) switched some entry without any oversight or review triggering hours of wasted effort on their and my side.

We can argue all day long if these screw ups are due to bad processes or bad data, but in all reality, even processes are triggered from some sort of underlying event, which is something as mundane as a database field’s flag being updated when your last purchase puts you in a new marketing segment.

Now imagine you get married and you wife changes her name. With all these company internal (CRM, Billing, ERP),  free public (property tax), commercial (credit bureaus, mailing lists) and social media data sources out there, you would think such everyday changes could get picked up quicker and automatically.  If not automatically, then should there not be some sort of trigger to kick off a “governance” process; something along the lines of “email/call the customer if attribute X has changed” or “please log into your account and update your information – we heard you moved”.  If American Express was able to detect ten years ago that someone purchased $500 worth of product with your credit card at a gas station or some lingerie website, known for fraudulent activity, why not your bank or insurer, who know even more about you? And yes, that happened to me as well.

Tell me about one of your “data-driven” horror scenarios?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Aggregation, Data Governance, Data Privacy, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Healthcare, Master Data Management, Retail, Telecommunications, Uncategorized, Vertical | Tagged , , , , , , , , , | Leave a comment

Open Source, Next Generation Data Encoding

Today is an exciting day for technology in high performance electronic trading. By the time you read this, the CME Group, Real Logic Ltd., and Informatica will have announced a new open source initiative. I’ve been collaborating on this work for a few months and I feel it is some great technology. I hope you will agree.

Simple Binary Encoding (SBE) is an encoding for FIX that is being developed by the FIX protocol community as part of their High Performance Working Group. The goal is to produce a binary encoding representation suitable for low-latency financial trading. The CME Group, Real Logic, and Informatica have sponsored the development of an open source implementation of an early version of the SBE specification undertaken by Martin Thompson (of Real Logic, formerly of LMAX) and myself, Todd Montgomery (of Informatica). The implementation methodology has been a very high performance encoding/decoding mechanism for data layout that is tailored to not just high performance application demands in low-latency trading. But has implications for all manner of serialization and marshaling in use cases from Big Data analytics to device data capture.

Financial institutions, and other businesses, need to serialize data structures for purposes of transmission over networks as well as for storage. SBE is a developing standard for how to encode/decode FIX data structures over a binary media at high speeds with low-latency. The SBE project is most similar to Google Protocol Buffers. However, looks are quite deceiving. SBE is an order of magnitude faster and immensely more efficient for encoding and decoding. This focus on performance means application developers can turn their attention to the application logic instead of the details of serialization. There are a number of advantages to SBE beyond speed, although, speed is of primary concern.

  • SBE provides a strong typing mechanism in the form of schemas for data objects
  • SBE only generates the overhead of versioning if the schema needs to handle versioning and if so, only on decode
  • SBE uses an Intermediate Representation (IR) for decoupling schema specification, optimization, and code generation
  • SBEs use of IR will allow it to provide various data layout optimizations in the near future
  • SBE initially provides Java, C++98, and C# code generators with more on the way

What breakthrough has lead to SBE being so fast?

It isn’t new or a breakthrough. SBE has been designed and implemented with the concepts and tenants of Mechanical Sympathy. Most software is developed with abstractions to mask away the details of CPU architecture, disk access, OS concepts, etc. Not so for SBE. It’s been designed with Martin and I utilizing everything we know about how CPUs, memory, compilers, managed runtimes, etc. work and making it very fast and work _with_ the hardware instead of against it.

Martin’s Blog will have a more detailed-oriented, technical discussion sometime later on SBE. But I encourage you to look at it and try it out. The work is open to the public under an Apache Public License.

Find out more on the FIX/SBE specification and SBE on github.

———————————————–

Todd Montgomery

Todd L. Montgomery is a Vice President of Architecture for Informatica and the chief designer and implementer of the 29West low latency messaging products. The Ultra Messaging product family (formerly known as LBM) has over 190 production deployments within electronic trading across many asset classes and pioneered the broker-less messaging paradigm. In the past, Todd has held architecture positions at TIBCO and Talarian as well as lecture positions at West Virginia University, contributed to the IETF, and performed research for NASA in various software fields. With a deep background in messaging systems, high performance systems, reliable multicast, network security, congestion control, and software assurance, Todd brings a unique perspective tempered by over 20 years of practical development experience.

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Big Data | Tagged , , | Leave a comment

There Is A Silver Bullet – Really!

Last month in The Biggest Dirty Little Secret in IT I highlighted a disturbing phenomenon – that in highly data-driven organizations that have large IT departments, as they get larger they become less efficient.  In short, diseconomies of scale begin to creep in which slow down processes and drive up costs. The article went on to identify the root cause as a high degree of manual IT processes which don’t scale well. The question I will address in this article is what can we do to tackle the problem, and what is it worth? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Business Impact / Benefits, CIO, Data Integration, Data Integration Platform, Enterprise Data Management, Integration Competency Centers | Tagged , , | 2 Comments

Managing Risk and Compliance in Financial Services with Informatica Vibe!

Last week at Informatica World 2013, Informatica introduced Vibe, the industry’s first and only embeddable virtual data machine (VDM), designed to embed data management into the next generation of applications for the integrated information age. This unique capability offers technology for banks and insurance companies to scale and improve their data management, integration, and governance processes to manage risk and ensure ongoing compliance with a host of industry regulations from Basel III, Dodd Frank, to Solvency II. Why is Vibe unique and how does it help with risk management and regulatory compliance? 

The data required for risk and compliances originates from tens if not hundreds of systems across all lines of business including loan origination systems, loan servicing, credit card processors, deposit servicing, securities trading, brokerage, call center, online banking, and more. Not to mention external data providers for market, pricing, positions, and corporate actions information.  The volumes are greater than ever, the systems range from legacy mainframe trading systems to mobile banking applications, the formats vary across the board from structured, semi-structured, and unstructured, and a wide range of data standards must be dealt with including MISMO®, FpML®, FIX®, ACORD®, to SWIFT to name a few.  Take all that into consideration and the data administration, management, governance, and integration work required is massive, multifaceted, and fraught with risk and hidden costs often caused by custom coded processes or use of standalone tools.

The Informatica Platform and Vibe can help by allowing our customers to take advantage of ever evolving data technologies and innovations without having to recode and develop a lean data management process that turns unique works of art into reusable artifacts across the information supply chain. In other words, Vibe powers the unique “Map Once. Deploy Anywhere.” capabilities of the Informatica Platform accelerates project delivery by 5x and makes the entire data lifecycle easier to manage and eliminates the risks, costs, and short lived value associated with hand coding or using standalone tools to do this work.  Here are some examples of Vibe for risk and compliance:

  • Built data quality rules to standardize address information, remove or consolidate duplicates, translate or standardize reference data, and other critical information to calculate risk within your ETL process or as a “Data Quality Validation” service in upstream systems
  • Build rules to standardize wire transfer data to the latest SWIFT formats within your payment hubs as well as leverage the same rules in facilitating payment transactions with your counterparties.
  • Build and execute complex parsing and transformation processes leveraging the power of Hadoop to handle large volumes of structured and unstructured data to analytics and utilize the same rules in downstream credit, operational, and market risk data warehouses.
  • Define standard data masking rules once, and leverage it when using data with sensitive information for testing and develop as well as enforcing data access rights for ongoing data privacy compliance.

 The “Map Once. Deploy Anywhere.” capabilities inherent to Vibe drive:

  • Faster adoption of new technologies and data – Banks and insurance companies can take rapid advantage of new data and technologies without having to know the details of the underlying platform, or having to hire highly specialized and costly programming resources. 
  • Reduced complexity through insulation from change – When data type, volume, source, platform or users change, financial institutions can simply redeploy their existing data integration instructions without re-specification, redesign or redevelopment on a new integration technology – like Hadoop.

Vibe is NOT a new product offering. It is a unique capability that Informatica supports through our existing platform comprised of our Data Integration, Data Quality, Master Data Management, and Informatica Life Cycle Management products.  Whether it is Dodd Frank, Basel III, FATCA, or Solvency II, with Vibe, banks and insurance companies can ensure they have the right data and increase the potential to improve how they measure risk and ensure regulatory compliance. Visit Informatica’s Banking/Capital Markets and Insurance industry solutions section of our website for more information on how we help today’s global financial services industry.

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Financial Services | Tagged , | Leave a comment

Sub-100 Nanosecond Pub/Sub Messaging: What Does It Matter?

Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.

But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?

Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”.  He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.”  What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.

The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU.   The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.

So congrats to the UM development team for achieving this  important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and  your businesses.

 

*- http://tabbforum.com/opinions/the-death-of-alpha-on-wall-street?utm_source=TabbFORUM+Alerts&utm_campaign=1c01537e42-UA-12160392-1&utm_medium=email&utm_term=0_29f4b8f8f1-1c01537e42-270859141

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Ultra Messaging | Tagged , , , , , | Leave a comment

Leading Indicator: 89% of Financial Service Firms Have Adopted ICCs

The latest survey by Informatica Professional Services shows that 59% of enterprises have, or are in the process of, implementing an ICC. The figures vary greatly by industry however.  For example, in Financial Service Firms the percentage is 89% while for public sector organizations it is just 25%. What can we take from this? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, CIO, Integration Competency Centers | Tagged , , , | 2 Comments

Financial Services Sessions at Informatica World 2013

Data is one of the most important and value assets to banks and insurance companies across the globe to help comply with industry regulations, improve customer experience, find new revenue opportunities, and reduce the cost of doing business. These are universal needs and challenges and Informatica’s industry leading solutions have helped over 780 financial services institutions increase their potential to achieve business success.

At Informatica World 2013, June 4-7 at the Aria Resort and Casino in Las Vegas, Nevada, we will be showcasing a wealth of valuable information to maximize value from your data assets and technology investments. The event includes over 100 interactive and informative breakout sessions across 6 dedicated tracks on (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk).

There will also be a financial services path including guest speakers from the banking and insurance industry and from our Financial Services experts including:

  • Morgan Stanley Wealth Management: Accelerating Business Growth While Protecting Sensitive Data: Find how Morgan Stanley built one of the largest Informatica platforms to mask and process over 150 thousand objects used by more than 1,000 applications globally and comply with today’s data privacy regulations.
  • Wells Fargo Bank’s Data Governance Journey with Informatica: Hear and learn about Wells Fargo’s data governance strategy, program, and how Informatica is used to deliver actionable, transparent, and trusted data to the business.
  • Liberty Mutual Insurance:  Architecture and Best Practices with Informatica Data Integration:  Learn how Informatica Data Integration’s metadata-driven architecture helps scale and support large data volumes and meet enterprise Liberty Mutual’s demands for performance and compliance.
  • Addressing Top Business Priorities in Banking and Insurance with MDM: Peter Ku, Senior Director of Financial Services Industry solutions share how Master Data Management is being used in Banking and Insurance to help address top business imperatives from regulatory compliance to finding new revenue opportunities. 

Register today at www.informaticaworld.com and I look forward to seeing you there!

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Financial Services | Tagged , , | Leave a comment