Category Archives: Financial Services
As I indicated in my last case study regarding competing on analytics, Thomas H. Davenport believes “business processes are among the last remaining points of differentiation.” For this reason, Davenport contends that businesses that create a sustainable right to win use analytics to “wring every last drop of value from their processes”. For financial services, the mission critical areas needing process improvement center are around improving the consistency of decision making and making the management of regulatory and compliance more efficient and effective.
Why does Fannie Mae need to compete on analytics?
Fannie Mae is in the business of enabling people to buy, refinance, or rent homes. As a part of this, Fannie Mae says it is all about keeping people in their homes and getting people into new homes. Foundational to this mission is the accurate collection and reporting of data for decision making and risk management. According to Tracy Stephan at Fannie Mae, their “business needs to have the data to make decisions in a more real time basis. Today, this is all about getting the right data to the right people at the right time”.
Fannie Mae claims when the mortgage crisis hit, a lot of the big banks stopped lending and this meant that Fannie Mae among others needed to pick up the slack. Their action here, however, caused the Federal Government to require them to report monthly and quarterly against goals that the Federal Government set for it. “This meant that there was not room for error in how data gets reported”. In the end, Fannie Mae says three business imperatives drove it’s need to improve its reporting and its business processes:
- To ensure that go forward business decisions were made consistently using the most accurate business data available
- To avoid penalties by adhering to Dodd-Frank and other regulatory requirements established for it after the 2008 Global Financial Crisis
- To comply with reporting to Federal Reserve and Wall Street regarding overall business risk as a function of: data quality and accuracy, credit-worthiness of loans, and risk levels of investment positions.
Delivering required Fannie Mae to change how it managed data
Given these business imperatives, IT leadership quickly realized it needed to enable the business to use data to truly drive better business processes from end to end of the organization. However, this meant enabling Fannie Mae’s business operations teams to more effectively and efficiently manage data. This caused Fannie Mae to determine that it needed a single source of truth whether it was for mortgage applications or the passing of information securely to investors. This need required Fannie Mae to establish the ability to share the same data across every Fannie Mae repository.
But there was a problem. Fannie Mae needed clean and correct data collected and integrated from more than 100 data sources. Fannie Mae determined that doing so with its current data processes could not scale. And as well, it determined that its data processes would not allow it to meet its compliance reporting requirements. At the same time, Fannie Mae needed to deliver more proactive management of compliance. This required that it know how critical business data enters and flows through each of its systems. This includes how data was changed by multiple internal processing and reporting applications. As well, Fannie Mae leadership felt that this was critical to ensure traceability to the individual user.
Per its discussions with business customers, Fannie Mae’s IT leadership determined that it needed to get real time, trustworthy data to improve its business operations and to improve its business processes and decision making. As said, these requirements could not be met with its historical approaches to integrating and managing data.
Fannie Mae determined that it needed to create a platform that was high availability, scalable, and largely automating its management of data quality management. At the same time, the platform needed to provide the ability to create a set of business glossaries with clear data lineages. Fannie Mae determined it needed effectively a single source of truth across all of its business systems. According to Tracy Stephan, IT Director, Fannie Mae, “Data quality is the key to the success of Fannie Mae’s mission of getting the right people into the right homes. Now all our systems look at the same data – that one source of truth – which gives us great comfort.” To learn more specifics about how Fannie Mae improved its business processes and demonstrated that it is truly “data driven”, please click on this video of their IT leadership.
Solution Brief: The Intelligent Data Platform
Thomas Davenport Book “Competing On Analytics”
Competing on Analytics
The Business Case for Better Data Connectivity
The CFO Viewpoint upon Data
What an enlightened healthcare CEO should tell their CIO?
This was a great week of excitement and innovation here in San Francisco starting with the San Francisco Giants winning the National League Pennant for the 3rd time in 5 years on the same day Saleforce’s Dreamforce 2014 wrapped up their largest customer conference with over 140K+ attendees from all over the world talking about their new Customer Success Platform.
Salesforce has come a long way from their humble beginnings as the new kid on the cloud front for CRM. The integrated sales, marketing, support, collaboration, application, and analytics as part of the Salesforce Customer Success Platform exemplifies innovation and significant business value upside for various industries however I see it very promising for today’s financial services industry. However like any new business application, the value business gains from it are dependent in having the right data available for the business.
The reality is, SaaS adoption by financial institutions has not been as quick as other industries due to privacy concerns, regulations that govern what data can reside in public infrastructures, ability to customize to fit their business needs, cultural barriers within larger institutions that critical business applications must reside on-premise for control and management purposes, and the challenges of integrating data to and from existing systems with SaaS applications. However, experts are optimistic that the industry may have turned the corner. Gartner (NYSE:IT) asserts more than 60 percent of banks worldwide will process the majority of their transactions in the cloud by 2016. Let’s take a closer look at some of the challenges and what’s required to overcome these obstacles when adopting cloud solutions to power your business.
Challenge #1: Integrating and sharing data between SaaS and on-premise must not be taken lightly
For most banks and insurance companies considering new SaaS based CRM, Marketing, and Support applications with solutions from Salesforce and others must consider the importance of migrating and sharing data between cloud and on-premise applications in their investment decisions. Migrating existing customer, account, and transaction history data is often done by IT staff through the use of custom extracts, scripts, and manual data validations which can carry over invalid information from legacy systems making these new application investments useless in many cases.
For example, customer type descriptions from one or many existing systems may be correct in their respective databases however collapsing them into a common field in the target application seems easy to do. Unfortunately, these transformation rules can be complex and that complexity increases when dealing with tens if not hundreds of applications during the migration and synchronization phase. Having capable solutions to support the testing, development, quality management, validation, and delivery of existing data from old to new is not only good practice, but a proven way of avoiding costly workarounds and business pain in the future.
Challenge 2: Managing and sharing a trusted source of shared business information across the enterprise.
As new SaaS applications are adopted, it is critical to understand how to best govern and synchronize common business information such as customer contact information (e.g. address, phone, email) across the enterprise. Most banks and insurance companies have multiple systems that create and update critical customer contact information, many of them which reside on-premise. For example, insurance customers who update contact information such as a phone number or email address while filing an insurance claim will often result in that claims specialist to enter/update only the claims system given the siloed nature of many traditional banking and insurance companies. This is the power of Master Data Management which is purposely designed to identify changes to master data including customer records in one or many systems, update the customer master record, and share that across other systems that house and require that update is essential for business continuity and success.
In conclusion, SaaS adoption will continue to grow in financial services and across other industries. The silver lining in the cloud is your data and the technology that supports the consumption and distribution of it across the enterprise. Banks and insurance companies investing in new SaaS solutions will operate in a hybrid environment made up of Cloud and core transaction systems that reside on-premise. Cloud adoption will continue to grow and to ensure investments yield value for businesses, it is important to invest in a capable and scalable data integration platform to integrate, govern, and share data in a hybrid eco-system. To learn more on how to deal with these challenges, click here and download a complimentary copy of the new “Salesforce Integration for Dummies”
Gartner’s official definition of Information Governance is “…the specification of decision rights and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archival and deletion of information. It includes the processes, roles, standards, and metrics that ensure the effective and efficient use of information in enabling a business to achieve its goals.” It therefore looks to address important considerations that key stakeholders within an enterprise face.
A CIO of a large European bank once asked me – “How long do we need to keep information?”
Keeping Information Governance relevant
This bank had to govern, index, search, and provide content to auditors to show it is managing data appropriately to meet Dodd-Frank regulation. In the past, this information was retrieved from a database or email. Now, however, the bank was required to produce voice recordings from phone conversations with customers, show the Reuters feeds coming in that are relevant, and document all appropriate IMs and social media interactions between employees.
All these were systems the business had never considered before. These environments continued to capture and create data and with it complex challenges. These islands of information that seemingly do not have anything to do with each other, yet impact how that bank governs itself and how it saves any of the records associated with trading or financial information.
Coping with the sheer growth is one issue; what to keep and what to delete is another. There is also the issue of what to do with all the data once you have it. The data is potentially a gold mine for the business, but most businesses just store it and forget about it.
Legislation, in tandem, is becoming more rigorous and there are potentially thousands of pieces of regulation relevant to multinational companies. Businesses operating in the EU, in particular, are affected by increasing regulation. There are a number of different regulations, including Solvency II, Dodd-Frank, HIPAA, Gramm-Leach-Bliley Act (GLBA), Basel III and new tax laws. In addition, companies face the expansion of state-regulated privacy initiatives and new rules relating to disaster recovery, transportation security, value chain transparency, consumer privacy, money laundering, and information security.
Regardless, an enterprise should consider the following 3 core elements before developing and implementing a policy framework.
Whatever your size or type of business, there are several key processes you must undertake in order to create an effective information governance program. As a Business Transformation Architect, I can see 3 foundation stones of an effective Information Governance Program:
Assess Your Business Maturity
Understand the full scope of requirements on your business is a heavy task. Assess whether your business is mature enough to embrace information governance. Many businesses in EMEA do not have an information governance team already in place, but instead have key stakeholders with responsibility for information assets spread across their legal, security, and IT teams.
Undertake a Regulatory Compliance Review
Understand the legal obligations to your business are critical in shaping an information governance program. Every business is subject to numerous compliance regimes managed by multiple regulatory agencies, which can differ across markets. Many compliance requirements are dependent upon the numbers of employees and/or turnover reaching certain limits. For example, certain records may need to be stored for 6 years in Poland, yet the same records may need to be stored for 3 years in France.
Establish an Information Governance Team
It is important that a core team be assigned responsibility for the implementation and success of the information governance program. This steering group and a nominated information governance lead can then drive forward operational and practical issues, including; Agreeing and developing a work program, Developing policy and strategy, and Communication and awareness planning.
I recently wrapped up two overseas trips; one to Central America and another to South Africa. As such, I had the opportunity to meet with a national bank and a regional retailer. It prompted me to ask the question: Does location matter in emerging markets?
I wish I could tell you that there was a common theme on how firms in the same sector or country (even city) treat data on a philosophical or operational level but I cannot. It is such a unique experience every time as factors like ownership history, regulatory scrutiny, available/affordable skill set and past as well as current financial success create a unique grey pattern rather than a comfortable black and white separation. This is even more obvious when I mix in recent meetings I had with North American organizations in the same sectors.
Banking in Latin vs North America
While a national bank in Latin America may seem lethargic, unimaginative and unpolished at first, you can feel the excitement when they can conceive, touch and play with the potential of new paradigms, like becoming data-driven. Decades of public ownership did not seem to have stifled their willingness to learn and improve. On the other side, there is a stock market-listed, regional US bank and half the organization appears to believe in meddling along without expert IT knowledge, which reduced adoption and financial success in past projects. Back office leadership also firmly believes in “relationship management” over data-driven “value management”.
To quote a leader in their finance department, “we don’t believe that knowing a few more characteristics about a client creates more profit….the account rep already knows everything about them and what they have and need”. Then he said, “Not sure why the other departments told you there are issues. We have all this information but it may not be rolled out to them yet or they have no license to view it to date.” This reminded me of the “All Quiet on the Western Front” mentality. If it is all good over here, why are most people saying it is not? Granted; one more attribute may not tip the scale to higher profits but a few more and their historical interrelationship typically does.
As an example; think about the correlation of average account balance fluctuations, property sale, bill pay account payee set ups, credit card late charges and call center interactions over the course of a year.
The Latin American bankers just said, “We have no idea what we know and don’t know…but we know that even long standing relationships with corporate clients are lacking upsell execution”. In this case, upsell potential centered on wire transfer SWIFT message transformation to their local standard they report of and back. Understanding the SWIFT message parameters in full creates an opportunity to approach originating entities and cutting out the middleman bank.
Retailing in Africa vs Europe
The African retailer’s IT architects indicated that customer information is centralized and complete and that integration is not an issue as they have done it forever. Also, consumer householding information is not a viable concept due to different regional interpretations, vendor information is brand specific and therefore not centrally managed and event based actions are easily handled in BizTalk. Home delivery and pickup is in its infancy.
The only apparent improvement area is product information enrichment for an omnichannel strategy. This would involve enhancing attribution for merchandise demand planning, inventory and logistics management and marketing. Attributes could include not only full and standardized capture of style, packaging, shipping instructions, logical groupings, WIP vs finished goods identifiers, units of measure, images and lead times but also regional cultural and climate implications.
However, data-driven retailers are increasingly becoming service and logistics companies to improve wallet share, even in emerging markets. Look at the successful Russian eTailer Ozon, which is handling 3rd party merchandise for shipping and cash management via a combination of agency-style mom & pop shops and online capabilities. Having good products at the lowest price alone is not cutting it anymore and it has not for a while. Only luxury chains may be able to avoid this realization for now. Store size and location come at a premium these days. Hypermarkets are ill-equipped to deal with high-profit specialty items. Commercial real estate vacancies on British high streets are at a high (Economist, July 13, 2014) and footfall is at a seven-year low. The Centre for Retail Research predicts that 20% of store locations will close over the next five years.
If specialized, high-end products are the most profitable, I can (test) sell most of them online or at least through fewer, smaller stores saving on carrying cost. If my customers can then pick them up and return them however they want (store, home) and I can reduce returns from normally 30% (per the Economist) to fewer than 10% by educating and servicing them as unbureaucratically as possible, I just won the semifinals. If I can then personalize recommendations based on my customers’ preferences, life style events, relationships, real-time location and reward them in a meaningful way, I just won the cup.
Emerging markets may seem a few years behind but companies like Amazon or Ozon have shown that first movers enjoy tremendous long-term advantages.
So what does this mean for IT? Putting your apps into the cloud (maybe even outside your country) may seem like an easy fix. However, it may not only create performance and legal issues but also unexpected cost to support decent SLA terms. Does your data support transactions for higher profits today to absorb this additional cost of going into the cloud? Focus on transactional applications and their management obfuscates the need for a strong backbone for data management, just like the one you built for your messaging and workflows ten years ago. Then you can tether all the fancy apps to it you want.
Have any emerging markets’ war stories or trends to share? I would love to hear them. Stay tuned for future editions of this series.
“If I use master data technology to create a 360-degree view of my client and I have a data breach, then someone could steal all the information about my client.”
Um, wait, what? Insurance companies take personally identifiable information very seriously. The statement is flawed in the relationship between client master data and securing your client data. Let’s dissect the statement and see what master data and data security really mean for insurers. We’ll start by level setting a few concepts.
What is your Master Client Record?
Your master client record is your 360-degree view of your client. It represents everything about your client. It uses Master Data Management technology to virtually integrate and syndicate all of that data into a single view. It leverages identifiers to ensure integrity in the view of the client record. And finally it makes an effort through identifiers to correlate client records for a network effect.
There are benefits to understanding everything about your client. The shape and view of each client is specific to your business. As an insurer looks at their policyholders, the view of “client” is based on relationships and context that the client has to the insurer. This are policies, claims, family relationships, history of activities and relationships with agency channels.
And what about security?
Naturally there is private data in a client record. But there is nothing about the consolidated client record that contains any more or less personally identifiable information. In fact, most of the data that a malicious party would be searching for can likely be found in just a handful of database locations. Additionally breaches happen “on the wire”. Policy numbers, credit card info, social security numbers, and birth dates can be found in less than five database tables. And they can be found without a whole lot of intelligence or analysis.
That data should be secured. That means that the data should be encrypted or masked so that any breach will protect the data. Informatica’s data masking technology allows this data to be secured in whatever location. It provides access control so that only the right people and applications can see the data in an unsecured format. You could even go so far as to secure ALL of your client record data fields. That’s a business and application choice. Do not confuse field or database level security with a decision to NOT assemble your golden policyholder record.
What to worry about? And what not to worry about?
Do not succumb to fear of mastering your policyholder data. Master Data Management technology can provide a 360-degree view. But it is only meaningful within your enterprise and applications. The view of “client” is very contextual and coupled with your business practices, products and workflows. Even if someone breaches your defenses and grabs data, they’re looking for the simple PII and financial data. Then they’re grabbing it and getting out. If the attacker could see your 360-degree view of a client, they wouldn’t understand it. So don’t over complicate the security of your golden policyholder record. As long as you have secured the necessary data elements, you’re good to go. The business opportunity cost of NOT mastering your policyholder data far outweighs any imagined risk to PII breach.
So what does your Master Policyholder Data allow you to do?
Imagine knowing more about your policyholders. Let that soak in for a bit. It feels good to think that you can make it happen. And you can do it. For an insurer, Master Data Management provides powerful opportunities across everything from sales, marketing, product development, claims and agency engagement. Each channel and activity has discreet ROI. It also has direct line impact on revenue, policyholder satisfaction and market share. Let’s look at just a few very real examples that insurers are attempting to tackle today.
- For a policyholder of a certain demographic with an auto and home policy, what is the next product my agent should discuss?
- How many people live in a certain policyholder’s household? Are there any upcoming teenage drivers?
- Does this personal lines policyholder own a small business? Are they a candidate for a business packaged policy?
- What is your policyholder claims history? What about prior carriers and network of suppliers?
- How many touch points have your agents and had with your policyholders? Were they meaningful?
- How can you connect with you policyholders in social media settings and make an impact?
- What is your policyholder mobility usage and what are they doing online that might interest your Marketing team?
These are just some of the examples of very streamlined connections that you can make with your policyholders once you have your 360-degree view. Imagine the heavy lifting required to do these things without a Master Policyholder record.
Fear is the enemy of innovation. In mastering policyholder data it is important to have two distinct work streams. First, secure the necessary data elements using data masking technology. Once that is secure, gain understanding through the mastering of your policyholder record. Only then will you truly be able to take your clients’ experience to the next level. When that happens watch your revenue grow in leaps and bounds.
A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation. This bank prides itself on customer service and SMB focus, while using large-bank product offerings. However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.
This included technologies like ESB, BPM, CRM, etc. They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.
As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.
- Vendor Management & Risk
- Commercial and Consumer Depository products
- Credit Risk
- HR & Compensation
- Private Banking
- Customer Solutions
This lack of use occurred across the board. The natural reaction was to throw more bodies and more Band-Aid marts at the problem. Users also started to operate under the assumption that it will never get better. They just resigned themselves to mediocrity. When some new players came into the organization from various systemically critical banks, they shook things up.
Here is a list of use cases they want to tackle:
- The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
- The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
- The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
- The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
- The reduction of wealth management advisors’ time to research clients and prospects.
- The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment. This originated from assets no longer owned, scrapped or moved to different department, etc.
- The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums. An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
- The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).
MONEY TO BE MADE – PEOPLE TO SEE
Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.
So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?
The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW. Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?
To summarize my findings I want to quote three people I interviewed. A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game. Here she meant particularly tier 2 and 3 size organizations. A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”. The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.
Given all this, what would you suggest? Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?
Did I really compare data quality to flushing toilet paper? Yeah, I think I did. Makes me laugh when I read that, but still true. And yes, I am still playing with more data. This time it’s a location schedule for earthquake risk. I see a 26-story structure with a building value of only $136,000 built in who knows what year. I’d pull my hair out if it weren’t already shaved off.
So let’s talk about the six steps for data quality competency in underwriting. These six steps are standard in the enterprise. But, what we will discuss is how to tackle these in insurance underwriting. And more importantly, what is the business impact to effective adoption of the competency. It’s a repeating self-reinforcing cycle. And when done correctly can be intelligent and adaptive to changing business needs.
Profile – Effectively profile and discover data from multiple sources
We’ll start at the beginning, a very good place to start. First you need to understand your data. Where is it from and in what shape does it come? Whether internal or external sources, the profile step will help identify the problem areas. In underwriting, this will involve a lot of external submission data from brokers and MGAs. This is then combined with internal and service bureau data to get a full picture of the risk. Identify you key data points for underwriting and a desired state for that data. Once the data is profiled, you’ll get a very good sense of where your troubles are. And continually profile as you bring other sources online using the same standards of measurement. As a side, this will also help in remediating brokers that are not meeting the standard.
Measure – Establish data quality metrics and targets
As an underwriter you will need to determine what is the quality bar for the data you use. Usually this means flagging your most critical data fields for meeting underwriting guidelines. See where you are and where you want to be. Determine how you will measure the quality of the data as well as desired state. And by the way, actuarial and risk will likely do the same thing on the same or similar data. Over time it all comes together as a team.
Design – Quickly build comprehensive data quality rules
This is the meaty part of the cycle, and fun to boot. First look to your desired future state and your critical underwriting fields. For each one, determine the rules by which you normally fix errant data. Like what you do when you see a 30-story wood frame structure? How do you validate, cleanse and remediate that discrepancy? This may involve fuzzy logic or supporting data lookups, and can easily be captured. Do this, write it down, and catalog it to be codified in your data quality tool. As you go along you will see a growing library of data quality rules being compiled for broad use.
Deploy – Native data quality services across the enterprise
Once these rules are compiled and tested, they can be deployed for reuse in the organization. This is the beautiful magical thing that happens. Your institutional knowledge of your underwriting criteria can be captured and reused. This doesn’t mean just once, but reused to cleanse existing data, new data and everything going forward. Your analysts will love you, your actuaries and risk modelers will love you; you will be a hero.
Review – Assess performance against goals
Remember those goals you set for your quality when you started? Check and see how you’re doing. After a few weeks and months, you should be able to profile the data, run the reports and see that the needle will have moved. Remember that as part of the self-reinforcing cycle, you can now identify new issues to tackle and adjust those that aren’t working. One metric that you’ll want to measure over time is the increase of higher quote flow, better productivity and more competitive premium pricing.
Monitor – Proactively address critical issues
Now monitor constantly. As you bring new MGAs online, receive new underwriting guidelines or launch into new lines of business you will repeat this cycle. You will also utilize the same rule set as portfolios are acquired. It becomes a good way to sanity check the acquisition of business against your quality standards.
In case it wasn’t apparent your data quality plan is now more automated. With few manual exceptions you should not have to be remediating data the way you were in the past. In each of these steps there is obvious business value. In the end, it all adds up to better risk/cat modeling, more accurate risk pricing, cleaner data (for everyone in the organization) and more time doing the core business of underwriting. Imagine if you can increase your quote volume simply by not needing to muck around in data. Imagine if you can improve your quote to bind ratio through better quality data and pricing. The last time I checked, that’s just good insurance business.
And now for something completely different…cats on pianos. No, just kidding. But check here to learn more about Informatica’s insurance initiatives.
I was just looking at some data I found. Yes, real data, not fake demo stuff. Real hurricane location analysis with modeled loss numbers. At first glance, I thought it looked good. There are addresses, latitudes/longitudes, values, loss numbers and other goodies like year built and construction codes. Yes, just the sort of data that an underwriter would look at when writing a risk. But after skimming through the schedule of locations a few things start jumping out at me. So I dig deeper. I see a multi-million dollar structure in Palm Beach, Florida with $0 in modeled loss. That’s strange. And wait, some of these geocode resolutions look a little coarse. Are they tier one or tier two counties? Who would know? At least all of the construction and occupancy codes have values, albeit they look like defaults. Perhaps it’s time to talk about data quality.
This whole concept of data quality is a tricky one. As cost in acquiring good data is weighed against speed of underwriting/quoting and model correctness I’m sure some tradeoffs are made. But the impact can be huge. First, incomplete data will either force defaults in risk models and pricing or add mathematical uncertainty. Second, massively incomplete data chews up personnel resources to cleanse and enhance. And third, if not corrected, the risk profile will be wrong with potential impact to pricing and portfolio shape. And that’s just to name a few.
I’ll admit it’s daunting to think about. Imagine tens of thousands of submissions a month. Schedules of thousands of locations received every day. Can there even be a way out of this cave? The answer is yes, and that answer is a robust enterprise data quality infrastructure. But wait, you say, enterprise data quality is an IT problem. Yeah, I guess, just like trying to flush an entire roll of toilet paper in one go is the plumber’s problem. Data quality in underwriting is a business problem, a business opportunity and has real business impacts.
Join me in Part 2 as I outline the six steps for data quality competency in underwriting with tangible business benefits and enterprise impact. And now that I have you on the edge of your seats, get smart about the basics of enterprise data quality.
Regardless of the industry, new regulatory compliance requirements are more often than not treated like the introduction of a new tax. A few may be supportive, some will see the benefits, but most will focus on the negatives – the cost, the effort, the intrusion into private matters. There will more than likely be a lot of grumbling.
Across many industries there is currently a lot of grumbling, as new regulation seems to be springing up all over the place. Pharmaceutical companies have to deal with IDMP in Europe and UDI in the USA. This is hot on the heels of the US Sunshine Act, which is being followed in Europe by Aggregate Spend requirements. Consumer Goods companies in Europe are looking at the consequences of beefed up 1169 requirements. Financial Institutes are mulling over compliance to BCBS-239. Behind the grumbling most organisations across all verticals appear to have a similar approach to regulatory compliance. The pattern seems to go like this:
- Delay (The requirements may change)
- Scramble (They want it when? Why didn’t we get more time?)
- Code to Spec (Provide exactly what they want, and only what they want)
No wonder these requirements are seen as purely a cost and an annoyance. But it doesn’t have to be that way, and in fact, it should not. Just like I have seen a pattern in response to compliance, I see a pattern in the requirements themselves:
- The regulators want data
- Their requirements will change
- When they do change, regulators will be wanting even more data!
Now read the last 3 bullet points again, but use ‘executives’ or ‘management’ or ‘the business people’ instead of ‘regulators’. The pattern still holds true. The irony is that execs will quickly sign off on budget to meet regulatory requirements, but find it hard to see the value in “infrastructure” projects. Projects that will deliver this same data to their internal teams.
This is where the opportunity comes in. pwc’s 2013 State of Compliance Report[i] shows that over 42% of central compliance budgets are in excess of $1m. A significant figure. Efforts outside of the compliance team imply a higher actual cost. Large budgets are not surprising in multi-national companies, who often have to satisfy multiple regulators in a number of countries. As an alternate to multiple over-lapping compliance projects, what if this significant budget was repurposed to create a flexible data management platform? This approach could deliver compliance, but provide even more value internally.
Almost all internal teams are currently clamouring for additional data to drive ther newest application. Pharma and CG sales & marketing teams would love ready access to detailed product information. So would consumer and patient support staff, as well as down-stream partners. Trading desks and client managers within Financial Institutes should really have real-time access to their risk profiles guiding daily decision making. These data needs will not be going away. Why should regulators be prioritised over the people who drive your bottom line and who are guardians of your brand?
A flexible data management platform will serve everyone equally. Foundational tools for a flexible data management platform exist today including Data Quality, MDM, PIM and VIBE, Informatica’s Virtual Data Machine. Each of them play a significant role in easing of regulatory compliance, and as a bonus they deliver measureable business value in their own right. Implemented correctly, you will get enhanced data agility & visibility across the entire organisation as part of your compliance efforts. Sounds like ‘Buy one Get One Free’, or BOGOF in retail terms.
Unlike taxes, BOGOF opportunities are normally embraced with open arms. Regulatory compliance should receive a similar welcome – an opportunity to build the foundations for universal delivery of data which is safe, clean and connected. A 2011 study by The Economist found that effective regulatory compliance benefits businesses across a wide range of performance metrics[ii].
Is it time to get your free performance boost?
Like many American men, I judge my banking experience by the efficiency of my transaction time. However, my wife often still likes to go into the bank and see her favorite teller.
For her, banking is a bit more of a social experience. And every once in a while, my wife even drags into her bank as well. But like many of my male counterparts, I still judge the quality of the experience by the operational efficiency of her teller. And the thing that I hate the most is when our experience at the bank is lengthened when the teller can’t do something and has to get the bank manager’s approval.
Now, a major financial institution has decided to make my life and even my wife’s life better. Using Informatica Rulepoint, they have come up with a way to improve teller operational efficiency and customer experience while actually decreasing operational business risks. Amazing!
How has this bank done this magic? They make use of the data that they have to create a better banking experience. They already capture historical transactions data and team member performance against each transaction in multiple databases. What they are doing now is using this information to make better decisions. With this information, this bank is able to create and update a risk assessment score for each team member at a branch location. And then by using Informatica Rulepoint, they have created approximately 100 rules that are able change teller’s authority based upon the new transaction, the teller’s transaction history, and the teller’s risk assessment score. This means that if my wife carefully picks the right teller, she is speed through the line without waiting for management approval.
So the message at this bank is the fastest teller is the best teller. To me this is really using data to improve customer experience and allow for less time in a line. Maybe I should get this bank to talk next to my auto mechanic!