Peter Ku

Peter Ku

Anthem Data Breach – Who’s Next?

Peter KuI hate to break the news but data breaches have become an unfortunate fact of life. These unwanted events are happening too frequently that each time it happens, it feels like the daily weather report. The scary thing about data breaches is that these events will only continue to grow as criminals become more desperate to take advantage of the innocent and data about our personal records, financial account numbers, and identities continues to proliferate across computer systems in every industry from your local retailer, your local DMV, to one of the nation’s largest health insurance providers.

According to the 2014 Cost of Data Breach study from the Ponemon Institute, data breaches will cost companies $201 per stolen record. According to the NY Post, 80 million records were stolen from Anthem this week which will cost employees, customers, and shareholders $16,080,000,000 from this single event. The 80 million records accounted for includes the data they knew about. What about all the data that has proliferated across systems? Data about both current and past customers across decades that was copied onto personal computers, loaded into shared network folders, and sitting there while security experts pray that their network security solutions will prevent the bad guys from finding it and causing even more carnage the this ever growing era of Big Data?

Anthem Data Breach – Who’s Next?

If you are worried as much as I am about what these criminals will do with our personal information,  make it a priority to protect your data assets in your lives both personal and in business.  Learn more about Informatica’s perspectives and video on this matter:

Follow me! @DataisGR8

Share
Posted in Data masking, Data Privacy, Data Security | Tagged , | Leave a comment

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

A friend of mine recently reached out to me about some advice on CRM solutions in the market.  Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.

We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing.  He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations  and improve how we marketed and serviced our customers.  The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.

After 90 days of rolling out SFDC, we ran into some old familiar problems across the business.  Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market.  You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution.  C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.

During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:

  • Trial users  who purchased evaluation copies of our products that expired were tagged as current customers
  • Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
  • Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
  • Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system

We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:

  • Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually.  Because we had such bad data,  we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
  • Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
  • Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.

At the end of our conversation, this was my advice to my friend:

  • Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
  • Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
  • If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
  • However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
  • Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.

Looking Ahead!

CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?

Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:

  • Access and migrate data from old to new avoiding develop cost overruns and project delays.
  • Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
  • Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
  • Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.

Will your data be ready for your new CRM investments?  To learn more:

Follow me on Twitter @DataisGR8

Share
Posted in Architects, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, CMO, Customer Acquisition & Retention, SaaS | Tagged , , , , , , , , , | Leave a comment

The Similarities between Mixed Martial Arts Fighters and Information Architects

Mixed Martial Arts Fighters and Information Architects

Mixed Martial Arts Fighters

I have to admit I am a huge and longtime Mixed Martial Arts (MMA) fan.  Even before MMA became main stream, I have followed and studied traditional martial arts from Tae Kwon Do, Judo, Boxing, Wrestling, and Jujitsu since I was a young lad.  Given I hate pain, I am glad I call myself a fan and only a spectator.  Modern MMA fighters are extremely talented athletes however their success to win the cage is dependent on having a strong base across different disciplines.  They must be good on their feet with effective punches and kicks, have world class level wrestling, jujitsu, and judo on the ground, strong defense techniques all around, and a strong will not to quit.

My other passion of which is a less painful one is helping organizations understand and harness technology and best practices to leverage great data for business success.  Believe it or not, there are close similarities between being an effective MMA fighter and a successful information architect.  Information architects in today’s modern Big Data/Internet of Things world has to have a mix of knowledge and skills to recommend, design, and implement the right information management solutions for their businesses.  This involves having a strong background in database development, data engineering, computer programming,  web, security,  networking, system administration, development, and other technology competencies to formulate and categorize information into a coherent structure, preferably one that the intended audience can understand quickly, if not inherently, and then easily retrieve the information for which they are searching.

Information Architects

Information Architects

Like successful MMA fighters, Information Architects require training and development of basic building blocks regardless of their “intellectual” and “technical” prowess.  Having a strong base allows architects to recommend the right solutions, avoiding ineffective and inefficient methods such as hand coding critical data integration, data governance, and data quality processes that often result in bad data, higher costs, and increased risk of not meeting what the business needs. An MMA fighter with a strong base would leverage those skills to avoid techniques or moves that places themselves at harm or risk of getting knocked out, choked out, or getting an arm broken by their opponent.  Instead, like an MMA fighter, well developed architects leverage that base and knowledge to adopt proven technologies for their information architecture and management needs.

The technologies to manage data in the enterprise vary both in performance, functionality, and value.  Like MMA fighters competing for a living, it’s important they learn from skilled masters vs. the local martial arts school at your neighborhood strip mall or free YouTube videos recorded by jokers claiming to be a master and hoping that knowledge will help them survive in combat. Similarly, architects must make careful investments and decisions when designing systems to deliver great data. Short cuts and “good enough” tools won’t cut it in today’s data driven world. Great Data only comes by Great Design powered by an intelligent and capable data platform.    “Are you Ready to Get it On!”

Share
Posted in Architects, Enterprise Data Management | Tagged | Leave a comment

How to Improve Cross Sell and Customer Experience in Banking

Customer Experience in Banking

Customer Experience in Banking

I recently refinanced an existing mortgage on an investment property with my bank. Like most folks these days, I went to their website from my iPad, fill out an online application form, and received a pre-approval decision. Like any mortgage application, we stated our liabilities and assets including credit cards, auto loans, and investment accounts some of which were with this bank.  During the process I also entered a new contact email address after my email service was hacked over the summer.  The whole process took quite a bit of time and being an inpatient person I ended up logging off and coming back to the application over the weekend.

I walked into my local branch the following week to do a withdrawal with my bank teller and asked how my mortgage application was going. She had no clue what I was talking about as though I was a complete stranger.  When I asked her if they had my updated email address that I entered online, she was equally puzzled stating that any updates to that information would require me to contact all the other groups that held my brokerage, credit card, and mortgage services to make the change. That experience was extremely frustrating and I felt like my bank had no idea who I was as a customer despite the fact my ATM card as printed on it “Customer Since 1989″! Even worse, I expected someone to reach out to me after entering my entire financial history on my mortgage application about moving my investment accounts to their bank however no one contacted me about any new offers or services. (Wondering if they really wanted my business??)

2015 will continue to be a challenge for banks large and small to grow revenue caused by low interest rates, increasing competition from non-traditional segments, and lower customer loyalty with existing institutions.  The biggest opportunity for banks to grow revenue is to expand the wallet with existing customers.  Though times are ahead as many bank customers continue to do business with a multitude of different financial institutions.

The average U.S. consumer owns between 8-12 financial products ranging from your basic checking, credit card, mortgages, etc. to a wider range of products from IRA’s to 401K’s as they get closer to retirement.  On the flip side the average institution has between 2-3 products per customer relationship.  So why do banks continue to struggle in gaining more wallet share from existing customers?  Based on my experience and research, it stems down to two key reasons including:

  • Traditional product-centric business silos and systems
  • Lack of a single trusted source of customer, account, household, and other shared data syndicated and governed across the enterprise

The first reason is the way banks are set up to do business. Back in the day, you would walk into your local branch office. As you enter the doors, you have your bank tellers behind the counter ready to handle your deposits, withdrawals, and payments. If you need to open a new account you would talk to the new accounts manager sitting at their desk waiting to offer you a cookie. For mortgages and auto loans that would be someone else sitting in the far side of the building equally eager to sign new customers. As banks diversified their businesses with new products including investments, credit cards, insurance, etc. each product had their own operating units. The advent of the internet did not really change the traditional “brick and mortar” business model. Instead, one would go to the bank’s website to transact or sign up for a new product however on the back end the systems, people, and incentives to sell one product did not change creating the same disconnected customer experience.  Fast forward to today, these product centric silos continue to exist in big and small banks across the globe despite CEO’s saying they are focused on delivering a better customer experience.

Why is that the case? Well, another reason or cause are the systems within these product silos including core banking, loan origination, loan servicing, brokerage systems, etc. that were never designed to share common information with each other. In traditional retail or consumer banks maintained customer, account, and household information within the Customer Information File (CIF) often part of the core banking systems. Primary and secondary account holders would be grouped with a household based on the same last name and mailing address. Unfortunately, CIF systems were mainly used within retail banking. The problem grows expotentially as more systems were adopted to run the business across core business functions and traditional product business silos. Each group and its systems managed their own versions of the truth and these environments were never set up to share common data between them.

This is where Master Data Management technology can help.  “Master Data” is defined as a single source of basic business data used across multiple systems, applications, and/or processes.  In banking that traditionally includes information such as:

  • Customer name
  • Address
  • Email
  • Phone
  • Account numbers
  • Employer
  • Household members
  • Employees of the bank
  • Etc.

Master Data Management technology has evolved over the years starting as Customer Data Integration (CDI) solutions providing merge and match capabilities between systems to more modern platforms that govern consistent records and leverage inference analytics in to determine relationships between entities across systems within an enterprise. Depending on your business need, there are core capabilities one should consider when investing in an MDM platform. They include:

Key functions: What to look for in an MDM solution?
Capturing existing master data from two or more systems regardless of source and creating a single source of the truth for all systems to share. To do this right, you need seamless access to data regardless of source, format, system, and in real-time
Defining relationships based on “business rules” between entities. For example: “Household = Same last name, address, and account number.” These relationship definitions can be complex and can change over time therefore having the ability to create and modify those business rules by business users will help grow adoption and scalability across the enterprise
Governing consistency across systems by identifying changes to this common business information, determining whether it’s a unique, duplicate, or update to an existing record, and updating other systems that use and rely on that information. Similar to the first, you need the ability easily deliver and update dependent systems across the enterprise in real-time. Also, having a flexible and user friendly way of managing those master record rules and avoid heavy IT development is important to consider.

Now, what would my experience have been if my bank had capable Master Data Management solution in my bank? Let’s take a look:

Process Without MDM With MDM Benefit with MDM
Start a new mortgage application online Customer is required to fill out the usual information (name, address, employer, email, phone, existing accounts, etc.) The online banking system references the MDM solution which delivers the most recent master record of this customer based on existing data from the bank’s core banking system and brokerage systems and pre-populates the form with those details including information for their existing savings and credit card accounts with that bank.
  • Accelerate new customer on-boarding
  • Mitigating the risk of a competitor grabbing my attention to do business with them

 

New email address from customer Customer enters this on their mortgage application and gets entered into the bank’s loan origination system MDM recognizes that the email address is different from what exists in other systems, asks the customer to confirm changes.The master record is updated and shared across the banks’ other systems in real-time including the downstream data warehouse used by Marketing to drive cross sell campaigns.
  • Ensure every part of the bank shares the latest information about their customer
  • Avoids any disruptions with future communication of new products and offers to grow wallet share.

The banking industry continues to face headwinds from a revenue, risk, and regulatory standpoint. Traditional product-centric silos will not go away anytime soon and new CRM and client onboarding solutionsmay help with improving customer engagements and productivity within a firm however front office business applications are not designed to manage and share critical master data across your enterprise.  Anyhow, I decided to bank with another institution who I know has Master Data Management.   Are you ready for a new bank too?

For more information on Informatica’s Master Data Management:

Share
Posted in Banking & Capital Markets, Customer Acquisition & Retention, Master Data Management | Tagged , , | Leave a comment

Jumping on the Internet of Things (IoT) Band Wagon?

Jumping on the IoT band wagon

IoT and the Smart Home

There is a new “Band Wagon” out there and it’s not Big Data. If you were at this year’s CES Show this past week, it would have been impossible even with a “Las Vegas-size” hangover not to have heard the hype around the Internet of Things (IoT).  The Internet of Things includes anything and everything that is connected to the Internet and able to communicate and share information with other “smart” devices. This year as well as last it was about home appliances, fitness and health monitors, home security systems, Bluetooth enabled toothbrushes, sensors in shoes to monitor weight and mileage, thermostats that monitor humidity and sound, to kitchen utensils that can track and monitor the type of food you cook and eat.

If you ask me, all these devices and the IoT movement is both cool and creepy. Cool in the sense that networking technology has both matured and become affordable for devices to transmit data for companies to turn into actionable intelligence. IoT is creepy in the sense where do I really want someone monitoring what I cook or how many times I wake up and night?  Like other hype cycles or band wagons, there are different opinions as to the size of the IoT market.  Gartner expects it to include nearly 26 billion devices, with a “global economic value-add” of $1.9 trillion by 2020.  The question is whether the Internet of Things is truly transformational to our daily lives?  The answer to that really depends on being able to harness all that data into information. Just because my new IoT toothbrush can monitor and send data on how many times I brush my teeth, it doesn’t provide any color whether that makes me healthier or have a prettier smile :).

To help answer these questions, here are examples and potential use cases of leveraging all that Big Data from Small devices of the IoT world:

  • Mimo’s Smart Baby Monitor is aimed at helping to prevent SIDS, the Mimo monitor is a new kind of infant monitor that provides parents with real-time information about their baby’s breathing, skin temperature, body position, and activity level on their smartphones.
  • GlowCaps fit prescription bottles and via a wireless chip provide services that help people stick with their prescription regimen; from reminder messages, all the way to refill and doctor coordination.
  • BeClose offers a wearable alarm button and other discrete wireless sensors placed around the home, the BeClose system can track your loved one’s daily routine and give you peace of mind for their safety by alerting you to any serious disruptions detected in their normal schedule.
  • Postscapes provides technology a suite of sensors and web connectivity help save you time and resources by keeping plants fed based on their actual growing needs and conditions while automating much of the labor processes.
  • OnFarm solution combines real-time sensor data from soil moisture levels, weather forecasts, and pesticide usage from farming sites into a consolidated web dashboard. Farmers can use this data with advanced imaging and mapping information to spot crop issues and remotely monitor all of the farms assets and resource usage levels.
  • Banks and auto lenders are using cellular GPS units that report location and usage of financed cars in addition to locking the ignitions to prevent further movement in the case of default.
  • Sensors on farm equipment now provides real-time intelligence on how many hours trackers are used, the weather conditions to predict mechanical problems, and measuring the productivity of the farmer to predict trends in the commodity market.

I can see a number of other potential use cases for IoT including:

  • Health devices not only sending data but receiving data from other IoT devices to provide real time recommendations on workout routines based on weather data received from real-time weather sensors, food intake from kitchen devices, to nutritional information based on vitamins and medications consumed by the wearer.
  • Credit card banks leveraging their GPS tracking device data from auto loan customers to combine it with credit card data to deliver real-time offers on merchant promotions while on the road.
  • GPS tracking devices on hotel card keys to track where you go, eat, entertain to deliver more customized services and offers while one is on a business trip or vacation.
  • Boxing gloves transmitting the impact and force of a punch to monitor for athlete concussions.

What does this all mean?

The Internet of Things has changed the way we live and do business and will continue to shape the future hopefully in a positive way. Harnessing all of that Big Data from Small devices does not come easily. Every device that generates data sends it to some central system through WiFi or cellular network.  Once in that central system, it needs to be access, translated, transformed, cleansed, and standardized for business use with data from other systems that run the business.  For example:

  • Access, transform, and validate data from IoT with data generated from other business applications. Formats and values will be often different and change over time and needs to be rationalized and standardized for downstream business use. Otherwise, you end up with a bunch of Alphas and Numerics that make no sense.
  • Data quality and validation: Just because a sensor can send data, it does not mean it will send the right data or data that is right for a business user trying to make sense of it. GPS data requires accurate coordinate data. If any value is transmitted incorrectly, it is important to identify those errors; more importantly correct it so the business can take action.  This is especially important when combining like values (e.g. Weather status = Cold, Wet, Hot however the device is sending A,B, C)
  • Shared with other systems: Once your data is ready to be consumed by new and existing analytic applications, marketing systems, CRM, or your fraud surveillance systems, it needs to be available in in real-time if required, in the right format, and structure as required by those  applications and doing it in a way that is seamless, automated, and does not require heavy IT lifting.

In closing, IoT’s future is bright along with the additional insights gained from all that data.  Consider it Cool or Creepy one thing is for sure, the IoT band wagon is in full swing!

Share
Posted in Big Data, Vibe | Tagged , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Data Security – A Major Concern in 2015

Data Security

Data Security – A Major Concern in 2015

2014 ended with a ton of hype and expectations and some drama if you are a data security professional or business executive responsible for shareholder value.  The recent attacks on Sony Pictures by North Korea during December caught everyone’s attention, not about whether with Sony would release “The Interview” but how vulnerable we as a society are to these criminal acts.

I have to admit, I was one of those who saw the movie and found the film humorous to say the least and can see why a desperate regime like North Korea would not want their leader admitting they love margarita’s and Katy Perry. What concerned me about the whole event was whether these unwanted security breaches were now just a fact of life?  As a disclaimer, I have no affinity over the downfall of the North Korean government however what transpired was fascinating and amazing that companies like Sony continue to struggle to protect sensitive data despite being one of the largest companies in the world.

According to the Identity Theft Resource Center, there were 761 reported data security breaches in 2014 impacting over 83 million breached records across industries and geographies with B2B and B2C retailers leading the pack with 79.2% of all breaches. Most of these breaches originated through the internet via malicious WORMS and viruses purposely designed to identify and rely back sensitive information including credit card numbers, bank account numbers, and social security information used by criminals to wreak havoc and significant financial losses to merchants and financial institutions. According to the 2014 Ponemon Institute Research study:

  • The average cost of cyber-crime per company in the US was $12.7 million this year, according to the Ponemon report, and US companies on average are hit with 122 successful attacks per year.
  • Globally, the average annualized cost for the surveyed organizations was $7.6 million per year, ranging from $0.5 million to $61 million per company. Interestingly, small organizations have a higher per-capita cost than large ones ($1,601 versus $437), the report found.
  • Some industries incur higher costs in a breach than others, too. Energy and utility organizations incur the priciest attacks ($13.18 million), followed closely by financial services ($12.97 million). Healthcare incurs the fewest expenses ($1.38 million), the report says.

Despite all the media attention around these awful events last year, 2015 does not seem like it’s going to get any better. According to CNBC just this morning, Morgan Stanley reported a data security breach where they had fired an employee who it claims stole account data for hundreds of thousands of its wealth management clients. Stolen information for approximately 900 of those clients was posted online for a brief period of time.  With so much to gain from this rich data, businesses across industries have a tough battle ahead of them as criminals are getting more creative and desperate to steal sensitive information for financial gain. According to a Forrester Research, the top 3 breach activities included:

  • Inadvertent misuse by insider (36%)
  • Loss/theft of corporate asset (32%)
  • Phishing (30%)

Given the growth in data volumes fueled by mobile, social, cloud, and electronic payments, the war against data breaches will continue to grow bigger and uglier for firms large and small.  As such, Gartner predicts investments in Information Security Solutions will grow further 8.2 percent in 2015 vs. 2014 reaching $76.9+ billion globally.  Furthermore, by 2018, more than half of organizations will use security services firms that specialize in data protection, security risk management and security infrastructure management to enhance their security postures.

Like any war, you have to know your enemy and what you are defending. In the war against data breaches, this starts with knowing where your sensitive data is before you can effectively defend against any attack. According to the Ponemon Institute, 18% of firms who were surveyed said they knew where their structured sensitive data was located where as the rest were not sure. 66% revealed that if would not be able to effectively know if they were attacked.   Even worse, 47% were NOT confident at having visibility into users accessing sensitive or confidential information and that 48% of those surveyed admitted to a data breach of some kind in the last 12 months.

In closing, the responsibilities of today’s information security professional from Chief Information Security Officers to Security Analysts are challenging and growing each day as criminals become more sophisticated and desperate at getting their hands on one of your most important assets….your data.  As your organizations look to invest in new Information Security solutions, make sure you start with solutions that allow you to identify where your sensitive data is to help plan an effective data security strategy both to defend your perimeter and sensitive data at the source.   How prepared are you?

For more information about Informatica Data Security Solutions:

  • Download the Gartner Data Masking Magic Quadrant Report
  • Click here to learn more about Informatica’s Data Masking Solutions
  • Click here to access Informatica Dynamic Data Masking: Preventing Data Breaches with Benchmark-Proven Performance whitepaper
Share
Posted in Application ILM, Banking & Capital Markets, Big Data, CIO, Data masking, Data Privacy, Data Security | Tagged , , | Leave a comment

How to Get the Biggest Returns from Your Hadoop and Big Data Investments in 2015

Big Data2014 was the year that Big Data went mainstream from conversations asking “What is Big Data?” to “How do we harness the power of Big Data to solve real business problems”. It seemed like everyone jumped on the Big Data band wagon from new software start-ups offering the “next generation” predictive analytic applications to traditional database, data quality, business intelligence, and data integration vendors, all calling themselves Big Data providers. The truth is, they all play a role in this Big Data movement.

Earlier in 2014, Wikibon estimated the Big Data market is currently on pace to top $50 billion in 2017, which translates to a 38% compound annual growth rate over the six year period from 2011 (the first year Wikibon sized the Big Data market) to 2017. Most of the excitement around Big Data has been around Hadoop as early adopters who experimented with open source versions quickly grew to adopt enterprise-class solutions from companies like Cloudera™, HortonWorks™, MapR™, and Amazon’s RedShift™ to address real-world business problems including: (more…)

Share
Posted in B2B, Big Data, Business Impact / Benefits, Data Aggregation, Hadoop | Tagged , , , | Comments Off

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

What is the Silver Lining in Cloud for Financial Services?

This was a great week of excitement and innovation here in San Francisco starting with the San Francisco Giants winning the National League Pennant for the 3rd time in 5 years on the same day Saleforce’s Dreamforce 2014 wrapped up their largest customer conference with over 140K+ attendees from all over the world talking about their new Customer Success Platform.

Salesforce has come a long way from their humble beginnings as the new kid on the cloud front for CRM. The integrated sales, marketing, support, collaboration, application, and analytics as part of the Salesforce Customer Success Platform exemplifies innovation and significant business value upside for various industries however I see it very promising for today’s financial services industry. However like any new business application, the value business gains from it are dependent in having the right data available for the business.

The reality is, SaaS adoption by financial institutions has not been as quick as other industries due to privacy concerns, regulations that govern what data can reside in public infrastructures, ability to customize to fit their business needs, cultural barriers within larger institutions that critical business applications must reside on-premise for control and management purposes, and the challenges of integrating data to and from existing systems with SaaS applications.  However, experts are optimistic that the industry may have turned the corner. Gartner (NYSE:IT) asserts more than 60 percent of banks worldwide will process the majority of their transactions in the cloud by 2016.  Let’s take a closer look at some of the challenges and what’s required to overcome these obstacles when adopting cloud solutions to power your business.

Challenge #1:  Integrating and sharing data between SaaS and on-premise must not be taken lightly

For most banks and insurance companies considering new SaaS based CRM, Marketing, and Support applications with solutions from Salesforce and others must consider the importance of migrating and sharing data between cloud and on-premise applications in their investment decisions.  Migrating existing customer, account, and transaction history data is often done by IT staff through the use of custom extracts, scripts, and manual data validations which can carry over invalid information from legacy systems making these new application investments useless in many cases.

For example, customer type descriptions from one or many existing systems may be correct in their respective databases however collapsing them into a common field in the target application seems easy to do. Unfortunately, these transformation rules can be complex and that complexity increases when dealing with tens if not hundreds of applications during the migration and synchronization phase. Having capable solutions to support the testing, development, quality management, validation, and delivery of existing data from old to new is not only good practice, but a proven way of avoiding costly workarounds and business pain in the future.

Challenge 2:  Managing and sharing a trusted source of shared business information across the enterprise.

As new SaaS applications are adopted, it is critical to understand how to best govern and synchronize common business information such as customer contact information (e.g. address, phone, email) across the enterprise. Most banks and insurance companies have multiple systems that create and update critical customer contact information, many of them which reside on-premise. For example, insurance customers who update contact information such as a phone number or email address while filing an insurance claim will often result in that claims specialist to enter/update only the claims system given the siloed nature of many traditional banking and insurance companies. This is the power of Master Data Management which is purposely designed to identify changes to master data including customer records in one or many systems, update the customer master record, and share that across other systems that house and require that update is essential for business continuity and success.

In conclusion, SaaS adoption will continue to grow in financial services and across other industries. The silver lining in the cloud is your data and the technology that supports the consumption and distribution of it across the enterprise. Banks and insurance companies investing in new SaaS solutions will operate in a hybrid environment made up of Cloud and core transaction systems that reside on-premise. Cloud adoption will continue to grow and to ensure investments yield value for businesses, it is important to invest in a capable and scalable data integration platform to integrate, govern, and share data in a hybrid eco-system. To learn more on how to deal with these challenges, click here and download a complimentary copy of the new “Salesforce Integration for Dummies”

Share
Posted in B2B, Cloud, Financial Services | Tagged , , , , | Leave a comment