Category Archives: Data Security

British Cycling: A Big Data Champion?

Big_Data

British Cycling: A Big Data Champion?

I think I may have gone to too many conferences in 2014 in which the potential of big data was discussed.  After a while all the stories blurred into two main themes:

  1. Companies have gone bankrupt at a time when demand for their core products increased.
  2. Data from mobile phones, cars and other machines house a gold mine of value – we should all be using it.

My main take away from 2014 conferences was that no amount of data is a substitute for poor strategy, or lack of organisational agility to adapt business processes in times of disruption.  However, I still feel as an industry our stories are stuck in the phase of ‘Big Data Hype’, but most organisations are beyond the hype and need practicalities, guidance and inspiration to turn their big data projects into a success.  This is possibly due to a limited number of big data projects in production, or perhaps it is too early to measure the long term results of existing projects.  Another possibility is that the projects are delivering significant competitive advantage, so the stories will remain under wraps for the time being.

However, towards the end of 2014 I stumbled across a big data success story in an unexpected place.  It did (literally) provide competitive advantage, and since it has been running for a number of years the results are plain to see.  It started with a book recommendation from a friend.   ‘Faster’ by Michael Hutchinson is written as a self-propelled investigation as to the difference between world champion and world class althletes.  It promised to satisfy my slightly geeky tendency to enjoy facts, numerical details and statistics.  It did this – but it really struck me as a ‘how-to’ guide for big data projects.

Mr Hutchinson’s book is an excellent read as an insight into professional cycling by a professional cyclist.  It is stacked with interesting facts and well-written anecdotes, and I highly recommend the reading the book.  Since the big-data aspect was a sub-plot, I will pull out the highlights without distracting from the main story.

Here are the five steps I extracted for big data project success:

1. Have a clear vision and goal for your project

The Sydney Olympics in 2000 had only produced 4 medals across all cycling disciplines for British cyclists.  With a home Olympics set for 2012, British Cycling desperately wanted to improve this performance.  Specific targets were clearly set across all disciplines stated in times that an athlete needed to achieve in order to win a race.

2. Determine data the required to support these goals

Unlike many big data projects which start with a data set and then wonder what to do with it, British Cycling did this the other way around.  They worked out what they needed to measure in order to establish the influencers on their goal (track time) and set about gathering this information.  In their case this involved gathering wind tunnel data to compare & contrast equipment, as well as physiological data from athletes and all information from cycling activities.

3. Experiment in order to establish causality

Most big data projects involve experimentation by changing the environment  whilst gathering a sub-set of data points.  The number of variables to adjust in cycling is large, but all were embraced. Data (including video) was gathered on the effects of small changes in each component:  Bike, Clothing, Athlete (training and nutrition).

4. Guide your employees on how to use the results of the data

Like many employees, cyclists and coaches were convinced of the ‘best way’ to achieve results based on their own personal experience.  Analysis of data in some cases showed that the perceived best way, was in fact not the best way.   Coaching staff trusted the data, and convinced the athletes to change aspects of both training and nutrition.  This was not necessarily easy to do, as it could mean fundamental changes in the athlete’s lifestyle.

5. Embrace innovation

Cycling is a very conservative sport by nature, with many of the key innovations coming from adjacent sports such as triathlon.  Data however, is not steeped in tradition and does not have pre-conceived ideas as to what equipment should look like, or what constitutes an excellent recovery drink.  What made British Cycling’s big data initiatives successful is that they allowed themselves to be guided by the data and put the recommendations into practice.  Plastic finished skin suits are probably not the most obvious choice for clothing, but they proved to be the biggest advantage cyclist could get.  Far more than tinkering with the bike.  (In fact they produced so much advantage they were banned shortly after the 2008 Olympics.)

The results:  British Cycling won 4 Olympic medals in 2000, one of which was gold.  In 2012 they grabbed 8 gold, 2 silver and 2 bronze medals.  A quick glance at their website shows that it is not just Olympic medals they are wining – but medals won across all world championship events has increased since 2000.

To me, this is one of the best big data stories, as it directly shows how to be successful using big data strategies in a completely analogue world.  I think it is more insightful that the mere fact that we are producing ever-increasing volumes of data.  The real value of big data is in understanding what portion of all avaiable data will constribute to you acieving your goals, and then embracing the use the results of analysis to make constructive changes in daily activities.

But then again, I may just like the story because it involves geeky facts, statistics and fast bicycles.

Share
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Quality, Data Security, Data Services | Tagged , , , | Leave a comment

How Protected is your PHI?

I live in a very small town in Maine. I don’t spend a lot of time thinking about my privacy. Some would say that by living in a small town, you give up your right to privacy because everyone knows what everyone else is doing. Living here is a choice – for me to improve my family’s quality of life. Sharing all of the details of my life – not so much.

When I go to my doctor (who also happens to be a parent from my daughter’s school), I fully expect that any sort of information that I share with him, or that he obtains as a result of lab tests or interviews, or care that he provides is not available for anyone to view. On the flip side, I want researchers to be able to take my lab information combined with my health history in order to do research on the effectiveness of certain medications or treatment plans.

As a result of this dichotomy, Congress (in 1996) started to address governance regarding the transmission of this type of data. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) is a Federal law that sets national standards for how health care plans, health care clearinghouses, and most health care providers protect the privacy of a patient’s health information. With certain exceptions, the Privacy Rule protects a subset of individually identifiable health information, known as protected health information or PHI, that is held or maintained by covered entities or their business associates acting for the covered entity. PHI is any information held by a covered entity which concerns health status, provision of health care, or payment for health care that can be linked to an individual.

Many payers have this type of data in their systems (perhaps in a Claims Administration system), and have the need to share data between organizational entities. Do you know if PHI data is being shared outside of the originating system? Do you know if PHI is available to resources that have no necessity to access this information? Do you know if PHI data is being shared outside your organization?

If you can answer yes to each of these questions – fantastic. You are well ahead of the curve. If not – you need to start considering solutions that can

I want to researchers to have access to medically relevant data so they can find the cures to some horrific diseases. I want to feel comfortable sharing health information with my doctor. I want to feel comfortable that my health insurance company is respecting my privacy. Now to get my kids to stop oversharing.

Share
Posted in Customers, Data Governance, Data masking, Data Privacy, Data Security, Governance, Risk and Compliance, Healthcare | Tagged , , , , , | Leave a comment

Data Security – A Major Concern in 2015

Data Security

Data Security – A Major Concern in 2015

2014 ended with a ton of hype and expectations and some drama if you are a data security professional or business executive responsible for shareholder value.  The recent attacks on Sony Pictures by North Korea during December caught everyone’s attention, not about whether with Sony would release “The Interview” but how vulnerable we as a society are to these criminal acts.

I have to admit, I was one of those who saw the movie and found the film humorous to say the least and can see why a desperate regime like North Korea would not want their leader admitting they love margarita’s and Katy Perry. What concerned me about the whole event was whether these unwanted security breaches were now just a fact of life?  As a disclaimer, I have no affinity over the downfall of the North Korean government however what transpired was fascinating and amazing that companies like Sony continue to struggle to protect sensitive data despite being one of the largest companies in the world.

According to the Identity Theft Resource Center, there were 761 reported data security breaches in 2014 impacting over 83 million breached records across industries and geographies with B2B and B2C retailers leading the pack with 79.2% of all breaches. Most of these breaches originated through the internet via malicious WORMS and viruses purposely designed to identify and rely back sensitive information including credit card numbers, bank account numbers, and social security information used by criminals to wreak havoc and significant financial losses to merchants and financial institutions. According to the 2014 Ponemon Institute Research study:

  • The average cost of cyber-crime per company in the US was $12.7 million this year, according to the Ponemon report, and US companies on average are hit with 122 successful attacks per year.
  • Globally, the average annualized cost for the surveyed organizations was $7.6 million per year, ranging from $0.5 million to $61 million per company. Interestingly, small organizations have a higher per-capita cost than large ones ($1,601 versus $437), the report found.
  • Some industries incur higher costs in a breach than others, too. Energy and utility organizations incur the priciest attacks ($13.18 million), followed closely by financial services ($12.97 million). Healthcare incurs the fewest expenses ($1.38 million), the report says.

Despite all the media attention around these awful events last year, 2015 does not seem like it’s going to get any better. According to CNBC just this morning, Morgan Stanley reported a data security breach where they had fired an employee who it claims stole account data for hundreds of thousands of its wealth management clients. Stolen information for approximately 900 of those clients was posted online for a brief period of time.  With so much to gain from this rich data, businesses across industries have a tough battle ahead of them as criminals are getting more creative and desperate to steal sensitive information for financial gain. According to a Forrester Research, the top 3 breach activities included:

  • Inadvertent misuse by insider (36%)
  • Loss/theft of corporate asset (32%)
  • Phishing (30%)

Given the growth in data volumes fueled by mobile, social, cloud, and electronic payments, the war against data breaches will continue to grow bigger and uglier for firms large and small.  As such, Gartner predicts investments in Information Security Solutions will grow further 8.2 percent in 2015 vs. 2014 reaching $76.9+ billion globally.  Furthermore, by 2018, more than half of organizations will use security services firms that specialize in data protection, security risk management and security infrastructure management to enhance their security postures.

Like any war, you have to know your enemy and what you are defending. In the war against data breaches, this starts with knowing where your sensitive data is before you can effectively defend against any attack. According to the Ponemon Institute, 18% of firms who were surveyed said they knew where their structured sensitive data was located where as the rest were not sure. 66% revealed that if would not be able to effectively know if they were attacked.   Even worse, 47% were NOT confident at having visibility into users accessing sensitive or confidential information and that 48% of those surveyed admitted to a data breach of some kind in the last 12 months.

In closing, the responsibilities of today’s information security professional from Chief Information Security Officers to Security Analysts are challenging and growing each day as criminals become more sophisticated and desperate at getting their hands on one of your most important assets….your data.  As your organizations look to invest in new Information Security solutions, make sure you start with solutions that allow you to identify where your sensitive data is to help plan an effective data security strategy both to defend your perimeter and sensitive data at the source.   How prepared are you?

For more information about Informatica Data Security Solutions:

  • Download the Gartner Data Masking Magic Quadrant Report
  • Click here to learn more about Informatica’s Data Masking Solutions
  • Click here to access Informatica Dynamic Data Masking: Preventing Data Breaches with Benchmark-Proven Performance whitepaper
Share
Posted in Application ILM, Banking & Capital Markets, Big Data, CIO, Data masking, Data Privacy, Data Security | Tagged , , | Leave a comment

The CISO Challenge: Articulating Data Worth and Security Economics

A few years ago the former eBay’s CISO, Dave Cullinane, led a sobering coaching discussion on how to articulate and communicate the value of a security solution and its economics to a CISO’s CxO peers.

Why would I blog about such old news? Because it was a great and timeless idea. And in this age of the ‘Great Data Breach’, where CISOs need all the help they can get, I thought I would share it with y’all.

Dave began by describing how to communicate the impact of an attack from malware such as Aurora, spearfishing, stuxnet, hacktivision, and so on… versus the investment required to prevent the attack.  If you are an online retailer and your web server goes down because of a major denial of service attack, what does that cost the business?  How much revenue is lost every minute that site is offline? Enough to put you out of business? See the figure below that illustrates how to approach this conversation.

If the impact of a breach and the risk of losing business is high and the investment in implementing a solution is relatively low, the investment decision is an obvious one (represented by the yellow area in the upper left corner).

CISO Challenge

However, it isn’t always this easy, is it?  When determining what your company’s brand and reputation worth, how do you develop a compelling case?

Another dimension Dave described is communicating the economics of a solution that could prevent an attack based on the probability that the attack would occur (see next figure below).

CISO Challenge

For example, consider an attack that could influence stock prices?  This is a complex scenario that is probably less likely to occur on a frequent basis and would require a sophisticated multidimensional solution with an integrated security analytics solution to correlate multiple events back to a single source.  This might place the discussion in the middle blue box, or the ‘negotiation zone’. This is where the CISO needs to know what the CxO’s risk tolerances are and articulate value in terms of the ‘coin of the realm’.

Finally, stay on top of what the business is cooking up for new initiatives that could expose or introduce new risks.  For example, is marketing looking to spin up a data warehouse on Amazon Redshift? Anyone on the analytics team tinkering with Hadoop in the cloud? Is development planning to outsource application test and development activities to offshore systems integrators? If you are participating in any of these activities, make sure your CISO isn’t the last to know when a ‘Breach Happens’!

To learn more about ways you can mitigate risk and maintain data privacy compliance, check out the latest Gartner Data Masking Magic Quadrant.

Share
Posted in Data Governance, Data masking, Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , | Leave a comment

Are The Banks Going to Make Retailers Pay for Their Poor Governance?

 

Retail and Data Governance

Retail and Data Governance

A couple months ago, I reached out to a set of CIOs on the importance of good governance and security. All of them agreed that both were incredibly important. However, one CIO retorted a very pointed remark by saying that “the IT leadership at these breached companies wasn’t stupid.” He continued by saying that when selling the rest of the C-Suite, the discussion needs to be about business outcomes and business benefits.  For this reason, he said that CIOs have struggled at selling the value of investments in governance and security investment. Now I have suggested previously that security pays because of the impact on “brand promise”.  And, I still believe this.

However, this week the ante was raised even higher. A district judge ruled that a group of banks can proceed to sue a retailer for negligence in their data governance and security. The decision could clearly lead to significant changes in the way the cost of fraud is distributed among parties within the credit card ecosystem. Where once banks and merchant acquirers would have shouldered the burden of fraud, this decision paves the way for more card-issuing banks to sue merchants for not adequately protecting their POS systems.

Accidents waste priceless time

Accidents waste priceless time

The judge’s ruling said that “although the third-party hackers’ activities caused harm, merchant played a key role in allowing the harm to occur.” The judge also determined that the bank suit against merchants was valid because the plaintiffs adequately showed that the retailer failed “to disclose that its data security systems were deficient.” This is interesting because it says that security systems should be sufficient and if not, retailers need to inform potentially affected stakeholders of their deficient systems. And while taking this step could avoid a lawsuit, it would likely increase the cost of interchange for more risky merchants. This would effectively create a risk premium for retailers that do not adequately govern and protect their IT environments.

There are broad implications for all companies who end up harming customer, partners, or other stakeholders by not keeping their security systems up to snuff. The question is, will this make good governance have enough of a business outcome and benefit that businesses will actually want to pay it forward — i.e. invest in good governance and security? What do you think? I would love to hear from you.

Related links

Solutions:

Enterprise Level Data Security

Hacking: How Ready Is Your Enterprise?

Gambling With Your Customer’s Financial Data

The State of Data Centric Security

Twitter: @MylesSuer

 

Share
Posted in Banking & Capital Markets, CIO, Data Governance, Data Security, Financial Services | Tagged , , , | Leave a comment

Homeland = Your Average Life Insurance Company?

Or in other words: Did the agency model kill data quality?  When you watch the TV series “Homeland”, you quickly realize the interdependence between field operatives and the command center.  This is a classic agency model.  One arm gathers, filters and synthesizes information and prepares a plan but the guys on the ground use this intel to guide their sometimes ad hoc next moves.

Data QualityThe last few months I worked a lot – and I mean A LOT – with a variety of mid-sized life insurers (<$1B annual revenue) around fixing their legacy-inherited data quality problems.  Their IT departments, functioning like Operations Command Centers (intel acquisition, filter and synthesize), were inundated with requests to fix up and serve a coherent, true, enriched central view of a participant (the target) and all his policies and related entities from and to all relevant business lines (planning) to achieve their respective missions (service, retain, upsell, mitigate risk): employee benefits, broker/dealer, retirement services, etc.

The captive and often independent agents (execution); however, often run with little useful data into an operation (sales cycle) as the Ops Center is short on timely and complete information.  Imagine Carrie directing her strike team to just wing it based on their experience and dated intel from a raid a few months ago without real-time drone video feeds.  Would she be saying, “Guys, it’s always been your neck, you’ll figure it out.”  I think not.

Homeland = Your Average Life Insurance Company?

Homeland = Your Average Life Insurance Company?

This becomes apparent when talking to the actuaries, claims operations, marketing, sales, agency operations, audit, finance, strategic planning, underwriting and customer service, common denominators appeared quickly:

  • Every insurer saw the need to become customer instead of policy centric. That’s the good news.
  • Every insurer knew their data was majorly sub-standard in terms of quality and richness.
  • Every insurer agreed that they are not using existing data capture tools (web portals for agents and policy holders, CRM applications, policy and claims mgmt systems) to their true potential.
  • New books-of-business were generally managed as separate entities from a commercial as well as IT systems perspective, even if they were not net-new products, like trust products.  Cross-selling as such, even if desired, becomes a major infrastructure hurdle.
  • As in every industry, the knee-jerk reaction was to throw the IT folks at data quality problems and making it a back-office function.  Pyrrhic victory.
  • Upsell scenarios, if at all strategically targeted, are squarely in the hands of the independent agents.  The insurer will, at most, support customer insights around lapse avoidance or 401k roll-off indicators for retiring or terminated plan participants.  This may be derived from a plan sponsor (employer) census file, which may have incorrect address information.
  • Prospect and participant e-mail addresses are either not captured (process enforcement or system capability) or not validated (domain, e-mail verification), so the vast majority of customer correspondence, like scenarios, statements, privacy notices and policy changes,  travels via snail mail (and this typically per policy). Overall, only 15-50% of contacts have an “unverified” e-mail address today and of these less than a third subscribed to exclusive electronic statement delivery.
  • Postal address information is still not 99% correct, complete or current, resulting in high returned mail ($120,000-$750,000 every quarter), priority mail upgrades, statement reprints, manual change capture and shredding cost as well as the occasional USPS fine.
  • Data quality, as unbelievable as it sounds, took a back-burner to implementing a new customer data warehouse, a new claims system, a new risk data mart, etc.  They all just get filled with the same old, bad data as business users were – and I quote –“used to the quality problem already”.
  • Premium underpricing runs at 2-4% annually, foregoing millions in additional revenue, due to lack of a full risk profile.
  • Customer cost –of-acquisition (CAC) is unknown or incorrect as there is no direct, realistic tracking of agency campaign/education dollars spent against new policies written.
  • Agency historic production and projections are unclear as a dynamic enforcement of hierarchies is not available, resulting in orphaned policies generating excess tax burdens.  Often this is the case when agents move to states where they are not licensed, they passed or retired.

What does a cutting-edge insurer look like instead?  Ask Carrie Mathison and Saul Bernstein.  They already have a risk and customer EDW as well as a modern (cloud based?) CRM and claims mgmt system.  They have considered, as part of their original implementation or upgrade, capabilities required to fix the initial seed data into their analytics platforms.  Now, they are looking into pushing them back into operational systems like CRM and avoiding bad source system entries from the get-go.

They are also beyond just using data to avoid throwing more bodies in every department at “flavor-of-the-month” clean-up projects, e.g. yet another state unclaimed property matching exercise, total annual premium revenue written in state X for tax review purposes by the state tax authority.

So what causes this drastic segmentation of leading versus laggard life insurers?  In my humble opinion, it is the lack of a strategic refocusing of what the insurer can do for an agent by touching the prospects and customers directly.  Direct interaction (even limited) improves branding, shortens the sales cycle and rates based on improved insights through better data quality.

Agents (and insurers) need to understand that the wealth of data (demographic, interactions, transactions) corporate possesses already via native and inherited (via M&A) can be a powerful competitive differentiator.  Imagine if they start tapping into external sources beyond the standard credit bureaus and consumer databases; dare I say social media?

Competing based on immediate instead of long term needs (in insurance: life time earnings potential replacement), price (fees) and commission cannot be the sole answer.

Share
Posted in Data Governance, Data Integration, Data Quality, Data Security | Tagged , , | Leave a comment

Securing Sensitive Data in Test and Development Environments

Securing Sensitive Data in Test and Development Environments

Securing Test and Development Environments

Do you use copies of production data in test and development environments? This is common practice in IT organizations. For this reason, test environments have become the number one target for outside intruders. That being said, most data breaches occur when non-malicious insiders accidentally expose sensitive data in the course of their daily work. Insider data breaches can be more serious and harder to detect than intruder events.

If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation.  On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.

This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.

In this webinar, we will cover:

  • Key trends in enterprise data security
  • Vulnerabilities in non-production application environments (test and development)
  • Alternatives to consider when protecting test and development environments
  • Priorities for enterprises in reducing attack surface for their organization
  • Compliance and internal audit cost reduction
  • Data masking and synthetics data use cases
  • Informatica Secure Testing capabilities

Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.

Share
Posted in Application ILM, Business Impact / Benefits, Data Security, Data Services | Tagged , , , | Leave a comment

IDC Life Sciences and Ponemon Research Highlights Need for New Security Measures

The Healthcare and Life Sciences industry has demonstrated its ability to take advantage of data to fuel research, explore new ways to cure life threatening diseases, and save lives.  With the adoption of technology innovation especially in the mobile technology segment, this industry will need to find a balance between investments and risk.

ModernMedicine.com published an article in May, 2014 stating how analysts worry that a wide-scale security breach could occur in healthcare and pharmaceuticals industry this year.  The piece calls out that this industry category ranked the lowest in an S&P500 cyber health study because of its high volume of incidents and slow response rates.

In the Ponemon Institute’s research, The State of Data Centric Security, respondents from the Healthcare and Life Sciences stated the data they considered most at risk was customer, consumer and patient record data.  Intellectual Property, Business Intelligence and Classified Data responses ranked a close second.

Data security

In an Informatica webinar with Alan Louie, Research Analyst from IDC Health Insights (@IDCPharmaGuru), we discussed his research on ‘Changing Times in the Life Sciences – Enabled and Empowered by Tech Innovation’.  The megatrends of cloud, mobile, social networks and Big Data analytics are all moving in a positive direction with various phases of adoption.  Mobile technologies tops the list of IT priorities – likely because of the productivity gains that can be achieved by mobile devices and applications. Security/Risk Management technologies listed as the second-highest priority.

When we asked Security Professionals in Life Sciences in the Ponemon Survey, ‘What keeps you up at night?’, the top answer was ‘migrating to new mobile platforms’.  The reason I call this factoid out is that all other industry categories ranked ‘not knowing where sensitive data resides’ as the biggest concern. Why is Life Sciences different from other industries?

Life sciences

One reason could be the intense scrutiny over Intellectual Property protection and HIPPA compliance has already shone a light on where sensitive data reside. Mobile makes it difficult to track and contain a potential breach given that cell phones are the number 1 item left behind in taxi cabs.

With the threat of a major breach on the horizon, and the push to leverage technology such as mobile and cloud, it is evident that the investments in security and risk management need to focus on the data itself – rather than tie it to a specific technology or platform.

Enter Data-Centric Security.  The call to action is to consider applying a new approach to the information security paradigm that emphasizes the security of the data itself rather than the security of networks or applications.   Informatica recently published an eBook ‘Data-Centric Security eBook New Imperatives for a New Age of Data’.  Download it, read it. In an industry with so much at stake, we highlight the need for new security measures such as these. Do you agree?

I encourage your comments and open the dialogue!

Share
Posted in Application ILM, Data Governance, Data masking, Data Privacy, Data Security | Tagged , , , , , | Leave a comment

Just In Time For the Holidays: How The FTC Defines Reasonable Security

Reasonable Security

How The FTC Defines Reasonable Security

Recently the International Association of Privacy Professionals (IAPP, www.privacyassociation.org ) published a white paper that analyzed the Federal Trade Commission’s (FTC) data security/breach enforcement. These enforcements include organizations from the finance, retail, technology and healthcare industries within the United States.

From this analysis in “What’s Reasonable Security? A Moving Target,” IAPP extrapolated the best practices from the FTC’s enforcement actions.

While the white paper and article indicate that “reasonable security” is a moving target it does provide recommendations that will help organizations access and baseline their current data security efforts.  Interesting is the focus on data centric security, from overall enterprise assessment to the careful control of access of employees and 3rd parties.  Here some of the recommendations derived from the FTC’s enforcements that call for Data Centric Security:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

How does Data Centric Security help organizations achieve this inferred baseline? 

  1. Data Security Intelligence (Secure@Source coming Q2 2015), provides the ability to “…identify reasonably foreseeable risks.”
  2. Data Masking (Dynamic and Persistent Data Masking)  provides the controls to limit access of information to employees and 3rd parties.
  3. Data Archiving provides the means for the secure disposal of information.

Other data centric security controls would include encryption for data at rest/motion and tokenization for securing payment card data.  All of the controls help organizations secure their data, whether a threat originates internally or externally.   And based on the never ending news of data breaches and attacks this year, it is a matter of when, not if your organization will be significantly breached.

For 2015, “Reasonable Security” will require ongoing analysis of sensitive data and the deployment of reciprocal data centric security controls to ensure that the organizations keep pace with this “Moving Target.”

Share
Posted in Data Integration, Data masking, Data Privacy, Data Security | Tagged , , , | Leave a comment

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

I routinely have the pleasure of working with Terri Mikol, Director of Data Governance, UPMC. Terri has been spearheading data governance for three years at UPMC. As a result, she has a wealth of insights to offer on this hot topic. Enjoy her top 10 lessons learned from UPMC’s data governance journey:

1. You already have data stewards.

Commonly, health systems think they can’t staff data governance such as UPMC has becauseof a lack of funding. In reality, people are already doing data governance everywhere, across your organization! You don’t have to secure headcount; you locate these people within the business, formalize data governance as part of their job, and provide them tools to improve and manage their efforts.

2. Multiple types of data stewards ensure all governance needs are being met.

Three types of data stewards were identified and tasked across the enterprise:

I. Data Steward. Create and maintain data/business definitions. Assist with defining data and mappings along with rule definition and data integrity improvement.

II. Application Steward. One steward is named per application sourcing enterprise analytics. Populate and maintain inventory, assist with data definition and prioritize data integrity issues.

III. Analytics Steward. Named for each team providing analytics. Populate and maintain inventory, reduce duplication and define rules and self-service guidelines.

3. Establish IT as an enabler.

IT, instead of taking action on data governance or being the data governor, has become anenabler of data governance by investing in and administering tools that support metadata definition and master data management.

4. Form a governance council.

UPMC formed a governance council of 29 executives—yes, that’s a big number but UPMC is a big organization. The council is clinically led. It is co-chaired by two CMIOs and includes Marketing, Strategic Planning, Finance, Human Resources, the Health Plan, and Research. The council signs off on and prioritizes policies. Decision-making must be provided from somewhere.

5. Avoid slowing progress with process.

In these still-early days, only 15 minutes of monthly council meetings are spent on policy and guidelines; discussion and direction take priority. For example, a recent agenda item was “Length of Stay.” The council agreed a single owner would coordinate across Finance, Quality and Care Management to define and document an enterprise definition for “Length of Stay.”

6. Use examples.

Struggling to get buy-in from the business about the importance of data governance? An example everyone can relate to is “Test Patient.” For years, in her business intelligence role, Terri worked with “Test Patient.” Investigation revealed that these fake patients end up in places they should not. There was no standard for creation or removal of test patients, which meant that test patients and their costs, outcomes, etc., were included in analysis and reporting that drove decisions inside and external to UPMC. The governance program created a policy for testing in production should the need arise.

7. Make governance personal through marketing.

Terri holds monthly round tables with business and clinical constituents. These have been a game changer: Once a month, for two hours, ten business invitees meet and talk about the program. Each attendee shares a data challenge, and Terri educates them on the program and illustrates how the program will address each challenge.

8. Deliver self-service.

Providing self-service empowers your users to gain access and control to the data they need to improve their processes. The only way to deliver self-service business intelligence is to make metadata, master data, and data quality transparent and accessible across the enterprise.

9. IT can’t do it alone.

Initially, IT was resistant to giving up control, but now the team understands that it doesn’t have the knowledge or the time to effectively do data governance alone.

10. Don’t quit!

Governance can be complicated, and it may seem like little progress is being made. Terri keeps spirits high by reminding folks that the only failure is quitting.

Getting started? Assess the data governance maturity of your organization here: http://governyourdata.com/

Share
Posted in Data First, Data Governance, Data Integration, Data Security | Tagged , , , | Leave a comment