Tag Archives: Enterprise Data Management

Getting to Your Future State Enterprise Data Architecture

Getting to Your Future State Enterprise Data Architecture

Future State Enterprise Data Architecture

Just exactly how do your move from a “Just a Bunch of Data” (JBOD) architecture to a coherent enterprise data architecture?

The white paper, “The Great Rethink: Building a Highly Responsive and Evolving Data Integration Architecture” by Claudia Imhoff and Joe McKendrick provides an interesting view of what such an architecture might look like.  The paper describes how to move from ad hoc Data Integration to an Enterprise Data Architecture.  The paper also describes an approach towards building architectural maturity and a next-generation enterprise data architecture that helps organizations to be more competitive.

Organizations that look to compete based on their data are searching for ways to design an architecture that:

  • On-boards new data quickly
  • Delivers clean and trustworthy data
  • Delivers data at the speed required of the business
  • Ensures that data is handled in secure way
  • Is flexible enough to incorporate new data types and new technology
  • Enables end user self-service
  • Speeds up the speed of business value delivery for an organization

In my previous blog, Digital Strategy and Architecture, we discussed the demands that digital strategies are putting on enterprise data architecture in particular.  Add to that the additional stress from business initiatives such as:

  • Supporting new mobile applications
  • Moving IT applications to the cloud – which significantly increases data management complexity
  • Dealing with external data.  One recent study estimates that a full 25% of the data being managed by the average organization is external data.
  • Next-generation analytics and predictive analytics with Hadoop and No SQL
  • Integrating analytics with applications
  • Event-driven architectures and projects
  • The list goes on…

The point here is that most people are unlikely to be funded to build an enterprise data architecture from scratch that can meet all these needs.  A pragmatic approach would be to build out your future state architecture in each new strategic business initiative that is implemented.  The real challenge of being an enterprise architect is ensuring that all of the new work does indeed add up to a coherent architecture as it gets implemented.

The “Great Rethink” white paper describes a practical approach to achieving an agile and responsive future state enterprise data architecture that will support your strategic business initiatives.  It also describes a high level data integration architecture and the building blocks to achieving that architecture.  This is highly recommended reading.

Also, you might recall that Informatica sponsored the Informatica Architect’s Challenge this year to design an enterprise-wide data architecture of the future.  The contest has closed and we have a winner.  See the site for details, Informatica Architect Challenge .

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, CIO, Data First, Data Integration Platform | Tagged , , , , , | Leave a comment

Architects: How Will You Stay Relevant?

Enterprise_Architect_lrg.13210747_std

We are way past the point where the architecture needs to be aligned with business goals and value delivery.  That is necessary but no longer sufficient.  We are now at the point where architecture needs to be central to the creation of an organization’s strategy process.  Not to get hyperbolic, but anything less is risky for your career.

The Challenge: Digitization

I just came back from the MIT Center for Information Systems Research (CISR) research forum.  One of the leading topics was digitization and how every business is becoming digitized.  To those in the High Tech industry, this may be an “of course” topic, but to most other industries it is a wrenching change.  Even those who are comfortable with the idea of digitization risk taking this too lightly.

The fact is that most products and services will have a digital component to them in the near future and an increasing number of  products and services will be entirely digital.  The fact is that digitization and the technologies that enable it are going to bring about a period of increased disruption.  This will mean:

  • New competitors.  Examples: autonomous cars, sports equipment with embedded sensors that provide feedback, personal assistant fully capable of making decisions and taking action.  Gartner is predicting that almost everything over $100 will have a sensor by the turn of the decade.
  • New competitors jumping across industry boundaries.  Examples: Apple iTunes and Google cars to name a few.

Why Architects Are Important

Architects are in a unique position to not only understand the technology trends driving this disruption, but they also to know how to leverage these trends to drive business value within their organizations.  The very best architects are going to be those who are deeply involved in defining the organization strategy, not just figuring out how to implement it.

Evidence of Change

Many architects and CIOs currently report very little interest from upper management in IT.  That is about to change, and quickly.  At the MIT CISR forum I attended last week, they presented research around this area that is very telling:

  • Half of Board of Directors members believe that their board’s ability to oversee the strategic use of IT is “less than effective.”
  • 26% of Boards hired consultants to evaluate major projects or the IT unit.
  • 60% of Boards want to spend more time on digital issues next year.
  • Board members estimate that 32% of their company’s revenues are under threat from digital disruption.

That last bullet is the really interesting piece of research.  32% is a huge impact.

The Role of Data in Digitization

Digitization by its very nature is all about data.  The winners in this space will be those that can manage and deliver relevant data the quickest.  The question for architects is this: Do you have the architecture and agility to take advantage of the coming disruptions and opportunities?  Are you actively advising your organization on how to leverage them?  As we have documented in many previous blogs, many organizations are poorly positioned to manage their data as a discoverable and easily sharable asset.  This will essential for:

  • Delivering business initiatives and showing value faster (agility).
  • Enabling business self-service so that IT is not the bottleneck in new analyses and decisions.

All of this requires new thinking around enterprise data architecture.  For fresh thinking on this subject see Thinking “Data First” to Drive Business Value.

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, CIO | Tagged , , , , , | 2 Comments

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part II

Sensitive Data

Protecting Sensitive Data

To determine what is the appropriate sensitive data protection method to use, you should first answer the following questions regarding the requirements:

  • Do you need to protect data at rest (in storage), during transmission, and/or when accessed?
  • Do some privileged users still need the ability to view the original sensitive data or does sensitive data need to be obfuscated at all levels?
  • What is the granularity of controls that you need?
    • Datafile level
    • Table level
    • Row level
    • Field / column level
    • Cell level
    • Do you need to be able to control viewing vs. modification of sensitive data?
    • Do you need to maintain the original characteristics / format of the data (e.g. for testing, demo, development purposes)?
    • Is response time latency / performance of high importance for the application?  This can be the case for mission critical production applications that need to maintain response times in the order of seconds or sub-seconds.

In order to help you determine which method of control is appropriate for your requirements, the following table provides a comparison of the different methods and their characteristics.

data

A combination of protection method may be appropriate based on your requirements.  For example, to protect data in non-production environments, you may want to use persistent data masking to ensure that no one has access to the original production data, since they don’t need to.  This is especially true if your development and testing is outsourced to third parties.  In addition, persistent data masking allows you to maintain the original characteristics of the data to ensure test data quality.

In production environments, you may want to use a combination of encryption and dynamic data masking.  This is the case if you would like to ensure that all data at rest is protected against unauthorized users, yet you need to protect sensitive fields only for certain sets of authorized or privileged users, but the rest of your users should be able to view the data in the clear.

The best method or combination of methods will depend on each scenario and set of requirements for your environment and organization.  As with any technology and solution, there is no one size fits all.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part I

Sensitive Data

Protecting Sensitive Data

I’m often asked to share my thoughts about protecting sensitive data. The questions that typically come up include:

  • Which types of data should be protected?
  • Which data should be classified as “sensitive?”
  • Where is this sensitive data located?
  • Which groups of users should have access to this data?

Because these questions come up frequently, it seems ideal to share a few guidelines on this topic.

When protecting the confidentiality and integrity of data, the first level of defense is Authentication and access control. However, data with higher levels of sensitivity or confidentiality may require additional levels of protection, beyond regular authentication and authorization methods.

There are a number of control methods for securing sensitive data available in the market today, including:

  • Encryption
  • Persistent (Static) Data Masking
  • Dynamic Data Masking
  • Tokenization
  • Retention management and purging

Encryption is a cryptographic method of encoding data.  There are generally, two methods of encryption:  symmetric (using single secret key) and asymmetric (using public and private keys).  Although there are methods of deciphering encrypted information without possessing the key, a good encryption algorithm makes it very difficult to decode the encrypted data without knowledge of the key.  Key management is usually a key concern with this method of control.  Encryption is ideal for mass protection of data (e.g. an entire data file, table, partition, etc.) against unauthorized users.

Persistent or static data masking obfuscates data at rest in storage.  There is usually no way to retrieve the original data – the data is permanently masked.  There are multiple techniques for masking data, including: shuffling, substitution, aging, encryption, domain-specific masking (e.g. email address, IP address, credit card, etc.), dictionary lookup, randomization, etc.  Depending on the technique, there may be ways to perform reverse masking  - this should be used sparingly.  Persistent masking is ideal for cases where all users should not see the original sensitive data (e.g. for test / development environments) and field level data protection is required.

Dynamic data masking de-identifies data when it is accessed.  The original data is still stored in the database.  Dynamic data masking (DDM) acts as a proxy between the application and database and rewrites the user / application request against the database depending on whether the user has the privilege to view the data or not.  If the requested data is not sensitive or the user is a privileged user who has the permission to access the sensitive data, then the DDM proxy passes the request to the database without modification, and the result set is returned to the user in the clear.  If the data is sensitive and the user does not have the privilege to view the data, then the DDM proxy rewrites the request to include a masking function and passes the request to the database to execute.  The result is returned to the user with the sensitive data masked.  Dynamic data masking is ideal for protecting sensitive fields in production systems where application changes are difficult or disruptive to implement and performance / response time is of high importance.

Tokenization substitutes a sensitive data element with a non-sensitive data element or token.  The first generation tokenization system requires a token server and a database to store the original sensitive data.  The mapping from the clear text to the token makes it very difficult to reverse the token back to the original data without the token system.  The existence of a token server and database storing the original sensitive data renders the token server and mapping database as a potential point of security vulnerability, bottleneck for scalability, and single point of failure. Next generation tokenization systems have addressed these weaknesses.  However, tokenization does require changes to the application layer to tokenize and detokenize when the sensitive data is accessed.  Tokenization can be used in production systems to protect sensitive data at rest in the database store, when changes to the application layer can be made relatively easily to perform the tokenization / detokenization operations.

Retention management and purging is more of a data management method to ensure that data is retained only as long as necessary.  The best method of reducing data privacy risk is to eliminate the sensitive data.  Therefore, appropriate retention, archiving, and purging policies should be applied to reduce the privacy and legal risks of holding on to sensitive data for too long.  Retention management and purging is a data management best practices that should always be put to use.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Building a Data Foundation for Execution

Building a Data Foundation for Execution

Building a Data Foundation

I have been re-reading Enterprise Architecture as Strategy from the MIT Center for Information Systems Research (CISR).*  One concept that they talk about that jumped out at me was the idea of a “Foundation for Execution.”  Everybody is working to drive new business initiatives, to digitize their businesses, and to thrive in an era of increased technology disruption and competition.  The ideas around a Foundation for Execution in the book are a highly practical and useful framework to deal with these problems.

This got me thinking: What is the biggest bottleneck in the delivery of business value today?  I know I look at things from a data perspective, but data is the biggest bottleneck.  Consider this prediction from Gartner:

“Gartner predicts organizations will spend one-third more on app integration in 2016 than they did in 2013. What’s more, by 2018, more than half the cost of implementing new large systems will be spent on integration. “

When we talk about application integration, we’re talking about moving data, synchronizing data, cleansing, data, transforming data, testing data.  The question for architects and senior management is this: Do you have the Data Foundation for Execution you need to drive the business results you require to compete?  The answer, unfortunately, for most companies is; No.

All too often data management is an add-on to larger application-based projects.  The result is unconnected and non-interoperable islands of data across the organization.  That simply is not going to work in the coming competitive environment.  Here are a couple of quick examples:

  • Many companies are looking to compete on their use of analytics.  That requires collecting, managing, and analyzing data from multiple internal and external sources.
  • Many companies are focusing on a better customer experience to drive their business. This again requires data from many internal sources, plus social, mobile and location-based data to be effective.

When I talk to architects about the business risks of not having a shared data architecture, and common tools and practices for enterprise data management, they “get” the problem.  So why aren’t they addressing it?  The issue is that they find that they are only funded to do the project they are working on and are dealing with very demanding timeframe requirements.  They have no funding or mandate to solve the larger enterprise data management problem, which is getting more complex and brittle with each new un-connected project or initiative that is added to the pile.

Studies such as “The Data Directive” by The Economist show that organizations that actively manage their data are more successful. But, if that is the desired future state, how do you get there?

Changing an organization to look at data as the fuel that drives strategy takes hard work and leadership. It also takes a strong enterprise data architecture vision and strategy.  For fresh thinking on the subject of building a data foundation for execution, see “Think Data-First to Drive Business Value” from Informatica.

* By the way, Informatica is proud to announce that we are now a sponsor of the MIT Center for Information Systems Research.

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Business Impact / Benefits, CIO, Data Governance, Data Integration, Data Synchronization, Enterprise Data Management | Tagged , , , , , , | Leave a comment

Enterprise Architects as Strategists

Data Architecture

The conversation at the Gartner Enterprise Architecture Summit was very interesting last week. They central them for years had been idea of closely linking enterprise architecture with the goals and strategy.  This year, Gartner added another layer to that conversation.  They are now actively promoting the idea of enterprise architects as strategists.

The reason why is simple.  The next wave of change is coming and it will significantly disrupt everybody.  Even worse, your new competitors may be coming from other industries.

Enterprise architects are in a position to take a leading role within the strategy process. This is because they are the people who best understand both business strategy and technology trends.

Some of the key ideas discussed included:

  • The boundaries between physical and digital products will blur
  • Every organization will need a technology strategy to survive
  • Gartner predicts that by 2017: 60% of the Global 1,000 will execute on at least one revolutionary and currently unimaginable business transformation effort.
  • The change is being driven by trends such as mobile, social, the connectedness of everything, cloud/hybrid, software-defined everything, smart machines, and 3D printing.

Observations

I agree with all of this.  My view is that this means that it is time for enterprise architects to think very differently about architecture.  Enterprise applications will come and go.  They are rapidly being commoditized in any case.  They need to think like strategists; in terms of market differentiation.  And nothing will differentiate an organization more than their data.    Example: Google autonomous cars.  Google is jumping across industry boundaries to compete in a new market with data as their primary differentiator. There will be many others.

Thinking data-first

Years of thinking of architecture from an application-first or business process-first perspective have left us with silos of data and the classic ‘spaghetti diagram” of data architecture. This is slowing down business initiative delivery precisely at the time organizations need to accelerate and make data their strategic weapon.  It is time to think data-first when it comes to enterprise architecture.

You will be seeing more from Informatica on this subject over the coming weeks and months.

Take a minute to comment on this article.  Your thoughts on how we should go about changing to a data-first perspective, both pro and con are welcomed.

Also, remember that Informatica is running a contest to design the data architecture of the year 2020.  Full details are here.

http://www.informatica.com/us/architects-challenge/

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, CIO, Data Integration, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , | Leave a comment

And now for the rest of the data…

In the first two issues I spent time looking at the need for states to pay attention to the digital health and safety of their citizens, followed by the oft forgotten need to understand and protect the non-production data. This is data than has often proliferated and also ignored or forgotten about.

In many ways, non-production data is simpler to protect. Development and test systems can usually work effectively with realistic but not real PII data and realistic but not real volumes of data. On the other hand, production systems need the real production data complete with the wealth of information that enables individuals to be identified – and therefore presents a huge risk. If and when that data is compromised either deliberately or accidentally the consequences can be enormous; in the impact on the individual citizens and also the cost of remediation on the state. Many will remember the massive South Carolina data breach of late 2012 when over the course of 2 days a 74 GB database was downloaded and stolen, around 3.8 million payers and 1.9 million dependents had their social security information stolen and 3.3 million “lost” bank account details. The citizens’ pain didn’t end there, as the company South Carolina picked to help its citizens seems to have tried to exploit the situation.

encryption protects against theft - unless the key is stolen too

encryption protects against theft – unless the key is stolen too

The biggest problem with securing production data is that there are numerous legitimate users and uses of that data, and most often just a small number of potentially malicious or accidental attempts of inappropriate or dangerous access. So the question is… how does a state agency protect its citizens’ sensitive data while at the same time ensuring that legitimate uses and users continues – without performance impacts or any disruption of access? Obviously each state needs to make its own determination as to what approach works best for them.

This video does a good job at explaining the scope of the overall data privacy/security problems and also reviews a number of successful approaches to protecting sensitive data in both production and non-production environments. What you’ll find is that database encryption is just the start and is fine if the database is “stolen” (unless of course the key is stolen along with the data! Encryption locks the data away in the same way that a safe protects physical assets – but the same problem exists. If the key is stolen with the safe then all bets are off. Legitimate users are usually easily able deliberately breach and steal the sensitive contents, and it’s these latter occasions we need to understand and protect against. Given that the majority of data breaches are “inside jobs” we need to ensure that authorized users (end-users, DBAs, system administrators and so on) that have legitimate access only have access to the data they absolutely need, no more and no less.

So we have reached the end of the first series. In the first blog we looked at the need for states to place the same emphasis on the digital health and welfare of their citizens as they do on their physical and mental health. In the second we looked at the oft-forgotten area of non-production (development, testing, QA etc.) data. In this third and final piece we looked at the need to and some options for providing the complete protection of non-production data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data masking, Data Privacy, Enterprise Data Management, Public Sector | Tagged , , , , | Leave a comment

What types of data need protecting?

In my first article on the topic of citizens’ digital health and safety we looked at the states’ desire to keep their citizens healthy and safe and also at the various laws and regulations they have in place around data breaches and losses. The size and scale of the problem together with some ideas for effective risk mitigation are in this whitepaper.

The cost and frequency of data breaches continue to rise

The cost and frequency of data breaches continue to rise

Let’s now start delving a little deeper into the situation states are faced with. It’s pretty obvious that citizen data that enables an individual to be identified (PII) needs to be protected. We immediately think of the production data: data that is used in integrated eligibility systems; in health insurance exchanges; in data warehouses and so on. In some ways the production data is the least of our problems; our research shows that the average state has around 10 to 12 full copies of data for non-production (development, test, user acceptance and so on) purposes. This data tends to be much more vulnerable because it is widespread and used by a wide variety of people – often subcontractors or outsourcers, and often the content of the data is not well understood.

Obviously production systems need access to real production data (I’ll cover how best to protect that in the next issue), on the other hand non-production systems of every sort do not. Non-production systems most often need realistic, but not real data and realistic, but not real data volumes (except maybe for the performance/stress/throughput testing system). What need to be done? Well to start with, a three point risk remediation plan would be a good place to start.

1. Understand the non-production data using sophisticated data and schema profiling combined with NLP (Natural Language Processing) techniques help to identify previously unrealized PII that needs protecting.
2. Permanently mask the PII so that it is no longer the real data but is realistic enough for non-production uses and make sure that the same masking is applied to the attribute values wherever they appear in multiple tables/files.
3. Subset the data to reduce data volumes, this limits the size of the risk and also has positive effects on performance, run-times, backups etc.

Gartner has just published their 2013 magic quadrant for data masking this covers both what they call static (i.e. permanent or persistent masking) and dynamic (more on this in the next issue) masking. As usual the MQ gives a good overview of the issues behind the technology as well as a review of the position, strengths and weaknesses of the leading vendors.

It is (or at least should be) an imperative that from the top down state governments realize the importance and vulnerability of their citizens data and put in place a non-partisan plan to prevent any future breaches. As the reader might imagine, for any such plan to success needs a combination of cultural and organizational change (getting people to care) and putting the right technology – together these will greatly reduce the risk. In the next and final issue on this topic we will look at the vulnerabilities of production data, and what can be done to dramatically increase its privacy and security.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving, Data Governance, Data masking, Data Privacy, Public Sector | Tagged , , , , , | Leave a comment

Data Management Is No Longer A Back Office Function

A front office as defined by Wikipedia is “a business term that refers to a company’s departments that come in contact with clients, including the marketing, sales, and service departments” while a back office is “tasks dedicated to running the company….without being seen by customers.”  Wikipedia goes on to say that “Back office functions can be outsourced to consultants and contractors, including ones in other countries.” Data Management was once a back office activity but in recent years it has moved to the front office.  What changed? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , | 2 Comments