Category Archives: Data Security

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part II

Sensitive Data

Protecting Sensitive Data

To determine what is the appropriate sensitive data protection method to use, you should first answer the following questions regarding the requirements:

  • Do you need to protect data at rest (in storage), during transmission, and/or when accessed?
  • Do some privileged users still need the ability to view the original sensitive data or does sensitive data need to be obfuscated at all levels?
  • What is the granularity of controls that you need?
    • Datafile level
    • Table level
    • Row level
    • Field / column level
    • Cell level
    • Do you need to be able to control viewing vs. modification of sensitive data?
    • Do you need to maintain the original characteristics / format of the data (e.g. for testing, demo, development purposes)?
    • Is response time latency / performance of high importance for the application?  This can be the case for mission critical production applications that need to maintain response times in the order of seconds or sub-seconds.

In order to help you determine which method of control is appropriate for your requirements, the following table provides a comparison of the different methods and their characteristics.

data

A combination of protection method may be appropriate based on your requirements.  For example, to protect data in non-production environments, you may want to use persistent data masking to ensure that no one has access to the original production data, since they don’t need to.  This is especially true if your development and testing is outsourced to third parties.  In addition, persistent data masking allows you to maintain the original characteristics of the data to ensure test data quality.

In production environments, you may want to use a combination of encryption and dynamic data masking.  This is the case if you would like to ensure that all data at rest is protected against unauthorized users, yet you need to protect sensitive fields only for certain sets of authorized or privileged users, but the rest of your users should be able to view the data in the clear.

The best method or combination of methods will depend on each scenario and set of requirements for your environment and organization.  As with any technology and solution, there is no one size fits all.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part I

Sensitive Data

Protecting Sensitive Data

I’m often asked to share my thoughts about protecting sensitive data. The questions that typically come up include:

  • Which types of data should be protected?
  • Which data should be classified as “sensitive?”
  • Where is this sensitive data located?
  • Which groups of users should have access to this data?

Because these questions come up frequently, it seems ideal to share a few guidelines on this topic.

When protecting the confidentiality and integrity of data, the first level of defense is Authentication and access control. However, data with higher levels of sensitivity or confidentiality may require additional levels of protection, beyond regular authentication and authorization methods.

There are a number of control methods for securing sensitive data available in the market today, including:

  • Encryption
  • Persistent (Static) Data Masking
  • Dynamic Data Masking
  • Tokenization
  • Retention management and purging

Encryption is a cryptographic method of encoding data.  There are generally, two methods of encryption:  symmetric (using single secret key) and asymmetric (using public and private keys).  Although there are methods of deciphering encrypted information without possessing the key, a good encryption algorithm makes it very difficult to decode the encrypted data without knowledge of the key.  Key management is usually a key concern with this method of control.  Encryption is ideal for mass protection of data (e.g. an entire data file, table, partition, etc.) against unauthorized users.

Persistent or static data masking obfuscates data at rest in storage.  There is usually no way to retrieve the original data – the data is permanently masked.  There are multiple techniques for masking data, including: shuffling, substitution, aging, encryption, domain-specific masking (e.g. email address, IP address, credit card, etc.), dictionary lookup, randomization, etc.  Depending on the technique, there may be ways to perform reverse masking  - this should be used sparingly.  Persistent masking is ideal for cases where all users should not see the original sensitive data (e.g. for test / development environments) and field level data protection is required.

Dynamic data masking de-identifies data when it is accessed.  The original data is still stored in the database.  Dynamic data masking (DDM) acts as a proxy between the application and database and rewrites the user / application request against the database depending on whether the user has the privilege to view the data or not.  If the requested data is not sensitive or the user is a privileged user who has the permission to access the sensitive data, then the DDM proxy passes the request to the database without modification, and the result set is returned to the user in the clear.  If the data is sensitive and the user does not have the privilege to view the data, then the DDM proxy rewrites the request to include a masking function and passes the request to the database to execute.  The result is returned to the user with the sensitive data masked.  Dynamic data masking is ideal for protecting sensitive fields in production systems where application changes are difficult or disruptive to implement and performance / response time is of high importance.

Tokenization substitutes a sensitive data element with a non-sensitive data element or token.  The first generation tokenization system requires a token server and a database to store the original sensitive data.  The mapping from the clear text to the token makes it very difficult to reverse the token back to the original data without the token system.  The existence of a token server and database storing the original sensitive data renders the token server and mapping database as a potential point of security vulnerability, bottleneck for scalability, and single point of failure. Next generation tokenization systems have addressed these weaknesses.  However, tokenization does require changes to the application layer to tokenize and detokenize when the sensitive data is accessed.  Tokenization can be used in production systems to protect sensitive data at rest in the database store, when changes to the application layer can be made relatively easily to perform the tokenization / detokenization operations.

Retention management and purging is more of a data management method to ensure that data is retained only as long as necessary.  The best method of reducing data privacy risk is to eliminate the sensitive data.  Therefore, appropriate retention, archiving, and purging policies should be applied to reduce the privacy and legal risks of holding on to sensitive data for too long.  Retention management and purging is a data management best practices that should always be put to use.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Informatica Responsible Disclosure Policy

Notifying Informatica of Security Issues

Our team of security experts strive to quickly address security issues involving our products and services. For guidance please see below:

Problem or issue Informatica contact or resource
How do I report a security problem in an Informatica application, online service, or website? Email security@informatica.com
How do I provide feedback on an Informatica product or service? Go to Informatica Support orEmail support@informatica.com
How do I report an email, website or pop-up window that falsely claims to represent Informatica? Contact security@informatica.com
Does Informatica have a Bug Bounty Program No, but we do participate in a Responsible Disclosure Program.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Security | Tagged , , | Leave a comment

Informatica and the Shellshock Security Vulnerability

The security of information systems is a complex, shared responsibility between infrastructure, system and application providers. Informatica doesn’t take lightly the responsibility our customers have entrusted to us in this complex risk equation.

As Informatica’s Chief Information Security Officer, I’d like to share three important security updates with our customers:

  1. What you need to know about Informatica products and services relative to the latest industry-wide security concern,
  2. What you need to do to secure Informatica products against the ShellShock vulnerability, and
  3. How to contact Informatica if you have questions about Informatica product security.

1 – What you need to know

On September 24, 2014 a serious new cluster of vulnerabilities to Linux/Unix distributions was announced, classified as (CVE-2014-6271, CVE-2014-7169, CVE-2014-7186, CVE-2014-7187, CVE-2014-6277 and CVE-2014-6278) aka “Shellshock” or “Bashdoor”. What makes ShellShock so impactful is that it requires relatively low effort or expertise to exploit and gain privileged access to vulnerable systems.

Informatica’s cloud-hosted products, including Informatica Cloud Services (ICS) and our recently-launched Springbok beta, have already been patched to address this issue. We continue to monitor for relevant updates to both vulnerabilities and available patches.

Because this vulnerability is a function of the underlying Operating System, we encourage administrators of potentially vulnerable systems to assess their risk levels and apply patches and/or other appropriate countermeasures.

Informatica’s Information Security team coordinated an internal response with product developers to assess the vulnerability and make recommendations necessary for our on-premise products. Specific products and actions are listed below.

2 – What you need to do

Informatica products themselves require no patches to address the Shellshock vulnerability, they are not directly impacted. However, Informatica strongly recommends that you apply your OS vendors’ patches as they become available, since some applications allow customers to use shell scripts in their pre-and post-processing scripts. Specific Informatica products and remediations are listed below:

Cloud Service Version Patch / Remediation
Springbok Beta No action necessary. The Springbok infrastructure has been patched by Informatica Cloud Operations.
ActiveVOS/Cloud All No action necessary. The ActiveVOS/Cloud infrastructure has been patched by Informatica Cloud Operations.
Cloud/ICS All Customers should apply OS patches to all of their machines running a Cloud agent. Relevant Cloud/ICS hosted infrastructure has already been patched by Informatica Cloud Operations.

 

Product Version Patch / Remediation
PowerCenter All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
IDQ All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
MM, BG, IDE All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PC Express All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
Data Services / Mercury stack All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PWX mainframe & CDC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
UM, VDS All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDR, IFC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DT, UDT, hparser, Atlantic All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Data Archive All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Dynamic data masking All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDV All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
SAP Nearline No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed..
TDM No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
MDM All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IR / name3 No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DX / DIH All DX & DIH on Red Hat Customers should apply OS patches.  Other OS customers still recommended to apply OS patch.
PIM All PIM core and Procurement are not not directly impacted. Recommend Media Manager customers apply OS patch to all machines with INFA product installed.
ActiveVOS All No direct impact for on-premise ActiveVOS product.  Cloud-realtime has already been patched.
Address Doctor All No direct impact for AD services run on Windows.  Procurement service has already been patched by Informatica Cloud Operations.
StrikeIron All No direct impact.

3 – How to contact Informatica about security

Informatica takes the security of our customers’ data very seriously. Please contact our Informatica’s Knowledge Base (article ID 301574), or our Global Customer Support team if you have any questions or concerns. The Informatica support portal is always available at http://mysupport.informatica.com.

If you are security researcher and have identified a potential vulnerability in an Informatica product or service, please follow our Responsible Disclosure Program.

Thank you,

Bill Burns, VP & Chief Information Security Officer

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Data Security, IaaS | Tagged , , , | Leave a comment

Hacking: How Ready Is Your Enterprise?

 
Recent corporate data security challenges require companies to ask hard questions about enterprise readiness:

1)      How do you know if your firm is next in line?
2)      How well will your Information Technology team respond to an attempted breach?

Is your firm ready?

HackingOver the last year, a number of high profile data security breaches have taken place at major US corporations. However, as a business person, how do you know the answers to the above questions.  Do you know what is at risk? And as well with big data gathering so much attention these days, isn’t it kind of like putting all the eggs into one basket? According to the management scholar, Theodore Levitt, part of being a manager is the ability to ask questions. My goal today is to arm business managers with the questions to ask so they can determine the answers to both of the above questions.

Is your Big Data secure?

HackingBig Data is all the buzz today. How safe are your Big Data spaces? Do you know what is going into each of them? Judith Hurwitz, the President and CEO of Hurwitz & Associates, says that she worries about big data security. Judith even suggests that big data “introduces security risks into the company, unintended consequences can endanger the company”. According to Judith, these risks come in two forms:

1)      Big data sources can contain viruses as well as other forms of business risk
2)      Big data lakes if unprotected represent a major business risk from hacking

Clearly, protecting your big data comprehensively requires diligence, including data encryption. But just remember, big data may seem like a science project in the back room, but it puts in one place a significant volume of data that could damage your enterprise if exposed to the outside world.

Do you need better tools or better business processes?

SecurityWhile many of the discussions about recent hacks have focused on the importance of having the right and up to date tools in place, it is just as important to have the right business processes in place if you want to minimize the possibility of a breach and minimizes losses when a breach occurs.

From an accessibility and security prospective, security processes look at the extent to which access to information is restricted appropriately to authorized parties. Next, from an information management perspective, they should consider the entire information life cycle. Information should be protected during all phases of its life cycle. Security should start at the information planning phase, and for many, this implies different protection mechanisms for storing, sharing, and disposition of information.

To determine what questions a business person should be asking their security professionals, I went to COBIT 5. For those who do not know, COBIT is the standard your auditors use to evaluate your company’s technology per Sarbanes Oxley. Understanding what it recommends matters because CFOs that we have talked to say that after the recent hacks they believe they are about to get increased scrutiny from their auditors. If you want to understand what auditors will look for, you should study COBIT 5. COBIT 5 has even linked its security policy guidance to what your IT security management team should be running against—one more term, ISO/IEC 27000 standard. Want to impress your security management professionals? Ask them whether they are in compliance with ISO/IEC 27000.

Good information security requires policies and procedures

Now, let’s explore what COBIT 5 recommends for information governance and security. The first thing it recommends is that good information security requires policies and procedures are created and put in place. This sounds pretty reasonable. However, COBIT next insists—something that we all know is true as managers– enterprise culture and ethics are critical to making “security policies and procedures effective”.

What metrics then should business people use to judge whether their firm is managing information security appropriately. COBIT 5 suggest that you look for two things right off the top.

1)      How recently did your IT organization conduct a risk assessment for the services that it provides?
2)      Does your IT organization have a current security plan which is accepted and communication throughout the enterprise?

For the first, it is important that you then ask what percentage of IT services and programs are covered by a risk assessment and what percentage of security incidents taking place were not identified in the risk assessment. The first question tells you how actively your IT is managing security and the second tells you whether there a gaps and risks. Your goal here should be to ensure that “IT-related enterprise risk does not exceed your risk appetite and your risk tolerance”.

With regards to the security plan, you should be asking your IT leadership (your CIO or CISO) about the number of key security roles that have been clearly defined and about the number of security related incidents over time. As important, find out how many security solutions currently deviate from plan?  A timely review of these could clearly impact your probability of getting your systems hacked.

As a manager, you know that teams need policies and procedures to limit errors from happening and to manage them when they occur. So ask what are the procedures for managing through a security event? As important, ask about the percentage of services are confirmed to have alignment with the security plan. At the same time, you want to know about the number of security incidents caused by non-adherence to the security plan. For the future, you want to make sure as well that all new solutions being developed have from launch confirmed their alignment to the security plan.

Other critical things to consider include the number of security incidents that have caused financial loss, business disruption, and public embarrassment. This of course is a big one that should be small in number. Then ask about the number of IT services with outstanding security requirements? Next, what is the time required to grant, change, and remove access privileges and the frequency of security assessment against the latest standards and guidelines.

Concluding Remarks

Security is one area that you really need IT-Business Alignment. It is important, as a business professional, that you do your best to ensure that IT builds policies and procedures that conform to your corporate risk appetite. As well you need to assure that the governance, policies, and procedures for your IT organization run against are kept current and update. This includes ensuring that the data is governed from end to end in the IT environment.

Related links

Solutions: Enterprise Level Data Security
The State of Data Centric Security
Gambling With Your Customer’s Financial Data
Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data masking, Data Security | Tagged , , , | Leave a comment

Gambling With Your Customer’s Financial Data

CIOs and CFOs both dig data security

Financial dataIn my discussions with CIOs over the last couple of months, I asked them about the importance of a series of topics. All of them placed data security at the top of their IT priority list. Even their CFO counterparts, with whom they do not always see eye to eye, said they were very concerned about the business risk for corporate data. These CFOs said that they touch, as a part of owning business risk, security — especially from hacking. One CFO said that he worried, as well, about the impact of data security for compliance issues, including HIPAA and SOX. Another said this: “The security of data is becoming more and more important. The auditors are going after this. CFOs, for this reason, are really worried about getting hacked. This is a whole new direction, but some of the highly publicized recent hacks have scared a lot of folks and they combined represent to many of us a watershed event.”

Editor of CFO Magazine

According to David W. Owens the editor of CFO Magazine, even if you are using “secure” storage, such as internal drives and private clouds, the access to these areas can be anything but secure. Practically any employee can be carrying around sensitive financial and performance data in his or her pocket, at any time.” Obviously, new forms of data access have created new forms of data risk.

Are some retailers really leaving the keys in the ignition?

If I only hadGiven the like mind set from CIOs and CFOs, I was shocked to learn that some of the recently hacked retailers had been using outdated security software, which may have given hackers easier access company payment data systems. Most amazingly, some retailers had not even encrypted their customer payment data. Because of this, hackers were able to hide on the network for months and steal payment data, as customers continued to use their credit cards at the company’s point of sale locations.

Why weren’t these transactions encrypted or masked? In my 1998 financial information start-up, we encrypted our databases to protect against hacks of our customers’ personal financial data. One answer came from a discussion with a Fortune 100 Insurance CIO. This CIO said “CIO’s/CTO’s/CISO’s struggle with selling the value of these investment because the C Suite is only interested in hearing about investments with a direct impact on business outcomes and benefits”.

Enterprise security drives enterprise brand today

Brand ValueSo how should leaders better argue the business case for security investments? I want to suggest that the value of IT is its “brand promise”. For retailers, in particular, if a past purchase decision creates a perceived personal data security risk, IT becomes a liability to their corporations brand equity and potentially creates a negative impact on future sales. Increasingly how these factors are managed either supports or not the value of a company’s brand.

My message is this: Spend whatever it takes to protect your brand equity; Otherwise a security issue will become a revenue issue.

In sum, this means organizations that want to differentiate themselves and avoid becoming a brand liability need to further invest in their data centric security strategy and of course, encryption. The game is no longer just about securing particular applications. IT organizations need to take a data centric approach to securing customer data and other types of enterprise data. Enterprise level data governance rules needs to be a requirement. A data centric approach can mitigate business risk by helping organizations to understand where sensitive data is and to protect it in motion and at rest. 

Related links

Solutions: Enterprise Level Data Security
The State of Data Centric Security
How Is The CIO Role Starting To Change?
The CFO viewpoint on data
CFOs discuss their technology priorities
Twitter: @MylesSuer

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Governance, Data masking, Data Security, Retail | Tagged , , , , , , , | Leave a comment

CFOs Discuss Their Technology Priorities

Recently, I had the opportunity to talk to a number of CFOs about their technology priorities. These discussions represent an opportunity for CIOs to hear what their most critical stakeholder considers important. The CFOs did not hesitate or need to think much about this question. They said three things make their priority list. They are better financial system reliability, better application integration, and better data security and governance. The top two match well with a recent KPMG study which found the biggest improvement finance executives want to see—cited by 91% of survey respondents—is in the quality of financial and performance insight obtained from the data they produce, followed closely by the finance and accounting organization’s ability to proactively analyze that information before it is stale or out of date”

TrustBetter financial system reliability

CFOs want to know that their systems work and are reliable. They want the data collected from their systems to be analyzed in a timely fashion. Importantly, CFOs say they are worried not only about the timeliness of accounting and financial data. This is because they increasingly need to manage upward with information.  For this reason, they want timely, accurate information produced for financial and business decision makers. Their goal is to drive out better enterprise decision making.

In manufacturing, for example, CFOs say they want data to span from the manufacturing systems to the distribution system. They want to be able to push a button and get a report. These CFOs complain today about the need to manually massage and integrate data from system after system before they get what they and their business decision makers want and need.

IntegrationBetter Application Integration

CFOs really feel the pain of systems not talking to each other. CFOs know firsthand that they have “disparate systems” and that too much manual integration is going on. For them, they see firsthand the difficulties in connecting data from the frontend to backend systems. They personally feel the large number of manual steps required to pull data. They want their consolidation of account information to be less manual and to be more timely. One CFO said that “he wants the integration of the right systems to provide the right information to be done so they have the right information to manage and make decisions at the right time”.

Data Security and Governance

securityCFOs, at the same time, say they have become more worried about data security and governance. Even though CFOs believe that security is the job of the CIO and their CISO, they have an important role to play in data governance. CFOs say they are really worried about getting hacked. One CFO told me that he needs to know that systems are always working properly. Security of data matters today to CFOs for two reasons. First, data has a clear material impact. Just take a look at the out of pocket and revenue losses coming from the breach at Target. Second, CFOs, which were already being audited for technology and system compliance, feel that their audit firms will be obligated to extend what they were doing in security and governance and go as a part of regular compliance audits. One CFO put it this way. “This is a whole new direction for us. Target scared a lot of folks and will be to many respects a watershed event for CFOs”.

Take aways

So the message here is that CFOs prioritize three technology objectives for their CIOs– better IT reliability, better application integration, and improved data security and governance. Each of these represents an opportunity to make the CFOs life easier but more important to enable them to take on a more strategic role. The CFOs, that we talked to, want to become one of the top three decision makers in the enterprise. Fixing these things for CFOs will enable CIOs to build a closer CFO and business relationships.

Related links

Solution Brief: The Intelligent Data Platform

Solution Brief: Secure at Source

Related Blogs

The CFO Viewpoint upon Data

How CFOs can change the conversation with their CIO?

New type of CFO represents a potent CIO ally

Competing on Analytics

The Business Case for Better Data Connectivity

Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Security, Governance, Risk and Compliance | Tagged , , , , , | Leave a comment

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

In a Data First World, IT must Empower Business Change!

IT must Empower Business ChangeYou probably know this already, but I’m going to say it anyway: It’s time you changed your infrastructure. I say this because most companies are still running infrastructure optimized for ERP, CRM and other transactional systems. That’s all well and good for running IT-intensive, back-office tasks. Unfortunately, this sort of infrastructure isn’t great for today’s business imperatives of mobility, cloud computing and Big Data analytics.

Virtually all of these imperatives are fueled by information gleaned from potentially dozens of sources to reveal our users’ and customers’ activities, relationships and likes. Forward-thinking companies are using such data to find new customers, retain existing ones and increase their market share. The trick lies in translating all this disparate data into useful meaning. And to do that, IT needs to move beyond focusing solely on transactions, and instead shine a light on the interactions that matter to their customers, their products and their business processes.

They need what we at Informatica call a “Data First” perspective. You can check out my first blog first about being Data First here.

A Data First POV changes everything from product development, to business processes, to how IT organizes itself and —most especially — the impact IT has on your company’s business. That’s because cloud computing, Big Data and mobile app development shift IT’s responsibilities away from running and administering equipment, onto aggregating, organizing and improving myriad data types pulled in from internal and external databases, online posts and public sources. And that shift makes IT a more-empowering force for business change. Think about it: The ability to connect and relate the dots across data from multiple sources finally gives you real power to improve entire business processes, departments and organizations.

I like to say that the role of IT is now “big I, little t,” with that lowercase “t” representing both technology and transactions. But that role requires a new set of priorities. They are:

  1. Think about information infrastructure first and application infrastructure second.
  2. Create great data by design. Architect for connectivity, cleanliness and security. Check out the eBook Data Integration for Dummies.
  3. Optimize for speed and ease of use – SaaS and mobile applications change often. Click here to try Informatica Cloud for free for 30 days.
  4. Make data a team sport. Get tools into your users’ hands so they can prepare and interact with it.

I never said this would be easy, and there’s no blueprint for how to go about doing it. Still, I recognize that a little guidance will be helpful. In a few weeks, Informatica’s CIO Eric Johnson and I will talk about how we at Informatica practice what we preach.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Data Integration, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

3 Ways to Sell Data Integration Internally

3 Ways to Sell Data Integration Internally

Selling data integration internally

So, you need to grab some budget for a data integration project, but know one understands what data integration is, what business problems it solves, and it’s difficult to explain without a white board and a lot of time.  I’ve been there.

I’ve “sold” data integration as a concept for the last 20 years.  Let me tell you, it’s challenging to define the benefits to those who don’t work with this technology every day.  That said, most of the complaints I hear about enterprise IT are around the lack of data integration, and thus the inefficiencies that go along with that lack, such as re-keying data, data quality issues, lack of automation across systems, and so forth.

Considering that most of you will sell data integration to your peers and leadership, I’ve come up with 3 proven ways to sell data integration internally.

First, focus on the business problems.  Use real world examples from your own business.  It’s not tough to find any number of cases where the data was just not there to make core operational decisions that could have avoided some huge mistakes that proved costly to the company.  Or, more likely, there are things like ineffective inventory management that has no way to understand when orders need to be place.  Or, there’s the go-to standard: No single definition of what a “customer” or a “sale” is amongst the systems that support the business.  That one is like back pain, everyone has it at some point.

Second, define the business case in practical terms with examples.  Once you define the business problems that exist due to lack of a sound data integration strategy and technologies, it’s time to put money behind those numbers.  Those in IT have a tendency to either way overstate, or way understate the amount of money that’s being wasted and thus could be saved by using data integration approaches and technology.  So, provide practical numbers that you can back-up with existing data.

Finally, focus on a phased approach to implementing your data integration solution.  The “Big Bang Theory” is a great way to define the beginning of the universe, but it’s not the way you want to define the rollout of your data integration technology.  Define a workable plan that moves from one small grouping of systems and databases to another, over time, and with a reasonable amount of resources and technology.  You do this to remove risk from the effort, as well as manage costs, and insure that you can dial lessons learned back into the efforts.  I would rather roll out data integration within an enterprises using small teams and more problem domains, than attempt to do everything within a few years.

The reality is that data integration is no longer optional for enterprises these days.  It’s required for so many reasons, from data sharing, information visibility, compliance, security, automation…the list goes on and on.  IT needs to take point on this effort.  Selling data integration internally is the first and most important step.  Go get ‘em.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Quality, Data Security | Tagged | Leave a comment