Category Archives: Enterprise Data Management

(Re)Thinking Data Security Strategy

Data Security Strategy

Rethinking Data Security Strategy

Data security is usually something people only think about when they get hacked, a place they do business with gets hacked, or they lose their credit card or wallet. It is just human nature to not worry about things that you cannot see and that seem to be well at hand. Instead I would suggest every company (and person) take just 15 minutes once a month to think about the below items that need to be part of their data security strategy.

Data security is a complex issue with many facets. I will skip past how you create and use passwords as that is the one area that gets a lot of focus. With the now well accepted use of SaaS and cloud based technologies by companies and people in their personal lives it is also time that people take a few moments to consider just how their data is secured or in some cases at risk.

Data centric security. Traditionally enterprise security has focused on access issues. What can be accessed from where and by who. The problem with this often walled garden approach is that when it comes to data these technologies and procedures do not take into account the common use cases of data usage. Most data security programs are also really outdated in a world where the majority of companies are using systems they do not own or directly manage (e.g. SaaS, Cloud, Mobile) or all the different types of data that are being created by people, systems and applications. Many enterprise security strategies need to move from focusing on access to include data usage and the ontology of data being used.

Question: Does your company have a modern enterprise security strategy or a walled garden approach?

Data about data. Long ago to make it easier to store, search and retrieve data people figured out that adding descriptive information about what is in the data file would be useful. Metadata is the actual term and it is no different than the labels people would put on a file to hold papers before we started moving everything to software based storage. The problem is that metadata has really grown and it can provide ways for people to learn a lot of personal, business and proprietary information without even getting access to the underlying information file. The richer the meta-data the more business or personal risk is created by possibly exposing information without actually exposing the underlying data.

Question: Are you accidentally exposing sensitive information in your metadata?

At rest data. The reason they use to say keep your tax records for 3 years and then destroy them is because people stored everything in file cabinets, drawers, or under a mattress. Some people do still like physical records but for most people and companies data is stored electronically and has been for a long time. The addition of SaaS and cloud based solutions adds a new wrinkle because the data is stored somewhere that you do not necessarily have direct access. And in many cases the data is stored multiple times if it is archived or backed up. Even when data is deleted in many cases it is not really gone because with the right technology data can be recovered if it was not fully deleted off the storage system that was used.

Question: Do you know where your data is stored? Archived? Backed up?

Question: Do you know how you would dispose of sensitive data that is no longer needed?

In flight data. No, this is not the Wi-Fi on the airplane. This is literally the data and meta-data that as they are being used by applications in the regular course of business. The issue is that while the data is being transmitted it could be at risk. This is one reason that people are warned to be careful of how they use public Wi-Fi because any decent hacker can see all the data on the network. (yes, really is that easy). Another enterprise issue that often needs to be dealt with is data cleaning in order to reduce duplicates or errors in data. A problem that occurs is how to do this with sensitive data that you do not want the developers or IT staff actually seeing. (e.g. HR or financial records).

Question: How does your company safe guard transactional and in flight data?

Question: Does your company use data masking and cleansing technology to safe guard in flight data?

Data Security Strategy

Rethinking Data Security Strategy

Data. Yes, the actual data or information that you care about or just store because it is so easy. I would recommend that companies look holistically at their data and think of it across it’s lifecycle. In this approach the data risks should be identified for how it is stored, used, transmitted, exposed internally or externally, and integrated or accessed for data integration. There are some new and interesting solutions coming to market that go beyond traditional data security, masking, and cleansing to help identify and access data security risks in the area of Security Intelligence. The concepts of Security Intelligence are solutions that are meant to create a measurement of security risk and identify issues so that they can a) be addressed before becoming a big problem b) automated procedures can be put in place to improve the level of security or bring solution up to the desired level of security .

One example is a new solution from Informatica called Secure@Source, which is just coming to market. This is a solution that is meant to provide automated analysis for enterprises so they can determine data risks so they can make improvements and then put in place new policies and automated procedures so that the desired level of data security is maintained. There have been similar solutions used for network security for years but these newer solutions while using similar approaches are now dealing with the more specific issues of data security.

Question: What is your company doing to proactively assess and manage data risk? Are you a good candidate for a security intelligence approach?

Data security is an important issue that all companies should have a strategy. While this is not meant to be an all encompassing list it is a good starting place for a discussion. Stay secure. Don’t be the next company in the news with a data security issue.

Share
Posted in Architects, Big Data, Business Impact / Benefits, Cloud Computing, Data Governance, Data Integration, Enterprise Data Management, Master Data Management | Tagged , | Leave a comment

Five Levers of Change

Change Management

Levers of Change

Organizational Change Management and Business Process Re-engineering was the rage in 1990’s. Much of that thinking still persists today but it is no longer sufficient for the kinds of transformations that organizations need to accomplish on an ongoing basis today. A modern business transformation is data driven, global in nature, crosses functional boundaries, and changes behavior at multiple levels of the organization. To address these needs organizations need to adopt a business-led enterprise-wide planning capability. (more…)

Share
Posted in Enterprise Data Management, Operational Efficiency, Professional Services | Tagged , , | Leave a comment

Parsing: Does WSJ Really Think Data Management Eliminates Jobs?

Data management

Does WSJ Really Think Data Management Eliminates Jobs?

I enjoyed reading this story this week in the WSJ talking about data and how it is being used by startups with an example of how the company Chubbies is using modern data management tools. Unlike the story my belief is that the new era of data management tools is net job creator and not a job eliminator as comes across in the title (Data is The New Middle Manager)

There were a number of interesting statements and suggestions based on data (and we think they meant data mangaement tools) that included:

  1. Startups today are flatter than other companies
  2. Access to data management tools is more pervasive today, especially in startups
  3. Employees at startups are move empowered to make decisions compared to main stream companies

First, from my own experiences working at startups and large Fortune 100 companies these three bullet points are not new when comparing startups and large companies.  However, taking the startup focus away the observations about the access and use of enterprise data management tools is still very important. The main aspects that the article highlights includes the following:

  1. Small companies have access to ever more powerful data management tools every bit as good (or better) than large companies
  2. The latest data management tools are easier to learn and easier to use so that just about any employee can get up to speed
  3. It is easier to provide access to the latest in data management tools do to newer technologies

These last items are where the article is very accurate with what many companies and people are seeing. It really is possible to get some very good data preparation tools, data quality tools, and full blown data management capabilities for a fraction of what they would have cost even 5 or 10 years ago.  In addition these tools are more easy to learn and more easy to be made available. This is good in a world where as much as many people hate to admit it Excel is still king because it is the good enough tool that many people can learn.

My point of view would be the data management story is less about startups vs large companies and far more about there being far greater access to powerful data management tools. Some that I would point out that provide different types of data management tools from full platforms to specific industries and use cases.

  • DataBricks : data visualization and big data processing
  • Informatica Rev : do it yourself data preparation
  • Salesforce Wave : for anyone using Salesforce.com an integrated analytics platform
  • Radian6 : social media monitoring platform for marketing professionals
  • Yodlee : financial management analytics
  • BigML : data decision support platform with predictive analytics
Share
Posted in Enterprise Data Management | Tagged , , | Leave a comment

Great Customer Experiences Start with Great Customer Data

TCRM

Are your customer-facing teams empowered with the great customer data they need to deliver great customer experiences?

On Saturday, I got a call from my broadband company on my mobile phone. The sales rep pitched a great limited-time offer for new customers. I asked him whether I could take advantage of this great offer as well, even though I am an existing customer. He was surprised.  “Oh, you’re an existing customer,” he said, dismissively. “No, this offer doesn’t apply to you. It’s for new customers only. Sorry.” You can imagine my annoyance.

If this company had built a solid foundation of customer data, the sales rep would have had a customer profile rich with clean, consistent, and connected information as reference. If he had visibility into my total customer relationship with his company, he’d know that I’m a loyal customer with two current service subscriptions. He’d know that my husband and I have been customers for 10 years at our current address. On top of that, he’d know we both subscribed to their services while live at separate addresses before we were married.

Unfortunately, his company didn’t arm him with the great customer data he needs to be successful. If they had, he could have taken the opportunity to offer me one of the four services I currently don’t subscribe to—or even a bundle of services. And I could have shared a very different customer experience.

Every customer interaction counts

Executives at companies of all sizes talk about being customer-centric, but it’s difficult to execute on that vision if you don’t manage your customer data like a strategic asset. If delivering seamless, integrated, and consistent customer experiences across channels and touch points is one of your top priorities, every customer interaction counts. But without knowing exactly who your customers are, you cannot begin to deliver the types of experiences that retain existing customers, grow customer relationships and spend, and attract new customers.

How would you rate your current ability to identify your customers across lines of business, channels and touch points?

Many businesses, however, have anything but an integrated and connected customer-centric view—they have a siloed and fragmented channel-centric view. In fact, sales, marketing, and call center teams often identify siloed and fragmented customer data as key obstacles preventing them from delivering great customer experiences.

ChannelCRM

Many companies are struggling to deliver great customer experiences across channels, because  their siloed systems give them a channel-centric view of customers

According to Retail Systems Research, creating a consistent customer experience remains the most valued capability for retailers, but 55 % of those surveyed indicated their biggest inhibitor was not having a single view of the customer across channels.

Retailers are not alone. An SVP of marketing at a mortgage company admitted in an Argyle CMO Journal article that, now that his team needs to deliver consistent customer experiences across channels and touch points, they realize they are not as customer-centric as they thought they were.

Customer complexity knows no bounds

The fact is, businesses are complicated, with customer information fragmented across divisions, business units, channels, and functions.

Citrix, for instance, is bringing together valuable customer information from 4 systems. At Hyatt Hotels & Resorts, it’s about 25 systems. At MetLife, it’s 70 systems.

How many applications and systems would you estimate contain valuable customer information at your company?

Based on our experience working with customers across many industries, we know the total customer relationship allows:

  • Marketing to boost response rates by better segmenting their database of contacts for personalized marketing offers.
  • Sales to more efficiently and effectively cross-sell and up-sell the most relevant offers.
  • Customer service teams to resolve customers’ issues immediately, instead of placing them on hold to hunt for information in a separate system.

If your marketing, sales, and customer service teams are struggling with inaccurate, inconsistent, and disconnected customer information, it is costing your company revenue, growth, and success.

Transforming customer data into total customer relationships

Informatica’s Total Customer Relationship Solution fuels business and analytical applications with clean, consistent and connected customer information, giving your marketing, sales, e-commerce and call center teams access to that elusive total customer relationship. It not only brings all the pieces of fragmented customer information together in one place where it’s centrally managed on an ongoing basis, but also:

  • Reconciles customer data: Your customer information should be the same across systems, but often isn’t. Assess its accuracy, fixing and completing it as needed—for instance, in my case merging duplicate profiles under “Jakki” and “Jacqueline.”
  • Reveals valuable relationships between customers: Map critical connections­—Are individuals members of the same household or influencer network? Are two companies part of the same corporate hierarchy? Even link customers to personal shoppers or insurance brokers or to sales people or channel partners.
  • Tracks thorough customer histories: Identify customers’ preferred locations; channels, such as stores, e-commerce, and catalogs; or channel partners.
  • Validates contact information: Ensure email addresses, phone numbers, and physical addresses are complete and accurate so invoices, offers, or messages actually reach customers.
PCRM

With a view of the Total Customer Relationship, teams are empowered to deliver great customer experiences

This is just the beginning. From here, imagine enriching your customer profiles with third-party data. What types of information help you better understand, sell to, and serve your customers? What are your plans for incorporating social media insights into your customer profiles? What could you do with this additional customer information that you can’t do today?

We’ve helped hundreds of companies across numerous industries build a total customer relationship view. Merrill Lynch boosted marketing campaign effectiveness by 30 percent. Citrix boosted conversion rates by 20%. A $60 billion global manufacturer improved cross-sell and up-sell success by 5%. A hospitality company boosted cross-sell and up-sell success by 60%. And Logitech increased sales across channels, including their online site, retail stores, and distributors.

Informatica’s Total Customer Relationship Solution empowers your people with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on social media.

Do you have a terrible customer experience or great customer experience to share? If so, please share them with us and readers using the Comment option below.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Data Integration, Data Quality, Data Services, Enterprise Data Management, Master Data Management, Total Customer Relationship | Tagged , , , , , , , , , | Leave a comment

Building an Impactful Data Governance – One Step at a Time

Let’s face it, building a Data Governance program is no overnight task.  As one CDO puts it:  ”data governance is a marathon, not a sprint”.  Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative.  Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.

Why bother then?  Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company?  Well, the drivers for data governance can vary for different organizations.  Let’s take a close look at some of the motivations behind data governance program.

For companies in heavily regulated industries, establishing a formal data governance program is a mandate.  When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors.  Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.

A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business,  you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.

Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.

Now that we have identified the drivers for data governance, how do we start?  This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results.  And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.

A rule of thumb?  Start small, take one-step at a time and focus on producing something tangible.

Sounds easy, right? Think this is easy?!Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement  a successful data governance practice that brings business impact to an enterprise organization.

If you are currently kicking the tires on setting up data governance practice in your organization,  I’d like to invite you to visit a member-only website dedicated to Data Governance:  http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics.  I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark.  More than 200 members have taken the assessment to gain better understanding of their current data governance program,  so why not give it a shot?

Governyourdata.com

Governyourdata.com

Data Governance is a journey, likely a never-ending one.  We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.

Share
Posted in Big Data, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , | 1 Comment

How to Ace Application Migration & Consolidation (Hint: Data Management)

Myth Vs Reality: Application Migration & Consolidation

Myth Vs Reality: Application Migration & Consolidation (No, it’s not about dating)

Will your application consolidation or migration go live on time and on budget?  According to Gartner, “through 2019, more than 50% of data migration projects will exceed budget and/or result in some form of business disruption due to flawed execution.”1  That is a scary number by any measure. A colleague of mine put it well: ‘I wouldn’t get on a plane that had 50% chance of failure’. So should you be losing sleep over your migration or consolidation project? Well that depends.  Are you the former CIO of Levi Strauss? Who, according to Harvard Business Review, was forced to resign due to a botched SAP migration project and a $192.5 million earnings write-off?2  If so, perhaps you would feel a bit apprehensive. Otherwise, I say you can be cautiously optimistic, if you go into it with a healthy dose of reality. Please ensure you have a good understanding of the potential pitfalls and how to address them.  You need an appreciation for the myths and realities of application consolidation and migration.

First off, let me get one thing off my chest.  If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type.  What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart.  Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short!  Need I say more?  Now, you may already be guessing where I am going with this.  That’s right, we are talking about the myths and realities related to your data!   Let’s explore a few of these.

Myth #1: All my data is there.

Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source.  Ouch!

Myth #2: I can just move my data from point A to point B.

Reality #2: You can try that approach if you want.  However you might not be happy with the results.  Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.

Myth #3: All my data is clean.

Reality #3:  It’s not. And here is a tip:  better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!

Myth #4: All my data will move over as expected

Reality #4: It will not.  Any time you move and transform large sets of data, there is room for logical or operational errors and surprises.  The best way to avoid this is to automatically validate that your data has moved over as intended.

Myth #5: It’s a one-time effort.

Reality #5: ‘Load and explode’ is formula for disaster.  Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand.  More importantly, your application architecture should not be a one-time effort.  It is work in progress and really an ongoing journey.  Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.

As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom.  These potential challenges are not always recognized and understood early on.  This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.

So what’s your reality going to be like?  Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former.  And also hoping you can join us for this upcoming webinar to learn more:

Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.

Date: Tuesday March 10, 10 am PT / 1 pm ET

Don’t miss out, Register Today!

1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014

2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.

Share
Posted in Data Integration, Data Migration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , , , , | 2 Comments

Great Data for Great Analytics – Evolving Best Practices for Data Management

By Philip Russom, TDWI, Research Director for Data Management.

Data ManagementI recently broadcast a really interesting Webinar with David Lyle, a vice president of product strategy at Informatica Corporation. David and I had a “fire-side chat” where we discussed one of the most pressing questions in data management today, namely: How can we prepare great data for great analytics, while still leveraging older best practices in data management? Please allow me to summarize our discussion.

Both old and new requirements are driving organizations toward analytics. David and I started the Webinar by talking about prominent trends:

  • Wringing value from big data – The consensus today says that advanced analytics is the primary path to business value from big data and other types of new data, such as data from sensors, devices, machinery, logs, and social media.
  • Getting more value from traditional enterprise data – Analytics continues to reveal customer segments, sales opportunities, and threats for risk, fraud, and security.
  • Competing on analytics – The modern business is run by the numbers – not just gut feel – to study markets, refine differentiation, and identify competitive advantages.

The rise of analytics is a bit confusing for some data people. As experienced data professionals do more work with advanced forms of analytics (enabled by data mining, clustering, text mining, statistical analysis, etc.) they can’t help but notice that the requirements for preparing analytic data are similar-but-different as compared to their other projects, such as ETL for a data warehouse that feeds standard reports.

Analytics and reporting are two different practices. In the Webinar, David and I talked about how the two involve pretty much the same data management practices, but it different orders and priorities:

  • Reporting is mostly about entities and facts you know well, represented by highly polished data that you know well. Squeaky clean report data demands elaborate data processing (for ETL, quality, metadata, master data, and so on). This is especially true of reports that demand numeric precision (about financials or inventory) or will be published outside the organization (regulatory or partner reports).
  • Advanced analytics, in general, enables the discovery of new facts you didn’t know, based on the exploration and analysis of data that’s probably new to you. Preparing raw source data for analytics is simple, though at high levels of scale. With big data and other new data, preparation may be as simple as collocating large datasets on Hadoop or another platform suited to data exploration. When using modern tools, users can further prepare the data as they explore it, by profiling, modeling, aggregating, and standardizing data on the fly.

Operationalizing analytics brings reporting and analysis together in a unified process. For example, once an epiphany is discovered through analytics (e.g., the root cause of a new form of customer churn), that discovery should become a repeatable BI deliverable (e.g., metrics and KPIs that enable managers to track the new form of churn in dashboards). In these situations, the best practices of data management apply to a lesser degree (perhaps on the fly) during the early analytic steps of the process, but then are applied fully during the operationalization steps.

Architectural ramifications ensue from the growing diversity of data and workloads for analytics, reporting, multi-structured data, real time, and so on. For example, modern data warehouse environments (DWEs) include multiple tools and data platforms, from traditional relational databases to appliances and columnar databases to Hadoop and other NoSQL platforms. Some are on premises and others are on clouds. On the down side, this results in high complexity, with data strewn across multiple platforms. On the upside, users get great data for great analytics by moving data to a platform within the DWE that’s optimized for a particular data type, analytic workload, or price point, or data management best practice.

For example, a number of data architecture uses cases have emerged successfully in recent years, largely to assure great data for great analytics:

  • Leveraging new data warehouse platform types gives analytics the high performance it needs. Toward this end, TDWI has seen many users successfully adopt new platforms based on appliances, columnar data stores, and a variety of in-memory functions.
  • Offloading data and its processing to Hadoop frees up capacity on EDWs. And it also gives unstructured and multi-structured data types a platform that is better suited to their management and processing, all at a favorable cost point.
  • Virtualizing data assets yields greater agility and simpler data management. Multi-platform data architectures too often entail a lot of data movement among the platforms. But this can be mitigated by federated and virtual data management practices, as well as by emerging practices for data lakes and enterprise data hubs.

If you’d like to hear more of my discussion with Informatica’s David Lyle, please replay the Webinar from the Informatica archive.

Share
Posted in B2B, Enterprise Data Management | Tagged , | Leave a comment

Announcing the New Formation of the Informatica Data Security Group

The Informatica Data Security Group

The Informatica Data Security Group

The technology world has and continues to change rapidly in front of our eyes. One of the areas where this change has become most obvious is Security, and in particular the approach to Security. Network and perimeter-based security controls alone are insufficient as data proliferates outside the firewall to social networks, outsourced and offshore resources and mobile devices. Organizations are more focused on understanding and protecting their data, which is the most prized asset they have vs all the infrastucture around it. Informatica is poised to lead this transformation of the security market to focus on a data-centric security approach.

The Ponemon Institute stated that the biggest concern for security professionals is that they do not know where sensitive data resides.  Informatica’s Intelligent Data Platform provides data security professionals with the technology required to discover, profile, classify and assess the risk of confidential and sensitive data.

Last year, we began significant investments in data security R&D support the initiative.  This year, we continue the commitment by organizing around the vision.  I am thrilled to be leading the Informatica Data Security Group, a newly-formed business unit comprised of a team dedicated to data security innovation.  The business unit includes the former Application ILM business unit which consists of data masking, test data management and data archive technologies from previous acquisitions, including Applimation, ActiveBase, and TierData.

By having a dedicated business unit and engineering resources applying Informatica’s Intelligent Data Platform technology to a security problem, we believe we can make a significant difference addressing a serious challenge for enterprises across the globe.  The newly formed Data Security Group will focus on new innovations in the data security intelligence market, while continuing to invest and enhance our existing data-centric security solutions such as data masking, data archiving and information lifecycle management solutions.

The world of data is transforming around us and we are committed to transforming the data security industry to keep our customer’s data clean, safe and connected.

For more details regarding how these changes will be reflected in our products, message and support, please refer to the FAQs listed below:

Q: What is the Data Security Group (DSG)?

A: Informatica has created a newly formed business unit, the Informatica Data Security Group, as a dedicated team focusing on data security innovation to meet the needs of our customers while leveraging the Informatica Intelligent Data Platform

Q: Why did Informatica create a dedicated Data Security Group business unit?

A:  Reducing Risk is among the top 3 business initiatives for our customers in 2015.  Data Security is a top IT and business initiative for just about every industry and organization that store sensitive, private, regulated or confidential data.  Data Security is a Board room topic.  By building upon our success with the Application ILM product portfolio and the Intelligent Data Platform, we can address more pressing issues while solving mission-critical challenges that matter to most of our customers.

Q: Is this the same as the Application ILM Business Unit?

A: The Informatica Data Security Group is a business unit that includes the former Application ILM business unit products comprised of data masking, data archive and test data management products from previous acquisitions, including Applimation, ActiveBase, and TierData, and additional resources developing and supporting Informatica’s data security products GTM, such as Secure@Source.

Q: How big is the Data Security market opportunity?

A: Data Security software market is estimated to be a $3B market in 2015 according to Gartner. Total information security spending will grow a further 8.2 percent in 2015 to reach $76.9 billion.[1]

Q: Who would be most interested in this announcement and why?

A: All leaders are impacted when a data breach occurs. Understanding the risk of sensitive data is a board room topic.  Informatica is investing and committing to securing and safeguarding sensitive, private and confidential data. If you are an existing customer, you will be able to leverage your existing skills on the Informatica platform to address a challenge facing every team who manages or handles sensitive or confidential data.

Q: How does this announcement impact the Application ILM products – Data Masking, Data Archive and Test Data Management?

A: The existing Application ILM products are foundational to the Data Security Group product portfolio.  These products will continue to be invested in, supported and updated.  We are building upon our success with the Data Masking, Data Archive and Test Data Management products.

Q: How will this change impact my customer experience?

A: The Informatica product website will reflect this new organization by listing the Data Masking, Data Archive, and Test Data Management products under the Data Security product category.  The customer support portal will reference Data Security as the top level product category.  Older versions of the product and corresponding documentation will not be updated and will continue to reflect Application ILM nomenclature and messaging.

[1] http://www.gartner.com/newsroom/id/2828722

Share
Posted in B2B, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

The New Marketing Technology Landscape Is Here… And It’s Music to Our Ears!

How Do You Like It? How Do You Like It? More, More More!
Andrea+True+Connection+ANDREAChiefmartec came out with their 2015 Marketing Technology Landscape, and if there’s one word that comes to mind, it’s MORE. 1,876 corporate logos dot the page, up from 947 in 2014. That’s definitely more, more, more – just about double to be exact. I’m honestly not sure it’s possible to squeeze any more in a single image?

But it’s strangely fitting, because this is the reality that we marketers live in.  There are an infinite number of new technologies, approaches, social media platforms, operations tools, and vendors that we have to figure out. New, critical categories of technology roll out constantly. New vendors enter and exit the landscape. As Chiefmartec says “at least on the immediate horizon, I don’t think we’re going to see a dramatic shrinking of this landscape. The landscape will change, for sure. What qualifies as “marketing” and “technology” under the umbrella of marketing technology will undoubtedly morph. But if mere quantity is the metric we’re measuring, I think it’s going to be a world of 1,000+ marketing technology companies — perhaps even a world of 2,000+ of them — for some time to come.”

Marketing_Technology_Jan2015

Middleware: I’m Coming Up So You’d Better Get This Party Started!
pinkOne thing you’ll notice if you look carefully between last year’s and this year’s version, is the arrival of the middleware layer. Chiefmartec spends quite a bit of time talking about middleware, pointing out that great tools in the category are making the marketing technology landscape easier to manage – particularly those that handle a hybrid of on premise and cloud.

Marketers have long since cared about the things on the top – the red “Marketing Experiences” and the orange “Marketing Operations”. They’ve also put a lot of focus in the dark gray/black/blue layer “Backbone Platforms” like marketing autionation & e-commerce. But only recently has that yellow middleware layer become front and center for marketers. Data integration, data management platforms, connectivity, data quality, and API’s are definitely not new to the technology landscape, and have been a critical domain of IT for decades. But as marketers are becoming more and more skilled and reliant on analytics and focused customer experience management, data is entering the forefront.

Marketers cannot focus exclusively on their Salesforce CRM, their Marketo automation, or their Adobe Experience Manager web management. Data Ready marketers realize that each of these applications can no longer be run in a silo, they need to be looked at collectively as a powerful set of tools designed to engage the customer and push them through the buying cycle, as critical pieces to the same puzzle. And to do that, they need to be looking at connecting their data sources, powering them with great data, analyzing and measuring their results, and then deciding what to do.

If you squint, you can see Informatica in the yellow Middleware layer. (I could argue that it belongs in several of these yellow boxes, not just Cloud integration, but I’ll save that for another blog!) Some might say that’s not very exciting, but I would argue that Informtaica is in a tremendous place to help marketers succeed with great data. And it all comes down to two words… complexity and change.

Why You Have to Go and Make Things So Complicated?
avril-lavigne-avril-lavigne-34900869-1280-1024Ok, admittedly terrible grammar, but you get the picture. Marketers live in a trendounsly complex world. Sure you don’t have all 1,876 of the logos on the Technology Landscape in house. You probably don’t eve have one from each of the 43 categories. But you definitely have a lot of different tecnology solutions that you rely upon on a day-to-day basis. According to a September article by ChiefMarTech, most marketers already regularly rely on more than 100 software programs.

Data ready marketers realize that their environments are complicated, and that they need a foundation. They need a platform of great data that all of their various applications and tools can leverage, and that can actually connect all of their various applications and tools together. They need to be able to connect to just about anything from just about anything. They need a complete view of all of their interactions their customers. In short, they need to make their extremely complicated world more simple, streamlined, and complete.

Ch-Ch-Ch-Ch-Changes. Turn and Face the Strange!
David-Bowie-1973I have a tendency to misunderstand lyrics, so I have to confess that until I looked up this song today, I thought the lyric was “time to face the pain” (Bowie fans, I hang my head in shame!).  But quite honestly, “turn and face the strange” illustrates my point just as well!

There is no question that marketing has changed dramatically in the past few years.  Your most critical marketing tools and processes two years ago are almost certainly different than those this year, and will almost certainly be different from what you see two years from now.  Marketers realize this.  The Marketing Technology Landscape illustrates this every year!

The data ready marketer understands that their toolbox will change, but that their data will be the foundation for whatever new piece of the technology puzzle they embrace or get rid of.  Building a foundation of great data will power any technology solution or new approach.

Data ready marketers also work with their IT counterparts to engineer for change.  They make sure that no matter what technology or data source they want to add – no matter how strange or unthinkable it is today – they never have to start from scratch.  They can connect to what they want, when they want, leveraging great data, and ultimately making great decisions.

Get Ready ‘Cause Here I Come. The Era of the Data Ready Marketer is Here
TemptationsNow that you have a few catchy tunes stuck in your head, it’s time to ask yourself, are you data ready? Are you ready to embrace the complexity of marketing technology landscape? Are you ready to think about change as a competitive weapon?

I encourage you to take our survey about data ready marketing. The results are coming out soon so don’t miss your chance to be a part. You can find the link here.

Also, follow me on twitter – The Data Ready Marketer (@StephanieABest) for some of the latest & greatest news and insights on the world of data ready marketing.

And stay tuned because we have several new Data Ready Marketing pieces coming out soon – InfoGraphics, eBooks, SlideShares, and more!

Share
Posted in 5 Sales Plays, Big Data, CMO, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform | Tagged , , , , , , , , | Leave a comment

Ready for Internet of Things?

internet_of_thingsData has always played a key role in informing decisions – machine generated and intuitive.  In the past, much of this data came from transactional databases as well as unstructured sources, such as emails and flat files.  Mobile devices appeared next on the map.  We have found applications of such devices not just to make calls but also to send messages, take a picture, and update status on social media sites.  As a result, new sets of data got created from user engagements and interactions.  Such data started to tell a story by connecting dots at different location points and stages of user connection.  “Internet of Things” or IoT is the latest technology to enter the scene that could transform how we view and use data on a massive scale.

Another buzzword? 

Does IoT present a significant opportunity for companies to transform their business processes?  Internet of Things probably add an important awareness veneer when it comes to data.  It could bring data early in focus by connecting every step of data creation stages in any business process.  It could de-couple the lagging factor in consuming data and making decisions based on it.  Data generated at every stage in a business process could show an interesting trend or pattern and better yet, tell a connected story.  Result could be predictive maintenance of equipment involved in any process that would further reduce cost.  New product innovations would happen by leveraging the connectedness in data as generated by each step in a business process.  We would soon begin to understand not only where the data is being used and how, but also what’s the intent and context behind this usage.  Organizations could then connect with their customers in a one-on-one fashion like never before, whether to promote a product or offer a promotion that could be both time and place sensitive.  New opportunities to tailor product and services offering for customers on an individual basis would create new growth areas for businesses.  Internet of Things could make it a possibility by bringing together previously isolated sets of data.

Proof-points

Recent Economist report, “The Virtuous Circle of Data: Engaging Employees in Data and Transforming Your Business” suggests that 68% of data-driven businesses outperform their competitors when it comes to profitability.  78% of those businesses foster a better culture of creativity and innovation.  Report goes on to suggest that 3 areas are critical for an organization to build a data-driven business, including data supported by devices: 1) Technology & Tools, 2) Talent & Expertise, and 3) Culture & Leadership.  By 2020, it’s projected that there’ll be 50B connected devices, 7x more than human beings on the planet.  It is imperative for an organization to have a support structure in place for device generated data and a strategy to connect with broader enterprise-wide data initiatives.

A comprehensive Internet of Things strategy would leverage speed and context of data to the advantage of business process owners.  Timely access to device generated data can open up the channels of communication to end-customers in a personalized at the moment of their readiness.  It’s not enough anymore to know what customers may want or what they asked for in the past; rather anticipating what they might want by connecting dots across different stages.  IoT generated data can help bridge this gap.

How to Manage IoT Generated Data

More data places more pressure on both quality and security factors – key building blocks for trust in one’s data.  Trust is ideally truth over time.  Consistency in data quality and availability is going to be key requirement for all organizations to introduce new products or service differentiated areas in a speedy fashion.  Informatica’s Intelligent Data Platform or IDP brings together industry’s most comprehensive data management capabilities to help organizations manage all data, including device generated, both in the cloud and on premise.  Informatica’s IDP enables an automated sensitive data discovery, such that data discovers users in the context where it’s needed.

Cool IoT Applications

There are a number of companies around the world that are working on interesting applications of Internet of Things related technology.  Smappee from Belgium has launched an energy monitor that can itemize electricity usage and control a household full of devices by clamping a sensor around the main power cable. This single device can recognize individual signatures produced by each of the household devices and can let consumers switch off any device, such as an oven remotely via smartphone.  JIBO is a IoT device that’s touted as the world’s first family robot.  It automatically uploads data in the cloud of all interactions.  Start-ups such as Roost and Range OI can retrofit older devices with Internet of Things capabilities.  One of the really useful IoT applications could be found in Jins Meme glasses and sunglasses from Japan.  They embed wearable sensors that are shaped much like Bluetooth headsets to detect drowsiness in its wearer.  It observes the movement of eyes and blinking frequency to identify tiredness or bad posture and communicate via iOS and android smartphone app.  Finally, Mellow is a new kind of kitchen robot that makes it easier by cooking ingredients to perfection while someone is away from home. Mellow is a sous-vide machine that takes orders through your smartphone and keeps food cold until it’s the exact time to start cooking.

Closing Comments

Each of the application mentioned above deals with data, volumes of data, in real-time and in stored fashion.  Such data needs to be properly validated, cleansed, and made available at the moment of user engagement.  In addition to Informatica’s Intelligent Data Platform, newly introduced Informatica’s Rev product can truly connect data coming from all sources, including IoT devices and make it available for everyone.  What opportunity does IoT present to your organization?  Where are the biggest opportunities to disrupt the status quo?

Share
Posted in 5 Sales Plays, Big Data, Cloud, Cloud Data Management, Customer Services, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform, Wearable Devices | Tagged , , , , , | Leave a comment