Category Archives: Data Integration

Regulation Ready: People, Processes and Technology, oh my!

Regulation Ready

Regulation Ready

Data Governance, the art of being Regulation Ready is about a lot of things, but one thing is clear. It’s NOT just about the technology. You ever been in one of those meetings, probably more than a few, where committees and virtual teams discuss the latest corporate initiatives? You know, those meetings where you want to dip your face in lava and run into the ocean? Because at the end of the meeting, everyone goes back to their day jobs and nothing changes.

Now comes a new law or regulation from the governing body du jour. There are common threads to each and every regulation related to data. Laws like HIPAA even had entire sections dedicated to the types of filing cabinets required in the office to protect healthcare data. And the same is true of regulations like BCBS 239, CCAR reporting and Solvency II. The laws ask; what are you reporting, how did you get that data, where has it been, what does this data mean and who has touched it. Virtually all of the regulations dealing with data have those elements.

So it behooves an organization to be Regulation Ready. This means those committees and virtual teams need to be driving cultural and process change. It’s not just about the technology; it’s as much about people and processes. Every role in the organization, from the developer to the business executive should embed the concepts of data governance in their daily work. From the time a developer or architect builds a new system, they need to document and define everything and every piece of data. It reminds me of days writing code and remembering to comment each code block. And the business executive likewise is sharing business rules and definition from the top so they can be integrated into the systems that eventually have to report on it.

Finally, the processes that support a data governance program are augmented by the technology. It may seem to suffice, that systems are documented in spreadsheets and documents, but those are more and more error prone and in the end not reliable in audit.

Informatica is the market leader in data management infrastructure to be Regulation Ready. This means, everything, from data movement and quality to definitions and security. Because at the end of the day, once you have the people culturally integrated, and the processes supporting the data workload, a centralized, high performance and feature rich technology needs to be in place to complete the trifecta. Informatica is pleased to offer the industry this leading technology as part of a comprehensive data governance foundation.

Informatica will be sharing this vision at the upcoming Annual FIMA 2015 Conference in Boston from March 30 to April 1. Come and visit Informatica at FIMA 2015 in Booth #3.

Share
Posted in Banking & Capital Markets, Data Governance, Data Integration, Data Security, Financial Services | Tagged , , , , | Leave a comment

Speed: The #1 IT Challenge

Agile Data Integration

Speed: The #1 IT Challenge

Speed is the top challenge facing IT today, and it’s reaching crisis proportions at many organizations.  Specifically, IT needs to deliver business value at the speed that the business requires.

The challenge does not end there; This has to be accomplished without compromising cost or quality. Many people have argued that you only get two out of three on the Speed/Cost/Quality triangle, but I believe that achieving this is the central challenge facing Enterprise Architects today.  Many people I talk to are looking at agile technologies, and in particular Agile Data Integration.

There have been a lot of articles written about the challenges, but it’s not all doom and gloom.  Here is something you can do right now to dramatically increase the speed of your project delivery while improving cost and quality at the same time: Take a fresh look you Agile Data Integration environment and specifically at Data Virtualization.  Data Virtualization offers the opportunity to simplify and speed up the data part of enterprise projects.  And this is the place where more and more projects are spending 40% and more of their time.  For more information and an industry perspective you can download the latest Forrester Wave report for Data Virtualization Q1 2015.

Here is a quick example of how you can use Data Virtualization technology for rapid prototyping to speed up business value delivery:

  • Use data virtualization technology to present a common view of your data to your business-IT project teams.
  • IT and business can collaborate in realtime to access and manage data from a wide variety of very large data sources – eliminating the long, slow cycles of passing specifications back and forth between business and IT.
  • Your teams can discover, profile, and manage data using a single virtual interface that hides the complexity of the underlying data.
  • By working with a virtualization layer, you are assured that your teams are using the right data and data that can by verified by linking it to a Business Glossary with clear terms, definitions, owners, and business context to reduce the chance of misunderstandings and errors.
  • Leading offerings in this space include data quality and data masking tools in the interface, ensuring that you improve data quality in the process.
  • Data virtualization means that your teams can be delivering in days rather than months and faster delivery means lower cost.

There has been a lot of interest in agile development, especially as it relates to data projects.  Data Virtualization is a key tool to accelerate your team in this direction.

Informatica has a leading position in the Forrester report due to the productivity of the Agile Data Integration environment but also because of the integration with the rest of the Informatica platform.  From an architect’s point of view it is critical to start standardizing on an enterprise data management platform.  Continuing data and data tool fragmentation will only slow down future project delivery.  The best way to deal with the growing complexity of both data and tools is to drive standardization within your organizations.

Share
Posted in 5 Sales Plays, Architects, CIO, Data Integration | Tagged , , , , , | Leave a comment

Data Mania Focuses on SaaS Ecosystems

SaaS

Data Mania Focuses on SaaS Ecosystems

Last week I had the opportunity to attend the Data Mania industry event hosted by Informatica. The afternoon event was a nice mix of industry panels with technical and business speakers from companies that included Amazon, Birst, AppDynamics, SalesForce, Marketo, Tableau, Adobe, Informatica, Birst, Dun & Bradstreet and several others.

A main theme of the event that came through with so many small, medium and large SaaS vendors was that everyone is increasingly dependent on being able to integrate data from other solutions and platforms. The second part of this was that customers increasingly expect the data integration requirements to work under the covers so they can focus on the higher level business solutions.

I really thought the four companies presented by Informatica as the winners of their Connect-a-Thon contest were the highlight. Each of these solutions was built by a company and highlighted some great aspects of data integration.

Databricks provides a cloud platform for big data processing. The solution leverages Apache Spark, which is an open source engine for big data processing that has seen a lot of adoption. Spark is the engine for the Databricks Cloud which then adds several enterprise features for visualization, large scale Spark cluster management, workflow and integration with third party applications. Having a big data solution means bringing data in from a lot of SaaS and on premise sources so Databricks built a connector to Informatica Cloud to make it easier to load data into the Databricks Cloud. Again, it’s a great example the ecosystem where higher level solutions can leverage 3rd party.

Thoughspot provides a search based BI solution. The general idea is that a search based interface provides a tool that a much broader group of users can use with little training to access to the power of enterprise business intelligence tools. It reminds me of some other solutions that fall into the enterprise 2.0 area and do everything from expert location to finding structured and unstructured data more easily. They wrote a nice blog post explaining why they built the ThoughtSpot Connector for Informatica Cloud. The main reason they are using Informatica to handle the data integration so they can focus on their own solution, which is the end user facing BI tools. It’s the example of SaaS providers choosing to either roll their own data integration or leveraging other providers as part of their solution.

BigML provides some very interesting machine learning solutions. The simple summary would be they are trying to create beautiful visualization and predicative modeling tools. The solution greatly simplifies the process of model iteration and visualizing models. Their gallery of models has several very good examples. Again, in this case BigML built a connector to Informatica Cloud for the SaaS and on premise integration and also in conjunction with the existing BigML REST API. BigML wrote a great blog post on their connector that goes into more details.

FollowAnalytics had one of the more interesting demonstrations because it was a very different solution than the other three solutions. They have a mobile marketing platform that is used to drive end user engagement and measure that engagement. They also uploaded their Data Mania integration demo here. They mostly are leveraging the data integration to provide access to important data sources that can help drive customer engagement in their platform. Given their end users are more marketing or business analysts they just expect to be able to easily get the data they want and need to drive marketing analysis and engagement.

My takeaway from talking to many of the SaaS vendors was that there is a lot of interest being able to leverage higher level infrastructure, platform and middleware services as they mature to meet the real needs of SaaS vendors so that they can focus on their own solutions. The ecosystem might be more ready in a lot of cases than what is available.

Share
Posted in Big Data, Business Impact / Benefits, Cloud Computing, Data Integration | Tagged , | Leave a comment

Startup Winners of the Informatica Data Mania Connect-a-Thon

Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.

While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP.  Bugs also continue to mount, and technical debt increases.

Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.

BigML

With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important.  BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.

Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw

Databricks

Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.

Can’t play the video? Click here, http://youtu.be/607ugvhzVnY

FollowAnalytics

With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.

Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg

ThoughtSpot

Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.

Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4

Share
Posted in B2B, Business Impact / Benefits, Cloud, Data Integration, Data Integration Platform, Data Privacy, Data Quality, Data Services, Data Transformation | Tagged , , , , , | Leave a comment

IoT Case Study: Fresh Powder = Fresh Data

EpicMix is a website, data integration solution and web application that provides a great example of how companies can provide more value to their customers when they think about data-ready architecture. In this case the company is Vail Resorts and it is great to look at this as an IoT case study since the solution has been in use since 2010.

The basics of EpicMix

* RFID technology embedded into lift tickets provide the ability to collect data for anyone using one at any Vail managed. Vail realized they had all these lift tickets being worn and there was an opportunity to use them to collect data that could enhance the experience of their guests. It also is a very clever way to collect data on skiers to help drive segmentation and marketing decisions.

* EpicMix just works. If any guest wants to take advantage all they have to do is register on the website or download the mobile app for their Android or iOS smart phone and register. Having a low bar to use is important to getting people to try out the app and even if people do not use the EpicMix website or app Vail is still able to leverage the data they are generating to better understand what people do on the mountain. (Vail has a detailed information policy and opt out policy)

IoT Case Study: Fresh Powder = Fresh Data

Figure 1: EpicMix Public Dashboard

* Value added features beyond data visibility. What makes the solution more interesting are the features that go beyond just tracking skiing performance. These include private messaging between guests while on the mountain, sharing photos with friends, integration to personal social media accounts and the ability for people to earn badges and participate in challenges. These go beyond the generation one solution that would just track performance and nothing else.

IoT Case Study: Fresh Powder = Fresh Data

Figure 2: Phone Application EpicMix view

This is the type of solution that qualifies as a IoT Personal Productivity solution and a Business Productivity solution.

  • For the skier it provides information on their activity, communication and sharing information on social media.
  • For Vail it allows them to better understand their guests, better communicate and offer their guests additional services and benefits and also how to use their resources or deploy their employees.

The EpicMix solution was made possible by taking advantage of data that was not being collected and then making it useful to users (skiers & guests). Having used EpicMix and similar performance tracking solutions the added communication and collaboration features are what sets it apart and the ease of use in getting started make it a great example of how fresh data can come from anywhere.

In the future it is easy to imagine features being added that streamlined ordering services for users (table reservation at the restaurant for Apre-ski) or Vail leveraging the data to make business decisions to provide more real time offers to guests on the mountain or frequent visitors on their next visit. And maybe we will see some of the new ski oriented wearables like XON bindings be integrated to solutions like EpicMix so it is possible to get even more data without having to have a second smart phone application.

Information for this post comes from Mapleton Hill Media and Vail Resorts

Share
Posted in Big Data, Cloud Computing, Data Integration | Tagged | Leave a comment

Informatica joins new ServiceMax Marketplace – offers rapid, cost effective integration with ERP and Cloud apps for Field Service Automation

ERP

Informatica Partners with ServiceMax

To deliver flawless field service, companies often require integration across multiple applications for various work processes.  A good example is automatically ordering and shipping parts through an ERP system to arrive ahead of a timely field service visit.  Informatica has partnered with ServiceMax, the leading field service automation solution, and subsequently joined the new ServiceMax Marketplace to offer customers integration solutions for many ERP and Cloud applications frequently involved in ServiceMax deployments.  Comprised of Cloud Integration Templates built on Informatica Cloud for frequent customer integration “patterns”, these solutions will speed and cost contain the ServiceMax implementation cycle and help customers realize the full potential of their field service initiatives.

Existing members of the ServiceMax Community can see a demo or take advantage of a free 30-day trial that provides full capabilities of Informatica Cloud Integration for ServiceMax with prebuilt connectors to hundreds of 3rd party systems including SAP, Oracle, Salesforce, Netsuite and Workday, powered by the Informatica Vibe virtual data machine for near-universal access to cloud and on-premise data.  The Informatica Cloud Integration for Servicemax solution:

  • Accelerates ERP integration through prebuilt Cloud templates focused on key work processes and the objects on common between systems as much as 85%
  • Synchronizes key master data such as Customer Master, Material Master, Sales Orders, Plant information, Stock history and others
  • Enables simplified implementation and customization through easy to use user interfaces
  • Eliminates the need for IT intervention during configuration and deployment of ServiceMax integrations.

We look forward to working with ServiceMax through the ServiceMax Marketplace to help joint customers deliver Flawless Service!

Share
Posted in 5 Sales Plays, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Data Integration, Data Integration Platform, Data Migration, Data Synchronization, Operational Efficiency, Professional Services, SaaS | Tagged , , , | Leave a comment

Data Mania- Using REST APIs and Native Connectors to Separate Yourself from the SaaS Pack

Data Mania

Data Mania: An Event for SaaS & ISV Leaders

With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.

Let’s unpack why.

To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it.  SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.

Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.

SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.

Enter REST APIs

REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.

With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.

SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.

Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.

While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.

The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.

In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.

Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.

Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.

The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.

For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.

Share
Posted in B2B, B2B Data Exchange, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Services, SaaS | Tagged , , , , , , , | Leave a comment

Internet of Things (IoT) Changes the Data Integration Game in 2015

Data Integration

Internet of Things (IoT) Changes the Data Integration Game in 2015

As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”

We can all see this happening in our personal lives.  Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone.  On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.

So, what changed?  With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.

For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure.  Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.

The problem with all of this very cool stuff is that we need to once again rethink data integration.  Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.

That’s why those who are moving to IoT-based systems need to do two things.  First, they must create a strategy for extracting data from devices, such as industrial robots or ann  Audi A8.  Second, they need a strategy to take  all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.

The challenge is that machines and devices are not traditional IT systems.  I’ve built connectors for industrial applications in my career.  The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around.  Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.

This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage.  However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible.  Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff.  At the heart of its ability to succeed is the ability to move data from place-to-place.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Services | Tagged , , , | 5 Comments

How organizations can prepare for 2015 data privacy legislation

Original article can be found here, scmagazine.com

On Jan. 13 the White House announced President Barack Obama’s proposal  for new data privacy legislation, the Personal Data Notification and Protection Act.  Many states have laws today that require corporations and government agencies to notify consumers in the event of a breach – but it is not enough.  This new proposal aims to improve cybersecurity standards nationwide with the following tactics:

Enable cyber-security information sharing between private and public sectors. 

Government agencies and corporations with a vested interest in protecting our information assets need a streamlined way to communicate and share threat information. This component of the proposed legislation incents organizations that participate in knowledge-sharing with targeted liability protection, as long as they are responsible for how they share, manage and retain privacy data.

Modernize the tools law enforcement has to combat cybercrime.
Existing laws, such as the Computer Fraud and Abuse Act, need to be updated to incorporate the latest cyber-crime classifications while giving prosecutors the ability to target insiders with privileged access to sensitive and privacy data.  The proposal also specifically calls out pursuing prosecution when selling privacy data nationally and internationally.

Standardize breach notification policies nationwide.
Many states have some sort of policy that requires notification of customers that their data has been compromised.  Three leading examples include California , Florida’s Information Protection Act (FIPA) and Massachusetts Standards for the Protection of Personal Information of Residents of the Commonwealth.  New Mexico, Alabama and South Dakota have no data breach protection legislation.  Enforcing standardization and simplifying the requirement for companies to notify customers and employees when a breach occurs will ensure consistent protection no matter where you live or transact.

Invest in increasing cyber-security skill sets.
For a number of years, security professionals have reported an ever-increasing skills gap in the cybersecurity profession.  In fact, in a recent Ponemon Institute report, 57 percent of respondents said a data breach incident could have been avoided if the organization had more skilled personnel with data security responsibilities. Increasingly, colleges and universities are adding cybersecurity curriculum and degrees to meet the demand. In support of this need, the proposed legislation mentions that the Department of Energy will provide $25 million in educational grants to Historically Black Colleges and Universities (HBCU) and two national labs to support a cybersecurity education consortium.

This proposal is clearly comprehensive, but it also raises the critical question: How can organizations prepare themselves for this privacy legislation?

The International Association of Privacy Professionals conducted a study of Federal Trade Commission (FTC) enforcement actions.  From the report, organizations can infer best practices implied by FTC enforcement and ensure these are covered by their organization’s security architecture, policies and practices:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

The Personal Data Notification and Protection Act fills a void at the national level; most states have privacy laws with California pioneering the movement with SB 1386.  However, enforcement at the state AG level has been uneven at best and absent at worse.

In preparing for this national legislation organization need to heed the policies derived from the FTC’s enforcement practices. They can also track the progress of this legislation and look for agencies such as the National Institute of Standards and Technology to issue guidance. Furthermore, organizations can encourage employees to take advantage of cybersecurity internship programs at nearby colleges and universities to avoid critical skills shortages.

With online security a clear priority for President Obama’s administration, it’s essential for organizations and consumers to understand upcoming legislation and learn the benefits/risks of sharing data. We’re looking forward to celebrating safeguarding data and enabling trust on Data Privacy Day, held annually on January 28, and hope that these tips will make 2015 your safest year yet.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Security, Data Services | Tagged , , , | Leave a comment