Category Archives: Data Services
I found a truly cringe-worthy article today that shows what popular websites looked like more than a decade ago and what they look like today. Looking back to what was cutting-edge in 1996, or even 2006, is as bad as fingernails on a chalkboard compared to the modern homepages of popular sites today.
These websites are still well-used today, staying with the times and leading the way we design digital experiences. The key was change over many years of research and understanding of user experience. These sites stay modern, adapting to different web-enabled devices and experiences that the end user will encounter. Common among them are beautiful imagery, clear calls to action, and a sophisticated understanding of what people want on a homepage.
Can you imagine if any of those sites had stayed the same and never changed? We would not be using them today if that were the case. Their popularity would wane. Change is never easy, but it is usually necessary to stay relevant.
Web designers in 1996 could not imagine what the internet would be like in 2015, although they would probably agree there was a lot of potential. A modern equivalent is the implications of big data throughout the enterprise.
Data-driven marketers today are wondering how they can gain insight from big data. The answer? The ability to change is the connection between big data and insight. Data-driven marketers today know that their roles are changing: 68% of marketers think that marketing has seen more changes in the last two years than it has in the past 50 years, according to a recent survey. The changes are due to a renewed focus on customer experience within their jobs, and the need to use big data to improve that experience.
Big data should drive insights that change businesses, but is the real reason marketers aren’t sure about how to use big data tied to the change that it requires? Leading change in an organization is never easy, but it is definitely necessary.
What insights do you want from big data, and what value can you derive from them? If your reason for using big data is customer behavior insights, how will knowing how a customer behaves influence any changes in your approach?
The National Retail Federation recently reported that retailers say these are the three top reasons for using big data in a survey:
- Analyzing customer behavior (56 percent)
- Bringing together different data sources (49 percent)
- Improving personalization (48 percent)
What are your reasons for using big data?
Data-driven marketers can drown in too much information if they look at massive datasets without a question in mind that they want to answer. The question being asked often implies that a business must change to stay modern and relevant to its customers. Could concern over a need for great change be the roadblock to data-driven marketers who could be using data for valuable insights?
Big data has gotten a lot of buzz in the last few years. Data-driven marketers can move the big data concept from fuzzy, unrealized potential to a major part of how their business operates successfully.
Learn more in this white paper for marketers, The Secret to a Successful Customer Journey.
This article was originally posted on Argyle CMO Journal and is re-posted here with permission.
According to a new global study from SDL, 90% of consumers expect a consistent customer experience across channels and devices when they interact with brands. However, according to these survey results, Gartner Survey Finds Importance of Customer Experience on the Rise — Marketing is on the Hook, fewer than half of the companies surveyed rank their customer experience as exceptional today. The good news is that two-thirds expect it to be exceptional in two years. In fact, 89% plan to compete primarily on the basis of the customer experience by 2016.
So, what role do CMOs play in delivering omnichannel customer experiences?
According to a recent report, Gartner’s Executive Summary for Leadership Accountability and Credibility within the C-Suite, a high percentage of CEOs expect CMOs to lead the integrated cross-functional customer experience. Also, customer experience is one of the top three areas of investment for CMOs in the next two years.
I had the pleasure of participating on a panel discussion at the Argyle CMO Forum in Dallas a few months ago. It focused on the emergence of omnichannel and the need to deliver seamless, integrated and consistent customer experiences across channels.
Lisa Zoellner, Chief Marketing Officer of Golfsmith International, was the dynamic moderator, kept the conversation lively, and the audience engaged. I was a panelist alongside:
- Chris Brogan, Senior Vice President, Strategy & Analysis,Hyatt Hotels & Resorts
• Chris Berg, Vice President, Store Operations, The Home Depot
• Chip Burgard, Senior Vice President, Marketing, CitiMortgage
Below are some highlights from the panel.
Lisa Zoellner, CMO, Golfsmith International opened the panel with a statistic. “Fifty-five percent of marketers surveyed feel they are playing catch up to customer expectations. But in that gap is a big opportunity.”
What is your definition of omnichannel?
There was consensus among the group that omnichannel is about seeing your business through the eyes of your customer and delivering seamless, integrated and consistent customer experiences across channels.
Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences. It’s one brand to the customer. But there is a gap between customer expectations and what most businesses can deliver today.
In fact, executives at most organizations I’ve spoken with, including the panelists, believe they are in the very beginning stages of their journey towards delivering omnichannel customer experiences. The majority are still struggling to get a single view of customers, products and inventory across channels.
“Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences.”
What are some of the core challenges standing in your way?
A key takeaway was that omnichannel requires organizations to fundamentally change how they do business. In particular, it requires changing existing business practices and processes. It cannot be done without cross-functional collaboration.
I think Chris Berg, VP, Store Operations at The Home Depot said it well, “One of the core challenges is the annual capital allocation cycle, which makes it difficult for organizations to be nimble. Most companies set strategies and commitments 12-24 months out and approach these strategies in silos. Marketing, operations, and merchandising teams typically ask for capital separately. Rarely does this process start with asking the question, ‘What is the core strategy we want to align ourselves around over the next 24 months?’ If you begin there and make a single capital allocation request to pursue that strategy, you remove one of the largest obstacles standing in the way.”
Chip Burgard, Senior Vice President of Marketing at CitiMortgage focused on two big barriers. “The first one is a systems barrier. I know a lot of companies struggle with this problem. We’re operating with a channel-centric rather than a customer-centric view. Now that we need to deliver omnichannel customer experiences, we realize we’re not as customer-centric as we thought we were. We need to understand what products our customers have across lines-of-business such as, credit cards, banking, investments and mortgage. But, our systems weren’t providing a total customer relationship view across products and channels. Now, we’re making progress on that. The second barrier is compensation. We have a commission-based sales force. How do you compensate the loan officers if a customer starts the transaction with the call center but completes it in the branch? That’s another issue we’re working on.”
Lisa Zoellner, CMO at Golfsmith International added, “I agree that compensation is a big barrier. Companies need to rethink their compensation plans. The sticky question is ‘Who gets credit for the sale?’ It’s easy to say that you’re channel-agnostic, but when someone’s paycheck is tied to the performance of a particular channel, it makes it difficult to drive that type of culture change.”
“We have a complicated business. More than 500 Hyatt hotels and resorts span multiple brands and regions,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “But, customers want a seamless experience no matter where they travel. They expect that the preference they shared during their Hyatt stay at a hotel in Singapore is understood by the person working at the next hotel in Dallas. So, we’re bridging those traditional silos all the way down to the hotel. A guest doesn’t care if the person they’re interacting with is from the building engineering department, from the food and beverage department, or the rooms department. It’s all part of the same customer experience. So we’re looking at how we share the information that’s important to guests to keep the customer the focus of our operations.”
“We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”
How are companies powering great customer experiences with great customer data?
Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts, said, “We’re going through a transformation to unleash our colleagues to deliver great customer experiences at every stage of the guest journey. Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”
Hyatt connects the fragmented customer data from numerous applications including sales, marketing, ecommerce, customer service and finance. They bring the core customer profiles together into a single, trusted location, where they are continually managed. Now their customer profiles are clean, de-duplicated, enriched, and validated. They can see the members of a household as well as the connections between corporate hierarchies. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively and hotel teams can extend simple, meaningful gestures that drive guest loyalty.
When he first joined Hyatt, Chris did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay. Mostly just sleep now,” he joked. Those guest profiles have now been successfully consolidated.
This solid customer data foundation means Hyatt colleagues can more easily personalize a guest’s experience. For example, colleagues at the front desk are now able to use the limited check-in time to congratulate a new Diamond member on just achieving the highest loyalty program tier or offer a better room to those guests most likely to take them up on the offer and appreciate it.
According to Chris, “Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is not in order.”
How are you shifting from channel-centric to customer-centric?
Chip Burgard, SVP of Marketing at CitiMortgage answered, “In the beginning of our omnichannel journey, we were trying to allow customer choice through multi-channel. Our whole organization was designed around people managing different channels. But, we quickly realized that allowing separate experiences that a customer can choose from is not being customer-centric.
Now we have new sales leadership that understands the importance of delivering seamless, integrated and consistent customer experiences across channels. And they are changing incentives to drive that customer-centric behavior. We’re no longer holding people accountable specifically for activity in their channels. We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”
Chris Berg, VP of Store Operations at The Home Depot, explained, “For us, it’s about transitioning from a store-centric to customer-centric approach. It’s a cultural change. The managers of our 2,000 stores have traditionally been compensated based on their own store’s performance. But we are one brand. For example in the future, a store may be fulfilling an order, however because of the geography of where the order originated they may not receive credit for the sale. We’re in the process of working through how to better reward that collaboration. Also, we’re making investments in our systems so they support an omnichannel, or what we call interconnected, business. We have 40,000 products in store and over 1,000,000 products online. Now that we’re on the interconnected journey, we’re rethinking how we manage our product information so we can better manage inventory across channels more effectively and efficiently.”
Omnichannel is all about shifting from channel-centric to customer-centric – much more customer-centric than you are today. Knowing who your customers are and having a view of products and inventory across channels are the basic requirements to delivering exceptional customer experiences across channels and touch points.
This is not a project. A business transformation is required to empower people to deliver omnichannel customer experiences. The executive team needs to drive it and align compensation and incentives around it. A collaborative cross-functional approach is needed to achieve it.
Omnichannel depends on customer-facing teams such as marketing, sales and call centers to have access to a total customer relationship view based on clean, consistent and connected customer, product and inventory information. This is the basic foundation needed to deliver seamless, integrated and consistent customer experiences across channels and touch points and improve their effectiveness.
Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.
While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP. Bugs also continue to mount, and technical debt increases.
Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.
With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important. BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.
Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw
Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.
Can’t play the video? Click here, http://youtu.be/607ugvhzVnY
With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.
Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg
Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.
Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4
The problem many banks encounter today is that they have vast sums of investment tied up in old ways of doing things. Historically, customers chose a bank and remained ’loyal’ throughout their lifetime…now competition is rife and loyalty is becoming a thing of a past. In order to stay ahead of the competition, gain and keep customers, they need to understand the ever-evolving market, disrupt norms and continue to delight customers. The tradition of staying with one bank due to family convention or from ease has now been replaced with a more informed customer who understands the variety of choice at their fingertips.
Challenger Banks don’t build on ideas of tradition and legacy and see how they can make adjustments to them. They embrace change. Longer-established banks can’t afford to do nothing, and assume their size and stature will attract customers.
Here’s some useful information
Accenture’s recent report, The Bank of Things, succinctly explains what ‘Customer 3.0’ is all about. The connected customer isn’t necessarily younger. It’s everybody. Banks can get to know their customers better by making better use of information. It all depends on using intelligent data rather than all data. Interrogating the wrong data can be time-consuming, costly and results in little actionable information.
When an organisation sets out with the intention of knowing its customers, then it can calibrate its data according with where the gold nuggets – the real business insights – come from. What do people do most? Where do they go most? Now that they’re using branches and phone banking less and less – what do they look for in a mobile app?
Customer 3.0 wants to know what the bank can offer them all-the-time, on the move, on their own device. They want offers designed for their lifestyle. Correctly deciphered data can drive the level of customer segmentation that empowers such marketing initiatives. This means an organisation has to have the ability and the agility to move with its customers. It’s a journey that never ends -technology will never have a cut-off point just like customer expectations will never stop evolving.
It’s time for banks to re-shape banking
Informatica have been working with major retail banks globally to redefine banking excellence and realign operations to deliver it. We always start by asking our customers the revealing question “Have you looked at the art of the possible to future-proof your business over the next five to ten years and beyond?” This is where the discussion begins to explore really interesting notions about unlocking potential. No bank can afford to ignore them.
Who remembers their first game of Pong? Celebrating more than 40 years of innovation, gaming is no longer limited to monochromatic screens and dedicated, proprietary platforms. The PC gaming industry is expected to exceed $35bn by 2018. Phone and handheld games is estimated at $34bn in 5 years and quickly closing the gap. According to EEDAR, 2014 recorded more than 141 million mobile gamers just in North America, generating $4.6B in revenue for mobile game vendors.
This growth has spawned a growing list of conferences specifically targeting gamers, game developers, the gaming industry and more recently gaming analytics! This past weekend in Boston, for example, was PAX East where people of all ages and walks of life played games on consoles, PC, handhelds, and good old fashioned board games. With my own children in attendance, the debate of commercial games versus indie favorites, such as Minecraft , dominates the dinner table.
Online games are where people congregate online, collaborate, and generate petabytes of data daily. With the added bonus of geospatial data from smart phones, the opportunity for more advanced analytics. Some of the basic metrics that determine whether a game is successful, according to Ninja Metrics, include:
- New Users, Daily Active Users, Retention
- Revenue per user
- Session length and number of sessions per user
Additionally, they provide predictive analytics, customer lifetime value, and cohort analysis. If this is your gig, there’s a conference for that as well – the Gaming Analytics Summit !
At the Game Developers Conference recently held in San Francisco, the focus of this event has shifted over the years from computer games to new gaming platforms that need to incorporate mobile, smartphone, and online components. In order to produce a successful game, it requires the following:
- Needs to be able to connect to a variety of devices and platforms
- Needs to use data to drive decisions and improve user experience
- Needs to ensure privacy laws are adhered to.
Developers are able to quickly access online gaming data and tweak or change their sprites’ attributes dynamically to maximize player experience.
When you look at what is happening in the gaming industry, you can start to see why colleges and universities like my own alma mater, WPI, now offers a computer science degree in Interactive Media and Game Design degree . The IMGD curriculum includes heavy coursework in data science, game theory, artificial intelligence and story boarding. When I asked a WPI IMGD student about what they are working on, they are mapping out decision trees that dictate what adversary to pop up based on the player’s history (sounds a lot like what we do in digital marketing…).
As we start to look at the Millennial Generation entering into the workforce, maybe we should look at our own recruiting efforts and consider game designers. They are masters in analytics and creativity with an appreciation for the importance of great data. Combining the magic and the math makes a great gaming experience. Who wouldn’t want that for their customers?
With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.
Let’s unpack why.
To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it. SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.
Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.
SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.
Enter REST APIs
REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.
With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.
SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.
Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.
While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.
The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.
In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.
Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.
Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.
The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.
For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.
As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”
We can all see this happening in our personal lives. Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone. On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.
So, what changed? With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.
For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure. Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.
The problem with all of this very cool stuff is that we need to once again rethink data integration. Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.
That’s why those who are moving to IoT-based systems need to do two things. First, they must create a strategy for extracting data from devices, such as industrial robots or ann Audi A8. Second, they need a strategy to take all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.
The challenge is that machines and devices are not traditional IT systems. I’ve built connectors for industrial applications in my career. The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around. Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.
This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage. However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible. Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff. At the heart of its ability to succeed is the ability to move data from place-to-place.
Original article can be found here, scmagazine.com
On Jan. 13 the White House announced President Barack Obama’s proposal for new data privacy legislation, the Personal Data Notification and Protection Act. Many states have laws today that require corporations and government agencies to notify consumers in the event of a breach – but it is not enough. This new proposal aims to improve cybersecurity standards nationwide with the following tactics:
Enable cyber-security information sharing between private and public sectors.
Government agencies and corporations with a vested interest in protecting our information assets need a streamlined way to communicate and share threat information. This component of the proposed legislation incents organizations that participate in knowledge-sharing with targeted liability protection, as long as they are responsible for how they share, manage and retain privacy data.
Modernize the tools law enforcement has to combat cybercrime.
Existing laws, such as the Computer Fraud and Abuse Act, need to be updated to incorporate the latest cyber-crime classifications while giving prosecutors the ability to target insiders with privileged access to sensitive and privacy data. The proposal also specifically calls out pursuing prosecution when selling privacy data nationally and internationally.
Standardize breach notification policies nationwide.
Many states have some sort of policy that requires notification of customers that their data has been compromised. Three leading examples include California , Florida’s Information Protection Act (FIPA) and Massachusetts Standards for the Protection of Personal Information of Residents of the Commonwealth. New Mexico, Alabama and South Dakota have no data breach protection legislation. Enforcing standardization and simplifying the requirement for companies to notify customers and employees when a breach occurs will ensure consistent protection no matter where you live or transact.
Invest in increasing cyber-security skill sets.
For a number of years, security professionals have reported an ever-increasing skills gap in the cybersecurity profession. In fact, in a recent Ponemon Institute report, 57 percent of respondents said a data breach incident could have been avoided if the organization had more skilled personnel with data security responsibilities. Increasingly, colleges and universities are adding cybersecurity curriculum and degrees to meet the demand. In support of this need, the proposed legislation mentions that the Department of Energy will provide $25 million in educational grants to Historically Black Colleges and Universities (HBCU) and two national labs to support a cybersecurity education consortium.
This proposal is clearly comprehensive, but it also raises the critical question: How can organizations prepare themselves for this privacy legislation?
The International Association of Privacy Professionals conducted a study of Federal Trade Commission (FTC) enforcement actions. From the report, organizations can infer best practices implied by FTC enforcement and ensure these are covered by their organization’s security architecture, policies and practices:
- Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
- Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
- Limit employee access to (and copying of) personal information, based on employee’s role.
- Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
- Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.
The Personal Data Notification and Protection Act fills a void at the national level; most states have privacy laws with California pioneering the movement with SB 1386. However, enforcement at the state AG level has been uneven at best and absent at worse.
In preparing for this national legislation organization need to heed the policies derived from the FTC’s enforcement practices. They can also track the progress of this legislation and look for agencies such as the National Institute of Standards and Technology to issue guidance. Furthermore, organizations can encourage employees to take advantage of cybersecurity internship programs at nearby colleges and universities to avoid critical skills shortages.
With online security a clear priority for President Obama’s administration, it’s essential for organizations and consumers to understand upcoming legislation and learn the benefits/risks of sharing data. We’re looking forward to celebrating safeguarding data and enabling trust on Data Privacy Day, held annually on January 28, and hope that these tips will make 2015 your safest year yet.
2014 was a pivotal turning point for Informatica as our investments in Hadoop and efforts to innovate in big data gathered momentum and became a core part of Informatica’s business. Our Hadoop related big data revenue growth was in the ballpark of leading Hadoop startups – more than doubling over 2013.
In 2014, Informatica reached about 100 enterprise customers of our big data products with an increasing number going into production with Informatica together with Hadoop and other big data technologies. Informatica’s big data Hadoop customers include companies in financial services, insurance, telcommunications, technology, energy, life sciences, healthcare and business services. These innovative companies are leveraging Informatica to accelerate their time to production and drive greater value from their big data investments.
These customers are in-production or implementing a wide range of use cases leveraging Informatica’s great data pipeline capabilities to better put the scale, efficiency and flexibility of Hadoop to work. Many Hadoop customers start by optimizing their data warehouse environments by moving data storage, profiling, integration and cleansing to Hadoop in order to free up capacity in their traditional analytics data warehousing systems. Customers that are further along in their big data journeys have expanded to use Informatica on Hadoop for exploratory analytics of new data types, 360 degree customer analytics, fraud detection, predictive maintenance, and analysis of massive amounts of Internet of Things machine data for optimization of energy exploration, manufacturing processes, network data, security and other large scale systems initiatives.
2014 was not just a year of market momentum for Informatica, but also one of new product development innovations. We shipped enhanced functionality for entity matching and relationship building at Hadoop scale (a key part of Master Data Management), end-to-end data lineage through Hadoop, as well as high performance real-time streaming of data into Hadoop. We also launched connectors to NoSQL and analytics databases including Datastax Cassandra, MongoDB and Amazon Redshift. Informatica advanced our capabilities to curate great data for self-serve analytics with a connector to output Tableau’s data format and launched our self-service data preparation solution, Informatica Rev.
Customers can now quickly try out Informatica on Hadoop by downloading the free trials for the Big Data Edition and Vibe Data Stream that we launched in 2014. Now that Informatica supports all five of the leading Hadoop distributions, customers can build their data pipelines on Informatica with confidence that no matter how the underlying Hadoop technologies evolve, their Informatica mappings will run. Informatica provides highly scalable data processing engines that run natively in Hadoop and leverage the best of open source innovations such as YARN, MapReduce, and more. Abstracting data pipeline mappings from the underlying Hadoop technologies combined with visual tools enabling team collaboration empowers large organizations to put Hadoop into production with confidence.
As we look ahead into 2015, we have ambitious plans to continue to expand and evolve our product capabilities with enhanced productivity to help customers rapidly get more value from their data in Hadoop. Stay tuned for announcements throughout the year.
Try some of Informatica’s products for Hadoop on the Informatica Marketplace here.
Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning
Are you ready to answer “Yes” to the questions:
a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”
I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.
The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:
- How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
- How you can make data work in new and advanced ways for every user in your company.
Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.
“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”
Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.
The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.
The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.
Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.
For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.
Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.
As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.
My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!
To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.