Category Archives: Data Services

Informatica Rev: Data Democracy At Last – Part I

Data Democracy At Last

Data Democracy At Last

Informatica Cloud Data Preparation has launched! Those are the words we aspired to hear, the words that served as our rallying cry, when all we had was an idea coupled with a ton of talent, passion and drive. Well, today, we launch Informatica Rev, as business users refer to it. (Check out the press release here.)

As we launch today, we now have over 3,500 individual users across over 800 logos. These users are everyday business users who just want to improve the speed and quality of their business decisions. And by doing so, they help their corporation find success in the marketplace. And in turn they also find success in their own careers. You can hear more from Customers talking about their experience using Informatica Rev during our December 16 Future of Work webinar.

These users are people who, previously, were locked out of the exclusive Data Club because they did not have the time to be excel jocks or know how to code. But now, these are people who have found success by turning their backs on this Club and aggressively participating in the Data Democracy.

And they are able to finally participate in the “Data Democracy” because of Informatica Rev. You can try Informatica Rev for free by clicking here.

These people play every conceivable role in the information economy. They are marketing managers, marketing operations leads, tradeshow managers , sales people, sales operations leads, accounting  analysts, recruiting leads, benefits managers, to mention a few.  They work for large companies to small/mid-size companies and even sole proprietorships. They are even IT leads who might have more technical knowledge than their business counterparts, but are increasingly getting barraged by requests from their business side counterparts, and are just looking to be more productive with these requests. Let’s take a peek into how Informatica Rev allows them to participate in the Data Democracy, and changes their lives for the better.

Before Informatica Rev, a marketing analyst was simply unable to respond to rapid changes to competitor prices because by the time the competitor pricing data was assembled by people or tools they relied on, the competitor prices changed. This lead to lost revenue opportunities for the company. I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business.

Let’s explore what a marketing analyst does today. When a file with competitor prices was received by the analyst, the initial questions they ask were “Which of my SKUs is each competitive price for?” and ”Do the prices vary by some geography”  and to answer these questions, they use Excel VLOOKUPS and some complex macros. By the time the Excel work is done, if they know what a VLOOKUP is, the competitor data is old. Therefore, at some point, there was no reason to continue this analysis and just accept the inability to capture this revenue.

With Informatica Rev, a Marketing Analyst can use Intelligent Guidance to understand the competitor data file to determine its completeness and then with Smart Combine easily combine the competitor data with their own. This is with no code, formal training, and in a few minutes all by themselves. And with Tableau as their BI tool, they can then use the Export to TDE capability to seamlessly export to Tableau to analyze trends in price changes to decide on their strategy. Voila!

Before Informatica Rev, a tradeshow manager used to spend an inordinate amount of time trying to validate leads so that they could then load them into a Marketing Automation System. After a tradeshow, time is of the essence and leads need to be processed rapidly otherwise they will decay, and fewer opportunities will result for the company. Again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business.  But, the Tradeshow Manager finds themselves using Excel VLOOKUPS and other creative but time consuming ways to validate the lead information. They simply want to know, “Which leads have missing titles or phone numbers?” and ” What is the correct phone number?” and” How many are new leads?” and ” How many are in accounts closing this quarter?”

All of these are questions that can be answered, but take a lot of time in Excel and even after all that Excel work, the final lead list was still error prone causing missed sales opportunities.  With Informatica Rev, a Tradeshow Manager can answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. With the Intelligent Guidance capability they can easily surface where the missing data lies. With Fast Combine they can access their opportunity information in Salesforce and be guided through the process of combining tradeshow and salesforce data to correctly replace the missing data.  Again, Voila!

Before Informatica Rev, an Accounting Analyst spent inordinate amounts of time processing trade partner data, every month, reconciling it with the trade partner’s receivables to determine if they had been paid the correct amount by their trade partner. Not only was this process time consuming, it was error prone and after all of the effort, they actually left millions in earned revenue, unreceived. And again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business, and also effectively managing operational costs within the analysts company.  So, let’s take a look at what the Accounting Analyst does, today.  Every trade partner sends large files with different structures of purchase data in them. The Accounting Analyst initially asks, “What data is in them?”,” For what time period?”,” How many transactions?”,” From which products?”, “Which of our actual products does their name for our product tie to?”

Then, after they get these answers, they need to combine it with the payments data they received from the trade partner in order to answer the question, “Have we been paid the right amount and if not what is the difference?” All of these questions are ones that can be answered, but used to take a lot of time with Excel VLOOKUPS and complex macros. And often, the reconciliation was performed incorrectly leaving receivables, well, un-received. With Informatica Rev, an Accounting Analyst can benefit from Intelligent Guidance where they are lead through the process of rapidly understanding their questions about the trade partner files, with a few simple clicks. Furthermore Informatica Rev’s Smart Combine capability suggests how to combine receivables data with trade partner data. So there you have it, now they know if the correct amount has been paid. And the best part is that they were able to answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. Now, this process has to be done every month. Using Recipes, every step the Accounting Analyst took last month is recorded, so they do not have to repeat it this month. Just re-import the new trade partner data and you reconciled. And Again, Voila!

One more thing for you, the everyday business user. In the future, you will be able to send this Recipe to IT. This capability will allow you to communicate your exact data requirement to IT, just as you created it with no mis-interpretation on anyone’s behalf. IT can then rapidly institutionalize your logic exactly as you defined it, into the enterprise datawarehouse, datamart or some other repository of your or your IT department’s liking. Perhaps this means the end to those requirements gathering sessions?

More importantly, I feel this means that you just got your exact requirement added into a central repository in a matter of minutes. And you did not need to make a case to be part of an enterprise project either. This capability is a necessary part for you to participate in the Data Democracy and maintain your rapid pace of business. This is a piece that Informatica is uniquely positioned to solve for you as your IT department likely already has Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too.

Please look for Part 2 of this Blog, tomorrow, where I will discuss how Informatica Rev elegantly bridges the IT and Business divide, empowering IT to lead the charge into Data Democracy. But in the meantime check out Informatica Rev for yourself and let me know what you think.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Business Impact / Benefits, Data Services | Tagged , | Leave a comment

Take These Steps to Avoid Wasting Your Marketing Technology Budget

Avoid Wasting Your Marketing Technology Budget

Don’t Waste Your Marketing Tech Budget

This year, the irresistible pull of digital marketing met an unstoppable force: Girl Scout cookies. It’s an $800 million-a-year fundraiser that is only expected to increase with a newly announced addition of digital sales.

The New York Times reports that beginning in this month and into January, for the first time, the Girl Scouts of America will be able to sell Thin Mints and other favorites online through invite-only websites. The websites will be accompanied by a mobile app, giving customers new digital options.

As the Girl Scouts update from a door-to-door approach to include a newly introduced digital program, it’s just one more sign of where marketing trends are heading.

From digital cookies to digital marketing technology:

If 2014 is the year of the digital cookie, then 2015 will be the year of marketing technology. Here’s just a few of the strongest indicators:

  • A study found that 67% of marketing departments plan to increase spending on technology over the next two years, according to the Harvard Business Review.
  • Gartner predicts that by 2017, CMOs will outspend CIOs on IT-related expenses.
  • Also by 2017, one-third of the total marketing budget will be dedicated to digital marketing, according to survey results from Teradata.
  • A new LinkedIn/Salesforce survey found that 56% of marketers see their relationships with the CIO as very important or critical.
  • Social media is a mainstream channel for marketers, making technology for measuring and managing this channel of paramount importance. This is not just true of B2C companies. Of high level executive B2B buyers, 75% used social media to make purchasing decisions, according to a 2014 survey by market research firm IDC.

From social to analytics to email marketing, much of what marketers see in technology offerings is often labeled as “cloud-based.” While cloud technology has many features and benefits, what are we really saying when we talk about the cloud?

What the cloud means… to marketers.

Beginning around 2012, multitudes of businesses in many industries began adapting “the cloud” as a feature or a benefit to their products or services. Whether or not the business truly was cloud-based was not as clear, which led to the term “cloudwashing.” We hear the so much about cloud, it is easy for us to overlook what it really means and what the benefits really are.

The cloud is more than a buzzword – and in particular, marketers need to know what it truly means to them.

For marketers, “the cloud” has many benefits. A service that is cloud-based gives you amazing flexibility and choices over the way you use a product or service:

  • A cloud-enabled product or service can be integrated into your existing systems. For marketers, this can range from integration into websites, marketing automation systems, CRMs, point-of-sale platforms, and any other business application.
  • You don’t have to learn a new system, the way you might when adapting a new application, software, or other enterprise system. You won’t have to set aside a lot of time and effort for new training for you or your staff.
  • Due to the flexibility that lets you integrate anywhere, you can deploy a cloud-based product or service across all of your organization’s applications or processes, increasing efficiencies and ensuring that all of your employees have access to the same technology tools at the same time.
  • There’s no need to worry about ongoing system updates, as those happen automatically behind the scenes.

In 2015, marketers should embrace the convenience of cloud-based services, as they help put the focus on benefits instead of spending time managing the technology.

Are you using data quality in the cloud?

If you are planning to move data out of an on-premise application or software to a cloud-based service, you can take advantage of this ideal time to ensure these data quality best practices are in place.

Verify and cleanse your data first, before it is moved to the cloud. Since it’s likely that your move to the cloud will make this data available across your organization — within marketing, sales, customer service, and other departments — applying data quality best practices first will increase operational efficiency and bring down costs from invalid or unusable data.

There may be more to add to this list, depending on the nature of your own business. Make sure that:

  • Postal addresses are valid, accurate, current and complete
  • Email addresses are valid
  • Telephone numbers are valid, accurate, and current
  • Increase the effectiveness of future data analysis by making sure all data fields are consistent and every individual data element is clearly defined
  • Fill in missing data
  • Remove duplicate contact and customer records

Once you have cleansed and verified your existing data and move it to the cloud, use a real-time verification and cleansing solution at the point of entry or point of collection in real-time to ensure good data quality across your organization on an ongoing basis.

The biggest roadblock to effective marketing technology is: Bad data.

Budgeting for marketing technology is going to become a bigger and bigger piece of the pie (or cookie, if you prefer) for B2C and B2B organizations alike. The first step all marketers need to take to make sure those investments fully pay off and don’t go wasted is great customer data.

Marketing technology is fueled by data. A recent Harvard Business Review article listed some of the most important marketing technologies. They included tools for analytics, conversion, email, search engine marketing, remarketing, mobile, and marketing automation.

What do they all have in common? These tools all drive customer communication, engagement, and relationships, all of which require valid and actionable customer data to work at all.

You can’t plan your marketing strategy off of data that tells you the wrong things about who your customers are, how they prefer to be contacted, and what messages work the best. Make data quality a major part of your 2015 marketing technology planning to get the most from your investment.

Marketing technology is going to be big in 2015 — where do you start?

With all of this in mind, how can marketers prepare for their technology needs in 2015? Get started with this free virtual conference from MarketingProfs that is totally focused on marketing technology.

This great event includes a keynote from Teradata’s CMO, Lisa Arthur, on “Using Data to Build Strong Marketing Strategies.” Register here for the December 12 Marketing Technology Virtual Conference from MarketingProfs.

Even if you can’t make it live that day at the virtual conference, it’s still smart to sign up so you receive on-demand recordings from the sessions when the event ends. Register now!

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Data Migration, Data Quality, Data Services | Tagged , , , | Leave a comment

When Data Integration Saves Lives

When Data Integration Saves Lives

When Data Integration Saves Lives

In an article published in Health Informatics, its author, Gabriel Perna, claims that data integration could save lives, as we learn more about illnesses and causal relationships.

According to the article, in Hamilton County Ohio, it’s not unusual to see kids from the same neighborhoods coming to the hospital for asthma attacks.  Thus, researchers wanted to know if it was fact or mistaken perception that an unusually high number of children in the same neighborhood were experiencing asthma attacks.  The next step was to review existing data to determine the extent of the issues, and perhaps how to solve the problem altogether.

“The researchers studied 4,355 children between the ages of 1 and 16 who visited the emergency department or were hospitalized for asthma at Cincinnati Children’s between January 2009 and December 2012. They tracked those kids for 12 months to see if they returned to the ED or were readmitted for asthma.”

Not only were the researchers able to determine a sound correlation between the two data sets, but they were able to advance the research to predict which kids were at high-risk based upon where they live.  Thus, some of the cause and the effects have been determined.

This came about when researchers began thinking out of the box, when it comes to dealing with traditional and non-traditional medical data.  They integrated housing and census data, in this case, with that of the data from the diagnosis and treatment of the patients.  These are data sets unlikely to find their way to each other, but together they have a meaning that is much more valuable than if they just stayed in their respective silos.

“Non-traditional medical data integration has begun to take place in some medical collaborative environments already. The New York-Presbyterian Regional Health Collaborative created a medical village, which ‘goes beyond the established patient-centered medical home mode.’ It not only connects an academic medical center with a large ambulatory network, medical homes, and other providers with each other, but community resources such as school-based clinics and specialty-care centers (the ones that are a part of NYP’s network).”

The fact of the matter is that data is the key to understanding what the heck is going on when cells of sick people begin to emerge.  While researchers and doctors can treat the individual patients there is not a good understanding of the larger issues that may be at play.  In this case, poor air quality in poor neighborhoods.  Thus, they understand what problem needs to be corrected.

The universal sharing of data is really the larger solution here, but one that won’t be approached without a common understanding of the value, and funding.  As we pass laws around the administration of health care, as well as how data is to be handled, perhaps it’s time we look at what the data actually means.  This requires a massive deployment of data integration technology, and the fundamental push to share data with a central data repository, as well as with health care providers.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Quality, Data Services | Tagged , , , | Leave a comment

What Would Retail Peak Season Be Like Without Address Data Quality?

Address Data Quality

Why You NEED Address Data Quality

Reliable shipping is something customers take for granted, and never more so than during the holidays.

Peak season is here, and delivery companies have plans in place to successfully deliver every last package:

  • The USPS added delivery on Sunday and Christmas Day this year, after last year’s double-digit rise in package volumes during peak season.
  • In December alone, the United Parcel Service (UPS) forecasts it will deliver 585 million packages, an increase of 11% over last year.
  • UPS is also investing $175 million for its peak season preparedness plan, and will add 95,000 season workers (which is nearly 10 times the number of Federal Emergency Management Agency, or FEMA, employees in 2013).
  • According to the National Retail Federation, 44% of consumers will do their shopping on the web, which translates to a lot of deliveries.

For retailers, that means a lot of addresses in your company’s database. This will lead to a copious amount of deliveries.

The big rise in deliveries this year got me thinking: What would the holidays be like if there were no such thing as a postal address?

It’s safe to say, the holidays would be a lot less cheery. With our current reliance on mapping applications, it would be tough to get from home to the new toy store and back. Sadly, a lot of holiday greeting cards would get stamped “return to sender.” And without mapping applications or GPS, it would take a little more effort to get to grandmother’s house this year.

I think the only person who would be successfully delivering any gifts this year would be Santa (since he has his own rooftop-to-rooftop accuracy built in with his magical sleigh.)

Of course, one of the biggest places impacted would be the retail industry. The peak season at the end of the year is the time for retail businesses to make or break their reputations.

With the increased volume of deliverability, what mistakes might occur if all address data suddenly disappeared?

In a season without address data quality, your reputation and company could suffer in a number of ways:

  • Faulty addresses mean a weak customer database
  • Erroneous shipping means you’ll paying for delivery, returns, and re-delivery
  • Loss of customers and hurt reputations during peak sales time

A truly data-centric company treats address data as the foundation for customer information, and this would be more challenging to do without quality address verification.

In a peak season without address verification, I imagine companies would have to turn to alternative means to estimate locations and distances such as the geocoding process from Google Maps, which would leave them at a few disadvantages as delivery trucks navigate the icy roads during wintertime.

Informatica’s Address Validation offers these benefits that Google Maps’ geocoding does not, including:

  • Address validation and corrections
  • Availability as an on-premise solution for customer security and privacy
  • User-friendly experience as a leader in lookup and cleansing for 20+ years
  • Exact geocoding for properties (not estimates or approximations)
  • Partnership with the Universal Postal Union and all five existing global postal certifications

Approximate locations and uncertified data won’t cut it when customers expect on-time delivery, every time. Along with these benefits that make it invaluable for customer shipping and other postal mail uses, Informatica’s Address Validation sets the standard in 240 countries and territories.

Luckily, we do not live in a world without address quality. It is possible to ensure every last package and parcel makes it to its destination on time, while making it to grandmother’s house on time, sending greeting cards to our whole list, and bringing home lovingly selected gifts from the store to wrap and tuck under the tree.

How do you measure how your company is rating with its customer address quality? You can get started with this ebook from Informatica, “Three Ways to Measure Address Quality.”

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Data Services, Retail | Tagged , , , , | Leave a comment

Securing Sensitive Data in Test and Development Environments

Securing Sensitive Data in Test and Development Environments

Securing Test and Development Environments

Do you use copies of production data in test and development environments? This is common practice in IT organizations. For this reason, test environments have become the number one target for outside intruders. That being said, most data breaches occur when non-malicious insiders accidentally expose sensitive data in the course of their daily work. Insider data breaches can be more serious and harder to detect than intruder events.

If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation.  On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.

This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.

In this webinar, we will cover:

  • Key trends in enterprise data security
  • Vulnerabilities in non-production application environments (test and development)
  • Alternatives to consider when protecting test and development environments
  • Priorities for enterprises in reducing attack surface for their organization
  • Compliance and internal audit cost reduction
  • Data masking and synthetics data use cases
  • Informatica Secure Testing capabilities

Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data Security, Data Services | Tagged , , , | Leave a comment

Is Thanksgiving the Most Data-Driven Holiday?

Is Thanksgiving the Most Data-Driven Holiday?

Thanksgiving: The Most Data-Driven Holiday?

Turkey, cranberry sauce, pumpkin pie… and lots and lots of data on the side. Thanksgiving has to be the most data-driven holiday we have. After Black Friday, we will see a slew of reports and headlines describing how the retail industry’s sales performed.

What are retailers all abuzz about right now as we get closer to the Black Friday / Cyber Monday dates?

This year, retailers are using between 31-50% of their online marketing budget on holiday-related efforts alone, according to the National Retail Federation.

The NRF also projects that almost 20% of the retail industry’s annual sales will come from the holiday period.

This sounds like so much pressure, any retail marketer would crack like a chestnut under it. What’s a marketer to do? Here are some tactics and results around peak season holiday marketing.

Segmentation is a great way to see a big lift in sales. If you have the data to understand your customers and identify emerging segments in your market, you can find new business and directly target it with specific messaging. One recent case reported by Target Marketing Magazine showed a 40% year-over-year sales increase from a new segment that was identified and targeted with tailored messaging. Learn more about getting the most from your customer data in this white paper, “Data Quality Management: Beyond the Basics.”

Know your email sender reputation. You spend so much effort getting the right message out via email marketing with the right timing over different days leading up to Black Friday, Cyber Monday, and Green Monday into December. But a poor email sender reputation can lead to those efforts being blocked or sent to the junk folder instead of to your customers. Global retailer BCBG proactively avoided this, using data validation and cleansing techniques to send their sender reputation sky-high (which you can read more about here.) Don’t let a bad sender reputation limit you during the holidays — especially when it can take up to 15 days to fix a major issue like this. Find out more in this short informative video, “Email Sender Reputation: What Does it Mean to Marketers?”

Everyone, positively everyone, is talking about mobile. Practically every article you come across on the subject of retail industry sales around the peak season holiday mentions mobile in some respect – for very good reason. Shoppers take their phones and tablets shopping with them to compare prices, look for deals, and research their purchases beforehand. The NRF reports that 7 out of 10 retailers they surveyed are putting major investments into their mobile-friendly websites. A well-timed SMS message to your customers with a special deal just for them could work wonders. You can take advantage of the huge popularity of SMS mobile messaging in just a few steps. See how in this SMS Quick-Start Guide created for marketers like you.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customers, Data Services, Retail | Tagged , , , , | Leave a comment

Raised Expectations and New Discoveries with Great Customer Data

New Discoveries with Great Customer Data

New Discoveries with Great Customer Data

We have a landing! On November 12, the Rosetta probe arrived at its destination, a comet 300 million miles away from Earth.

Fulfilling its duty after a 10-year journey, Rosetta dropped its lander, Philae, to gather data from the comet below.

Everything about the comet so far is both challenging and fascinating, from its advanced age – 4.6 billion years old, to its hard-to- pronounce name, Churyumov-Gerasimenko.

The size of Gerasimenko? Roughly that of lower Manhattan. The shape wasn’t the potato-like image some anticipated of a typical comet. Instead it had a form that was compared to that of a “rubber-duck,” making landing trickier than expected.

To add one more challenging feature, the comet was flying at 38,000 mph. The feat of landing the probe onto the comet has been compared to hitting a speeding bullet with another speeding bullet.

All of this would have been impossible if the ESA didn’t have serious data on the lofty goal they set forth.

As this was happening, on the same day there was a quieter landing: Salesforce and LinkedIn paired up to publish research they conducted on marketing strategies by surveying 900+ senior-level B2B and B2C marketers through their social networks about their roles, marketing trends, and challenges they face.

This one finding stood out to me: “Only 17% of respondents said their company had fully integrated their customer data across all areas of the organization. However, 97% of those ‘fully integrated’ marketing leaders said they were at least somewhat effective at creating a cohesive customer journey across all touchpoints and channels.”

While not as many companies were implementing customer data like they should, those who did felt the strong impact of the benefits. It’s like knowing the difference between interacting with a potato-shaped company, or a B2C, vs. interacting with a rubber-duck-shaped company, or a B2B, for example.

Efficient customer data could help you learn how to land each one properly. While the methods for dealing with both might be similar, they’re not identical, and taking the wrong approach could mean a failed landing. One of the conclusions from the survey showed there is a “direct link between how well a marketer integrated customer data and the self-reported successes of that brand’s strategy.”

When interviewed by MSNBC on the comet landing, Bill Nye, also known as “the Science Guy,” had many positive things to say on the historic event. One question he answered was why do programs like the ESA exist – or basically, why do we go to space?

Nye had two replies: “It raises the expectations of your society,” and “You’re going to make discoveries.”

customer dataMarketers armed with insights from powerful customer data can have their own “landing on a comet” moment. Properly integrated customer data means you’ll be making new discoveries about your own clientele while simultaneously raising the expectations of your business.

The world couldn’t progress forward without quality data, whether in the realm of retail or planetary science. We put a strong emphasis on validating and cleanse your customer data at the point of entry or the point of collection.

Check out a quick video demo here of three data quality solutions: Email Verification, Address Verification, and Phone Validation.

 

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Data Services | Tagged , , , | Leave a comment

Data First: Five Tips To Reduce the Risk of A Breach

Reduce the Risk of A Breach

Reduce the Risk of A Breach

This article was originally published on www.federaltimes.com

November – that time of the year. This year, November 1 was the start of Election Day weekend and the associated endless barrage of political ads. It also marked the end of Daylight Savings Time. But, perhaps more prominently, it marked the beginning of the holiday shopping season. Winter holiday decorations erupted in stores even before Halloween decorations were taken down. There were commercials and ads, free shipping on this, sales on that, singing, and even the first appearance of Santa Claus.

However, it’s not all joy and jingle bells. The kickoff to this holiday shopping season may also remind many of the countless credit card breaches at retailers that plagued last year’s shopping season and beyond. The breaches at Target, where almost 100 million credit cards were compromised, Neiman Marcus, Home Depot and Michael’s exemplify the urgent need for retailers to aggressively protect customer information.

In addition to the holiday shopping season, November also marks the next round of open enrollment for the ACA healthcare exchanges. Therefore, to avoid falling victim to the next data breach, government organizations as much as retailers, need to have data security top of mind.

According to the New York Times (Sept. 4, 2014), “for months, cyber security professionals have been warning that the healthcare site was a ripe target for hackers eager to gain access to personal data that could be sold on the black market. A week before federal officials discovered the breach at HealthCare.gov, a hospital operator in Tennessee said that Chinese hackers had stolen personal data for 4.5 million patients.”

Acknowledging the inevitability of further attacks, companies and organizations are taking action. For example, the National Retail Federation created the NRF IT Council, which is made up of 130 technology-security experts focused on safeguarding personal and company data.

Is government doing enough to protect personal, financial and health data in light of these increasing and persistent threats? The quick answer: no. The federal government as a whole is not meeting the data privacy and security challenge. Reports of cyber attacks and breaches are becoming commonplace, and warnings of new privacy concerns in many federal agencies and programs are being discussed in Congress, Inspector General reports and the media. According to a recent Government Accountability Office report, 18 out of 24 major federal agencies in the United States reported inadequate information security controls. Further, FISMA and HIPAA are falling short and antiquated security protocols, such as encryption, are also not keeping up with the sophistication of attacks. Government must follow the lead of industry and look for new and advanced data protection technologies, such as dynamic data masking and continuous data monitoring to prevent and thwart potential attacks.

These five principles can be implemented by any agency to curb the likelihood of a breach:

1. Expand the appointment and authority of CSOs and CISOs at the agency level.

2. Centralize the agency’s data privacy policy definition and implement on an enterprise level.

3. Protect all environments from development to production, including backups and archives.

4. Data and application security must be prioritized at the same level as network and perimeter security.

5. Data security should follow data through downstream systems and reporting.

So, as the season of voting, rollbacks, on-line shopping events, free shipping, Black Friday, Cyber Monday and healthcare enrollment begins, so does the time for protecting personal identifiable information, financial information, credit cards and health information. Individuals, retailers, industry and government need to think about data first and stay vigilant and focused.

This article was originally published on www.federaltimes.com. Please view the original listing here

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data First, Data Security, Data Services | Tagged , , , | Leave a comment

Fast and Fasterer: Screaming Streaming Data on Hadoop

Hadoop

Guest Post by Dale Kim

This is a guest blog post, written by Dale Kim, Director of Product Marketing at MapR Technologies.

Recent published research shows that “faster” is better than “slower.” The point, ladies and gentlemen, is that speed, for lack of a better word, is good. But granted, you won’t always have the need for speed. My Lamborghini is handy when I need to elude the Bakersfield fuzz on I-5, but it does nothing for my Costco trips. There, I go with capacity and haul home my 30-gallon tubs of ketchup with my Ford F150. (Note: this is a fictitious example, I don’t actually own an F150.)

But if speed is critical, like in your data streaming application, then Informatica Vibe Data Stream and the MapR Distribution including Apache™ Hadoop® are the technologies to use together. But since Vibe Data Stream works with any Hadoop distribution, my discussion here is more broadly applicable. I first discussed this topic earlier this year during my presentation at Informatica World 2014. In that talk, I also briefly described architectures that include streaming components, like the Lambda Architecture and enterprise data hubs. I recommend that any enterprise architect should become familiar with these high-level architectures.

Data streaming deals with a continuous flow of data, often at a fast rate. As you might’ve suspected by now, Vibe Data Stream, based on the Informatica Ultra Messaging technology, is great for that. With its roots in high speed trading in capital markets, Ultra Messaging quickly and reliably gets high value data from point A to point B. Vibe Data Stream adds management features to make it consumable by the rest of us, beyond stock trading. Not surprisingly, Vibe Data Stream can be used anywhere you need to quickly and reliably deliver data (just don’t use it for sharing your cat photos, please), and that’s what I discussed at Informatica World. Let me discuss two examples I gave.

Large Query Support. Let’s first look at “large queries.” I don’t mean the stuff you type on search engines, which are typically no more than 20 characters. I’m referring to an environment where the query is a huge block of data. For example, what if I have an image of an unidentified face, and I want to send it to a remote facial recognition service and immediately get the identity? The image would be the query, the facial recognition system could be run on Hadoop for fast divide-and-conquer processing, and the result would be the person’s name. There are many similar use cases that could leverage a high speed, reliable data delivery system along with a fast processing platform, to get immediate answers to a data-heavy question.

Data Warehouse Onload. For another example, we turn to our old friend the data warehouse. If you’ve been following all the industry talk about data warehouse optimization, you know pumping high speed data directly into your data warehouse is not an efficient use of your high value system. So instead, pipe your fast data streams into Hadoop, run some complex aggregations, then load that processed data into your warehouse. And you might consider freeing up large processing jobs from your data warehouse onto Hadoop. As you process and aggregate that data, you create a data flow cycle where you return enriched data back to the warehouse. This gives your end users efficient analysis on comprehensive data sets.

Hopefully this stirs up ideas on how you might deploy high speed streaming in your enterprise architecture. Expect to see many new stories of interesting streaming applications in the coming months and years, especially with the anticipated proliferation of internet-of-things and sensor data.

To learn more about Vibe Data Stream you can find it on the Informatica Marketplace .


 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Services, Hadoop | Tagged , , , , | Leave a comment

Informatica’s Hadoop Connectivity Reaches for the Clouds

The Informatica Cloud team has been busy updating connectivity to Hadoop using the Cloud Connector SDK.  Updated connectors are available now for Cloudera and Hortonworks and new connectivity has been added for MapR, Pivotal HD and Amazon EMR (Elastic Map Reduce).

Informatica Cloud’s Hadoop connectivity brings a new level of ease of use to Hadoop data loading and integration.  Informatica Cloud provides a quick way to load data from popular on premise data sources and apps such as SAP and Oracle E-Business, as well as SaaS apps, such as Salesforce.com, NetSuite, and Workday, into Hadoop clusters for pilots and POCs.  Less technical users are empowered to contribute to enterprise data lakes through the easy-to-use Informatica Cloud web user interface.

Hadoop

Informatica Cloud’s rich connectivity to a multitude of SaaS apps can now be leveraged with Hadoop.  Data from SaaS apps for CRM, ERP and other lines of business are becoming increasingly important to enterprises. Bringing this data into Hadoop for analytics is now easier than ever.

Users of Amazon Web Services (AWS) can leverage Informatica Cloud to load data from SaaS apps and on premise sources into EMR directly.  Combined with connectivity to Amazon Redshift, Informatica Cloud can be used to move data into EMR for processing and then onto Redshift for analytics.

Self service data loading and basic integration can be done by less technical users through Informatica Cloud’s drag and drop web-based user interface.  This enables more of the team to contribute to and collaborate on data lakes without having to learn Hadoop.

Bringing the cloud and Big Data together to put the potential of data to work – that’s the power of Informatica in action.

Free trials of the Informatica Cloud Connector for Hadoop are available here: http://www.informaticacloud.com/connectivity/hadoop-connector.html

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Services, Hadoop, SaaS | Tagged , , , | Leave a comment