Tag Archives: B2B

British Cycling: A Big Data Champion?

Big_Data

British Cycling: A Big Data Champion?

I think I may have gone to too many conferences in 2014 in which the potential of big data was discussed.  After a while all the stories blurred into two main themes:

  1. Companies have gone bankrupt at a time when demand for their core products increased.
  2. Data from mobile phones, cars and other machines house a gold mine of value – we should all be using it.

My main take away from 2014 conferences was that no amount of data is a substitute for poor strategy, or lack of organisational agility to adapt business processes in times of disruption.  However, I still feel as an industry our stories are stuck in the phase of ‘Big Data Hype’, but most organisations are beyond the hype and need practicalities, guidance and inspiration to turn their big data projects into a success.  This is possibly due to a limited number of big data projects in production, or perhaps it is too early to measure the long term results of existing projects.  Another possibility is that the projects are delivering significant competitive advantage, so the stories will remain under wraps for the time being.

However, towards the end of 2014 I stumbled across a big data success story in an unexpected place.  It did (literally) provide competitive advantage, and since it has been running for a number of years the results are plain to see.  It started with a book recommendation from a friend.   ‘Faster’ by Michael Hutchinson is written as a self-propelled investigation as to the difference between world champion and world class althletes.  It promised to satisfy my slightly geeky tendency to enjoy facts, numerical details and statistics.  It did this – but it really struck me as a ‘how-to’ guide for big data projects.

Mr Hutchinson’s book is an excellent read as an insight into professional cycling by a professional cyclist.  It is stacked with interesting facts and well-written anecdotes, and I highly recommend the reading the book.  Since the big-data aspect was a sub-plot, I will pull out the highlights without distracting from the main story.

Here are the five steps I extracted for big data project success:

1. Have a clear vision and goal for your project

The Sydney Olympics in 2000 had only produced 4 medals across all cycling disciplines for British cyclists.  With a home Olympics set for 2012, British Cycling desperately wanted to improve this performance.  Specific targets were clearly set across all disciplines stated in times that an athlete needed to achieve in order to win a race.

2. Determine data the required to support these goals

Unlike many big data projects which start with a data set and then wonder what to do with it, British Cycling did this the other way around.  They worked out what they needed to measure in order to establish the influencers on their goal (track time) and set about gathering this information.  In their case this involved gathering wind tunnel data to compare & contrast equipment, as well as physiological data from athletes and all information from cycling activities.

3. Experiment in order to establish causality

Most big data projects involve experimentation by changing the environment  whilst gathering a sub-set of data points.  The number of variables to adjust in cycling is large, but all were embraced. Data (including video) was gathered on the effects of small changes in each component:  Bike, Clothing, Athlete (training and nutrition).

4. Guide your employees on how to use the results of the data

Like many employees, cyclists and coaches were convinced of the ‘best way’ to achieve results based on their own personal experience.  Analysis of data in some cases showed that the perceived best way, was in fact not the best way.   Coaching staff trusted the data, and convinced the athletes to change aspects of both training and nutrition.  This was not necessarily easy to do, as it could mean fundamental changes in the athlete’s lifestyle.

5. Embrace innovation

Cycling is a very conservative sport by nature, with many of the key innovations coming from adjacent sports such as triathlon.  Data however, is not steeped in tradition and does not have pre-conceived ideas as to what equipment should look like, or what constitutes an excellent recovery drink.  What made British Cycling’s big data initiatives successful is that they allowed themselves to be guided by the data and put the recommendations into practice.  Plastic finished skin suits are probably not the most obvious choice for clothing, but they proved to be the biggest advantage cyclist could get.  Far more than tinkering with the bike.  (In fact they produced so much advantage they were banned shortly after the 2008 Olympics.)

The results:  British Cycling won 4 Olympic medals in 2000, one of which was gold.  In 2012 they grabbed 8 gold, 2 silver and 2 bronze medals.  A quick glance at their website shows that it is not just Olympic medals they are wining – but medals won across all world championship events has increased since 2000.

To me, this is one of the best big data stories, as it directly shows how to be successful using big data strategies in a completely analogue world.  I think it is more insightful that the mere fact that we are producing ever-increasing volumes of data.  The real value of big data is in understanding what portion of all avaiable data will constribute to you acieving your goals, and then embracing the use the results of analysis to make constructive changes in daily activities.

But then again, I may just like the story because it involves geeky facts, statistics and fast bicycles.

Share
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Quality, Data Security, Data Services | Tagged , , , | Leave a comment

Real-time Data Puts CEO in Charge of Customer Relationships

real-time_data

Real Time Data good for better Customer Relationships

In 2014, Informatica Cloud focused a great deal of attention on the needs and challenges of the citizen integrator. These are the critical business users at the core of every company: The customer-facing sales rep at the front, as well as the tireless admin at the back. We all know and rely on these men and women. And up until very recently, they’ve been almost entirely reliant on IT for the integration tasks and processes needed to be successful at their jobs.

A lot of that has changed over the last year or so. In a succession of releases, we provided these business users with the tools to take matters into their hands. And with the assistance of key ecosystem partners, such as Salesforce, SAP, Amazon, Workday, NetSuite and the hundreds of application developers that orbit them, we’ve made great progress toward giving business users the self-sufficiency they need, and demand. But, beyond giving these users the tools to integrate and connect with their apps and information at will, what we’ve really done is give them the ability to focus their attention and efforts on their most valuable customers. By doing so, we have got to core of the real purpose and importance of the whole cloud project or enterprise: The customer relationship.

In a recent Fortune interview, Salesforce CEO and cloud evangelist Marc Benioff echoed that idea when he stated that “The CEO is now in charge of the customer relationship.” What he meant by that is companies now have the ability to tie all aspects of their marketing – website, customer service, email marketing, social, sales, etc. – into “one canonical file” with all the respective customer information. By organizing the enterprise around the customer this way, the company can then pivot all of their efforts toward the customer relationship, which is what is required if a business is going to have and sustain success as we move through the 2010s and beyond.

We are in complete agreement with Marc and think it wouldn’t be too much of a stretch to declare 2015 as the year of the customer relationship. In fact, helping companies and business users focus their attention toward the customer has been a core focus of ours for some time. For an example, you don’t have to look much further than the latest iteration of our real-time application integration capability.

In a short video demo that I recommend to everyone, my colleague Eric does a fantastic job of walking users through the real-time features available through the Informatica Cloud platform.

As the demo demonstrates, the real-time features let you build a workflow process application that interacts with data from cloud and on-premise sources right from the Salesforce user interface (UI). It’s quick and easy, thus allowing you to devote more time to your customers and less time on “plumbing.”

The workflows themselves are created with the help of a drag-and-drop process designer that enables the user to quickly create a new process and configure the parameters, inputs and outputs, and decision steps with the click of a few buttons.

Once the process guide is created, it displays as a window embedded right in the Salesforce UI. So if, for example, you’ve created an opportunity-to-order guide, you can follow a wizard-driven process that walks your users from new opportunity creation through to the order confirmation, and everything in between.

As users move through the process, they can interact in real time with data from any on-premise or cloud-based source they choose. In the example from the video, the user, Eric, chooses a likely prospect from a list of company contacts, and with a few keystrokes creates a new opportunity in Salesforce.  In a further demonstration of the real-time capability, Eric performs a NetSuite query, logs a client call, escalates a case to customer service, pulls the latest price book information from an Oracle database, builds out the opportunity items, creates the order in SAP, and syncs it all back to Salesforce, all without leaving the wizard interface.

The capabilities available via Informatica Cloud’s application integration are a gigantic leap forward for business users and an evolutionary step toward pivoting the enterprise toward the customer. As 2015 takes hold we will see this become increasingly important as companies continue to invest in the cloud. This is especially true for those cloud applications, like the Salesforce Analytics, Marketing and Sales Clouds, that need immediate access to the latest and most reliable customer data to make them all work — and truly establish you as the CEO in charge of customer relationships.

Share
Posted in B2B, Big Data, Cloud, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , , , | Leave a comment

Informatica Rev: Data Democracy At Last – Part 2

This is a continuation from Part 1 of the Blog which you can read here.

Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.

While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.

Data Democracy At Last

Data Democracy At Last

“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications.  By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.

In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.

Share
Posted in B2B, B2B Data Exchange, Data First, Data Governance | Tagged , , , , | Leave a comment

Thoughts on the CMS Medicare Quality “Five-Star” Rating System

CMS Medicare Quality "Five-Star" Rating System

CMS Medicare Quality “Five-Star” Rating System

This past October, the Centers for Medicare & Medicaid Services (CMS) announced two Medicare Quality Improvement Initiatives. I spent part of Thanksgiving weekend analyzing the most recent Star Quality bonus scores and trying to figure out where this program is going and what value we will get from it as an industry.  Much of the work that I am doing these days is focused on data.  I look at that through the prism of health plan operations where I spent a number of years.

CMS points out the overall improvement in quality which they position as the result of focusing, and incenting quality.  I agree that putting funding behind a quality program was a valuable strategy to motivate the industry.   This has not always been the case, in fact a former colleague who related a common dialog previous to this program:

  1. He would present a quality initiative to executive management
  2. They would nod politely and say, “Yes, of course we are interested in quality!”
  3. The conversation would continue until the cost of the program was disclosed.

The faces would change, and the response was, “Well, yes, quality is important, but funding is tight right now.  We need to focus on programs with a clear ROI”.

Thankfully the Star program has given quality initiatives a clear ROI – for which we are all grateful!

The other dynamic which is positive is that Medicare Advantage has provided a testing ground for new programs, largely the result of ACA.  Programs very similar to the Star program are part of the ACO program and the marketplace membership.  Risk Adjustment is being fitted to meet these programs also.  Private insurance will likely borrow similar structures to insure quality and fair compensation in their various risk sharing arrangements.  MA is a significant subset of the population and is providing an excellent sandbox for these initiatives while improving the quality of care that our senior population receives.

My concerns are around the cultures and mission of those plans who are struggling to get to the magic four star level where they will receive the bonus dollars.

Having worked in a health plan for almost nine years, and continuing to interact with my current customers, has shown me the dedication of the staffs that work in these plans.  One of my most rewarding experiences was leading the call center for the Medicare population.  I was humbled each day by the caring and patience the reps on the phones showed to the senior population.  I have also seen the dedication of clinical staffs to insuring the care for members is carefully coordinated and that their dignity and wishes were always respected.  I sincerely hope that plans with a clear mission find the right tools and supports to improve their ratings to the point where they receive the additional funding to maintain their viability and continue to serve their members and the medical community.  I am sure that there are poor quality plans out there, and I agree that they should be eliminated.  But I am also rooting for the plans with a mission who are striving to be a bit better.

Share
Posted in B2B, General, Healthcare, Life Sciences | Tagged , , | Leave a comment

Securing Sensitive Data in Test and Development Environments

Securing Sensitive Data in Test and Development Environments

Securing Test and Development Environments

Do you use copies of production data in test and development environments? This is common practice in IT organizations. For this reason, test environments have become the number one target for outside intruders. That being said, most data breaches occur when non-malicious insiders accidentally expose sensitive data in the course of their daily work. Insider data breaches can be more serious and harder to detect than intruder events.

If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation.  On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.

This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.

In this webinar, we will cover:

  • Key trends in enterprise data security
  • Vulnerabilities in non-production application environments (test and development)
  • Alternatives to consider when protecting test and development environments
  • Priorities for enterprises in reducing attack surface for their organization
  • Compliance and internal audit cost reduction
  • Data masking and synthetics data use cases
  • Informatica Secure Testing capabilities

Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.

Share
Posted in Application ILM, Business Impact / Benefits, Data Security, Data Services | Tagged , , , | Leave a comment

Data Integration Webinar Follow-Up: By Our First Strange and Fatal Interview

How to Maximize Value of Data Management Investments

How to Maximize Value of Data Management Investments

This is a guest author post by Philip Howard, Research Director, Bloor Research.

I recently posted a blog about an interview style webcast I was doing with Informatica on the uses and costs associated with data integration tools.

I’m not sure that the poet John Donne was right when he said that it was strange, let alone fatal. Somewhat surprisingly, I have had a significant amount of feedback following this webinar. I say “surprisingly” because the truth is that I very rarely get direct feedback. Most of it, I assume, goes to the vendor. So, when a number of people commented to me that the research we conducted was both unique and valuable, it was a bit of a thrill. (Yes, I know, I’m easily pleased).

There were a number of questions that arose as a result of our discussions. Probably the most interesting was whether moving data into Hadoop (or some other NoSQL database) should be treated as a separate use case. We certainly didn’t include it as such in our original research. In hindsight, I’m not sure that the answer I gave at the time was fully correct. I acknowledged that you certainly need some different functionality to integrate with a Hadoop environment and that some vendors have more comprehensive capabilities than others when it comes to Hadoop and the same also applies (but with different suppliers, when it comes to integrating with, say, MongoDB or Cassandra or graph databases). However, as I pointed out in my previous blog, functionality is ephemeral. And, just because a particular capability isn’t supported today, doesn’t mean it won’t be supported tomorrow. So that doesn’t really affect use cases.

However, where I was inadequate in my reply was that I only referenced Hadoop as a platform for data warehousing, stating that moving data into Hadoop was not essentially different from moving it into Oracle Exadata or Teradata or HP Vertica. And that’s true. What I forgot was the use of Hadoop as an archiving platform. As it happens we didn’t have an archiving use case in our survey either. Why not? Because archiving is essentially a form of data migration – you have some information lifecycle management and access and security issues that are relevant to archiving once it is in place but that is after the fact: the process of discovering and moving the data is exactly the same as with data migration. So: my bad.

Aside from that little caveat, I quite enjoyed the whole event. Somebody or other (there’s always one!) didn’t quite get how quantifying the number of end points in a data integration scenario was a surrogate measure for complexity (something we took into account) and so I had to explain that. Of course, it’s not perfect as a metric but it’s the only alternative to ask eye of the beholder type questions which aren’t very satisfactory.

Anyway, if you want to listen to the whole thing you can find it HERE:

Share
Posted in B2B, B2B Data Exchange, Data Integration Platform, Data Quality | Tagged , , , | Leave a comment

Take Action – Reduce Your Risk of Identity Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

What is our personal information worth? 

With this 2014 holiday season rolling into full swing, Americans will spend more than $600 Billion, a 4.1% increase from last year. According to the Credit Union National Association, a poll showed that 45% of credit and debit card users will think twice about how they shop and pay given the tens of millions of shoppers impacted by breaches. Stealing identities is a lucrative pastime for those with ulterior motives. The Black Market pays between $10-$12 per stolen record. Yet when enriched with health data, the value is as high as $50 per record because it can be used for insurance fraud.

Are the thieves getting smarter or are we getting sloppy?  

With ubiquitous access to technology globally, general acceptance to online shopping, and the digitization of health records, there is more data online with more opportunities to steal our data than ever before.  Unfortunately for shoppers, 2013 was known as ‘the year of the retailer breach’ according to the Verizon’s 2014 data breach report. Unfortunately for patients, Healthcare providers were most noted for the highest percentage of losing protected healthcare data.

So what can we do to be a smarter and safer consumer?

No one wants to bank roll the thieves’ illegal habits. One way would be to regress 20 years, drive to the mall and make our purchases cash in hand or go back to completely paper-based healthcare.   Alternatively, here are a few suggestions to avoid being on the next list of victims:

1. Avoid irresponsible vendors and providers by being an educated consumer

Sites like The Identify Theft Resource Center and the US Department of Health and Human Services expose the latest breaches in retail and healthcare respectively. Look up who you are buying from and receiving care from and make sure they are doing everything they can to protect your data. If they didn’t respond in a timely fashion, tried to hide the breach, or didn’t implement new controls to protect your data, avoid them. Or take your chances.

2. Expect to be hacked, plan for it

Most organizations you trust with your personal information have already experienced a breach. In fact, according to a recent survey conducted by the Ponemon Group sponsored by Informatica, 72% of organizations polled experienced a breach within the past 12 months; more than 20% had 2 or more breaches in the same timeframe. When setting passwords, avoid using words or phrases that you publicly share on Facebook.  When answering security questions, most security professionals suggest that you lie!

3. If it really bothers you, be vocal and engage

Many states are invoking legislation to make organizations accountable for notifying individuals when a breach occurs. For example, Florida enacted FIPA – the Florida Information Protection Act – on July 1, 2014 that stipulates that all breaches, large or small, are subject to notification.  For every day that a breach goes undocumented, FIPA stipulates $1,000 per day penalty up to an annual limit of $500,000.

In conclusion, as the holiday shopping season approaches, now is the perfect time for you to ensure that you’re making the best – and most informed – purchasing decisions. You have the ability to take matters into your own hands; keep your data secure this year and every year.

To learn more about Informatica Data Security products, visit our Data Privacy solutions website.

Share
Posted in Data masking, Data Privacy, Healthcare | Tagged , , , | Leave a comment

When was Your Last B2B Integration Health Check?

If you haven’t updated your B2B integration capabilities in the past five years, are you at risk of being left behind?  This is the age of superior customer experience and rapid time-to-value so speedy customer on-boarding and support of specialized integration services means the difference between winning and losing business.  A health check starts with asking some simple questions: (more…)

Share
Posted in B2B, B2B Data Exchange | Tagged , , , , , , , | 2 Comments

Manual Processes Got You Down? 4 Steps to Small Partner Enablement

In a recent Aberdeen research, they found that 95% of respondents (of 122 responses) replied on some level of manual processing in order to integration external data sources.   Manual processing to integrate external data is time consuming, expensive and error prone so why do so many do it?  Well, they often have little choice. If you look deeper, most of the time these data exchanges are with small partners and small partner enablement is a significant challenge for most organizations.  For the most part, (more…)

Share
Posted in B2B Data Exchange | Tagged , , , , , | Leave a comment

Avnet Story: How to Integrate 100% of B2B Data Sources

The other week we had Ayman Taha, Director of IT for Enterprise Solutions Integration, from Avent in to talk about how he automated the processing of unstructured, non-traditional B2B exchanges with external partners.  Avent is a large distributor of electronic components with a diverse ecosystem of customer and suppliers.  Their B2B infrastructure reflects the complexity of their environment but, despite a sophisticated and mature EDI infrastructure, they were still manually re-keying invoices and product updates from hundreds of spreadsheets and .PDFs received from partners.  This was because many of their smaller customers and partners could not send or receive EDI messages. (more…)

Share
Posted in B2B Data Exchange | Tagged , , | 1 Comment