Category Archives: Business/IT Collaboration

The CMOs Role in Delivering Omnichannel Customer Experiences

omnichannel

The CMOs Role in Delivering Omnichannel Customer Experiences

This article was originally posted on Argyle CMO Journal and is re-posted here with permission.

According to a new global study from SDL, 90% of consumers expect a consistent customer experience across channels and devices when they interact with brands. However, according to these survey results, Gartner Survey Finds Importance of Customer Experience on the Rise — Marketing is on the Hook, fewer than half of the companies surveyed rank their customer experience as exceptional today. The good news is that two-thirds expect it to be exceptional in two years. In fact, 89% plan to compete primarily on the basis of the customer experience by 2016.

So, what role do CMOs play in delivering omnichannel customer experiences?

According to a recent report, Gartner’s Executive Summary for Leadership Accountability and Credibility within the C-Suite, a high percentage of CEOs expect CMOs to lead the integrated cross-functional customer experience. Also, customer experience is one of the top three areas of investment for CMOs in the next two years.

I had the pleasure of participating on a panel discussion at the Argyle CMO Forum in Dallas a few months ago. It focused on the emergence of omnichannel and the need to deliver seamless, integrated and consistent customer experiences across channels.

Lisa Zoellner, Chief Marketing Officer of Golfsmith International, was the dynamic moderator, kept the conversation lively, and the audience engaged. I was a panelist alongside:

Below are some highlights from the panel.

Lisa Zoellner, CMO, Golfsmith International opened the panel with a statistic. “Fifty-five percent of marketers surveyed feel they are playing catch up to customer expectations. But in that gap is a big opportunity.”

What is your definition of omnichannel?

There was consensus among the group that omnichannel is about seeing your business through the eyes of your customer and delivering seamless, integrated and consistent customer experiences across channels.

Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences. It’s one brand to the customer. But there is a gap between customer expectations and what most businesses can deliver today.

In fact, executives at most organizations I’ve spoken with, including the panelists, believe they are in the very beginning stages of their journey towards delivering omnichannel customer experiences. The majority are still struggling to get a single view of customers, products and inventory across channels.

“Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences.”

What are some of the core challenges standing in your way?

A key takeaway was that omnichannel requires organizations to fundamentally change how they do business. In particular, it requires changing existing business practices and processes. It cannot be done without cross-functional collaboration.

I think Chris Berg, VP, Store Operations at The Home Depot said it well, “One of the core challenges is the annual capital allocation cycle, which makes it difficult for organizations to be nimble. Most companies set strategies and commitments 12-24 months out and approach these strategies in silos. Marketing, operations, and merchandising teams typically ask for capital separately. Rarely does this process start with asking the question, ‘What is the core strategy we want to align ourselves around over the next 24 months?’ If you begin there and make a single capital allocation request to pursue that strategy, you remove one of the largest obstacles standing in the way.”

Chip Burgard, Senior Vice President of Marketing at CitiMortgage focused on two big barriers. “The first one is a systems barrier. I know a lot of companies struggle with this problem. We’re operating with a channel-centric rather than a customer-centric view. Now that we need to deliver omnichannel customer experiences, we realize we’re not as customer-centric as we thought we were. We need to understand what products our customers have across lines-of-business such as, credit cards, banking, investments and mortgage. But, our systems weren’t providing a total customer relationship view across products and channels. Now, we’re making progress on that. The second barrier is compensation. We have a commission-based sales force. How do you compensate the loan officers if a customer starts the transaction with the call center but completes it in the branch? That’s another issue we’re working on.”

Lisa Zoellner, CMO at Golfsmith International added, “I agree that compensation is a big barrier. Companies need to rethink their compensation plans. The sticky question is ‘Who gets credit for the sale?’ It’s easy to say that you’re channel-agnostic, but when someone’s paycheck is tied to the performance of a particular channel, it makes it difficult to drive that type of culture change.”

“We have a complicated business. More than 500 Hyatt hotels and resorts span multiple brands and regions,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “But, customers want a seamless experience no matter where they travel. They expect that the preference they shared during their Hyatt stay at a hotel in Singapore is understood by the person working at the next hotel in Dallas. So, we’re bridging those traditional silos all the way down to the hotel. A guest doesn’t care if the person they’re interacting with is from the building engineering department, from the food and beverage department, or the rooms department. It’s all part of the same customer experience. So we’re looking at how we share the information that’s important to guests to keep the customer the focus of our operations.”

“We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

How are companies powering great customer experiences with great customer data?

Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts, said, “We’re going through a transformation to unleash our colleagues to deliver great customer experiences at every stage of the guest journey. Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt connects the fragmented customer data from numerous applications including sales, marketing, ecommerce, customer service and finance. They bring the core customer profiles together into a single, trusted location, where they are continually managed. Now their customer profiles are clean, de-duplicated, enriched, and validated. They can see the members of a household as well as the connections between corporate hierarchies. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively and hotel teams can extend simple, meaningful gestures that drive guest loyalty.

When he first joined Hyatt, Chris did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay. Mostly just sleep now,” he joked. Those guest profiles have now been successfully consolidated.

This solid customer data foundation means Hyatt colleagues can more easily personalize a guest’s experience. For example, colleagues at the front desk are now able to use the limited check-in time to congratulate a new Diamond member on just achieving the highest loyalty program tier or offer a better room to those guests most likely to take them up on the offer and appreciate it.

According to Chris, “Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is not in order.”

How are you shifting from channel-centric to customer-centric?

Chip Burgard, SVP of Marketing at CitiMortgage answered, “In the beginning of our omnichannel journey, we were trying to allow customer choice through multi-channel. Our whole organization was designed around people managing different channels. But, we quickly realized that allowing separate experiences that a customer can choose from is not being customer-centric.

Now we have new sales leadership that understands the importance of delivering seamless, integrated and consistent customer experiences across channels. And they are changing incentives to drive that customer-centric behavior. We’re no longer holding people accountable specifically for activity in their channels. We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

Chris Berg, VP of Store Operations at The Home Depot, explained, “For us, it’s about transitioning from a store-centric to customer-centric approach. It’s a cultural change. The managers of our 2,000 stores have traditionally been compensated based on their own store’s performance. But we are one brand. For example in the future, a store may be fulfilling an order, however because of the geography of where the order originated they may not receive credit for the sale. We’re in the process of working through how to better reward that collaboration. Also, we’re making investments in our systems so they support an omnichannel, or what we call interconnected, business. We have 40,000 products in store and over 1,000,000 products online. Now that we’re on the interconnected journey, we’re rethinking how we manage our product information so we can better manage inventory across channels more effectively and efficiently.”

Summary

Omnichannel is all about shifting from channel-centric to customer-centric – much more customer-centric than you are today. Knowing who your customers are and having a view of products and inventory across channels are the basic requirements to delivering exceptional customer experiences across channels and touch points.

This is not a project. A business transformation is required to empower people to deliver omnichannel customer experiences. The executive team needs to drive it and align compensation and incentives around it. A collaborative cross-functional approach is needed to achieve it.

Omnichannel depends on customer-facing teams such as marketing, sales and call centers to have access to a total customer relationship view based on clean, consistent and connected customer, product and inventory information. This is the basic foundation needed to deliver seamless, integrated and consistent customer experiences across channels and touch points and improve their effectiveness.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Data Services | Tagged , , , | Leave a comment

Next Generation Planning for Agile Business Transformation

This is an age of technology disruption and digitization. Winners will be those organizations that can adapt quickly and drive business transformation on an ongoing basis.

When I first met John Schmidt Vice President of Global Integration Services at Informatica, he asked me to visualize Business Transformation as “A modern tool like the internet and Google Maps, with which planning a road trip from New York to San Francisco with a number of stops along the way to visit friends or see some sights takes just minutes. So you’re halfway through the trip and a friend calls to say he has suddenly been called out of town, you get on your mobile phone and within a few minutes, you have a new roadmap and a new plan.”

So, why is it that creating a roadmap for an enterprise initiative takes months or even years, and upon development of such a plan, it is nearly impossible to change even when new information or external events invalidate the plan? A single transformation is useful, but what you really want is the ability to transform our business on an ongoing basis. You need to be agile in planning of the transformation initiative itself. Is it even feasible to achieve a planning capability for complex enterprise initiatives that could approach the speed and agility of cross-country road-trip planning?

The short answer is YES; you can get much faster if you do three things:

First, throw out old notions of how planning in complex corporate environments is done, while keeping in mind that planning an enterprise transformation is fundamentally different than planning a focused departmental initiative.

Second, invest in tools equivalent to Google Maps for building the enterprise roadmap. Google Maps works because it leverages a database of information about roads, rules of the roads, related local services, and points of interest. In short, Google Map the enterprise, which is not as onerous as it sounds.

Third, develop a team of Enterprise Architects and planners with the skills and discipline to use the BOST™ Framework to maintain the underlying reference data about the business, its operations, the systems that support it, and the technologies that they are based on. This will provide the execution framework for your organization to deliver the data to fuel your business initiatives and digital strategy.

The results in a closer alignment of your business and IT organizations, there will be fewer errors due to communication issues, and because your business plans are linked directly to the underlying technical implementation, your business value will be delivered quicker.

BOSTThis is not some “pie in the sky” theory or a futuristic dream. What you need is a tool like Google Maps for Business Transformation. The tool is the BOST™ Toolkit leverages the BOST™ Framework, which through models, elements, and associated relationships built around an underlying Metamodel, interprets enterprise processes using a 4-dimensional view driven by business, operations, systems, and technology. Informatica in collaboration with certified partners built The BOST™ Framework. It provides an Architecture-led Planning approach to for business transformation.

Benefits of Architecture-led Planning

The Architecture-led Planning approach is effective when applied with governance and oversight. The following four features describe the benefits:

Enablement of Business and IT Collaboration – Uses a common reference model to facilitate cross-functional business alignment, as well as alignment between business and IT. The model gets everyone on the same page, regardless of line of business, location, or IT function. This model explicitly and dynamically starts with business strategy and links from there to the technical implementation.

Data-driven Planning – Being able to capture data in a structured repository helps with rapid planning. A data-driven plan makes it dynamic and adaptable to changing circumstances. When the plan changes, rather than updating dozens of documents, simply apply the change to the relevant components in the enterprise model repository and all business and technical model views that reference that component update automatically.

Cross-Functional Decision Making – Cross-functional decision-making is facilitated in several ways. First, by showing interdependencies between functions, business operations, and systems, the holistic view helps each department or team to understand the big-picture and its role in the overall process. Second, the future state architectural models are based on a view of how business operations will change. This provides the foundation to determine the business value of the initiative, measure your progress, and ultimately report the achievement of the goals. Quantifiable metrics help decision makers look beyond the subjective perspectives and agree on fact-based success metrics.

Reduced Execution Risk – Reduced execution risk results from having a robust and holistic plan based on a rigorous analysis of all the dependent enterprise components in the business, operations, systems and technology view. Risk is reduced with an effective governance discipline both from a program management as well as from an architectural change perspective.

Business Transformation with Informatica

Integrated Program Planning is for organizations that need large or complex Change Management assistance. Examples of candidates for Integrated Program Planning include:

Enterprise Initiatives: Large-scale mergers or acquisitions, switching from a product-centric operating model to more customer-centric operations, restructuring channel or supplier relationships, rationalizing the company’s product or service portfolio, or streamlining end-to-end processes such as order-to-cash, procure-to-pay, hire-to-retire or customer on-boarding.

Top-level Directives: Examples include board-mandated data governance, regulatory compliance initiatives that have broad organizational impacts such as data privacy or security, or risk management initiatives.

Expanding Departmental Solutions into Enterprise Solutions: Successful solutions in specific business areas can often be scaled-up to become cross-functional enterprise-wide initiatives. For example, expanding a successful customer master data initiative in marketing to an enterprise-wide Customer Information Management solution used by sales, product development, and customer service for an Omni-channel customer experience.

Twitter @bigdatabeat

The BOST™ Framework identifies and defines enterprise capabilities. These capabilities are modularized as reconfigurable and scalable business services. These enterprise capabilities are independent of organizational silos and politics, which provide strategists, architects, and planners the means to drive for high performance across the enterprise, regardless of the shifting set of strategic business drivers.The BOST™ Toolkit facilitates building and implementing new or improved capabilities, adjusting business volumes, and integrating with new partners or acquisitions through common views of these building blocks and through reusing solution components. In other words, Better, Faster, Cheaper projects.

The BOST View creates a visual understanding of the relationship between business functions, data, and systems. It helps with the identification of relevant operational capabilities and underlying support systems that need to change in order to achieve the organization’s strategic objectives. The result will be a more flexible business process with greater visibility and the ability to adjust to change without error.

Share
Posted in 5 Sales Plays, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO | Tagged , , , | Leave a comment

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

Banks and the Art of the Possible

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

The problem many banks encounter today is that they have vast sums of investment tied up in old ways of doing things. Historically, customers chose a bank and remained ’loyal’ throughout their lifetime…now competition is rife and loyalty is becoming a thing of a past. In order to stay ahead of the competition, gain and keep customers, they need to understand the ever-evolving market, disrupt norms and continue to delight customers. The tradition of staying with one bank due to family convention or from ease has now been replaced with a more informed customer who understands the variety of choice at their fingertips.

Challenger Banks don’t build on ideas of tradition and legacy and see how they can make adjustments to them. They embrace change. Longer-established banks can’t afford to do nothing, and assume their size and stature will attract customers.

Here’s some useful information

Accenture’s recent report, The Bank of Things, succinctly explains what ‘Customer 3.0’ is all about. The connected customer isn’t necessarily younger. It’s everybody. Banks can get to know their customers better by making better use of information. It all depends on using intelligent data rather than all data. Interrogating the wrong data can be time-consuming, costly and results in little actionable information.

When an organisation sets out with the intention of knowing its customers, then it can calibrate its data according with where the gold nuggets – the real business insights – come from. What do people do most? Where do they go most? Now that they’re using branches and phone banking less and less – what do they look for in a mobile app?

Customer 3.0 wants to know what the bank can offer them all-the-time, on the move, on their own device. They want offers designed for their lifestyle. Correctly deciphered data can drive the level of customer segmentation that empowers such marketing initiatives. This means an organisation has to have the ability and the agility to move with its customers. It’s a journey that never ends -technology will never have a cut-off point just like customer expectations will never stop evolving.

It’s time for banks to re-shape banking

Informatica have been working with major retail banks globally to redefine banking excellence and realign operations to deliver it. We always start by asking our customers the revealing question “Have you looked at the art of the possible to future-proof your business over the next five to ten years and beyond?” This is where the discussion begins to explore really interesting notions about unlocking potential. No bank can afford to ignore them.

Share
Posted in B2B Data Exchange, Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Cloud Data Integration, Data Services, Financial Services | Tagged , , , , | Leave a comment

Internet of Things (IoT) Changes the Data Integration Game in 2015

Data Integration

Internet of Things (IoT) Changes the Data Integration Game in 2015

As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”

We can all see this happening in our personal lives.  Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone.  On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.

So, what changed?  With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.

For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure.  Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.

The problem with all of this very cool stuff is that we need to once again rethink data integration.  Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.

That’s why those who are moving to IoT-based systems need to do two things.  First, they must create a strategy for extracting data from devices, such as industrial robots or ann  Audi A8.  Second, they need a strategy to take  all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.

The challenge is that machines and devices are not traditional IT systems.  I’ve built connectors for industrial applications in my career.  The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around.  Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.

This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage.  However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible.  Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff.  At the heart of its ability to succeed is the ability to move data from place-to-place.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Services | Tagged , , , | 5 Comments

Informatica Supports New Custom ODBC/JDBC Drivers for Amazon Redshift

Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.

Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.

Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.

Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.

Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.

Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.

Redshift

To learn more, sign up for the free trial of Informatica’s Redshift connector for Informatica Cloud or PowerCenter.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

How Organizations can Prepare for 2015 Data Privacy Legislation

Original article can be found here, scmagazine.com

On Jan. 13 the White House announced President Barack Obama’s proposal  for new data privacy legislation, the Personal Data Notification and Protection Act.  Many states have laws today that require corporations and government agencies to notify consumers in the event of a breach – but it is not enough.  This new proposal aims to improve cybersecurity standards nationwide with the following tactics:

Enable cyber-security information sharing between private and public sectors. 

Government agencies and corporations with a vested interest in protecting our information assets need a streamlined way to communicate and share threat information. This component of the proposed legislation incents organizations that participate in knowledge-sharing with targeted liability protection, as long as they are responsible for how they share, manage and retain privacy data.

Modernize the tools law enforcement has to combat cybercrime.
Existing laws, such as the Computer Fraud and Abuse Act, need to be updated to incorporate the latest cyber-crime classifications while giving prosecutors the ability to target insiders with privileged access to sensitive and privacy data.  The proposal also specifically calls out pursuing prosecution when selling privacy data nationally and internationally.

Standardize breach notification policies nationwide.
Many states have some sort of policy that requires notification of customers that their data has been compromised.  Three leading examples include California , Florida’s Information Protection Act (FIPA) and Massachusetts Standards for the Protection of Personal Information of Residents of the Commonwealth.  New Mexico, Alabama and South Dakota have no data breach protection legislation.  Enforcing standardization and simplifying the requirement for companies to notify customers and employees when a breach occurs will ensure consistent protection no matter where you live or transact.

Invest in increasing cyber-security skill sets.
For a number of years, security professionals have reported an ever-increasing skills gap in the cybersecurity profession.  In fact, in a recent Ponemon Institute report, 57 percent of respondents said a data breach incident could have been avoided if the organization had more skilled personnel with data security responsibilities. Increasingly, colleges and universities are adding cybersecurity curriculum and degrees to meet the demand. In support of this need, the proposed legislation mentions that the Department of Energy will provide $25 million in educational grants to Historically Black Colleges and Universities (HBCU) and two national labs to support a cybersecurity education consortium.

This proposal is clearly comprehensive, but it also raises the critical question: How can organizations prepare themselves for this privacy legislation?

The International Association of Privacy Professionals conducted a study of Federal Trade Commission (FTC) enforcement actions.  From the report, organizations can infer best practices implied by FTC enforcement and ensure these are covered by their organization’s security architecture, policies and practices:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

The Personal Data Notification and Protection Act fills a void at the national level; most states have privacy laws with California pioneering the movement with SB 1386.  However, enforcement at the state AG level has been uneven at best and absent at worse.

In preparing for this national legislation organization need to heed the policies derived from the FTC’s enforcement practices. They can also track the progress of this legislation and look for agencies such as the National Institute of Standards and Technology to issue guidance. Furthermore, organizations can encourage employees to take advantage of cybersecurity internship programs at nearby colleges and universities to avoid critical skills shortages.

With online security a clear priority for President Obama’s administration, it’s essential for organizations and consumers to understand upcoming legislation and learn the benefits/risks of sharing data. We’re looking forward to celebrating safeguarding data and enabling trust on Data Privacy Day, held annually on January 28, and hope that these tips will make 2015 your safest year yet.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Security, Data Services | Tagged , , , | Leave a comment

Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud

Are you ready to answer “Yes” to the questions:

a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”

I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.

The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme  “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:

  • How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
  • How you can make data work in new and advanced ways for every user in your company.

Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.

“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”

Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.

The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.

The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.

Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.

For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.

Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.

As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.

My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!

To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Services | Tagged , , , , , , , , , , , , , , , , | 3 Comments

The Sexiest Job of the 21st Century

Sexiest Job

The Sexiest Job of the 21st Century

I’ve spent most of my career working with new technology, most recently helping companies make sense of mountains of incoming data. This means, as I like to tell people, that I have the sexiest job in the 21st century.

Harvard Business Review put the data scientist into the national spotlight in their publication Data Scientist: The Sexiest Job of the 21st Century. Job trends data from Indeed.com confirms the rise in popularity for the position, showing that the number of job postings for data scientist positions increased by 15,000%.

In the meantime, the role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital – a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

For every Big Data problem, the solution often rests on the shoulders of a data scientist. The role of the data scientist is similar in responsibility to the Wall Street “quants” of the 80s and 90s – now, these data experienced are tasked with the management of databases previously thought too hard to handle, and too unstructured to derive any value.

So, is it the sexiest job of the 21st Century?

Think of a data scientist more like the business analyst-plus, part mathematician, part business strategist, these statistical savants are able to apply their background in mathematics to help companies tame their data dragons. But these individuals aren’t just math geeks, per se.

A data scientist is somebody who is inquisitive, who can stare at data and spot trends. It’s almost like a renaissance individual who really wants to learn and bring change to an organization.

If this sounds like you, the good news is demand for data scientists is far outstripping supply. Nonetheless, with the rising popularity of the data scientist – not to mention the companies that are hiring for these positions – you have to be at the top of your field to get the jobs.

Companies look to build teams around data scientists that ask the most questions about:

  • How the business works
  • How it collects its data
  • How it intends to use this data
  • What it hopes to achieve from these analyses

These questions were important because data scientists will often unearth information that can “reshape an entire company.” Obtaining a better understanding of the business’ underpinnings not only directs the data scientist’s research, but helps them present the findings and communicate with the less-analytical executives within the organization.

While it’s important to understand your own business, learning about the successes of other corporations will help a data scientist in their current job–and the next.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business/IT Collaboration, CIO, Data Governance, General, Governance, Risk and Compliance, Real-Time | Tagged , , | Leave a comment

Lovenomics: The Price of Love This Valentine’s Day, Cash In on the Hopeless Romantics

retailers

Lovenomics: The Price of Love This Valentine’s Day, Cash In on the Hopeless Romantics

This blog post was originally featured on Business.com here: Lovenomics: The Price of Love This Valentine’s Day.

After the Blue Cross sales that dominate January, Valentine’s Day offers welcome relief to the high street. Valentine’s Day marks the end of Christmas sales and the first of the year’s seasonal hooks providing retailers with an opportunity to upsell. According to the National Retail Federation’s Valentine’s Day Consumer Spending Survey, American consumers plan to spend a total of $4.8 billion on jewelry and a survey high of nearly $2 billion on clothing this year. However, to successfully capture customers, retailers need to develop an omni-channel strategy designed to sell the right product.

Target the indecisive

For the most part, the majority of Valentine’s Day shoppers will be undecided when they begin their purchasing journey. Based on this assumption, a targeted sales approach at the point of interest (POI) and point of sale (POS) will be increasingly important. Not only do retailers need to track and understand the purchasing decision of every customer as they move between channels, but they also need to have a real-time view of the product lines, pricing and content that the competition is using. Once armed with this information, retailers can concentrate on delivering personalized ads or timely product placements that drive consumers to the checkout as they move across different channels.

Related Article: 11 Cheeky Business Valentine’s Day Cards for the BFF In Your Office

Start with search

Consumers will start their shopping journey with a search engine and will rarely scroll past the first page. So brands need to be prepared by turning Valentine’s Day product lines into searchable content. To capture a greater share of online traffic, retailers should concentrate on making relevant products easy to find by managing meta-information, optimizing media assets with the keywords that consumers are using, deploying rich text and automatically sending products to search engines.

Next generation loyalty

Retailers and restaurants can now integrate loyalty schemes into specialized smartphone apps, or maybe integrate customer communication to automatically deliver personalized ads (e.g., offers for last minute gifts for those who forget). However, to ensure success, brands need to know as much about their customers as consumers know about their products. By being able to monitor customers’ behavior, the information that they are looking at and the channels that they are using to interact with brands, loyalty programs can be used to deliver timely special offers or information at the right moment.

Digital signage

Valentine’s Day represents an opportunity to reinvent the in-store experience. By introducing digital signage for special product promotions, retailers can showcase a wide range of eclectic merchandise to showroom consumers. This could be done by targeting any smartphone consumers (who have allowed geo-located ads on their phones) with a personalized text message when they enter the store. Use this message to direct them to the most relevant areas for Valentine’s Day gifts or present them with a customized offer based on previous buying history.

Related Article: Small Business Marketing Tips for Valentine’s Day

Quick fulfillment

supermarkets have become established as the one-stop shop for lovers in a rush. Last year, Tesco, a British multinational grocery and general merchandise retailer, revealed that 85 percent of all Valentine’s Day bouquets were bought on the day itself, with three-quarters of all Valentine’s Day chocolates sold on February 14.

To tap into the last-minute attitude of panicked couples searching for a gift, retailers should have a dedicated Valentine’s Day section online and provide timely offers that come with the promise of delivery in time for Valentine’s Day. For example, BCBGMAXAZRIA is using data quality services to ensure its email list is clean and updated, keeping its sender reputation high so that when they need to reach customers during critical times like Valentine’s Day, they have confidence in their data.

Alternatively, retailers can help customers by closely managing local inventory levels to offer same-day click-and-collect initiatives or showing consumers the number of items that are currently in-stock and in-store across all channels.

Valentine’s Day may seem like a minor holiday after Christmas, but for retailers it generates billions of dollars in annual spending and presents a tremendous opportunity to boost their customer base. With these tips, retailers will hopefully be able to sweeten their sales by effectively targeting customers looking for the perfect gift for their special someone.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Services, Retail | Tagged , | Leave a comment

What’s All the Mania Around SaaS Data?

SaaSIt’s no secret that the explosion of software-as-a-service (SaaS) apps has revolutionized the way businesses operate. From humble beginnings, the titans of SaaS today include companies such as Salesforce.com, NetSuite, Marketo, and Workday that have gone public and attained multi-billion dollar valuations.  The success of these SaaS leaders has had a domino effect in adjacent areas of the cloud – infrastructure, databases, and analytics.

Amazon Web Services (AWS), which originally had only six services in 2006 with the launch of Amazon EC2, now has over 30 ranging from storage, relational databases, data warehousing, Big Data, and more. Salesforce.com’s Wave platform, Tableau Software, and Qlik have made great advances in the cloud analytics arena, to give better visibility to line-of-business users. And as SaaS applications embrace new software design paradigms that extend their functionality, application performance monitoring (APM) analytics has emerged as a specialized field from vendors such as New Relic and AppDynamics.

So, how exactly did the growth of SaaS contribute to these adjacent sectors taking off?

The growth of SaaS coincided with the growth of powerful smartphones and tablets. Seeing this form factor as important to the end user, SaaS companies rushed to produce mobile apps that offered core functionality on their mobile device. Measuring adoption of these mobile apps was necessary to ensure that future releases met all the needs of the end user. Mobile apps contain a ton of information such as app responsiveness, features utilized, and data consumed. As always, there were several types of users, with some preferring a laptop form factor over a smartphone or tablet. With the ever increasing number of data points to measure within a SaaS app, the area of application performance monitoring analytics really took off.

Simultaneously, the growth of the SaaS titans cemented their reputation as not just applications for a certain line-of-business, but into full-fledged platforms. This growth emboldened a number of SaaS startups to develop apps that solved specialized or even vertical business problems in healthcare, warranty-and-repair, quote-to-cash, and banking. To get started quickly and scale rapidly, these startups leveraged AWS and its plethora of services.

The final sector that has taken off thanks to the growth of SaaS is the area of cloud analytics. SaaS grew by leaps and bounds because of its ease of use, and rapid deployment that could be achieved by business users. Cloud analytics aims to provide the same ease of use for business users when providing deep insights into data in an interactive manner.

In all these different sectors, what’s common is the fact that SaaS growth has created an uptick in the volume of data and the technologies that serve to make it easier to understand. During Informatica’s Data Mania event (March 4th, San Francisco) you’ll find several esteemed executives from Salesforce, Amazon, Adobe, Microsoft, Dun & Bradstreet, Qlik, Marketo, and AppDynamics talk about the importance of data in the world of SaaS.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, SaaS | Tagged , , , , | Leave a comment