Category Archives: Cloud
Data has always played a key role in informing decisions – machine generated and intuitive. In the past, much of this data came from transactional databases as well as unstructured sources, such as emails and flat files. Mobile devices appeared next on the map. We have found applications of such devices not just to make calls but also to send messages, take a picture, and update status on social media sites. As a result, new sets of data got created from user engagements and interactions. Such data started to tell a story by connecting dots at different location points and stages of user connection. “Internet of Things” or IoT is the latest technology to enter the scene that could transform how we view and use data on a massive scale.
Does IoT present a significant opportunity for companies to transform their business processes? Internet of Things probably add an important awareness veneer when it comes to data. It could bring data early in focus by connecting every step of data creation stages in any business process. It could de-couple the lagging factor in consuming data and making decisions based on it. Data generated at every stage in a business process could show an interesting trend or pattern and better yet, tell a connected story. Result could be predictive maintenance of equipment involved in any process that would further reduce cost. New product innovations would happen by leveraging the connectedness in data as generated by each step in a business process. We would soon begin to understand not only where the data is being used and how, but also what’s the intent and context behind this usage. Organizations could then connect with their customers in a one-on-one fashion like never before, whether to promote a product or offer a promotion that could be both time and place sensitive. New opportunities to tailor product and services offering for customers on an individual basis would create new growth areas for businesses. Internet of Things could make it a possibility by bringing together previously isolated sets of data.
Recent Economist report, “The Virtuous Circle of Data: Engaging Employees in Data and Transforming Your Business” suggests that 68% of data-driven businesses outperform their competitors when it comes to profitability. 78% of those businesses foster a better culture of creativity and innovation. Report goes on to suggest that 3 areas are critical for an organization to build a data-driven business, including data supported by devices: 1) Technology & Tools, 2) Talent & Expertise, and 3) Culture & Leadership. By 2020, it’s projected that there’ll be 50B connected devices, 7x more than human beings on the planet. It is imperative for an organization to have a support structure in place for device generated data and a strategy to connect with broader enterprise-wide data initiatives.
A comprehensive Internet of Things strategy would leverage speed and context of data to the advantage of business process owners. Timely access to device generated data can open up the channels of communication to end-customers in a personalized at the moment of their readiness. It’s not enough anymore to know what customers may want or what they asked for in the past; rather anticipating what they might want by connecting dots across different stages. IoT generated data can help bridge this gap.
How to Manage IoT Generated Data
More data places more pressure on both quality and security factors – key building blocks for trust in one’s data. Trust is ideally truth over time. Consistency in data quality and availability is going to be key requirement for all organizations to introduce new products or service differentiated areas in a speedy fashion. Informatica’s Intelligent Data Platform or IDP brings together industry’s most comprehensive data management capabilities to help organizations manage all data, including device generated, both in the cloud and on premise. Informatica’s IDP enables an automated sensitive data discovery, such that data discovers users in the context where it’s needed.
Cool IoT Applications
There are a number of companies around the world that are working on interesting applications of Internet of Things related technology. Smappee from Belgium has launched an energy monitor that can itemize electricity usage and control a household full of devices by clamping a sensor around the main power cable. This single device can recognize individual signatures produced by each of the household devices and can let consumers switch off any device, such as an oven remotely via smartphone. JIBO is a IoT device that’s touted as the world’s first family robot. It automatically uploads data in the cloud of all interactions. Start-ups such as Roost and Range OI can retrofit older devices with Internet of Things capabilities. One of the really useful IoT applications could be found in Jins Meme glasses and sunglasses from Japan. They embed wearable sensors that are shaped much like Bluetooth headsets to detect drowsiness in its wearer. It observes the movement of eyes and blinking frequency to identify tiredness or bad posture and communicate via iOS and android smartphone app. Finally, Mellow is a new kind of kitchen robot that makes it easier by cooking ingredients to perfection while someone is away from home. Mellow is a sous-vide machine that takes orders through your smartphone and keeps food cold until it’s the exact time to start cooking.
Each of the application mentioned above deals with data, volumes of data, in real-time and in stored fashion. Such data needs to be properly validated, cleansed, and made available at the moment of user engagement. In addition to Informatica’s Intelligent Data Platform, newly introduced Informatica’s Rev product can truly connect data coming from all sources, including IoT devices and make it available for everyone. What opportunity does IoT present to your organization? Where are the biggest opportunities to disrupt the status quo?
In 2014, Informatica Cloud focused a great deal of attention on the needs and challenges of the citizen integrator. These are the critical business users at the core of every company: The customer-facing sales rep at the front, as well as the tireless admin at the back. We all know and rely on these men and women. And up until very recently, they’ve been almost entirely reliant on IT for the integration tasks and processes needed to be successful at their jobs.
A lot of that has changed over the last year or so. In a succession of releases, we provided these business users with the tools to take matters into their hands. And with the assistance of key ecosystem partners, such as Salesforce, SAP, Amazon, Workday, NetSuite and the hundreds of application developers that orbit them, we’ve made great progress toward giving business users the self-sufficiency they need, and demand. But, beyond giving these users the tools to integrate and connect with their apps and information at will, what we’ve really done is give them the ability to focus their attention and efforts on their most valuable customers. By doing so, we have got to core of the real purpose and importance of the whole cloud project or enterprise: The customer relationship.
In a recent Fortune interview, Salesforce CEO and cloud evangelist Marc Benioff echoed that idea when he stated that “The CEO is now in charge of the customer relationship.” What he meant by that is companies now have the ability to tie all aspects of their marketing – website, customer service, email marketing, social, sales, etc. – into “one canonical file” with all the respective customer information. By organizing the enterprise around the customer this way, the company can then pivot all of their efforts toward the customer relationship, which is what is required if a business is going to have and sustain success as we move through the 2010s and beyond.
We are in complete agreement with Marc and think it wouldn’t be too much of a stretch to declare 2015 as the year of the customer relationship. In fact, helping companies and business users focus their attention toward the customer has been a core focus of ours for some time. For an example, you don’t have to look much further than the latest iteration of our real-time application integration capability.
In a short video demo that I recommend to everyone, my colleague Eric does a fantastic job of walking users through the real-time features available through the Informatica Cloud platform.
As the demo demonstrates, the real-time features let you build a workflow process application that interacts with data from cloud and on-premise sources right from the Salesforce user interface (UI). It’s quick and easy, thus allowing you to devote more time to your customers and less time on “plumbing.”
The workflows themselves are created with the help of a drag-and-drop process designer that enables the user to quickly create a new process and configure the parameters, inputs and outputs, and decision steps with the click of a few buttons.
Once the process guide is created, it displays as a window embedded right in the Salesforce UI. So if, for example, you’ve created an opportunity-to-order guide, you can follow a wizard-driven process that walks your users from new opportunity creation through to the order confirmation, and everything in between.
As users move through the process, they can interact in real time with data from any on-premise or cloud-based source they choose. In the example from the video, the user, Eric, chooses a likely prospect from a list of company contacts, and with a few keystrokes creates a new opportunity in Salesforce. In a further demonstration of the real-time capability, Eric performs a NetSuite query, logs a client call, escalates a case to customer service, pulls the latest price book information from an Oracle database, builds out the opportunity items, creates the order in SAP, and syncs it all back to Salesforce, all without leaving the wizard interface.
The capabilities available via Informatica Cloud’s application integration are a gigantic leap forward for business users and an evolutionary step toward pivoting the enterprise toward the customer. As 2015 takes hold we will see this become increasingly important as companies continue to invest in the cloud. This is especially true for those cloud applications, like the Salesforce Analytics, Marketing and Sales Clouds, that need immediate access to the latest and most reliable customer data to make them all work — and truly establish you as the CEO in charge of customer relationships.
The technology you use in your business can either help or hinder your business objectives.
In the past, slow and manual processes had an inhibiting effect on customer services and sales interactions, thus dragging down the bottom line.
Now, with cloud technology and customers interacting at record speeds, companies expect greater returns from each business outcome. What do I mean when I say business outcome?
Well according to Bluewolf’s State of Salesforce Report, you can split these into four categories: acquisition, expansion, retention and cost reduction.
With the right technology and planning, a business can speedily acquire more customers, expand to new markets, increase customer retention and ensure they are doing all of this efficiently and cost effectively. But what happens when the data or the way you’re interacting with these technologies grow unchecked, and/or becomes corrupted and unreliable.
With data being the new fuel for decision-making, you need to make sure it’s clean, safe and reliable.
With clean data, Salesforce customers, in the above-referenced Bluewolf survey, reported efficiency and productivity gains (66%), improved customer experience (34%), revenue growth (32%) and cost reduction (21%) in 2014.
It’s been said that it costs a business 10X more to acquire new customers than it does to retain existing ones. But, despite the additional cost, real continued growth requires the acquisition of new customers.
Gaining new customers, however, requires a great sales team who knows what and to whom they’re selling. With Salesforce, you have that information at your fingertips, and the chance to let your sales team be as good as they can possibly be.
And this is where having good data fits in and becomes critically important. Because, well, you can have great technology, but it’s only going to be as good as the data you’re feeding it.
The same “garbage in, garbage out” maxim holds true for practically any data-driven or –reliant business process or outcome, whether it’s attracting new customers or building a brand. And with the Salesforce Sales Cloud and Marketing Cloud you have the technology to both attract new customers and build great brands, but if you’re feeding your Clouds with inconsistent and fragmented data, you can’t trust that you’ve made the right investments or decisions in the right places.
The combination of good data and technology can help to answer so many of your critical business questions. How do I target my audience without knowledge of previous successes? What does my ideal customer look like? What did they buy? Why did they buy it?
For better or worse, but mainly better, answering those questions with just your intuition and/or experience is pretty much out of the question. Without the tool to look at, for example, past campaigns and sales, and combining this view to see who your real market is, you’ll never be fully effective.
The same is true for sales. Without the right Leads, and the ability to interact with these Leads effectively, i.e., having the right contact details, company, knowing there’s only one version of that record, can make the discovery process a long and painful one.
But customer acquisition isn’t the only place where data plays a vital role.
When expanding to new markets or upselling and cross selling to existing customers, it’s the data you collect and report on that will help inform where you should focus your efforts.
Knowing what existing relationships you can leverage can make the difference between proactively offering solutions to your customers and losing them to a competitor. With Salesforce’s Analytics Cloud, this visibility that used to take weeks and months to view can now be put together in a matter of minutes. But how do you make strategic decisions on what market to tap into or what relationships to leverage, if you can only see one or two regions? What if you could truly visualize how you interact with your customers? Or see beyond the hairball of interconnected business hierarchies and interactions to know definitively what subsidiary, household or distributor has what? Seeing the connections you have with your customers can help uncover the white space that you could tap into.
Naturally this entire process means nothing if you’re not actually retaining these customers. Again, this is another area that is fuelled by data. Knowing who your customers are, what issues they’re having and what they could want next could help ensure you are always providing your customer with the ultimate experience.
Last, but by no means least, there is cost reduction. Only by ensuring that all of this data is clean — and continuously cleansed — and your Cloud technologies are being fully utilized, can you then help ensure the maximum return on your Cloud investment.
Learn more about how Informatica Cloud can help you maximize your business outcomes through ensuring your data is trusted in the Cloud.
Like me, you probably just returned from an inspiring Sales Kick Off 2015 event. You’ve invested in talented people. You’ve trained them with the skills and knowledge they need to identify, qualify, validate, negotiate and close deals. You’ve invested in world-class applications, like Salesforce Sales Cloud, to empower your sales team to sell more effectively. But does your sales team have what they need to succeed in 2015?
Gartner predicts that as early as next year, companies will compete primarily on the customer experiences they deliver. So, every customer interaction counts. Knowing your customers is key to delivering great sales experiences.
But, inaccurate, inconsistent and disconnected customer information may be holding your sales team back from delivering great sales experiences. If you’re not fueling Salesforce Sales Cloud (or another Sales Force Automation (SFA) application) with clean, consistent and connected customer information, your sales team may be at a disadvantage against the competition.
To successfully compete and deliver great sales experiences more efficiently, your sales team needs a complete picture of their customers. They don’t want to pull information from multiple applications and then reconcile it in spreadsheets. They want direct access to the Total Customer Relationship across channels, touch points and products within their Salesforce Sales Cloud.
Watch this short video comparing a day-in-the-life of two sales reps competing for the same business. One has access to the Total Customer Relationship in Salesforce Sales Cloud, the other does not. Watch now: Salesforce.com with Clean, Consistent and Connected Customer Information.
Is your sales team spending time creating spreadsheets by pulling together customer information from multiple applications and then reconciling it to understand the Total Customer Relationship across channels, touch points and products? If so, how much is it costing your business? Or is your sales team engaging with customers without understanding the Total Customer Relationship? How much is that costing your business?
Many innovative sales leaders are gaining a competitive edge by better leveraging their customer data to empower their sales teams to deliver great sales experiences. They are fueling business and analytical applications, like Salesforce Sales Cloud, with clean, consistent and connected customer information. They are arming their sales teams with direct access to richer customer profiles, which includes the Total Customer Relationship across channels, touch points and products.
What measurable results have these sales leaders acheived? Merrill Lynch boosted sales productivity by 15%, resulting in $50M in annual impact. A $60B manufacturing company improved cross-sell and up-sell success by 5%. Logitech increased across channels: online, in their retail partner’s stores and through distribution partners.
This year, I believe more sales leaders will focus on leveraging their customer information for competitive advantage. This will help them shift from sales automation to sales optimization. What do you think?
A friend of mine recently reached out to me about some advice on CRM solutions in the market. Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.
We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing. He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations and improve how we marketed and serviced our customers. The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.
After 90 days of rolling out SFDC, we ran into some old familiar problems across the business. Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market. You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution. C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.
During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:
- Trial users who purchased evaluation copies of our products that expired were tagged as current customers
- Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
- Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
- Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system
We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:
- Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually. Because we had such bad data, we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
- Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
- Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.
At the end of our conversation, this was my advice to my friend:
- Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
- Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
- If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
- However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
- Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.
CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?
Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:
- Access and migrate data from old to new avoiding develop cost overruns and project delays.
- Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
- Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
- Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.
Will your data be ready for your new CRM investments? To learn more:
- Download Salesforce Integration for Dummies
- Download a new Whitepaper on how to Maximize Integration ROI with a Hybrid Approach
- Consolidating Multiple Salesforce Orgs: A Best Practice Guide
- Sign up for a 30 Day Trial of Informatica Cloud Integration
Follow me on Twitter @DataisGR8
The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.
1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”
Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”
2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:
- What were my sales last week?
- Which promotions are performing well and poorly?
- Which suppliers are not delivering on their SLAs?
- Which stores aren’t selling according to plan?
- How are the products performing in specific markets?”
Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.
3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica
Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”
4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.
5. Increased adoption of matching and linking capabilities on Hadoop
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.
6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”
Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.
7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.
8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.
Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”. Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.
Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.
So, what do you predict? I would love to hear your opinions and comments.
It’s true. Data integration is a whole new game, compared to five years ago, or, in some organizations, five minutes ago. The right approaches to data integration continue to evolve around a few principal forces: First, the growth of cloud computing, as pointed out by Stafford. Second, the growing use of big data systems, and the emerging use of data as a strategic asset for the business.
These forces combine to drive us to the understanding that old approaches to data integration won’t provide the value that they once did. As someone who was a CTO of three different data integration companies, I’ve seen these patterns change over the time that I was building technology, and that change has accelerated in the last 7 years.
The core opportunities lie with the enterprise architect, and their ability to drive an understanding of the value of data integration, as well as drive change within their organization. After all, they, or the enterprises CTOs and CIOs (whomever makes decisions about technological approaches), are supposed to drive the organization in the right technical directions that will provide the best support for the business. While most enterprise architects follow the latest hype, such as cloud computing and big data, many have missed the underlying data integration strategies and technologies that will support these changes.
“The integration challenges of cloud adoption alone give architects and developers a once in a lifetime opportunity to retool their skillsets for a long-term, successful career, according to both analysts. With the right skills, they’ll be valued leaders as businesses transition from traditional application architectures, deployment methodologies and sourcing arrangements.”
The problem is that, while most agree that data integration is important, they typically don’t understand what it is, and the value it can bring. These days, many developers live in a world of instant updates. With emerging DevOps approaches and infrastructure, they really don’t get the need, or the mechanisms, required to share data between application or database silos. In many instances, they resort to coding interfaces between source and target systems. This leads to brittle and unreliable integration solutions, and thus hurts and does not help new cloud application and big data deployments.
The message is clear: Those charged with defining technology strategies within enterprises need to also focus on data integration approaches, methods, patterns, and technologies. Failing to do so means that the investments made in new and emerging technology, such as cloud computing and big data, will fail to provide the anticipated value. At the same time, enterprise architects need to be empowered to make such changes. Most enterprises are behind on this effort. Now it’s time to get to work.
I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.
Standardization of OData has been going on for years (they are working on version 4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.
But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.
Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.
But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.
Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.
OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.
The New York Times reports that beginning in this month and into January, for the first time, the Girl Scouts of America will be able to sell Thin Mints and other favorites online through invite-only websites. The websites will be accompanied by a mobile app, giving customers new digital options.
As the Girl Scouts update from a door-to-door approach to include a newly introduced digital program, it’s just one more sign of where marketing trends are heading.
From digital cookies to digital marketing technology:
If 2014 is the year of the digital cookie, then 2015 will be the year of marketing technology. Here’s just a few of the strongest indicators:
- A study found that 67% of marketing departments plan to increase spending on technology over the next two years, according to the Harvard Business Review.
- Gartner predicts that by 2017, CMOs will outspend CIOs on IT-related expenses.
- Also by 2017, one-third of the total marketing budget will be dedicated to digital marketing, according to survey results from Teradata.
- A new LinkedIn/Salesforce survey found that 56% of marketers see their relationships with the CIO as very important or critical.
- Social media is a mainstream channel for marketers, making technology for measuring and managing this channel of paramount importance. This is not just true of B2C companies. Of high level executive B2B buyers, 75% used social media to make purchasing decisions, according to a 2014 survey by market research firm IDC.
From social to analytics to email marketing, much of what marketers see in technology offerings is often labeled as “cloud-based.” While cloud technology has many features and benefits, what are we really saying when we talk about the cloud?
What the cloud means… to marketers.
Beginning around 2012, multitudes of businesses in many industries began adapting “the cloud” as a feature or a benefit to their products or services. Whether or not the business truly was cloud-based was not as clear, which led to the term “cloudwashing.” We hear the so much about cloud, it is easy for us to overlook what it really means and what the benefits really are.
The cloud is more than a buzzword – and in particular, marketers need to know what it truly means to them.
For marketers, “the cloud” has many benefits. A service that is cloud-based gives you amazing flexibility and choices over the way you use a product or service:
- A cloud-enabled product or service can be integrated into your existing systems. For marketers, this can range from integration into websites, marketing automation systems, CRMs, point-of-sale platforms, and any other business application.
- You don’t have to learn a new system, the way you might when adapting a new application, software, or other enterprise system. You won’t have to set aside a lot of time and effort for new training for you or your staff.
- Due to the flexibility that lets you integrate anywhere, you can deploy a cloud-based product or service across all of your organization’s applications or processes, increasing efficiencies and ensuring that all of your employees have access to the same technology tools at the same time.
- There’s no need to worry about ongoing system updates, as those happen automatically behind the scenes.
In 2015, marketers should embrace the convenience of cloud-based services, as they help put the focus on benefits instead of spending time managing the technology.
Are you using data quality in the cloud?
If you are planning to move data out of an on-premise application or software to a cloud-based service, you can take advantage of this ideal time to ensure these data quality best practices are in place.
Verify and cleanse your data first, before it is moved to the cloud. Since it’s likely that your move to the cloud will make this data available across your organization — within marketing, sales, customer service, and other departments — applying data quality best practices first will increase operational efficiency and bring down costs from invalid or unusable data.
There may be more to add to this list, depending on the nature of your own business. Make sure that:
- Postal addresses are valid, accurate, current and complete
- Email addresses are valid
- Telephone numbers are valid, accurate, and current
- Increase the effectiveness of future data analysis by making sure all data fields are consistent and every individual data element is clearly defined
- Fill in missing data
- Remove duplicate contact and customer records
Once you have cleansed and verified your existing data and move it to the cloud, use a real-time verification and cleansing solution at the point of entry or point of collection in real-time to ensure good data quality across your organization on an ongoing basis.
The biggest roadblock to effective marketing technology is: Bad data.
Budgeting for marketing technology is going to become a bigger and bigger piece of the pie (or cookie, if you prefer) for B2C and B2B organizations alike. The first step all marketers need to take to make sure those investments fully pay off and don’t go wasted is great customer data.
Marketing technology is fueled by data. A recent Harvard Business Review article listed some of the most important marketing technologies. They included tools for analytics, conversion, email, search engine marketing, remarketing, mobile, and marketing automation.
What do they all have in common? These tools all drive customer communication, engagement, and relationships, all of which require valid and actionable customer data to work at all.
You can’t plan your marketing strategy off of data that tells you the wrong things about who your customers are, how they prefer to be contacted, and what messages work the best. Make data quality a major part of your 2015 marketing technology planning to get the most from your investment.
Marketing technology is going to be big in 2015 — where do you start?
With all of this in mind, how can marketers prepare for their technology needs in 2015? Get started with this free virtual conference from MarketingProfs that is totally focused on marketing technology.
This great event includes a keynote from Teradata’s CMO, Lisa Arthur, on “Using Data to Build Strong Marketing Strategies.” Register here for the December 12 Marketing Technology Virtual Conference from MarketingProfs.
Even if you can’t make it live that day at the virtual conference, it’s still smart to sign up so you receive on-demand recordings from the sessions when the event ends. Register now!
With the Winter 2015 Release, Informatica Cloud Advances Real Time and Batch Integration for Citizen Integrators Everywhere
The first of these is in the area of connectivity and brings a whole new set of features and capabilities to those who use our platform to connect with Salesforce, Amazon Redshift, NetSuite and SAP.
Starting with Amazon, the Winter 2015 release leverages the new Redshift Unload Command, giving any user the ability to securely perform bulk queries, and quickly scan and place multiple columns of data in the intended target, without the need for ODBC or JDBC connectors. We are also ensuring the data is encrypted at rest on the S3 bucket while loading data into Redshift tables; this provides an additional layer of security around your data.
For SAP, we’ve added the ability to balance the load across all applications servers. With the new enhancement, we use a Type B connection to route our integration workflows through a SAP messaging server, which then connects with any available SAP application server. Now if an application server goes down, your integration workflows won’t go down with it. Instead, you’ll automatically be connected to the next available application server.
Additionally, we’ve expanded the capability of our SAP connector by adding support for ECC5. While our connector came out of the box with ECC6, ECC5 is still used by a number of our enterprise customers. The expanded support now provides them with the full coverage they and many other larger companies need.
Finally, for Salesforce, we’re updating to the newest versions of their APIs (Version 31) to ensure you have access to the latest features and capabilities. The upgrades are part of an aggressive roadmap strategy, which places updates of connectors to the latest APIs on our development schedule the instant they are announced.
The second major platform enhancement for the Winter 2015 release has to do with our Cloud Mapping Designer and is sure to please those familiar with PowerCenter. With the new release, PowerCenter users can perform secure hybrid data transformations – and sharpen their cloud data warehousing and data analytic skills – through a familiar mapping and design environment and interface.
Specifically, the new enhancement enables you to take a mapplet you’ve built in PowerCenter and bring it directly into the Cloud Mapping Designer, without any additional steps or manipulations. With the PowerCenter mapplets, you can perform multi-group transformations on objects, such as BAPIs. When you access the Mapplet via the Cloud Mapping Designer, the groupings are retained, enabling you to quickly visualize what you need, and navigate and map the fields.
Additional productivity enhancements to the Cloud Mapping Designer extend the lookup and sorting capabilities and give you the ability to upload or delete data automatically based on specific conditions you establish for each target. And with the new feature supporting fully parameterized, unconnected lookups, you’ll have increased flexibility in runtime to do your configurations.
The third and final major Winter release enhancement is to our Real Time capability. Most notable is the addition of three new features that improve the usability and functionality of the Process Designer.
The first of these is a new “Wait” step type. This new feature applies to both processes and guides and enables the user to add a time-based condition to an action within a service or process call step, and indicate how long to wait for a response before performing an action.
When used in combination with the Boundary timer event variation, the Wait step can be added to a service call step or sub-process step to interrupt the process or enable it to continue.
The second is a new select feature in the Process Designer which lets users create their own service connectors. Now when a user is presented with multiple process objects created when the XML or JSON is returned from a service, he or she can select the exact ones to include in the connector.
An additional Generate Process Objects feature automates the creation of objects, thus eliminating the tedious task of replicating hold service responses containing hierarchical XML and JSON data for large structures. These can now be conveniently auto generated when testing a Service Connector, saving integration developers a lot of time.
The final enhancement for the Process Designer makes it simpler to work with XML-based services. The new “Simplified XML” feature for the “Get From” field treats attributes as children, removing the namespaces and making sibling elements into an object list. Now if a user only needs part of the returned XML, they just have to indicate the starting point for the simplified XML.
While those conclude the major enhancements, additional improvements include:
- A JMS Enqueue step is now available to submit an XML or JSON message to a JMS Queue or Topic accessible via the a secure agent.
- Dequeuing (queue and topics) of XML or JSON request payloads is now fully supported.