Category Archives: Cloud Computing

What do CIOs think about when integrating Salesforce?

Viswanath_0416Salesforce.com is one of the most widely used cloud applications across every industry. Initially, Salesforce gained dominance from mid-market customers due to the agility and ease of deployment that the SaaS approach delivered. A cloud-based CRM system enabled SMB companies to easily automate sales processes that recorded customer interactions during the sales cycle and scale without costly infrastructure to maintain. This resulted in faster growth, thereby showing rapid ROI of a Salesforce deployment in most cases.

The Eye of the Enterprise

When larger enterprises saw the rapid growth that mid-market players had achieved, they realized that Salesforce was a unique technology enabler capable of helping their businesses to also speed time to market and scale more effectively. In most enterpises, the Salesforce deployments were driven by line-of-business units such as Sales and Customer Service, with varying degrees of coordination with central IT groups – in fact, most initial deployments of Salesforce orgs were done fairly autonomously from central IT.

With Great Growth Comes Greater Integration Challenges

When these business units needed to engage with each other to run cross functional tasks, the lack of a single customer view across the siloed Salesforce instances became a problem. Each individual Salesforce org had its own version of the truth and it was impossible to locate where in the sales cycle each customer was in respect to each business unit. As a consequence, cross-selling and upselling became very difficult. In short, the very application that was a key technology enabler for growth was now posing challenges to meet business objectives.

Scaling for Growth with Custom Apps

While many companies use the pre-packaged functionality in Salesforce, ISVs have also begun building custom apps using the Force.com platform due to its extensibility and rapid customization features. By using Salesforce to build native applications from the ground up, they could design innovative user interfaces that expose powerful functionality to end users. However, to truly add value, it was not just the user interface that was important, but also the back-end of the technology stack. This was especially evident when it came to aggregating data from several sources, and surfacing them in the custom Force.com apps.

On April 23rd at 10am PDT, you’ll hear how two CIOs from two different companies tackled the above integration challenges with Salesforce: Rising Star finalist of the 2013 Silicon Valley Business Journal CIO Awards, Eric Johnson of Informatica, and Computerworld’s 2014 Premier 100 IT Leaders, Derald Sue of InsideTrack.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Cloud Computing, SaaS | Tagged , , , | Leave a comment

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

The Typical Journey of SaaS Adoption

SaaS applicationsWithin most organizations today, it is not a question of if SaaS applications should be deployed, but how quickly. The era of having to justify adoption of SaaS applications is long over, and the focus has shifted towards a deciding which SaaS applications to deploy, in which departments, and in what timeframes. With this view in mind, let us explore the typical journey that most companies take when deciding which SaaS applications to implement first.

Related: Learn more about customer facing processes vs. customer fulfillment processes in the March 25th webinar ‘Accelerate Business Velocity with NetSuite and Salesforce Integration

Customer Facing Processes

The main impetus behind switching to a SaaS application is because of the agility the cloud brings. Customizations that normally take weeks to get implemented take minutes or days, and can be performed by employees that do not possess an in-depth knowledge of the technical infrastructure of the SaaS system. With that being said, it is customer-facing processes that require application customizations almost immediately because optimizing these processes results in bringing in revenue quickly into the company, thereby enabling CIOs to show rapid ROI of a SaaS application.

It is no wonder that front-office applications such as Salesforce have become one of the largest SaaS vendors out there today. The entire process of converting a lead to a closed opportunity has several steps in between, and may require multiple workflows in parallel. But the journey doesn’t stop there. To keep customers satisfied and retain them, their product needs to be fulfilled, and this is where customer fulfillment processes come into play.

Customer Fulfillment Processes

Once an opportunity has been closed, the process of getting the product to the customer begins. Traditionally, this role has been done by large-scale on-premise ERP vendors, but leading cloud ERP companies such as NetSuite are showing how the complex task of fulfilling orders and realizing revenue can be done faster. Processes such as applying category-specific price and quantity discounts, special tax regulations involving several regions and nations, and fulfillment through multiple delivery options all have several moving parts. Moreover, the task of invoicing the customer, collecting payment, and recording numerous financial transactions is an entire sub-process in of itself and the only way it can be streamlined is through cloud ERP applications.

Optimizing the Entire Lead-to-Cash Process with Cloud Integration

When looking at customer-facing and customer-fulfillment processes together, it is very clear that SaaS apps in both categories need to work hand-in-hand to ensure that an organization’s customers are satisfied, and continue to engage in repeat business. This is why organizations that are starting the rollout of front-office SaaS applications also need to be thinking about rolling out back-office ERP SaaS applications along with a cloud integration solution to tie it all together. In the March 25th webinar, ‘Accelerate Business Velocity with NetSuite and Salesforce Integration’, we’ll talk about a blueprint for integrating both these types of apps together and how the Australian Institute of Management deployed these apps as part of a multi-million dollar IT transformation project.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, SaaS | Leave a comment

Cloud Designer and Dynamic Rules Linking in Informatica Cloud Spring 2014

Informatica Cloud Spring 2014: Advanced Integration, Simplified for All! Once upon a time, database schema changes were rare and handled with scrutiny. The stability of source data led to the development of the traditional Data Integration model. In this traditional model, a developer pulled a fixed number of source fields into an integration, transformed these fields, and then mapped the data into appropriate target fields.
The world of data has profoundly changed. Today’s Cloud applications allow an administrator to add custom fields to an object at a moment’s notice. Because source data is increasingly malleable, the traditional Data Integration model is no longer optimal. The Data Integration model must evolve.

Today’s integrations must dynamically adapt to ever-changing environments. (Webinar HERE)

To meet these demands, Informatica has built the Informatica Cloud Mapping Designer. The Mapping Designer provides power and adaptability to integrations through the “link rules” and “incoming field rules” features. Integration developers no longer need to deal with fields on a one-by-one basis. Cloud Designer allows the integration developer to specify a set of dynamic “rules” that tell the mapping how fields need to be handled.

For example, the default rule is “Include all fields”, which is both simple and powerful. The “all fields” rule dynamically resolves to bring in as many fields as exist at the source at run time. Regardless of how many new fields the application developer or database administrator may have thrown in to the source after the integration was developed, this simple rule can bring in all the new fields into the integration dynamically. This exponentially increases developer productivity, as the integration developer is not making modifications just to keep up with changes to the integration endpoints. Instead, the integration is “future proofed”.

Link rules can be defined in combination using both “includes” and “excludes” criteria. The rules can be of four types:

  • Include or Exclude All fields
  • Include or Exclude Fields of a particular datatype (example: String, numeric, decimal, datetime, blob etc)
  • Include or Exclude Fields that fit a name pattern (example: any field that ends with “_c” or any field that starts with “Shipping_”)
  • Include or Exclude Fields by a particular name (example: “Id”, “Name” etc)

Any combination of the link rules can be put together to create sophisticated dynamic rules for fields to flow.

Each transformation in the integration can specify the set of rules that determine what fields flow into that particular transformation. For example, if I need all custom fields from a Salesforce source to flow into a target, I would simply “Include fields by name pattern : suffixed with ‘_c’” – which is the naming convention for custom field names in Salesforce.  In another example, If I need to perform standardization of date formats for all datetime fields in an expression, I can define a rule to “Include fields by datatype – datetime”.

The dynamic nature of the link rules is what empowers a mapping created in Informatica Cloud Designer to be easily converted into a highly reusable integration template through parameterization.

For example, the entire source object can be parameterized and the integration developer may focus on the core integration logic without having to worry about individual fields. For example I can build an integration for bringing data into a slowly changing dimension table in a datawarehouse and this integration can apply to any source object. When the integration is executed by substituting different source objects for the source parameter, the integration would work as expected since the logical rules can dynamically bring in the fields regardless of what the source object structure is. Now all of a sudden, an integration developer is only required to build one reusable integration template for replicating multiple objects to the datawarehouse and NOT dozens or even hundreds of such repeated integration mappings. Needless to say, maintenance is hugely optimized.

With the power of logically defining field propagation through an integration combined with the ability to parameterize just about any part of the integration logic, the Cloud Mapping Designer provides a unique and powerful platform for developing reusable end to end integration solutions (such as Opportunity to Order, Accounts load to Salesforce, SAP product catalog to Salesforce, File load to Amazon redshift etc). Such prebuilt end-to-end solutions or VIPs (Vibe Integration Packages) can be easily customized by any consuming customer to adapt to their unique environments and business needs by tweaking only certain configurations but largely reusing the core integration logic.

What could be better than building integrations… building far fewer integrations that are reusable and self-adapting

To learn more, join the upcoming Cloud Spring release Webinar on Thursday, March 13.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing | Tagged , | Leave a comment

What Keeps Business Intelligence from Reaching its Potential?

With the growing prominence of big data as both a strategic and tactical resource for enterprises, there’s been a growing shift in the scope of business intelligence. Not too long ago, BI’s world was in tools that ran on individual workstations or PCs, providing filtered reports on limited sets of data, or stacking the data into analytical cubes.

business intelligence

Now, BI encompasses a range of data and analytics from across the enterprise, and is increasingly likely to be online, supported in the cloud, as it is in a local PC. However, as it has been for years, BI adoption still tends to be limited, not reaching its full potential. In a recent interview, BI analyst Cindi Howson, asks the question, what’s holding companies back from achieving a big impact with BI?  In a recent Q&A with TDWI’s Linda Briggs, she discussed the issues raised in her new book, Successful Business Intelligence: Unlock the Value of BI and Big Data.

The success of BI depends, more than anything, on one factor, she says: corporate culture. Some organizations have achieved an analytic culture that reaches across their various business limes, but for many, it’s a challenge. “Leadership means not just the CIO but also the CEO, the lines of business, the COO, and the VP of marketing,” says Howson. “Culture and leadership are closely related, and it’s hard to separate one from the other.”

While corporate culture has always been important to success, it take on even a more critical role in efforts to compete on analytics. For example, she illustrates, “companies have a lot of data, and certainly they value the data, but there is sometimes a fear of sharing it. Once you start exposing the data, somebody’s job might be on the line, or it can show that someone made some bad decisions. Maybe the data will reveal that you’ve spent millions of dollars and you’re not really getting the returns that you thought you would in pursuing a particular market segment or product.”

It’s important to see an analytics culture as focusing on data as a tool to see problems and make course corrections, or act on opportunities – not to punish or expose individuals or departments.

Another point of corporate resistance is employing BI in the cloud, a challenge recently explored by Brad Peters, CEO of Birst. Here again, corporate culture may hold back efforts to move to the cloud, which offers greater scalability and availability for BI and analytics initiatives. In a recent interview in Diginomica, he says that IT departments, for example, may throw up roadblocks, for fear of being disintermediated. Plus, there is also a recognition that once BI data is in the cloud, it often gets “harder to work with.” Multi-tenant sites, for example, have security systems and protocols that may limit users’ ability to manipulate or parse the data.

The increasing adoption of cloud-based services – such as those from Amazon or Salesforce – are gradually melting resistance to the idea of cloud-based BI, Peters adds. He particular;y sees advantages for geographically-dispersed workforces.”

For his part, he admits that “has never been under any illusion that the shift of enterprise analytics to the cloud was going to happen overnight.”

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud Computing | Leave a comment

Are you Ready for the Massive Wave of Data?

Leo Eweani makes the case that the data tsunami is coming.  “Businesses are scrambling to respond and spending accordingly. Demand for data analysts is up by 92%; 25% of IT budgets are spent on the data integration projects required to access the value locked up in this data “ore” – it certainly seems that enterprise is doing The Right Thing – but is it?”

Data is exploding within most enterprises.  However, most enterprises have no clue how to manage this data effectively.  While you would think that an investment in data integration would be an area of focus, many enterprises don’t have a great track record in making data integration work.  “Scratch the surface, and it emerges that 83% of IT staff expect there to be no ROI at all on data integration projects and that they are notorious for being late, over-budget and incredibly risky.”

Are you Ready for the massive Wave of Data

The core message from me is that enterprises need to ‘up their game’ when it comes to data integration.  This recommendation is based upon the amount of data growth we’ve already experienced, and will experience in the near future.  Indeed, a “data tsunami” is on the horizon, and most enterprises are ill prepared for it.

So, how do you get prepared?   While many would say it’s all about buying anything and everything, when it comes to big data technology, the best approach is to splurge on planning.  This means defining exactly what data assets are in place now, and will be in place in the future, and how they should or will be leveraged.

To face the forthcoming wave of data, certain planning aspects and questions about data integration rise to the top:

Performance, including data latency.  Or, how quickly does the data need to flow from point or points A to point or points B?  As the volume of data quickly rises, the data integration engines have got to keep up.

Data security and governance.  Or, how will the data be protected both at-rest and in-flight, and how will the data be managed in terms of controls on use and change?

Abstraction, and removing data complexity.  Or, how will the enterprise remap and re-purpose key enterprise data that may not currently exist in a well-defined and functional structure?

Integration with cloud-based data.  Or, how will the enterprise link existing enterprise data assets with those that exist on remote cloud platforms?

While this may seem like a complex and risky process, think through the problems, leverage the right technology, and you can remove the risk and complexity.  The enterprises that seem to fail at data integration do not follow that advice.

I suspect the explosion of data to be the biggest challenge enterprise IT will face in many years.  While a few will take advantage of their data, most will struggle, at least initially.  Which route will you take?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud Computing, Data Governance, Data Integration | Tagged , , , | Leave a comment

5 Wishes for the New Year

  1. Business and IT: Stop dissing each other.  We all do it.  Despite any platitudes about business-IT alignment, there is always griping behind closed doors.  Let’s all promise to go the entire month of January without saying anything negative about the other team, and on a weekly basis express gratitude or provide positive feedback.
  2. Don’t let the hype fool youBig DataInternet of ThingsCloud/Social/Mobile (which has seemingly morphed into a single word).   Hype?  Definitely yes.  Vaporware?  Sometimes.  Ignore it until “it’s real”?  Definitely not.  There are kernels of reality hidden in most of the hype.  You have to find those kernels, and then let your mind open up to what the potential is in your own realm.
  3. Marry right brain with left brain.  Most of us are heavy left brain people when we’re on the job.  And while being data-driven, analytical and methodical are important, what separates the innovators from the followers is the spark of intuition, wisdom or creativity that is based on facts and knowledge, but not bound by it.
  4. Use social to discuss issues and gain knowledge rather than while away time.  Social media has been extremely powerful for connecting people. But a shockingly high percentage of the social content is trivial—following celebrities; sharing selfies; updating friends on the latest meal eaten; lodging complaints about various first world problems.  What if we diverted 25% of the social media time we spend on frivolous trivia to intellectual engagement and intelligent discussions about real issues?  What could we change in our society if that power was unleashed?
  5. Use data for good. There are many uses for all the data flowing around us. Many of them are transformative— changing business models, revolutionizing industries, in some cases changing society. However, most of the ones being discussed today focus on corporate profit as opposed to societal good—think of all the investment in better targeting marketing offers to consumers.  There is absolutely nothing wrong with utilizing data to grow business, and healthy businesses provide jobs, foster innovation and drive economic growth.  However, business profit should not be our only goal. A few people and organizations (such as DataKind) are thinking about how to use data for good. Meaning societal good. If a few more of us carve out a portion of our time and brain power to focus on potential ways data can be harnessed to benefit our broader community, imagine the impact we could have on education, healthcare, the environment, economic hardship and the other myriad challenges we face around the world. Perhaps this wish is the most pollyanaish of them all, but I’ll keep doing what I can to forward the cause.
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Cloud Computing | Leave a comment

The Usefulness of Things

As a Tesla owner, I recently had the experience of calling Tesla service after a yellow warning message appeared on the center console of my car.” Check tire pressure system.  Call Tesla Service.” While still on the freeway, I voice dialed Tesla with my iPhone and was in touch with a service representative within minutes.

Me: A yellow warning message just appeared on my dash and also the center console.

Tesla rep: Yes, I see – is it the tire pressure warning?

Me: Yes – do I need to pull into a gas station?  I haven’t had to visit a gas station since I purchased the car.

Tesla rep:  Well, I also see that you are traveling on a freeway that has some steep elevation – it’s possible the higher altitude is affecting your car’s tires temporarily until the pressure equalizes.  Let me check your tire pressure monitoring sensor in a half hour.  If the sensor still detects a problem, I will call you and give further instructions.

As it turned out, the warning message disappeared after ten minutes and everything was fine for the rest of the trip. However, the episode served as a reminder that the world will be much different with the advent of the Internet of Things. Just as humans connected with mobile phones become more productive, machines and devices connected to the network become more useful. In this case, a connected automobile allowed the remote service rep to remotely access vehicle data, read the tire pressure sensor as well as the vehicle location/elevation and was able to suggest a course of action. This example is fairly basic compared to the opportunities afforded by networked devices/machines.

In addition to remote servicing, there are several other use case categories that offer great potential, including:

  • Preventative Maintenance – monitor usage data and increase the overall uptime for machines/devices while decreasing the cost of upkeep. e.g., Tesla runs remote diagnostics on vehicles and has the ability to identify vehicle problems before they occur.
  • Realtime Product Enhancements – analyze product usage data and deliver improvements quickly in response. e.g., Tesla delivers software updates that improve the usability of the vehicle based on analysis of owner usage.
  • Higher Efficiency in Business Operations – analyze consolidated enterprise transaction data with machine data to identify opportunities to achieve greater operational efficiency. e.g., Tesla deployed waves of new fast charging stations (known as superchargers) based upon analyzing the travel patterns of its vehicle owners.
  • Differentiated Product/Service Offerings – deliver new class of applications that operate on correlated data across a broad spectrum of sources (HINT for Tesla: a trip planning application that estimates energy consumption and recommends charging stops would be really cool…)

In each case, machine data is integrated with other data (traditional enterprise data, vehicle owner registration data, etc.) to create business value. Just as important to the connectivity of the devices and machines is the ability to integrate the data. Several Informatica customers have begun investing in M2M (aka Internet of Things) infrastructure and Informatica technology has been critical to their efforts. US Xpress utilizes mobile censors on its vast fleet of trucks and Informatica delivers the ability to consolidate, cleanse and integrate the data they collect.

My recent episode with Tesla service was a simple, yet eye-opening experience. With increasingly more machines and devices getting wireless connected and the ability to integrate the tremendous volumes of data being generated, this example is only a small hint of more interesting things to come.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Complex Event Processing, Data Aggregation, Data Integration | Tagged , , , | 2 Comments

Creating Secure Test Data Subsets for Salesforce.com Sandboxes

Many Salesforce developers that use sandbox environments for test and development suffer from the following challenges:

  • Lack of relevant data for proper testing and development (empty sandboxes)
  • To fix that problem, they manually copy data from production
  • Which results in exposing sensitive data to unauthorized users
  • And potentially consuming more storage than allocated for sandbox environments (resulting in unexpected costs)

To address these challenges, Informatica just released Cloud Test Data Management for Salesforce.  This solution is designed to give Salesforce admins and developers   the ability to provision secure test data subsets to developers through an easy to use, wizard driven approach.  The application is delivered as a service through a subscription-based pricing model.

The Informatica IT team uses Salesforce internally and validated an ROI based on reducing the amount of developer time used to manually script copying data from production to a sandbox, reducing the amount of time fixing defects due to not having the right test data, and eliminating the risk of a data breach by masking sensitive data.

To learn more about this new offering, watch a demonstration that shows how to create secure test data subsets for Salesforce.  Also, available now, try the free Cloud Data Masking app or take a 30-day Cloud Test Data Management trial.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data masking | Leave a comment