Tag Archives: business impact
As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”
We can all see this happening in our personal lives. Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone. On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.
So, what changed? With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.
For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure. Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.
The problem with all of this very cool stuff is that we need to once again rethink data integration. Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.
That’s why those who are moving to IoT-based systems need to do two things. First, they must create a strategy for extracting data from devices, such as industrial robots or ann Audi A8. Second, they need a strategy to take all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.
The challenge is that machines and devices are not traditional IT systems. I’ve built connectors for industrial applications in my career. The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around. Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.
This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage. However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible. Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff. At the heart of its ability to succeed is the ability to move data from place-to-place.
I have always loved making connections: between people, between a product and its message, between partner companies and their messages. Coming from a creative agency background where I worked with our clients, created messaging, found images, and wrote copy all day, what I did not love was cold hard data. In fact, I’m embarrassed to admit I rarely thought about it. I handed off my creative work and let the client worry about the boring details. As long as they kept coming back, all was well.
Enter year 2008. I decided to go in-house for an IaaS provider, with a laser focus on SaaS companies. In many ways it was an easy transition, with one glaring difference – METRICS. I used every trick in my book to escape tracking and reporting. I was “too busy” with “more important” things. Needless to say, this did not go over well. But my background, along with our non-compatible (how I saw it at the time) systems, had me up nights worrying about the reports I should be doing. In truth, I was too busy to spend an extra several hours pulling information from three systems, sifting through and manually mashing it together in a spreadsheet to get the report I needed. And by the end I always had a huge headache and wasn’t even sure my information was correct. But none of that got me out of doing the work I hated.
Then came the first time I was able to prove a program’s worth; there was a spark of excitement – an awakening to the power of data. For the next several years, through both start-up and enterprise environments, I had a love / hate relationship with data. No company I worked for had integrated SaaS/software systems, and reporting took hours of manual work for me, and my teams. The desire was there, even the occasional win – but it was laden with bitter feelings, from the pain of wasted time and uncertain results.
Everything changed last year when I joined Informatica. For the first time, my marketing automation was integrated with my CRM, which was integrated with my… you know the rest. And reporting? Even that was now easy. For a company that lives and breathes data integration, obviously this makes sense. As a person who’s never experienced this before, I had no idea what a relief it would be until I lived it.
Now imagine unlocking this ease of use not only for your employees (very important), but also for your customers (maybe even more important). I’d like to invite you to Informatica’s first SaaS ecosystem event where data-driven executives from Salesforce.com, AWS, Tableau, Marketo, AppDynamics, D&B, Adobe, NewRelic, and more will share their stories around data and the difference it’s made in their competitive differentiation.
Data Mania is a private event for SaaS leaders, March 4, in San Francisco. Right now, it’s the stealth version of Dreamforce or Oracle Openworld. And like any A-list after party, it’s drawing a who’s-who of SaaS & data industry insiders. It is the event to attend if you are a product management, engineering, professional services or customer success executive at a SaaS company and want to know the data story behind some of the most successful companies in your space.
Planned sessions and panels include something for everyone.
For customer success management, we offer the chance to learn firsthand how native connectors quickly onboard new customers, improve business processes and establish connectivity with other best-of-breed applications.
Engineers and developers – and anyone involved with R&D – will hear how their peers have figured out a way to refocus their attention on developing new products and enhanced features while still providing the data integration required for mass adoption.
And, finally, for product management, we offer freedom — to consider all the potential opportunities and applications that open up when you quit worrying about how “to make the data work and scale” and instead focus on “all the ways data can make your product better” and provide your customers with greater insights and value.
Leading up to Data Mania, we’re also holding Connect-a-thon, a hackathon-like event to get you connected to hundreds of your customers’ cloud and on-premises apps. Connect-a-thon will give your dev team direct access to Informatica Cloud R&D resources – at no cost – to help them develop connectors and custom mappings to make these connections. And if your company is under $5M in annual bookings, and you choose to embed Informatica Cloud, we have a very special offer* for you (think free software and services). Then come to the show for advice on the next steps from your peers and data-driven leaders.
In the end, if you think Salesforce, Adobe, Amazon Web Services, Tableau, Qlik, Dun & Bradstreet and Informatica have something to say about connection and data — and the role they play helping to create the customer-driven enterprise – then you want to be at Data Mania to hear it.
I’m proud of the event we’ve put together and I know you won’t be disappointed. Conceiving and producing Data Mania with a small team here has been my chance to come full circle back to my love of making connections in the SaaS community, using my creative background AND working with the data and metrics I’ve learned to love. I’m counting down the days to the event on March 4th, and I hope you’ll join me. I’ll be the Data Maniac with the biggest smile.
*Offer applies to the first 25 participants
Over the past few years, we have assisted an increasing shift in customer behavior. Pervasive internet connectivity – along with the exponential adoption of mobile devices – has enabled shoppers to research and purchase products of all kinds, anytime and anywhere, using a combination of touch points they find most convenient. This is not a passing fad.
Consumers expect rich data and images to make purchase choices; business users require access to analytical data in order to make mission-critical decisions. These demands for information are driving a need for improved product data availability and accuracy. And this is changing the way businesses go to market.
A staggering number of stores and manufacturers are reforming their models to response to this challenge. The direct-to-consumer (DTC) model, while not new, is rapidly becoming the center stage to address these challenges. The optimal DTC model will vary depending on specific and contextual business objectives. However, there are many strategic benefits to going direct, but the main objectives include growing sales, gaining control over pricing, strengthening the brand, getting closer to consumers, and testing out new products and markets.
It is my contention that while the DTC model is gaining the deserved attention, much remains to be done. In fact, among many challenges that DTC poses, the processes and activities associated with sourcing product information, enriching product data to drive sales and lower returns, and managing product assortments across all channels loom large. More precisely, the challenges that need to be overcome are better exemplified by these points:
- Products have several variations to support different segments, markets, and campaigns.
- Product components, ingredients, care information, environmental impact data and other facets of importance to the customer.
- People are visual. As a result, easy website navigation is essential. Eye-catching images that highlight your products or services (perhaps as they’re being performed or displayed as intended) is an effective way to visually communicate information to your customers and make it easier for them to evaluate options. If information and pictures are readily accessible, customers are more likely to engage.
- Ratings, reviews and social data, stored within the product’s record rather than in separate systems.
- Purchasing and sales measurements, for example, sales in-store, return rates, sales velocity, product views online, as well as viewing and purchasing correlations are often held across several systems. However, this information is increasingly needed for search and recommendation.
The importance of product data and its use, combined with the increased demands on business as a result of inefficient, non-scaling approaches to data management, provide an imperative to considering a PIM to ‘power’ cross-channel retail. Once established, PIM users repeatedly report higher ROI. It is likely that we’ll see PIM systems rank alongside CRM, ERP, CMS, order management and merchandising systems as the pillars of cross-channel retailing at scale.
For all these reasons, choosing the right PIM strategy (and partner) is now a key decision. Get this decision wrong and it could become an expensive mistake.