Category Archives: Data Integration Platform

Startup Winners of the Informatica Data Mania Connect-a-Thon

Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.

While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP.  Bugs also continue to mount, and technical debt increases.

Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.

BigML

With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important.  BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.

Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw

Databricks

Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.

Can’t play the video? Click here, http://youtu.be/607ugvhzVnY

FollowAnalytics

With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.

Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg

ThoughtSpot

Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.

Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4

Share
Posted in B2B, Business Impact / Benefits, Cloud, Data Integration, Data Integration Platform, Data Privacy, Data Quality, Data Services, Data Transformation | Tagged , , , , , | Leave a comment

Gamers Need Great Data and Connected Platforms

Great Data and Connected Platforms

Gamers Need Great Data and Connected Platforms

Who remembers their first game of Pong? Celebrating more than 40 years of innovation, gaming is no longer limited to monochromatic screens and dedicated, proprietary platforms. The PC gaming industry is expected to exceed $35bn by 2018. Phone and handheld games is estimated at $34bn in 5 years and quickly closing the gap. According to EEDAR, 2014 recorded more than 141 million mobile gamers just in North America, generating $4.6B in revenue for mobile game vendors.

This growth has spawned a growing list of conferences specifically targeting gamers, game developers, the gaming industry and more recently gaming analytics! This past weekend in Boston, for example, was PAX East where people of all ages and walks of life played games on consoles, PC, handhelds, and good old fashioned board games. With my own children in attendance, the debate of commercial games versus indie favorites, such as Minecraft , dominates the dinner table.

Online games are where people congregate online, collaborate, and generate petabytes of data daily. With the added bonus of geospatial data from smart phones, the opportunity for more advanced analytics. Some of the basic metrics that determine whether a game is successful, according to Ninja Metrics, include:

  • New Users, Daily Active Users, Retention
  • Revenue per user
  • Session length and number of sessions per user

Additionally, they provide predictive analytics, customer lifetime value, and cohort analysis. If this is your gig, there’s a conference for that as well – the Gaming Analytics Summit !

At the Game Developers Conference recently held in San Francisco, the focus of this event has shifted over the years from computer games to new gaming platforms that need to incorporate mobile, smartphone, and online components. In order to produce a successful game, it requires the following:

  • Needs to be able to connect to a variety of devices and platforms
  • Needs to use data to drive decisions and improve user experience
  • Needs to ensure privacy laws are adhered to.

Developers are able to quickly access online gaming data and tweak or change their sprites’ attributes dynamically to maximize player experience.

When you look at what is happening in the gaming industry, you can start to see why colleges and universities like my own alma mater, WPI, now offers a computer science degree in Interactive Media and Game Design degree . The IMGD curriculum includes heavy coursework in data science, game theory, artificial intelligence and story boarding. When I asked a WPI IMGD student about what they are working on, they are mapping out decision trees that dictate what adversary to pop up based on the player’s history (sounds a lot like what we do in digital marketing…).

As we start to look at the Millennial Generation entering into the workforce, maybe we should look at our own recruiting efforts and consider game designers. They are masters in analytics and creativity with an appreciation for the importance of great data. Combining the magic and the math makes a great gaming experience. Who wouldn’t want that for their customers?

Share
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Data Integration, DaaS, Data Integration Platform, Data Services | Tagged , , | Leave a comment

Informatica joins new ServiceMax Marketplace – offers rapid, cost effective integration with ERP and Cloud apps for Field Service Automation

ERP

Informatica Partners with ServiceMax

To deliver flawless field service, companies often require integration across multiple applications for various work processes.  A good example is automatically ordering and shipping parts through an ERP system to arrive ahead of a timely field service visit.  Informatica has partnered with ServiceMax, the leading field service automation solution, and subsequently joined the new ServiceMax Marketplace to offer customers integration solutions for many ERP and Cloud applications frequently involved in ServiceMax deployments.  Comprised of Cloud Integration Templates built on Informatica Cloud for frequent customer integration “patterns”, these solutions will speed and cost contain the ServiceMax implementation cycle and help customers realize the full potential of their field service initiatives.

Existing members of the ServiceMax Community can see a demo or take advantage of a free 30-day trial that provides full capabilities of Informatica Cloud Integration for ServiceMax with prebuilt connectors to hundreds of 3rd party systems including SAP, Oracle, Salesforce, Netsuite and Workday, powered by the Informatica Vibe virtual data machine for near-universal access to cloud and on-premise data.  The Informatica Cloud Integration for Servicemax solution:

  • Accelerates ERP integration through prebuilt Cloud templates focused on key work processes and the objects on common between systems as much as 85%
  • Synchronizes key master data such as Customer Master, Material Master, Sales Orders, Plant information, Stock history and others
  • Enables simplified implementation and customization through easy to use user interfaces
  • Eliminates the need for IT intervention during configuration and deployment of ServiceMax integrations.

We look forward to working with ServiceMax through the ServiceMax Marketplace to help joint customers deliver Flawless Service!

Share
Posted in 5 Sales Plays, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Data Integration, Data Integration Platform, Data Migration, Data Synchronization, Operational Efficiency, Professional Services, SaaS | Tagged , , , | Leave a comment

Stop Trying to Manage Data Growth!(?)

Data Downpour

Data Downpour

Talking to architects about analytics at a recent event, I kept hearing the familiar theme; data scientists are spending 80% of their time on “data wrangling” leaving only 20% for delivering the business insights that will drive the company’s innovation.  It was clear to everybody that I spoke to that the situation will only worsen.  The coming growth everybody sees in data volume and complexity, will only lengthen the time to value.

Gartner recently predicted that:

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

50 percent

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

Some of the details of this study are interesting.  In the end, many organizations are coming to two conclusions:

  • It’s risky to delete data, so they keep it around as insurance.
  • All data has potential business value, so more organizations are keeping it around for potential analytical purposes.

The other mega-trend here is that more and more organizations are looking to compete on analytics – and they need data to do it, both internal data and external data.

From an architect’s perspective, here are several observations:

  • The floodgates are open and analytics is a top priority. Given that, the emphasis should be on architecting to manage the dramatic increases in both data quantity and data complexity rather than on trying to stop it.
  • The immediate architectural priority has to be on simplifying and streamlining your current enterprise data architecture. Break down those data silos and standardize your enterprise data management tools and processes as much as possible.  As discussed in other blogs, data integration is becoming the biggest bottleneck to business value delivery in your environment. Gartner has projected that “by 2018, more than half the cost of implementing new large systems will be spent on integration.”  The more standardized your enterprise data management architecture is, the more efficient it will be.
  • With each new data type, new data tool (Hive, Pig, etc.), and new data storage technology (Hadoop, NoSQL, etc.) ask first if your existing enterprise data management tools can handle the task before people go out and create a new “data silo” based on the cool, new technologies. Sometimes it will be necessary, but not always.
  • The focus needs to be on speeding value delivery for the business. And the key bottleneck is highly likely to be your enterprise data architecture.

Rather than focusing on managing data growth, the priority should be on managing it in the most standardized and efficient way possible.  It is time to think about enterprise data management as a function with standard processes, skills and tools (just like Finance, Marketing or Procurement.)

Several of our leading customers have built or are building a central “Data as a Service” platform within their organizations.  This is a single, central place where all developers and analysts can go to get trustworthy data that is managed by IT through a standard architecture and served up for use by all.

For more information, see “The Big Big Data Workbook

*Gartner Predicts 2015: Managing ‘Data Lakes’ of Unprecedented Enormity, December 2014  http://www.gartner.com/document/2934417#

Share
Posted in Architects, CIO, Data Integration Platform | Tagged , , , , , , | Leave a comment

Big Data Is Neither-Part II

Big_DataYou Say Big Dayta, I say Big Dahta

Some say Big Data is a great challenge while others say Big Data creates new opportunities. Where do you stand?  For most companies concerned with their Big Data challenges, it shouldn’t be so difficult – at least on paper. Computing costs (both hardware and software) have vastly shrunk. Databases and storage techniques have become more sophisticated and scale massively, and companies such as Informatica have made connecting and integrating all the “big” and disparate data sources much easier and have helped companies achieve a sort of “big data synchronicity”. As it is.

In the process of creating solutions to Big Data problems, humans (and the supra-species known as IT Sapiens) have a tendency to use theories based on linear thinking and the scientific method. There is data as our systems know it and data as our systems don’t. The reality, in my opinion, is that “Really Big Data” problems now and in the future will have complex correlations and unintuitive relationships that need to utilize mathematical disciplines, data models and algorithms that haven’t even been discovered or invented yet and when eventually discovered, will make current database science positively primordial.

At some point in the future, machines will be able to predict, based on big, perhaps unknown data types when someone is having a bad day or a good day, or more importantly whether a person may behave in a good or bad way. Many people do this now when they take a glance at someone across a room and infer how that person is feeling or what they will do next. They see eyes that are shiny or dull, crinkles around eyes or sides of mouths, then hear the “tone” in a voice and then their neurons put it altogether that this is a person that is having a bad day and needs a hug. Quickly. No one knows exactly how the human brain does this, but it does what it does and we go with it and we are usually right.

U.S._Air_Force_Senior_Airman__130429-F-ZX232-013

And some day, Big Data will be able to derive this and it will be an evolution point and it will also be a big business opportunity. Through bigger and better data ingestion and integration techniques and more sophisticated math and data models, a machine will do this fast and relatively speaking, cheaply. The vast majority won’t understand why or how it’s done, but it will work and it will be fairly accurate.

And my question to you all is this.

Do you see any other alternate scenarios regarding the future of big data? Is contextual computing an important evolution and will big data integration be more or less of a problem in the future.

PS. Oh yeah, one last thing to chew on concerning Big Data… If Big Data becomes big enough, does that spell the end of modelling as we know it?

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CMO, Complex Event Processing, Data Integration Platform, Hadoop, Intelligent Data Platform | Tagged , , , | Leave a comment

Federal Migration to Cloud Computing, Hindered by Data Issues

cloud_computing

Moving towards Cloud Computing

As reviewed by Loraine Lawson,  a MeriTalk survey about cloud adoption found that a “In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.”

For the most part, the shifts are more tactical in nature.  These federal managers are shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent).  Most interesting is that they’re not moving traditional business applications, custom business apps, or middleware. Why? Data, and data integration issues.

“Federal agencies are worried about what happens to data in the cloud, assuming they can get it there in the first place:

  • 58 percent of executives fret about cloud-to-legacy system integration as a barrier.
  • 57 percent are worried about migration challenges, suggesting they’re not sure the data can be moved at all.
  • 54 percent are concerned about data portability once the data is in the cloud.
  • 53 percent are worried about ‘contract lock-in.’ ”

The reality is that the government does not get much out of the movement to cloud without committing core business applications and thus core data.  While e-mail and Web hosting, and some storage is good, the real cloud computing money is made when moving away from expensive hardware and software.  Failing to do that, you fail to find the value, and, in this case, spend more taxpayer dollars than you should.

Data issues are not just a concern in the government.  Most larger enterprise have the same issues as well.  However, a few are able to get around these issues with good planning approaches and the right data management and data integration technology.  It’s just a matter of making the initial leap, which most Federal IT executives are unwilling to do.

In working with CIOs of Federal agencies in the last few years, the larger issue is that of funding.  While everyone understands that moving to cloud-based systems will save money, getting there means hiring government integrators and living with redundant systems for a time.  That involves some major money.  If most of the existing budget goes to existing IP operations, then the move may not be practical.  Thus, there should be funds made available to work on the cloud projects with the greatest potential to reduce spending and increase efficiencies.

The shame of this situation is that the government was pretty much on the leading edge with cloud computing. back in 2008 and 2009.  The CIO of the US Government, Vivek Kundra, promoted the use of cloud computing, and NIST drove the initial definitions of “The Cloud,” including IaaS, SaaS, and PaaS.  But, when it came down to making the leap, most agencies balked at the opportunity citing issues with data.

Now that the technology has evolved even more, there is really no excuse for the government to delay migration to cloud-based platforms.  The clouds are ready, and the data integration tools have cloud integration capabilities backed in.  It’s time to see some more progress.

Share
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform | Tagged , , , | 2 Comments

The New Marketing Technology Landscape Is Here… And It’s Music to Our Ears!

How Do You Like It? How Do You Like It? More, More More!
Andrea+True+Connection+ANDREAChiefmartec came out with their 2015 Marketing Technology Landscape, and if there’s one word that comes to mind, it’s MORE. 1,876 corporate logos dot the page, up from 947 in 2014. That’s definitely more, more, more – just about double to be exact. I’m honestly not sure it’s possible to squeeze any more in a single image?

But it’s strangely fitting, because this is the reality that we marketers live in.  There are an infinite number of new technologies, approaches, social media platforms, operations tools, and vendors that we have to figure out. New, critical categories of technology roll out constantly. New vendors enter and exit the landscape. As Chiefmartec says “at least on the immediate horizon, I don’t think we’re going to see a dramatic shrinking of this landscape. The landscape will change, for sure. What qualifies as “marketing” and “technology” under the umbrella of marketing technology will undoubtedly morph. But if mere quantity is the metric we’re measuring, I think it’s going to be a world of 1,000+ marketing technology companies — perhaps even a world of 2,000+ of them — for some time to come.”

Marketing_Technology_Jan2015

Middleware: I’m Coming Up So You’d Better Get This Party Started!
pinkOne thing you’ll notice if you look carefully between last year’s and this year’s version, is the arrival of the middleware layer. Chiefmartec spends quite a bit of time talking about middleware, pointing out that great tools in the category are making the marketing technology landscape easier to manage – particularly those that handle a hybrid of on premise and cloud.

Marketers have long since cared about the things on the top – the red “Marketing Experiences” and the orange “Marketing Operations”. They’ve also put a lot of focus in the dark gray/black/blue layer “Backbone Platforms” like marketing autionation & e-commerce. But only recently has that yellow middleware layer become front and center for marketers. Data integration, data management platforms, connectivity, data quality, and API’s are definitely not new to the technology landscape, and have been a critical domain of IT for decades. But as marketers are becoming more and more skilled and reliant on analytics and focused customer experience management, data is entering the forefront.

Marketers cannot focus exclusively on their Salesforce CRM, their Marketo automation, or their Adobe Experience Manager web management. Data Ready marketers realize that each of these applications can no longer be run in a silo, they need to be looked at collectively as a powerful set of tools designed to engage the customer and push them through the buying cycle, as critical pieces to the same puzzle. And to do that, they need to be looking at connecting their data sources, powering them with great data, analyzing and measuring their results, and then deciding what to do.

If you squint, you can see Informatica in the yellow Middleware layer. (I could argue that it belongs in several of these yellow boxes, not just Cloud integration, but I’ll save that for another blog!) Some might say that’s not very exciting, but I would argue that Informtaica is in a tremendous place to help marketers succeed with great data. And it all comes down to two words… complexity and change.

Why You Have to Go and Make Things So Complicated?
avril-lavigne-avril-lavigne-34900869-1280-1024Ok, admittedly terrible grammar, but you get the picture. Marketers live in a trendounsly complex world. Sure you don’t have all 1,876 of the logos on the Technology Landscape in house. You probably don’t eve have one from each of the 43 categories. But you definitely have a lot of different tecnology solutions that you rely upon on a day-to-day basis. According to a September article by ChiefMarTech, most marketers already regularly rely on more than 100 software programs.

Data ready marketers realize that their environments are complicated, and that they need a foundation. They need a platform of great data that all of their various applications and tools can leverage, and that can actually connect all of their various applications and tools together. They need to be able to connect to just about anything from just about anything. They need a complete view of all of their interactions their customers. In short, they need to make their extremely complicated world more simple, streamlined, and complete.

Ch-Ch-Ch-Ch-Changes. Turn and Face the Strange!
David-Bowie-1973I have a tendency to misunderstand lyrics, so I have to confess that until I looked up this song today, I thought the lyric was “time to face the pain” (Bowie fans, I hang my head in shame!).  But quite honestly, “turn and face the strange” illustrates my point just as well!

There is no question that marketing has changed dramatically in the past few years.  Your most critical marketing tools and processes two years ago are almost certainly different than those this year, and will almost certainly be different from what you see two years from now.  Marketers realize this.  The Marketing Technology Landscape illustrates this every year!

The data ready marketer understands that their toolbox will change, but that their data will be the foundation for whatever new piece of the technology puzzle they embrace or get rid of.  Building a foundation of great data will power any technology solution or new approach.

Data ready marketers also work with their IT counterparts to engineer for change.  They make sure that no matter what technology or data source they want to add – no matter how strange or unthinkable it is today – they never have to start from scratch.  They can connect to what they want, when they want, leveraging great data, and ultimately making great decisions.

Get Ready ‘Cause Here I Come. The Era of the Data Ready Marketer is Here
TemptationsNow that you have a few catchy tunes stuck in your head, it’s time to ask yourself, are you data ready? Are you ready to embrace the complexity of marketing technology landscape? Are you ready to think about change as a competitive weapon?

I encourage you to take our survey about data ready marketing. The results are coming out soon so don’t miss your chance to be a part. You can find the link here.

Also, follow me on twitter – The Data Ready Marketer (@StephanieABest) for some of the latest & greatest news and insights on the world of data ready marketing.

And stay tuned because we have several new Data Ready Marketing pieces coming out soon – InfoGraphics, eBooks, SlideShares, and more!

Share
Posted in 5 Sales Plays, Big Data, CMO, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform | Tagged , , , , , , , , | Leave a comment

Ready for Internet of Things?

internet_of_thingsData has always played a key role in informing decisions – machine generated and intuitive.  In the past, much of this data came from transactional databases as well as unstructured sources, such as emails and flat files.  Mobile devices appeared next on the map.  We have found applications of such devices not just to make calls but also to send messages, take a picture, and update status on social media sites.  As a result, new sets of data got created from user engagements and interactions.  Such data started to tell a story by connecting dots at different location points and stages of user connection.  “Internet of Things” or IoT is the latest technology to enter the scene that could transform how we view and use data on a massive scale.

Another buzzword? 

Does IoT present a significant opportunity for companies to transform their business processes?  Internet of Things probably add an important awareness veneer when it comes to data.  It could bring data early in focus by connecting every step of data creation stages in any business process.  It could de-couple the lagging factor in consuming data and making decisions based on it.  Data generated at every stage in a business process could show an interesting trend or pattern and better yet, tell a connected story.  Result could be predictive maintenance of equipment involved in any process that would further reduce cost.  New product innovations would happen by leveraging the connectedness in data as generated by each step in a business process.  We would soon begin to understand not only where the data is being used and how, but also what’s the intent and context behind this usage.  Organizations could then connect with their customers in a one-on-one fashion like never before, whether to promote a product or offer a promotion that could be both time and place sensitive.  New opportunities to tailor product and services offering for customers on an individual basis would create new growth areas for businesses.  Internet of Things could make it a possibility by bringing together previously isolated sets of data.

Proof-points

Recent Economist report, “The Virtuous Circle of Data: Engaging Employees in Data and Transforming Your Business” suggests that 68% of data-driven businesses outperform their competitors when it comes to profitability.  78% of those businesses foster a better culture of creativity and innovation.  Report goes on to suggest that 3 areas are critical for an organization to build a data-driven business, including data supported by devices: 1) Technology & Tools, 2) Talent & Expertise, and 3) Culture & Leadership.  By 2020, it’s projected that there’ll be 50B connected devices, 7x more than human beings on the planet.  It is imperative for an organization to have a support structure in place for device generated data and a strategy to connect with broader enterprise-wide data initiatives.

A comprehensive Internet of Things strategy would leverage speed and context of data to the advantage of business process owners.  Timely access to device generated data can open up the channels of communication to end-customers in a personalized at the moment of their readiness.  It’s not enough anymore to know what customers may want or what they asked for in the past; rather anticipating what they might want by connecting dots across different stages.  IoT generated data can help bridge this gap.

How to Manage IoT Generated Data

More data places more pressure on both quality and security factors – key building blocks for trust in one’s data.  Trust is ideally truth over time.  Consistency in data quality and availability is going to be key requirement for all organizations to introduce new products or service differentiated areas in a speedy fashion.  Informatica’s Intelligent Data Platform or IDP brings together industry’s most comprehensive data management capabilities to help organizations manage all data, including device generated, both in the cloud and on premise.  Informatica’s IDP enables an automated sensitive data discovery, such that data discovers users in the context where it’s needed.

Cool IoT Applications

There are a number of companies around the world that are working on interesting applications of Internet of Things related technology.  Smappee from Belgium has launched an energy monitor that can itemize electricity usage and control a household full of devices by clamping a sensor around the main power cable. This single device can recognize individual signatures produced by each of the household devices and can let consumers switch off any device, such as an oven remotely via smartphone.  JIBO is a IoT device that’s touted as the world’s first family robot.  It automatically uploads data in the cloud of all interactions.  Start-ups such as Roost and Range OI can retrofit older devices with Internet of Things capabilities.  One of the really useful IoT applications could be found in Jins Meme glasses and sunglasses from Japan.  They embed wearable sensors that are shaped much like Bluetooth headsets to detect drowsiness in its wearer.  It observes the movement of eyes and blinking frequency to identify tiredness or bad posture and communicate via iOS and android smartphone app.  Finally, Mellow is a new kind of kitchen robot that makes it easier by cooking ingredients to perfection while someone is away from home. Mellow is a sous-vide machine that takes orders through your smartphone and keeps food cold until it’s the exact time to start cooking.

Closing Comments

Each of the application mentioned above deals with data, volumes of data, in real-time and in stored fashion.  Such data needs to be properly validated, cleansed, and made available at the moment of user engagement.  In addition to Informatica’s Intelligent Data Platform, newly introduced Informatica’s Rev product can truly connect data coming from all sources, including IoT devices and make it available for everyone.  What opportunity does IoT present to your organization?  Where are the biggest opportunities to disrupt the status quo?

Share
Posted in 5 Sales Plays, Big Data, Cloud, Cloud Data Management, Customer Services, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform, Wearable Devices | Tagged , , , , , | Leave a comment

Magical Data from the Internet of Things? Think again…

Internet of Things

Internet of Things

I recently read an opinion piece written in an insurance publication online. The author postulated, among other things, that the Internet of Things would magically deliver great data to an insurer. Yes, it was a statement just that glib. Almost as if there is some fantastic device that you just plug into the wall and out streams a flow of unicorns and rainbows. And furthermore that those unicorns and rainbows will subsequently give a magical boost to your business. But hey, you plugged in that fantastic device, so bring on the magic.

Now, let’s come back from the land of fairytales and ground ourselves in reality. Data is important, no doubt about that. Today, financial services firms are able to access data from so many new data sources. One of those new and fancy data sources is the myriad of devices in this thing we call the Internet of Things.

You ever have one of those frustrating days with your smart phone? Dropped calls, slow Internet, Facebook won’t locate you? Well, other devices experience the same wonkiness. Even the most robust of devices found on commercial aircraft or military equipment are not lossless in data transmission. And that’s where we are with the Internet of Things. All great devices, they serve a number of purposes, but are still fallible in communicating with the “mother ship”.

A telematics device in a consumer vehicle can transmit, VIN, speed, latitude/longitude, time, and other vehicle statuses for use in auto insurance. As with other devices on a network, some of these data elements will not come through reliably. That means that in order to reconstruct or smooth the set of data, interpolations need to be made and/or entire entries deleted as useless. That is the first issue. Second, simply receiving this isolated dataset does not make sense of it. The data needs to be moved, cleansed and then correlated to other pieces of the puzzle, which eventually turn into a policyholder, an account holder, a client or a risk. And finally, that enhanced data can be used for further analytics. It can be archived, aggregated, warehoused and secured for additional analysis. None of these activities happen magically. And the sheer volume of integration points and data requires a robust and standardized data management infrastructure.

So no, just having an open channel to the stream of noise from your local Internet of Things will not magically deliver you great data. Great data comes from market leading data management solutions from Informatica. So whether you are an insurance company, financial services firm or data provider, being “Insurance Ready” means having great data; ready to use; everywhere…from Informatica.

Share
Posted in Data Integration Platform, Data Quality, Financial Services, Master Data Management | Tagged , , , | Leave a comment

The Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Loraine Lawson provides some great insights into the world of data integration, and this article is no exception.  The topic struck me due to the fact that we’ve wrestled with SaaS integration issues for about 10 years now.

Back in 2004, we saw the rapid growth of SaaS providers such as Salesforce.com.  However, there was typically no consistent data integration strategy to go along with the use of SaaS.  In many instances, SaaS-delivered applications became the new data silos in the enterprise, silos that lacked a sound integration plan and integration technology.

10 years later, we’ve gotten to a point where we have the ability to solve problems using SaaS and data integration problems around the use of SaaS.  However, we typically lack the knowledge and understanding of how to effectively use data integration technology within an enterprise to integrate SaaS problem domains.

Lawson looks at both sides of the SaaS integration argument.  “Surveys certainly show that integration is less of a concern for SaaS than in the early days, when nearly 88 percent of SaaS companies said integration concerns would slow down adoption and more than 88 percent said it’s an important or extremely important factor in winning new customers.”

Again, while we’ve certainly gotten better at integration, we’re nowhere near being out of the woods.  “A Dimensional Research survey of 350 IT executives showed that 67 percent cited data integration problems as a challenge with SaaS business applications. And as with traditional systems, integration can add hidden costs to your project if you ignore it.”

As I’ve stated many times in this blog, integration requires a bit of planning and the use of solid technology.  While this does require some extra effort and money, the return on the value of this work is huge.

SaaS integration requires that you take a bit of a different approach than traditional enterprise integration.  SaaS systems typically place your data behind well-defined APIs that can be accessed directly or through a data integration technology.  While the information can be consumed by anything that can invoke an API, enterprises still have to deal with structure and content differences, and that’s typically best handled using the right data integration technology.

Other things to consider, things that are again often overlooked, is the need for both data governance and data security around your SaaS integration solution.  There should be a centralized control mechanism to support the proper management and security of the data, as well as a mechanism to deal with data quality issues that often emerge when consuming data from any cloud computing services.

The reality is that SaaS is here to stay.  Even enterprise software players that put off the move to SaaS-delivered systems, are not standing up SaaS offerings.  The economics around the use of SaaS are just way to compelling.  However, as SaaS-delivered systems become more common place, so will the emergence of new silos.  This will not be an issue, if you leverage the right SaaS integration approach and technology.  What will your approach be?

Share
Posted in B2B, Data Integration, Data Integration Platform, SaaS | Tagged , | Leave a comment