Tag Archives: Analytics

Data Ambition: What Does It Take To Become an Analytics Leader?

shutterstock_136599614

Which comes first: innovation or analytics?

Bain & Company released some survey findings a few months back that actually put a value on big data.  Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.

This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?

You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.

The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”

Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”

Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:

Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”

Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”

Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”

Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.

There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration | Tagged , | 2 Comments

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

Fire your Data Scientists – They Don’t Add Value

Data ScientistYears ago, I was on a project to improve production and product quality through data analysis. During the project, I heard one man say: 

“If I had my way, I’d fire the statisticians – all of them – they don’t add value”. 

Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production?  But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.

So, perhaps this title should be re-written to be: 

Fire your Data Scientists – They Aren’t Able to Add Value.

Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:

  • “I’m told our data is 60% accurate, which means I can’t trust any of it.”
  • “We achieved our goal of an answer within a week by working 24 hours a day.”
  • “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
  • “Fred manually audits 10% of the invoices.  When he is on holiday, we just don’t do the audit.”

This is why I think the original quote is so insightful.  Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.

Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.

With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.

Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?

Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Integration Platform, Data Warehousing | Tagged , , , | 4 Comments

Analytics 3.0 – The Emerging Data Economy

In recent times, the big Internet companies – the Googles, Yahoos and eBays – have proven that it is possible to build a sustainable business on data analytics, in which corporate decisions and actions are being seamlessly guided via an analytics culture, based on data, measurement and quantifiable results. Now, two of the top data analytics thinkers say we are reaching a point that non-tech, non-Internet companies are on their way to becoming analytics-driven organizations in a similar vein, as part of an emerging data economy.

Post-Big data

In a report written for the International Institute for Analytics, Thomas Davenport and Jill Dyché divulge the results of their interviews with 20 large organizations, in which they find big data analytics to be well integrated into the decision-making cycle. “Large organizations across industries are joining the data economy,” they observe. “They are not keeping traditional analytics and big data separate, but are combining them to form a new synthesis.”

Davenport and Dyché call this new state of management “Analytics 3.0, ” in which the concept and practices of competing on analytics are no longer confined to data management and IT departments or quants – analytics is embedded into all key organizational processes. That means major, transformative effects for organizations. “There is little doubt that analytics can transform organizations, and the firms that lead the 3.0 charge will seize the most value,” they write.

Analytics 3.0 is the current of three distinct phases in the way data analytics has been applied to business decision making, Davenport and Dyché say. The first two “eras” looked like this:

  1. Analytics 1.0, prevalent between 1954 and 2009, was based on relatively small and structured data sources from internal corporate sources.
  2. Analytics 2.0, which arose between 2005 and 2012, saw the rise of the big Web companies – the Googles and Yahoos and eBays – which were leveraging big data stores and employing prescriptive analytics to target customers and shape offerings. This time span was also shaped by a growing interest in competing on analytics, in which data was applied to strategic business decision-making. “However, large companies often confined their analytical efforts to basic information domains like customer or product, that were highly-structured and rarely integrated with other data,” the authors write.
  3. In the Analytics 3.0 era, analytical efforts are being integrated with other data types, across enterprises.

This emerging environment “combines the best of 1.0 and 2.0—a blend of big data and traditional analytics that yields insights and offerings with speed and impact,” Davenport and Dyché say. The key trait of Analytics 3.0  “is that not only online firms, but virtually any type of firm in any industry, can participate in the data-driven economy. Banks, industrial manufacturers, health care providers, retailers—any company in any industry that is willing to exploit the possibilities—can all develop data-based offerings for customers, as well as supporting internal decisions with big data.”

Davenport and Dyché describe how one major trucking and transportation company has been able to implement low-cost sensors for its trucks, trailers and intermodal containers, which “monitor location, driving behaviors, fuel levels and whether a trailer/container is loaded or empty. The quality of the optimized decisions [the company] makes with the sensor data – dispatching of trucks and containers, for example – is improving substantially, and the company’s use of prescriptive analytics is changing job roles and relationships.”

New technologies and methods are helping enterprises enter the Analytics 3.0 realm, including “a variety of hardware/software architectures, including clustered parallel servers using Hadoop/MapReduce, in-memory analytics, and in-database processing,” the authors adds. “All of these technologies are considerably faster than previous generations of technology for data management and analysis. Analyses that might have taken hours or days in the past can be done in seconds.”

In addition, another key characteristic of big data analytics-driven enterprises is the ability to fail fast – to deliver, with great frequency, partial outputs to project stakeholders. With the rise of new ‘agile’ analytical methods and machine learning techniques, organizations are capable of delivering “insights at a much faster rate,” and provide for “an ongoing sense of urgency.”

Perhaps most importantly, big data and analytics are integrated and embedded into corporate processes across the board. “Models in Analytics 3.0 are often being embedded into operational and decision processes, dramatically increasing their speed and impact,” Davenport and Dyché state. “Some are embedded into fully automated systems based on scoring algorithms or analytics-based rules. Some are built into consumer-oriented products and features. In any case, embedding the analytics into systems and processes not only means greater speed, but also makes it more difficult for decision-makers to avoid using analytics—usually a good thing.”

The report is available here.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , | Leave a comment

Death of the Data Scientist: Silver Screen Fiction?

Maybe the word “death” is a bit strong, so let’s say “demise” instead.  Recently I read an article in the Harvard Business Review around how Big Data and Data Scientists will rule the world of the 21st century corporation and how they have to operate for maximum value.  The thing I found rather disturbing was that it takes a PhD – probably a few of them – in a variety of math areas to give executives the necessary insight to make better decisions ranging from what product to develop next to who to sell it to and where.

Who will walk the next long walk.... (source: Wikipedia)

Who will walk the next long walk…. (source: Wikipedia)

Don’t get me wrong – this is mixed news for any enterprise software firm helping businesses locate, acquire, contextually link, understand and distribute high-quality data.  The existence of such a high-value role validates product development but it also limits adoption.  It is also great news that data has finally gathered the attention it deserves.  But I am starting to ask myself why it always takes individuals with a “one-in-a-million” skill set to add value.  What happened to the democratization  of software?  Why is the design starting point for enterprise software not always similar to B2C applications, like an iPhone app, i.e. simpler is better?  Why is it always such a gradual “Cold War” evolution instead of a near-instant French Revolution?

Why do development environments for Big Data not accommodate limited or existing skills but always accommodate the most complex scenarios?  Well, the answer could be that the first customers will be very large, very complex organizations with super complex problems, which they were unable to solve so far.  If analytical apps have become a self-service proposition for business users, data integration should be as well.  So why does access to a lot of fast moving and diverse data require scarce PIG or Cassandra developers to get the data into an analyzable shape and a PhD to query and interpret patterns?

I realize new technologies start with a foundation and as they spread supply will attempt to catch up to create an equilibrium.  However, this is about a problem, which has existed for decades in many industries, such as the oil & gas, telecommunication, public and retail sector. Whenever I talk to architects and business leaders in these industries, they chuckle at “Big Data” and tell me “yes, we got that – and by the way, we have been dealing with this reality for a long time”.  By now I would have expected that the skill (cost) side of turning data into a meaningful insight would have been driven down more significantly.

Informatica has made a tremendous push in this regard with its “Map Once, Deploy Anywhere” paradigm.  I cannot wait to see what’s next – and I just saw something recently that got me very excited.  Why you ask? Because at some point I would like to have at least a business-super user pummel terabytes of transaction and interaction data into an environment (Hadoop cluster, in memory DB…) and massage it so that his self-created dashboard gets him/her where (s)he needs to go.  This should include concepts like; “where is the data I need for this insight?’, “what is missing and how do I get to that piece in the best way?”, “how do I want it to look to share it?” All that is required should be a semi-experienced knowledge of Excel and PowerPoint to get your hands on advanced Big Data analytics.  Don’t you think?  Do you believe that this role will disappear as quickly as it has surfaced?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Integration, Data Integration Platform, Data Quality, Data Warehousing, Enterprise Data Management, Financial Services, Healthcare, Life Sciences, Manufacturing, Master Data Management, Operational Efficiency, Profiling, Scorecarding, Telecommunications, Transportation, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , | 1 Comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts ;-)   – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

True Facts About Informatica RulePoint Real-Time Integration

Cabralia_computer_center

Shhhh… RulePoint Programmer Hard at Work

End of year.  Out with the old, in with the new.  A time where everyone gets their ducks in order, clears the pipe and gets ready for the New Year. For R&D, one of the gating events driving the New Year is the annual sales kickoff event where we present to Sales the new features so they can better communicate a products’ road map and value to potential buyers.  All well and good.  But part of the process is to fill out a Q and A that explains the product “Value Prop” and they only gave us 4 lines. I think the answer also helps determine speaking slots and priority.

So here’s the question I had to fill out -

FOR SALES TO UNDERSTAND THE PRODUCT BETTER, WE ASK THAT YOU ANSWER THE FOLLOWING QUESTION:

WHAT IS THE PRODUCT VALUE PROPOSITION AND ARE THERE ANY SIGNIFICANT DEPLOYMENTS OR OTHER CUSTOMER EXPERIENCES YOU HAVE HAD THAT HAVE HELPED TO DEFINE THE PRODUCT OFFERING?

Here’s what I wrote:

Informatica RULEPOINT is a real-time integration and event processing software product that is deployed very innovatively by many businesses and vertical industries.  Its value proposition is that it helps large enterprises discover important situations from their droves of data and events and then enables users to take timely action on discovered business opportunities as well as stop problems while or before they happen.

Here’s what I wanted to write:

RulePoint is scalable, low latency, flexible and extensible and was born in the pure and exotic wilds of the Amazon from the minds of natives that have never once spoken out loud – only programmed.  RulePoint captures the essence of true wisdom of the greatest sages of yesteryear. It is the programming equivalent and captures what Esperanto linguistically tried to do but failed to accomplish.

As to high availability, (HA) there has never been anything in the history of software as available as RulePoint. Madonna’s availability only pales in comparison to RulePoint’s availability.  We are talking 8 Nines cubed and then squared ( ;-) ). Oracle = Unavailable. IBM = Unavailable. Informatica RulePoint = Available.

RulePoint works hard, but plays hard too.  When not solving those mission critical business problems, RulePoint creates Arias worthy of Grammy nominations. In the wee hours of the AM, RulePoint single-handedly prevented the outbreak and heartbreak of psoriasis in East Angola.

One of the little known benefits of RulePoint is its ability to train the trainer, coach the coach and play the player. Via chalk talks? No, RulePoint uses mind melds instead.  Much more effective. RulePoint knows Chuck Norris.  How do you think Chuck Norris became so famous in the first place? Yes, RulePoint. Greenpeace used RulePoint to save dozens of whales, 2 narwhal, a polar bear and a few collateral penguins (the bear was about to eat the penguins).  RulePoint has been banned in 16 countries because it was TOO effective.  “Veni, Vidi, RulePoint Vici” was Julius Caesar’s actual quote.

The inspiration for Gandalf in the Lord of the Rings? RulePoint. IT heads worldwide shudder with pride when they hear the name RulePoint mentioned and know that they acquired it. RulePoint is stirred but never shaken. RulePoint is used to train the Sherpas that help climbers reach the highest of heights. RulePoint cooks Minute rice in 20 seconds.

The running of the bulls in Pamplona every year -  What do you think they are running from? Yes,  RulePoint. RulePoint put the Vinyasa back into Yoga. In fact, RulePoint will eventually create a new derivative called Full Contact Vinyasa Yoga and it will eventually supplant gymnastics in the 2028 Summer Olympic games.

The laws of physics were disproved last year by RulePoint.  RulePoint was drafted in the 9th round by the LA Lakers in the 90s, but opted instead to teach math to inner city youngsters. 5 years ago, RulePoint came up with an antivenin to the Black Mamba and has yet to ask for any form of recompense. RulePoint’s rules bend but never break. The stand-in for the “Mind” in the movie “A Beautiful Mind” was RulePoint.

RulePoint will define a new category for the Turing award and will name it the 2Turing Award.  As a bonus, the 2Turing Award will then be modestly won by RulePoint and the whole category will be retired shortly thereafter.  RulePoint is… tada… the most interesting software in the world.

But I didn’t get to write any of these true facts and product differentiators on the form. No room.

Hopefully I can still get a primo slot to talk about RulePoint.

 

And so from all the RulePoint and Emerging Technologies team, including sales and marketing, here’s hoping you have great holiday season and a Happy New Year!

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Uncategorized | Tagged , , , , | Leave a comment

Big Data, So Mom Can Understand

Dear Mom,

I’m glad to hear you feel comfortable explaining data to your friends, and I completely understand why you’ll avoid discussing metadata with them.  You’re in great company – most business leaders also avoid discussing metadata at all costs! You mentioned during our last call that you keep reading articles in the New York Times about this thing called “Big Data” so as promised I’ll try to explain it as best I can.   (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance | Tagged , , , | 2 Comments

3 Big Data Secrets that can ignite a Product Launch

Customers don’t always like change, and new product launch offers variety of changes so it’s important to showcase the value of the change for customers while launching a product.  One key ingredient that can fuel the successful Product launch is leveraging the rich, varied, multi-sourced, readily available information. Yes, tons of information which is like a gold mine and is available to us more easily/readily than ever before from various different sources. Industry experts call it Big Data. Today Big Data can pull gold out of this information gold mine and positively impact a product launch.  What follows are 3 secrets of how Product Marketers can tap the power of Big Data for a successful product launch.

Secret #1: Use Big Data to optimize content strategy and targeted messaging

The main challenge is not just to create a great product but also to communicate the clear compelling value of the product to your customers. You need to speak the language that resonates with needs and preferences of customers. Through social media platforms and weblogs, lots of information is available highlighting views/preferences of buyers. Big Data brings all these data points together from various sources, unlocks them to provide customer intelligence. Product Marketers can leverage this intelligence to create customer segmentation and targeted messaging.

Secret #2: Use Big Data to identify influential customers and incent them to influence others

One of the studies done by Forrester Research indicates that today your most valuable customer is the one who may buy little but influences 100 others to buy via blogs, tweets, Facebook and online product reviews. Using MDM with Big Data businesses can create a 360 degree customer profile by integrating transaction, social interaction and weblogs which help in identifying influential customers. Companies can engage these influential customers early by initiating a soft launch or beta testing of their product.

Secret #3: Use Big data to provide direction to ongoing Product improvement

Big Data is also a useful tool to monitor on-going product performance and keeping customers engaged post-launch. Insights into how customers are using the product and what they enjoy most can open the doors for improvements in future launches resulting in happier and loyal customers.

Zynga, creator of most popular Facebook game Farmville, collects terabytes of big data in a day and analyzes it to improve the game features and customer services. As indicated in a WSJ article after Version 1 launch of the game, the company analyzed customer behavior and found that customers were interacting with animals much more than the designers expected. So in the second release game designers increased the game offerings with more focus on animals keeping customer’s more engaged.

Key Takeaway:

Big data is proving to be a game changer for product managers and marketers who want to deeply engage with their customers and launch products with a memorable and valued customer experience.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Master Data Management | Tagged , , , | Leave a comment

Two Sci-Fi Big Data Predictions From Over 50 Years Ago

Science fiction represents some of the most impactful stories I’ve read throughout my life.  By impactful, I mean the ideas have stuck with me 30 years since I last read them.  I recently recalled two of these stories and realized they represent two very different paths for Big Data.  One path, quite literally, was towards enlightenment.  Let’s just say the other path went in a different direction.  The amazing thing is that both of these stories were written between 50-60 years ago. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance | Tagged , , , , , | 5 Comments