Category Archives: Business Impact / Benefits

Bringing the “Local Experience” Online: Today’s Farm Store is Data Driven

Today’s Farm Store is Data Driven

Today’s Farm Store is Data Driven

Have you ever found yourself walking into a store to buy one thing, only to leave 2 hours later with enough items to fill 2 mini vans? I certainly have. Now, imagine the same scenario, however this time, you walk in a store to buy ranch supplies, like fencing materials or work boots, but end up leaving with an outdoor fire pit, fancy spurs, a pair of Toms shoes, a ski rack and a jar of pickled egg. If you had no idea these products could be purchased at the same place, you clearly haven’t been to North 40 Outfitters.

Established in Northwestern United States, North 40 Outfitters, a family owned and operated business, has been outfitting the hardworking and hard playing populace of the region. Understanding the diverse needs of its customers, hardworking people, North 40 Outfitters carries everything from fencing for cattle and livestock to tools and trailers. They have gear for camping and hunting—even fly fishing.

Named after the Homestead Act of 1862, an event with strong significance in the region, North 40 Outfitters heritage is built on its community involvement and support of local small businesses. The company’s 700 employees could be regarded as family. At this year’s Thanksgiving, every employee was given a locally raised free range turkey to bring home. Furthermore, true to Black Friday’s shopping experience, North 40 Outfitters opened its door. Eschewing the regular practice of open as early as 3 AM, North 40 Outfitters opened at the reasonable 7 o’clock hour. They offered patrons donuts as well as coffee obtained from a local roaster.

North 40 Outfitters aims to be different. They achieve differentiation by being data driven. While the products they sell cannot be sourced exclusively from local sources, their experience aims to do exactly that.

The Problem

Prior to operating under the name North 40 Outfitters, the company ran under the banner of “Big R”, which was shared with several other members of the same buying group. The decision to change the name to North 40 Outfitters was the result of a move into the digital realm— they needed a name to distinguish themselves. Now as North 40 Outfitters, they can focus on what matters rather than having to deal with the confusion of a shared name. They would now provide the “local store” experience, while investing in their digital strategy as a means to do business online and bring the unique North 40 Outfitters experience and value nationwide.

With those organizational changes taking place, lay an even greater challenge. With over 150,000 SKUs and no digital database for their product information, North 40 Outfitters had to find a solution and build everything from the ground up. Moreover, with customers demanding a means to buy products online, especially customers living in rural areas, it became clear that North 40 Outfitters would have to address its data concerns.

Along with the fresh rebrand and restructure, North 40 Outfitters needed to tame their product information situation, a critical step conducive to building their digital product database and launching their ecommerce store.

The Solution

North 40 Outfitters was clear about the outcome of the recent rebranding and they knew that investments needed to be taken if they were to add value to their existing business. Building the capabilities to take their business to new channels, ecommerce in this case, meant finding the best solution to start on the right foot. Consequently, wishing to become master of their own data, for both online and in-store uses, North 40 Outfitters determined that they needed a PIM application that would act as a unique data information repository.

It’s important to note that North 40 Outfitters environment is not typical to that of traditional retailers. The difference can be found in the large variation of product type they sell. Some of their suppliers have local, boutique style production scales, while some are large multi-national distributors. Furthermore, a large portion of North 40 Outfitters customers live in rural regions, in some cases their stores are a day’s drive away. With the ability to leverage both a PIM and an ecommerce solution North 40 Outfitters is now a step closer to outfitting everyone in the Northwestern region.

Results

It is still very early to talk about results, since North 40 Outfitters has only recently entered the implementation phase. What can be said is that they are very excited. Having reclaimed their territory, and equipped with a PIM solution and an ecommerce solution they have all the right tools to till and plow the playing field.

The meaning of North 40 Outfitters

To the uninitiated the name North 40 Outfitters might not mean much. However, there is a lot of local heritage and history standing behind this newly rebranded name. North 40 is derived from the Homestead Act of 1862. The Act refers to the “North forty”, to the Northern most block of the homesteader’s property. To this day, this still holds significance to the local community. The second half of the brand: “Outfitters” is about the company’s focus on the company ability to outfit its customers both for work and play. On the one hand, you can visit North 40 Outfitters to purchase goods aimed at running your ranch, such as fencing material, horse related goods or quality tools. At the same time, you can buy camping and backpacking goods—they even sell ice fishing huts.

North 40 Outfitters ensures their customers have what they need to work the land, get back from it and ultimately go out and play just as hard if not harder.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Cloud Computing, Data Governance, Data Integration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , | Leave a comment

Swim Goggles, Great Data, and Total Customer Value

Total Customer Value

Total Customer Value on CMO.com

The other day I ran across an article on CMO.com from a few months ago entitled “Total Customer Value Trumps Simple Loyalty in Digital World”.  It’s a great article, so I encourage you to go take a look, but the basic premise is that loyalty does not necessarily equal value in today’s complicated consumer environment.

Customers can be loyal for a variety of reasons as the author Samuel Greengard points out.  One of which may be that they are stuck with a certain product or service because they believe there is no better alternative available. I know I can relate to this after a recent series of less-than-pleasant experiences with my bank. I’d like to change banks, but frankly they’re all about the same and it just isn’t worth the hassle.  Therefore, I’m loyal to my unnamed bank, but definitely not an advocate.

The proverbial big fish in today’s digital world, according to the author, are customers who truly identify with the brand and who will buy the company’s products eagerly, even when viable alternatives exist.  These are the customers who sing the brand’s praises to their friends and family online and in person.  These are the customers who write reviews on Amazon and give your product 5 stars.  These are the customers who will pay markedly more just because it sports your logo.  And these are the customers whose voices hold weight with their peers because they are knowledgeable and passionate about the product.  I’m sure we all have a brand or two that we’re truly passionate about.

Total Customer Value in the Pool

Total Customer Value

Total Customer Value in the Pool

My 13 year old son is a competitive swimmer and will only use Speedo goggles – ever – hands down – no matter what.  He wears Speedo t-shirts to show his support.  He talks about how great his goggles are and encourages his teammates to try on his personal pair to show them how much better they are.  He is a leader on his team, so when newbies come in and see him wearing these goggles and singing their praises, and finishing first, his advocacy holds weight.  I’m sure we have owned well over 30 pair of Speedo goggles over the past 4 years at $20 a pop – and add in the T-Shirts and of course swimsuits – we probably have a historical value of over $1000 and a potential lifetime value of tens of thousands (ridiculous I know!).  But if you add in the influence he’s had over others, his value is tremendously more – at least 5X.

This is why data is king!

I couldn’t agree more that total customer value, or even total partner or total supplier value, is absolutely the right approach, and is a much better indicator of value.  But in this digital world of incredible data volumes and disparate data sources & systems, how can you really know what a customer’s value is?

The marketing applications you probably already use are great – there are so many great automation, web analytics, and CRM systems around.  But what fuels these applications?  Your data.

Most marketers think that data is the stuff that applications generate or consume. As if all data is pretty much the same.  In truth, data is a raw ingredient.  Data-driven marketers don’t just manage their marketing applications, they actively manage their data as a strategic asset.

Total Customer Value

This is why data is king!

How are you using data to analyze and identify your influential customers?  Can you tell that a customer bought their fourth product from your website, and then promptly tweeted about the great deal they got on it?  Even more interesting, can you tell that that five of their friends followed the link, 1 bought the same item, 1 looked at it but ended up buying a similar item, and 1 put it in their cart but didn’t buy it because it was cheaper on another website?  And more importantly, how can you keep this person engaged so they continue their brand preference – so somebody else with a similar brand and product doesn’t swoop in and do it first?  And the ultimate question… how can you scale this so that you’re doing this automatically within your marketing processes, with confidence, every time?

All marketers need to understand their data – what exists in your information ecosystem , whether it be internally or externally.  Can you even get to the systems that hold the richest data?  Do you leverage your internal customer support/call center records?  Is your billing /financial system utilized as a key location for customer data?  And the elephant in the room… can you incorporate the invaluable social media data that is ripe for marketers to leverage as an automated component of their marketing campaigns?
This is why marketers need to care about data integration

Even if you do have access to all of the rich customer data that exists within and outside of your firewalls, how can you make sense of it?  How can you pull it together to truly understand your customers… what they really buy, who they associate with, and who they influence.  If you don’t, then you’re leaving dollars, and more importantly, potential advocacy and true customer value, on the table.
This is why marketers need to care about achieving a total view of their customers and prospects… 

And none of this matters if the data you are leveraging is plain incorrect or incomplete.  How often have you seen some analysis on an important topic, had that gut feeling that something must be wrong, and questioned the data that was used to pull the report?  The obvious data quality errors are really only the tip of the iceberg.  Most of the data quality issues that marketers face are either not glaringly obvious enough to catch and correct on the spot, or are baked into an automated process that nobody has the opportunity to catch.  Making decisions based upon flawed data inevitably leads to poor decisions.
This is why marketers need to care about data quality.

So, as the article points out, don’t just look at loyalty, look at total customer value.  But realize, that this is easier said than done without a focusing in on your data and ensuring you have all of the right data, at the right place, in the right format, right away.

Now…  Brand advocates, step up!  Share with us your favorite story.  What brands do you love?  Why?  What makes you so loyal?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, CMO, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , | Leave a comment

Salesforce Lightning Connect and OData: What You Need to Know

Salesforce Lightning Connect and OData

Salesforce Lightning Connect and OData

Last month, Salesforce announced that they are democratizing integration through the introduction of Salesforce1 Lightning Connect. This new capability makes it possible to work with data that is stored outside of Salesforce using the same force.com constructs (SOQL, Apex, VisualForce, etc) that are used with Salesforce objects. The important caveat is that that external data has to be available through the OData protocol, and the provider of that protocol has to be accessible from the internet.

I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.

Standardization of OData has been going on for years (they are working on version  4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.

But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.

Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.

But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.

Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.

OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Computing, Cloud Data Integration, Data Governance | Tagged , , , | Leave a comment

Informatica Rev: Data Democracy At Last – Part I

Data Democracy At Last

Data Democracy At Last

Informatica Cloud Data Preparation has launched! Those are the words we aspired to hear, the words that served as our rallying cry, when all we had was an idea coupled with a ton of talent, passion and drive. Well, today, we launch Informatica Rev, as business users refer to it. (Check out the press release here.)

As we launch today, we now have over 3,500 individual users across over 800 logos. These users are everyday business users who just want to improve the speed and quality of their business decisions. And by doing so, they help their corporation find success in the marketplace. And in turn they also find success in their own careers. You can hear more from Customers talking about their experience using Informatica Rev during our December 16 Future of Work webinar.

These users are people who, previously, were locked out of the exclusive Data Club because they did not have the time to be excel jocks or know how to code. But now, these are people who have found success by turning their backs on this Club and aggressively participating in the Data Democracy.

And they are able to finally participate in the “Data Democracy” because of Informatica Rev. You can try Informatica Rev for free by clicking here.

These people play every conceivable role in the information economy. They are marketing managers, marketing operations leads, tradeshow managers , sales people, sales operations leads, accounting  analysts, recruiting leads, benefits managers, to mention a few.  They work for large companies to small/mid-size companies and even sole proprietorships. They are even IT leads who might have more technical knowledge than their business counterparts, but are increasingly getting barraged by requests from their business side counterparts, and are just looking to be more productive with these requests. Let’s take a peek into how Informatica Rev allows them to participate in the Data Democracy, and changes their lives for the better.

Before Informatica Rev, a marketing analyst was simply unable to respond to rapid changes to competitor prices because by the time the competitor pricing data was assembled by people or tools they relied on, the competitor prices changed. This lead to lost revenue opportunities for the company. I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business.

Let’s explore what a marketing analyst does today. When a file with competitor prices was received by the analyst, the initial questions they ask were “Which of my SKUs is each competitive price for?” and ”Do the prices vary by some geography”  and to answer these questions, they use Excel VLOOKUPS and some complex macros. By the time the Excel work is done, if they know what a VLOOKUP is, the competitor data is old. Therefore, at some point, there was no reason to continue this analysis and just accept the inability to capture this revenue.

With Informatica Rev, a Marketing Analyst can use Intelligent Guidance to understand the competitor data file to determine its completeness and then with Smart Combine easily combine the competitor data with their own. This is with no code, formal training, and in a few minutes all by themselves. And with Tableau as their BI tool, they can then use the Export to TDE capability to seamlessly export to Tableau to analyze trends in price changes to decide on their strategy. Voila!

Before Informatica Rev, a tradeshow manager used to spend an inordinate amount of time trying to validate leads so that they could then load them into a Marketing Automation System. After a tradeshow, time is of the essence and leads need to be processed rapidly otherwise they will decay, and fewer opportunities will result for the company. Again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business.  But, the Tradeshow Manager finds themselves using Excel VLOOKUPS and other creative but time consuming ways to validate the lead information. They simply want to know, “Which leads have missing titles or phone numbers?” and ” What is the correct phone number?” and” How many are new leads?” and ” How many are in accounts closing this quarter?”

All of these are questions that can be answered, but take a lot of time in Excel and even after all that Excel work, the final lead list was still error prone causing missed sales opportunities.  With Informatica Rev, a Tradeshow Manager can answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. With the Intelligent Guidance capability they can easily surface where the missing data lies. With Fast Combine they can access their opportunity information in Salesforce and be guided through the process of combining tradeshow and salesforce data to correctly replace the missing data.  Again, Voila!

Before Informatica Rev, an Accounting Analyst spent inordinate amounts of time processing trade partner data, every month, reconciling it with the trade partner’s receivables to determine if they had been paid the correct amount by their trade partner. Not only was this process time consuming, it was error prone and after all of the effort, they actually left millions in earned revenue, unreceived. And again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business, and also effectively managing operational costs within the analysts company.  So, let’s take a look at what the Accounting Analyst does, today.  Every trade partner sends large files with different structures of purchase data in them. The Accounting Analyst initially asks, “What data is in them?”,” For what time period?”,” How many transactions?”,” From which products?”, “Which of our actual products does their name for our product tie to?”

Then, after they get these answers, they need to combine it with the payments data they received from the trade partner in order to answer the question, “Have we been paid the right amount and if not what is the difference?” All of these questions are ones that can be answered, but used to take a lot of time with Excel VLOOKUPS and complex macros. And often, the reconciliation was performed incorrectly leaving receivables, well, un-received. With Informatica Rev, an Accounting Analyst can benefit from Intelligent Guidance where they are lead through the process of rapidly understanding their questions about the trade partner files, with a few simple clicks. Furthermore Informatica Rev’s Smart Combine capability suggests how to combine receivables data with trade partner data. So there you have it, now they know if the correct amount has been paid. And the best part is that they were able to answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. Now, this process has to be done every month. Using Recipes, every step the Accounting Analyst took last month is recorded, so they do not have to repeat it this month. Just re-import the new trade partner data and you reconciled. And Again, Voila!

One more thing for you, the everyday business user. In the future, you will be able to send this Recipe to IT. This capability will allow you to communicate your exact data requirement to IT, just as you created it with no mis-interpretation on anyone’s behalf. IT can then rapidly institutionalize your logic exactly as you defined it, into the enterprise datawarehouse, datamart or some other repository of your or your IT department’s liking. Perhaps this means the end to those requirements gathering sessions?

More importantly, I feel this means that you just got your exact requirement added into a central repository in a matter of minutes. And you did not need to make a case to be part of an enterprise project either. This capability is a necessary part for you to participate in the Data Democracy and maintain your rapid pace of business. This is a piece that Informatica is uniquely positioned to solve for you as your IT department likely already has Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too.

Please look for Part 2 of this Blog, tomorrow, where I will discuss how Informatica Rev elegantly bridges the IT and Business divide, empowering IT to lead the charge into Data Democracy. But in the meantime check out Informatica Rev for yourself and let me know what you think.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Business Impact / Benefits, Data Services | Tagged , | Leave a comment

Great Data Puts You In Driver Seat: The Next Step in The Digital Revolution

Great Data Puts You In Driver Seat

Great Data Is the Next Step

The industrial revolution began in mid-late eighteenth century, introducing machines to cut costs and speed up manufacturing processes. Steam engines forever changed efficiency in iron making, textiles, and chemicals production, among many others. Transportation improved significantly, and the standard of living for the masses went saw significant, sustained growth.

In last 50-60 years, we have witnessed another revolution, through the invention of computing machines and the Internet – a digital revolution.  It has transformed every industry and allowed us to operate at far greater scale – processing more transactions and in more locations – than ever before.    New cities emerged on the map, migrations of knowledge workers throughout the world followed, and the standard of living increased again.  And digitally available information transformed how we run businesses, cities, or countries.

Forces Shaping Digital Revolution

Over the last 5-6 years, we’ve witnessed a massive increase in the volume and variety of this information.  Leading forces that contributed to this increase are:

  • Next generation of software technology connecting data faster from any source
  • Little to no hardware cost to process and store huge amount of data  (Moore’s Law)
  • A sharp increase in number of machines and devices generating data that are connected online
  • Massive worldwide growth of people connecting online and sharing information
  • Speed of Internet connectivity that’s now free in many public places

As a result, our engagement with the digital world is rising – both for personal and business purposes.  Increasingly, we play games, shop, sign digital contracts, make product recommendations, respond to customer complains, share patient data, and make real time pricing changes to in-store products – all from a mobile device or laptop.  We do so increasingly in a collaborative way, in real-time, and in a very personalized fashion.  Big Data, Social, Cloud, and Internet of Things are key topics dominating our conversations and thoughts around data these days.  They are altering our ways to engage with and expectations from each other.

This is the emergence of a new revolution or it is the next phase of our digital revolution – the democratization and ubiquity of information to create new ways of interacting with customers and dramatically speeding up market launch.  Businesses will build new products and services and create new business models by exploiting this vast new resource of information.

The Quest for Great Data

But, there is work to do before one can unleash the true potential captured in data.  Data is no more a by-product or transaction record.  Neither it has anymore an expiration date.  Data now flows through like a river fueling applications, business processes, and human or machine activities.  New data gets created on the way and augments our understanding of the meaning behind this data.  It is no longer good enough to have good data in isolated projects, but rather great data need to become accessible to everyone and everything at a moment’s notice. This rich set of data needs to connect efficiently to information that has been already present and learn from it.  Such data need to automatically rid itself of inaccurate and incomplete information.  Clean, safe, and connected – this data is now ready to find us even before we discover it.   It understands the context in which we are going to make use of this information and key decisions that will follow.  In the process, this data is learning about our usage, preference, and results.  What works versus what doesn’t.  New data is now created that captures such inherent understanding or intelligence.  It needs to flow back to appropriate business applications or machines for future usage after fine-tuning.  Such data can then tell a story about human or machine actions and results.  Such data can become a coach, a mentor, a friend of kind to guide us through critical decision points.  Such data is what we would like to call great data.  In order to truly capitalize on the next step of digital revolution, we will pervasively need this great data to power our decisions and thinking.

Impacting Every Industry

By 2020, there’ll be 50 Billion connected devices, 7x more than human beings on the planet.  With this explosion of devices and associated really big data that will be processed and stored increasingly in the cloud.  More than size, this complexity will require a new way of addressing business process efficiency that renders agility, simplicity, and capacity.  Impact of such transformation will spread across many industries.  A McKinsey article, “The Future of Global Payments”, focuses on digital transformation of payment systems in the banking industry and ubiquity of data as a result.   One of the key challenges for banks will be to shift from their traditional heavy reliance on siloed and proprietary data to a more open approach that encompasses a broader view of customers.

Industry executives, front line managers, and back office workers are all struggling to make the most sense of the data that’s available.

Closing Thoughts on Great Data

A “2014 PWC Global CEO Survey ” showed 81% ranked technology advances as #1 factor to transform their businesses over next 5 years.  More data, by itself, isn’t enough for this transformation.  A robust data management approach integrating machine and human data, from all sources and updated in real-time, among on-premise and cloud-based systems must be put in place to accomplish this mission.  Such an approach will nurture great data.  This end-to-end data management platform will provide data guidance and curate an organization’s one of the most valuable assets, its information.    Only by making sense of what we have at our disposal, will we unleash the true potential of the information that we possess.  The next step in the digital revolution will be about organizations of all sizes being fueled by great data to unleash their potential tapped.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Data Governance, Data Integration Platform, Enterprise Data Management | Tagged , , , , | Leave a comment

Analytics Stories: An Educational Case Study

AnalyticsAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. At first glance, you might not think of universities needing to worry much about their right to win, but universities today are facing increasing competition for students as well as the need to increase efficiency, decrease dependence upon state funding, create new and less expensive delivery models, and drive better accountability.

George Washington University Perceives The Analytic Opportunity

AnalyticsGeorge Washington University (GWU) is no different. And for this reason their leadership determined that they needed to gain the business insight to compete for the best students, meet student diversity needs, and provide accountability to internal and external stakeholders. All of these issues turned out to have a direct impact upon GWU’s business processes—from student recruitment to financial management. At the same time university leadership determined the complexity of these challenges requires continual improvement in the University’s operational strategies and most importantly, accurate, timely, and consistent data.

Making It A Reality

processing dataGWU determined that getting after these issues required a flexible system that could provide analytics and key academic performance indicators and metrics on demand, whenever they needed them. They, also, determined that the analytics and underlying data needed to enable accurate, balanced decisions needed to be performed more quickly and more effectively than in the past.

Unfortunately, GWU’s data was buried in disparate data sources that were largely focused on supporting transactional, day-to-day business processes. This data was difficult to extract and even more difficult to integrate into a single format, owing to inherent system inconsistencies and the ownership issues surrounding them — a classic problem for collegial environments. Moreover, the university’s transaction applications did not store data in models that supported on-demand and ad hoc aggregations that GWU business users required.

To solve these issues, GWU created a data integration and business intelligence implementation dubbed the Student Data Mart (SDM). The SDM integrates raw structured and unstructured data into a unified data model to support key academic metrics.

“The SDM represents a life record of the students,” says Wolf, GWU’s Director of Business Intelligence. “It contains 10 years of recruitment, admissions, enrollment, registration, and grade-point average information for all students across all campuses”. It supports a wide-range of academic metrics around campus enrollment counts, admissions selectivity, course enrollment, student achievement, and program metrics.

These metrics are directly and systematically aligned with the academic goals for each department and with GWU’s overall overarching business goals. Wolf says, “The SDM system provides direct access to key measures of academic performance”. “By integrating data into a clean repository and disseminating information over their intranet, the SDM has given university executivesdirect access to key academic metrics. Based on these metrics, users are able to make decisions in a timely manner and with more precision than before.”

Their integration technology supports a student account system, which supplies more than 400 staff with a shared, unified view of the financial performance of students. It connects data from a series of diverse, fragmented internal sources and third-party data from employers, sponsors, and collection agencies. The goal is to answer business questions about whether students paid their fees or how much they paid for each university course.

Continual Quality Improvement

AnalyticsDuring its implementation, GWU’s data integration process exposed a number of data quality issues that were the natural outcome of a distributed data ownership. Without an enterprise approach to data and analytics, it would have been difficult to investigate the nature and extent of data quality issues from its historical fragmented business intelligence system. Taking an enterprise approach has, as well, enabled GWU to improve data quality standards and procedures.

Wolf explains, “Data quality is an inevitable problem in any higher education establishment, because you have so many different people—lecturers, students, and administration staff—all entering data. With our system, we can find hidden data problems, wherever they are, and analyze the anomalies across all data sources. This helps build our trust and confidence in the data. It also speeds up the design phase because it overcomes the need to hand query the data to see what the quality is like.”

Connecting The Dots

Dots_gameplayWolf and his team have not stopped here. As data emanating from social media has grown, they have designed their system so social data can be integrated just as easily as their traditional data sources including Oracle Financials, SunGard, SAP, and flat file data. Wolf says the SDM platform doesn’t turn its back on any type of data. By allowing the university to integrate any type of data, including social media, Wolf has been able to support key measures of academic performance, improving standards, and reducing costs. Ultimately, this is helping GWU maintain its business position as well as the University’s position especially as a magnet for the best students around the world.

In sum, the GWU analytics solution has helped it achieve the following business goals:

  • Attract the best students
  • Provide trusted reliable data for decision makers
  • Enable more timely business decisions
  • Increase achievement of academic and administrative goals
  • Deliver new business insight by combining social media with existing data sources

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance | Tagged , , | Leave a comment

Building Engagement through Service and Support

This is a guest post by Tom Petrocelli, Research Director of Enterprise Social, Mobile and Cloud Applications at Neuralytix

Engagement through Service and Support

Engagement through Service and Support

A product is not just an object or the bits that comprise software or digital media; it is an entire experience. The complete customer experience is vital to the overall value a customer derives from their product and the on-going relationship between the customer and the vendor. The customer experience is enhanced through a series of engagements over a variety of digital, social, and personal channels.  Each point of contact between a vendor and customer is an opportunity for engagement. These engagements over time affect the level of satisfaction the customers with the vendor relationship.

Service and support is a critical part of this engagement strategy. Retail and consumer goods companies recognize the importance of support to the overall customer relationship. Subsequently, these companies have integrated their before and after-purchase support into their multi-channel marketing and omni-channel marketing strategies. While retail and consumer products companies have led the way on support an integral part of on-going customer engagement, B2B companies have begun to do the same. Enterprise IT companies, which are primarily B2B companies, have been expanding their service and support capabilities to create more engagement between their customers and themselves. Service offerings have expanded to include mobile tools, analytics-driven self-help, and support over social media and other digital channels. The goal of these investments has been to make interactions more productive for the customer, strengthen relationships through positive engagement, and to gather data that drives improvements in both the product and service.

A great example of an enterprise software company that understands the value in customer engagement though support is Informatica.  Known primarily for their data integration products, Informatica has been quickly expanding their portfolio of data management and data access products over the past few years. This growth in their product portfolio has introduced many new types of customers Informatica and created more complex customer relationships. For example, the new SpringBok product is aimed at making data accessible to the business user, a new type of interaction for Informatica. Informatica has responded with a collection of new service enhancements that augment and extend existing service channels and capabilities.

What these moves say to me is that Informatica has made a commitment to deeper engagement with customers. For example, Informatica has expanded the avenues from which customers can get support. By adding social media and mobile capabilities, they are creating additional points of presence that address customer issues when and where customers are. Informatica provides support on the customers’ terms instead of requiring customers to do what is convenient for Informatica. Ultimately, Informatica is creating more value by making it easier for customers to interact with them. The best support is that which solves the problem quickest with the least amount of effort. Intuitive knowledge base systems, online support, sourcing answers from peers, and other tools that help find solutions immediately are more valued than traditional phone support. This is the philosophy that drives the new self-help portal, predicative escalation, and product adoption services.

Informatica is also shifting the support focus from products to business outcomes. They are manage problems holistically and are not simply trying to create product band-aids. This shows a recognition that technical problems with data are actually business problems that have broad effects on a customer’s business.  Contrast this with the traditional approach to support that focuses fixing a technical issue but doesn’t necessarily address the wider organizational effects of those problems.

More than anything, these changes are preparation for a very different support landscape. With the launch of the Springbok data analytics tool, Informatica’s support organization is clearly positioning itself to help business analysts and similar semi-technical end-users. The expectations of these end-users have been set by consumer applications. They expect more automation and more online resources that help them to use and derive value from their software and are less enamored with fixing technical problems.

In the past, technical support was mostly charged with solving immediate technical issues.  That’s still important since the products have to work first to be useful. Now, however, support organizations has an expanded mission to be part of the overall customer experience and to enhance overall engagement. The latest enhancements to the Informatica support portfolio reflects this mission and prepares them for the next generation of non-IT Informatica customers.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration | Tagged , , | Leave a comment

Connecting Architecture To Business Strategy

On November 13, 2014, Informatica acquired the assets of Proact, whose Enterprise Architecture tools and delivery capability link architecture to business strategy. The BOST framework is now the Informatica Business Transformation Toolkit which received high marks in a recent research paper:

“(BOST) is a framework that provides four architectural views of the enterprise (Business, Operational, Systems, and Technology). This EA methodology plans and organizes capabilities and requirements at each view, based on evolving business and opportunities. It is one of the most finalized of the methodologies, in use by several large enterprises.” [1] (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Integration Competency Centers | Tagged , , , | Leave a comment

A New Dimension on a Data-Fueled World

A New Dimension on a Data-Fueled World

A New Dimension on a Data-Fueled World

A Data-Fueled World, Informatica’s new view on data in the enterprise.  I think that we can all agree that technology innovation has changed how we live and view every day life.  But, I want to speak about a new aspect of the data-fueled world.  This is evident now and will be shockingly present in the few years to come.  I want to address the topic of “information workers”.

Information workers deal with information, or in other words, data.  They use that data to do their jobs.  They make decisions in business with that data.  They impact the lives of their clients.

Many years ago, I was part of a formative working group researching information worker productivity.  The idea was to create an index like Labor Productivity indexes.  It was to be aimed at information worker productivity.  By this I mean the analysts, accountants, actuaries, underwriters and statisticians.  These are business information workers.  How productive are they?  How do you measure their output?  How do you calculate an economic cost of more or less productive employees?  How do you quantify the “soft” costs of passing work on to information workers?  The effort stalled in academia, but I learned a few key things.  These points underline the nature of an information worker and impacts to their productivity.

  1. Information workers need data…and lots of it
  2. Information workers use applications to view and manipulate data to get the job done
  3. Degradation, latency or poor ease of use in any of items 1 and 2 have a direct impact on productivity
  4. Items 1 and 2 have a direct correlation to training cost, output and (wait for it) employee health and retention

It’s time to make a super bold statement.  It’s time to maximize your investment in DATA. And past time to de-emphasize investments in applications!  Stated another way, applications come and go, but data lives forever.

My five-year old son is addicted to his iPad.  He’s had one since he was one-year old.  At about the age of three he had pretty much left off playing Angry Birds.  He started reading Wikipedia.  He started downloading apps from the App Store.  He wanted to learn about string theory, astrophysics and plate tectonics.  Now, he scares me a little with his knowledge.  I call him my little Sheldon Cooper.  The apps that he uses for research are so cool.  The way that they present the data, the speed and depth are amazing.  As soon as he’s mastered one, he’s on to the next one.  It won’t be long before he’s going to want to program his own apps.  When that day comes, I’ll do whatever it takes to make him successful.

And he’s not alone.  The world of the “selfie-generation” is one of rapid speed.  It is one of application proliferation and flat out application “coolness”.  High school students are learning iOS programming.  They are using cloud infrastructure to play games and run experiments.  Anyone under the age of 27 has been raised in a mélange of amazing data-fueled computing and mobility.

This is your new workforce.  And on their first day of their new career at an insurance company or large bank, they are handed an aging recycled workstation.  An old operating system follows and mainframe terminal sessions.  Then comes rich-client and web apps circa 2002.  And lastly (heaven forbid) a Blackberry.  Now do you wonder if that employee will feel empowered and productive?  I’ll tell you now, they won’t.  All that passion they have for viewing and interacting with information will disappear.  It will not be enabled in their new work day.  An outright information worker revolution would not surprise me.

And that is exactly why I say that it’s time to focus on data and not on applications.  Because data lives on as applications come and go.  I am going to coin a new phrase.  I call this the Empowered Selfie Formula.  The Empowered Selfie Formula is a way in which the focus on data liberates information workers.  They become free to be more productive in today’s technology ecosystem.

Enable a BYO* Culture

Many organizations have been experimenting with Bring Your Own Device (BYOD) programs.  Corporate stipends that allow employees to buy the computing hardware of their choice.  But let’s take that one step further.  How about a Bring Your Own Application program?  How about a Bring Your Own Codebase program?  The idea is not so far-fetched.  There are so many great applications for working with information.  Today’s generation is learning about coding applications at a rapid pace.  They are keen to implement their own processes and tools to “get the job done”.  It’s time to embrace that change.  Allow your information workers to be productive with their chosen devices and applications.

Empower Social Sharing

Your information workers are now empowered with their own flavors of device and application productivity.  Let them share it.  The ability to share success, great insights and great apps is engrained into the mindset of today’s technology users.  Companies like Tableau have become successful based on the democratization of business intelligence.  Through enabling social sharing, users can celebrate their successes and cool apps with colleagues.  This raises the overall levels of productivity as a grassroots movement.  Communities of best practices begin to emerge creating innovation where not previously seen.

Measure Productivity

As an organization it is important to measure success.  Find ways to capture key metrics in productivity of this new world of data-fueled information work.  Each information worker will typically be able to track trends in their output.  When they show improvement, celebrate that success.

Invest in “Cool”

With a new BYO* culture, make the investments in cool new things.  Allow users to spend a few dollars here and there for training online or in-person.  There they can learn new things will make them more productive.  It will also help with employee retention.  With small investment larger ROI can be realized in employee health and productivity.

Foster Healthy Competition

Throughout history, civilizations that fostered healthy competition have innovated faster.  The enterprise can foster healthy competition on metrics.  Other competition can be focused on new ways to look at information, valuable insights, and homegrown applications.  It isn’t about a “best one wins” competition.  It is a continuing round of innovation winners with lessons learned and continued growth.  These can also be centered on the social sharing and community aspects.  In the end it leads to a more productive team of information workers.

Revitalize Your Veterans

Naturally those information workers who are a little “longer in the tooth” may feel threatened.  But this doesn’t need to be the case.  Find ways to integrate them into the new culture.  Do this through peer training, knowledge transfer, and the data items listed below.  In the best of cases, they too will crave this new era of innovation.  They will bring a lot of value to the ecosystem.

There is a catch.  In order to realize success in the formula above, you need to overinvest in data and data infrastructure.  Perhaps that means doing things with data that only received lip service in the past.  It is imperative to create a competency or center of excellence for all things data.  Trusting your data centers of excellence activates your Empowered Selfie Formula.

Data Governance

You are going to have users using and building new apps and processing data and information in new and developing ways.  This means you need to trust your data.  Your data governance becomes more important.  Everything from metadata, data definition, standards, policies and glossaries need to be developed.  In this way the data that is being looked at can be trusted.  Chief Data Officers should put into place a data governance competency center.  All data feeding and coming from new applications is inspected regularly for adherence to corporate standards.  Remember, it’s not about the application.  It’s about what feeds any application and what data is generated.

Data Quality

Very much a part of data governance is the quality of data in the organization.  Also adhering to corporate standards.  These standards should dictate cleanliness, completeness, fuzzy logic and standardization.  Nothing frustrates an information worker more than building the coolest app that does nothing due to poor quality data.

Data Availability

Data needs to be in the right place at the right time.  Any enterprise data takes a journey from many places and to many places.  Movement of data that is governed and has met quality standards needs to happen quickly.  We are in a world of fast computing and massive storage.  There is no excuse for not having data readily available for a multitude of uses.

Data Security

And finally, make sure to secure your data.  Regardless of the application consuming your information, there may be people that shouldn’t see the data.  Access control, data masking and network security needs to be in place.  Each application from Microsoft Excel to Informatica Springbok to Tableau to an iOS developed application will only interact with the information it should see.

The changing role of an IT group will follow close behind.  IT will essentially become the data-fueled enablers using the principles above.  IT will provide the infrastructure necessary to enable the Empowered Selfie Formula.  IT will no longer be in the application business, aside from a few core corporation applications as a necessary evil.

Achieving a competency in the items above, you no longer need to worry about the success of the Empowered Selfie Formula.  What you will have is a truly data-fueled enterprise.  There will be a new class of information workers enabled by a data-fueled competency.  Informatica is thrilled to be an integral part of the realization that data can play in your journey.  We are energized to see the pervasive use of data by increasing numbers of information workers.  The are creating new and better ways to do business.  Come and join a data-fueled world with Informatica.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data First, Data Governance, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Securing Sensitive Data in Test and Development Environments

Securing Sensitive Data in Test and Development Environments

Securing Test and Development Environments

Do you use copies of production data in test and development environments? This is common practice in IT organizations. For this reason, test environments have become the number one target for outside intruders. That being said, most data breaches occur when non-malicious insiders accidentally expose sensitive data in the course of their daily work. Insider data breaches can be more serious and harder to detect than intruder events.

If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation.  On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.

This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.

In this webinar, we will cover:

  • Key trends in enterprise data security
  • Vulnerabilities in non-production application environments (test and development)
  • Alternatives to consider when protecting test and development environments
  • Priorities for enterprises in reducing attack surface for their organization
  • Compliance and internal audit cost reduction
  • Data masking and synthetics data use cases
  • Informatica Secure Testing capabilities

Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data Security, Data Services | Tagged , , , | Leave a comment