Tag Archives: customer data integration

How Much Does Bad Data Cost Your Business?

Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.

Dennis Moore explains the implications of using inaccurate, inconsistent and disconnected data and the value business leaders can gain by mastering it.

Dennis Moore explains the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.

As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse,  most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:

  • How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
  • How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
  • How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
  • How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?

Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,

  • a customer moves, changes jobs, gets married, or changes their purchasing habits;
  • a suppliers moves, goes bankrupt or acquires a competitor;
  • you discontinue a product or launch a new one; or
  • you onboard a new asset or retire an old one.

As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”

Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!

Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.

Lastly, look for a technology that is purpose-built to deal with this problem.  Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.

When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.

Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.

In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions  to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.

Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:

Business leaders are starting to make the connection between transformation failures and bad data.

Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data.

  • MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
  • MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies,  contracts and the products they supply.
  • MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.

Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.

Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.

The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.

One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.

Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.

Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!

See the MDM Day Agenda.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Quality, Informatica World 2014, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

Agile Development tools deliver up to 5x faster data integration development

agile data integration boys playing telephoneWhen I was seven years old, Danny Weiss had a birthday party where we played the telephone game.  The idea is this:  there are 8 people sitting around a table, the first person tells the next person a little story.  They tell the next person, the story, and so on, all the way around the room.  At the end of the game, you compare the original story that the first person tells and compare it to the story the 8th person tells.  Of course, the stories are very different and everyone giggles hysterically… we were seven years old after all.

The reason I was thinking about this story is that data integration development is similarly inefficient as a seven year old birthday party.  The typical process is that a business analyst, using the knowledge in their head about the business applications they are responsible for, creates a spreadsheet in Microsoft Excel that has a list of database tables and columns along with a set of business rules for how the data is to be transformed as it moved to a target system (a data warehouse or another application).  The spreadsheet, which is never checked against real data, is then passed to a developer who then creates code in separate system in order to move the data, which is then checked by a QA person which is then checked again by the business analyst at the end of the process.   This is the first time the business analyst verifies their specification against real data.

99 times out of 100, the data in the target system doesn’t match what the business analyst was expecting. Why? Either the original specification was wrong because the business analyst had a typo or the data is inaccurate.  Or the data in the original system wasn’t organized the way the analyst thought it was organized.  Or the developer misinterpreted the spreadsheet.  Or the business analyst simply doesn’t need this data anymore – he needs some other data.  The result is lots of errors, just like the telephone game.  And the only way to fix it is with rework and then more rework.

But there is a better way.  What if the data analyst could validate their specification against real data and self correct on the fly before passing the specification to the developer. What if the specification were not just a specification, but a prototype that could be passed directly to the developer who wouldn’t recode it, but would just modify it to add scalability and reliability?   The result is much less rework and much faster time to development.  In fact, up to 5 times faster.

That is what Agile Data integration is all about.  Rapid prototyping and self-validation against real data up front by the business analyst.  Sharing of results in a common toolset back and forth to the developer to improve the accuracy of communication.

Because we believe the agile process is so important to your success, Informatica is giving all of our PowerCenter Standard Edition (and higher editions) customers agile data integration for FREE!!! That’s right, if you are a current customer of Informatica PowerCenter, we are giving you the tools you need to go from the old fashion error-prone, waterfall, telephone game style of development to a modern 21st century Agile process.

•             FREE rapid prototyping and data profiling for the data analyst.

•             Go from prototype to production with no recoding.

•             Better communication and better collaboration between analyst and developer

PowerCenter 9.6.  Agile Data Integration built in.  No more telephone game.  It doesn’t get any better than that.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , | Leave a comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts ;-)   – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

History Repeats Itself Through Business Intelligence (Part 1)

History repeats

Unlike some of my friends, History was a subject in high school and college that I truly enjoyed.   I particularly appreciated biographies of favorite historical figures because it painted a human face and gave meaning and color to the past. I also vowed at that time to navigate my life and future under the principle attributed to Harvard professor Jorge Agustín Nicolás Ruiz de Santayana y Borrás that goes, “Those who cannot remember the past are condemned to repeat it.”

So that’s a little ditty regarding my history regarding history.

Forwarding now to the present in which I have carved out my career in technology, and in particular, enterprise software, I’m afforded a great platform where I talk to lots of IT and business leaders.  When I do, I usually ask them, “How are you implementing advanced projects that help the business become more agile or effective or opportunistically proactive?”  They usually answer something along the lines of “this is the age and renaissance of data science and analytics” and then end up talking exclusively about their meat and potatoes business intelligence software projects and how 300 reports now run their business.

Then when I probe and hear their answer more in depth, I am once again reminded of THE history quote and think to myself there’s an amusing irony at play here.  When I think about the Business Intelligence systems of today, most are designed to “remember” and report on the historical past through large data warehouses of a gazillion transactions, along with basic, but numerous shipping and billing histories and maybe assorted support records.

But when it comes right down to it, business intelligence “history” is still just that.  Nothing is really learned and applied right when and where it counted – AND when it would have made all the difference had the company been able to react in time.

So, in essence, by using standalone BI systems as they are designed today, companies are indeed condemned to repeat what they have already learned because they are too late – so the same mistakes will be repeated again and again.

This means the challenge for BI is to reduce latency, measure the pertinent data / sensors / events, and get scalable – extremely scalable and flexible enough to handle the volume and variety of the forthcoming data onslaught.

There’s a part 2 to this story so keep an eye out for my next blog post  History Repeats Itself (Part 2)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Complex Event Processing, Data Integration, Data Integration Platform, Data Warehousing, Real-Time, Uncategorized | Tagged , , , , , , , | Leave a comment

Integration Gives You Muscle. Data Integration with Event Processing Gives You 6-Pack Abs.

6Packabs

Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system.  The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.

Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT.  But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it.  The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.

Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs?  Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body.  It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete.  And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.

You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place?  The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives.  But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.

So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Complex Event Processing, Data Integration, Data Integration Platform, Operational Efficiency, Real-Time | Tagged , , , , , , , , , | Leave a comment

3 Big Data Secrets that can ignite a Product Launch

Customers don’t always like change, and new product launch offers variety of changes so it’s important to showcase the value of the change for customers while launching a product.  One key ingredient that can fuel the successful Product launch is leveraging the rich, varied, multi-sourced, readily available information. Yes, tons of information which is like a gold mine and is available to us more easily/readily than ever before from various different sources. Industry experts call it Big Data. Today Big Data can pull gold out of this information gold mine and positively impact a product launch.  What follows are 3 secrets of how Product Marketers can tap the power of Big Data for a successful product launch.

Secret #1: Use Big Data to optimize content strategy and targeted messaging

The main challenge is not just to create a great product but also to communicate the clear compelling value of the product to your customers. You need to speak the language that resonates with needs and preferences of customers. Through social media platforms and weblogs, lots of information is available highlighting views/preferences of buyers. Big Data brings all these data points together from various sources, unlocks them to provide customer intelligence. Product Marketers can leverage this intelligence to create customer segmentation and targeted messaging.

Secret #2: Use Big Data to identify influential customers and incent them to influence others

One of the studies done by Forrester Research indicates that today your most valuable customer is the one who may buy little but influences 100 others to buy via blogs, tweets, Facebook and online product reviews. Using MDM with Big Data businesses can create a 360 degree customer profile by integrating transaction, social interaction and weblogs which help in identifying influential customers. Companies can engage these influential customers early by initiating a soft launch or beta testing of their product.

Secret #3: Use Big data to provide direction to ongoing Product improvement

Big Data is also a useful tool to monitor on-going product performance and keeping customers engaged post-launch. Insights into how customers are using the product and what they enjoy most can open the doors for improvements in future launches resulting in happier and loyal customers.

Zynga, creator of most popular Facebook game Farmville, collects terabytes of big data in a day and analyzes it to improve the game features and customer services. As indicated in a WSJ article after Version 1 launch of the game, the company analyzed customer behavior and found that customers were interacting with animals much more than the designers expected. So in the second release game designers increased the game offerings with more focus on animals keeping customer’s more engaged.

Key Takeaway:

Big data is proving to be a game changer for product managers and marketers who want to deeply engage with their customers and launch products with a memorable and valued customer experience.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Master Data Management | Tagged , , , | Leave a comment

Five Examples Of How Master Data Management (MDM) Helps Integrate Social Media Data Into Your Business

Are you trying to figure out how to integrate social media data into your business?

A recent poll, taken during a webinar on The Power of Social & Mobile Data, revealed almost 50% of respondents are trying to integrate social media data into their business. Half indicated that the Marketing executive was most interested. For almost 20 percent it’s the Product Development or Merchandising executive. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Master Data Management | Tagged , , , , , , , , , , , , , , | 4 Comments

Start Running Because The Data Tsunami Is Approaching

The phrase ‘Data Tsunami’ has been used by numerous authors in the last few months and it’s difficult to find another suitable analogy because what’s approaching is of such an increased order of magnitude that the IT industries continued expectations for data growth will be swamped in the next few years.
However impressive a spectacle a Tsunami is, it still wreaks havoc to those who are unprepared or believe they can tread water and simply float to the surface when the trouble has passed.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Big Data, Data Transformation, Telecommunications, Uncategorized | Tagged , , | 1 Comment

Customer Data Management – Time for a Reboot?

I have been developing two ideas for customer data management: entities vs. roles and differentiation. In the last post I suggested that customer is not a data type, but rather a role that can be played by some core entity in some context, with some set of characteristics assigned to that role within that context. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Uncategorized | Tagged , , , , , , , | 1 Comment