Category Archives: Business Impact / Benefits

(Re)Thinking Data Security Strategy

Data Security Strategy

Rethinking Data Security Strategy

Data security is usually something people only think about when they get hacked, a place they do business with gets hacked, or they lose their credit card or wallet. It is just human nature to not worry about things that you cannot see and that seem to be well at hand. Instead I would suggest every company (and person) take just 15 minutes once a month to think about the below items that need to be part of their data security strategy.

Data security is a complex issue with many facets. I will skip past how you create and use passwords as that is the one area that gets a lot of focus. With the now well accepted use of SaaS and cloud based technologies by companies and people in their personal lives it is also time that people take a few moments to consider just how their data is secured or in some cases at risk.

Data centric security. Traditionally enterprise security has focused on access issues. What can be accessed from where and by who. The problem with this often walled garden approach is that when it comes to data these technologies and procedures do not take into account the common use cases of data usage. Most data security programs are also really outdated in a world where the majority of companies are using systems they do not own or directly manage (e.g. SaaS, Cloud, Mobile) or all the different types of data that are being created by people, systems and applications. Many enterprise security strategies need to move from focusing on access to include data usage and the ontology of data being used.

Question: Does your company have a modern enterprise security strategy or a walled garden approach?

Data about data. Long ago to make it easier to store, search and retrieve data people figured out that adding descriptive information about what is in the data file would be useful. Metadata is the actual term and it is no different than the labels people would put on a file to hold papers before we started moving everything to software based storage. The problem is that metadata has really grown and it can provide ways for people to learn a lot of personal, business and proprietary information without even getting access to the underlying information file. The richer the meta-data the more business or personal risk is created by possibly exposing information without actually exposing the underlying data.

Question: Are you accidentally exposing sensitive information in your metadata?

At rest data. The reason they use to say keep your tax records for 3 years and then destroy them is because people stored everything in file cabinets, drawers, or under a mattress. Some people do still like physical records but for most people and companies data is stored electronically and has been for a long time. The addition of SaaS and cloud based solutions adds a new wrinkle because the data is stored somewhere that you do not necessarily have direct access. And in many cases the data is stored multiple times if it is archived or backed up. Even when data is deleted in many cases it is not really gone because with the right technology data can be recovered if it was not fully deleted off the storage system that was used.

Question: Do you know where your data is stored? Archived? Backed up?

Question: Do you know how you would dispose of sensitive data that is no longer needed?

In flight data. No, this is not the Wi-Fi on the airplane. This is literally the data and meta-data that as they are being used by applications in the regular course of business. The issue is that while the data is being transmitted it could be at risk. This is one reason that people are warned to be careful of how they use public Wi-Fi because any decent hacker can see all the data on the network. (yes, really is that easy). Another enterprise issue that often needs to be dealt with is data cleaning in order to reduce duplicates or errors in data. A problem that occurs is how to do this with sensitive data that you do not want the developers or IT staff actually seeing. (e.g. HR or financial records).

Question: How does your company safe guard transactional and in flight data?

Question: Does your company use data masking and cleansing technology to safe guard in flight data?

Data Security Strategy

Rethinking Data Security Strategy

Data. Yes, the actual data or information that you care about or just store because it is so easy. I would recommend that companies look holistically at their data and think of it across it’s lifecycle. In this approach the data risks should be identified for how it is stored, used, transmitted, exposed internally or externally, and integrated or accessed for data integration. There are some new and interesting solutions coming to market that go beyond traditional data security, masking, and cleansing to help identify and access data security risks in the area of Security Intelligence. The concepts of Security Intelligence are solutions that are meant to create a measurement of security risk and identify issues so that they can a) be addressed before becoming a big problem b) automated procedures can be put in place to improve the level of security or bring solution up to the desired level of security .

One example is a new solution from Informatica called Secure@Source, which is just coming to market. This is a solution that is meant to provide automated analysis for enterprises so they can determine data risks so they can make improvements and then put in place new policies and automated procedures so that the desired level of data security is maintained. There have been similar solutions used for network security for years but these newer solutions while using similar approaches are now dealing with the more specific issues of data security.

Question: What is your company doing to proactively assess and manage data risk? Are you a good candidate for a security intelligence approach?

Data security is an important issue that all companies should have a strategy. While this is not meant to be an all encompassing list it is a good starting place for a discussion. Stay secure. Don’t be the next company in the news with a data security issue.

Share
Posted in Architects, Big Data, Business Impact / Benefits, Cloud Computing, Data Governance, Data Integration, Enterprise Data Management, Master Data Management | Tagged , | Leave a comment

There is Just One V in Big Data

According to Gartner, 64% of organizations surveyed have purchased or were planning to invest in Big Data systems. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.

Of that 64%, 30% have already invested in Big Data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner’s 720 respondents, however, have actually deployed Big Data technology. This is bad, because most companies simply don’t know what they’re doing when it comes to Big Data.

Over the years, we have heard that Big Data is Volume, Velocity, and Variety. I feel this is one of the reasons why despite the Big Data hype, most companies are still stuck in neutral is because of this limited view.

  1. Volume: Terabytes to Exabytes, petabytes to Zetabytes of lots of data
  1. Velocity: Streaming data, milliseconds to seconds, how fast data is produced, and how fast the data must be processed to meet the need or demand
  1. Variety: Structured, unstructured, text, multimedia, video, audio, sensor data, meter data, html, text, e-mails, etc.

There is just one V in Big DataFor us, the focus is on collection of data. After all, we are prone to be hoarders. Wired by our survival extinct to collect and hoard for the leaner winter months that may come. So while we hoard data, as much as we can, for the illusive “What if?” scenario. “Maybe this will be useful someday.” It’s this stockpiling of Big Data without application that makes it useless.

While Volume, Velocity, and Variety are focused on collection of data, Gartner, in 2014, introduced 3 additional Vs: Veracity, Variability, and Value which focus on usefulness of the data.

  1. Veracity: Uncertainty due to data inconsistency and incompleteness, ambiguities, latency, deception, model approximations, accuracy, quality, truthfulness or trustworthiness
  1. Variability: The differing ways in which the data may be interpreted, different questions require different interpretations
  1. Value: Data for co-creation and deep learning

I believe that perfecting as few as 5% of the relevant variables will get a business 95% of the same benefit. The trick is identifying that viable 5%, and extracting meaningful information from it. In other words, “Value” is the long pole in the tent.

Twitter @bigdatabeat

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Hadoop | Tagged , | 1 Comment

Great Customer Experiences Start with Great Customer Data

TCRM

Are your customer-facing teams empowered with the great customer data they need to deliver great customer experiences?

On Saturday, I got a call from my broadband company on my mobile phone. The sales rep pitched a great limited-time offer for new customers. I asked him whether I could take advantage of this great offer as well, even though I am an existing customer. He was surprised.  “Oh, you’re an existing customer,” he said, dismissively. “No, this offer doesn’t apply to you. It’s for new customers only. Sorry.” You can imagine my annoyance.

If this company had built a solid foundation of customer data, the sales rep would have had a customer profile rich with clean, consistent, and connected information as reference. If he had visibility into my total customer relationship with his company, he’d know that I’m a loyal customer with two current service subscriptions. He’d know that my husband and I have been customers for 10 years at our current address. On top of that, he’d know we both subscribed to their services while live at separate addresses before we were married.

Unfortunately, his company didn’t arm him with the great customer data he needs to be successful. If they had, he could have taken the opportunity to offer me one of the four services I currently don’t subscribe to—or even a bundle of services. And I could have shared a very different customer experience.

Every customer interaction counts

Executives at companies of all sizes talk about being customer-centric, but it’s difficult to execute on that vision if you don’t manage your customer data like a strategic asset. If delivering seamless, integrated, and consistent customer experiences across channels and touch points is one of your top priorities, every customer interaction counts. But without knowing exactly who your customers are, you cannot begin to deliver the types of experiences that retain existing customers, grow customer relationships and spend, and attract new customers.

How would you rate your current ability to identify your customers across lines of business, channels and touch points?

Many businesses, however, have anything but an integrated and connected customer-centric view—they have a siloed and fragmented channel-centric view. In fact, sales, marketing, and call center teams often identify siloed and fragmented customer data as key obstacles preventing them from delivering great customer experiences.

ChannelCRM

Many companies are struggling to deliver great customer experiences across channels, because  their siloed systems give them a channel-centric view of customers

According to Retail Systems Research, creating a consistent customer experience remains the most valued capability for retailers, but 55 % of those surveyed indicated their biggest inhibitor was not having a single view of the customer across channels.

Retailers are not alone. An SVP of marketing at a mortgage company admitted in an Argyle CMO Journal article that, now that his team needs to deliver consistent customer experiences across channels and touch points, they realize they are not as customer-centric as they thought they were.

Customer complexity knows no bounds

The fact is, businesses are complicated, with customer information fragmented across divisions, business units, channels, and functions.

Citrix, for instance, is bringing together valuable customer information from 4 systems. At Hyatt Hotels & Resorts, it’s about 25 systems. At MetLife, it’s 70 systems.

How many applications and systems would you estimate contain valuable customer information at your company?

Based on our experience working with customers across many industries, we know the total customer relationship allows:

  • Marketing to boost response rates by better segmenting their database of contacts for personalized marketing offers.
  • Sales to more efficiently and effectively cross-sell and up-sell the most relevant offers.
  • Customer service teams to resolve customers’ issues immediately, instead of placing them on hold to hunt for information in a separate system.

If your marketing, sales, and customer service teams are struggling with inaccurate, inconsistent, and disconnected customer information, it is costing your company revenue, growth, and success.

Transforming customer data into total customer relationships

Informatica’s Total Customer Relationship Solution fuels business and analytical applications with clean, consistent and connected customer information, giving your marketing, sales, e-commerce and call center teams access to that elusive total customer relationship. It not only brings all the pieces of fragmented customer information together in one place where it’s centrally managed on an ongoing basis, but also:

  • Reconciles customer data: Your customer information should be the same across systems, but often isn’t. Assess its accuracy, fixing and completing it as needed—for instance, in my case merging duplicate profiles under “Jakki” and “Jacqueline.”
  • Reveals valuable relationships between customers: Map critical connections­—Are individuals members of the same household or influencer network? Are two companies part of the same corporate hierarchy? Even link customers to personal shoppers or insurance brokers or to sales people or channel partners.
  • Tracks thorough customer histories: Identify customers’ preferred locations; channels, such as stores, e-commerce, and catalogs; or channel partners.
  • Validates contact information: Ensure email addresses, phone numbers, and physical addresses are complete and accurate so invoices, offers, or messages actually reach customers.
PCRM

With a view of the Total Customer Relationship, teams are empowered to deliver great customer experiences

This is just the beginning. From here, imagine enriching your customer profiles with third-party data. What types of information help you better understand, sell to, and serve your customers? What are your plans for incorporating social media insights into your customer profiles? What could you do with this additional customer information that you can’t do today?

We’ve helped hundreds of companies across numerous industries build a total customer relationship view. Merrill Lynch boosted marketing campaign effectiveness by 30 percent. Citrix boosted conversion rates by 20%. A $60 billion global manufacturer improved cross-sell and up-sell success by 5%. A hospitality company boosted cross-sell and up-sell success by 60%. And Logitech increased sales across channels, including their online site, retail stores, and distributors.

Informatica’s Total Customer Relationship Solution empowers your people with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on social media.

Do you have a terrible customer experience or great customer experience to share? If so, please share them with us and readers using the Comment option below.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Data Integration, Data Quality, Data Services, Enterprise Data Management, Master Data Management, Total Customer Relationship | Tagged , , , , , , , , , | Leave a comment

Does a Better Order to Cash Process Start with Better Customer Data?

Improving Ordeshutterstock_227713873 - Copyr to Cash matters regardless of market cycle

Order to Cash (OTC) matters to today’s CFOs and their financial teams even as CFOs move themselves from an expense and cost reduction footing to a manage growth footing. As the business process concerned with receiving and processing customer sales, having a well-functioning OTC process is not only about preserving cash, but it is also about improving the working capital delivered to the bottom line. This is something which is necessitated by successful growth strategies. Specifically, OTC helps to provide the cash flow needed to quicken collections and improve working capital turnover.

This drives concrete improvements to finance metrics including Days Sales Outstanding (a measure of the average number of days that a company takes to collect revenue after a sale has been made) and the overall cost of collections. It should be clear that a poorly running order to cash process can create tangible business issues. These include but are not limited to the following:

  • Accuracy of purchase orders
  • Accuracy of invoices
  • Volume of customer disputes
  • Incorrect application of payment terms
  • Approval of inappropriate orders
  • Errors in order fulfillment

CFOs tell us that it is critical that they make sure that “good cash management is occurring in compliance with regulation”. It is important as well to recognize that OTC cuts across many of the primary activities of the business value chain—especially those that related to sales, marketing, and services.

How do you improve your order to cash process?

So how do financial leader go about improving OTC? They can clearly start by looking at the entire OTC process from quote order, process order, fulfill order, invoice customer, receive and apply cash, and manage credit and collections. The below diagram shows the specific touch points where the process can be improved—each of these should be looked at for process or technology improvement.

customer data OTC Process

However, the starting point is where the most concrete action can be taken. Fixing customer data fixes the data that is used by each succeeding process improvement area. This is where a single, connected view of customer can be established. This improves the OTC process by doing the following:

  1. Fixes your customer data
  2. Establishes the right relationships between customers
  3. Establishes tax and statutory registrations and credit limits
  4. Prevents penalties for delivering tax documents to the wrong placeCustomer Data Mastering (CDM) does this in the following way. It provides a single view of customers, 360 degree view of relationships as well as a complete view of integrative customer relationships including interactions and transactions.

CDM matters to the CFO and the Business as a whole

customer dataIt turns out that CDM does not just matter to OTC and the finance organization. It matters as well to the corporate value chain by impacting the following primary activities including outbound logics, marketing and sales, and service. Specifically, CDM accomplishes the following:

  • It reduces costs by reducing invoicing and billing inaccuracies, customer disputes, mailing unconsolidated statements, sending duplicate mail, and dealing with returned mail
  • Increases revenue by boosting marketing effectiveness at segmenting customer for more personalized offers
  • Increases revenue by boosting sales effectiveness by making more relevant cross-sell and up-sell offers
  • Reduces costs by boosting call center effectiveness by resolving customer issues more quickly
  • Improves customer satisfaction, customer loyalty and retention because employees are empowered with complete customer information to deliver great customer experiences across channels and touch points

Parting remarks

So as we have discussed, today’s CFOs are finding that they need to become more and more growth oriented. This transition is made more effective with a well function OTC process. While OTC impacts other elements of the business too, the starting point for fixing OTC is customer data mastering because it fixes elements of the data portion of this process.

Solution Brief:

Master Data Management

Related Blogs

CFOs Move to Chief Profitability Officer

CFOs Discuss Their Technology Priorities

The CFO Viewpoint upon Data

How CFOs can change the conversation with their CIO?

New type of CFO represents a potent CIO ally

Competing on Analytics

The Business Case for Better Data Connectivity

Twitter@MylesSuer

Share
Posted in Business Impact / Benefits, Master Data Management | Tagged , | Leave a comment

Connecting The Dots Between Tax Day, Identity Theft And Digital Transformation

Original article is posted at techcrunch.com

taxdaysocialsecurity

Connecting The Dots Between Tax Day, Identity Theft And Digital Transformation

It’s probably no surprise to the security professional community that once again, identity theft is among the IRS’s Dirty Dozen tax scams. Criminals use stolen Social Security numbers and other personally identifiable information to file tax claims illegally, deposit the tax refunds to rechargeable debit cards, and vanish before the average citizen gets around to filing.

Since the IRS began publishing its “Dirty Dozen” list to alert filers of the worst tax scams, identity theft has continually topped the list since 2011. In 2012, the IRS implemented a preventive measure to catch fraud prior to actually issuing refunds, and issued more than 2,400 enforcement actions against identity thieves. With an aggressive campaign to fight identity theft, the IRS saved over $1.4 billion in 2011 and over $63 billion since October 2014.

That’s great progress – but given that of the 117 million tax payers who filed electronically in 2014, 80 million received on average $2,851 directly deposited into their bank, which is more than $229 billion changing hands electronically. The pessimist in me has to believe that cyber criminals are already plotting how to nab more Social Security numbers and e-filing logins to tap into that big pot of gold.

So where are criminals getting the data to begin with? Any organization that has employees and a human resources department collects and possibly stores Social Security numbers, birthdays, addresses and income either on-premises or in a cloud HR application. This information is everything a criminal would need to fraudulently file taxes. Any time a common business process is digitally transformed, or moved to the cloud, the potential risk of exposure increases.

As the healthcare industry transforms to electronic health records and patient records, another abundant source of Social Security numbers and personally identifiable information increases the surface area of opportunity. When you look at the abundance of Social Security numbers stolen in major data breaches, such as the case with Anthem, you start to connect the dots.

One of my favorite dynamic infographics comes from the website Information is Beautiful entitled, ‘World’s Biggest Data Breaches.’ When you filter the data based on number of records versus sensitivity, the size of the bubbles indicate the severity. Even though the sensitivity score appears to be somewhat arbitrary, it does provide one way to assess the severity based on the type of information that was breached:

Data Breached

Sensitivity Score
Just email address/online information 1
SSN/personal details 20
Credit card information 300
Email password/health records 4000
Full bank account details 50000

What would be an interesting addition is how many records were sold on the black market that resulted in tax or insurance fraud.

Cyber-security expert Brian Krebs, who was personally impacted by a criminal tax return filing last year, says we will likely see “more phony tax refund claims than last year.” With credentials for TurboTax and H&R Block marketed on black market websites for about 4 cents per identity, it is hard to disagree.

The Ponemon Institute published a survey last year, entitled The State of Data Centric Security. One research finding that sticks out is when security professionals were asked what keeps them up at night, and more than 50 percent said “not knowing where sensitive and confidential data reside.” As we enter full swing into tax season, what should security professionals be thinking about?

Data Security Intelligence promises to be the next big thing that provides a more automated and data-centric view into sensitive data discovery, classification and risk assessment. If you don’t know where the data is or its risk, how can you protect it? Maybe with a little more insight, we can at least reduce the surface area of exposed sensitive data.

Share
Posted in Business Impact / Benefits, Data Security | Tagged , | Leave a comment

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

cloud+services

Popular Informatica Products are Now Fully Supported on AWS EC2 for Greater Agility

An increasing number of companies around the world moving to cloud-first or hybrid architectures for new systems to process their data for new analytics applications.  In addition to adding new data source from SaaS (Software as a Service) applications to their data pipelines, they are hosting some or all of their data storage, processing and analytics in IaaS (Infrastructure as a Service) public hosted environments to augment on-premise systems. In order to enable our customers to take advantage of the benefits of IaaS options, Informatica is embracing this computing model.

As announced today, Informatica now fully supports running the traditionally on-premise Informatica PowerCenter, Big Data Edition (BDE), Data Quality and Data Exchange on Amazon Web Services (AWS) Elastic Compute (EC2).  This provides customers with added flexibility, agility and time-to-production by enabling a new deployment option for running Informatica software.

Existing and new Informatica customers can now choose to develop and/or deploy data integration, quality and data exchange in AWS EC2 just as they would on on-premise servers.  There is no need for any special licensing as Informatica’s standard product licensing now covers deployment on AWS EC2 on the same operating systems as on-premise.  BDE on AWS EC2 supports the same versions of Cloudera and Hortonworks Hadoop that are supported on-premise.

Customers can install these Informatica products on AWS EC2 instances just as they would on servers running on an on-premise infrastructure. The same award winning Informatica Global Customer Service that thousands of Informatica customers use is now available on call and standing by to help with success on AWS EC2. Informatica Professional Services is also available to assist customers running these products on AWS EC2 as they are for on-premise system configurations.

Informatica customers can accelerate their time to production or experimentation with the added flexibility of installing Informatica products on AWS EC2 without having to wait for new servers to arrive.  There is the flexibility to develop in the cloud and deploy production systems on-premise or develop on-premise and deploy production systems in AWS.  Cloud-first companies can keep it all in the cloud by both developing and going into production on AWS EC2.

Customers can also benefit from the lower up-front costs, maintenance costs and pay-as-you-go infrastructure pricing of AWS.  Instead of having to pay upfront for servers and managing them in an on-premise data center, customers can use virtual servers in AWS to run Informatica products on. Customers can use existing Informatica licenses or purchase them in the standard way from Informatica for use on top of AWS EC2.

Combined with the ease of use of Informatica Cloud, Informatica now offers customers looking for hybrid and cloud solutions even more options.

Read the press release including supporting quotes from AWS and Informatica customer ProQuest, here.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration | Tagged , , , | Leave a comment

The Impact of the Industrial Internet on Data

Internet-impact

Impact of the Industrial Internet

At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”.  One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.

As software and the value of data are  becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc.  As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance,  the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.

The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs.  These business  models enable suppliers  to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict  malfunction, proactive maintenance planning, etc.

So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.

Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered  and governed in IoT projects with the same priorities as the data in the more traditional applications.

To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough.  In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.

Share
Posted in B2B, Big Data, Business Impact / Benefits, Cloud, Cloud Data Integration, Data Security, Data Services | Tagged , , , , , , | Leave a comment

What is an Enterprise Architecture Maturity Model?

Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. The solution to this predicament is an Enterprise Architecture (EA) process that can provide a framework for an optimized IT portfolio. IT Optimization strategy should be based on a comprehensive set of architectural principles which ensure consistency and make IT more responsive, efficient, and economical.

The rationalization, standardization, and consolidation process helps organizations understand their current EA maturity level and move forward on the appropriate roadmap. As they undertake the IT Optimization journey, the IT architecture matures through several stages, leveraging IT Optimization Architecture Principles to attain each level of maturity.

EA Maturity

Multiple Levels of Enterprise Architecture Maturity Model

Level 1: The first step involves helping a company develop its architecture vision and operating model, with attention to cost, globalization, investiture, or whatever is driving the company strategically. Once that vision is in place, enterprise architects can guide the organization through an iterative process of rationalization, consolidation, and eventually shared-services and cloud computing.

Level 2: The rationalization exercise helps an organization identify what standards to move towards as they eliminate the complexities and silos they have built up over the years, along with the specific technologies that will help them get there.

Depending on the company, Rationalization could start with a technical discussion and be IT-driven; or it could start at a business level. For example, a company might have distributed operations across the globe and desire to consolidate and standardize its business processes. That could drive change in the IT portfolio. Or a company that has gone through mergers and acquisitions might have redundant business processes to rationalize.

Rationalizing involves understanding the current state of an organization’s IT portfolio and business processes, and then mapping business capabilities to IT capabilities. This is done by developing scoring criteria to analyze the current portfolio, and ultimately by deciding on the standards that will propel the organization forward. Standards are the outcome of a rationalization exercise.

Standardized technology represents the second level of EA maturity. Organizations at this level have evolved beyond isolated independent silos. They have well-defined corporate governance and procurement policies, which yields measurable cost savings through reduced software licenses and the elimination of redundant systems and skill sets.

Level 3: Consolidation entails reducing the footprint of your IT portfolio. That could involve consolidating the number of database servers, application servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing, and related consolidation initiatives.

Consolidation may be a by-product of another technology transformation, or it may be the driver of these transformations. But whatever motivates the change, the key is to be in alignment with the overall business strategy. Enterprise architects understand where the business is going so they can pick the appropriate consolidation strategy.

Level 4: One of the key outcomes of a rationalization and consolidation exercise is the creation of a strategic roadmap that continually keeps IT in line with where the business is going.

Having a roadmap is especially important when you move down the path to shared services and cloud computing. For a company that has a very complex IT infrastructure and application portfolio, having a strategic roadmap helps the organization to move forward incrementally, minimizing risk, and giving the IT department every opportunity to deliver value to the business.

Twitter @bigdatabeat

Share
Posted in 5 Sales Plays, Application Retirement, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Cloud, Mergers and Acquisitions | Tagged , , , , | Leave a comment

What’s Next for the Cloud, Coming to a City Near You (If you’re in NYC or San Francisco)

Training Seminars for Cloud Application Owners and Salesforce Practitioners

Free Training Seminars for Cloud Application Owners and Salesforce Practitioners

March 20th 2015 was the official start of spring and to be honest, it couldn’t have come soon enough for us folks in the North East.  After a long, cold and snowy winter we’re looking forward to the spring thaw and the first green shoots of burgeoning life. Spring is also the time that we like to tackle new projects and start afresh after our winter hibernation.

For those of us in technology new spring projects often reflect the things we do in everyday life.  Naturally our mind turns at this time to spring cleaning and spring training.  To be honest, we’d have to admit that we haven’t scrubbed our data in three months so data cleansing is a must, but so too is training.  We probably haven’t picked up a book or attended a seminar since last November.  But what training should we do?  And “what should we do next?”

Luckily, Informatica is providing the answer.  We’ve put together two free, half day training seminars for cloud application owners and Salesforce practitioners.  That’s two dates, two fantastic locations and dozens of brilliant speakers lined up to give you some new pointers for what’s coming next in the world of cloud and SaaS.

The goals of the event are to give you the tools and knowledge to strengthen your Salesforce implementation and help you delight your customers.  The sessions will include presentations by experts from Salesforce and our partner Bluewolf.  There will also be some best practices presentations and demonstrations from Informatica’s team of very talented engineers.

Just glance at the seminar summary and you’ll see what we mean:

Session 1: Understand the Opportunity of Every Customer Interaction

In this session Eric Berridge, Co-founder and CEO of Bluewolf Inc. will discuss how you can develop a customer obsessed culture and get the most value from every customer interaction.

Session 2: Delight Your Customers by Taking Your Salesforce Implementation to the Next Level

Ajay Gandhi, Informatica’s VP Product Marketing is next up and he’s going to provide a fabulous session on what you look out for, and where should you invest as your Salesforce footprint grows.

Session 3: Anticipate Your Business Needs With a Fresh Approach to Customer Analytics

The seminar wraps up with Benjamin Pruden, Sr. Manager Product Marketing, at Salesforce. Ben’s exciting session touches on one of the hottest topics in the industry today.  He’s going to explain how you can obtain a comprehensive understanding of your most valuable customers with cloud-analytics and data-driven dashboards.

I’m sure you’ll agree that it’s a pretty impressive seminar and well worth a couple of hours of your time.

The New York event is happening at Convene (810 Seventh Ave, 52nd and 53rd ) on April 7thClick here for more details and to reserve your seat.

The San Francisco event is a week later on April 14th at Hotel Nikko (222 Mason Street). Make sure you click here and register today.

Come join us on the 7th or the 14th to learn how to take your cloud business to the next level. Oh, and don’t forget that you’ll also be treating yourself to some well-deserved spring training!

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management | Tagged , , , , , , , | Leave a comment

Data Wizard Beta: Paving the Way for Next-Generation Data Loaders

data transformation

The Data Wizard, Changes the Landscape of What Traditional Data Loaders Can Do

The emergence of the business cloud is making the need for data ever more prevalent. Whatever your business, if your role is in the sales, marketing or service departments, chances are your productivity depends a great deal on the ability to move data quickly in and out of Salesforce and its ecosphere of applications.

With the in-built data transformation intelligence, the Data Wizard (click here to try the Beta version), changes the landscape of what traditional data loaders can do. The Data Wizard takes care of the following aspects, so that you don’t have to:

  1. Data Transformations: We built in over 300 standard data transformations so you don’t have to format the data before bringing it in (eg. combining first and last names into full names, adding numeric columns for totals, splitting address fields into its separate components).
  2. Built-in intelligence: We automate the mapping of data into Salesforce for a range of common use cases (eg., Automatically mapping matching fields, intelligently auto-generating date format conversions , concatenating multiple fields).
  3. App-to-app integration: We incorporated pre-built integration templates to encapsulate the logic required for integrating Salesforce with other applications (eg., single click update of customer addresses in a Cloud ERP application based on Account addresses in Salesforce) .

Unlike the other data loading apps out there, the Data Wizard doesn’t presuppose any technical ability on the part of the user. It was purpose-built to solve the needs of every type of user, from the Salesforce administrator to the business analyst.

Despite the simplicity the Data Wizard offers, it is built on the robust Informatica Cloud integration platform, providing the same reliability and performance that is key to the success of Informatica Cloud’s enterprise customers, who integrate over 5 billion rows of data per day. We invite you to try the Data Wizard for free, and contribute to the Beta process by providing us with your feedback.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform, Data Services | Tagged , , , , , | Leave a comment