Category Archives: Data Services

Why Enterprise Architects Need to Think About Data First

Enterprise Architects Need to Think About Data First

Enterprise Architects: Think “Data First”

Enterprise Architects (EAs) are increasingly being asked to think 3-5 years out.  This means that they need to take an even more active part in the strategy process, and to help drive business transformation.  A CIO that we talked to recently said;

 “Enterprise Architecture needs to be the forward, business facing component of IT.  Architects need to create a regular structure for IT based on the service and product line functions/capabilities. They need to be connected to their business counterparts. They need to be so tied to the product and service road map that they can tie changes directly to the IT roadmap. Often times, I like to pair a Chief Business Strategist with a Chief Enterprise Architect”.

To get there, Enterprise Architects are going to have to think differently about enterprise architecture. Specifically, they need think “data first” to break through the productivity barrier and deliver business value in the time frame that business requires it.

IT is Not Meeting the Needs of the Business

A study by McKinsey and Company has found that IT is not delivering in the time frame that business requires.  Even worse, the performance ratings have been dropping over the past three years.  And even worse than that, 20% of the survey respondents are calling for a change in IT leadership.

Our talks with CIOs and Enterprise Architects tell us that the ability to access, manage and deliver data on a timely basis is the biggest bottleneck in the process of delivering business initiatives.  Gartner predicts that by 2018, more than half the cost of implementing new large systems will be spent on integration.

The Causes: It’s Only Going to Get Worse

Data needs to be easily discoverable and sharable across multiple uses.  Today’s application-centric architectures do not provide that flexibility. This means any new business initiative is going to be slowed by issues relating to finding, accessing, and managing data.  Some of the causes of problems will include:

  • Data Silos: Decades of applications-focused architecture have left us with unconnected “silos of data.”
  • Lack of Data Management Standards: The fact is that most organizations do not manage data as a single system. This means that they are dealing with a classic “spaghetti diagram” of data integration and data management technologies that are difficult to manage and change.
  • Growth of Data Complexity: There is a coming explosion of data complexity: partner data, social data, mobile data, big data, Internet of Things data.
  • Growth of Data Users: There is also a coming explosion of new data users, who will be looking to self-service.
  • Increasing Technology Disruption:  Gartner predicts that we are entering a period of increased technology disruption.

Looking forward, organizations are increasingly running on the same few enterprise applications and those applications are rapidly commoditizing.  The point is that there is little competitive differentiation to be had from applications.  The only meaningful and sustainable competitive differentiation will come from your data and how you use it.

Recommendations for Enterprise Architects

  1. Think “data first” to accelerate business value delivery and to drive data as your competitive advantage. Designing data as a sharable resource will dramatically accelerate your organization’s ability to produce useful insights and deliver business initiatives.
  2. Think about enterprise data management as a single system.  It should not be a series of one-off, custom, “works of art.”  You will reduce complexity, save money, and most importantly speed the delivery of business initiatives.
  3. Design your data architecture for speed first.  Do not buy into the belief that you must accept trade-offs between speed, cost, or quality. It can be done, but you have to design your enterprise data architecture to accomplish that goal from the start.
  4. Design to know everything about your data. Specifically, gather and carefully manage all relevant metadata.  It will speed up data discovery, reduce errors, and provide critical business context.  A full compliment of business and technical metadata will enable recommendation #5.
  5. Design for machine-learning and automation. Your data platform should be able to automate routine tasks and intelligently accelerate more complex tasks with intelligent recommendations.  This is the only way you are going to be able to meet the demands of the business and deal with the growing data complexity and technology disruptions.

Technology disruption will bring challenges and opportunities.  For more on this subject, see the Informatica eBook, Think ‘Data First’ to Drive Business Value.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Services, Enterprise Data Management | Tagged , | Leave a comment

The Future of Data for Everyone

Chart for the future of dataWithin every corporation there are lines of businesses, like Finance, Sales, Logistics and Marketing. And within those lines of businesses are business users who are either non-technical or choose to be non-technical.

These business users are increasingly using Next-Generation Business Intelligence Tools like Tableau, Qliktech, MicroStrategy Visual Insight, Spotfire or even Excel. A unique capability of these Next-Generation Business Intelligence Tools is that they allow a non-technical Business User to prepare data, themselves, prior to the ingestion of the prepared data into these tools for subsequent analysis.

Initially, the types of activities involved in preparing this data are quite simple. It involves, perhaps, putting together two excel files via a join on a common field. However, over time, the types of operations a non-technical user wishes to perform on the data become more complex. They wish to do things like join two files of differing grain, or validate/complete addresses, or even enrich company or customer profile data. And when a non-technical user reaches this point they require either coding or advanced tooling, neither of which they have access to. Therefore, at this point, they will pick up the phone, call their brethren in IT and ask nicely for help with combining, enhancing quality and enriching the data. Often times they require the resulting dataset back in a tight timeframe, perhaps a couple of hours. IT, will initially be very happy to oblige. They will get the dataset back to the business user in the timeframe requested and at the quality levels expected. No issues.

However, as the number of non-technical Business Users using Next-Generation Business Intelligence tools increase, the number of requests to IT for datasets also increase. And so, while initially IT was able to meet the “quick hit dataset” requests from the Business, over time, and to the best of their abilities, IT increasingly becomes unable to do so.

The reality is that over time, the business will see a gradual decrease in the quality of the datasets returned, as well as an increase the timeframe required for IT to provide the data. And at some point the business will reach a decision point. This is where they determine that for them to meet their business commitments, they will have to find other means by which to put together their “quick hit datasets.” It is precisely at this point that the business may do things like hire an IT contractor to sit next to them to do nothing but put together these “quick hit” datasets. It is also when IT begins to feel marginalized and will likely begin to see a drop in funding.

This dynamic is one that has been around for decades and has continued to worsen due to the increase in the pace of data driven business decision making. I feel that we at Informatica have a truly unique opportunity to innovate a technology solution that focuses on two related constituents, specifically, the Non-Technical Business User and the IT Data Provisioner.

The specific point of value that this technology will provide to the Non-Technical Business User will enable them to rapidly put together datasets for subsequent analysis in their Next-Generation BI tool of choice. Without this tool they might spend a week or two putting together a dataset or wait for someone else to put it together. I feel we can improve this division-of-labor and allow business users to spend 1-2 weeks performing meaningful analysis before spending 15 minutes putting the data set together themselves. Doing so, we allow non-technical business users to dramatically decrease their decision making time.

The specific point of value that this technology will provide the IT data provisioner is that they will now be able to effectively scale data provisioning as the number of requests for “quick hit datasets” rapidly increase. Most importantly, they will be able to scale, proactively.

Because of this, the Business and IT relationship has become a match made in heaven.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Quality, Data Services | Tagged , , , | Leave a comment

ILM Day: Test management, Data archives and Data security discussions and more…

At the Informatica World 2014 pre-conference, the “ILM Day” sessions were packed, with over 100 people in attendance. This attendance reflects the strong interest in data archive, test data management and data security. Customers were the focus of the panel sessions today, taking center stage to share their experiences, best practices and lessons learned from successful deployments.

Both the test management and data archive panels had strong audience interest and interaction. For Test Data Management, the panel topic was “Agile Development by Streamlining Test Data Management”; for data archive, the session tackled “Managing Data Growth in the Era of Application Consolidation and Modernization”. The panels provided practical tactics and strategies to address the challenges and issues in managing data growth, and how to efficiently and safely provision test data. Thank you to the customers, partners and analysts who served on the panels; participating was EMC, Visteon, Comcast, Lowes, Tata Consultancy Services and Neuralytix.

The day concluded with a most excellent presentation from the ILM General Manager, Amit Walia and the CTO of the International Association of Privacy Professionals, Jeff Northrop. Amit provided an executive summary pre-view of Tuesday’s Secure@Source(TM) announcement, while Jeff Northrop provided a thought provoking market backdrop on the issues and challenges for data privacy and security, and how the focus on information security needs to shift to a ‘data-centric’ approach.

A very successful event for all involved!

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Privacy, Data Services | Leave a comment

Calling All Architects To Join A New Career-focused Community

In the Information Age we live and work in, where it’s hard to go even one day without a Google search, where do you turn for insights that can help you solve work challenges and progress your career?  This is a tough question.  How can we deal with the challenges of information overload – which some have called information pollution? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Integration Platform, Data Services, Enterprise Data Management, Governance, Risk and Compliance, Integration Competency Centers, Pervasive Data Quality | Tagged , , , , | 1 Comment

Get Your Data Butt Off The Couch and Move It

Data is everywhere.  It’s in databases and applications spread across your enterprise.  It’s in the hands of your customers and partners.  It’s in cloud applications and cloud servers.  It’s on spreadsheets and documents on your employee’s laptops and tablets.  It’s in smartphones, sensors and GPS devices.  It’s in the blogosphere, the twittersphere and your friends’ Facebook timelines. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, B2B, Big Data, Cloud Computing, Complex Event Processing, Data Governance, Data Integration, Data Migration, Data Quality, Data Services, Data Transformation, Data Warehousing, Enterprise Data Management, Integration Competency Centers | Tagged , , , | Leave a comment

Is It Time For a New ICC Book?

In a recent visit to a client, three people asked me to autograph their copies of Integration Competency Center: An Implementation Guidebook. David Lyle and I published the book in 2005, but it was clear from the dog-eared corners and book-mark tabs that it is still relevant and actively being used today.  Much has changed in the last seven years including the emergence of Big Data, Data Virtualization, Cloud Integration, Self-Service Business Intelligence, Lean and Agile practices, Data Privacy, Data Archiving (the “death” part of the information life-cycle), and Data Governance.  These areas were not mainstream concerns in 2005 like they are today.  The original ICC (Integration Competency Center) book concepts and advice are still valid in this new context, but the question I’d like readers to comment on is should we write a new book that explicitly provides guidance for these new capabilities in a shared services environment? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Data Services, Data Warehousing, Integration Competency Centers | Tagged | Leave a comment

I Can’t Describe It, “But I Know It When I See It”

So wrote Potter Stewart, Associate Justice of the Supreme Court in Jacobellis v. Ohio opinion (1964). He was talking about pornography. The same holds true for data. For example, most business users have a hard time describing exactly what data they need for a new BI report, including what source system to get the data from, in sufficiently precise terms that allow designers, modelers and developers to build the report right the first time. But if you sit down with a user in front an analyst tool and profile the potential source data, they will tell you in an instant whether it’s the right data or not. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Migration, Data Quality, Data Services, Data Warehousing, Enterprise Data Management, Integration Competency Centers, Profiling | Tagged , , , | Leave a comment

Data Virtualization Quick Tips

The ability to create abstract schemas that are mapped to back-end physical databases provides a huge advantage for those enterprises looking to get their data under control.  However, given the power of data virtualization, there are a few things that those in charge of data integration should know. Here are a few quick tips.

Tip 1:  Start with a new schema that is decoupled from the data sources. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Services | Tagged , , , , | Leave a comment

Data Quality Goes Green in Colorado, and on the Informatica Marketplace

A recent trip to a supermarket in Telluride, Colorado struck me as a funny place to find an analogy for data quality, but there it was. You see, supermarkets here require you to bring your own bags to cart your groceries home. Those brown disposable plastic bags are banned here – the town has made a firm commitment to the philosophy of Reduce, Reuse and Recycle. By adhering to this environmental philosophy, data integration teams can develop and deploy successful data quality strategies across the enterprise despite the constraints of today’s “do more with less” IT budgets.

In the decade that I’ve been in the Information Management space, I’ve noticed that success in data integration usually comes in small increments – typically on a project by project basis. However, by leveraging those small incremental successes and deploying them in a repeatable, consistent fashion – either as standardized rules sets or data services in a SOA – development teams can maximize their impact at the enterprise level.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality, Data Services, Enterprise Data Management, Informatica 9.1, Informatica 9.5, Integration Competency Centers, Operational Efficiency, Pervasive Data Quality, Uncategorized | Tagged , , , , | Leave a comment