Tag Archives: Architecture
On November 13, 2014, Informatica acquired the assets of Proact, whose Enterprise Architecture tools and delivery capability link architecture to business strategy. The BOST framework is now the Informatica Business Transformation Toolkit which received high marks in a recent research paper:
“(BOST) is a framework that provides four architectural views of the enterprise (Business, Operational, Systems, and Technology). This EA methodology plans and organizes capabilities and requirements at each view, based on evolving business and opportunities. It is one of the most finalized of the methodologies, in use by several large enterprises.”  (more…)
This got me thinking: What is the biggest bottleneck in the delivery of business value today? I know I look at things from a data perspective, but data is the biggest bottleneck. Consider this prediction from Gartner:
“Gartner predicts organizations will spend one-third more on app integration in 2016 than they did in 2013. What’s more, by 2018, more than half the cost of implementing new large systems will be spent on integration. “
When we talk about application integration, we’re talking about moving data, synchronizing data, cleansing, data, transforming data, testing data. The question for architects and senior management is this: Do you have the Data Foundation for Execution you need to drive the business results you require to compete? The answer, unfortunately, for most companies is; No.
All too often data management is an add-on to larger application-based projects. The result is unconnected and non-interoperable islands of data across the organization. That simply is not going to work in the coming competitive environment. Here are a couple of quick examples:
- Many companies are looking to compete on their use of analytics. That requires collecting, managing, and analyzing data from multiple internal and external sources.
- Many companies are focusing on a better customer experience to drive their business. This again requires data from many internal sources, plus social, mobile and location-based data to be effective.
When I talk to architects about the business risks of not having a shared data architecture, and common tools and practices for enterprise data management, they “get” the problem. So why aren’t they addressing it? The issue is that they find that they are only funded to do the project they are working on and are dealing with very demanding timeframe requirements. They have no funding or mandate to solve the larger enterprise data management problem, which is getting more complex and brittle with each new un-connected project or initiative that is added to the pile.
Studies such as “The Data Directive” by The Economist show that organizations that actively manage their data are more successful. But, if that is the desired future state, how do you get there?
Changing an organization to look at data as the fuel that drives strategy takes hard work and leadership. It also takes a strong enterprise data architecture vision and strategy. For fresh thinking on the subject of building a data foundation for execution, see “Think Data-First to Drive Business Value” from Informatica.
* By the way, Informatica is proud to announce that we are now a sponsor of the MIT Center for Information Systems Research.
If you have been following publications in the Potential at Work Community or any number of Linkedin discussions such this one on the DrJJ group (a think-tank for information management best practices), you will have noticed the Agile methodology topic come up time and time again. For instance, check out the article Architect Your Way From Sluggish to Speed or the video Focus on Agility Adaptability. It hasn’t always been this way. For many years the architectural focus was on RASP.
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Those moving to cloud computing have their work cut out of for them. They need to pick a parcel of data, applications, or both to migrate to a cloud-based service. Or, perhaps build a system from the ground up on a cloud platform.
In any event, you need a few things to insure success, including a good architecture, a deployment plan, and a sound data integration strategy. (more…)
Those who understand data integration and the supporting technology are in high demand. Why? The need to create data synergy within enterprises, among both traditional and cloud computing-based data and applications, is inflecting. This is due to a more clearly understood business benefit around the value of data integration.
The ability to have information arrive on-time and when needed has been a fundamental need of IT since I wrote the EAI book over a decade ago. However, in the last few years, systems became more complex, including the complexity of the data that exists within them. In response, data integration grew more complex, the technology more sophisticated, and thus the increase in the demand for data integration talent. (more…)
Because there is real need! Period! Even after investing heavily in agile architecture approaches such as SOA, IT organizations are finding it extremely challenging to solve complex data integration issues.
If you are involved in defining or re-defining a data architecture to enable composite applications and portals to effectively leverage data in an SOA, here is where the discussions are happening.
If you are looking to enhance your existing data architecture to ensure that business intelligence reports can quickly leverage data that is not in your data warehouse, you will find your answers here.
A couple of weeks back I posted on the shortcomings of the application approach to multidomain MDM, so this week let’s take a look at the many reasons why the platform approach is the superior alternative for effective multidomain MDM. The primary technological difference between the two approaches is that MDM “applications” typically employ a predefined data model, business logic, and a dedicated graphical user interface (GUI) tied to solving a single business problem, whereas platform-based MDM allow users to create and use flexible data models, configure it to suite any business logic, and provide visibility across any number of business processes via a single user interface. (more…)
I’m looking forward to doing a Webinar on data virtualization this Thursday, April 22nd. Why? Because this is the single most beneficial concept of architecture, including SOA, and it’s often overlooked by the rank-and-file developers and architects out there. I’m constantly evangelizing the benefits of data virtualization, including integrating data from many and different data sources in real-time, and enabling query-based applications to get data from multiple systems.
The idea is pretty simple, really. Considering that there are many physical database schemas within most enterprises, and typically no common view of the data, data virtualization allows you to map many physical schemas to virtual schemas that are a better representation of the business. For example, a single view of customer data, sales data, and other data that has the same logical meaning, but may be scattered amongst many different physical database systems, using any number of implementation models. (more…)
Loraine Lawson did a great job covering the topic of the integration challenges around the cloud and virtualization. She reports that “…a recent Internet Evolution column [by David Vellante] looks more broadly at the cloud integration question and concludes that insufficient integration is holding up both cloud computing and virtualization.”
In fact, what currently limits the number of cloud deployments is the lack of a clear understanding of data integration in the context of cloud computing. This is a rather easy problem to solve, but it’s often an afterthought.
The core issue is that cloud computing providers, other than Salesforce.com, don’t consider integration. Perhaps they are thinking, “If you use our cloud, then there is no reason to sync your data back to your enterprise. After all, we’re the final destination for your enterprise data, right?” Wrong. (more…)