Category Archives: Architects

Speed: The #1 IT Challenge

Agile Data Integration

Speed: The #1 IT Challenge

Speed is the top challenge facing IT today, and it’s reaching crisis proportions at many organizations.  Specifically, IT needs to deliver business value at the speed that the business requires.

The challenge does not end there; This has to be accomplished without compromising cost or quality. Many people have argued that you only get two out of three on the Speed/Cost/Quality triangle, but I believe that achieving this is the central challenge facing Enterprise Architects today.  Many people I talk to are looking at agile technologies, and in particular Agile Data Integration.

There have been a lot of articles written about the challenges, but it’s not all doom and gloom.  Here is something you can do right now to dramatically increase the speed of your project delivery while improving cost and quality at the same time: Take a fresh look you Agile Data Integration environment and specifically at Data Virtualization.  Data Virtualization offers the opportunity to simplify and speed up the data part of enterprise projects.  And this is the place where more and more projects are spending 40% and more of their time.  For more information and an industry perspective you can download the latest Forrester Wave report for Data Virtualization Q1 2015.

Here is a quick example of how you can use Data Virtualization technology for rapid prototyping to speed up business value delivery:

  • Use data virtualization technology to present a common view of your data to your business-IT project teams.
  • IT and business can collaborate in realtime to access and manage data from a wide variety of very large data sources – eliminating the long, slow cycles of passing specifications back and forth between business and IT.
  • Your teams can discover, profile, and manage data using a single virtual interface that hides the complexity of the underlying data.
  • By working with a virtualization layer, you are assured that your teams are using the right data and data that can by verified by linking it to a Business Glossary with clear terms, definitions, owners, and business context to reduce the chance of misunderstandings and errors.
  • Leading offerings in this space include data quality and data masking tools in the interface, ensuring that you improve data quality in the process.
  • Data virtualization means that your teams can be delivering in days rather than months and faster delivery means lower cost.

There has been a lot of interest in agile development, especially as it relates to data projects.  Data Virtualization is a key tool to accelerate your team in this direction.

Informatica has a leading position in the Forrester report due to the productivity of the Agile Data Integration environment but also because of the integration with the rest of the Informatica platform.  From an architect’s point of view it is critical to start standardizing on an enterprise data management platform.  Continuing data and data tool fragmentation will only slow down future project delivery.  The best way to deal with the growing complexity of both data and tools is to drive standardization within your organizations.

Share
Posted in 5 Sales Plays, Architects, CIO, Data Integration | Tagged , , , , , | Leave a comment

Next Generation Planning for Agile Business Transformation

This is an age of technology disruption and digitization. Winners will be those organizations that can adapt quickly and drive business transformation on an ongoing basis.

When I first met John Schmidt Vice President of Global Integration Services at Informatica, he asked me to visualize Business Transformation as “A modern tool like the internet and Google Maps, with which planning a road trip from New York to San Francisco with a number of stops along the way to visit friends or see some sights takes just minutes. So you’re halfway through the trip and a friend calls to say he has suddenly been called out of town, you get on your mobile phone and within a few minutes, you have a new roadmap and a new plan.”

So, why is it that creating a roadmap for an enterprise initiative takes months or even years, and upon development of such a plan, it is nearly impossible to change even when new information or external events invalidate the plan? A single transformation is useful, but what you really want is the ability to transform our business on an ongoing basis. You need to be agile in planning of the transformation initiative itself. Is it even feasible to achieve a planning capability for complex enterprise initiatives that could approach the speed and agility of cross-country road-trip planning?

The short answer is YES; you can get much faster if you do three things:

First, throw out old notions of how planning in complex corporate environments is done, while keeping in mind that planning an enterprise transformation is fundamentally different than planning a focused departmental initiative.

Second, invest in tools equivalent to Google Maps for building the enterprise roadmap. Google Maps works because it leverages a database of information about roads, rules of the roads, related local services, and points of interest. In short, Google Map the enterprise, which is not as onerous as it sounds.

Third, develop a team of Enterprise Architects and planners with the skills and discipline to use the BOST™ Framework to maintain the underlying reference data about the business, its operations, the systems that support it, and the technologies that they are based on. This will provide the execution framework for your organization to deliver the data to fuel your business initiatives and digital strategy.

The results in a closer alignment of your business and IT organizations, there will be fewer errors due to communication issues, and because your business plans are linked directly to the underlying technical implementation, your business value will be delivered quicker.

BOSTThis is not some “pie in the sky” theory or a futuristic dream. What you need is a tool like Google Maps for Business Transformation. The tool is the BOST™ Toolkit leverages the BOST™ Framework, which through models, elements, and associated relationships built around an underlying Metamodel, interprets enterprise processes using a 4-dimensional view driven by business, operations, systems, and technology. Informatica in collaboration with certified partners built The BOST™ Framework. It provides an Architecture-led Planning approach to for business transformation.

Benefits of Architecture-led Planning

The Architecture-led Planning approach is effective when applied with governance and oversight. The following four features describe the benefits:

Enablement of Business and IT Collaboration – Uses a common reference model to facilitate cross-functional business alignment, as well as alignment between business and IT. The model gets everyone on the same page, regardless of line of business, location, or IT function. This model explicitly and dynamically starts with business strategy and links from there to the technical implementation.

Data-driven Planning – Being able to capture data in a structured repository helps with rapid planning. A data-driven plan makes it dynamic and adaptable to changing circumstances. When the plan changes, rather than updating dozens of documents, simply apply the change to the relevant components in the enterprise model repository and all business and technical model views that reference that component update automatically.

Cross-Functional Decision Making – Cross-functional decision-making is facilitated in several ways. First, by showing interdependencies between functions, business operations, and systems, the holistic view helps each department or team to understand the big-picture and its role in the overall process. Second, the future state architectural models are based on a view of how business operations will change. This provides the foundation to determine the business value of the initiative, measure your progress, and ultimately report the achievement of the goals. Quantifiable metrics help decision makers look beyond the subjective perspectives and agree on fact-based success metrics.

Reduced Execution Risk – Reduced execution risk results from having a robust and holistic plan based on a rigorous analysis of all the dependent enterprise components in the business, operations, systems and technology view. Risk is reduced with an effective governance discipline both from a program management as well as from an architectural change perspective.

Business Transformation with Informatica

Integrated Program Planning is for organizations that need large or complex Change Management assistance. Examples of candidates for Integrated Program Planning include:

Enterprise Initiatives: Large-scale mergers or acquisitions, switching from a product-centric operating model to more customer-centric operations, restructuring channel or supplier relationships, rationalizing the company’s product or service portfolio, or streamlining end-to-end processes such as order-to-cash, procure-to-pay, hire-to-retire or customer on-boarding.

Top-level Directives: Examples include board-mandated data governance, regulatory compliance initiatives that have broad organizational impacts such as data privacy or security, or risk management initiatives.

Expanding Departmental Solutions into Enterprise Solutions: Successful solutions in specific business areas can often be scaled-up to become cross-functional enterprise-wide initiatives. For example, expanding a successful customer master data initiative in marketing to an enterprise-wide Customer Information Management solution used by sales, product development, and customer service for an Omni-channel customer experience.

Twitter @bigdatabeat

The BOST™ Framework identifies and defines enterprise capabilities. These capabilities are modularized as reconfigurable and scalable business services. These enterprise capabilities are independent of organizational silos and politics, which provide strategists, architects, and planners the means to drive for high performance across the enterprise, regardless of the shifting set of strategic business drivers.The BOST™ Toolkit facilitates building and implementing new or improved capabilities, adjusting business volumes, and integrating with new partners or acquisitions through common views of these building blocks and through reusing solution components. In other words, Better, Faster, Cheaper projects.

The BOST View creates a visual understanding of the relationship between business functions, data, and systems. It helps with the identification of relevant operational capabilities and underlying support systems that need to change in order to achieve the organization’s strategic objectives. The result will be a more flexible business process with greater visibility and the ability to adjust to change without error.

Share
Posted in 5 Sales Plays, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO | Tagged , , , | Leave a comment

What’s Driving Core Banking Modernization?

Renew

What’s Driving Core Banking Modernization

When’s the last time you visited your local branch bank and spoke to a human being? How about talking to your banker over the phone?  Can’t remember?  Well you’re not alone and don’t worry, it’s not a bad thing. The days of operating physical branches with expensive workers to greet and service customers  are being replaced with more modern and customer friendly mobile banking applications that allow consumers to deposit checks from the phone, apply for a mortgage and sign closing documents electronically, to eliminating the need to go to an ATM and get physical cash by using mobile payment solutions like Apple Pay.  In fact, a new report titled ‘Bricks + Clicks: Building the Digital Branch,’ from Jeanne Capachin and Jim Marous takes an in-depth look at how banks and credit unions are changing their branch and customer channel strategies to meet the demand of today’s digital banking customer.

Why am I talking about this? These market trends are dominating the CEO and CIO agenda in today’s banking industry. I just returned from the 2015 IDC Asian Financial Congress event in Singapore where the digital journey for the next generation bank was a major agenda item. According the IDC Financial Insights, global banks will invest $31.5B USD in core banking modernization to enable these services, improve operational efficiency, and position these banks to better compete on technology and convenience across markets. Core banking modernization initiatives are complex, costly, and fraught with risks. Let’s take a closer look. (more…)

Share
Posted in Application Retirement, Architects, Banking & Capital Markets, Data Migration, Data Privacy, Data Quality, Vertical | Tagged , , | Leave a comment

Top 5 Big Data Mistakes

Top 5 Big Data mistakes

Top 5 Big Data mistakes

I won’t say I’ve seen it all; I’ve only scratched the surface in the past 15 years. Below are some of the mistakes I’ve made or fixed during this time.

MongoDB as your Big Data platform

Ask yourself, why am I picking on MongoDB? The NoSQL database most abused at this point is MongoDB, while Mongo has an aggregation framework that tastes like MapReduce and even a very poorly documented Hadoop connector, its sweet spot is as an operational database, not an analytical system.

RDBMS schema as files

You dumped each table from your RDBMS into a file and stored that on HDFS, you now plan to use Hive on it. You know that Hive is slower than RDBMS; it’ll use MapReduce even for a simple select. Next, let’s look at row sizes; you have flat files measured in single-digit kilobytes.

Hadoop does best on large sets of relatively flat data. I’m sure you can create an extract that’s more de-normalized.

Data Ponds

Instead of creating a single Data Lake, you created a series of data ponds or a data swamp. Conway’s law has struck again; your business groups have created their own mini-repositories and data analysis processes. That doesn’t sound bad at first, but with different extracts and ways of slicing and dicing the data, you end up with different views of the data, i.e., different answers for some of the same questions.

Schema-on-read doesn’t mean, “Don’t plan at all,” but it means “Don’t plan for every question you might ask.”

Missing use cases

Vendors, to escape the constraints of departmental funding, are selling the idea of the data lake. The byproduct of this is the business lost sight of real use cases. The data-lake approach can be valid, but you won’t get much out of it if you don’t have actual use cases in mind.

It isn’t hard to come up with use cases, but that is always an afterthought. The business should start thinking of the use cases when their databases can’t handle the load.

SQL

You like SQL. Query languages and techniques have changed with time. Today, think of Pig as PL/SQL on steroids with maybe a touch of acid.

To do a larger bit of analytics, you may need a bigger tool set like that may include Hive, Pig, MapReduce, R, and more.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business Impact / Benefits, CIO, Hadoop | Tagged , , , , , , , | Leave a comment

The Sexiest Job of the 21st Century

Sexiest Job

The Sexiest Job of the 21st Century

I’ve spent most of my career working with new technology, most recently helping companies make sense of mountains of incoming data. This means, as I like to tell people, that I have the sexiest job in the 21st century.

Harvard Business Review put the data scientist into the national spotlight in their publication Data Scientist: The Sexiest Job of the 21st Century. Job trends data from Indeed.com confirms the rise in popularity for the position, showing that the number of job postings for data scientist positions increased by 15,000%.

In the meantime, the role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital – a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

For every Big Data problem, the solution often rests on the shoulders of a data scientist. The role of the data scientist is similar in responsibility to the Wall Street “quants” of the 80s and 90s – now, these data experienced are tasked with the management of databases previously thought too hard to handle, and too unstructured to derive any value.

So, is it the sexiest job of the 21st Century?

Think of a data scientist more like the business analyst-plus, part mathematician, part business strategist, these statistical savants are able to apply their background in mathematics to help companies tame their data dragons. But these individuals aren’t just math geeks, per se.

A data scientist is somebody who is inquisitive, who can stare at data and spot trends. It’s almost like a renaissance individual who really wants to learn and bring change to an organization.

If this sounds like you, the good news is demand for data scientists is far outstripping supply. Nonetheless, with the rising popularity of the data scientist – not to mention the companies that are hiring for these positions – you have to be at the top of your field to get the jobs.

Companies look to build teams around data scientists that ask the most questions about:

  • How the business works
  • How it collects its data
  • How it intends to use this data
  • What it hopes to achieve from these analyses

These questions were important because data scientists will often unearth information that can “reshape an entire company.” Obtaining a better understanding of the business’ underpinnings not only directs the data scientist’s research, but helps them present the findings and communicate with the less-analytical executives within the organization.

While it’s important to understand your own business, learning about the successes of other corporations will help a data scientist in their current job–and the next.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business/IT Collaboration, CIO, Data Governance, General, Governance, Risk and Compliance, Real-Time | Tagged , , | Leave a comment

Stop Trying to Manage Data Growth!(?)

Data Downpour

Data Downpour

Talking to architects about analytics at a recent event, I kept hearing the familiar theme; data scientists are spending 80% of their time on “data wrangling” leaving only 20% for delivering the business insights that will drive the company’s innovation.  It was clear to everybody that I spoke to that the situation will only worsen.  The coming growth everybody sees in data volume and complexity, will only lengthen the time to value.

Gartner recently predicted that:

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

50 percent

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

Some of the details of this study are interesting.  In the end, many organizations are coming to two conclusions:

  • It’s risky to delete data, so they keep it around as insurance.
  • All data has potential business value, so more organizations are keeping it around for potential analytical purposes.

The other mega-trend here is that more and more organizations are looking to compete on analytics – and they need data to do it, both internal data and external data.

From an architect’s perspective, here are several observations:

  • The floodgates are open and analytics is a top priority. Given that, the emphasis should be on architecting to manage the dramatic increases in both data quantity and data complexity rather than on trying to stop it.
  • The immediate architectural priority has to be on simplifying and streamlining your current enterprise data architecture. Break down those data silos and standardize your enterprise data management tools and processes as much as possible.  As discussed in other blogs, data integration is becoming the biggest bottleneck to business value delivery in your environment. Gartner has projected that “by 2018, more than half the cost of implementing new large systems will be spent on integration.”  The more standardized your enterprise data management architecture is, the more efficient it will be.
  • With each new data type, new data tool (Hive, Pig, etc.), and new data storage technology (Hadoop, NoSQL, etc.) ask first if your existing enterprise data management tools can handle the task before people go out and create a new “data silo” based on the cool, new technologies. Sometimes it will be necessary, but not always.
  • The focus needs to be on speeding value delivery for the business. And the key bottleneck is highly likely to be your enterprise data architecture.

Rather than focusing on managing data growth, the priority should be on managing it in the most standardized and efficient way possible.  It is time to think about enterprise data management as a function with standard processes, skills and tools (just like Finance, Marketing or Procurement.)

Several of our leading customers have built or are building a central “Data as a Service” platform within their organizations.  This is a single, central place where all developers and analysts can go to get trustworthy data that is managed by IT through a standard architecture and served up for use by all.

For more information, see “The Big Big Data Workbook

*Gartner Predicts 2015: Managing ‘Data Lakes’ of Unprecedented Enormity, December 2014  http://www.gartner.com/document/2934417#

Share
Posted in Architects, CIO, Data Integration Platform | Tagged , , , , , , | Leave a comment

Data Streams, Data Lakes, Data Reservoirs, and Other Large Data Bodies

data lake

Data Lake is a catchment area for data entering the organization

A Data Lake is a simple concept. They are a catchment area for data entering the organization. In the past, most businesses didn’t need to organize such a data store because almost all data was internal. It traveled via traditional ETL mechanisms from transactional systems to a data warehouse and then was sprayed around the business, as required.

When a good deal of data comes from external sources, or even from internal sources like log files, which never previously made it into the data warehouse, there is a need for an “operational data store.” This has definitely become the premier application for Hadoop and it makes perfect sense to me that such technology be used for a data catchment area. The neat thing about Hadoop for this application is that:

  1. It scales out “as far as the eye can see,” so there’s no likelihood of it being unable to manage the data volumes even when they grow beyond the petabyte level.
  2. It is a key-value store, which means that you don’t need to expend much effort in modeling data when you decide to accommodate a new data source. You just define a key and define the metadata at leisure.
  3. The cost of the software and the storage is very low.

So let’s imagine that we have a need for a data catchment area, because we have decided to collect data from log-files, mobile devices, social networks, from public data sources, or whatever. So let us also imagine that we have implemented Hadoop and some of its useful components and we have begun to collect data.

Is it reasonable to describe this as a data lake?

A Hadoop implementation should not be a set of servers randomly placed at the confluence of various data flows. The placement needs to be carefully considered and if the implementation is to resemble a “data lake” in any way, then it must be a well-engineered man-made lake. Since the data doesn’t just sit there until it evaporates but eventually flows to various applications, we should think of this as a “data reservoir” rather than a “data lake.”

There is no point in arranging all that data neatly along the aisles because when we get it, we may not know what we want to do with it at the time we get it. We should organize the data when we know that.

Another reason we should think of this as more like a reservoir than a lake is that we might like to purify the data a little before sending it down the pipes to applications or users that want to use it.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, CIO, Cloud Data Integration, Cloud Data Management, DaaS, Hadoop, IaaS | Tagged , , , , , | Leave a comment

Three Things Every Architect Should Do for 2015

architect

Moving at the Speed of the Business

The start of the year is a great time to refresh and take a new look at your capabilities, goals, and plans for your future-state architecture.  That being said, you have to take into consideration that the most scarce resource in your architecture is probably your own personal time.

Looking forward, here are three things that I would recommend that every architect do.  I realize that all three of these relate to data, but as I have said in the eBook, Think “Data First” to Drive Business Value, we believe that data is the key bottleneck in your enterprise architecture in terms of slowing the delivery of business initiatives in support of your organization’s business strategy.

So, here are the recommendations.  None of these will cost you anything if you are a current Informatica PowerCenter customer.  And #2 and #3 are free regardless.  It is only a matter of your time:

1. Take a look at the current Informatica Cloud offering and in particular the templating capabilities.

Informatica Cloud is probably much more capable than you think.  The standard templating functionality supports very complex use cases and does it all from a very easy to use, no-coding, user interface.  It comes with a strong library of integration stubs that can be dragged & dropped into Microsoft Viseo to create complex integrations.  Once the flow is designed in Viseo, it can be easily imported into Informatica Cloud and from there users have a Wizard-driven UI to do the final customization for sources, targets, mappings, transformations, filters, etc.  It is all very powerful and easy to use.

Resources

Why This Matters to Architects

  • You will see how easy it is for new groups to get going with fairly complex integrations.
  • This is a great tool for departmental or new user use, and it will be completely compatible with the rest of your Informatica architecture – not another technology silo for you to manage.
  • Any mapping created for Informatica on-premise can also run on the cloud version.

2. Download Informatica Rev and understand what it can do for your analysts and “data wranglers.”

Your data analysts are spending 80% of their time managing their data and only 20% on the actual analysis they are trying to provide.  Informatica Rev is a great way to prepare your data before use in analytics tools such as Qlik, Tableau, and others.

With Informatica Rev, people who are not data experts can access, mashup, prototype and cleanse their data all in a User Interface that looks like a spreadsheet and requires no previous experience in data tools.

Resources

Why  This Matters for Architects

  • Your data analysts are going to use analytics tools with or without the help of IT. This enables you to help them while ensuring that they are managing their data well and optimizing their productivity.
  • This tool will also enable them to share their “data recipes” and for IT to be involved in how they access and use the organization’s data.

 

3. Look at the new features in PowerCenter 9.6.  First, upgrade to 9.6 if you haven’t already, and particularly take a good look at these new capabilities that are bundled in every version. Many people we talk to have 9.6 but don’t realize the power of what they already own.

  1. Profiling: Discover and analyze your data quickly.  Find relationships and data issues.
  2. Data Services: This presents any JDBC or ODBC repository as a logical data object. From there you can rapidly prototype new applications using these logical objects without worrying about the complexities of the underlying repositories. It can also do data cleansing on the fly.

Resources

 

Why This Matters for Architects

  • The key challenge for IT and for Architects is to be able to deliver at the “speed of business.” These tools can dramatically improve the productivity of your team and speed the delivery of projects for your business “customers.”

Taking the time to understand what these tools can do in terms of increasing the productivity of your IT team and enabling your end users to self-service will make you a better business partner overall and increase your influence across the organization.  Have a great year!

Share
Posted in Architects, CIO | Tagged , , , , | Leave a comment

Stop Asking Your IT for a Roadmap

it_roadmap

IT Roadmap

Knowing business’s trends and needs change frequently, why is it that we plan multi-year IT-driven roadmaps?

Understandably, IT managers have honed their skills in working with the line to predict business needs. They have learned to spend money and time wisely and to have the right infrastructure in place to meet the business’ needs. Whether it is launching in a new market, implementing a new technology, or one of many other areas where IT can help its firm find a competitive advantage.

Not so long ago, IT was so complex and unwieldy that it needed specially-trained professionals to source, build, and run almost every aspect of it, and when line managers had scant understanding which technology would suit their activities best, making a plan based on long-term business goals was a good one.

Today, we talk of IT as a utility, just like electricity, you press a button, and IT turns “on.” However that is not the case, the extent to which IT has saturated the day-to-day business life means they are better placed to determine how technology should be used to achieve the company’s objectives.

In the next five years, the economic climate will change, customer preferences will shift, and new competitors will threaten the business. Innovations in technology will provide new opportunities to explore, and new leadership could send the firm in a new direction. While most organizations have long-term growth targets, their strategies constantly evolve.

This new scenario has caused those in the enterprise architecture (EA) function to ask whether long-term road mapping is still a valuable investment.

EAs admit that long-term IT-led road mapping is no longer feasible. If the business does not have a detailed and stable five-year plan, these architects argue, how can IT develop a technology roadmap to help them achieve it? At best, creating long-term roadmaps is a waste of effort, a never-ending cycle of updates and revisions.

Without a long-range vision of business technology demand, IT has started to focus purely on the supply side. These architects focus on existing systems, identifying ways to reduce redundancies or improve flexibility. However, without a clear connection to business plans, they struggle to secure funding to make their plans a reality.

IT has turned their focus to the near-term, trying to influence the small decisions made every day in their organizations. IT can have greater impact, they believe, if they serve as advisors to IT and business stakeholders, guiding them to make cost-efficient, enterprise-aligned technology decisions.

Rather than taking a top-down perspective, shaping architecture through one master plan, they work from the bottom-up, encouraging more efficient working by influencing the myriad technology decisions being made each day.

Twitter @bigdatabeat

Share
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration | Tagged , , , , | Leave a comment