Category Archives: Business/IT Collaboration

Internet of Things (IoT) Changes the Data Integration Game in 2015

Data Integration

Internet of Things (IoT) Changes the Data Integration Game in 2015

As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”

We can all see this happening in our personal lives.  Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone.  On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.

So, what changed?  With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.

For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure.  Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.

The problem with all of this very cool stuff is that we need to once again rethink data integration.  Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.

That’s why those who are moving to IoT-based systems need to do two things.  First, they must create a strategy for extracting data from devices, such as industrial robots or ann  Audi A8.  Second, they need a strategy to take  all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.

The challenge is that machines and devices are not traditional IT systems.  I’ve built connectors for industrial applications in my career.  The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around.  Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.

This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage.  However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible.  Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff.  At the heart of its ability to succeed is the ability to move data from place-to-place.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Services | Tagged , , , | 5 Comments

Informatica Supports New Custom ODBC/JDBC Drivers for Amazon Redshift

Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.

Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.

Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.

Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.

Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.

Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.

Redshift

To learn more, sign up for the free trial of Informatica’s Redshift connector for Informatica Cloud or PowerCenter.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

How Organizations can Prepare for 2015 Data Privacy Legislation

Original article can be found here, scmagazine.com

On Jan. 13 the White House announced President Barack Obama’s proposal  for new data privacy legislation, the Personal Data Notification and Protection Act.  Many states have laws today that require corporations and government agencies to notify consumers in the event of a breach – but it is not enough.  This new proposal aims to improve cybersecurity standards nationwide with the following tactics:

Enable cyber-security information sharing between private and public sectors. 

Government agencies and corporations with a vested interest in protecting our information assets need a streamlined way to communicate and share threat information. This component of the proposed legislation incents organizations that participate in knowledge-sharing with targeted liability protection, as long as they are responsible for how they share, manage and retain privacy data.

Modernize the tools law enforcement has to combat cybercrime.
Existing laws, such as the Computer Fraud and Abuse Act, need to be updated to incorporate the latest cyber-crime classifications while giving prosecutors the ability to target insiders with privileged access to sensitive and privacy data.  The proposal also specifically calls out pursuing prosecution when selling privacy data nationally and internationally.

Standardize breach notification policies nationwide.
Many states have some sort of policy that requires notification of customers that their data has been compromised.  Three leading examples include California , Florida’s Information Protection Act (FIPA) and Massachusetts Standards for the Protection of Personal Information of Residents of the Commonwealth.  New Mexico, Alabama and South Dakota have no data breach protection legislation.  Enforcing standardization and simplifying the requirement for companies to notify customers and employees when a breach occurs will ensure consistent protection no matter where you live or transact.

Invest in increasing cyber-security skill sets.
For a number of years, security professionals have reported an ever-increasing skills gap in the cybersecurity profession.  In fact, in a recent Ponemon Institute report, 57 percent of respondents said a data breach incident could have been avoided if the organization had more skilled personnel with data security responsibilities. Increasingly, colleges and universities are adding cybersecurity curriculum and degrees to meet the demand. In support of this need, the proposed legislation mentions that the Department of Energy will provide $25 million in educational grants to Historically Black Colleges and Universities (HBCU) and two national labs to support a cybersecurity education consortium.

This proposal is clearly comprehensive, but it also raises the critical question: How can organizations prepare themselves for this privacy legislation?

The International Association of Privacy Professionals conducted a study of Federal Trade Commission (FTC) enforcement actions.  From the report, organizations can infer best practices implied by FTC enforcement and ensure these are covered by their organization’s security architecture, policies and practices:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

The Personal Data Notification and Protection Act fills a void at the national level; most states have privacy laws with California pioneering the movement with SB 1386.  However, enforcement at the state AG level has been uneven at best and absent at worse.

In preparing for this national legislation organization need to heed the policies derived from the FTC’s enforcement practices. They can also track the progress of this legislation and look for agencies such as the National Institute of Standards and Technology to issue guidance. Furthermore, organizations can encourage employees to take advantage of cybersecurity internship programs at nearby colleges and universities to avoid critical skills shortages.

With online security a clear priority for President Obama’s administration, it’s essential for organizations and consumers to understand upcoming legislation and learn the benefits/risks of sharing data. We’re looking forward to celebrating safeguarding data and enabling trust on Data Privacy Day, held annually on January 28, and hope that these tips will make 2015 your safest year yet.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Security, Data Services | Tagged , , , | Leave a comment

Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud

Are you ready to answer “Yes” to the questions:

a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”

I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.

The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme  “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:

  • How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
  • How you can make data work in new and advanced ways for every user in your company.

Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.

“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”

Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.

The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.

The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.

Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.

For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.

Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.

As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.

My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!

To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Services | Tagged , , , , , , , , , , , , , , , , | 2 Comments

The Sexiest Job of the 21st Century

Sexiest Job

The Sexiest Job of the 21st Century

I’ve spent most of my career working with new technology, most recently helping companies make sense of mountains of incoming data. This means, as I like to tell people, that I have the sexiest job in the 21st century.

Harvard Business Review put the data scientist into the national spotlight in their publication Data Scientist: The Sexiest Job of the 21st Century. Job trends data from Indeed.com confirms the rise in popularity for the position, showing that the number of job postings for data scientist positions increased by 15,000%.

In the meantime, the role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital – a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

For every Big Data problem, the solution often rests on the shoulders of a data scientist. The role of the data scientist is similar in responsibility to the Wall Street “quants” of the 80s and 90s – now, these data experienced are tasked with the management of databases previously thought too hard to handle, and too unstructured to derive any value.

So, is it the sexiest job of the 21st Century?

Think of a data scientist more like the business analyst-plus, part mathematician, part business strategist, these statistical savants are able to apply their background in mathematics to help companies tame their data dragons. But these individuals aren’t just math geeks, per se.

A data scientist is somebody who is inquisitive, who can stare at data and spot trends. It’s almost like a renaissance individual who really wants to learn and bring change to an organization.

If this sounds like you, the good news is demand for data scientists is far outstripping supply. Nonetheless, with the rising popularity of the data scientist – not to mention the companies that are hiring for these positions – you have to be at the top of your field to get the jobs.

Companies look to build teams around data scientists that ask the most questions about:

  • How the business works
  • How it collects its data
  • How it intends to use this data
  • What it hopes to achieve from these analyses

These questions were important because data scientists will often unearth information that can “reshape an entire company.” Obtaining a better understanding of the business’ underpinnings not only directs the data scientist’s research, but helps them present the findings and communicate with the less-analytical executives within the organization.

While it’s important to understand your own business, learning about the successes of other corporations will help a data scientist in their current job–and the next.

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business/IT Collaboration, CIO, Data Governance, General, Governance, Risk and Compliance, Real-Time | Tagged , , | Leave a comment

Lovenomics: The Price of Love This Valentine’s Day, Cash In on the Hopeless Romantics

retailers

Lovenomics: The Price of Love This Valentine’s Day, Cash In on the Hopeless Romantics

This blog post was originally featured on Business.com here: Lovenomics: The Price of Love This Valentine’s Day.

After the Blue Cross sales that dominate January, Valentine’s Day offers welcome relief to the high street. Valentine’s Day marks the end of Christmas sales and the first of the year’s seasonal hooks providing retailers with an opportunity to upsell. According to the National Retail Federation’s Valentine’s Day Consumer Spending Survey, American consumers plan to spend a total of $4.8 billion on jewelry and a survey high of nearly $2 billion on clothing this year. However, to successfully capture customers, retailers need to develop an omni-channel strategy designed to sell the right product.

Target the indecisive

For the most part, the majority of Valentine’s Day shoppers will be undecided when they begin their purchasing journey. Based on this assumption, a targeted sales approach at the point of interest (POI) and point of sale (POS) will be increasingly important. Not only do retailers need to track and understand the purchasing decision of every customer as they move between channels, but they also need to have a real-time view of the product lines, pricing and content that the competition is using. Once armed with this information, retailers can concentrate on delivering personalized ads or timely product placements that drive consumers to the checkout as they move across different channels.

Related Article: 11 Cheeky Business Valentine’s Day Cards for the BFF In Your Office

Start with search

Consumers will start their shopping journey with a search engine and will rarely scroll past the first page. So brands need to be prepared by turning Valentine’s Day product lines into searchable content. To capture a greater share of online traffic, retailers should concentrate on making relevant products easy to find by managing meta-information, optimizing media assets with the keywords that consumers are using, deploying rich text and automatically sending products to search engines.

Next generation loyalty

Retailers and restaurants can now integrate loyalty schemes into specialized smartphone apps, or maybe integrate customer communication to automatically deliver personalized ads (e.g., offers for last minute gifts for those who forget). However, to ensure success, brands need to know as much about their customers as consumers know about their products. By being able to monitor customers’ behavior, the information that they are looking at and the channels that they are using to interact with brands, loyalty programs can be used to deliver timely special offers or information at the right moment.

Digital signage

Valentine’s Day represents an opportunity to reinvent the in-store experience. By introducing digital signage for special product promotions, retailers can showcase a wide range of eclectic merchandise to showroom consumers. This could be done by targeting any smartphone consumers (who have allowed geo-located ads on their phones) with a personalized text message when they enter the store. Use this message to direct them to the most relevant areas for Valentine’s Day gifts or present them with a customized offer based on previous buying history.

Related Article: Small Business Marketing Tips for Valentine’s Day

Quick fulfillment

supermarkets have become established as the one-stop shop for lovers in a rush. Last year, Tesco, a British multinational grocery and general merchandise retailer, revealed that 85 percent of all Valentine’s Day bouquets were bought on the day itself, with three-quarters of all Valentine’s Day chocolates sold on February 14.

To tap into the last-minute attitude of panicked couples searching for a gift, retailers should have a dedicated Valentine’s Day section online and provide timely offers that come with the promise of delivery in time for Valentine’s Day. For example, BCBGMAXAZRIA is using data quality services to ensure its email list is clean and updated, keeping its sender reputation high so that when they need to reach customers during critical times like Valentine’s Day, they have confidence in their data.

Alternatively, retailers can help customers by closely managing local inventory levels to offer same-day click-and-collect initiatives or showing consumers the number of items that are currently in-stock and in-store across all channels.

Valentine’s Day may seem like a minor holiday after Christmas, but for retailers it generates billions of dollars in annual spending and presents a tremendous opportunity to boost their customer base. With these tips, retailers will hopefully be able to sweeten their sales by effectively targeting customers looking for the perfect gift for their special someone.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Services, Retail | Tagged , | Leave a comment

What’s All the Mania Around SaaS Data?

SaaSIt’s no secret that the explosion of software-as-a-service (SaaS) apps has revolutionized the way businesses operate. From humble beginnings, the titans of SaaS today include companies such as Salesforce.com, NetSuite, Marketo, and Workday that have gone public and attained multi-billion dollar valuations.  The success of these SaaS leaders has had a domino effect in adjacent areas of the cloud – infrastructure, databases, and analytics.

Amazon Web Services (AWS), which originally had only six services in 2006 with the launch of Amazon EC2, now has over 30 ranging from storage, relational databases, data warehousing, Big Data, and more. Salesforce.com’s Wave platform, Tableau Software, and Qlik have made great advances in the cloud analytics arena, to give better visibility to line-of-business users. And as SaaS applications embrace new software design paradigms that extend their functionality, application performance monitoring (APM) analytics has emerged as a specialized field from vendors such as New Relic and AppDynamics.

So, how exactly did the growth of SaaS contribute to these adjacent sectors taking off?

The growth of SaaS coincided with the growth of powerful smartphones and tablets. Seeing this form factor as important to the end user, SaaS companies rushed to produce mobile apps that offered core functionality on their mobile device. Measuring adoption of these mobile apps was necessary to ensure that future releases met all the needs of the end user. Mobile apps contain a ton of information such as app responsiveness, features utilized, and data consumed. As always, there were several types of users, with some preferring a laptop form factor over a smartphone or tablet. With the ever increasing number of data points to measure within a SaaS app, the area of application performance monitoring analytics really took off.

Simultaneously, the growth of the SaaS titans cemented their reputation as not just applications for a certain line-of-business, but into full-fledged platforms. This growth emboldened a number of SaaS startups to develop apps that solved specialized or even vertical business problems in healthcare, warranty-and-repair, quote-to-cash, and banking. To get started quickly and scale rapidly, these startups leveraged AWS and its plethora of services.

The final sector that has taken off thanks to the growth of SaaS is the area of cloud analytics. SaaS grew by leaps and bounds because of its ease of use, and rapid deployment that could be achieved by business users. Cloud analytics aims to provide the same ease of use for business users when providing deep insights into data in an interactive manner.

In all these different sectors, what’s common is the fact that SaaS growth has created an uptick in the volume of data and the technologies that serve to make it easier to understand. During Informatica’s Data Mania event (March 4th, San Francisco) you’ll find several esteemed executives from Salesforce, Amazon, Adobe, Microsoft, Dun & Bradstreet, Qlik, Marketo, and AppDynamics talk about the importance of data in the world of SaaS.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, SaaS | Tagged , , , , | Leave a comment

Informatica University offers New Flexible Pricing for Private Training

Informatica UniversityCustomers often inquire about the best way to get their team up to speed on the Informatica solutions.  The question Informatica University hears frequently is whether a team should attend our public scheduled courses or hold a Private training event.  The number of resources to be skilled on the products will help to determine which option to choose.  If your team, or multiple teams within your company, has 7 or more resources that require getting up to speed on the Informatica products, then a Private training event is the recommended choice.

Seven (7) for a remote instructor and nine (9) for an onsite instructor is the break even cost per resource when determining whether to hold a private training and is the most cost efficient delivery for a team.  In addition to the cost benefit, customers who have taken this option value the daily access to their team members to keep business operations humming along, and the opportunity to collaborate with key team members not attending by allowing them to provide input to project perspective.

These reserved events also provide the opportunity to be adapted to focus on a customers needs by tailoring course materials to highlight topics that will be key to a project’s implementation which provide creative options to get a team up to speed on the Informatica projects at hand.

With Informatica University’s new flexible pricing, hosting a Private Training event is easy.  All it takes is:

  • A conference room
  • Training PC’s or laptops for participants
  • Access to the Internet
  • An LCD projector, screen, white board, and appropriate markers

Private training events provide the opportunity to get your resources comfortable and efficient with the Informatica Solutions and have a positive impact on the success of your projects.

To understand more about Informatica’s New Flexible Pricing, contact trainingsales@informatica.com

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Informatica Feature, Informatica University | Tagged , , , | Leave a comment

Big Data Is Neither-Part II

Big_DataYou Say Big Dayta, I say Big Dahta

Some say Big Data is a great challenge while others say Big Data creates new opportunities. Where do you stand?  For most companies concerned with their Big Data challenges, it shouldn’t be so difficult – at least on paper. Computing costs (both hardware and software) have vastly shrunk. Databases and storage techniques have become more sophisticated and scale massively, and companies such as Informatica have made connecting and integrating all the “big” and disparate data sources much easier and have helped companies achieve a sort of “big data synchronicity”. As it is.

In the process of creating solutions to Big Data problems, humans (and the supra-species known as IT Sapiens) have a tendency to use theories based on linear thinking and the scientific method. There is data as our systems know it and data as our systems don’t. The reality, in my opinion, is that “Really Big Data” problems now and in the future will have complex correlations and unintuitive relationships that need to utilize mathematical disciplines, data models and algorithms that haven’t even been discovered or invented yet and when eventually discovered, will make current database science positively primordial.

At some point in the future, machines will be able to predict, based on big, perhaps unknown data types when someone is having a bad day or a good day, or more importantly whether a person may behave in a good or bad way. Many people do this now when they take a glance at someone across a room and infer how that person is feeling or what they will do next. They see eyes that are shiny or dull, crinkles around eyes or sides of mouths, then hear the “tone” in a voice and then their neurons put it altogether that this is a person that is having a bad day and needs a hug. Quickly. No one knows exactly how the human brain does this, but it does what it does and we go with it and we are usually right.

U.S._Air_Force_Senior_Airman__130429-F-ZX232-013

And some day, Big Data will be able to derive this and it will be an evolution point and it will also be a big business opportunity. Through bigger and better data ingestion and integration techniques and more sophisticated math and data models, a machine will do this fast and relatively speaking, cheaply. The vast majority won’t understand why or how it’s done, but it will work and it will be fairly accurate.

And my question to you all is this.

Do you see any other alternate scenarios regarding the future of big data? Is contextual computing an important evolution and will big data integration be more or less of a problem in the future.

PS. Oh yeah, one last thing to chew on concerning Big Data… If Big Data becomes big enough, does that spell the end of modelling as we know it?

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CMO, Complex Event Processing, Data Integration Platform, Hadoop, Intelligent Data Platform | Tagged , , , | Leave a comment

Stop Asking Your IT for a Roadmap

it_roadmap

IT Roadmap

Knowing business’s trends and needs change frequently, why is it that we plan multi-year IT-driven roadmaps?

Understandably, IT managers have honed their skills in working with the line to predict business needs. They have learned to spend money and time wisely and to have the right infrastructure in place to meet the business’ needs. Whether it is launching in a new market, implementing a new technology, or one of many other areas where IT can help its firm find a competitive advantage.

Not so long ago, IT was so complex and unwieldy that it needed specially-trained professionals to source, build, and run almost every aspect of it, and when line managers had scant understanding which technology would suit their activities best, making a plan based on long-term business goals was a good one.

Today, we talk of IT as a utility, just like electricity, you press a button, and IT turns “on.” However that is not the case, the extent to which IT has saturated the day-to-day business life means they are better placed to determine how technology should be used to achieve the company’s objectives.

In the next five years, the economic climate will change, customer preferences will shift, and new competitors will threaten the business. Innovations in technology will provide new opportunities to explore, and new leadership could send the firm in a new direction. While most organizations have long-term growth targets, their strategies constantly evolve.

This new scenario has caused those in the enterprise architecture (EA) function to ask whether long-term road mapping is still a valuable investment.

EAs admit that long-term IT-led road mapping is no longer feasible. If the business does not have a detailed and stable five-year plan, these architects argue, how can IT develop a technology roadmap to help them achieve it? At best, creating long-term roadmaps is a waste of effort, a never-ending cycle of updates and revisions.

Without a long-range vision of business technology demand, IT has started to focus purely on the supply side. These architects focus on existing systems, identifying ways to reduce redundancies or improve flexibility. However, without a clear connection to business plans, they struggle to secure funding to make their plans a reality.

IT has turned their focus to the near-term, trying to influence the small decisions made every day in their organizations. IT can have greater impact, they believe, if they serve as advisors to IT and business stakeholders, guiding them to make cost-efficient, enterprise-aligned technology decisions.

Rather than taking a top-down perspective, shaping architecture through one master plan, they work from the bottom-up, encouraging more efficient working by influencing the myriad technology decisions being made each day.

Twitter @bigdatabeat

Share
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration | Tagged , , , , | Leave a comment