Tag Archives: Data Integration

Learn to Put Your Data to Work

Healthcaredata

Put Your Data to Work

Informatica recently released the findings of a survey (entitled “Data is Holding You Back from Analytics Success”) in which respondents revealed that 85% of are effective at putting financial data to use to inform decision making.  However, it also discovered that many are less confident about putting data to use to inform patient engagement initiatives that require access to external data and big data, which they note to be more challenging.

The idea is that data unto itself does not carry that much value.  For example, I’ve been gathering data with my fitbit for over 90 days.  A use of that data could be to looking at patterns that might indicate I’m more likely to have heart attack.  However, this can only be determined if we compare my data with external historical patient data that exists in a large analytical database (big data).

The external data provides the known patterns that lead to known outcomes.  Thus, when compared with my data, predictive analytics can occur.  In other words, we can use data integration as a way to mash up and analyze the data so it has more meaning and value.  In this case, perhaps having me avoid a future heart attack.

Inter-organizational transformational business processes require information sharing between data sources, and yet, according to the Informatica Survey, over 65% of respondents say data integration and data quality are significantly challenging.  Thus, healthcare providers collect data, but many have yet to integrate these data silos to realize its full potential.

Indeed, the International Institute of Analytics, offered a view of the healthcare analytics maturity by looking at more than 20 healthcare provider organizations.  The study validated the fact that, while healthcare providers are indeed gathering the EMR data, they are not acting upon the data in meaningful ways.

The core problem is a lack of understanding of the value that this data can bring.  Or, perhaps the lack of a budget for the right technology.  Much as my Fitbit could help me prevent a future heart attack by tracking my activity data, healthcare providers can use their data to become more proactive around health issues.

Better utilization of this data will reduce costs by leveraging predictive analytics to take more preventative measures.  For instance, automatically culling through the family tree of a patient to determine risks for cancer, heart disease, etc., and automatically scheduling specific kinds of tests that are not normally given unless the patient is symptomatic.

Of course, putting your data to work is not free.  It’s going to take some level of effort to create strategies, and acquire and deploy data integration technology.  However, the benefits are easy to define, thus the business case is easy to create as well.

For myself, I’ll keep gathering data.  Hopefully it will have some use, someday.

Share
Posted in Data Integration | Tagged , | Leave a comment

Everybody‘s Doing It! Learn how at Informatica World 2015!

I-WANT-DATA-NOW

Learn more at Informatica World 2015

Everybody’s doing it.  And if not, they say they are doing it anyway!  Are you doing it?  We all hear the mantra: ‘Data is an asset’.  Everybody wants to get in on the action.  Hear at Informatica World 2015 how Informatica customers are using data integration agility to drive business agility.  These organizations are relying on the Informatica platform as their data foundation.  Does that sound a bit vague?  Let’s get more specific here…

Who hasn’t used PayPal to send a secure payment?  Would you like to know how PayPal is managing the data integration architecture to support and analyze 11.6 million payments per day? Hear PayPal’s Architect chat with Informatica PowerCenter Product Managers.  They will discuss Advanced Scaling, Metadata Management and Business Glossary.  Would you like to learn how these PowerCenter capabilities can benefit your business? Add this session to your IW15 registration.

Verizon’s Architect will talk about consolidating 50+ legacy applications into a new application architecture. Wow, that’s a massive data management effort!  Are you curious to learn more about how to successfully manage an application modernization effort of this scope using Informatica tools?  Add this session to your IW15 registration.

Did you know that HP boasts one of the most complex and largest Informatica installations on the face of the earth?  HP’s Informatica Shared Services architecture allows hundreds of projects throughout HP worldwide to use PowerCenter data integration capabilities.  And they do so easily and cost effectively.  Join the session to gain insight on the design, architecture, creation, support, and overall governance of this solution.  Would you like to learn more from HP on how to maximize the benefits of your Informatica investment?  Add this session to your IW15 registration.

You have probably had your tires replaced at Discount Tire.  Would you like to learn more about how Discount Tire leverages the Informatica Platform to gain business advantage?  Discount Tire’s architect will share how they test and monitor their Data Integration environment using PowerCenter capabilities, such as Data Validation Testing and Proactive Monitoring.  They will discuss the benefit of doing so across multiple use cases.  They will highlight Discount Tire’s migration to a new SAP application and how the Informatica platform supports this migration. Would you like to improve how you test and monitor you critical business processes?  Add this session to your IW15 registration.

Finally if your organization is planning any type of application modernization or rationalization, you will not want to miss this Informatica customer panel.  This session will features experts from Cisco, Verizon and Discount Tire.  Speakers will share their experience, insights and best practices for Application Consolidation and Migration.  As we discussed in a previous blog post, failure rates for these projects are staggeringly high.  Arm yourself with proven best practices for data management.  This has been shown to increase the success of application modernization projects! To learn more Add this session to your IW15 registration.

Register for infa15

Register now for Informatica World 2015

We hope you can join us at Informatica World 2015!  Come and enjoy the wealth of experience and insights shared by these industry experts and many others.  See you there!

Share
Posted in Informatica World 2015 | Tagged , , , , , , , , , | Leave a comment

Come to Informatica World 2015 and Advance Your Career

INFA15- Data Integration

Come to Informatica World 2015 and Advance Your Career

5 Reasons Data Integration Professionals Absolutely Must Not Miss This Informatica World.

If you are a Data Integration or Data Management professional, you really cannot afford to miss this event.  This year’s theme in the Data Integration track at Informatica World is all about customers.  Over 50 customers will be sharing their experiences and best practices for succeeding with for data integration projects such as analytics, big data, application consolidation and migration, and much more.

If you still need convincing, here are the five reasons:

  1. Big Data:A special Big Data Summit is part of the track.
  2. Immediate Value:over 50 customers will be sharing their experience and best practices. Things you can start doing now to improve your organization.
  3. Architecture for Business Transformation. An architecture track focused on practical approaches for using architecture to enable business transformation, with specific examples and real customer experiences.
  4. Hands on Labs:Everybody loves them. This year we have even more. Sign up early to make sure that you get your choice. They go fast!
  5. New “Meet the Experts” Sessions:These are small group meetings for business-level discussions around subjects like big data, analytics, application consolidation, and more.

This truly will be a one-stop shop for all things data integration at Informatica World.  The pace of both competition and technology change is accelerating.  Attend this event to stay on top of what is happening in the word of data integration and how leading companies and experts are using data for competitive advantage within their organizations.

To help start your planning, here is a listing of the Data Integration, Architecture, and Big Data Sessions this year.  I hope to see you there.

QUICK GUIDE

DATA INTEGRATION AND BIG DATA at INFORMATICA WORLD 2015

Breakout Sessions, Tuesday, May 13

Session Time Location
Accelerating Business Value Delivery with Informatica Platform  (Architect Track Keynote) 10:45am – 11:15am Gracia 6
How to Support Real Time Data Integration Projects with PowerCenter (Grant Thornton) 1:30pm – 2:30pm Gracia 2
Knowledgent 11:30am – 12:15pm Gracia 8
Putting Big Data to Work to Make Cancer History at MD Anderson Cancer Center (MD Anderson) 11:30am – 12:15pm Gracia 4
Modernize your Data Architecture for Speed, Efficiency, and Scalability 11:30am – 12:15pm Castellana 1
An Architectural Approach to Data as an Asset (Cisco) 11:30am – 12:15pm Gracia 6
Accenture 11:30am – 12:15pm Gracia 2
Architectures for Next-Generation Analytics 1:30pm – 2:30pm Gracia 6
Informatica Marketplace (Tamara Strifler) 1:30pm – 2:30pm Gracia 4
Informatica Big Data Ready Summit: Keynote Address (Anil Chakravarthy, EVP and Chief Product Officer) 1:40 – 2:25 Castellana 1
Big Data Keynote: Tom Davenport, Distinguished Professor in Management and Information Technology, Babson College 2:30 – 3:15 Castellana 1
How to Test and Monitor Your Critical Business Processes with PowerCenter (Discount Tire, AT&T) 2:40pm – 3:25pm Gracia 2
Enhancing Consumer Experiences with Informatica Data Integration Hub (Humana) 2:40pm – 3:25pm Gracia 4
Business Transformation:  The Case for Information Architecture (Cisco) 2:40pm – 3:25pm Gracia 6
Succeeding with Big Data and Avoiding Pitfalls (CapGemini, Cloudera, Cognizant, Hortonworks) 3:15 – 3:30
What’s New in B2B Data Exchange: Self-Service Integration of 3rd Party Partner Data (BMC Software) 3:35pm – 4:20pm Gracia 2
PowerCenter Developer:  Mapping Development Tips & Tricks 3:35pm – 4:20pm Gracia 4
Modernize Your Application Architecture and Boost Your Business Agility (Mototak Consulting) 3:35pm – 4:20pm Gracia 6
The Big Data Journey: Traditional BI to Next Gen Analytics (Johnson&Johnson, Transamerica, Devon Energy, KPN) 4:15 – 4:30 Castellana 1
L&T Infotech 4:30 – 5:30 Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 4:30 – 5:30 Gracia 4
Next-Generation Analytics Architecture for the Year 2020 4:30 – 5:30 Gracia 6
Accelerate Big Data Projects with Informatica (Jeff Rydz) 4:35 – 5:20 Castellana 1
Big DataMichael J. Franklin, Professor of Computer Science, UC Berkeley 5:20 -5:30 Castellana 1
  • Informatica World Pavilion5:15 PM – 8:00 PM

Breakout Sessions, Wednesday, May 14

Session Time Location
How Mastercard is using a Data Hub to Broker Analytics Data Distribution (Mastercard) 2:00pm – 2:45pm Gracia 2
Cause: Business and IT Collaboration Effect: Cleveland Clinic Executive Dashboard (Cleveland Clinic) 2:00pm – 2:45pm Castellana 1
Application Consolidation & Migration Best Practices: Customer Panel (Discount Tire, Cisco, Verizon) 2:55pm – 3:55pm Gracia 2
Big Data Integration Pipelines at Cox Automotive (Cox Automotive) 2:55pm – 3:55pm Gracia 4
Performance Tuning for PowerCenter and Informatica Data Services 2:55pm – 3:55pm Gracia 6
US Bank and Cognizant 2:55pm – 3:55pm Castellana 1
Analytics architecture (Teradata, Hortonworks) 4:05pm – 4:50pm Gracia 4
A Case Study in Application Consolidation and Modernization—Migrating from Ab Initio to Informatica (Kaiser Permanente) 4:05pm – 4:50pm Castellana 1
Monetize Your Data With Hadoop and Agile Data Integration (AT&T) 4:05pm – 4:50pm Gracia 2
How to Enable Advanced Scaling and Metadata Management with PowerCenter (PayPal) 5:00pm – 5:45pm Castellana 1
How Verizon is consolidating 50+ legacy systems into a modern application architecture, optimizing Verizon’s enterprise sales and delivery process (Verizon) 5:00pm – 5:45pm Gracia 6
A guided tour to one of the most complex Informatica Installations worldwide (HP) 5:00pm – 5:45pm Gracia 2
Integration with Hadoop:  Best Practices for mapping development using Big Data Edition 5:00pm – 5:45pm Gracia 4

Meet The Experts Sessions, Wednesday, May 14

Session Time Location
Meet the Expert: App Consolidation – Driving Greater Business Agility and Reducing Costs Through Application Consolidation and Migration (Roger Nolan) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Big Data – Delivering on the Promise of Big Data Analytics (John Haddad) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Architect – Laying the Architectural Foundation for the Data-Driven Enterprise (David Lyle) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
  • Informatica World Pavilion11:45 PM – 2:00 PM

Breakout Sessions, Thursday, May 15

Session Time Location
Enterprise Architecture and Business Transformation Panel  (Cisco) 9:00am – 10:00am Gracia 6
The Data Lifecycle: From infancy through retirement, how Informatica can help (Mototak Consulting) 9:00am – 10:00am Gracia 4
How Allied Solutions Streamlined Customer Data Integration using B2B Data Exchange (Allied Solutions) 9:00am – 10:00am Gracia 2
How the State of Washington and Michigan State University are Delivering Integration as a Service (Michigan State University, Washington State Department of Enterprise Services) 9:00am – 10:00am Gracia 1
Real Time Big Data Streaming Analytics (PRA Group) 10:10am – 11:10am Gracia 1
Extending and Modernizing Enterprise Data Architectures (Philip Russom, TDWI) 10:10am – 11:10am Gracia 4
Best Practices for Saving Millions by Offloading ETL/ELT to Hadoop with Big Data Edition and Vibe Data Stream (Cisco) 10:10am – 11:10am Gracia 2
Retire Legacy Applications – Improve Your Bottom-Line While Managing Compliance (Cisco) 11:20am – 12:20pm Gracia 4
How a Data Hub Reduces Complexity, Cost and Risk for Data Integration Projects 11:20am – 12:20pm Gracia 1
Title? (Cap Gemini) 11:20am – 12:20pm Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 2:30pm – 3:30pm Gracia 4
Title?  Keyur Desai 2:30pm – 3:30pm Gracia 2
How to run PowerCenter & Big Data Edition on AWS & connect Data as a Service (Customer) 2:30pm – 3:30pm Gracia 1
Accelerating Business with Near-Realtime Architectures 2:30pm – 3:30pm Gracia 6
  • Informatica World Pavillion12:30 PM – 3:30 PM

Hands-On Labs

Session Time Location
General Interest
PowerCenter 9.6.1 Upgrade 1 Table 01
PowerCenter 9.6.1 Upgrade 2 (repeat) Table 02
PowerCenter Advanced Edition – High Availability & Grid Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 03a
PowerCenter Advanced Edition – Metadata Manager & Business Glossary Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 03b
Data Archive Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 06a
Test Data Management Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 06b
Analytics- Related
PowerCenter Big Data Edition – Delivering on the Promise of Big Data Analytics All other times not taken by 11b. Table 11a
Elastic Analytics:  Big Data Edition in the Cloud Mon 4:00
Tue 11:45, 3:35
Wed 12:45, 5:00, 7:00
Thu  9:00;1:15;2:15
Fri 10:30
Table 11b
Greater Agility and Business-IT Collaboration using Data Virtualization Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 12a
Boosting your performance and productivity with Informatica Developer Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 12b
Democratizing your Data through the Informatica Data Lake Table 13
Enabling Self-Service Analytics with Informatica Rev Table 14
Real-time Data Integration: PowerCenter Architecture & Implementation Considerations Monday 1pm
Tuesday 7:30am, 1:45pm
Wed 7:30, 2:00, 4:05
Thu 9am, 11:20am
Fri 8:30am
Table 15a
Real-time Data Integration: PowerExchange CDC on z/OS Monday 2pm
Tue 10:45, 2:40
Wed 10:45, 5pm
Thu 12:15pm
Fri 9:30am
Table 15b
Real-time Data Integration: PowerExchange CDC on i5/OS Monday 3pm
Tuesday 3:35pm
Wed 11:45am, 6pm
Thu 1:15pm
Fri 10:30am
Table 15c
Real-time Data Integration: PowerExchange CDC for Relational (Oracle, DB2, MS-SQL) Mon 4pm
Tue 11:45am, 4:25pm
Wed 12:45pm, 2:55pm, 7pm
Thu 7:30am, 10:10am, 2:15pm
Fri 7:30am, 11:30am
Table 15d
Healthcare Data Management and Modernization for Healthcare Providers Table 16
Data Management of Machine Data & Internet of Things Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 17a
Handling Complex Data Types with B2B Data Transformation Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 17b
Application Consolidation & Migration Related
Simplifying Complex Data Integrations with Data Integration Hub Table 18
Implementing Trading Partner Integration with B2B Data Exchange Table 19
Operationalizing and Scaling your PowerCenter Environment Mon 1pm, 2pm
Tue 7:30, 10:45, 2:40, 3:35
Wed 10:45, 12:45, 5pm, 6pm, 7pm
Thu 7:30, 9am, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 20a
Effective Operations management and Administration – What’s New M: 3:00 – 3:45pm
4:00 – 4:45pm
Tu: 11:45 – 12:30pm
1:45 – 2:30pm
4:25 – 5:15pm
W: 7:30 – 8:15am
11:45 – 12:30pm
2:55 – 3:40pm
4:05 – 4:50pm
Th: 10:10 – 10:55am
12:15 – 1:00pm
2:15 – 3:00pm
F:  8:30 – 9:15am
10:30 – 11:15am
Table 20b
Getting the Most out of your Data Integration & Data Quality Platform – Performance and Scalability Tips & Tricks Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 21a
Getting the Most out of your BigData Edition – Performance Best Practices Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 21b
Modernizing and Consolidating Legacy and Application Data: Leveraging Data Services, Data Quality and Data Explorer Mon 1:00
Tue 10:45,  2:40, 4:25
Wed 10:45, 2:00, 2:55, 4:05, 7:00
Thu 11:20, 2:15 PM
Fri  9:30AM, 10:30AM
Table 22a
Connect to *: Connectivity to Long Tail of Next Generation Data Sources Mon 2:00, 3:00pm
Tue  7:30AM, 11:45, 1:45
Wed 7:30AM,  12:45pm,, 5:00pm
Thu 9:00am, 10:10am, 1:15pm
Fri 7:30am,8:30AM,
Table 22b
Modernizing and Consolidating Legacy and Application Data with PowerExchange Mainframe and CDC Mon 4:00PM
Tue 3:35
Wed 11:45, 6:00
Thu 7:30AM, 12:15, 2:15
Fri 11:30
Table 22c
Retire Legacy Applications and Optimize Application Performance with Informatica Data Archive Table 23
Protect Salesforce Sandboxes with Cloud Data Masking Tue 3:35, 4:25
Wed 6:00, 7:00
Thu 1:15, 2:15
Fri 7:30
Table 24a
Optimally Provision Test Data Sets with Test Data Management Mon: all times Monday
Tues: 7:30,10:45, 11:45, 1:45, 2:40
Wed: 7:30, 10:45, 11:45, 12:45, 2:00, 2:55, 4:05, 5:00
Thurs: 7:30, 9:00, 10:10, 11:20, 12:15
Fri: 8:30, 9:30, 10:30, 11:30
Table 24b
Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

What It Takes Today to Be an Effective CISO

data security

What It Takes Today to Be an Effective CISO

What does it take to be an effective Chief Information Security Officer (CISO) in today’s era massive data breaches? Besides skin as thick as armor and proven experience in security, an effective CISO needs to hold the following qualities:

  • A strong grasp of their security program’s capabilities and of their adversaries
  • The business acumen to frame security challenges into business opportunties
  • An ability to effectively partner and communicate with stakeholders outside of the IT department
  • An insatiable appetite to make data-driven decisions and to take smart risks

In order to be successful, a CISO needs data-driven insights.  The business needs this too.  Informatica recently launched the industry’s first Data Security Intelligence solution, Secure@Source. At the launch event, we shared how CISOs can leverage new insights, gathered and presented by Secure@Source. These insights better equip their security and compliance teams to defend against misconfigurations, cyber-attacks and malicious insider threats.

Data-driven organizations are more profitable, more efficient, and more competitive [1].  An effective CISO ensures the business has the data it needs without introducing undo risk. In my RSA Conference Security Leadership Development session I will share several other characteristics of effective CISOs.

Despite best efforts at threat modeling and security automation, security controls will never be perfect.  Modern businesses require data agility, as attack surface areas and risks change quickly. As data proliferates by business users beyond the firewall, the ability to ensure that sensitive and confidential data is safe from exposure or a breach becomes an enormous task.

Data at rest isn’t valuable if the business can’t use it in a timely manner. Encrypted data may be safe from theft, but needs to be decrypted at some point to be useful for those using the data for predictive analytics. Data’s relative risk of breach goes up as the number of connections, applications, and accounts that have access to the data also increases.

If you have two databases, each with the same millions of sensitive records in them, the system with more applications linked to it and privileged administrative accounts managing it is the one you should be focusing your security investments on. But you need a way to measure and manage your risk with accurate, timely intel.

As Informatica’s CISO, my responsibility is to ensure that our brand is protected, that our customers, stakeholders, and employees trust Informatica — that we are trustworthy custodians of our customers’ most important data assets.

In order to do that, I need to have conviction about where our sensitive assets are, what threats and risks are relevant to them, and have a plan to keep them compliant and safe no matter where the data travels.

Modern security guidance like the SANS Critical Security Controls or NIST CyberSecurity Framework both start with “know your assets”, building an inventory and what’s most critical to your business. Next, they advise you to form a strategy to monitor, protect, and re-assess relevant risks as the business evolves.  In the age of Agile development and security automation, continuous monitoring is replacing batch-mode assessments. Businesses move too fast to measure risk annually or once a quarter.

As Informatica has shifted to a cloud-first enterprise, and as our marketing organization makes data-driven decisions for their customer experience initiatives, my teams ensure we are making data available to those who need it while adhering to international data privacy laws. This task has become more challenging as the volume of data increases, is shared between targets, and as requirements become more stringent.  Informatica’s Data Security Intelligence solution, Secure@Source, was designed to help manage these activities while making it easier to collaborate with other stakeholders.

The role of the CISO has transformed over time to being a trusted advisor to the business; relying on their guidance to help take smart risks.  The CISO provides a lens in business discussions that focuses on technical threats, regulatory constraints, and business risks while ensuring that the business earns and maintains trust with customers. In order to be an effective CISO, it all comes down to the data.

[1] http://www.economistinsights.com/analysis/data-directive

Share
Posted in Data Security | Tagged , , , , , , | Leave a comment

Data Wizard Beta: Paving the Way for Next-Generation Data Loaders

data transformation

The Data Wizard, Changes the Landscape of What Traditional Data Loaders Can Do

The emergence of the business cloud is making the need for data ever more prevalent. Whatever your business, if your role is in the sales, marketing or service departments, chances are your productivity depends a great deal on the ability to move data quickly in and out of Salesforce and its ecosphere of applications.

With the in-built data transformation intelligence, the Data Wizard (click here to try the Beta version), changes the landscape of what traditional data loaders can do. The Data Wizard takes care of the following aspects, so that you don’t have to:

  1. Data Transformations: We built in over 300 standard data transformations so you don’t have to format the data before bringing it in (eg. combining first and last names into full names, adding numeric columns for totals, splitting address fields into its separate components).
  2. Built-in intelligence: We automate the mapping of data into Salesforce for a range of common use cases (eg., Automatically mapping matching fields, intelligently auto-generating date format conversions , concatenating multiple fields).
  3. App-to-app integration: We incorporated pre-built integration templates to encapsulate the logic required for integrating Salesforce with other applications (eg., single click update of customer addresses in a Cloud ERP application based on Account addresses in Salesforce) .

Unlike the other data loading apps out there, the Data Wizard doesn’t presuppose any technical ability on the part of the user. It was purpose-built to solve the needs of every type of user, from the Salesforce administrator to the business analyst.

Despite the simplicity the Data Wizard offers, it is built on the robust Informatica Cloud integration platform, providing the same reliability and performance that is key to the success of Informatica Cloud’s enterprise customers, who integrate over 5 billion rows of data per day. We invite you to try the Data Wizard for free, and contribute to the Beta process by providing us with your feedback.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform, Data Services | Tagged , , , , , | Leave a comment

Why Data Integration is Exploding Right Now

Data Integration

Mashing Up Our Business Data with External Data Sources Makes Our Data Even More Valuable.

In case you haven’t noticed, data integration is all the rage right now.  Why?  There are three major reasons for this trend that we’ll explore below, but a recent USA Today story focused on corporate data as a much more valuable asset than it was just a few years ago.  Moreover, the sheer volume of data is exploding.

For instance, in a report published by research company IDC, they estimated that the total count of data created or replicated worldwide in 2012 would add up to 2.8 zettabytes (ZB).  By 2020, IDC expects the annual data-creation total to reach 40 ZB, which would amount to a 50-fold increase from where things stood at the start of 2010.

But the growth of data is only a part of the story.  Indeed, I see three things happening that drive interest in data integration.

First, the growth of cloud computing.  The growth of data integration around the growth of cloud computing is logical, considering that we’re relocating data to public clouds, and that data must be synced with systems that remain on-premise.

The data integration providers, such as Informatica, have stepped up.  They provide data integration technology that can span enterprises, managed service providers, and clouds that dealing with the special needs of cloud-based systems.  Moreover, at the same time, data integration improves the ways we doing data governance, and data quality,

Second, the growth of big data.  A recent IDC forecast shows that the big data technology and services market will grow at a 26.4% compound annual growth rate to $41.5 billion through 2018, or, about six times the growth rate of the overall information technology market. Additionally, by 2020, IDC believes that line of business buyers will help drive analytics beyond its historical sweet spot of relational to the double-digit growth rates of real-time intelligence and exploration/discovery of the unstructured worlds.

The world of big data razor blades around data integration.  The more that enterprises rely on big data, and the more that data needs to move from place to place, the more a core data integration strategy and technology is needed.  That means you can’t talk about big data without talking about big data integration.

Data integration technology providers have responded with technology that keeps up with the volume of data that moves from place to place.  As linked to the growth of cloud computing above, providers also create technology with the understanding  that data now moves within enterprises, between enterprises and clouds, and even from cloud to cloud.  Finally, data integration providers know how to deal with both structured and unstructured data these days.

Third, better understanding around the value of information.  Enterprise managers always knew their data was valuable, but perhaps they did not understand the true value that it can bring.

With the growth of big data, we now have access to information that helps us drive our business in the right directions.  Predictive analytics, for instance, allows us to take years of historical data and determine patterns that allow us to predict the future.  Mashing up our business data with external data sources makes our data even more valuable.

Of course, data integration drives much of this growth.  Thus the refocus on data integration approaches and tech.  There are years and years of evolution still ahead of us, and much to be learned from the data we maintain.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business/IT Collaboration, Data First, Data Integration, Data Integration Platform, Data Security, Data Services | Tagged , , , | 1 Comment

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Gut_Instincts

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.

Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching.  The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.

The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”

This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.

As Carey explains it:

“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”

Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.

There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.

Share
Posted in B2B, B2B Data Exchange, Business/IT Collaboration, Data First, Data Integration, Data Services | Tagged , , , , , | Leave a comment

Becoming Analytics-Driven Requires a Cultural Shift, But It’s Doable

Analytics-Driven Requires a Cultural Shift

Becoming Analytics-Driven Requires a Cultural Shift, But It’s Doable

For those hoping to push through a hard-hitting analytics effort that will serve as a beacon of light within an otherwise calcified organization, there’s probably a lot of work cut out for you. Evolving into an organization that fully grasps the power and opportunities of data analytics requires cultural change, and this is a challenge organizations have only begin to grasp.

“Sitting down with pizza and coffee could get you around can get around most of the technical challenges,” explained Sam Ransbotham, Ph.D, associate professor Boston College, at a recent panel webcast hosted by MIT Sloan Management Review, “but the cultural problems are much larger.”

That’s one of the key takeaways from a the panel, in which Ransbotham was joined by Tuck Rickards, head of digital transformation practice at Russell Reynolds Associates, a digital recruiting firm, and Denis Arnaud, senior data scientist Amadeus Travel Intelligence. The panel, which examined the impact of corporate culture on data analytics, was led by Michael Fitzgerald, contributing editor at MIT Sloan Management Review.

The path to becoming an analytics-driven company is a journey that requires transformation across most or all departments, the panelists agreed. “It’s fundamentally different to be a data-driven decision company than kind of a gut-feel decision-making company,” said Rickards. “Acquiring this capability to do things differently usually requires a massive culture shift.”

That’s because the cultural aspects of the organization – “the values, the behaviors, the decision making norms and the outcomes go hand in hand with data analytics,” said Ransbotham. “It doesn’t do any good to have a whole bunch of data processes if your company doesn’t have the culture to act on them and do something with them.” Rickards adds that bringing this all together requires an agile, open source mindset, with frequent, open communication across the organization.

So how does one go about building and promoting a culture that is conducive to getting the maximum benefit from data analytics? The most important piece is being about people who ate aware and skilled in analytics – both from within the enterprise and from outside, the panelists urged. Ransbotham points out that it may seem daunting, but it’s not. “This is not some gee-whizz thing,” he said. “We have to get rid of this mindset that these things are impossible. Everybody who has figured it out has figured it out somehow. We’re a lot more able to pick up on these things that we think — the technology is getting easier, it doesn’t require quite as much as it used to.”

The key to evolving corporate culture to becoming more analytics-driven is to identify or recruit enlightened and skilled individuals who can provide the vision and build a collaborative environment. “The most challenging part is looking for someone who can see the business more broadly, and can interface with the various business functions –ideally, someone who can manage change and transformation throughout the organization,” Rickards said.

Arnaud described how his organization – an online travel service — went about building an espirit de corps between data analytics staff and business staff to ensure the success of their company’s analytics efforts. “Every month all the teams would do a hands-on workshop, together in some place in Europe [Amadeus is headquartered in Madrid, Spain].” For example, a workshop may focus on a market analysis for a specific customer, and the participants would explore the entire end-to-end process for working with the customer, “from the data collection all the way through to data acquisition through data crunching and so on. The one knowing the data analysis techniques would explain them, and the one knowing the business would explain that, and so on.” As a result of these monthly workshops, business and analytics teams members have found it “much easier to collaborate,” he added.

Web-oriented companies such as Amadeus – or Amazon and eBay for that matter — may be paving the way with analytics-driven operations, but companies in most other industries are not at this stage yet, both Rickards and Ransbotham point out. The more advanced web companies have built “an end-to-end supply chain, wrapped around customer interaction,” said Rickards. “If you think of most traditional businesses, financial services or automotive or healthcare are a million miles away from that. It starts with having analytic capabilities, but it’s a real journey to take that capability across the company.”

The analytics-driven business of the near future – regardless of industry – will likely to be staffed with roles not seen as of yet today. “If you are looking to re-architect the business, you may be imagining roles that you don’t have in the company today,” said Rickards. Along with the need for chief analytics officers, data scientists, and data analysts, there will be many new roles created. “If you are on the analytics side of this, you can be in an analytics group or a marketing group, with more of a CRM or customer insights title. Yu can be in a planning or business functions. In a similar way on the technology side, there are people very focused on architecture and security.”

Ultimately, the demand will be for leaders and professionals who understand both the business and technology sides of the opportunity, Rickards continued. Ultimately, he added, “you can have good people building a platform, and you can have good data scientists. But you better have someone on the top of that organization knowing the business purpose.’

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Services, Data Synchronization | Tagged , , , , | Leave a comment

Startup Winners of the Informatica Data Mania Connect-a-Thon

Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.

While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP.  Bugs also continue to mount, and technical debt increases.

Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.

BigML

With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important.  BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.

Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw

Databricks

Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.

Can’t play the video? Click here, http://youtu.be/607ugvhzVnY

FollowAnalytics

With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.

Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg

ThoughtSpot

Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.

Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4

Share
Posted in B2B, Business Impact / Benefits, Cloud, Data Integration, Data Integration Platform, Data Privacy, Data Quality, Data Services, Data Transformation | Tagged , , , , , | Leave a comment

Internet of Things (IoT) Changes the Data Integration Game in 2015

Data Integration

Internet of Things (IoT) Changes the Data Integration Game in 2015

As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”

We can all see this happening in our personal lives.  Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone.  On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.

So, what changed?  With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.

For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure.  Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.

The problem with all of this very cool stuff is that we need to once again rethink data integration.  Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.

That’s why those who are moving to IoT-based systems need to do two things.  First, they must create a strategy for extracting data from devices, such as industrial robots or ann  Audi A8.  Second, they need a strategy to take  all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.

The challenge is that machines and devices are not traditional IT systems.  I’ve built connectors for industrial applications in my career.  The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around.  Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.

This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage.  However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible.  Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff.  At the heart of its ability to succeed is the ability to move data from place-to-place.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Services | Tagged , , , | 5 Comments