Tag Archives: Big Data

Best Kept Secrets for Successful Data Governance

data governance

Best Kept Secrets for a Successful Data Governance

If you’ve spent some time studying and practicing data governance, you would agree that data governance is a challenging yet rewarding endeavor.  Across industries, a growing number of organizations have put data governance programs in place so they can more effectively manage their data to drive the business value. But the reality is, data governance is a complex process, and most companies practicing data governance today are still at the early phase of this very long journey.  In fact, according to the result from over 240 completed data governance assessments on http://governyourdata.com/, a community website dedicated to everything data governance, the average score for data governance maturity is only 1.6 out of 5. It’s no surprise that data governance was a hot topic at last week’s Informatica World 2015.  Over a dozen presentations and panel discussions on data governance were delivered; practitioners across various industries shared their real-world stories on topics ranging from how to kick-start a data governance program, how to build business cases for data governance, frameworks and stewardship management, to the choice of technologies.  For me, the key takeaways are:

  1. Old but still true – To do data governance the right way, you must start small and focus on achieving tangible results. Leverage the small victories to advance to the next phase.
  1. Be prepared to fail more than once while building a data governance program. But don’t quit, because your data will not.
  1. Why doesn't it fit?!One-size doesn’t fit all when it comes to building a data governance framework, which is a challenge for organizations, as there is no magic formula that companies can immediately adopt. Should you build a centralized or federated data governance operation? Well, that really depends on what works within your existing environment.
    In fact, when asked “what’s the most challenging area for your data governance effort” in our recent survey conducted at Informatica World 2015, “Identify roles and responsibilities” got the most mentions. Basic principle? – Choose a framework that blends well with your company‘s culture.
  1. pptLet’s face it, data governance is not an IT project, nor is it about fixing data problems. It is a business function that calls for people, process and technology working together to obtain the most value from your data. Our seasoned practitioners recommend a systematic approach: Your first priority should be people gathering – identifying the right people with the right skills and most importantly, those who have a passion for data; next is figuring out the process. Things to consider include: What’s the requirement for data quality? What metrics and measurements should be used for examining the data; how to handle exceptions and remediate data issues? How to quickly identify and apply security measures to the various data sets?  Third priority is selecting the right technologies  to implement and facilitate those processes to transform the data so it can be used to help meet  business goals.
  1. Business & IT Collaboration“Engage your business early on” is another important tip from our customers who have achieved early success with their data governance program. A data governance program will not be sustainable without participation from the business. The reason is simple – the business owns the data, they are the consumers of the data and have specific requirements for the data they want to use. IT needs to work collaboratively with business to meet those requirements so the data is fit for use, and provides good value for the business.
  1. Scalability, flexibility and interoperability should be the key considerations when it comes to selecting data governance technologies. Your technology platform should be able to easily adapt to the new requirements arising from the changes in your data environment.  A Big Data project, for example, introduces new data types, increased data speed and volume. Your data management solution should be agile enough to address those new challenges with minimum disruption to your workflow.

Data governance is HOT! The well-attended sessions at Informatica World, as well as some of our previously hosted webinars is testimony of the enthusiasm among our customers, partners, and our own employees on this topic. It’s an exciting time for us at Informatica because we are in a great position to help companies build an effective data governance program. In fact, many of our customers have been relying on our industry-leading data management tools to support their data governance program, and have achieved results in many business areas such as meeting compliance requirements, improving customer centricity and enabling advanced analytics projects. To continue the dialogue and facilitate further learning, I’d like to invite you to an upcoming webinar on May 28, to hear some insightful, pragmatic tips and tricks for building a holistic data governance program from industry expert David Loshin, Principal at Knowledge Integrity, Inc,  and Informatica’s own data governance guru Rob Karel.

Get Ready!

Get Ready!

“Better data is everyone’s job” –  well said by Terri Mikol, director of Data Governance at University of Pittsburgh Medical Center.  For companies striving to leverage data to deliver business value, everyone within the company should treat data as a strategic asset and take on responsibilities for delivering clean, connected and safe data. Only then can your organization be considered truly “Data Ready”.

Share
Posted in Data Governance | Tagged , , , , , , , , , , , | Leave a comment

The Internet of Things, So Mom Can Understand

Dear Mom,

IoT

The Internet of Things, So Mom Can Understand

It’s great to hear that you’re feeling pretty hip at your ability to explain data, metadata and Big Data to your friends. I did get your panicked voicemail asking about the Internet of Things (IoT).   Yes, you’re right that the Internet itself is a thing, so it’s confusing to know if the Internet is one of the things that the IoT includes.   So let’s make it a bit simper so you can feel more comfortable about the topic at your bridge game next week. (Still shocked you’re talking about data with your friends– Dad complains you only talk with him about “Dancing with the Stars”).

First let’s describe the Internet itself.   You use it everyday when you do a Google search: it’s a publicly accessible network of computer systems from around the world.     Someday the Internet will hopefully allow for sharing of all human knowledge with one another.

But what about knowledge that’s not “human”?   It’s not only people that create information.   The “things” in IoT are the devices and machines that people, companies and governments rely upon every day.  Believe it or not, there are billions of devices today that are also “connected” – meaning they have the ability to send information to other computers.  In fact, the technology research firm Gartner says there will be 4.9 billion connected devices, or “things” in 2015.  And by 2020, that number will reach 25 billion!

Some of these connected devices are more obvious than others – and some have been around a very long time.  For example, you use ATMs all the time at the bank, when you’re withdrawing and depositing money.   Clearly those machines can only access your account information if they’re connected to the banks computer systems that hold your account information.

Your iPhone, your “smart” thermostat, and your washing machine with the call home feature are all connected “Things” too.  Mom, imagine waking up in the morning and having your coffee brewed already. The coffee machine brewed because it knew from your alarm clock what time you were waking up.

IoT will also help make sure your oven is off; your lost keys can be easily found and your fridge can check how many eggs are left while you’re standing in the grocery store. The possibilities are limitless.

And it doesn’t stop there.   Medical devices are communicating with doctors, jet engines are communicating with their manufacturers, and parking meters are communicating with city services.

And guess what?  There are people watching the computers that collect all of that information, and they’re working hard to figure out what value they can deliver by using it effectively.   It’s actually this machine data that’s going to make Big Data REALLY Big.

So does that mean your espresso maker, your cell phone and your car will be conspiring to take over the house?  Probably not something we need to worry about this year (Maybe you want to keep an eye on the refrigerator just in case).   But in the short term, it will mean people like me who have dedicated our careers to data management will have our work cut out for ourselves trying to figure out how to make sense of all of this new machine interaction data from devices. Then how to marry it with the people interaction data from social networks like Facebook, LinkedIn and Twitter. And then how to marry all of that with all of the transactional data that companies capture during the normal course of business.    As I’ve said before, data’s a good business to be in!

Happy Mother’s Day!

Love, Rob

Share
Posted in Big Data | Tagged , , | 1 Comment

Big Data’s Next Big Frontier: Earthquake Prediction

big data

Big Data’s Next Big Frontier

There are lots of really fascinating applications coming out the big data space as of late, and I recently came across one that really may be the coolest of the coolest. There’s a UK-based firm that is employing big data to help predict earthquakes.

Unfortunately, predicting earthquakes thus far has been almost impossible. Imagine if people living in an earthquake zone could get at least several hours’ notice, maybe even several days, just as those in the paths of hurricanes can advanced warning and can flee or prepare. Hurricane and storm modeling is one of the earliest examples of big data in action, going back decades. The big data revolution may now be on the verge of earthquake prediction modeling as well.

Bernard Marr, in a recent Forbes post, explains how Terra Seismic employs satellite data to sense impending shakers:

“The systems use data from US, European and Asian satellite services, as well as ground based instruments, to measure abnormalities in the atmosphere caused by the release of energy and the release of gases, which are often detectable well before the physical quake happens. Large volumes of satellite data are taken each day from regions where seismic activity is ongoing or seems imminent. Custom algorithms analyze the satellite images and sensor data to extrapolate risk, based on historical facts of which combinations of circumstances have previously led to dangerous quakes.”

http://www.forbes.com/sites/bernardmarr/2015/04/21/big-data-saving-13000-lives-a-year-by-predicting-earthquakes/

So far, Marr reports, Terra Seismic has been able to predict major earthquakes anywhere in the world with 90% accuracy. Among them is a prediction, issued on February 22nd, that a 6.5-magnitude quake would hit the Indonesian island of Sumatra. The island was hit by a 6.4-magnitude quake on March 3rd.

There’s no question that the ability to accurately forecast earthquakes – at least as closely as hurricanes and major blizzards can be predicted – will not only save many human lives, but also be invaluable to government agencies and businesses as well.

At the same time, such creative – and potentially and game-changing – applications of big data provide very graphic examples of how data is converted to insights that were never possible before. Many business leaders are looking for ways to shine a light on potential events within their organizations and markets, and examples such as Terra Seismic accentuate the positive benefits big data can deliver.

Terra Seismic’s forecasts are available through a public website: http://quakehunters.com/

Share
Posted in Big Data | Tagged , | Leave a comment

Informatica Wins Cisco’s 2014 ISV Partner of The Year – America’s

Partners play an instrumental role in Informatica’s business and have for many years. But there are some years when unique partnerships truly blossom and both companies come together to do really special things together that could not have been conceived of by themselves. And that is the case in 2015 with our partnership with Cisco.

On April 28, Informatica was awarded ISV Partner of the Year for the America’s by Cisco in Montreal, Canada.

This year the Cisco  award was given for two solutions that were jointly created.  The first solution is an end-to-end Data Warehouse Optimization (DWO) solution (ADD LINK).  By combining Cisco UCS (Unified Computing System) along with Hadoop, Informatica Big Data Edition (BDE) and Cisco Data virtualization a customer gains access to a powerful next generation big data analytics platform for both structured and unstructured data.

This solution was created in order to help customers reduce both CAPEX and OPEX IT expenditures with regards to their Enterprise Data Warehouse increasing costs.  By offloading infrequently used or dark data along with ETL (extract, transform and load) jobs and mappings into the Data Lake / Hadoop a customer can recognize a 5-10X cost reduction.

The second solution that was recognized by Cisco was a jointly created Internet of Things (IoT) / Internet of Everything (IoE) offering (ADD LINK).  With the explosion of sensor, social and internet-based data the two companies recognized the need to create a solution that would incorporate data from “the edge” into a customer’s mainstream data repository (EDW, BD, Hadoop, etc.).

This solution includes Cisco routers and hardened devices to collect sensor data (i.e. Telemetry Data) coupled with Informatica’s real-time data ingestion and analytics capabilities.  By combining these technologies a customer is able to aggregate data from every source where they capture data allowing them to gain a 360 view of their business for competitive advantage.

From our announcement in February, “The Data Warehouse Optimization solution is about enabling organizations to more easily leverage all their data assets -current and historical, transaction and interaction – for more effective analytics while reducing their data management costs,” said Mike Flannagan, vice president and general manager of Data and Analytics at Cisco. “More than the sum of its parts, the solution’s Cisco and Informatica elements work synergistically to meet the demands of big data, to respond quickly to changing information needs, and to deliver insights that drive increased competitiveness sand business innovation.”

Every organization on the planet is working hard to gather, analyze and make decisions based on their data. The joint Informatica and Cisco solutions are critical to helping customers to become Data Ready enterprises today and for years to come.

Moving forward, Cisco and Informatica will continue to collaborate on how to best build and deliver on premise, cloud-based and hybrid solutions so that customers have best-of-breed solutions to solve their exploding data and complex IT problems.

Share
Posted in Big Data | Tagged , | Leave a comment

Put Yourself Ahead of Digital Transformation Curve – Informatica World 2015

The rapid advancement we are seeing in social, mobile and other digital technologies have transformed the way of our life significantly. My commute to airport for business travel takes three taps on an app on my smart phone and a leading retailer I shop frequently, knows what I like, which product I have put on the cart using my iPad, which products I “liked” on their social channels so they can do real-time recommendations while I am at their physical store. Their employees are now armed with information that is integrated in ways that empower them be more customer-centric than ever before.

All these rapid changes just over a decade are bought to us by companies that pioneered the digital transformation and data is at the center of this revolution. These organizations gained competitive advantage from social, mobile, analytics, cloud, and internet of things technologies. In a world filled with data of different variety, volume and velocity, it’s more important for organizations to become data-ready. While the potential for insight in big data is massive, we need a new generation of Master Data Management to realize all the potential.

In the contrast of this rapid change fueled by data, growing number of companies are realizing that they now have a massive opportunity in front of them. At the centers of this digital transformation is master data that provides an opportunity for organizations to:

  • Better understand customers, their household and relationships so they can do effective cross-sell and up-sell
  • Identify customers interacting with company via different channels so they can push relevant offer to these customers in real time.
  • Offer better product recommendations to customers based on their purchasing and browsing behavior
  • Optimize the way they manage their inventory leading to significant cost savings
  • Manage supplier relationships more effectively so they can negotiate better rate
  • Provide superior patient care and cure harmful deceases at early stage by creating patient-centric solutions that connect health information from more and more sources
  • Master wellhead and other upstream exploration and production assets so they can do better crew allocation and production planning
  • Be compliant to complex and ever changing government regulations leading to significant savings in terms of fines and punishments

We will talk about all this and more at Informatica World 2015 which is happening next week in Las Vegas. Join us for the MDM Day on May 12 followed by Information Quality and Governance track sessions on May 13 and 14. Register now.

We have 37 sessions that cover Master Data Management, Omnichannel Commerce, Data Quality, Data as a Service and Big Data Relationship Management. You get a chance to learn from Informatica’s customers about their experience, best practices from our partners and our vision and roadmap straight from our product management team. We will also talk about master data fueled Total Customer Relationship and Total Supplier Relationship applications that leverage our industry leading multidomain MDM platform.

Informatica world 2015

Here is your guide to sessions that will be covered. I will see you there. If you want to say hello in person, reach out to me at @MDMGeek and follow @InfaMDM twitter handle for all the latest news. The hash tag for this event is #INFA15

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Master Data Management | Tagged , , , , , , | Leave a comment

There is Just One V in Big Data

According to Gartner, 64% of organizations surveyed have purchased or were planning to invest in Big Data systems. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.

Of that 64%, 30% have already invested in Big Data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner’s 720 respondents, however, have actually deployed Big Data technology. This is bad, because most companies simply don’t know what they’re doing when it comes to Big Data.

Over the years, we have heard that Big Data is Volume, Velocity, and Variety. I feel this is one of the reasons why despite the Big Data hype, most companies are still stuck in neutral is because of this limited view.

  1. Volume: Terabytes to Exabytes, petabytes to Zetabytes of lots of data
  1. Velocity: Streaming data, milliseconds to seconds, how fast data is produced, and how fast the data must be processed to meet the need or demand
  1. Variety: Structured, unstructured, text, multimedia, video, audio, sensor data, meter data, html, text, e-mails, etc.

There is just one V in Big DataFor us, the focus is on collection of data. After all, we are prone to be hoarders. Wired by our survival extinct to collect and hoard for the leaner winter months that may come. So while we hoard data, as much as we can, for the illusive “What if?” scenario. “Maybe this will be useful someday.” It’s this stockpiling of Big Data without application that makes it useless.

While Volume, Velocity, and Variety are focused on collection of data, Gartner, in 2014, introduced 3 additional Vs: Veracity, Variability, and Value which focus on usefulness of the data.

  1. Veracity: Uncertainty due to data inconsistency and incompleteness, ambiguities, latency, deception, model approximations, accuracy, quality, truthfulness or trustworthiness
  1. Variability: The differing ways in which the data may be interpreted, different questions require different interpretations
  1. Value: Data for co-creation and deep learning

I believe that perfecting as few as 5% of the relevant variables will get a business 95% of the same benefit. The trick is identifying that viable 5%, and extracting meaningful information from it. In other words, “Value” is the long pole in the tent.

Twitter @bigdatabeat

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Hadoop | Tagged , | 1 Comment

Succeeding with Analytics in a Big Data World

shutterstock_227687962 (1) - CopyBig data is receiving a lot of press these days including from this author.  While there continues to be constructive dialog regarding whether volume, velocity, or variety are the most important attributes of big data movement, one thing is clear. Constructed correctly, big data has the potential to transform businesses by increasing sales and operational efficiencies. More importantly, when big data is combined with predictive analytics, big data can improve customer experience, enable better targeting of potential customers, and improve the core business capabilities that are foundational to a business’s right to win.

The problem many in the vanguard have discovered is their big data projects are fraught with risk if they are not built upon a solid data management foundation.  During the Big Data Summit, you will learn directly for the vanguard of big data. How have they successfully transition from the traditional world of data management to a new world of big data analytics. Hear from market leading enterprises like Johnson and Johnson, Transamerica, Devon Energy, KPN, and Western Union. As well, hear from Tom Davenport, Distinguished Professor in Management and Information Technology at Babson College and the bestselling author of “Competing on Analytics” and “Big Data at Work”. Tom will share in particular his perspective from interviewing hundreds of companies about the successes and failures of their big data initiatives. Tom Davenport initially thought big data was just another example of technology hype. But his research on big data changed his mind. And finally hear from big data thought leaders including Cloudera, Hortonworks, Cognizant, and Capgemini. They are all here to share their stories on how to avoid common pitfalls and accelerate your analytical returns in a big data world.

To attend in person, please join us on Tuesday the 12th at 1:30 in Las Vegas at the Big Data Summit. If you cannot join us in person, I will be share live tweets and videos through twitter starting at 1:30 PST. Look for me at @MylesSuer on twitter to follow along.

Related Blogs

What is Big Data and why should your business care?
Big Data: Does the emperor have their clothes on?
Should We Still be calling it Big Data?
CIO explains the importance of Big Data to healthcare
Big Data implementations need a systems view and to put in place trustworthy data.
The state of predictive analytics
Analytics should be built upon Business Strategy
Analytics requires continuous improvement too?
When should you acquire a data scientist or two?
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”
Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study

Author Twitter: @MylesSuer

 

Share
Posted in 5 Sales Plays, Big Data, CIO, Informatica World 2015 | Tagged , , | Leave a comment

Come to Informatica World 2015 and Advance Your Career

INFA15- Data Integration

Come to Informatica World 2015 and Advance Your Career

5 Reasons Data Integration Professionals Absolutely Must Not Miss This Informatica World.

If you are a Data Integration or Data Management professional, you really cannot afford to miss this event.  This year’s theme in the Data Integration track at Informatica World is all about customers.  Over 50 customers will be sharing their experiences and best practices for succeeding with for data integration projects such as analytics, big data, application consolidation and migration, and much more.

If you still need convincing, here are the five reasons:

  1. Big Data:A special Big Data Summit is part of the track.
  2. Immediate Value:over 50 customers will be sharing their experience and best practices. Things you can start doing now to improve your organization.
  3. Architecture for Business Transformation. An architecture track focused on practical approaches for using architecture to enable business transformation, with specific examples and real customer experiences.
  4. Hands on Labs:Everybody loves them. This year we have even more. Sign up early to make sure that you get your choice. They go fast!
  5. New “Meet the Experts” Sessions:These are small group meetings for business-level discussions around subjects like big data, analytics, application consolidation, and more.

This truly will be a one-stop shop for all things data integration at Informatica World.  The pace of both competition and technology change is accelerating.  Attend this event to stay on top of what is happening in the word of data integration and how leading companies and experts are using data for competitive advantage within their organizations.

To help start your planning, here is a listing of the Data Integration, Architecture, and Big Data Sessions this year.  I hope to see you there.

QUICK GUIDE

DATA INTEGRATION AND BIG DATA at INFORMATICA WORLD 2015

Breakout Sessions, Tuesday, May 13

Session Time Location
Accelerating Business Value Delivery with Informatica Platform  (Architect Track Keynote) 10:45am – 11:15am Gracia 6
How to Support Real Time Data Integration Projects with PowerCenter (Grant Thornton) 1:30pm – 2:30pm Gracia 2
Knowledgent 11:30am – 12:15pm Gracia 8
Putting Big Data to Work to Make Cancer History at MD Anderson Cancer Center (MD Anderson) 11:30am – 12:15pm Gracia 4
Modernize your Data Architecture for Speed, Efficiency, and Scalability 11:30am – 12:15pm Castellana 1
An Architectural Approach to Data as an Asset (Cisco) 11:30am – 12:15pm Gracia 6
Accenture 11:30am – 12:15pm Gracia 2
Architectures for Next-Generation Analytics 1:30pm – 2:30pm Gracia 6
Informatica Marketplace (Tamara Strifler) 1:30pm – 2:30pm Gracia 4
Informatica Big Data Ready Summit: Keynote Address (Anil Chakravarthy, EVP and Chief Product Officer) 1:40 – 2:25 Castellana 1
Big Data Keynote: Tom Davenport, Distinguished Professor in Management and Information Technology, Babson College 2:30 – 3:15 Castellana 1
How to Test and Monitor Your Critical Business Processes with PowerCenter (Discount Tire, AT&T) 2:40pm – 3:25pm Gracia 2
Enhancing Consumer Experiences with Informatica Data Integration Hub (Humana) 2:40pm – 3:25pm Gracia 4
Business Transformation:  The Case for Information Architecture (Cisco) 2:40pm – 3:25pm Gracia 6
Succeeding with Big Data and Avoiding Pitfalls (CapGemini, Cloudera, Cognizant, Hortonworks) 3:15 – 3:30
What’s New in B2B Data Exchange: Self-Service Integration of 3rd Party Partner Data (BMC Software) 3:35pm – 4:20pm Gracia 2
PowerCenter Developer:  Mapping Development Tips & Tricks 3:35pm – 4:20pm Gracia 4
Modernize Your Application Architecture and Boost Your Business Agility (Mototak Consulting) 3:35pm – 4:20pm Gracia 6
The Big Data Journey: Traditional BI to Next Gen Analytics (Johnson&Johnson, Transamerica, Devon Energy, KPN) 4:15 – 4:30 Castellana 1
L&T Infotech 4:30 – 5:30 Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 4:30 – 5:30 Gracia 4
Next-Generation Analytics Architecture for the Year 2020 4:30 – 5:30 Gracia 6
Accelerate Big Data Projects with Informatica (Jeff Rydz) 4:35 – 5:20 Castellana 1
Big DataMichael J. Franklin, Professor of Computer Science, UC Berkeley 5:20 -5:30 Castellana 1
  • Informatica World Pavilion5:15 PM – 8:00 PM

Breakout Sessions, Wednesday, May 14

Session Time Location
How Mastercard is using a Data Hub to Broker Analytics Data Distribution (Mastercard) 2:00pm – 2:45pm Gracia 2
Cause: Business and IT Collaboration Effect: Cleveland Clinic Executive Dashboard (Cleveland Clinic) 2:00pm – 2:45pm Castellana 1
Application Consolidation & Migration Best Practices: Customer Panel (Discount Tire, Cisco, Verizon) 2:55pm – 3:55pm Gracia 2
Big Data Integration Pipelines at Cox Automotive (Cox Automotive) 2:55pm – 3:55pm Gracia 4
Performance Tuning for PowerCenter and Informatica Data Services 2:55pm – 3:55pm Gracia 6
US Bank and Cognizant 2:55pm – 3:55pm Castellana 1
Analytics architecture (Teradata, Hortonworks) 4:05pm – 4:50pm Gracia 4
A Case Study in Application Consolidation and Modernization—Migrating from Ab Initio to Informatica (Kaiser Permanente) 4:05pm – 4:50pm Castellana 1
Monetize Your Data With Hadoop and Agile Data Integration (AT&T) 4:05pm – 4:50pm Gracia 2
How to Enable Advanced Scaling and Metadata Management with PowerCenter (PayPal) 5:00pm – 5:45pm Castellana 1
How Verizon is consolidating 50+ legacy systems into a modern application architecture, optimizing Verizon’s enterprise sales and delivery process (Verizon) 5:00pm – 5:45pm Gracia 6
A guided tour to one of the most complex Informatica Installations worldwide (HP) 5:00pm – 5:45pm Gracia 2
Integration with Hadoop:  Best Practices for mapping development using Big Data Edition 5:00pm – 5:45pm Gracia 4

Meet The Experts Sessions, Wednesday, May 14

Session Time Location
Meet the Expert: App Consolidation – Driving Greater Business Agility and Reducing Costs Through Application Consolidation and Migration (Roger Nolan) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Big Data – Delivering on the Promise of Big Data Analytics (John Haddad) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
Meet the Expert: Architect – Laying the Architectural Foundation for the Data-Driven Enterprise (David Lyle) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2
  • Informatica World Pavilion11:45 PM – 2:00 PM

Breakout Sessions, Thursday, May 15

Session Time Location
Enterprise Architecture and Business Transformation Panel  (Cisco) 9:00am – 10:00am Gracia 6
The Data Lifecycle: From infancy through retirement, how Informatica can help (Mototak Consulting) 9:00am – 10:00am Gracia 4
How Allied Solutions Streamlined Customer Data Integration using B2B Data Exchange (Allied Solutions) 9:00am – 10:00am Gracia 2
How the State of Washington and Michigan State University are Delivering Integration as a Service (Michigan State University, Washington State Department of Enterprise Services) 9:00am – 10:00am Gracia 1
Real Time Big Data Streaming Analytics (PRA Group) 10:10am – 11:10am Gracia 1
Extending and Modernizing Enterprise Data Architectures (Philip Russom, TDWI) 10:10am – 11:10am Gracia 4
Best Practices for Saving Millions by Offloading ETL/ELT to Hadoop with Big Data Edition and Vibe Data Stream (Cisco) 10:10am – 11:10am Gracia 2
Retire Legacy Applications – Improve Your Bottom-Line While Managing Compliance (Cisco) 11:20am – 12:20pm Gracia 4
How a Data Hub Reduces Complexity, Cost and Risk for Data Integration Projects 11:20am – 12:20pm Gracia 1
Title? (Cap Gemini) 11:20am – 12:20pm Gracia 2
What’s New in PowerCenter, PowerCenter Express and PowerExchange? 2:30pm – 3:30pm Gracia 4
Title?  Keyur Desai 2:30pm – 3:30pm Gracia 2
How to run PowerCenter & Big Data Edition on AWS & connect Data as a Service (Customer) 2:30pm – 3:30pm Gracia 1
Accelerating Business with Near-Realtime Architectures 2:30pm – 3:30pm Gracia 6
  • Informatica World Pavillion12:30 PM – 3:30 PM

Hands-On Labs

Session Time Location
General Interest
PowerCenter 9.6.1 Upgrade 1 Table 01
PowerCenter 9.6.1 Upgrade 2 (repeat) Table 02
PowerCenter Advanced Edition – High Availability & Grid Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 03a
PowerCenter Advanced Edition – Metadata Manager & Business Glossary Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 03b
Data Archive Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 06a
Test Data Management Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 06b
Analytics- Related
PowerCenter Big Data Edition – Delivering on the Promise of Big Data Analytics All other times not taken by 11b. Table 11a
Elastic Analytics:  Big Data Edition in the Cloud Mon 4:00
Tue 11:45, 3:35
Wed 12:45, 5:00, 7:00
Thu  9:00;1:15;2:15
Fri 10:30
Table 11b
Greater Agility and Business-IT Collaboration using Data Virtualization Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 12a
Boosting your performance and productivity with Informatica Developer Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 12b
Democratizing your Data through the Informatica Data Lake Table 13
Enabling Self-Service Analytics with Informatica Rev Table 14
Real-time Data Integration: PowerCenter Architecture & Implementation Considerations Monday 1pm
Tuesday 7:30am, 1:45pm
Wed 7:30, 2:00, 4:05
Thu 9am, 11:20am
Fri 8:30am
Table 15a
Real-time Data Integration: PowerExchange CDC on z/OS Monday 2pm
Tue 10:45, 2:40
Wed 10:45, 5pm
Thu 12:15pm
Fri 9:30am
Table 15b
Real-time Data Integration: PowerExchange CDC on i5/OS Monday 3pm
Tuesday 3:35pm
Wed 11:45am, 6pm
Thu 1:15pm
Fri 10:30am
Table 15c
Real-time Data Integration: PowerExchange CDC for Relational (Oracle, DB2, MS-SQL) Mon 4pm
Tue 11:45am, 4:25pm
Wed 12:45pm, 2:55pm, 7pm
Thu 7:30am, 10:10am, 2:15pm
Fri 7:30am, 11:30am
Table 15d
Healthcare Data Management and Modernization for Healthcare Providers Table 16
Data Management of Machine Data & Internet of Things Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 17a
Handling Complex Data Types with B2B Data Transformation Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 17b
Application Consolidation & Migration Related
Simplifying Complex Data Integrations with Data Integration Hub Table 18
Implementing Trading Partner Integration with B2B Data Exchange Table 19
Operationalizing and Scaling your PowerCenter Environment Mon 1pm, 2pm
Tue 7:30, 10:45, 2:40, 3:35
Wed 10:45, 12:45, 5pm, 6pm, 7pm
Thu 7:30, 9am, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 20a
Effective Operations management and Administration – What’s New M: 3:00 – 3:45pm
4:00 – 4:45pm
Tu: 11:45 – 12:30pm
1:45 – 2:30pm
4:25 – 5:15pm
W: 7:30 – 8:15am
11:45 – 12:30pm
2:55 – 3:40pm
4:05 – 4:50pm
Th: 10:10 – 10:55am
12:15 – 1:00pm
2:15 – 3:00pm
F:  8:30 – 9:15am
10:30 – 11:15am
Table 20b
Getting the Most out of your Data Integration & Data Quality Platform – Performance and Scalability Tips & Tricks Mon 1:00, 3:00
Tue 7:30, 11:45, 2:40, 4:25
Wed 10:45, 12:45, 2:55, 5:00, 7:00
Thu 9:00, 11:20, 1:15
Fri 7:30, 9:30, 11:30
Table 21a
Getting the Most out of your BigData Edition – Performance Best Practices Mon 2:00, 4:00
Tue 10:45, 1:45, 3:35
Wed 7:30, 11:45, 2:00, 4:05, 6:00
Thu 7:30, 10:10, 12:15, 2:15
Fri 8:30, 10:30
Table 21b
Modernizing and Consolidating Legacy and Application Data: Leveraging Data Services, Data Quality and Data Explorer Mon 1:00
Tue 10:45,  2:40, 4:25
Wed 10:45, 2:00, 2:55, 4:05, 7:00
Thu 11:20, 2:15 PM
Fri  9:30AM, 10:30AM
Table 22a
Connect to *: Connectivity to Long Tail of Next Generation Data Sources Mon 2:00, 3:00pm
Tue  7:30AM, 11:45, 1:45
Wed 7:30AM,  12:45pm,, 5:00pm
Thu 9:00am, 10:10am, 1:15pm
Fri 7:30am,8:30AM,
Table 22b
Modernizing and Consolidating Legacy and Application Data with PowerExchange Mainframe and CDC Mon 4:00PM
Tue 3:35
Wed 11:45, 6:00
Thu 7:30AM, 12:15, 2:15
Fri 11:30
Table 22c
Retire Legacy Applications and Optimize Application Performance with Informatica Data Archive Table 23
Protect Salesforce Sandboxes with Cloud Data Masking Tue 3:35, 4:25
Wed 6:00, 7:00
Thu 1:15, 2:15
Fri 7:30
Table 24a
Optimally Provision Test Data Sets with Test Data Management Mon: all times Monday
Tues: 7:30,10:45, 11:45, 1:45, 2:40
Wed: 7:30, 10:45, 11:45, 12:45, 2:00, 2:55, 4:05, 5:00
Thurs: 7:30, 9:00, 10:10, 11:20, 12:15
Fri: 8:30, 9:30, 10:30, 11:30
Table 24b
Share
Posted in Data Integration | Tagged , , , , , , | Leave a comment

How Do You Know if Your Business is Not Wasting Money on Big Data?

big-data

Smart Big Data Strategy

While CIOs are urged to rethink of backup strategies following warnings from leading analysts that companies are wasting billions on unnecessary storage, consultants and IT solution vendors are selling “Big Data” narratives to these CIOs as a storage optimization strategy.

 

What a CIO must do is ask:

Do you think a Backup Strategy is same as a Big Data strategy?

Is your MO – “I must invest in Big Data because my competitor is”?

Do you think Big Data and “data analysis” are synonyms?

Most companies invest very little in their storage technologies, while spending on server and network technologies primarily for backup. Further, the most common mistake businesses make is to fail to update their backup policies. It is not unusual for companies to be using backup policies that are years or even decades old, which do not discriminate between business-critical files and the personal music files of employees.

Web giants like Facebook and Yahoo generally aren’t dealing with Big Data. They run their own giant, in-house “clusters” – collections of powerful servers – for crunching data. But, it appears that those clusters are unnecessary for many of the tasks which they’re handed. In the case of Facebook, most of the jobs engineers ask their clusters to perform are in the “megabyte to gigabyte” range, which means they could easily be handled on a single computer – even a laptop.

The necessity of breaking problems into many small parts, and processing each on a large array of computers, characterizes classic Big Data problems like Google’s need to compute the rank of every single web page on the planet.

In, Nobody ever got fired for buying a cluster, Microsoft Research points out that a lot of the problems solved by engineers at even the most data-hungry firms don’t need to be run on clusters. Why is that a problem? It is because, there are vast classes of problems for which these clusters are relatively inefficient, or a very inappropriate, solution.

Here is an example of a post exhorting readers to “Incorporate Big Data Into Your Small Business” that is about a quantity of data that probably wouldn’t strain Google Docs, much less Excel on a single laptop. In other words, most businesses are in dealing with small data. It’s very important stuff but it has little connection to the big kind.

Let us lose the habit of putting “big” in front of data to make it sound important. After all, supersizing your data, just because you can, is going to cost you a lot more and may yield a lot less.

So what is it? Big Data, small Data, or Smart Data?

Gregor Mendel uncovered the secrets of genetic inheritance with just enough data to fill a notebook. The important thing is gathering the right data, not gathering some arbitrary quantity of it.

Twitter @bigdatabeat

Share
Posted in Big Data | Tagged , , | Leave a comment