Category Archives: B2B
I recently had the opportunity to participate in the “CDO Summit,” hosted by the CDO Club and Capgemini Consulting. While the CDO in this case meant Chief Digital Officer, I noticed some of the speakers had “Data” in their titles, suggesting a close alignment with CDO as Chief Data Officer as well. In fact, the conference program was packed full of discussion and presentations on how data analytics was shifting the game for many enterprises.
Not too long ago, I asked a group of executives what the difference between chief data officer and chief digital officer was. Generally, chief data officers were seen as reporting to chief digital officers, as data was the key component of broader efforts to move to digital enterprise. The chief digital officer is assumed to have roles encompassing various aspects of content development, sales and marketing, operations, production, finance, and product development.
Then again, the chief data officer will also be immersed in these areas of the business as well.
The overlap and convergence between the two CDOs mirrors what’s happening in many organizations. Many recognize the opportunities now available through digital channels, and the efficiencies that can be gained by adding intelligence to products and services. At the same time, this only can be accomplished by capturing, analyzing and monetizing the data that is generated or supports these digital efforts.
This means converged responsibilities, skill demands and opportunities for a range of positions across enterprises – not just CDOs.
Nevertheless, these two types of executives are likely to be looking at things from different perspectives. For example, in terms of background, the chief data officer is likely to have a background in statistical analysis, and may come up through the ranks as a data scientist. Many chief digital officers are coming out of marketing or IT.
Thus, you are likely to find chief data officers worry more about the data, and how it is being created, handled, and secured, while chief digital officers focus on the bigger picture.
For some perspective on the roles of chief data officers, Dr. Anne Marie Smith, principal consultant at Alabama Yankee Systems, LLC, describes the scope of responsibilities in report out of the Cutter Consortium:
- Articulate the enterprise’s data vision
- Serve as “champion for global data management, governance, quality, and vendor relationships across the enterprise.”
- Work with “executives, data owners, and data stewards to achieve data accuracy and process requirement goals for all internal and external customers.”
- Oversee “the monitoring of data quality efforts within the organization.”
- Lead the education of the organization “on data management concepts, the appropriate usage of data, enterprise master data management and data quality concepts, enterprise decision-support concepts, data vendor capabilities, definition and appropriateness of data management, rules on data access, and other data-related issues.”
The responsibilities of chief digital officers don’t fall too far from those of chief data officers, as they also call for data leadership. The roles of this CDO as explained by Sam Ramji, Vice President of strategy at Apigee, include the following:
- Articulate the enterprise’s digital strategy – how a digital transformation will help the organization “meet the challenges of a mobile-first world, digital partnerships, and new forms of competition,” as well as “build a consistent experience for customers across different lines of business in order to produce network effects for the enterprise.”
- Earn company-wide commitment for the digital strategy – serving “as a culture broker, establishing a single vision that spans businesses and technologies and being the active champion who gets everyone on board to execute that vision.”
- Embrace data-based experimentation — facilitate the ability to experiment repeatedly in the digital realm, with the expectation that failure is the most important part of innovation.
- Drive for tangible and measurable results Connect with experts in the company and the broader industry.
Speak multiple business languages (IT, marketing, strategy, finance).
On Saturday, I got a call from my broadband company on my mobile phone. The sales rep pitched a great limited-time offer for new customers. I asked him whether I could take advantage of this great offer as well, even though I am an existing customer. He was surprised. “Oh, you’re an existing customer,” he said, dismissively. “No, this offer doesn’t apply to you. It’s for new customers only. Sorry.” You can imagine my annoyance.
If this company had built a solid foundation of customer data, the sales rep would have had a customer profile rich with clean, consistent, and connected information as reference. If he had visibility into my total customer relationship with his company, he’d know that I’m a loyal customer with two current service subscriptions. He’d know that my husband and I have been customers for 10 years at our current address. On top of that, he’d know we both subscribed to their services while live at separate addresses before we were married.
Unfortunately, his company didn’t arm him with the great customer data he needs to be successful. If they had, he could have taken the opportunity to offer me one of the four services I currently don’t subscribe to—or even a bundle of services. And I could have shared a very different customer experience.
Every customer interaction counts
Executives at companies of all sizes talk about being customer-centric, but it’s difficult to execute on that vision if you don’t manage your customer data like a strategic asset. If delivering seamless, integrated, and consistent customer experiences across channels and touch points is one of your top priorities, every customer interaction counts. But without knowing exactly who your customers are, you cannot begin to deliver the types of experiences that retain existing customers, grow customer relationships and spend, and attract new customers.
How would you rate your current ability to identify your customers across lines of business, channels and touch points?
Many businesses, however, have anything but an integrated and connected customer-centric view—they have a siloed and fragmented channel-centric view. In fact, sales, marketing, and call center teams often identify siloed and fragmented customer data as key obstacles preventing them from delivering great customer experiences.
According to Retail Systems Research, creating a consistent customer experience remains the most valued capability for retailers, but 55 % of those surveyed indicated their biggest inhibitor was not having a single view of the customer across channels.
Retailers are not alone. An SVP of marketing at a mortgage company admitted in an Argyle CMO Journal article that, now that his team needs to deliver consistent customer experiences across channels and touch points, they realize they are not as customer-centric as they thought they were.
Customer complexity knows no bounds
The fact is, businesses are complicated, with customer information fragmented across divisions, business units, channels, and functions.
Citrix, for instance, is bringing together valuable customer information from 4 systems. At Hyatt Hotels & Resorts, it’s about 25 systems. At MetLife, it’s 70 systems.
How many applications and systems would you estimate contain valuable customer information at your company?
Based on our experience working with customers across many industries, we know the total customer relationship allows:
- Marketing to boost response rates by better segmenting their database of contacts for personalized marketing offers.
- Sales to more efficiently and effectively cross-sell and up-sell the most relevant offers.
- Customer service teams to resolve customers’ issues immediately, instead of placing them on hold to hunt for information in a separate system.
If your marketing, sales, and customer service teams are struggling with inaccurate, inconsistent, and disconnected customer information, it is costing your company revenue, growth, and success.
Transforming customer data into total customer relationships
Informatica’s Total Customer Relationship Solution fuels business and analytical applications with clean, consistent and connected customer information, giving your marketing, sales, e-commerce and call center teams access to that elusive total customer relationship. It not only brings all the pieces of fragmented customer information together in one place where it’s centrally managed on an ongoing basis, but also:
- Reconciles customer data: Your customer information should be the same across systems, but often isn’t. Assess its accuracy, fixing and completing it as needed—for instance, in my case merging duplicate profiles under “Jakki” and “Jacqueline.”
- Reveals valuable relationships between customers: Map critical connections—Are individuals members of the same household or influencer network? Are two companies part of the same corporate hierarchy? Even link customers to personal shoppers or insurance brokers or to sales people or channel partners.
- Tracks thorough customer histories: Identify customers’ preferred locations; channels, such as stores, e-commerce, and catalogs; or channel partners.
- Validates contact information: Ensure email addresses, phone numbers, and physical addresses are complete and accurate so invoices, offers, or messages actually reach customers.
This is just the beginning. From here, imagine enriching your customer profiles with third-party data. What types of information help you better understand, sell to, and serve your customers? What are your plans for incorporating social media insights into your customer profiles? What could you do with this additional customer information that you can’t do today?
We’ve helped hundreds of companies across numerous industries build a total customer relationship view. Merrill Lynch boosted marketing campaign effectiveness by 30 percent. Citrix boosted conversion rates by 20%. A $60 billion global manufacturer improved cross-sell and up-sell success by 5%. A hospitality company boosted cross-sell and up-sell success by 60%. And Logitech increased sales across channels, including their online site, retail stores, and distributors.
Informatica’s Total Customer Relationship Solution empowers your people with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on social media.
Do you have a terrible customer experience or great customer experience to share? If so, please share them with us and readers using the Comment option below.
Informatica’s in Brussels this week for Hadoop Summit. We’re looking forward to spending time with our European customers who are leading the way on repeatably delivering trusted and timely data for big data analytics.
If you’re attending Hadoop Summit Brussels, definitely stop by our session with Belgacom International Carrier Services and our very own Bert Oosterhof to learn how Belgacom is easily driving more predictive analytics and a better customer experience using Informatica and Hadoop.
Europe is clearly becoming a hotbed for increasing use of Hadoop, especially in Telecom, Financial Services, and Public Sector. As organizations look to extend their information architectures with Hadoop, Informatica can help you repeatably deliver trusted and timely data for big data analytics.
Please stop by our booth at Hadoop Summit to learn more!
An increasing number of companies around the world moving to cloud-first or hybrid architectures for new systems to process their data for new analytics applications. In addition to adding new data source from SaaS (Software as a Service) applications to their data pipelines, they are hosting some or all of their data storage, processing and analytics in IaaS (Infrastructure as a Service) public hosted environments to augment on-premise systems. In order to enable our customers to take advantage of the benefits of IaaS options, Informatica is embracing this computing model.
As announced today, Informatica now fully supports running the traditionally on-premise Informatica PowerCenter, Big Data Edition (BDE), Data Quality and Data Exchange on Amazon Web Services (AWS) Elastic Compute (EC2). This provides customers with added flexibility, agility and time-to-production by enabling a new deployment option for running Informatica software.
Existing and new Informatica customers can now choose to develop and/or deploy data integration, quality and data exchange in AWS EC2 just as they would on on-premise servers. There is no need for any special licensing as Informatica’s standard product licensing now covers deployment on AWS EC2 on the same operating systems as on-premise. BDE on AWS EC2 supports the same versions of Cloudera and Hortonworks Hadoop that are supported on-premise.
Customers can install these Informatica products on AWS EC2 instances just as they would on servers running on an on-premise infrastructure. The same award winning Informatica Global Customer Service that thousands of Informatica customers use is now available on call and standing by to help with success on AWS EC2. Informatica Professional Services is also available to assist customers running these products on AWS EC2 as they are for on-premise system configurations.
Informatica customers can accelerate their time to production or experimentation with the added flexibility of installing Informatica products on AWS EC2 without having to wait for new servers to arrive. There is the flexibility to develop in the cloud and deploy production systems on-premise or develop on-premise and deploy production systems in AWS. Cloud-first companies can keep it all in the cloud by both developing and going into production on AWS EC2.
Customers can also benefit from the lower up-front costs, maintenance costs and pay-as-you-go infrastructure pricing of AWS. Instead of having to pay upfront for servers and managing them in an on-premise data center, customers can use virtual servers in AWS to run Informatica products on. Customers can use existing Informatica licenses or purchase them in the standard way from Informatica for use on top of AWS EC2.
Combined with the ease of use of Informatica Cloud, Informatica now offers customers looking for hybrid and cloud solutions even more options.
Read the press release including supporting quotes from AWS and Informatica customer ProQuest, here.
At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”. One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.
As software and the value of data are becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc. As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance, the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.
The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs. These business models enable suppliers to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict malfunction, proactive maintenance planning, etc.
So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.
Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered and governed in IoT projects with the same priorities as the data in the more traditional applications.
To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough. In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.
March 20th 2015 was the official start of spring and to be honest, it couldn’t have come soon enough for us folks in the North East. After a long, cold and snowy winter we’re looking forward to the spring thaw and the first green shoots of burgeoning life. Spring is also the time that we like to tackle new projects and start afresh after our winter hibernation.
For those of us in technology new spring projects often reflect the things we do in everyday life. Naturally our mind turns at this time to spring cleaning and spring training. To be honest, we’d have to admit that we haven’t scrubbed our data in three months so data cleansing is a must, but so too is training. We probably haven’t picked up a book or attended a seminar since last November. But what training should we do? And “what should we do next?”
Luckily, Informatica is providing the answer. We’ve put together two free, half day training seminars for cloud application owners and Salesforce practitioners. That’s two dates, two fantastic locations and dozens of brilliant speakers lined up to give you some new pointers for what’s coming next in the world of cloud and SaaS.
The goals of the event are to give you the tools and knowledge to strengthen your Salesforce implementation and help you delight your customers. The sessions will include presentations by experts from Salesforce and our partner Bluewolf. There will also be some best practices presentations and demonstrations from Informatica’s team of very talented engineers.
Just glance at the seminar summary and you’ll see what we mean:
Session 1: Understand the Opportunity of Every Customer Interaction
In this session Eric Berridge, Co-founder and CEO of Bluewolf Inc. will discuss how you can develop a customer obsessed culture and get the most value from every customer interaction.
Session 2: Delight Your Customers by Taking Your Salesforce Implementation to the Next Level
Ajay Gandhi, Informatica’s VP Product Marketing is next up and he’s going to provide a fabulous session on what you look out for, and where should you invest as your Salesforce footprint grows.
Session 3: Anticipate Your Business Needs With a Fresh Approach to Customer Analytics
The seminar wraps up with Benjamin Pruden, Sr. Manager Product Marketing, at Salesforce. Ben’s exciting session touches on one of the hottest topics in the industry today. He’s going to explain how you can obtain a comprehensive understanding of your most valuable customers with cloud-analytics and data-driven dashboards.
I’m sure you’ll agree that it’s a pretty impressive seminar and well worth a couple of hours of your time.
The New York event is happening at Convene (810 Seventh Ave, 52nd and 53rd ) on April 7th. Click here for more details and to reserve your seat.
The San Francisco event is a week later on April 14th at Hotel Nikko (222 Mason Street). Make sure you click here and register today.
Come join us on the 7th or the 14th to learn how to take your cloud business to the next level. Oh, and don’t forget that you’ll also be treating yourself to some well-deserved spring training!
The emergence of the business cloud is making the need for data ever more prevalent. Whatever your business, if your role is in the sales, marketing or service departments, chances are your productivity depends a great deal on the ability to move data quickly in and out of Salesforce and its ecosphere of applications.
With the in-built data transformation intelligence, the Data Wizard (click here to try the Beta version), changes the landscape of what traditional data loaders can do. The Data Wizard takes care of the following aspects, so that you don’t have to:
- Data Transformations: We built in over 300 standard data transformations so you don’t have to format the data before bringing it in (eg. combining first and last names into full names, adding numeric columns for totals, splitting address fields into its separate components).
- Built-in intelligence: We automate the mapping of data into Salesforce for a range of common use cases (eg., Automatically mapping matching fields, intelligently auto-generating date format conversions , concatenating multiple fields).
- App-to-app integration: We incorporated pre-built integration templates to encapsulate the logic required for integrating Salesforce with other applications (eg., single click update of customer addresses in a Cloud ERP application based on Account addresses in Salesforce) .
Unlike the other data loading apps out there, the Data Wizard doesn’t presuppose any technical ability on the part of the user. It was purpose-built to solve the needs of every type of user, from the Salesforce administrator to the business analyst.
Despite the simplicity the Data Wizard offers, it is built on the robust Informatica Cloud integration platform, providing the same reliability and performance that is key to the success of Informatica Cloud’s enterprise customers, who integrate over 5 billion rows of data per day. We invite you to try the Data Wizard for free, and contribute to the Beta process by providing us with your feedback.
Security professionals are in dire need of a solution that provides visibility into where sensitive and confidential data resides, as well as visibility into the data’s risk. This knowledge would allow those responsible to take an effective, proactive approach to combating cybercrime. By focusing on the data, Informatica and our customers, partners and market ecosystem are collaborating to make data-centric security with Data Security Intelligence the next line of defense.
Security technologies that focus on securing the network and perimeter require additional safeguards when sensitive and confidential data traverse beyond these protective controls. Data proliferates to cloud-based applications and mobile devices. Application security and identity access management tools may lack visibility and granular control when data is replicated to Big Data and advanced analytics platforms.
Informatica is filling this need with its data-centric security portfolio, which now includes Secure@Source. Informatica Secure@Source is the industry’s first data security intelligence solution that delivers insight into where sensitive and confidential data reside, as well as the data’s risk profile.
Join us at our online launch event on April 8th where we will showcase Secure@Source and share reactions from an amazing panel including:
- Security Industry leader Anil Chakravarthy, CPO and EVP Informatica and myself, Amit Walia, GM and SVP Informatica
- Luminaries Larry Ponemon, Founder Ponemon Institute and Jeff Northrop, CTO IAPP
- CISOs Bill Burns, Informatica and Arnold Federbaum, Former CISOs and CyberSecurity Professor NYU
- Enterprise Security Architect, Linda Hewlett, Santander Holdings USA.
The opportunity for Data Security Intelligence is extensive. In a recently published report, Neuralytix defined Data-Centric Security as “an approach to security that focuses on the data itself; to cover the gaps of traditional network, host and application security solutions.” A critical element for successful data security is collecting intelligence required to prioritize where to focus security controls and efforts that mitigate risk. This is precisely what Informatica Secure@Source was designed to achieve.
What has emerged from a predominantly manual practice, the data security intelligence software market is expected to reach $800M by 2018 with a CAGR of 27.8%. We are excited about this opportunity! As a leader in data management software, we are uniquely qualified to take an active role in shaping this emerging market category.
Informatica Secure@Source addresses the need to get smarter about where our sensitive and private data reside, who is accessing it, prioritize which controls to implement, and work harmoniously with existing security architectures, policies and procedures. Our customers are asking us for data security intelligence, the industry deserves it. With more than 60% of security professionals stating their biggest challenge is not knowing where their sensitive and confidential data reside, the need for Data Security Intelligence has never been greater
Neuralytix says “data security is about protecting individual data objects that traverse across networks, in and out of a public or private cloud, from source applications to targets such as partner systems, to back office SaaS applications to data warehouses and analytics platforms”. We couldn’t agree more. We believe that the best way to incorporate a data-centric security approach is to begin with data security intelligence.
JOIN US at the online launch event on April 8th for the security industry’s most exciting new Data Security Intelligence solution, Informatica Secure@Source.
 “The State of Data Centric Security,” Ponemon Institute, sponsored by Informatica, June 2014
In case you haven’t noticed, data integration is all the rage right now. Why? There are three major reasons for this trend that we’ll explore below, but a recent USA Today story focused on corporate data as a much more valuable asset than it was just a few years ago. Moreover, the sheer volume of data is exploding.
For instance, in a report published by research company IDC, they estimated that the total count of data created or replicated worldwide in 2012 would add up to 2.8 zettabytes (ZB). By 2020, IDC expects the annual data-creation total to reach 40 ZB, which would amount to a 50-fold increase from where things stood at the start of 2010.
But the growth of data is only a part of the story. Indeed, I see three things happening that drive interest in data integration.
First, the growth of cloud computing. The growth of data integration around the growth of cloud computing is logical, considering that we’re relocating data to public clouds, and that data must be synced with systems that remain on-premise.
The data integration providers, such as Informatica, have stepped up. They provide data integration technology that can span enterprises, managed service providers, and clouds that dealing with the special needs of cloud-based systems. Moreover, at the same time, data integration improves the ways we doing data governance, and data quality,
Second, the growth of big data. A recent IDC forecast shows that the big data technology and services market will grow at a 26.4% compound annual growth rate to $41.5 billion through 2018, or, about six times the growth rate of the overall information technology market. Additionally, by 2020, IDC believes that line of business buyers will help drive analytics beyond its historical sweet spot of relational to the double-digit growth rates of real-time intelligence and exploration/discovery of the unstructured worlds.
The world of big data razor blades around data integration. The more that enterprises rely on big data, and the more that data needs to move from place to place, the more a core data integration strategy and technology is needed. That means you can’t talk about big data without talking about big data integration.
Data integration technology providers have responded with technology that keeps up with the volume of data that moves from place to place. As linked to the growth of cloud computing above, providers also create technology with the understanding that data now moves within enterprises, between enterprises and clouds, and even from cloud to cloud. Finally, data integration providers know how to deal with both structured and unstructured data these days.
Third, better understanding around the value of information. Enterprise managers always knew their data was valuable, but perhaps they did not understand the true value that it can bring.
With the growth of big data, we now have access to information that helps us drive our business in the right directions. Predictive analytics, for instance, allows us to take years of historical data and determine patterns that allow us to predict the future. Mashing up our business data with external data sources makes our data even more valuable.
Of course, data integration drives much of this growth. Thus the refocus on data integration approaches and tech. There are years and years of evolution still ahead of us, and much to be learned from the data we maintain.
Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.
Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching. The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.
The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”
This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.
As Carey explains it:
“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”
Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.
There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.