Julie Lockner

Julie Lockner
Julie Lockner is a recognized leading industry expert in the structured data management market. Julie joins Informatica from the industry analyst firm ESG, where she served a dual role of senior analyst and vice president covering data management solutions and managing the end-user consulting practice. Prior to joining ESG, Julie was President and Founder of CentricInfo, a consulting firm acquired by ESG, specializing in implementing Data Governance and Data Center Optimization programs. Julie was VP at Solix Technologies and Sr Portfolio Architect at EMC where she defined product and sales strategies in the data management software market. She also held various engineering, product management and marketing positions with companies such as Oracle, Cognex and Raytheon. She graduated with a BSEE from WPI.

Sports Analytics For Players, Owners and Fans

Sports_Analytics

Sports Analytics For Players, Owners and Fans

In the 2011 film Moneyball Billy Beane introduced to the sports industry how to use data analytics to acquire statistically optimal players for the Oakland A’s. In the last 4 years, advancements in data collection, preparation, aggregation and advanced analytics technology have made it possible to broaden the scope of applying analytics beyond the game and player, drastically change the shape of an industry that has a long history built on tradition.

Last week, MIT Sloan held its 9th annual Sports Analytics Conference in Boston, MA. Amidst the 6 foot snow banks, sports fanatics and data scientists came together at this sold out event to discuss the increasing role of analytics in the sports industry. This year’s conference agenda included topics spanning game statistics and modeling, player contract and salary negotiations, dynamic ticket pricing, referee calls to improving fan experiences.

This latter topic, improving fan experiences, is one that has seen a boost in technology innovation such that data is more readily available for use in analytics. For example, newer NFL stadiums are wifi connected throughout so that fans can watch replays on their devices, tweet, and share selfies during the game. With mobile devices connected to the stadium’s wifi, franchises can drive revenue generating marketing campaigns to their home fan base throughout the game.

More important, however, is the need to keep the Millennial Generation interested in watching games live. In an article posted by TechRepublic, college students are more likely to leave a game during halftime if they are not able to connect to the internet or use social media. Teams need to keep fans in the stadiums so the goal needs to ensure the fan experience in a live venue matches what they can experience at home.

Innovation in advanced analytics and Big Data platforms such as Hadoop gives sports analysts the ability to access significant volumes of detailed data resulting in greater modeling accuracy. Streamlined data preparation tools speed the process from receiving raw data to delivering insight. Advanced analytics offered in the cloud as a service offers team owners and managers access to predictive analytics tools without having to manage and staff large data centers. Better visualization applications provide an effective way to communicate what the data means to those without a math degree.

When applying these innovations to new data sources while combining with advancements of analytics in sports, the results will be game changing far beyond what Billy Beane was able to accomplish with the Okland A’s.

Our congratulations to the winners of the top research papers submitted at the MIT Sloan Sports Analytics conference: Who Is Responsible For A Called Strike? and Counterpoints: Advanced Defensive Metrics for NBA Basketball. It will be interesting to see how these models will make an impact, with Spring Training and March Madness just around the corner. Maybe next year, we will see a submission on the dependencies of atmospheric conditions on football pressure and its impact on the NFL playoffs (PV=NRT) and get a data-driven explanation of Deflate Gate.

Share
Posted in Big Data, Wearable Devices | Tagged , , | Leave a comment

Can You Find True Love Using A Wearable Device?

Wearable Device

Can You Find True Love Using A Wearable Device?

How do you know if you have found ‘true love’?

Biologists and psychologists tell us that when we are struck by cupid’s arrow, our body is reacting to a set of chemicals that are released in the brain that evoke emotions and feelings of lust, attraction and attachment.[1]  When those chemicals are released, our bodies respond.  Our hearts race, blood pumps through our veins, faces flush, body temperatures rise. Some say it feels like electricity is conducting all over the skin. It releases a flood of emotions that may cloud our judgment and may even cause us to make a choice considered unreasonable to others.  Sound familiar?

But what causes our brains to react to one person and not another?  Are we predisposed to how certain people look or smell?  Do our genes play a role in determining an affinity toward a body type or shape?

Pheromone research has shown how sensors in our nose can smell whether or not someone’s immune system compliments our own based on the scent of urine and sweat.  Meaning, if someone has a similar immune deficiency, that individual won’t smell good to us. We are more likely to prefer the smell of someone who has an immune system that is different. Is our genetic code programming our instincts to preselect who we should mate with so our offspring has a higher chance of surviving?

It is probably not surprising that most men are attracted to women with symmetrical faces and hourglass figures. Genetic research hints that men’s predispositions are also based on a genetic code.  There is a correlation between asymmetric facial characteristics and genetic disorders as well as between waist to hip ratios and fertility. Depending on where you are in your stage in life, these characteristics could have a weighting factor in how your brain responds to the smell of the perfect pheromone and how someone appears.  And, some argue it is all influenced by body language, voice tone and actual words used in dialogue.[2]

Psychologists report it takes only two to four minutes to decide if you are falling in love with someone.  Even if you dismiss some or accept all of the possibilities I am presenting, experiencing love is impacted by a variety and intensity of senses, interpretations and emotions combined together in a short period of time. If you are a data nerd like myself, variety, volume and velocity of ‘signals’ begins to sound like a Big Data marketing pitch. This really is an application of predictive analytics using different data types, large volumes of data and real-time decision making algorithms. But, I’m actually more interested in how affective computing, wearable devices and analytics could help determine whether or not what you feel is actually ’true love’ or just a bad case of indigestion.

Affective computing, according to researcher Rosalind Picard,[3] gives a computer the ability to recognize and express emotions, develop that ability and enable it to regulate and utilize emotions. When applied to wearable devices that can listen to how you talk, measure blood pressure, detect changes in heart and respiration rate and even measure electro-dermal responses, is it possible that technology could sense when your body is responding to the chemicals of love?

What about mood rings, you may ask?  Mood rings, the original form of an affective wearable device that grew in popularity in the 1970s changed color based on your mood. Unfortunately, mood rings only change based on body temperature. Through data collection and research, researchers[4]  have shown that physiology patterns cannot be determined by body temperature alone. In order to truly differentiate emotion of, let’s say ‘true love,’ you need to be able to collect multiple physiological signals and detect a pattern using multi-variant pattern recognition algorithms. And, if you only have 2-4 minutes, it pretty much needs to calculate chances of ‘true love’ in real-time to prevent making a life-altering mistake.

The evolution of wearables technology has reached medical grade, allowing parents to detect when their children are about to have an epileptic seizure or are experiencing acute levels of stress.  When tuned to love-seekers’ queues, is it possible that this same technology could send an audio or visual signal to your smart phone alerting you as to whether or not this person is a ‘true love’ candidate? Or glow red if you are in the proximity of someone who is experiencing similar physiological changes?   Maybe this is the next application for match-making companies such as eHarmony or Match.com?

The reality is this. Assuming that the data is clean and accurate, safe from violating any data privacy concerns and truly connected to your physiological signals, wearable device technology that could detect close proximity of ‘true love’ is probably five years out. It is more likely to show up in a popular science fiction film than at an Apple store in the near term. But, when it does, think about how the signal on your smart phone device tells you the proximity of a potential candidate, where a local flower shop is, integrated with facial recognition and Facebook photos and ‘status’ (assuming it is true), with an iTunes recommendation of ‘Love Is In The Air’ by John Paul Young, ‘True Love’ is only 2-4 minutes away.

[1] http://www.bbc.co.uk/science/hottopics/love/

[2] http://www.youramazingbrain.org/lovesex/sciencelove.htm

[3] R. Picard. Affective Computing. Pages 227-239, MIT Press, 2000

[4] Cacioppa and Tassinary (1990)

Share
Posted in Business Impact / Benefits, Data Aggregation, Data Services, Intelligent Data Platform, Real-Time, Wearable Devices | Tagged , , | Leave a comment

Data Proliferation Exposes Higher Risk of a Data Breach

Data proliferation has traditionally been measured based on the number of copies data reside on different media. For example, if data residing on an enterprise storage device was backed up to tape, the proliferation was measured by the number of tapes the same piece of data would reside. Now that backups are no longer restricted to the data center and data is no longer constrained by the originating application, this definition is due for an update.

Data proliferation should be measured based on the number of users who have access to or can view the data and that data proliferation is a primary factor in measuring the risk of a data breach. My argument here is that as sensitive, confidential or private data proliferates beyond the original copy, it increases its surface area and proportionally increases its risk of a data breach.

Using the original definition of data proliferation and an example of data storage shown below, data proliferation would include production, production copies used for disaster recovery purposes and all physical backup copies. But as you can see, data is also copied to test environments for development purposes. When factoring in the number of privileged users with access to those copies, you have a different view of proliferation and potential risk.

Data Proliferation_Data Breach Example 1

Data Proliferation of a production sensitive or private data element.

In the example, there are potentially thousands of copies of sensitive data but only a small number of users who are authorized to access the data.

In the case of test and development, this image highlights a potentially high area of risk because the number of users who could see the sensitive data is high.

Similarly with online advertising, the measure of how many people see an online ad is called an impression. If an ad was seen by 100 online users, it would have 100 impressions.

Data Proliferation measured by the total number of potential impressions_Data Breach

Data Proliferation measured as a function of number of users who have access to or can view sensitive or private data.

When you apply that same principal to data security, you could say that data proliferation is a calculation of the number of copies of a data element multiplied by the potential number of users who could physically view the data, or in other words ‘impressions’. In this second image below, rather than considering the total number of copies, what if we measured risk based on the total number of impressions?

In this case, the measure of risk is independent of the physical media the data reside on. You could take this a few steps further and add a factor based on security controls in place to prevent unauthorized access.

This is similar to how the Secure@Source team in Informatica’s newly formed Data Security Group calculates risk which I believe could truly be a game changer in data security industry.

Share
Posted in Data Privacy, Data Security | Tagged , , , , | Leave a comment

Responsible Data Breach Reporting

databreachThis week, another reputable organization, Anthem Inc, reported it was ‘the target of  a very sophisticated external cyber attack’. But rather than be upset at Anthem, I respect their responsible data breach reporting.

In this post from Joseph R. Swedish, President and CEO, Anthem, Inc., does something that I believe all CEO’s should do in this situation.  He is straight up about what happened,  what information was breached, actions they took to plug the security hole, and services available to those impacted.

When it comes to a data breach, the worst thing you can do is ignore it or hope it will go away. This was not the case with Anthem.  Mr Swedish did the right thing and I appreciate it.

You only have one corporate reputation – and it is typically aligned with the CEO’s reputation.  When the CEO talks about the details of a data breach and empathizes with those impacted, he establishes a dialogue based on transparency and accountability.

Research that tells us 44% of healthcare and pharmaceutical organizations experienced a breach in 2014. And we know that when personal information when combined with health information is worth more on the black market because the data can be used for insurance fraud.   I expect more healthcare providers will be on the defensive this year and only hope that they follow Mr Swedish’s example when facing the music.

Share
Posted in Data Privacy, Data Security, Governance, Risk and Compliance, Healthcare | Tagged , , | 1 Comment

Half of Healthcare Execs Have Analytics Initiatives

Half of Healthcare Execs Have Analytics Initiatives

Half of Healthcare Execs Have Analytics Initiatives

In a recent report posted sponsored by GE Healthcare on ModernHealthcare.com, only 49% of healthcare executives say they have an established strategy for Big Data or a specific Data and Analytics initiative.  With the move to electronic medical and health records, I would have thought this number was higher given the access to a gold mine of data.  I was also disappointed that of those who do have an initiative, 42% said they were unsure if the organization benefited from applying analytics so far!

I understand that fighting for budget and time to implement analytics is a challenge with all the changes happening in healthcare (ICD-10, M&A, etc.).  But hospitals using analytics to drive Value-based care are leading healthcare reform and setting a higher bar for quality of service.  Value-based care promises quicker recoveries, fewer readmissions, lower infection rates, and fewer medical errors – something we all want as consumers.

In order to truly achieve value-based care, analytics is a must have. If you are looking for the business case or inspiration for the business driver, here are a few ideas:

  • In surgery, do you have the data to show how many patients had lower complication rates and higher long-term survival rates? Do you have that data across the different surgical procedures you offer?
  • Do you have data to benchmark your practice quality? How do you compare to other practices in terms of infection rates? Can you use that data to promote your services from a marketing perspective?
  • Do you know how much a readmission is costing your hospital?
  • From a finance perspective, have you adopted best practices from other industries with respect to supply-chain management or cost optimization strategies?

If you don’t have the expertise, there are plenty of consulting organizations who specialize in implementing analytics to provide insight to make the transition to value-based care and pricing.

We are always going to be facing limited budgets, the day will always have 24 hours in it, and organizations are constantly changing as new leaders take over with a different agenda.  But one thing is certain; a decision without data is just someone’s opinion. In healthcare with only half of the executives making decisions based on analytics, maybe we should all be asking for a second opinion – and one based on data.

Share
Posted in Big Data, Healthcare | Tagged , , , , | Leave a comment

Saving Lives Using Wearable Device Data

Saving Lives Using Wearable Device Data

Saving Lives Using Wearable Device Data

The consumer electronics industry held CES 15 last week in Las Vegas showcased new innovations in wearable devices – one area I have a particular interest.  Wired’s top picks called out Withings Activité Pop Activité Pop’ ‘handsome and colorful’ and only $150.  It makes the geek in me fashionable – something my children could use some help with. Sold!

However, there is another wearable that has my attention – a wearable designed to save children’s lives: Embrace.  Embrace is the first medical-quality wearable to help measure stress, epileptic seizures, activity and sleep.  The idea is it an be used to detect early signs of an event and alert you when an unusual event is about to happen.  If you have a toddler or infant, the wearable could alert parents in the middle of the night.  As a mother of four children, peace of mind in the night is king.

Imagine the possibilities.

Biometric data collected from devices like these when used in the classroom could be used as a predictor children with Autism, Asperger Syndrome, or Mood Disorders to help clinicians, educators and parents better understand when a child is starting to become dis regulated. Integrating that data with therapeutic and educational strategies could potentially provide insight into a practice that is largely trial and error.

I pledged my support for Embrace in hopes that innovation in this field will continue to prosper, saving lives, and ultimately making a difference in the world.

Share
Posted in Data Governance, Healthcare, Wearable Devices | Tagged , , , , | Leave a comment

The CISO Challenge: Articulating Data Worth and Security Economics

A few years ago the former eBay’s CISO, Dave Cullinane, led a sobering coaching discussion on how to articulate and communicate the value of a security solution and its economics to a CISO’s CxO peers.

Why would I blog about such old news? Because it was a great and timeless idea. And in this age of the ‘Great Data Breach’, where CISOs need all the help they can get, I thought I would share it with y’all.

Dave began by describing how to communicate the impact of an attack from malware such as Aurora, spearfishing, stuxnet, hacktivision, and so on… versus the investment required to prevent the attack.  If you are an online retailer and your web server goes down because of a major denial of service attack, what does that cost the business?  How much revenue is lost every minute that site is offline? Enough to put you out of business? See the figure below that illustrates how to approach this conversation.

If the impact of a breach and the risk of losing business is high and the investment in implementing a solution is relatively low, the investment decision is an obvious one (represented by the yellow area in the upper left corner).

CISO Challenge

However, it isn’t always this easy, is it?  When determining what your company’s brand and reputation worth, how do you develop a compelling case?

Another dimension Dave described is communicating the economics of a solution that could prevent an attack based on the probability that the attack would occur (see next figure below).

CISO Challenge

For example, consider an attack that could influence stock prices?  This is a complex scenario that is probably less likely to occur on a frequent basis and would require a sophisticated multidimensional solution with an integrated security analytics solution to correlate multiple events back to a single source.  This might place the discussion in the middle blue box, or the ‘negotiation zone’. This is where the CISO needs to know what the CxO’s risk tolerances are and articulate value in terms of the ‘coin of the realm’.

Finally, stay on top of what the business is cooking up for new initiatives that could expose or introduce new risks.  For example, is marketing looking to spin up a data warehouse on Amazon Redshift? Anyone on the analytics team tinkering with Hadoop in the cloud? Is development planning to outsource application test and development activities to offshore systems integrators? If you are participating in any of these activities, make sure your CISO isn’t the last to know when a ‘Breach Happens’!

To learn more about ways you can mitigate risk and maintain data privacy compliance, check out the latest Gartner Data Masking Magic Quadrant.

Share
Posted in Data Governance, Data masking, Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , | Leave a comment

IDC Life Sciences and Ponemon Research Highlights Need for New Security Measures

The Healthcare and Life Sciences industry has demonstrated its ability to take advantage of data to fuel research, explore new ways to cure life threatening diseases, and save lives.  With the adoption of technology innovation especially in the mobile technology segment, this industry will need to find a balance between investments and risk.

ModernMedicine.com published an article in May, 2014 stating how analysts worry that a wide-scale security breach could occur in healthcare and pharmaceuticals industry this year.  The piece calls out that this industry category ranked the lowest in an S&P500 cyber health study because of its high volume of incidents and slow response rates.

In the Ponemon Institute’s research, The State of Data Centric Security, respondents from the Healthcare and Life Sciences stated the data they considered most at risk was customer, consumer and patient record data.  Intellectual Property, Business Intelligence and Classified Data responses ranked a close second.

Data security

In an Informatica webinar with Alan Louie, Research Analyst from IDC Health Insights (@IDCPharmaGuru), we discussed his research on ‘Changing Times in the Life Sciences – Enabled and Empowered by Tech Innovation’.  The megatrends of cloud, mobile, social networks and Big Data analytics are all moving in a positive direction with various phases of adoption.  Mobile technologies tops the list of IT priorities – likely because of the productivity gains that can be achieved by mobile devices and applications. Security/Risk Management technologies listed as the second-highest priority.

When we asked Security Professionals in Life Sciences in the Ponemon Survey, ‘What keeps you up at night?’, the top answer was ‘migrating to new mobile platforms’.  The reason I call this factoid out is that all other industry categories ranked ‘not knowing where sensitive data resides’ as the biggest concern. Why is Life Sciences different from other industries?

Life sciences

One reason could be the intense scrutiny over Intellectual Property protection and HIPPA compliance has already shone a light on where sensitive data reside. Mobile makes it difficult to track and contain a potential breach given that cell phones are the number 1 item left behind in taxi cabs.

With the threat of a major breach on the horizon, and the push to leverage technology such as mobile and cloud, it is evident that the investments in security and risk management need to focus on the data itself – rather than tie it to a specific technology or platform.

Enter Data-Centric Security.  The call to action is to consider applying a new approach to the information security paradigm that emphasizes the security of the data itself rather than the security of networks or applications.   Informatica recently published an eBook ‘Data-Centric Security eBook New Imperatives for a New Age of Data’.  Download it, read it. In an industry with so much at stake, we highlight the need for new security measures such as these. Do you agree?

I encourage your comments and open the dialogue!

Share
Posted in Application ILM, Data Governance, Data masking, Data Privacy, Data Security | Tagged , , , , , | Leave a comment

Take Action – Reduce Your Risk of Identity Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

What is our personal information worth? 

With this 2014 holiday season rolling into full swing, Americans will spend more than $600 Billion, a 4.1% increase from last year. According to the Credit Union National Association, a poll showed that 45% of credit and debit card users will think twice about how they shop and pay given the tens of millions of shoppers impacted by breaches. Stealing identities is a lucrative pastime for those with ulterior motives. The Black Market pays between $10-$12 per stolen record. Yet when enriched with health data, the value is as high as $50 per record because it can be used for insurance fraud.

Are the thieves getting smarter or are we getting sloppy?  

With ubiquitous access to technology globally, general acceptance to online shopping, and the digitization of health records, there is more data online with more opportunities to steal our data than ever before.  Unfortunately for shoppers, 2013 was known as ‘the year of the retailer breach’ according to the Verizon’s 2014 data breach report. Unfortunately for patients, Healthcare providers were most noted for the highest percentage of losing protected healthcare data.

So what can we do to be a smarter and safer consumer?

No one wants to bank roll the thieves’ illegal habits. One way would be to regress 20 years, drive to the mall and make our purchases cash in hand or go back to completely paper-based healthcare.   Alternatively, here are a few suggestions to avoid being on the next list of victims:

1. Avoid irresponsible vendors and providers by being an educated consumer

Sites like The Identify Theft Resource Center and the US Department of Health and Human Services expose the latest breaches in retail and healthcare respectively. Look up who you are buying from and receiving care from and make sure they are doing everything they can to protect your data. If they didn’t respond in a timely fashion, tried to hide the breach, or didn’t implement new controls to protect your data, avoid them. Or take your chances.

2. Expect to be hacked, plan for it

Most organizations you trust with your personal information have already experienced a breach. In fact, according to a recent survey conducted by the Ponemon Group sponsored by Informatica, 72% of organizations polled experienced a breach within the past 12 months; more than 20% had 2 or more breaches in the same timeframe. When setting passwords, avoid using words or phrases that you publicly share on Facebook.  When answering security questions, most security professionals suggest that you lie!

3. If it really bothers you, be vocal and engage

Many states are invoking legislation to make organizations accountable for notifying individuals when a breach occurs. For example, Florida enacted FIPA – the Florida Information Protection Act – on July 1, 2014 that stipulates that all breaches, large or small, are subject to notification.  For every day that a breach goes undocumented, FIPA stipulates $1,000 per day penalty up to an annual limit of $500,000.

In conclusion, as the holiday shopping season approaches, now is the perfect time for you to ensure that you’re making the best – and most informed – purchasing decisions. You have the ability to take matters into your own hands; keep your data secure this year and every year.

To learn more about Informatica Data Security products, visit our Data Privacy solutions website.

Share
Posted in Data masking, Data Privacy, Healthcare | Tagged , , , | Leave a comment

Informatica Named a Leader in Gartner MQ for Data Archiving

Structured Data Archiving and Application Retirement

Gartner MQ for Structured Data Archiving and Application Retirement

For the first time, Gartner has released a Magic Quadrant for Structured Data Archiving and Application Retirement . This MQ describes an approach for how customers can proactively manage data growth. We are happy to report that Gartner has positioned Informatica as a leader in the space. This is based on our ‘completeness of vision’ and our ‘ability to execute.’

This magic quadrant focuses on what Gartner calls Structured Data Archiving. Data Archiving is used to index, migrate, preserve and protect application data in secondary databases or flat files. These are typically located on lower-cost storage, for policy-based retention.  Data Archiving makes data available in context of the originating business process or application. This is especially useful in the event of litigation or of an audit.

The Magic Quadrant calls out two use cases. These use cases are “live archiving of production applications” and “application retirement of legacy systems.”  Informatica refers to both use cases, together, as “Enterprise Data Archiving.” We consider this to be a foundational component of a comprehensive Information Lifecycle Management strategy.

The application landscape is constantly evolving. For this reason, data archiving is a strategic component of a data growth management strategy. Application owners need a plan to manage data as applications are upgraded, replaced, consolidated, moved to the cloud and/or retired. 

When you don’t have a plan in production, data accumulates in the business application. When this happens, performance bothers the business. In addition, data bloat bothers IT operations. When you don’t have a plan for legacy systems, applications accumulate in the data center. As a result, increasing budgets bother the CFO.

A data growth management plan must include the following:

  • How to cycle through applications and retire them
  • How to smartly store the application data
  • How to ultimately dispose data while staying compliant

Structured data archiving and application retirement technologies help automate and streamline these tasks.

Informatica Data Archive delivers unparalleled connectivity, scalability and a broad range of innovative options (i.e. Smart Partitioning, Live Archiving, and retiring aging and legacy data to the Informatica Data Vault), and comprehensive retention management and data reporting and visualization.  We believe our strengths in this space are the key ingredients for deploying a successful enterprise data archive.

For more information, read the Gartner Magic Quadrant for Structured Data Archiving and Application Retirement.

Share
Posted in Application ILM, Application Retirement, Data Archiving, Database Archiving, Governance, Risk and Compliance | Tagged , , , , , , | Leave a comment