Tag Archives: electronic medical records
A growing number of Data Scientists believe so.
If you recall the Cholera outbreak of Haiti in 2010 after the tragic earthquake, a joint research team from Karolinska Institute in Sweden and Columbia University in the US analyzed calling data from two million mobile phones on the Digicel Haiti network. This enabled the United Nations and other humanitarian agencies to understand population movements during the relief operations and during the subsequent cholera outbreak. They could allocate resources more efficiently and identify areas at increased risk of new cholera outbreaks.
Mobile phones, widely owned even in the poorest countries in Africa. Cell phones are also a rich source of data irrespective of which region where other reliable sources are sorely lacking. Senegal’s Orange Telecom provided Flowminder, a Swedish non-profit organization, with anonymized voice and text data from 150,000 mobile phones. Using this data, Flowminder drew up detailed maps of typical population movements in the region.
Today, authorities use this information to evaluate the best places to set up treatment centers, check-posts, and issue travel advisories in an attempt to contain the spread of the disease.
The first drawback is that this data is historic. Authorities really need to be able to map movements in real time especially since people’s movements tend to change during an epidemic.
The second drawback is, the scope of data provided by Orange Telecom is limited to a small region of West Africa.
Here is my recommendation to the Centers for Disease Control and Prevention (CDC):
- Increase the area for data collection to the entire region of Western Africa which covers over 2.1 million cell-phone subscribers.
- Collect mobile phone mast activity data to pinpoint where calls to helplines are mostly coming from, draw population heat maps, and population movement. A sharp increase in calls to a helpline is usually an early indicator of an outbreak.
- Overlay this data over censuses data to build up a richer picture.
The most positive impact we can have is to help emergency relief organizations and governments anticipate how a disease is likely to spread. Until now, they had to rely on anecdotal information, on-the-ground surveys, police, and hospital reports.
I’ve been advocating for years that replacing the paper chart with an electronic system is not the value of the EHR, but rather collecting data that can be used to understand and improve care. So I was very pleased to see Dr. John Showalter’s blog address this very issue – making a compelling case with real-world examples where wisdom derived from data has made demonstrable improvements in healthcare quality and corresponding reductions in cost. (more…)
In this video, Richard Cramer, chief healthcare strategist, and Scott Fingerhut, senior director, product marketing, CEP, Informatica, discuss healthcare and CEP (Complex Event Processing).
Richard and Scott cover the following topics:
– What is CEP;
– How CEP pertains to healthcare;
– How CEP differs from data warehouse analytics;
– What some of the applications of CEP are in the healthcare environment; and,
– Where the opportunities are for companies who have already invested heavily in meaningful use and EHRs.
InfoWorld recently published an article by Paul Venezia entitled “10 IT agenda items for the first US CIO.” In reading through the article, I thought that some of the recommendations that Venezia makes could be more effective if a data quality strategy were also included as a part of the agenda.
For instance, the first suggested agenda item calls for “Mandatory restitution for customer data leaks.” Although this is primarily focused on data breaches and the impact to consumers, my recommendation is that this agenda item should be expanded to include requiring a data quality firewall.
By doing so, any data coming into or leaving an organization would have standard data quality processes enforced. Since data manipulation continues to move to more of a real-time process, having sufficient checks of the data, including standardizing key fields and looking for duplicates, is a way to ensure data integrity is maintained both at the source as well as throughout the data movement process. (more…)