by Chiara Garattini & Han Pham

 

Read Part I

Read Part II

 

In the third and final edition of our Future Cities, Future Health blog series, we will look at the final theme around Mapping Cities (Creatively) which showcases the creative ideas of allocating healthcare resources and using sound to produce insights into complex health data as part of the physical computing module on the Innovation Design and Engineering Masters programme run in conjunction between Imperial College and the Royal College of Arts (RCA).


Mapping cities (creatively)

In considering how to allocate resources, we also need to understand where resources are most needed, and how this changes dynamically within a city.

 

3A.jpg

Figure 1. Ambulance Density Tracker


Antoni Pakowski asked how to distribute ambulances within a city to shorten response times for critical cases, and suggested this could be supported by anonymous tracking of people via their mobile phones. The expected service window of ambulance arrival in critical care cases is 8 minutes. However, in London, only around 40 percent of calls meet that target. This may be due to ambulances being tied to a static base station. How can the location of the ambulance change as people density changes across a city?

 

The ambulance density tracker (Figure 1) combined a mobile router and hacked Pirate Box to retrieve anonymously the IP of phones actively seeking Wi-Fi to create a portable system to track the density of transient crowds. The prototype was designed to only rely upon one point of data within a certain region, requiring less processing than an embedded phone app. He also created a scaled down model of the prototype, to suggest a future small device that could potentially be affixed to static and moving infrastructure such as taxis within the city.

 

Although the original use case needs additional design work to be clearer, the prototype itself as a lightweight, anonymous device that allows for a portable proxy of transient crowd density may be useful as a complementary technology for other design projects geared toward designing for impromptu and ad hoc health resources within a city based on audience shifts.

 

3B.jpg

Figure 2. 'Citybeat'


The second project in this category is called ‘Citybeat’ by student Philippe Hohlfeld (Figure 2). Philippe wanted to look at the sound of a city and create not only ‘sound’ maps of the city, but also capturing the ‘heartbeat’ of a city by exploring ‘sonified’ feedback from it. His thinking originated from three distinct scientific endeavours: a) turning data from the Higgs Boson Atlas preliminary data at CERN into a symphony to celebrate the connectedness of different scientific fields; b) turning solar flares into music at the University of Michigan to produce new scientific insights; and c) a blind scientist at NASA turning gravitational fields of distant stars into sound to determine how they interact.

 

The project looked specifically at the Quality of Life Index (safety, security, general health, culture, transportation, etc.) and tried to attribute sounds to different elements so to create a ‘tune’ for each city. Sonification is good for finding trends and for comparison between two entities. What we most liked of the project though, was the idea of using sound rather than visual tools to produce insights into complex data.


Personal data from wearables, for example, is generally often in visual dashboard. Even though these are meant to simplify data fruition, they not always do. Sound could be quicker than visual displays in expressing, for example, rapid or slow progress (e.g. upbeat) or regress (e.g. downbeat). In the current landscape of information overload, exploring sound as alternative way of summarizing usage is something we thought very interesting.

3 - b - Bee Gate.png

Figure 3. 'Bee gate'

Finally, the last selected project in this list is also one of the most unusual ones. Student James Batstone wanted to think of how bees interact with polluted environments and how they could be used as part of reclamation or decontamination programmes. He imagined a city (or territory) abandoned due to pollution, and of using bees to collect and analyse pollen to establish whether the territory was ready for being reclaimed to human habitation.

He built a prototype with ‘bee gates’ that would allow for the harmless capturing of pollen from the individuals insects when returning to the hive (Figur3). He also theorised to complement this with an automated software that used cameras to track and automatically analyse their dance to establish provenance. What we liked about this project is the imaginative idea of using bees to monitor air and land quality by analysing vegetation through their pollen, as well as radiation and pollutants in honey, to create maps of lands quality levels. Using natural resources and occurring events to complement what technology can do (and vice versa) is the way to achieve sustainable solutions in the long term.

 

Final thoughts

As part of our work at Intel, we collaborate with the world’s top universities to look at the future of cities with an eye toward the intersection of technology, environment, and social sustainability. In our groups one can find entrepreneurs, designers, hacktivists, engineers, data artists, architects and more.

 

We seek to support the same diversity of inspiration in today’s students as the future technology innovators by tapping into how to connect creativity to the technology for more vibrant, connected cities and communities. In many ways, working with first year master’s students is a refreshing perspective of how to open these questions with a beginner’s mind-set by suggesting how embrace simplicity in the face of rising information – just because our digital traces and data footprint will be increasing, our time to juggle what that means won’t.

 

Physical computing is coming into play in new ways, more often. It will not be enough to get lost in a screen – the interface of tomorrow will be everywhere, and interactions leap off screens into the real world. ‘Future Health, Future Cities’ suggested how to consider the role of physical computing in helping create more sustainable services by, for example, making transparent what and where the need for services are, by exploring how to communicate simply and well new urban information streams, and, last but not least, by reflecting on how to deliver resources where it will be most needed in a constantly changing city.

 

 

*Concepts described are for investigational research only.

**Other names and brands may be claimed as the property of others.

When National Coordinator Karen DeSalvo said, at HIMSS15 in Chicago, that the nation needed "true interoperability, not just exchange,” I imagined the 45,000 or so attendees crying out in unison, “Just exchange?”

 

Measured only since George Bush made EMRs a priority in his 2004 State of the Union address, it has taken our country 11 difficult years to get to a point where 75-80 percent of U.S. hospitals are deploying electronic health records and exchanging a limited set of data points and documents. Especially for smaller hospitals, “just exchange” represents a herculean effort in terms of acquisition, deployment, training and implementation.

 

But adoption of EHRs, meaningful use of the technology, and peer-to-peer exchange of data were never defined as the endpoints of this revolution. They are the foundation on which the next chapter – interoperability — will grow.

 

I asked a former colleague of mine, Joyce Sensmeier, HIMSS Vice President, Informatics, how she distinguished exchange from interoperability.

 

“I think of exchange as getting data from one place to another, as sharing data. Ask a question, get an answer,” she said. “Interoperability implies a many-to-many relationship. It also includes the semantics – what do the data mean? It provides the perspective of context, and to me, it’s going towards integration.”

 

There are other definitions as well. One CIO I correspond with told me he relies on the IEEE definition, which is interesting because ONC uses it too: “the ability of two or more systems or components to exchange information and to use the information that has been exchanged.” And as I was writing this blog, John Halamka beat me to the punch on his blog with a post titled So what is interoperability anyway?

 

His answer: it’s got to be more than “the kind of summaries we’re exchanging today which are often lengthy, missing clinical narrative and hard to incorporate/reconcile with existing records.”

 

Sensmeier notes that interoperability, like exchange, is an evolutionary step on the path to a learning health system. I like that metaphor. As with biological evolution, healthcare IT adaptation is shaped by an ever-changing environment. We shouldn’t expect that every step creates a straight line of progress — some solutions will be better than others. The system as a whole learns, adopts the better practices, and moves forward.

 

(This, incidentally, is why Meaningful Use has proven unpopular among some providers. Although it  emerged from excellent thinking in both the public and private sector, its implementation has taken the command-and-control approach that rewarded — or punished —providers for following processes rather than creating positive change.)

 

Moving from exchange to interoperability, DeSalvo said at HIMSS15, will require “standardized standards,” greater clarity on data security and privacy, and incentives for “interoperability and the appropriate use of health information.”

 

I’ve seen two recent reports that suggest the effort will be worthwhile, even if the payoff is not immediate. A study from Niam Yaraghi of the Brookings Institute found that after more than a decade of work,  “we are on the verge of realizing returns on investments on health IT.” And analysts with Accenture report that healthcare systems saved $6 billion in avoided expense in 2014 thanks to the “increasing ubiquity of health it.” Accenture expects that number to increase to $10 billion this year and $18 billion in 2016.

 

Sensmeier says she expects that reaching “true operability” will occur faster than “just exchange” did.

 

“It won’t happen overnight,” she warns, “but it won’t take as long, either.” With evolving agreement on standards, the infrastructure of HIEs and new payment models, the path to interoperability should be smoother than the one that got us to where we are today.

 

What questions do you have?

Would you use a medical diagnostic that is missing 63 percent of its input data, and simultaneously has a 30 percent false positive rate? I wouldn’t.

 

That is, I wouldn’t if I could avoid it. Sadly, I can’t. These are the stats for the structured problem list in the typical enterprise electronic health record (EHR). Two studies have found that at least 63 percent* of the key clinical information about patients can only be found in text and scanned documents, which means that it is not readily accessible to clinical decision support.

 

At the same time, consider heart failure. One study found that a shocking 30 percent** of patients with heart failure in their problem lists do not actually have heart failure!

 

On Wednesday, May 27, I had the pleasure of co-presenting on this topic at the Health Technology Forum with Vishnu Vyas. Vishnu is the Director of Research and Development at Apixio, a Big Data analytics company for healthcare. Our goal was to introduce the concept of The True State of the Patient as a computable model of the patient for optimizing care delivery, outcomes and revenue.

 

Our message was that Big Data analytics can dramatically improve, and eventually eliminate, these problems.

 

First, the bad news. I’ve already mentioned a source of error: improperly attributed diagnoses in the structured problem list. I’ve also mentioned a data gap: Missing diagnoses in the structured problem list. But there are other critical gaps. For example, typical patients spend only 0.08percent*** of their waking hours in a clinical setting, so we are missing observations of the remaining 99.92 percent*** of their lives. To manage a patient with diabetes or cardiovascular disease, the inability to measure activity over the course of the entire day makes it very difficult to proactively manage these chronic diseases. For Parkinson’s patients, whose symptoms such as tremor and sleep disturbance are intermittent and highly variable, it is nearly impossible to assess the impact of therapy with such meager sampling. In both cases, to fill in these gaps is to move closer to knowing the True State of the Patient.

 

Another gap is genomic data. The literature abounds with amazing examples of genomic markers that can inform care decisions for patients with a variety of conditions, including but not limited to cancer, however these markers are not routinely measured and applied in clinical practice. We would argue that genomic data is often an important, but overlooked, component of the True State of the Patient.

 

Ok, so we have errors and we’re missing data. How does Big Data analytics help?

 

Let’s look at false positives for heart failure in the structured problem list. This is an important example, because heart failure patients are very sick, and they are subject to Medicare 30-day readmission rules, so the correct assessment of heart failure has real consequences for patients and health systems.

 

It is possible to build a classifier (a kind of machine learning model) that can look at all of the information in a patient’s chart and make a determination about how likely it is that the patient actually has heart failure. This is very interesting, because the computer is not diagnosing heart failure, it is reliably identifying what the chart of a true heart failure patient looks like. Machine learning, a typical tool in the Big Data arsenal, allows us to dramatically reduce the impact of errors in the problem list.

 

What about gaps? The same trick can be used for filling in gaps in the structured data, and is especially effective when text mining and natural language processing are used to find key clinical information in the unstructured data. Progress notes, nursing notes, consult letters and hospital discharge summaries are fertile sources of information for this kind of analysis. Big Data analytics leads to fewer gaps in the structured data.

 

What about gaps caused by the 99.92 percent of the time that the patient spends outside the clinic? This is where Big Data’s Internet of Things (IOT) comes in. In healthcare, IOT means remote patient monitoring, which can include anything from activity monitoring with a smart watch, to weight, blood pressure and glucose monitoring with FDA-approved medical devices. There are a number of very interesting studies happening right now in which patient monitoring is filling in the gaps in the clinical record.

 

Finally, what about the genome? Everything about the human genome is Big Data analytics, from the raw data for more than two billion base pairs, to the huge datasets and patient cohorts that are needed to correlate observed genomic variation with clinical measurements and outcomes. Inclusion of the human genome in the True State of the Patient will lead to some of the largest and most interesting Big Data advances in healthcare over the next 10 years. 

 

Once we have the True State of the Patient in a computable form, it is possible to improve every area of healthcare. Quality reporting, precision medicine, “practice-based evidence,” proactive wellness and disease management, revenue optimization and care delivery optimization all flow from accurate models of healthcare data. There are few businesses that can survive without accurate, reliable business data to drive modeling and optimization. The changes we are seeing in healthcare today are demanding that healthcare have the same level of transparency into the True State of the Patient.

 

Big Data analytics will be a critical part of this transformation in healthcare. What questions do you have?

 

* Studies performed by Apixio (private communication) and Harvard: http://www.nejm.org/doi/full/10.1056/NEJMp1106313#t=article
** Apixio study (private communication)
***http://www.cdc.gov/nchs/fastats/physician-visits.htm
(4.08 visits/American/year)
http://mail.fmdrl.org/Fullpdf/July01/ss5.pdf
(all visits < 80 minutes time with physician- this is a strong upper limit)
NB: 0.08% assumes 16 waking hours in a day, 80 minutes per visit, 4.08 visits/year/patient

Read Part 1 of this two-part blog series

 

What does the next 17 years hold in store for us? I believe it we are at the dawn of the era of petascale genomics. In 2015, we can manipulate gigascale genomic datasets without much difficulty. Deriving insight from data at this scale is pervasive today, and the IT infrastructure needed isn’t much more advanced than a small Linux cluster, or an easily affordable amount of time on a large cloud provider’s infrastructure. Manipulation of terascale datasets, however, is still not easy. It is possible to be sure, and researchers are busy attempting to derive insight from genomic data at these scales. But definitely not easy, and again the reason is the IT infrastructure.

 

Terascale data sets do not fit neatly into easily affordable computational architectures in 2015 – one needs advanced techniques to split up the data for analysis (e.g., Hadoop-style workflows) or one needs advanced systems well beyond the average Linux HPC cluster (e.g., the SGI UV server). Indeed, the skilled IT observer would say that these techniques and systems were invented for data analysis at terascales.  But true petascale genomics research? No, we’re not there yet. We can certainly create data at petascales, and storage infrastructure for storing petabytes of data are fairly common (a petabyte stored on hard drives can easily fit into half a rack in 2015), but this is not petascacle analysis. But to be adept at analyzing and deriving scientific insight from petascale genomic datasets requires IT architectures that have not yet been produced (although theoretical designs abound, including future generations of systems from SGI!)

 

We are headed in this direction. NGS technologies are only getting more affordable. If there’s anything the past 17 years has taught us it is that once the data can be generated at some massive scale, it will be. 

 

Perhaps “consumer” genomics will be the driver. The costs of DNA sequencing will be low enough that individuals with no scientific or HPC background will want to sequence their own DNA for healthcare reasons. Perhaps the desire for control over one’s genomic data will become pervasive (giving a whole new meaning to “personalized medicine”) versus having that information be controlled by healthcare providers or (gasp!) insurance companies. Once you have millions of individuals capturing their own genomes on digital media we will have petascale genomics analysis.

 

Imagine the insights we can gain from manipulation of data at these scales. Genomic analysis of not one human genome, but millions of genomes, and perhaps also tracking genomic information through time. Why not? If the cost of DNA sequencing is not a barrier why not sequence individuals or even whole populations through time? That’ll give new meaning to “genome-wide association studies”, that’s for sure. Whatever the reason and whatever the timeline, the destination is not in doubt – we will one day manipulate petascale genomics datasets and we will derive new scientific insight simply because of the scale and pace of the research. And it will be advanced IT architectures from companies like SGI and Intel that will make this possible.

 

Here’s to the next 17 years. I’ll see you in 2032 and we’ll talk about how primitive your 50,000-core cluster and your 100PB filesystems are then.

 

What questions do you have?

 

James-Reaney_avatar_1430432638-80x80.jpg

James Reaney is Senior Director, Research Markets for Silicon Graphics International (SGI).

 

Patient engagement and analytics were trending topics at HIMSS15. I now wonder how the essence of those conversations will change going forward.


In this video, I share insight into how the discussions around analytics and patient engagement need to shift toward improving the quality of care and reducing costs. I also look at how the growing volume of personal health data coming from wearables and genomic research will help drive truly customized care into becoming a reality.

 

Watch the short video and let us know what questions you have about the future of analytics and customized care, and where you think they’re headed.

Seventeen years. That’s how long it has taken us to move from the dawn of automated DNA sequencing to the data tsunami that defines next-generation sequencing (NGS) and genomic analysis in general today. I’m remembering, with some fondness, the year 1998 which I’ll consider as the year the life sciences got serious about automated DNA sequencing, about sequencing the human genome in particular, and the year the train left the station and the genomics research went from the benchtop to prime mover of high-performance computing (HPC) architectures and never looked back.

 

1998 was the year Perkin Elmer formed PE Biosystems, an amalgam of Applied Biosystems, PerSeptive Biosystems, Tropix, and PE Informatics, among other acquisitions. That was the year PE decided they could sequence the human genome before the academics could – that is, by competing against their own customers, and they would do it by brute force application of automated sequencing technologies. That was the year Celera Genomics was born and Craig Venter became a household name. At least if you lived in a household where molecular biology was a common dinnertime subject.

 

Remember Zip Drives?

In 1998, PE partnered with Hitachi to produce the ABI “PRISM” 3700, and hundreds of these machines were sold worldwide, kick starting the age of genomics. PE Biosystems revenues that year were nearly a billion dollars. The 3700 was such a revolutionary product that it purportedly could produce the same amount of DNA data in a single day what the typical academic lab could produce in a whole year. And yet, from an IT perspective, the 3700 was quite primitive. The computational engine driving the instrument was a Mac Centris, later upgraded to a Quadra, then finally to a Dell running Windows NT. There was no provision for data collection other than local storage, which if you wanted any portability was at that time the ubiquitous Iomega Zip Drive. You remember those? Those little purplish-blue boxes that sat on top of your computer and gave you a whopping 100 megabytes of portable storage. The pictures on my phone would easily fill several Zip disks today.

 

Networking the 3700 was no mean feat either. We had networking in 1998 of course; gigabit Ethernet and most wireless networking technologies were still just an idea in 1998 but 100 megabit (100Base-TX) connections were common enough and just about anyone in any academic research setting had a least 10 megabit (10Base-T) connections available. The problem was the 3700, and specifically the little Dell PC that was paired with the instrument and responsible for all the data collection and subsequent transfer of data to some computational facility (Beowulf-style Linux HPC clusters were just becoming commonplace in 1998 as well.)  As shipped from PE at that time, there was zero provision for networking, zero provision for data management beyond the local hard drive and/or the Zip Drive.

 

It seems laughable today but PE did not consider storage and networking, i.e., the collection and transmission of NGS data, a strategic platform element. I guess it didn’t matter since they were making a BILLION DOLLARS selling 3700s and all those reagents, even if a local hard drive and sneakernet were your only realistic data management options. Maybe they just didn’t have the proper expertise at that time.  After all, PE was in the business of selling laboratory instruments, not computers, storage, or networking infrastructure.

 

Changing Times

How times have changed. NGS workflows today practically demand HPC-style computational and data management architectures. The capillary electrophoresis sequencing technology in the 3700 was long-ago superseded by newer and more advanced sequencing technologies, dramatically increasing the data output of these instruments and simultaneously lowering the costs as well.  It is not uncommon today for DNA sequencing centers to output many terabytes of sequencing data every day from each machine, and there can be dozens of machines all running concurrently. To be a major NGS center meant also being adept at collecting, storing, transmitting, managing, and ultimately archiving petascale amounts of data. That’s seven orders of magnitude removed from the Zip Drive. If you are also in the business of genomics analysis that meant you needed to be experts in computational systems capable of handling data and data rates at these scales as well.

 

Today, this means either massively scalable cloud-based genomics platforms or the more traditional and even higher scale HPC architectures that dominate all large research computing centers worldwide. We are far, far beyond the days of any single Mac Quadra or Dell server. Maybe if PE had been paying closer attention to IT side of the NGS equation they would still be making billions of dollars today.

 

In Part II of this blog, I’ll look at what’s in store for the next 17 years in genomics. Watch for the post next week.

 

James-Reaney_avatar_1430432638-80x80.jpg

James Reaney is Senior Director, Research Markets for Silicon Graphics International (SGI).

Telehealth is often touted as a potential cure for much of what ails healthcare today. At Indiana’s Franciscan Visiting Nurse Service (FVNS), a division of Franciscan Alliance, the technology is proving that it really is all that. Since implementing a telehealth program in 2013, FVNS has seen noteworthy improvements in both readmission rates and efficiency.


I recently sat down with Fred Cantor, Manager of Telehealth and Patient Health Coaching at Franciscan, to talk about challenges and opportunities. A former paramedic, emergency room nurse and nursing supervisor, Fred transitioned to his current role in 2015. His interest in technology made involvement in the telehealth program a natural fit.


At any one time, Fred’s staff of three critical care-trained monitoring nurses, three installation technicians and one scheduler is providing care for approximately 1,000 patients. Many live in rural areas with no cell coverage – often up to 90 minutes away from FVNS headquarters in Indianapolis.


Patients who choose to participate in the telehealth program receive tablet computers that run Honeywell LifeStream Manager* remote patient monitoring software. In 30-40 minute training sessions, FVNS equipment installers teach patients to measure their own blood pressure, oxygen, weight and pulse rate. The data is automatically transmitted to LifeStream and, from there, flows seamlessly into Franciscan’s Allscripts™* electronic health record (EHR). Using individual diagnoses and data trends recorded during the first three days of program participation, staff set specific limits for each patient’s data. If transmitted data exceeds these pre-set limits, a monitoring nurse contacts the patient and performs a thorough assessment by phone. When further assistance is needed, the nurse may request a home visit by a field clinician or further orders from the patient’s doctor. These interventions can reduce the need for in-person visits requiring long-distance travel.


FVNS’ telehealth program also provides patient education via LifeStream. For example, a chronic heart failure (CHF) patient experiencing swelling in the lower extremities might receive content on diet changes that could be helpful.


Since the program was implemented, overall readmission rates have been well below national averages. In 2014, the CHF readmission rate was 4.4%, compared to a national average of 23%. The COPD rate was 5.47%, compared to a national average of 17.6%, and the CAD/CABG/AMI rate was 2.96%, compared to a national average of 18.3%.


Despite positive feedback, medical staff resistance remains the biggest hurdle to telehealth adoption.  Convincing providers and even some field staff that, with proper training, patients can collect reliable data has proven to be a challenge. The telehealth team is making a concerted effort to engage with patients and staff to encourage increased participation.


After evaluating what type of device would best meet the program’s needs, Franciscan decided on powerful, lightweight tablets. The touch screen devices with video capabilities are easily customizable and can facilitate continued program growth and improvement.


In the evolving FVNS telehealth program, Fred Cantor sees a significant growth opportunity. With knowledge gained from providing the service free to their own patients, FVNS could offer a private-pay package version of the program to hospital systems and accountable care organizations (ACOs).


Is telehealth a panacea? No. Should it be a central component of any plan to reduce readmission rates and improve workflow? Just ask the patients and healthcare professionals at Franciscan VNS.

 

Healthcare systems are coping with an unprecedented level of change. They’re managing a new regulatory environment, a more complex healthcare ecosystem, and an ever-increasing demand for services—all while facing intense cost pressures.

 

These trends are having a dramatic impact on EMR systems and healthcare databases, which have to maintain responsiveness even as they handle more concurrent users, more data, more diverse workflows, and a wider range of application functionality.

 

As Intel prepared to introduce the Intel® Xeon® processor E7 v3 family, we worked with engineers from Epic and InterSystems to ensure system configurations that would provide robust, reliable performance. InterSystems and VMware were also launching their next-generation solutions, so the test team ran a series of performance tests pairing the Intel Xeon processor E7-8890 v3 with InterSystems Caché 2015.1 and a beta version of VMware vSphere ESXi 6.0.

                                

The results were impressive. “We saw the scalability of a single operational database server increase by 60 percent,” said Epic senior performance engineer Seth Hain. “With these gains, we expect our customers to scale further with a smaller data center footprint and lower total cost of ownership.” Those results were also more than triple the end-user database accesses per second (global references or GREFs) achieved using the Intel® Xeon® processor E7-4860 with Caché® 2011.1.

 

leibforth graph.jpg

 

These results show that your healthcare organization can use the Intel Xeon processor E7 v3 family to implement larger-scale deployments with confidence on a single, scale-up platform.

 

In addition, if you exceed the vertical scalability of a single server, you can use InterSystems Caché’s Enterprise Cache Protocol (ECP) to scale horizontally. Here again, recent benchmarks show great scalability. A paper published earlier this year reported more than a threefold increase in GREFs for horizontal scalability compared to previous-generation technologies.

 

This combination of outstanding horizontal and vertical scalability—in the cost-effective environment of the Intel® platform—is exactly what needed to meet rising demands and create a more agile, adaptable, and affordable healthcare enterprise.

                                                                              

What will these scalability advances mean for your healthcare IT decision makers and data center planners? How will they empower your organization deliver outstanding patient care and enhance efficiency? I hope you’ll read the whitepapers and share your thoughts. And please keep in mind: Epic uses many factors, along with benchmarking results, to provide practical sizing guidelines, so talk to your Epic system representative as you develop your scalability roadmap.

 

Read the whitepaper about vertical scalability with the Intel Xeon processor E7 v3.

 

Read the whitepaper about horizontal scalability  with Intel Xeon processors.

 

Join and participate in the Intel Health and Life Sciences Community

 

Follow us on Twitter: @IntelHealth, @IntelITCenter, @InterSystems, @vmwareHIT

 

Steve Leibforth is a Strategic Relationship Manager at Intel Corporation

The health and well-being of any workforce has a direct impact on worker productivity, efficiency and happiness, all critical components of any successful organization. With this in mind, Intel has developed a next-generation healthcare program, called Connected Care, which includes an integrated delivery system based on a patient-centered medical home (PCMH) approach to care.

The shift to value-based compensation and team-based care is driving the need for improved collaboration and patient data sharing between a growing number of providers and medical systems. While we’ve successfully introduced the Connected Care program in smaller locations, bringing it to Oregon and the larger Portland Metropolitan area presented us with a common healthcare IT challenge, interoperability. Shah.PNG

 

Advanced Interoperability Delivers Better Experiences for Clinicians, Patients

 

Intel is using industry standards to address these challenges, geared towards advancing interoperability in healthcare. The ability to quickly share clinical information between on-site Health for Life Center Clinics and delivery system partners (DSPs) enables:

 

  • Efficient and seamless experiences for members
  • Informed decision-making by clinicians
  • Improved patient safety
  • Increased provider efficiency
  • Reduced waste in the delivery of healthcare, by avoiding redundant testing

 

These improvements will help us make the Institute for Healthcare Improvement’s (IHI’s) Triple Aim a reality, by improving the patient experience (quality and satisfaction), the health of populations, and reducing the per-capita cost of health care.

 

Kaiser and Providence Part of Intel’s Connected Care Program

 

Intel’s Connected Care program is offering Intel employees and their dependents two new options in Oregon. Kaiser Permanente Connected Care and Providence Health & Services Connected Care have both been designed to meet the following requirements of Intel and their employees:

 

  • “Optimize my time” – member and provider have more quality interactions
  • “Don’t make me do your work” – no longer rely on members to provide medical history
  • “Respect my financial health” - lower incidence of dropped hand-offs/errors
  • “Seamless member and provider experience” - based on bi-directional flow of clinical data

 

Now that we have eliminated the interoperability barrier, we can enable strong coordination between providers at Health For Life Centers (on-campus clinics at Intel), and the Kaiser and Providence network providers, enabling the ability to quickly share vital electronic health records (EHRs) between varying systems used by each organization.

 

In our efforts to deliver optimal care to every Intel employee, we sought solutions that would ensure all providers serving Intel Connected Care members are able to see an up-to-date patient health record, with accurate medications, allergies, problem lists and other key health data, every time a Connected Care member needs care.

 

Learn More: Advancing Interoperability in Healthcare

 

What questions do you have?

 

Prashant Shah is a Healthcare Architect with Intel Health & Life Sciences

When I used to work for the UK National Health Service, I encouraged doctors and nurses to use mobile devices. But that was 15 years ago, and the devices available only had about two hours of battery life and weighed a ton. In other words, my IT colleagues and I were a bit overly optimistic about the mobile devices of the time being up to the task of supporting clinicians’ needs.

 

So it’s great to be able to stand up in front of health professionals today and genuinely say that we now have a number of clinical-grade devices available. They come in all shapes and sizes. Many can be sanitized and can be dropped without damaging them. And they often have a long battery life that lasts the length of a clinician’s shift. The work Intel has done over the last few years on improving device power usage and efficiency has helped drive the advancements in clinical-grade devices.

 

It is very clear that each role in health has different needs. And as you can see from the following real-world examples, today’s clinical-grade devices are up to the task whatever the role.

 

Wit-Gele Kruis nurses are using Windows 8 Dell Venue 11 Pro tablets to help them provide better care to elderly patients at home. The Belgian home-nursing organization selected the tablets based on feedback from the nurses who would be using them. “We opted for Dell mainly because of better battery life compared to the old devices,” says Marie-Jeanne Vandormael, Quality Manager, Inspection Service, at Wit-Gele Kruis, Limburg. “The new Dell tablets last at least two days without needing a charge. Our old devices lasted just four hours. Also, the Dell tablets are lightweight and sit nicely in the hand, and they have a built-in electronic ID smartcard reader, which we use daily to confirm our visits.”

 

In northern California, Dr. Brian Keeffe, a cardiologist at Marin General Hospital loves that he can use the Microsoft Surface Pro 3 as either a tablet or a desktop computer depending on where he is and the task at hand (watch video below).

 

 

When he’s with patients, Dr. Keeffe uses it in its tablet form. “With my Surface, I am able to investigate all of the clinical data available to me while sitting face-to-face with my patients and maintaining eye contact,” says Dr. Keeffe.

 

And when he wants to use his Surface Pro 3 as a desktop computer, Dr. Keeffe pops it into the Surface docking station, so he can be connected to multiple monitors, keyboards, mice, and other peripherals. ”In this setup, I can do all of my charting, voice recognition, and administrative work during the day on the Surface,” explains Dr. Keeffe.

 

These are just two examples of the wide range of devices on the market today that meet the needs of different roles in health. So if you’re an IT professional recommending mobile devices to your clinicians, unlike me 15 years ago, you can look them in the eye and tell them you have a number of great clinical-grade options to show them.

 

Gareth Hall is Director, Mobility and Devices, WW Health at Microsoft

The National Health IT (NHIT) Collaborative for the Underserved kicked off their Spring Summit with a briefing at the White House in April to commemorate the 30-year anniversary of the Heckler Report.

 

This landmark task force report, published in 1985 by then-DHHS Secretary Margaret Heckler, first introduced the country to the documented health disparities that our racial and ethnic minority populations were facing. MJ blog.jpg

 

While we have made progress since, recent advances in technology have provided us with a unique opportunity to introduce real change, right now. To help carry this momentum, I participated in a lively panel discussion with industry leaders at the Summit, “Moving the Needle” for innovation success, where we discussed key action items that will help us deliver an effective and efficient healthcare ecosystem:

 

• Engage consumers to participate and manage their own health and wellness through education.

• Work with providers serving multicultural communities to increase Health IT adoption and their participation in programs that support delivery of high quality, cost effective care.

• Deliver effective educational, training and placement programs that can prepare members of multicultural communities for Health IT related careers.

• Establish and implement policies that support individual and community health empowerment and promote system transformation.

• Identify priority areas where gaps exist regarding the ability to use innovative health technologies to address disparities and plan actionable next steps.

 

Reactive approach to healthcare costly for payers and providers

Managing the complex health needs of the underserved has long been labor intensive and costly for both patients and clinicians. The lack of health coverage and other complications have traditionally presented significant challenges for a large portion of this population.

 

While the Affordable Care Act (ACA) now makes healthcare financially feasible for millions of newly insured individuals, a troubling trend may persist among some members of underserved communities who continue to only seek care after experiencing an acute health emergency, making their visits extremely costly to payers and providers. These visits usually require several medications, frequent monitoring of vitals, and lifestyle changes in diet and exercise.

 

They also typically require people who may live with instability in multiple aspects of life, to schedule and adhere to ongoing medical appointments and diagnostic tests. This isn’t an effective, realistic, or affordable approach to health and wellness, for payers, providers or consumers. But it can be addressed through raised awareness regarding the impact of health decisions and improved access to healthy options.

 

Organized data critical for effective education and outreach

Access to accurate and organized data is key when we talk about making personalized healthcare a reality. Actionable data is driving today’s cutting-edge research, leading to improvements in preventative health and wellness, as well as life-saving treatments.

 

Edge devices, like wearables, biosensors, and other consumer devices, can gather large amounts of data from various segments of the population, correlating behaviors related to diet and exercise. With end-to-end edge management systems, researchers and clinicians can have real-time access to locally filtered actionable data, helping them make accurate and educated discoveries on population behavior with amazing levels of insight.

 

Understanding where individual and population health trends are headed in advance will enable providers to customize their education and outreach services, saving time and resources from being wasted on programs with little to no impact. With electronic health records (EHR), clinicians can access a patient’s history on secure mobile devices, tracking analyzed data that impacts wellness plans and treatments.

 

Quality measures for prevention, risk factor screening, and chronic disease management are then identified and evaluated to provide support for practice interventions and outreach initiatives. Along with edge and embedded devices, they can play a key role in promoting self-management and self-empowerment through better communication with clinical staff.

 

Gathering data from the underserved population

Providers who treat underserved populations and vulnerable citizens often have less access to EHRs and other technologies that help them collect, sort and analyze data from patients. Another factor is that these clinics, hospitals and community centers are often reacting to crisis, instead of preventative outreach and education. This places greater strain on staff, patients and resources, while straining budgets that are partly limited by payer reimbursement.

 

So the big question is, “how do we leverage the power of data within complex populations that are often consumed by competing real-world priorities?”

 

It starts with education, outreach, and improved access to healthier lifestyle options. It continues by equipping clinics, hospitals and resource centers in underserved communities with the latest Health IT devices, wearables, software and services. As innovators it is our job to craft and articulate a value proposition that is so compelling, payers will realize that an initial investment in innovation, while potentially costly, will reduce expenditures significantly in the long run.

 

By educating and empowering all consumers to more actively participate in the management of their own wellness, the need for costly procedures, medications and repeated visits will go down, saving time and resources for payer and provider – while delivering a better “quality of life” for everyone.

I have just spent the better part of two weeks involved in the training of a new 50-strong sales team. Most of the team were experienced sales people but very inexperienced in pharmaceutical sales. They had a proven record in B2B sales, but only 30 percent of the team had previously sold pharmaceutical or medical device products to health care professionals (HCPs). Clearly, after the logistical and bureaucratic aspects of the training had been completed, most of the time was spent training the team on the medical background, disease state, product specifics and treatment landscape/competitor products.

 

Preparing the team for all eventualities and every possible question/objection they may get from HCPs was key to making sure that on the day of product launch they would be competent to go out into their new territories and speak with any potential customer. With particular reference to this product it was equally important for the team to be in a position to speak with doctor, nurse and pharmacist.

 

The last part of the training was to certify each of the sales professionals and make sure that they not only delivered the key messages but that they could also answer most of the questions HCPs would fire at them. In order to do this the sales professionals were allowed 10 minutes to deliver their presentation to trainers, managers and medical personal. The assessors were randomly assigned questions/objections to be addressed during the presentation.

 

The question remains, “does this really prepare the sales person for that first interaction with a doctor or other HCP?” Experience tells us that most HCPs are busy people and they allow little or no time for pharmaceutical sales professionals in their working day. The 90 seconds that a sales professional gets with most of their potential customers is not a pre-fixed amount. Remember, doctors are used to getting the information they need to make clinical decisions by asking the questions they need answers to in order to make a decision that will beneficially affect their patient(s). So, starting the interaction with an open question is quite simply the worst thing to do, as most doctors will take this opportunity to back out and say they do not have time.

 

The trick is to get the doctor to ask the first question (that is what they spend their lives doing and they are good at it) and within the first 10-15 seconds. Making a statement that shows you understand their needs and have something beneficial to tell them is the way you will get “mental access.” Once the doctor is engaged in a discussion, the 90-second call will quickly extend to 3+ minutes. Gaining “mental access” is showing the doctor that you have a solution to a problem they have in their clinical practice and that you have the necessary evidence to support your key message/solution. This has to be done in a way that the doctor will see a potential benefit for, most importantly, their patients. In order to do this the sales professional needs to really understand the clinical practice of the person that they are seeing (i.e. done their pre-call planning) and have the materials available to instantly support their message/solution.

 

The digital visual aid is singularly the best means of providing this supporting information/data, as whatever direction the sales professional needs to go in should be accessible in 1-2 touches of the screen. Knowing how to navigate through the digital sales aid is essential as this is where the HCP is engaged or finding a reason to move on.

 

What questions do you have? Agree or disagree?

As physicians, we're taught to practice evidence-based medicine where the evidence comes primarily from trade journals that document double blind, randomized control trials. Or, perhaps we turn to society meetings, problem-based learning discussions (PBLD), or peer group discussion forums. We are dedicated to finding ways to improve patient outcomes and experience, yet we miss huge opportunities every day.

 

We are lost in a sea of data, left to debate continuous process improvement with ‘gut feelings’ and opinions. We do the ‘best we can’ because we lack the ability to glean meaningful perspective from our daily actions. As an anesthesiologist, I know there's a wonderful opportunity for analytics to make a difference in our surgical patients’ experience, and I can only imagine there are similar opportunities in other specialties.int_brand_879_LabDocTblt_5600_cmyk._lowresjpg.jpg

 

Here are three undeniable reasons analytics should matter to every physician:

 

Secure Compensation

Quality compliance is here to stay, and it’s only becoming more onerous. In 2015, the CMS-mandated Physician Quality Reporting System (PQRS) finally transitioned from bonus payments to 2 percent penalties. It also raised the reporting requirements from 3 metrics to 9 metrics across 3 domains, including 2 outcome measures.

 

Unfortunately, in the absence of the right technology, compliance is too often considered just another costly burden. We’re relegated to either rely on unresponsive 3rd party vendors to update our software or else we’re forced to hire additional human resources to ‘count beans’. More frustratingly, we rarely see these efforts translate into meaningful change for the patients we serve. We arrive at the erroneous conclusion that these efforts only increase costs while offering no tangible benefits.

 

What if our technology was flexible enough to keep up with changing regulations while also making us faster and more intelligent at our jobs?  How would this change our perception of regulatory requirements? Thankfully such solutions exist, and with our input they can and should be adopted.

 

Gain Control

It’s too easy for providers to limit themselves to the “practice of medicine” – diagnosing and treating patients – and disengage from the management of our individual practices. We do ourselves a disservice because, as physicians, we have a significant advantage when it comes to interpreting the ever-increasing government regulations and applying them to our patients’ needs. There is often latitude in this interpretation, which ultimately gives rise to incorrect assumptions and unnecessary work. When we assume the responsibility for setting the definitions, we gain control over the metrics and consequently influence their interpretations.

 

By engaging in our analytics, we’re equipped to speak more convincingly with administration, we gain independence from poor implementations, and we gain freedom from added inefficiencies. We lose the all-too-common “victim perspective”, and we return to a position of influence in how and why we practice the way we do. Through analytics, we are better positioned to improve our patients’ experiences, and that can be incredibly gratifying.

 

Transform Your Industry

This ability to leverage real-time analytics has already transformed other industries. In retail, the best companies deliver exceptional service because their sales representatives know exactly who we are, what we’ve purchased, how we’ve paid, when we’ve paid, etc. Because they know our individual preferences at the point of sale, they deliver first-class customer service. Consider the example of Target, who used predictive analytics to identify which customers were pregnant simply from analyzing their transactional data, thus allowing them to intelligently advertise to a compelling market segment.

 

Imagine leveraging this same capability within the realm of surgical services. What if we could deliver individualized patient education at the time it’s needed. For example, a text message the evening before surgery reading, “It’s now time to stop eating.” Or, an automated message when the patient arrives to the surgical facility, stating, “Here’s a map to the registration desk”. There are plenty of opportunities to leverage mobility and connectivity to deliver personalized care throughout the surgical experience. Further, by analyzing the data generated during the course of that surgical experience, what if we could predict who was likely to be dissatisfied before they even complained. Could we automatically alert guest relations for a service recovery before the patient is discharged? There’s no doubt - of course, we can! We just need appropriate management of our surrounding data.

 

Conclusion

Through analytics we have the ability to secure our compensation, gain more control of our practices, and transform our industry by improving outcomes, improving the patient experience, and reducing costs.

 

When we’re equipped with analytical capabilities that are real-time, interactive, individualized, and mobile, we've implemented a framework with truly transformative power. We've enabled a dramatic reduction in the turnaround time for continuous process improvement. As regulatory requirements continue to increase in complexity, we have the opportunity to either work smarter using more intelligent tools or else surrender to an unfriendly future. Fellow practitioners, I much prefer the former.

 

What questions do you have? What’s your view of analytics?

As we celebrate Nurses Week across the world, I wanted to highlight the impact of telehealth on the changing nature of nursing and healthcare more generally.

 

But before I do that we must recognise that for all of the technological advancements, the priority for nurses is to provide the best patient care. And telehealth is helping nurses to do just that. With ageing populations across most developed nations putting additional stress on healthcare systems there is an increasing need to free up costly hospital beds and nurses time by monitoring and managing patients remotely.

 

From blood pressure to fall sensors, telehealth is enabling nurses to provide better care by being able to work more efficiently and be better informed about a patient’s condition. Recent research (see infographic below) suggests that telehealth will increase tenfold from 2013 to 2018 and with advances around the ‘Internet of Things’ bringing enhanced levels of connected care, I see nurses being able to do a great job even better in the future.

 

 

telehealthInfographic 6.jpg

 

If you haven’t noticed lately, we’re seeing an increase in demand for analytics driven by health reform. However, for many organizations, the culture needs to change in order to fully embrace analytics as part of the standard practice of care. Many will agree that there is too much information for clinicians to rely only on training and experience as they treat patients rather than leverage insights from analytics for clinical decision support. Providers who embrace analytics will be best positioned to improve patient care from the perspective of decreased cost, improved efficiency and enhanced patient experience.


In my role at Intel, I’m often asked where “big data” capabilities can apply to healthcare. One of the areas that always top my list are the clinical records. Roughly 70 percent of the electronic health record (EHR) includes clinically relevant information that is unstructured or in free form notes, meaning potentially critical pieces of information are not easily accessible to providers. There are many tools that can use sophisticated natural language processing techniques to pull out the clinically relevant information however, the culture has to be ready to accept those kinds of solutions and use them effectively.

 

Overcoming challenges

Personalized medicine analytics is a combination of data bits coming from multiple data sources and each comes with its own unique set of challenges. There is the payer side, the clinical side, the biology, life sciences and genomics side and finally, the patient side and the work that we’ve been doing is in all of these areas. We look at big data and health and life sciences as the aggregation of all of these different data sources and address the challenge of how this content will be generated, moved, stored, curated and analyzed. 

The goal is to take advantage of the sophisticated analytics and sophisticated technology capabilities and merge those with the changes to workflow on the healthcare side and the life sciences side and pull those two areas together to deliver care specific to an individual.  This is very different from treating a large cohort of all diabetes patients or all breast cancer patients in exactly the same way. 

Personalized medicine is really two different perspectives. First, is on the genomics side, where you include as an attribute to the patient care pathway the genome of that patient, comparing it against a reference genome to determine what is different about the patient as an individual or how their tumor genome differs from their normal DNA. Second, there’s the population health aspect to personalization; really understanding all of the data that is available in patient records whether it be structured or unstructured data and then developing care plans specific to that individual.  For example, micro segmenting a population taking into account comorbidities and socio-economic factors with the help of advanced analytic tools.

 

Safety opportunities

There was a recent article in the Journal of Patient Safety[1] that stated that there may be more than 400,000 premature deaths per year that are preventable in a hospital setting. Furthermore, 10-20 times more than that statistic cause serious harm but don’t result in death.  For example, big data and analytics are being used to help identify and diagnose sepsis earlier so that it can be treated more effectively and be less costly for the payer and provider.


A great example of using wearables to better understand disease progression is the work that Intel is conducting in partnership with the Michael J. Fox Foundation for Parkinson’s research (see video above). Individuals wearing specialized devices will be tracked around the clock; observations will be recorded 300 times a second and all information will be stored in the cloud. What this means for researchers is that they will go from evaluating a few data points per month to observing 1 gigabyte of data every day.

 

By analyzing the existing data that is available, adding wearables, improving the velocity in analyzing data, there are a lot of opportunities to improve patient safety using some of these tools. 

What questions about clinical analytics do you have? How are you using data in your practice or organization?
 


[1] James, John T. PhD. “A New, Evidence-based Estimate of Patient Harms Associated with Hospital Care.” Journal of Patient Safety (2013): http://journals.lww.com/journalpatientsafety/Fulltext/2013/09000/A_New,_Evidence_based_Estimate_of_Patient_Harms.2.aspx

Filter Blog

By date:
By tag:
Get Ahead of Innovation
Continue to stay connected to the technologies, trends, and ideas that are shaping the future of the workplace with the Intel IT Center