By Tom Foley, Director, Global Health Solution Strategy, Lenovo Health

 

Patient engagement is a top-of-mind subject for healthcare providers. After all, successful patient engagement usually results in better outcomes for all parties involved in care – for providers, it can result in a more comprehensive view of their patients’ health and/or condition stability as well as improved patient satisfaction, and for patients it can mean more confidence in their improvement, their sense of care and ultimately their health.

 

Even with the industry’s sights set on the promise of patient engagement, many healthcare providers have found it difficult to connect with patients in a meaningful way. While widespread consumerization of healthcare has not yet taken hold, there are still some compelling examples of organizations that are moving toward meaningful patient engagement, which can easily serve as a form of encouragement for those still seeking direction.

 

Recently, athenahealth collected data from millions of portal visits and more than 2,000 clients in hopes to be able to “nudge medical staff in the interest of better efficiency and performance.” Their findings could be equally as helpful to other providers, with key takeaways such as:

 

Age is not necessarily a pain point. Many providers are tempted to point to their diverse portfolio of patients when examining troubles with patient engagement. While in some cases, older individuals are slower to adapt to new norms and less comfortable with technology, athenahealth has found that in terms of patient portals, age is not a limiting factor. Specifically, the study found that “patients in their 60s register for portal accounts at the same rate as those in their 30s, 40s, and 50s” and “patients between 70 and 79 use portals at roughly the same rate as twenty-somethings.” Speaking even further to age as a non-issue, athenahealth points out that “older patients sign into their portal accounts significantly more often than younger patients.”

 

Specialty providers can benefit from portals. Athenahealth found “significant” adoption of patient portals from specialty practices. They name “tremendous reductions in staff work from engaging patients electronically” and “better adherence to perioperative protocols” as specific successes.

 

Defaults encourage engagement. Providers should not be hesitant to sign patients up for portals automatically – leaving them with the option to opt out. Athenahealth found that practices using this approach “see a much higher adoption rate than practices that ask patients to register for the portal on their own.”

 

As an industry, we know that patient engagement is more than complying with meaningful use requirements – we are moving forward with better strategies to engage patients and with the intention of creating an overall better-coordinated care experience. Portals are not the only tool available, but they remain a great place to start. As patient engagement continues to challenge providers, it’s important to follow the latest studies and their results, such as the one from athenahealth. These insights into the healthcare industry are a great way to glean best practices, understand different approaches, and understand what an improved patient engagement strategy can look like for your organization. By sharing findings, opinions, successes, and failures, we can, together, reinforce what really matters: improving overall care and maintaining patients as the top priority.

 

 

References:

  1. 1. “Using Data to Increase Patient Engagement in Health Care” Harvard Business Review. June 30 2015.

The seventh annual mHealth Summit kicks off Nov. 8 with the theme, Anytime, Anywhere: Engaging Patients and Providers.

 

This event, which is now part of the HIMSS Connected Health Conference, has come a long way since the first show in 2008, when the focus was primarily on devices. Since then, the meaning of “mobility” has evolved, with mHealth now driving a swelling tide of innovation that will ultimately fill the data hole that has vexed physicians for millennia: How is my patient doing at home?  We’re seeing bits and pieces of the innovations and use-cases that will make this possible—from footstep-monitoring smart phones to wearable biosensors for Parkinson’s patients. But these just hint at the disruptive changes that lie ahead.

 

Where are we heading, and how do we get there? That’s what we are looking forward to sharing next week as several experts on our team will be presenting ideas on the next phase of mHealth technology.

For example, here are just a few of the activities on the agenda:

 

  • The Intel booth (#703) will be a hub of activity showcasing many leading technologies, such as: a Patient Room of the Future, remote patient monitoring systems, Intel® RealSense™ cameras for wound care, portable teleclinics, IoT platforms that record, store and analyze data, plus the newest mobile devices featuring Windows* 10.
  • I will be delivering a keynote address on Tuesday at 9:30 a.m. in the Potomac Ballroom on the main stage and will be looking at mHealth’s growing role in enabling connected and proactive care wherever you are, tailored to you, your data, and your care community. Plus, I’ll be sharing some innovations that are moving us from wearable to implantable health technologies.
  • Our Cybersecurity Kiosk (#4) will feature our security expert, David Houlding, covering security maturity models and online tools. Come with your questions and have a chat with him.
  • The Connected Health Pavilion is where you can learn about big data analytics and see the Basis Peak™ watch in action in CH-3.

 

Mobility is a given as today’s healthcare continues expanding beyond institutions into more home-based and community care settings. Mobile technology offers the promise to help busy clinicians improve quality of care and efficiency of care delivery. Intel powers mobile devices with modern user interfaces for on-the-go clinicians, while helping IT departments meet the challenges of managing and securing devices in a “bring your own device” (BYOD) world.

 

We are looking forward to mHealth Summit and encourage you to follow @IntelHealth on Twitter for live tweets from the show floor and the Intel presentations.

 

What are you most looking forward to seeing and learning about at mHealth Summit?

 

John Sotos, MD, is the Worldwide Medical Director at Intel Corporation.

Now in its seventh year, the mHealth Summit is taking its “innovation” theme to a new level with a few surprises in 2015. First and foremost, the summit is taking place a month earlier than usual – this year’s event will be held November 8-11 at the Gaylord National Resort and Convention Center in Washington DC.  (Does this mean we won’t get to see the annual Christmas display in the Gaylord’s voluminous atrium?)

 

Second, it’s now part of the HIMSS Connected Health Conference, which also includes the Global mHealth Forum, the Population Health Summit and the CyberSecurity Summit – all for the price of a single registration.

 

And if that’s not enough, attendees can add on registration for seven additional co-located one-day conferences and events covering, wearables, games for health, venture capital and start-ups.

 

Let me confess: the mHealth Summit is one of my favorite annual events in the healthcare space. Attendees and exhibitors have a missionary zeal, and they embrace their roles as evangelists. And I love the tech on display — both the hardware and the apps. It reminds me a lot of CES, the big consumer electronics show in Vegas, except on a more intimate scale, and confined to a topic I’m passionate about.

 

Where are the providers?

But in my view, there’s always been one thing missing from the mHealth Summit – the providers. Last year, only 14 percent of attendees came from provider and payer organizations. While this represents the second-largest demographic group at the summit, mixing payers and providers together masks the actually attendance of CIOs, IT directors, and other technology professionals from hospitals and large practices.

 

When the Healthcare Information Management and Systems Society acquired the event in 2012, many industry observers expected HIMSS members (today, some 61,000 strong) to fill that gap. Yet even three years in, the exhibit floor and education sessions still seemed the sole province of technology developers, consultants, consumer advocates, government officials (it is in DC after all) and a handful of independent, forward thinking clinicians.

 

Eric Wicklund, editor of mHealth News (owned by HIMSS Media), says the acquisition may simply have been ahead of its time. In 2012, CIOs were in the midst of certifying EHRs, trying to get a handle on meaningful use, and dreading the implications of a change-over from ICD-9 to ICD-10. There just wasn’t the bandwidth to think about mHealth.

 

Wicklund also suggests that a conference between Thanksgiving and the end of the year was a difficult pill for many CIOs and practitioners. “If your goal was to attract providers, it just seems like a hard time to hold a conference,” he said, pointing to the holidays and the wintry weather.

 

Three reasons to reconsider

Wicklund thinks that in addition to the broader focus of the conference and the change in dates, there are good reasons to believe this year’s event will appeal to technologists and clinicians.

 

#1 - Connected health implies provider participation. Says Wicklund, “HIMSS identified three hot-button issues for providers – mHealth, cyber security and patient engagement. It’s not about mobility anymore — it’s connected care, and providers have a real stake in that.”

 

#2 - Greater focus on provider use cases. Providers might want to look at the clinical care track, which features a number of case studies from some of the country’s leading hospitals and health systems, including the University of Pittsburgh Medical Center and Massachusetts General Hospital.

 

#3 - Breaking news on provider issues. Here’s another reason why CIOs might want to make plans to attend this year’s event – the Office of the National Coordinator has agreed to hold a “fireside chat” on the recently released Interoperability Roadmap. Also at the Monday, Nov. 9 gathering, ONC officials are likely to detail the latest meaningful use final rule.

 

In a recent interview, HIMSS CEO Steve Lieber provided a strong justification for why CIOs should invest more time looking at and understanding mHealth, wearables and the Internet of Things.

 

“Healthcare is entering an era of the consumer and with that comes new challenges and opportunities,” Lieber told mHealth News. “More than ever before consumers are taking greater control of their health and healthcare. To support that trend, providers need to provide a different level of connectivity between the clinician and the consumer. Mobile health, population health and information security are critical components in achieving this new level of electronic connection.”

 

I think Lieber is spot-on. Connected health is being driven by consumerism, and healthcare providers who aren’t able to respond to consumer demands for interactive and interoperable data, new payment options, self-serve scheduling and greater convenience will be left behind. Taking care of the infrastructure and managing internal applications used to occupy the full attention of provider-based technologists, but in today’s world, that is simply not enough. As care spills over the hospital wall (think about preventing unnecessary readmissions as just one example), the technology will follow.

 

Do you plan to attend the mHealth Summit this year? Why or what not? Let me know in the comments or send me a tweet at @jbeaudoin.

By Asha Nayak, MD, Ph.D.

 

This week, I was thrilled to join Intel CEO Brian Krzanich on stage during his keynote at Oracle* Open World to describe the great potential of big data analytics in healthcare. I talked about the critical role of tools like Intel’s open-source Trusted Analytics Platform (TAP) in accelerating medical discovery and impact for patients.

 

As a practicing clinician and Intel’s Global Medical Director, I am especially excited about the ability to monitor patients outside of the clinic using wearable, at-home, and mobile devices. In addition to their usefulness in monitoring, many of these devices are two-way — creating an avenue to reach individuals in real time with custom recommendations. This has very broad implications well beyond the examples around cardiovascular and Parkinson’s studies I cited in the keynote.

 

For outwardly healthy individuals, wearable and mobile can help clinicians detect a variety of conditions earlier. For example, today many people have intermittent atrial fibrillation that they learn of only after arriving at the hospital with their first stroke. Imagine if we could detect this silent and asymptomatic condition at home, in time to prompt a person to get early treatment and prevent that first stroke.

 

Another example is pre-ecclampsia1, which affects two percent to 18 percent of pregnancies, depending on where you are in the world. It is a progressive condition that threatens the lives of both mother and fetus. One of its first signs is rising blood pressure. Earlier detection and treatment are known to improve outcomes for both mother and child. These are two of many examples of how we can prevent suffering, improve outcomes, and potentially lower costs by detecting disease earlier.

 

It’s very important to note that monitoring devices must be accurate (especially in the hands of untrained users), and must be validated (in combination with their back-end analytics) for accuracy – in order to be effective in these types of applications.

 

For individuals with chronic disease, there is also great value in helping people tailor how they manage their unique symptoms and disease progression. Technology can help to guide a patient within the bounds of a physician’s prescription/instructions. For example, patients with asthma treat themselves with prescribed oral and inhaled medications when symptoms occur.

 

Imagine if - using biometric and environmental sensors, real-time analytics, and push-alerts — an asthma patient could be alerted to the risk or early-onset of a flare before symptoms begin. Early intervention has the potential to help patients avoid or reduce the severity of their flare and reduce the amount of medication needed to control it. This is an example of how we can better treat conditions that are already diagnosed.

 

My enthusiasm in this area is shared by many, across disciplines, and around the world. In addition to Intel, numerous academic, government and commercial organizations have recognized the value of integrating multiple health data streams, and are investing heavily in discovery from large population datasets. As you can see from this chart, these efforts, representing just a few that are underway today, are diverse both in focus and location:

 

Organization

Target # of Participants

Status

UK Biobank

500,000

Fully enrolled, dataset growing

Michael J Fox Foundation

1,000 (US & Europe)

Actively enrolling

UCSF Health eHeart Study

1,000,000 (worldwide)

Actively enrolling

Stanford & Duke Baseline Study

10,000

Pilot phase (2015), then expanding

Qatar SIDRA Medical & Research Center

350,000

Enrolling in 2016

Saudi Genome Project

100,000

Planning phase (2015)

US Precision Medicine Initiative

1,000,000

Planning phase (2015)

 

So to all the researchers out there, this is an amazing time to ask questions we never thought we could answer before. Think about associations between clinical parameters and pretty much anything we can measure today — from behavior to diet to location to genetic composition and more. Bigger and bigger datasets are being integrated. Tools like TAP are making it easier to query these complex information streams, including data generated outside the clinic, and to find answers that I believe will help us live healthier.

 

The entire keynote can be found here with the healthcare discussion beginning at 33:00.

 

Visit intel.com/healthcare to find out more about Intel’s efforts in Health and Life Sciences.

 

Copyright © Intel Corporation, 2015. All rights reserved.


*Other names and brands may be claimed as the property of others. Legal Notices

While measuring is a requirement for process improvement, it's not the end goal. Identifying the necessary reports, in and of itself, does not automatically translate to an improved patient experience as revealed in February 2015 by two JAMA articles. Both failed to find significant difference between hospitals who had implemented the American College of Surgeon’s National Surgical Quality Improvement Program (NSQIP) and those who had not. As it turns out, the act of adopting frameworks to measure and report does not change the rate of improvement.

 

Donald M. Berwick, MD, MPP writes in an accompanying JAMA editorial:

 

“The authors of both reports in this issue of JAMA struggle to explain their findings, troubled as any sensible person must be by the suggestion that knowing results would not help caring, committed clinicians and organizations improve their results.”

 

In order to improve, individual behavior needs to change. Here are three essential report “qualities” to better connect measurement with behavior change, and thus improve the patient experience.

 

1. Real-Time

While hospitals and surgery centers may have access to "reports", not all reports are equal. Timing is everything, and receiving monthly reports is less than ideal when trying to improve process. When available any time, however, analytics provide a mechanism for immediate accountability and drive more intelligent behavior change. At the end of every day, we need to easily evaluate our performance and, where necessary, make changes for the next day.

 

With real-time analytics, the managers and directors are informed with objective data allowing them to intervene before emotions and incorrect perceptions create division or erode culture within the team. What was today's on time start rate? Why were cases late? How many patients waited over an hour past their scheduled start time? What was the average turn over time in the OR 6 today? With real-time analytics, managers are able to approach surgeons before there’s an opportunity to complain. This type of information, delivered in an easily consumable format, empowers the directors to have confidence their data-driven decisions are supported in real-time. And that's a big deal for tomorrow’s patient. There's no waiting for improvement.

 

2. Granular, Interactive, and Individually Comparative.

Hospitals and surgery centers may understand how they perform as a group, but improving that performance is traditionally limited to revising policy and communicating new goals to large groups. This leaves improvement opportunities hidden and the patient experience stagnant.

 

Two anecdotal examples highlight the limitations of traditional reports. First, consider a group's on-time antibiotic performance is 99.64 percent - a typical result. It's worthy of accolades and difficult to improve….until one compares the individuals. By comparing individual performance within this “99.64 percent”, the narrative changes and exposes new opportunities to reduce complications and cost. Comparing providers reveals everyone was 100 percent, except for 2 providers who were at 72 percent and 83 percent respectively. It is these providers who are putting their patients at increased risk for infection - raising costs and lowering patient satisfaction. It is these 2 providers who need to change their behavior in order to improve the overall performance from 99.64 percent to 100 percent. Traditional reports lack this level of granularity, and thus hide the opportunity for improvement and prolong the status quo.

 

In a second example, consider the overall incidence of complications for a given practice is 12 per 3,200 cases, or 0.38 percent. Again, this result may be cause for inaction as this is “acceptably low”. However, comparing individuals reveals that 7 of those 12 complications involve the same provider. Is there cause for concern? Is there a need to intervene so that tomorrow’s patients are safer? There are many more such examples. Individually comparative reports simply open new opportunities for process improvement and improved patient experience.

 

3. Distributed

Fundamental to successful use of informatics is getting the right information, at the right time, to the right person. Analytics need to go further than senior leadership or department director levels and arrive in the hands of the people who are providing the care. Too often individuals are identified by managers only when either “something is wrong” and corrective action is warranted or when “exemplary behavior” has been documented. Outside of these two extremes, the vast majority of providers remain idle in a world of “average mediocrity”, defensive towards a suggestion they need to change.

 

For transformational change, "mediocrity" needs to improve to "excellence" because there’s real value for patient outcomes and the hospital’s bottom line when improving a "78 percent" to "96 percent". For such change though, individuals need to be empowered with tools to measure themselves, compare themselves, and improve themselves - without fear of being negatively labeled by superiors. Truth is, we all have opportunity for improvement, and such opportunity must not be equated with a state of individual failure. Instead, measurements must be made available down to the individual level so that every provider can better reach their own full potential.

 

For transformational improvement, we need more than just reports. The collected information must be convenient, accessible, and meaningful - designed to help individuals change their own behavior. Real-time, individualized, and distributed analytics are key report qualities that will enable measurements to impact the patient experience through decreased complications, decreased costs, and an overall improved experience.

 

What questions do you have?

Big Data is a hot topic across the entire healthcare ecosystem so I was happy to join a panel of experts in the first of a series of Frost and Sullivan webinars that will help healthcare organizations to better understand how they can make use of the data they have. This first in this webinar series is now available for you to view on demand via the Frost and Sullivan website. The 2nd webinar continues the Big Data theme with a focus on Predictive Analytics in healthcare, please do register now for this must-see session on November 5th.


I was joined by Tod Davis, Manager of BI and Data Warehousing at Children’s Healthcare of Atlanta and Amandeep Khurana, Principal Solutions Architect at Cloudera. I started the webinar by providing an overview of the current state of the healthcare industry in relation to data use, where many disparate data types are held in silos that limit the value considerably. At Intel we’re helping healthcare organizations move towards integrated computing and use of unified data to help deliver better patient outcomes and reduced costs.

 

An inspirational presentation from Tod Davis highlighted the outstanding work he and colleagues at Children’s Healthcare of Atlanta have been undertaking, including the benefits they are reaping from focusing on making use of data. I’d highly recommend listening to Tod for his take on:

 

  • Lessons learned in 3 years with Hadoop
  • Use cases including retinopathy of prematurity and using vital signs to monitor pediatric stress
  • An overview of ecosystem tools and high-level data pipeline solution

 

Amandeep Khurana from Cloudera shared his thoughts on how Cloudera is helping to drive the big data ecosystem. Amandeep’s presentation sparked a number of great questions at the end of the webinar from viewers and I think you’ll find the answers of particular interest. Watch webinar 1 now and share with your own professional network.

 

Webinar 2 focuses on Predictive Analytics in Healthcare and highlights use cases from the field, including what is sure to be an informative session with Vijay Venkateson of Sutter Health on November 5th 2015. Join us by registering today for webinar 2 to receive the latest updates and reminders for this series.

 

- Register for Webinar 2 now

- Watch Webinar 1 on demand

- Intel’s role in Big Data in Healthcare

Obesity, diabetes, heart disease…these are not just headlines – statistically, they’re coming to you or someone you love. Caring about ourselves includes caring for our bodies, but that can be difficult without feedback.  Stop eating or drinking, and your body will soon let you know. But eat something healthy or unhealthy, exercise or don’t exercise, sleep too little or too much, and you may not feel any significant effects for a while.

 

Measuring the Invisible

 

How do we measure our health? Our weight? How out of breath we are when we walk up the stairs? Numbers gathered once a year from a blood test? Health doesn’t happen at the doctor’s office. It happens with your every heartbeat and daily decisions of diet and exercise.

 

With our busy days, it’s easy to fall prey to the convenient behaviors over the healthy. What’s for dinner? Do I have the time to walk/run/bike?  Stairs or elevator? Watch that show or go to sleep? We don’t want to become neurotic about this, but how can we instill healthy habits in ourselves with more timely rewards than our natural senses provide?

 

Analyzing Peak Data

 

I’m excited about our new partnership with Big Cloud Analytics. In 2014, Intel acquired Basis Science, which makes the Basis Peak, arguably the most accurate fitness tracker on the market today. It can measure your resting and exercising heart rate, light, REM, and deep sleep, steps, calories burned, something called galvanic skin response which among other things, can deduce your stress level.  And the data from the Peak has been shown to be about 4% within that provided by stress test via halter, and within 2% of that provided by a sleep study. Great, but how does that change my habits to make me healthier?

 

That’s where Big Cloud Analytics comes in. Their COVALENCE Analytics Platform, running on Intel® Xeon® Servers, can analyze the approximately 72,000 data points coming off your Basis Peak watch every day. Add to that data coming from the population, weather data, social media data and whole host of other metrics.

 

With the magic of statistics, the BCA Dashboard gives you a score of your heart health.  We all know our hearts are a muscle. Our heart rate at rest, during sleep, when we exercise, and when we rest after exercise give a pretty clear picture of how that muscle is performing.  Wear the watch regularly (I change wrists for sleeping), and you are being tested every moment of every day.

 

Anonymizing Population Data

 

Whoa – sounds a bit nosy - what about my privacy? Data from our watches goes to secure Basis servers, then when we opt in for the COVALENCE Dashboard, our IDs are mapped to a new random ID and password, and a device. This way, we and our data are no longer associated.  This is called double-blinding. With double-blinding, our data is available anonymously as a member of the population. This lets me see where I fit within the population without sharing my personal numbers.  It also might let my employer (who is interested in me being my healthiest to maximize productivity and minimize corporate health insurance premiums) offer me targeted information without knowing my personal identity.

 

Analytics Influencing Positive Change

 

So now combine our heart rate data with activity, sleep data, stress data, our age, height, local weather then add a connected digital scale and blood pressure cuff, and the BCA Dashboard can identify how we compare to others based on gender, age, location, etc.

 

If we do the same exercise on a regular basis, anything from a walk, run or bike, or even a walk up a couple flights of stairs, we can see performance improve (or get worse). If we can view sleep score every night, maybe we can tie changes to our diet, the thermostat setting, or our activity level. Maybe we don’t want to be athletes, but would we and our loved ones want us to move our heart fitness ages lower?

 

For this work in tying wearable and exogenous (beyond biometrics) data, BCA and Intel recently won the 2015 Leadership in Healthcare IoT Award from the Employer Healthcare & Benefits Congress – they realize our health impacts productivity at work and our health insurance rates as well as our quality of life. Opportunities abound for gamification and incentives to help make fitness fun.

 

Connecting your Health

 

Analytics is at the center of the consumerization of Healthcare (see also US News article: http://health.usnews.com/health-news/patient-advice/articles/2015/08/14/how-big-data-is-driving-the-consumerization-of-health-care ). In the near future, you’ll be seeing additional ways to assess your personal health.

 

Companies are working mobile devices to continuously evaluate the function of our bodies:  blood pressure, non-invasive glucose, lung function and more. When you or your loved ones are sick, the doctor will be able to ‘see’ you via your computer and devices enabling them to remotely hear your heart and lungs, see inside your throat and ears for example. And whether well or sick, analytics will assist in your assessment.

 

We could call it ‘Biofeedback 2.0’ – Adding individual and population analytics to accurate wearables and we’ve got the makings of a personal health revolution.

 

 

(mathew.h.taylor@intel.com)

Next-Generation Sequencing (NGS) technologies are transforming the bioinformatics industry. By sequencing whole human genomes at rates of up to 18,000 per year—an average of one genome every 32 minutes— new sequencers have broken the $1,000 genome barrier[1] and opened the door to population studies and clinical usage models that have not been possible before.

 

Of course, high-volume sequencing generates an enormous amount of data that must be analyzed as fast as it is produced. According to Illumina, it would take an 85-node high performance computing (HPC) cluster to keep pace with its top-of-the-line HiSeq X™ Ten sequencer operating at full capacity.[2]

 

Drive Down the Cost of Analyzing Your Genomes

Working together, Qiagen Bioinformatics and Intel have developed a reference architecture for a 35-node cluster based on the Intel® Xeon® processor E5 v3 family that meets these same performance requirements, while reducing total cost of ownership (TCO) by as much as $1.4 million over four years. [3] Depending on sequencing volumes and data center efficiency, this solution could enable full analysis of whole human genomes for as little as $22 each.

 

The Qiagen Bioinformatics and Intel reference architecture uses CLC Genomics Server with the Biomedical Genomics Server extension, which is highly optimized for Intel architecture. The Biomedical Genomics Server solution provides all the advanced tools and capabilities of CLC Genomics Workbench and the Biomedical Genomics Workbench, but is designed specifically for HPC clusters. Geneticists benefit from a powerful workbench, high quality results, and intuitive interfaces that insulate them from the complexities of cluster computing.

 

Manage Your Data on Massively-Scalable, Centralized Storage

Fast, scalable storage is as important as cluster performance for NGS. The reference architecture includes a 165 TB storage solution based on Intel® Enterprise Edition for Lustre*. Intel packages this open source software with powerful management tools and offers 24/7 support to help organizations manage and protect their data and maintain high reliability and uptime.

 

This centralized storage system uses high-capacity commodity disk drives to keep costs low, plus a small number of Intel® Solid State Drives (Intel® SSDs) to accelerate the operations that are most critical for fast genome analysis. Like the compute cluster, the storage system is designed to scale on-demand, so you can accommodate rapid growth in a straightforward and cost-effective manner.

 

Lay the Foundation for Next-Generation Breakthroughs

Today’s powerful NGS technologies will help scientists, labs, and clinics deliver the next wave of scientific and medical innovation. A fast, scalable, and affordable analytics solution can simplify your journey and help keep your costs under control.

 

Read the story on the Qiagen Blog

 

Learn more at:

 

[1] Based on the published output capacity of the Illumina HiSeq X Ten next-generation sequencer. http://www.illumina.com/content/dam/illumina-marketing/documents/products/datasheets/datasheet-hiseq-x-ten.pdf

[2] Source: A workflow for variant calling based on BWA+GATK in the HiSeq XTM System Lab Setup and Site Prep Guide (Part # 15050093 Rev. H July2015). Current version for September 2015 can be found at:https://support.illumina.com/content/dam/illumina-support/documents/documentation/system_documentation/hiseqx/hiseq-x-lab-setup-and-site-prep-guide-15050093-01.pdf

[3] Based on internal performance tests and a total cost of ownership analysis performed by Qiagen Bioinformatics and Intel. Performance tests were conducted on a 16-node high performance computing (HPC) cluster. Each node was configured with 2 x Intel® Xeon® processor E5-2697 v3 (2.6 GHz, 14 core), 128 GB memory, and a 500 GB storage drive. All nodes shared a 165 TB storage system based on Intel® Enterprise Edition for Lustre, 256 TB of 7.2K RPM NL-SAS disk storage and 4 x 800 GB Intel Solid State Drive Data Center S3700 Series supported by an Intel® True Scale™ 12300 - 36 Port QDR Infiniband Switch and a 2x Intel® True Scale™ Single Port HCA’s – (QDR-80 configured by default). The TCO analysis was performed using an internal Intel tool and publicly available product pricing and availability as of October 9, 2015. The TCO for the test cluster was estimated over a 4-year period and compared with the estimated TCO of an 85-node cluster, as described in the Illumina HiSeq X System Lab Setup and Site Prep Guide, Document # 15050093 v01, September 2015. https://support.illumina.com/content/dam/illumina-support/documents/documentation/system_documentation/hiseqx/hiseq-x-lab-setup-and-site-prep-guide-15050093-01.pdf.  To quantify the TCO comparison, specific products were chosen that would fulfill the general specifications defined within the Illumina guide.  Support costs for both systems were estimated as 60 percent of TCO. The performance and TCO results should only be used as a general guide for evaluating the cost/benefit or feasibility of a future purchases of systems. Actual performance results and economic benefits will vary, and there may be additional unaccounted costs related to the use and deployment of the solution that are not or cannot be accounted for.

This blog is written by a colleague, Robert Sugar, who is a Software Architect at Intel Health and Life Sciences and features some very exciting advances for those who work in the genomics field, particularly as we gather at BioData World Congress, in Cambridge, UK, to hear how organisations across the world are advancing precision medicine. If you'd like to discuss an aspect of this blog you can find Robert's LinkedIn details at the end of his writing.

--------------

 

By Robert Sugar

With an aspiration of all-in-one-day genome sequencing bringing precision medicine to reality, I wanted to share some of the work Intel has been undertaking with partners to speed up an important part of the sequencing process ahead of BioData World Congress 2015. Like most good stories, this one starts at the beginning, by which I mean the mapping phase of DNA sequencing analysis which prepares sequence alignment files for variant calling in sequencing pipelines.

 

From Multiple to Single Pass

In a typical genomic pipeline, after the mapping phase, a number of tasks such as marking duplicates, sorting and filtering must take place which usually requires multiple passes by different preparation tools. The consequences of calling multiple command line tools numerous times include repeated I/O between the steps and multiple passes over the same data which may have only been incrementally modified. Moreover, many of these tools (such as the Picard tool recommended by the gold standard GATK workflow) utilize only a single CPU thread. As a result, often more time is spent in sorting and filtering than in variant calling itself. This not only slows down the entire process of sequencing a genome but also has financial implications too.

 

Intel, alongside IMEC and in collaboration with Janssen Pharamceutica (as part of the ExaScience Life Lab) developed elPrep, an open source high-performance tool for DNA sequence (BAM file) processing, which uses a single-pass, parallel filtering architecture. elPrep simplifies not only the computational processes but also the end user’s need to understand these processes, thus reducing both time and costs. Additional filters can also be easily added for customer-specific workflows.

elPrep BAM Processing.png

Figure 1: traditional multi-pass BAM file processing (blue arrows) vs. a single-pass elPrep workflow (orange arrows). Source: Charlotte Herzeel (Imec) and Pascal Costanza (Intel)

 

Meeting Today’s Standards

Throughout the development of elPrep it was vitally important to ensure compatibility with existing tools and datasets, and this has been achieved. elPrep can be used as a drop-in replacement for existing tools today, e.g. it is now the standard tool for exome loads at Janssen Pharmaceutica. What truly makes elPrep a fantastic tool for the genomics community though is the single-pass, extensible filtering architecture. By merging computations of multiple steps it avoids repeated file I/O between the preparations steps. Unnecessary barriers are removed allowing all preparation steps to be executed in parallel.

 

Reducing Time, Increasing Value

It is worth illustrating the impact of this with figures from a real-life exome sequencing project by Janssen Pharmaceutica as reported in ‘elPrep: High-Performance Preparation of Sequence Alignment/Map Files for Variant Calling (2015)’1 The runtime of an exome workload can be reduced from around 1 hour 40 minutes to around 10 to 15 minutes. The gains also tend to increase with the complexity of the pipeline. This is a considerable time (and cost) saving when looked at in the context of mapping and analysing whole-genome data.

 

elPrep is an important addition to the toolset of those working in the genomics field. I would draw your attention to the previously mentioned paper which provides extensive detail on the benefits of elPrep and more information on how it compares to existing SAM/BAM manipulation tools. There is more work to be done as we look ahead to all-in-one-day genomic sequencing but this is an exciting development. elPrep is available open source for both academic and industrial customers at https://github.com/ExaScience/elprep and is being integrated to online genomic toolkits, such as DNANexus.

 

Contact Robert Sugar on LinkedIn

 

 

1 Herzeel C, Costanza P, Decap D, Fostier J, Reumers J (2015) elPrep: High-Performance Preparation of Sequence Alignment/Map Files for Variant Calling. PLoS ONE 10(7): e0132868. doi:10.1371/journal.pone.0132868

Today I presented at BioData World 2015 at the Wellcome Trust Sanger Institute, near Cambridge, UK, on the topic of ‘Advancing the Data Science of Precision Medicine’, so I wanted to be able to share some of my thoughts for those who were unable to attend this fantastic event.

 

The Unique Challenges of Clinical, Behavioral and Genomic Data

As Chief Data Scientist for Big Data Solutions at Intel I’m really pleased to be able to share how I think big data can really help to advance healthcare and life sciences. The first question I’m often asked is: ‘What’s the big data challenge in healthcare?’ In simple terms we need to do a better job of acquiring, ordering, analyzing and then utilizing all of the myriad of data types generated across the healthcare ecosystem. On the face of it this is technically very challenging. I like to think of the different types of data making up what we call the ‘True State of the Patient’, a computable model of the patient for optimizing care delivery, outcomes and revenue. Think clinical data, behavioral data and genomic data, all of which have with their own unique challenges when it comes to making meaningful sense of them.

 

Closing the gaps around Clinical Data

Most clinical data sits in Electronic Health Records (EHR) where it’s often assumed that the data is neatly structured and can be analyzed with ease. For the majority of providers that’s not the case just yet. Some of the most important data is sitting in free-form text written by clinicians or in annotated images of scans. There are some great advances being made around Natural Language Processing (NLP) which will help us to make sense of this unstructured data and begin to unlock the value of every EHR.

 

Plugging Wearables into the Matrix

Wearable technology is mainstream today, devices are affordable, easy to use and capturing increasingly accurate types of data. The challenges for wider adoption by the healthcare ecosystem remain around confidence, from both patient and provider. How can a clinician be sure that data has been accurately recorded? Do patients have the know-how to interpret the data recorded and self-confidence to present this to a medical professional? Integrating data from wearables (and sensing technology already used by providers) is critically important to achieving that 360-degree view of the patient.

 

Providing the Platform to bring Lifesaving Discoveries

It’s exciting to see how organizations are overcoming the technical challenges posed by the enormous volumes of data surfaced when analyzing the genome at BioData World this week. Supporting the life sciences through our Big Data Solutions here at Intel means that providers will be able to combine clinical data, behavioral data and genomic data much sooner that we could have imagined even just 5 years ago. For example, take a look at the work we are doing with Oregon Health & Science University (OHSU) on the Collaborative Cancer Cloud, a precision medicine analytics platform that allows hospitals and research institutions to securely share patient genomic, imaging, and clinical data for potentially lifesaving discoveries.

 

The True State of the Patient is an achievable framework for healthcare across the world. It requires change to workflows, technologies and perhaps new ways of thinking. Conversation at BioData World gives me the confidence that we are on the right path, and we’re heading down that right path at an ever-increasing speed. If you consider the last 5 years to have been an exciting ride then hold tight as the next 5 years are going to bring much more.

 

Contact Bob Rogers on LinkedIn

 

The early reports regarding the October 1 transition to ICD-10 have been remarkably positive. In fact, it’s been the absence of reports that have impressed the experts  – in that respect, the changeover from ICD-9 has been a lot like the “Y2K bug” issue at the turn of the century. There was enough apocalyptic doom-and-gloom in the last couple of years to ensure that most providers and payers were prepared for even the worst-case scenarios.

 

Now two weeks in, I spoke with Pam Jodock, Senior Director for Health Business Solutions at the Healthcare Information and Management Systems Society (HIMSS), following a review meeting of the HIMSS ICD-10 Task Force. The task force includes providers from both hospitals and practices, as well as payers, consultants and vendors.

 

“We’ve been pleasantly surprised by how few problems have been reported,” Jodock said, summing up the meeting. “It’s too early to fly the victory flag, but anecdotally, we’ve not seen a spike in claims or pended claims from either CMS or other payers.”

 

Meanwhile, task force chair Bonnie Sunday, MD, has promised “to monitor for challenges that arise after the October 1 transition date and [to] make education available on mitigation strategies for problems that may be encountered.”

 

Jodock pointed me to a RelayHealth web site that uses internal data to monitor potential disruptions to claims data. The site is tracking four metrics to measure the impact of the transition: (1) Days to Final Bill, (2) Days to Payment, (3) Denial Rates, and (4) Reimbursement Rate. The data show that all metrics for the first two weeks of ICD-10 are equivalent to, or better than, the three months preceding the transition.

 

Based on both anecdotes and available claims data, Jodock concludes that organizations which used the time between the final rule’s publication in 2009 and this month’s implementation to prepare have experienced “business as usual.”

 

Reports from the field support that assessment.

 

“Many in the press have asked me about the first few days of ICD-10,” John Halamka, CIO at  Beth Israel Deaconess Medical Center in Boston, blogged. “The answer for my institution, like many, is that other than a few small refinements, the impact has been unnoticeable.”

 

Stacey McIntosh, an HIM Project Manager at Memorial Hermann Health System, echoed those sentiments in her comments to Healthcare Finance News. “We didn't really expect any major issues day one,” she said at the end of the go-live date. “The few items that we experienced today were relatively minor. We expect bigger bumps in the road as we start coding in ICD-10 and send claims off to payers."

 

But it hasn’t been problem-free. Jodock said some providers with homegrown systems have found the transition challenging. And physicians have reported some glitches which could prove expensive if not addressed.

 

Writing a blog that appeared in The Medical Practice Insider, Linda Girgis, MD, a family physician in South River, N.J., catalogued a number of “glaring and disruptive” issues she faced in the first week. She worried that commercial payers wouldn’t follow Medicare’s lead on a grace period for accepting unspecified codes as long as they were n the right family. She also said her clearinghouse kicked out all unspecified codes and wouldn’t submit them to payers.

 

“Some insurance on-line sites were updating and unavailable for the first 2 days,” Girgis wrote, noting that her office couldn’t check eligibility on some patients. “Any patient I saw on the first two days of October who we were unable to verify their insurance was treated for free and [there’s] nothing I can do about it. I know some people will say not to see them without this verification but they were sick. What good is a doctor who doesn't treat sick patients?”

 

So now that the actual transition date has passed, what’s next? The American Academy of Family Physicians recommends that providers take the following steps to ensure that transition goes as planned:

 

  • Monitor all claims acknowledgement and acceptance/rejection reports.
  • Promptly correct and resubmit all rejected/denied claims.
  • Evaluate post-implementation cash flow until claims filed with ICD-10 are consistently paid.
  • Evaluate need for contingency activities (e.g., overtime, consultant, credit line).
  • Monitor payer news regarding claims adjudication issues and resolutions.
  • Monitor reimbursement accuracy and timeliness of payer per contract.
  • Conduct coding review for accuracy and compliance.

 

Do you think that the relative smoothness of the transition and the consistency of claims data over the past two weeks portend an equally smooth reimbursement process?

 

What should healthcare CIOs be thinking about when it comes to leveraging big data? In the above clip, Nolan Joiner from MarkLogic explains what healthcare CIOs need to understand about relational database technology and data integration.

 

Watch the short video and let us know what questions you have. Are you using your data to the fullest potential?

According to the most recent HIMSS Leadership Survey, 72 percent of respondents report that consumer and patient considerations will have a major impact on their organization’s strategic efforts over the next two years. In other words, patient engagement, patient satisfaction and quality of care remain center stage in the healthcare industry.

 

This type of thinking also means that, as in other industries like banking and travel, technology will be a major driver of facilitating a shift in focus and providing better outcomes and more personal user experiences.

 

It’s no surprise that today’s episodic and reactive healthcare delivery costs too much and compromises patient safety, satisfaction and outcomes. In order to receive care, patients must go to a clinical setting, where decisions often lack collaboration, coordination and continuity and relevant individual and cohort data is often absent. To transcend these challenges, the healthcare IT and medical devices industries will need to seamlessly connect patients, their clinicians and their data to deliver holistic and proactive care wherever they are.

 

That’s why change is coming.

 

Growth of Distributed Care

To succeed and meet the demands of a growing, and changing, patient population, it’s important that the healthcare industry provide virtual and remote innovations that greatly expand coordinated and continuous care delivery options to patients beyond the clinical setting. This requires an interoperable and integrated infrastructure that facilitates distributed care and digitization across locations and at different stages of deployment and sophistication. Consider these recent findings:

 

 

By embracing a more distributed care model using telehealth technology, a revamped healthcare infrastructure would provide a compelling user experience that helps shift clinicians to a new way of interacting with patients and with each other in care decision making and delivery. Smaller, more portable and capable medical devices with better connectivity and interoperability are crucial to providing new options for care delivery that work best for patient and clinicians. Payers, providers and public policy should encourage adoption by rewarding new types of care delivery and collaboration that improve outcomes and drive accountability and flexibility into sector business models.

 

Make it Personal

At Intel, we’re delivering technology solutions that make it possible for patients to receive optimal, personalized care wherever they are. Our clinical analytics and big data tools empower key insights and discoveries for better, more preventive and personalized treatment. New client devices with innovative and novel user experiences deliver faster and richer clinical data flows. Our gateway, wearables, PaaS and security technologies, among others, support a distributed care platform that connects patients and their care teams with faster, easier and secure access to needed information in a variety of care settings. Our work with medical device manufacturers miniaturizes and optimizes their solutions to deliver better care when, where and how the patient needs it. Plus, Intel provides comprehensive security solutions that allow new levels of sharing and collaboration while safeguarding individual and institutional data.

 

The bottom line is that the individual will be the driving force in healthcare in the coming years. I envision technology being the driver of effective care plans tailored to individual needs and circumstances. From data analytics for more precision, to intuitive, adaptable and secure clients and devices, to more effective and innovative care delivery options like telehealth, personalized healthcare is the key to addressing cost, quality and access while improving outcomes and patient satisfaction.

 

What questions do you have?

In my last couple of blogs, Healthcare Breaches from Loss or Theft of Mobile Devices or Media and Healthcare Breaches from Insider Risks: Accidents or Workarounds, I looked at breaches resulting from loss or theft of mobile devices containing sensitive patient data, and breaches resulting from healthcare worker accidents or use of workarounds respectively. In this blog I build on these with another common type of breach that results from cybercrime hacks of healthcare organizations.

 

In the Ponemon 2015 Cost of Data Breach Study, 47 percent of breaches resulted from malicious or criminal attacks. In these kinds of breaches the attacker is a remote hacker that is often part of organized cybercrime, or even a nation state. The target is the healthcare organization backed database containing all patient records. Since the cost of a breach depends on the number of patient records compromised, and the backend master database contains all patient records, this type of breach is usually much more impactful that one resulting from loss or theft of a mobile device which often contains just a small subset of the overall number of patient records. According to this research study, the total average cost of a single data breach event is $6.53 million, or $398 per patient record (the highest across all industries). This is inclusive of all types of breaches. Cybercrime breaches tend to be even more impactful and costly since they often involve all the patient records, and can run into the tens of millions of dollars and even north of $100 million per breach event.

 

An example of this type of breach is shown in the infographic below, and involves a series of failures, starting with ineffective security awareness training for healthcare workers. The next failure involves a spear phishing email being sent to a healthcare worker and the healthcare worker clicking on a malicious link in the email, resulting in drive by download of malware. The malware, now installed behind the firewall of the healthcare organization, proliferates and key logs, all the while looking for privileged credentials to use to access all patient records in the master database. Once DB administrator credentials are captured the malware then begins to exfiltrate patient records “low and slow” covertly to avoid detection, resulting in a breach. Many organizations lack the ability to detect such intrusions. As a result this type of breach can often go undetected for months or years before a watchful administrator happens to notice suspicious activity on the database. The huge delay between intrusion and detection with these types of breaches often results in much greater breach impact to the healthcare organization since the longer the breach goes on the more patient records are compromised.

 

David_Cyber attack.png

 

Security is complex, and there are many safeguards required to effectively mitigate this type of breach. Maturity models have achieved wide adoption and success in healthcare, for example the HIMSS EMRAM (EMR Adoption Model) has been used by 5300+ provider organizations worldwide. Maturity models are a great way to simplify complexity and enable rapid assessment of where you are and what you need to do to improve.

 

In the infographic above, beneath the sequence of events leading to this type of breach, is a breach focused maturity model that can be used to rapidly assess your security posture and determine next steps to further reduce residual risk. There are three levels in this maturity model, Baseline includes orange capabilities, Enhanced adds yellow capabilities, and Advanced adds green capabilities. Only safeguards relevant to mitigating this type of breach are colored in this maturity model. Other grayed out blocks, while important in mitigating risk of other types of breaches, do not play a significant role in mitigating risk of breaches from cybercrime hacks. There are many risks in healthcare privacy and security. This model is focused on breaches. A holistic approach is required for effective security, including administrative, physical and technical safeguards. This maturity model is focused mostly on technical safeguards. Risk assessments are required by regulations such as HIPAA, and standards such as ISO27001. The ability to rapidly assess breach security posture using a breach security maturity model is complementary and not a replacement to risk assessments. Below I briefly review each of the safeguards relevant to cybercrime breaches.

 

A baseline level of technical safeguards for basic mitigation of healthcare breaches from insider risks requires:

 

  • User Awareness Training: educates healthcare workers on how to be privacy and security savvy in delivering healthcare, and avoid clicking on spear phishing emails
  • Anti-Malware: detects and remediates malware infections of healthcare worker devices, including malware employees may accidentally encounter through drive by downloads
  • Vulnerability Management and Patching: involves proactively identifying vulnerabilities and patching them to close security holes before they can lead to a breach. This is particularly important with healthcare worker devices used to access the Internet and at risk of being exposed to drive by downloads of malware
  • Penetration Testing / Vulnerability Scanning: involves proactively testing IT and scanning for vulnerabilities to identify security holes and vulnerabilities that can be remediated proactively to reduce risk of these being used in exploits
  • Email Gateway:  helps defend against malware attached to emails, and phishing attacks
  • Web Gateway: can detect malware from healthcare workers web browsing the Internet, and defend against attempted drive-by-downloads that may otherwise lead to data loss and breach
  • Firewall: malware used in cybercrime attacks attempts to make contact with C&C “Command & Control” servers to receive instructions and exfiltrate patient records. A good firewall can help defend against this.

 

An enhanced level of technical safeguards for further improved mitigation of risk of this type of healthcare breach requires addition of:

 

  • Secure Remote Administration: enables healthcare IT to efficiently, securely and remotely administer endpoint devices so they are up to date with the latest patches and safeguards to defend against breaches from accidents and workarounds
  • Intrusion Prevention System: can detect and defend against anomalous activity on the healthcare organizations network such as would occur with malware communicating with C&C servers

 

An advanced level of security for further mitigation of risk of this type of breach adds:

 

  • Client and Server Application Whitelisting: block unauthorized executables on clients and servers and can stop even the most sophisticated zero day attack malware
  • Network DLP Prevention: ensures that sensitive healthcare data only leaves the healthcare network when appropriate, and can help defend against loss of sensitive healthcare information being exfiltrated as part of a cybercrime breach
  • Threat Intelligence Exchange / Collaboration: connects your security IT together with external threat intelligence for improved detection and response to malware and cybercrime attacks
  • SIEM: integrates and analyzes event, threat and risk data for improved detection of malware, intrusions, and cybercrime breaches
  • DB Activity Monitoring: improves your ability to detect malware attacking your database, as in the case of a cybercrime breach. This safeguard also enables you to define policies that can help defend against this type of breach
  • Digital Forensics: enables you to determine in the event of a cybercrime intrusion whether a breach actually occurred, and if so the nature of the breach, and exact scope of patient data compromised

 

Healthcare security budgets are limited. Building security is an ongoing process. The maturity model approach discusses here can be used in a multi-year incremental approach to improve breach security while keeping within limited annual budgets and resource constraints.

 

What questions do you have?

Read Part I of this blog series on wearables in healthcare

Read Part II of this blog series on wearables in healthcare

 

As I mentioned in the first part of this blog series, wearables have become more than a passing trend and are truly changing the way people and organizations think about managing health. I hear from many companies and customers who want to understand how the wearables market is impacting patient care as well as some of the changes taking place with providers, insurers, and employers. In this blog series, I'll share some of their questions and my responses. This blog’s question is:


What are the primary challenges that companies face in collecting, analyzing, and sharing data generated by wearables?

 

Data integration and technology interoperability pose challenges. Data in healthcare is still very siloed. In most cases, the provider owns and maintains the electronic health record, the payer the claims data. Lab and prescription data are in their own systems. It’s difficult to access this data where it resides and pull it into a unified repository. Some of the leading electronic healthcare record vendors have built adaptors to pull in some fitness and wellness data. However, a lot of the wearable manufacturers compound the problem by being very insular and not offering an easy API for transferring the data. And there are no standards in place for wearables data. So it can be challenging to integrate patient generated data into traditional healthcare applications.

However, with healthcare today, one can argue there are bigger fish to fry than wearables when it comes to interoperability.


Another big issue is privacy: how will the data be used? When you start tying wearable data to corporate wellness programs and health plans, there is natural concern by employees and members wondering if the data can be used against them. The successful programs are often opt-in, and some include financial incentives or lower premiums if certain performance milestones are reached. Those are the “carrots” that will get people to participate. I have not heard of an example where employees are required to participate in wearing devices, but I imagine that would be less successful.

 

What questions about data do you have?

Filter Blog

By date:
By tag:
Get Ahead of Innovation
Continue to stay connected to the technologies, trends, and ideas that are shaping the future of the workplace with the Intel IT Center