1 2 3 Previous Next

Intel Health & Life Sciences

323 posts

The 2nd webinar in Frost & Sullivan’s series on Big Data in Healthcare took place recently and featured a fantastic insight from Vijay Venkatesan of Sutter Health and Shawn Dolley of Cloudera on the subject of Predictive Analytics in a Big Data world. The webinar is now available on-demand via the Frost & Sullivan website.


This on-demand webinar features some great learnings from Sutter Health including:

  • Objective and role for Data and Enterprise Management
  • Dealing with Volume and Variety of Data
  • Best Approach to Transformation
  • Critical Success Factors
  • Closing Q&A


Listen to this webinar now and register for the 3rd and final webinar which features Dr. David Delaney of SAP sharing insights into how SAP’s HANA in-memory database along with Intel tailored hardware are being leveraged by the American Society of Clinical Oncology, and Dr. Kevin Fitzpatrick, CEO of CancerLinQ who will discuss the fantastic work around the aggregation and analysis of a huge web of real-world cancer care data.


  • Register Now: Big Data in Clinical Medicine: Bringing the Benefits of Genome-Aware Medicine to Cancer Patients
  • Watch Webinar 2: Predictive Healthcare Analytics in a Big Data World: Use Cases from the Field
  • Watch Webinar 1: Future of Healthcare is Today: Leveraging Big Data

As we arrive at the $1,000 genome, we find the fundamental challenges with next generation sequencing have shifted. The issue is no longer about shrinking the cost of sequencing but the explosive growth of big data: the downstream analytics with rapidly evolving parameters, data sources and formats; the storage, movement and management of massive datasets and workloads; and perhaps most paradoxical of all, the challenge of articulating the results and translating the latest findings directly into improving patient outcomes.

This topic, and more, will be front and center at the Personalized Medicine World Conference (PMWC) coming up in January 2016. As we transition from a “one-size-fits-all” approach and a focus on treatment rather than prevention, it’s a good time for the industry to gather to make personalized medicine a reality.


Combining patient, clinical, diagnostic and ‘omic data will give us more diversified data, allowing us to view health data differently with the potential for new personalized treatments. To analyze such diverse and large data sets will require new technical approaches. We will need to collect and store patient data in central and secure repositories when we can. We will also need solutions that can accommodate large amounts of genomic data which isn’t efficient to move from the clinics that generate and store it.


The challenge of analyzing data is also the reason that Intel, along with Oregon Health & Sciences University, launched the Collaborative Cancer Cloud, a precision medicine analytics platform that allows institutions to securely share patient genomic, imaging and clinical data for potentially lifesaving discoveries. It will enable large amounts of data from sites all around the world to be analyzed in a distributed way, while preserving the privacy and security of that patient data at each site.


The end goal is to empower researchers and doctors to help patients receive a diagnosis based on their genome and potentially arm clinicians with the data needed for a targeted treatment plan. By 2020, we envision this happening in 24 hours -- All in One Day. The focus is to help cancer centers worldwide—and eventually centers for other diseases—securely share their private clinical and research data with one another to generate larger datasets to benefit research and inform the specific treatment of their individual patients.


Genome Centers

I’m honored to be part of the PMWC 2016 agenda. On the third day of the event (January 26), I will be addressing how Genome Centers, including core facilities, are handling these challenges along with ethical and privacy issues. Several center directors will be on hand to discuss how they work with clinical data and how they share data with their customers.


Learn more about the Personalized Medicine World Conference on January 24-27, I hope to see you there as treating patients as individuals becomes the norm in healthcare delivery.


What questions do you have?


Interoperability should mean less cost for healthcare organizations and better data analysis for patients. Here are a few additional thoughts.

How does a centre of research excellence keep pace with ever increasing data volumes and demand for insight? It’s a recurring question we hear the world over so it’s great to be able to showcase an example of how one organisation is meeting these challenges here in Spain. Spain’s National Center of Genomic Analysis (CNAG) opened in 2009, supporting 120 researchers and conducting c. 300 projects per year. It has a clear mission: to deliver research and results that help make citizens’ lives better.


Finding the 0.1%

As one of the largest capacity sequencing facilities in Europe, CNAG sequences around 800 Gigabases per day. We know that for reliable analysis we need to sequence at 30-fold coverage, so CNAG are sequencing the equivalent of eight full human genomes every 24 hours, but it’s the variations that really hold the key to unlocking precision medicine.

And given that genomes are 99.9% identical the challenge becomes clear: find the 0.1%, break each genome down into short strings, sequence them and then rebuild them. Ivo Gut, director of CNAG summarizes this nicely in the Sequencing and Supercomputers case study when he says: “It’s like doing a jigsaw puzzle with 1 billion pieces.”


Combining Data Sources to Gain New Insights

If you are a regular reader of our Health and Life Sciences blogs you’ll know that the word collaboration appears frequently. Across the healthcare ecosystem, collaboration is driving change, it’s moved from something we all aspire to, to something we must embrace to deliver better care, reduced costs and improved workflows. So, it’s great to see CNAG combining their own data with other sources to gain new insights, e.g.  CNAG collaborated with other institutions as part of the International Cancer Genome Consortium to better understand chronic lymphocytic leukaemia.


Big Data leads to Big Information

CNAG’s aim is to be able to put the findings of their research into use in a clinical environment; this requires a powerful computing platform which allows them to locate and accurately predict the base variations in every genome of the 3.2m bases that are potentially responsible for diseases. Without the technical capabilities to deliver sequence analysis on an industrial scale it makes it difficult to do much more than one-off research projects. I recognise that these pockets of research are valuable but to move us closer to delivering personalized medicine we must begin to work more collaboratively.


CNAG’s new sequencing and analytics environment is helping the organisation to meet the growing volume and variety of data generated by collaborative working with Ivo Gut saying: “We’re certainly handling big data now – and it’s growing all the time – but what we’re really after is big information.”


Intel and Atos provide scale and flexibility

Being able to design the computational infrastructure from the ground up gives organisations such as CNAG the opportunity to utilise best-in-class technology. The organisations I talk to regularly all have the same priorities around flexibility and scale. With that in mind Atos Big Data and Security service line developed a tailor-made compute cluster, powered by the Intel® Xeon® processor E5 family, to conduct in-depth high-performance data analytics (HPDA) on genome sequencing.


And looking to the future, CNAG will provide more granular insights to help hospitals treat different diseases, whether that be for identification of the correct medication or for rapid initial diagnosis. As CNAG scales its computational infrastructure it will also increase its scope of research, ensuring that Spain stays at the forefront of global genomics research.


Contact Carlos Piqueras on LinkedIn


Read Part I of this blog series on wearables in healthcare

Read Part II of this blog series on wearables in healthcare

Read Part III of this blog series on wearables in healthcare


This blog series is about how wearables have become more than a passing trend and are truly changing the way people and organizations think about managing health. I hear from many companies and customers who want to understand how the wearables market is impacting patient care as well as some of the changes taking place with providers, insurers, and employers. In this series, I’m sharing some of their questions and my responses. This blog’s question is:

Are there applications for tracking employee movement or flow apart from wellness applications?


Yes, this is a fairly widespread and growing practice in healthcare – monitoring the location and movement of employees, patients, and equipment. So you know where the patient is, where the care team is, where the closest infusion pump is. The “wearable” in this case would be the badge with integrated active or passive RFID. This is typically referred to as RTLS or real-time location system.

I haven’t come across many healthcare organizations that monitor their clinicians; patient and medical device monitoring is more widespread. There are some privacy concerns with employees being monitored, but there are benefits such as easily locating the care team or adjusting workflow throughout the hospital. I suspect a growing number of organizations will deem examples like these to have high enough ROI to allow for the minor privacy intrusion.


What do you think? Have you seen a trend in monitoring medical staff?

By Tom Foley, Director, Global Health Solution Strategy, Lenovo Health


Patient engagement is a top-of-mind subject for healthcare providers. After all, successful patient engagement usually results in better outcomes for all parties involved in care – for providers, it can result in a more comprehensive view of their patients’ health and/or condition stability as well as improved patient satisfaction, and for patients it can mean more confidence in their improvement, their sense of care and ultimately their health.


Even with the industry’s sights set on the promise of patient engagement, many healthcare providers have found it difficult to connect with patients in a meaningful way. While widespread consumerization of healthcare has not yet taken hold, there are still some compelling examples of organizations that are moving toward meaningful patient engagement, which can easily serve as a form of encouragement for those still seeking direction.


Recently, athenahealth collected data from millions of portal visits and more than 2,000 clients in hopes to be able to “nudge medical staff in the interest of better efficiency and performance.” Their findings could be equally as helpful to other providers, with key takeaways such as:


Age is not necessarily a pain point. Many providers are tempted to point to their diverse portfolio of patients when examining troubles with patient engagement. While in some cases, older individuals are slower to adapt to new norms and less comfortable with technology, athenahealth has found that in terms of patient portals, age is not a limiting factor. Specifically, the study found that “patients in their 60s register for portal accounts at the same rate as those in their 30s, 40s, and 50s” and “patients between 70 and 79 use portals at roughly the same rate as twenty-somethings.” Speaking even further to age as a non-issue, athenahealth points out that “older patients sign into their portal accounts significantly more often than younger patients.”


Specialty providers can benefit from portals. Athenahealth found “significant” adoption of patient portals from specialty practices. They name “tremendous reductions in staff work from engaging patients electronically” and “better adherence to perioperative protocols” as specific successes.


Defaults encourage engagement. Providers should not be hesitant to sign patients up for portals automatically – leaving them with the option to opt out. Athenahealth found that practices using this approach “see a much higher adoption rate than practices that ask patients to register for the portal on their own.”


As an industry, we know that patient engagement is more than complying with meaningful use requirements – we are moving forward with better strategies to engage patients and with the intention of creating an overall better-coordinated care experience. Portals are not the only tool available, but they remain a great place to start. As patient engagement continues to challenge providers, it’s important to follow the latest studies and their results, such as the one from athenahealth. These insights into the healthcare industry are a great way to glean best practices, understand different approaches, and understand what an improved patient engagement strategy can look like for your organization. By sharing findings, opinions, successes, and failures, we can, together, reinforce what really matters: improving overall care and maintaining patients as the top priority.




  1. 1. “Using Data to Increase Patient Engagement in Health Care” Harvard Business Review. June 30 2015.

The seventh annual mHealth Summit kicks off Nov. 8 with the theme, Anytime, Anywhere: Engaging Patients and Providers.


This event, which is now part of the HIMSS Connected Health Conference, has come a long way since the first show in 2008, when the focus was primarily on devices. Since then, the meaning of “mobility” has evolved, with mHealth now driving a swelling tide of innovation that will ultimately fill the data hole that has vexed physicians for millennia: How is my patient doing at home?  We’re seeing bits and pieces of the innovations and use-cases that will make this possible—from footstep-monitoring smart phones to wearable biosensors for Parkinson’s patients. But these just hint at the disruptive changes that lie ahead.


Where are we heading, and how do we get there? That’s what we are looking forward to sharing next week as several experts on our team will be presenting ideas on the next phase of mHealth technology.

For example, here are just a few of the activities on the agenda:


  • The Intel booth (#703) will be a hub of activity showcasing many leading technologies, such as: a Patient Room of the Future, remote patient monitoring systems, Intel® RealSense™ cameras for wound care, portable teleclinics, IoT platforms that record, store and analyze data, plus the newest mobile devices featuring Windows* 10.
  • I will be delivering a keynote address on Tuesday at 9:30 a.m. in the Potomac Ballroom on the main stage and will be looking at mHealth’s growing role in enabling connected and proactive care wherever you are, tailored to you, your data, and your care community. Plus, I’ll be sharing some innovations that are moving us from wearable to implantable health technologies.
  • Our Cybersecurity Kiosk (#4) will feature our security expert, David Houlding, covering security maturity models and online tools. Come with your questions and have a chat with him.
  • The Connected Health Pavilion is where you can learn about big data analytics and see the Basis Peak™ watch in action in CH-3.


Mobility is a given as today’s healthcare continues expanding beyond institutions into more home-based and community care settings. Mobile technology offers the promise to help busy clinicians improve quality of care and efficiency of care delivery. Intel powers mobile devices with modern user interfaces for on-the-go clinicians, while helping IT departments meet the challenges of managing and securing devices in a “bring your own device” (BYOD) world.


We are looking forward to mHealth Summit and encourage you to follow @IntelHealth on Twitter for live tweets from the show floor and the Intel presentations.


What are you most looking forward to seeing and learning about at mHealth Summit?


John Sotos, MD, is the Worldwide Medical Director at Intel Corporation.

Now in its seventh year, the mHealth Summit is taking its “innovation” theme to a new level with a few surprises in 2015. First and foremost, the summit is taking place a month earlier than usual – this year’s event will be held November 8-11 at the Gaylord National Resort and Convention Center in Washington DC.  (Does this mean we won’t get to see the annual Christmas display in the Gaylord’s voluminous atrium?)


Second, it’s now part of the HIMSS Connected Health Conference, which also includes the Global mHealth Forum, the Population Health Summit and the CyberSecurity Summit – all for the price of a single registration.


And if that’s not enough, attendees can add on registration for seven additional co-located one-day conferences and events covering, wearables, games for health, venture capital and start-ups.


Let me confess: the mHealth Summit is one of my favorite annual events in the healthcare space. Attendees and exhibitors have a missionary zeal, and they embrace their roles as evangelists. And I love the tech on display — both the hardware and the apps. It reminds me a lot of CES, the big consumer electronics show in Vegas, except on a more intimate scale, and confined to a topic I’m passionate about.


Where are the providers?

But in my view, there’s always been one thing missing from the mHealth Summit – the providers. Last year, only 14 percent of attendees came from provider and payer organizations. While this represents the second-largest demographic group at the summit, mixing payers and providers together masks the actually attendance of CIOs, IT directors, and other technology professionals from hospitals and large practices.


When the Healthcare Information Management and Systems Society acquired the event in 2012, many industry observers expected HIMSS members (today, some 61,000 strong) to fill that gap. Yet even three years in, the exhibit floor and education sessions still seemed the sole province of technology developers, consultants, consumer advocates, government officials (it is in DC after all) and a handful of independent, forward thinking clinicians.


Eric Wicklund, editor of mHealth News (owned by HIMSS Media), says the acquisition may simply have been ahead of its time. In 2012, CIOs were in the midst of certifying EHRs, trying to get a handle on meaningful use, and dreading the implications of a change-over from ICD-9 to ICD-10. There just wasn’t the bandwidth to think about mHealth.


Wicklund also suggests that a conference between Thanksgiving and the end of the year was a difficult pill for many CIOs and practitioners. “If your goal was to attract providers, it just seems like a hard time to hold a conference,” he said, pointing to the holidays and the wintry weather.


Three reasons to reconsider

Wicklund thinks that in addition to the broader focus of the conference and the change in dates, there are good reasons to believe this year’s event will appeal to technologists and clinicians.


#1 - Connected health implies provider participation. Says Wicklund, “HIMSS identified three hot-button issues for providers – mHealth, cyber security and patient engagement. It’s not about mobility anymore — it’s connected care, and providers have a real stake in that.”


#2 - Greater focus on provider use cases. Providers might want to look at the clinical care track, which features a number of case studies from some of the country’s leading hospitals and health systems, including the University of Pittsburgh Medical Center and Massachusetts General Hospital.


#3 - Breaking news on provider issues. Here’s another reason why CIOs might want to make plans to attend this year’s event – the Office of the National Coordinator has agreed to hold a “fireside chat” on the recently released Interoperability Roadmap. Also at the Monday, Nov. 9 gathering, ONC officials are likely to detail the latest meaningful use final rule.


In a recent interview, HIMSS CEO Steve Lieber provided a strong justification for why CIOs should invest more time looking at and understanding mHealth, wearables and the Internet of Things.


“Healthcare is entering an era of the consumer and with that comes new challenges and opportunities,” Lieber told mHealth News. “More than ever before consumers are taking greater control of their health and healthcare. To support that trend, providers need to provide a different level of connectivity between the clinician and the consumer. Mobile health, population health and information security are critical components in achieving this new level of electronic connection.”


I think Lieber is spot-on. Connected health is being driven by consumerism, and healthcare providers who aren’t able to respond to consumer demands for interactive and interoperable data, new payment options, self-serve scheduling and greater convenience will be left behind. Taking care of the infrastructure and managing internal applications used to occupy the full attention of provider-based technologists, but in today’s world, that is simply not enough. As care spills over the hospital wall (think about preventing unnecessary readmissions as just one example), the technology will follow.


Do you plan to attend the mHealth Summit this year? Why or what not? Let me know in the comments or send me a tweet at @jbeaudoin.

By Asha Nayak, MD, Ph.D.


This week, I was thrilled to join Intel CEO Brian Krzanich on stage during his keynote at Oracle* Open World to describe the great potential of big data analytics in healthcare. I talked about the critical role of tools like Intel’s open-source Trusted Analytics Platform (TAP) in accelerating medical discovery and impact for patients.


As a practicing clinician and Intel’s Global Medical Director, I am especially excited about the ability to monitor patients outside of the clinic using wearable, at-home, and mobile devices. In addition to their usefulness in monitoring, many of these devices are two-way — creating an avenue to reach individuals in real time with custom recommendations. This has very broad implications well beyond the examples around cardiovascular and Parkinson’s studies I cited in the keynote.


For outwardly healthy individuals, wearable and mobile can help clinicians detect a variety of conditions earlier. For example, today many people have intermittent atrial fibrillation that they learn of only after arriving at the hospital with their first stroke. Imagine if we could detect this silent and asymptomatic condition at home, in time to prompt a person to get early treatment and prevent that first stroke.


Another example is pre-ecclampsia1, which affects two percent to 18 percent of pregnancies, depending on where you are in the world. It is a progressive condition that threatens the lives of both mother and fetus. One of its first signs is rising blood pressure. Earlier detection and treatment are known to improve outcomes for both mother and child. These are two of many examples of how we can prevent suffering, improve outcomes, and potentially lower costs by detecting disease earlier.


It’s very important to note that monitoring devices must be accurate (especially in the hands of untrained users), and must be validated (in combination with their back-end analytics) for accuracy – in order to be effective in these types of applications.


For individuals with chronic disease, there is also great value in helping people tailor how they manage their unique symptoms and disease progression. Technology can help to guide a patient within the bounds of a physician’s prescription/instructions. For example, patients with asthma treat themselves with prescribed oral and inhaled medications when symptoms occur.


Imagine if - using biometric and environmental sensors, real-time analytics, and push-alerts — an asthma patient could be alerted to the risk or early-onset of a flare before symptoms begin. Early intervention has the potential to help patients avoid or reduce the severity of their flare and reduce the amount of medication needed to control it. This is an example of how we can better treat conditions that are already diagnosed.


My enthusiasm in this area is shared by many, across disciplines, and around the world. In addition to Intel, numerous academic, government and commercial organizations have recognized the value of integrating multiple health data streams, and are investing heavily in discovery from large population datasets. As you can see from this chart, these efforts, representing just a few that are underway today, are diverse both in focus and location:



Target # of Participants


UK Biobank


Fully enrolled, dataset growing

Michael J Fox Foundation

1,000 (US & Europe)

Actively enrolling

UCSF Health eHeart Study

1,000,000 (worldwide)

Actively enrolling

Stanford & Duke Baseline Study


Pilot phase (2015), then expanding

Qatar SIDRA Medical & Research Center


Enrolling in 2016

Saudi Genome Project


Planning phase (2015)

US Precision Medicine Initiative


Planning phase (2015)


So to all the researchers out there, this is an amazing time to ask questions we never thought we could answer before. Think about associations between clinical parameters and pretty much anything we can measure today — from behavior to diet to location to genetic composition and more. Bigger and bigger datasets are being integrated. Tools like TAP are making it easier to query these complex information streams, including data generated outside the clinic, and to find answers that I believe will help us live healthier.


The entire keynote can be found here with the healthcare discussion beginning at 33:00.


Visit intel.com/healthcare to find out more about Intel’s efforts in Health and Life Sciences.


Copyright © Intel Corporation, 2015. All rights reserved.

*Other names and brands may be claimed as the property of others. Legal Notices

While measuring is a requirement for process improvement, it's not the end goal. Identifying the necessary reports, in and of itself, does not automatically translate to an improved patient experience as revealed in February 2015 by two JAMA articles. Both failed to find significant difference between hospitals who had implemented the American College of Surgeon’s National Surgical Quality Improvement Program (NSQIP) and those who had not. As it turns out, the act of adopting frameworks to measure and report does not change the rate of improvement.


Donald M. Berwick, MD, MPP writes in an accompanying JAMA editorial:


“The authors of both reports in this issue of JAMA struggle to explain their findings, troubled as any sensible person must be by the suggestion that knowing results would not help caring, committed clinicians and organizations improve their results.”


In order to improve, individual behavior needs to change. Here are three essential report “qualities” to better connect measurement with behavior change, and thus improve the patient experience.


1. Real-Time

While hospitals and surgery centers may have access to "reports", not all reports are equal. Timing is everything, and receiving monthly reports is less than ideal when trying to improve process. When available any time, however, analytics provide a mechanism for immediate accountability and drive more intelligent behavior change. At the end of every day, we need to easily evaluate our performance and, where necessary, make changes for the next day.


With real-time analytics, the managers and directors are informed with objective data allowing them to intervene before emotions and incorrect perceptions create division or erode culture within the team. What was today's on time start rate? Why were cases late? How many patients waited over an hour past their scheduled start time? What was the average turn over time in the OR 6 today? With real-time analytics, managers are able to approach surgeons before there’s an opportunity to complain. This type of information, delivered in an easily consumable format, empowers the directors to have confidence their data-driven decisions are supported in real-time. And that's a big deal for tomorrow’s patient. There's no waiting for improvement.


2. Granular, Interactive, and Individually Comparative.

Hospitals and surgery centers may understand how they perform as a group, but improving that performance is traditionally limited to revising policy and communicating new goals to large groups. This leaves improvement opportunities hidden and the patient experience stagnant.


Two anecdotal examples highlight the limitations of traditional reports. First, consider a group's on-time antibiotic performance is 99.64 percent - a typical result. It's worthy of accolades and difficult to improve….until one compares the individuals. By comparing individual performance within this “99.64 percent”, the narrative changes and exposes new opportunities to reduce complications and cost. Comparing providers reveals everyone was 100 percent, except for 2 providers who were at 72 percent and 83 percent respectively. It is these providers who are putting their patients at increased risk for infection - raising costs and lowering patient satisfaction. It is these 2 providers who need to change their behavior in order to improve the overall performance from 99.64 percent to 100 percent. Traditional reports lack this level of granularity, and thus hide the opportunity for improvement and prolong the status quo.


In a second example, consider the overall incidence of complications for a given practice is 12 per 3,200 cases, or 0.38 percent. Again, this result may be cause for inaction as this is “acceptably low”. However, comparing individuals reveals that 7 of those 12 complications involve the same provider. Is there cause for concern? Is there a need to intervene so that tomorrow’s patients are safer? There are many more such examples. Individually comparative reports simply open new opportunities for process improvement and improved patient experience.


3. Distributed

Fundamental to successful use of informatics is getting the right information, at the right time, to the right person. Analytics need to go further than senior leadership or department director levels and arrive in the hands of the people who are providing the care. Too often individuals are identified by managers only when either “something is wrong” and corrective action is warranted or when “exemplary behavior” has been documented. Outside of these two extremes, the vast majority of providers remain idle in a world of “average mediocrity”, defensive towards a suggestion they need to change.


For transformational change, "mediocrity" needs to improve to "excellence" because there’s real value for patient outcomes and the hospital’s bottom line when improving a "78 percent" to "96 percent". For such change though, individuals need to be empowered with tools to measure themselves, compare themselves, and improve themselves - without fear of being negatively labeled by superiors. Truth is, we all have opportunity for improvement, and such opportunity must not be equated with a state of individual failure. Instead, measurements must be made available down to the individual level so that every provider can better reach their own full potential.


For transformational improvement, we need more than just reports. The collected information must be convenient, accessible, and meaningful - designed to help individuals change their own behavior. Real-time, individualized, and distributed analytics are key report qualities that will enable measurements to impact the patient experience through decreased complications, decreased costs, and an overall improved experience.


What questions do you have?

Big Data is a hot topic across the entire healthcare ecosystem so I was happy to join a panel of experts in the first of a series of Frost and Sullivan webinars that will help healthcare organizations to better understand how they can make use of the data they have. This first in this webinar series is now available for you to view on demand via the Frost and Sullivan website. The 2nd webinar continues the Big Data theme with a focus on Predictive Analytics in healthcare, please do register now for this must-see session on November 5th.

I was joined by Tod Davis, Manager of BI and Data Warehousing at Children’s Healthcare of Atlanta and Amandeep Khurana, Principal Solutions Architect at Cloudera. I started the webinar by providing an overview of the current state of the healthcare industry in relation to data use, where many disparate data types are held in silos that limit the value considerably. At Intel we’re helping healthcare organizations move towards integrated computing and use of unified data to help deliver better patient outcomes and reduced costs.


An inspirational presentation from Tod Davis highlighted the outstanding work he and colleagues at Children’s Healthcare of Atlanta have been undertaking, including the benefits they are reaping from focusing on making use of data. I’d highly recommend listening to Tod for his take on:


  • Lessons learned in 3 years with Hadoop
  • Use cases including retinopathy of prematurity and using vital signs to monitor pediatric stress
  • An overview of ecosystem tools and high-level data pipeline solution


Amandeep Khurana from Cloudera shared his thoughts on how Cloudera is helping to drive the big data ecosystem. Amandeep’s presentation sparked a number of great questions at the end of the webinar from viewers and I think you’ll find the answers of particular interest. Watch webinar 1 now and share with your own professional network.


Webinar 2 focuses on Predictive Analytics in Healthcare and highlights use cases from the field, including what is sure to be an informative session with Vijay Venkateson of Sutter Health on November 5th 2015. Join us by registering today for webinar 2 to receive the latest updates and reminders for this series.


- Register for Webinar 2 now

- Watch Webinar 1 on demand

- Intel’s role in Big Data in Healthcare

Obesity, diabetes, heart disease…these are not just headlines – statistically, they’re coming to you or someone you love. Caring about ourselves includes caring for our bodies, but that can be difficult without feedback.  Stop eating or drinking, and your body will soon let you know. But eat something healthy or unhealthy, exercise or don’t exercise, sleep too little or too much, and you may not feel any significant effects for a while.


Measuring the Invisible


How do we measure our health? Our weight? How out of breath we are when we walk up the stairs? Numbers gathered once a year from a blood test? Health doesn’t happen at the doctor’s office. It happens with your every heartbeat and daily decisions of diet and exercise.


With our busy days, it’s easy to fall prey to the convenient behaviors over the healthy. What’s for dinner? Do I have the time to walk/run/bike?  Stairs or elevator? Watch that show or go to sleep? We don’t want to become neurotic about this, but how can we instill healthy habits in ourselves with more timely rewards than our natural senses provide?


Analyzing Peak Data


I’m excited about our new partnership with Big Cloud Analytics. In 2014, Intel acquired Basis Science, which makes the Basis Peak, arguably the most accurate fitness tracker on the market today. It can measure your resting and exercising heart rate, light, REM, and deep sleep, steps, calories burned, something called galvanic skin response which among other things, can deduce your stress level.  And the data from the Peak has been shown to be about 4% within that provided by stress test via halter, and within 2% of that provided by a sleep study. Great, but how does that change my habits to make me healthier?


That’s where Big Cloud Analytics comes in. Their COVALENCE Analytics Platform, running on Intel® Xeon® Servers, can analyze the approximately 72,000 data points coming off your Basis Peak watch every day. Add to that data coming from the population, weather data, social media data and whole host of other metrics.


With the magic of statistics, the BCA Dashboard gives you a score of your heart health.  We all know our hearts are a muscle. Our heart rate at rest, during sleep, when we exercise, and when we rest after exercise give a pretty clear picture of how that muscle is performing.  Wear the watch regularly (I change wrists for sleeping), and you are being tested every moment of every day.


Anonymizing Population Data


Whoa – sounds a bit nosy - what about my privacy? Data from our watches goes to secure Basis servers, then when we opt in for the COVALENCE Dashboard, our IDs are mapped to a new random ID and password, and a device. This way, we and our data are no longer associated.  This is called double-blinding. With double-blinding, our data is available anonymously as a member of the population. This lets me see where I fit within the population without sharing my personal numbers.  It also might let my employer (who is interested in me being my healthiest to maximize productivity and minimize corporate health insurance premiums) offer me targeted information without knowing my personal identity.


Analytics Influencing Positive Change


So now combine our heart rate data with activity, sleep data, stress data, our age, height, local weather then add a connected digital scale and blood pressure cuff, and the BCA Dashboard can identify how we compare to others based on gender, age, location, etc.


If we do the same exercise on a regular basis, anything from a walk, run or bike, or even a walk up a couple flights of stairs, we can see performance improve (or get worse). If we can view sleep score every night, maybe we can tie changes to our diet, the thermostat setting, or our activity level. Maybe we don’t want to be athletes, but would we and our loved ones want us to move our heart fitness ages lower?


For this work in tying wearable and exogenous (beyond biometrics) data, BCA and Intel recently won the 2015 Leadership in Healthcare IoT Award from the Employer Healthcare & Benefits Congress – they realize our health impacts productivity at work and our health insurance rates as well as our quality of life. Opportunities abound for gamification and incentives to help make fitness fun.


Connecting your Health


Analytics is at the center of the consumerization of Healthcare (see also US News article: http://health.usnews.com/health-news/patient-advice/articles/2015/08/14/how-big-data-is-driving-the-consumerization-of-health-care ). In the near future, you’ll be seeing additional ways to assess your personal health.


Companies are working mobile devices to continuously evaluate the function of our bodies:  blood pressure, non-invasive glucose, lung function and more. When you or your loved ones are sick, the doctor will be able to ‘see’ you via your computer and devices enabling them to remotely hear your heart and lungs, see inside your throat and ears for example. And whether well or sick, analytics will assist in your assessment.


We could call it ‘Biofeedback 2.0’ – Adding individual and population analytics to accurate wearables and we’ve got the makings of a personal health revolution.




Next-Generation Sequencing (NGS) technologies are transforming the bioinformatics industry. By sequencing whole human genomes at rates of up to 18,000 per year—an average of one genome every 32 minutes— new sequencers have broken the $1,000 genome barrier[1] and opened the door to population studies and clinical usage models that have not been possible before.


Of course, high-volume sequencing generates an enormous amount of data that must be analyzed as fast as it is produced. According to Illumina, it would take an 85-node high performance computing (HPC) cluster to keep pace with its top-of-the-line HiSeq X™ Ten sequencer operating at full capacity.[2]


Drive Down the Cost of Analyzing Your Genomes

Working together, Qiagen Bioinformatics and Intel have developed a reference architecture for a 35-node cluster based on the Intel® Xeon® processor E5 v3 family that meets these same performance requirements, while reducing total cost of ownership (TCO) by as much as $1.4 million over four years. [3] Depending on sequencing volumes and data center efficiency, this solution could enable full analysis of whole human genomes for as little as $22 each.


The Qiagen Bioinformatics and Intel reference architecture uses CLC Genomics Server with the Biomedical Genomics Server extension, which is highly optimized for Intel architecture. The Biomedical Genomics Server solution provides all the advanced tools and capabilities of CLC Genomics Workbench and the Biomedical Genomics Workbench, but is designed specifically for HPC clusters. Geneticists benefit from a powerful workbench, high quality results, and intuitive interfaces that insulate them from the complexities of cluster computing.


Manage Your Data on Massively-Scalable, Centralized Storage

Fast, scalable storage is as important as cluster performance for NGS. The reference architecture includes a 165 TB storage solution based on Intel® Enterprise Edition for Lustre*. Intel packages this open source software with powerful management tools and offers 24/7 support to help organizations manage and protect their data and maintain high reliability and uptime.


This centralized storage system uses high-capacity commodity disk drives to keep costs low, plus a small number of Intel® Solid State Drives (Intel® SSDs) to accelerate the operations that are most critical for fast genome analysis. Like the compute cluster, the storage system is designed to scale on-demand, so you can accommodate rapid growth in a straightforward and cost-effective manner.


Lay the Foundation for Next-Generation Breakthroughs

Today’s powerful NGS technologies will help scientists, labs, and clinics deliver the next wave of scientific and medical innovation. A fast, scalable, and affordable analytics solution can simplify your journey and help keep your costs under control.


Read the story on the Qiagen Blog


Learn more at:


[1] Based on the published output capacity of the Illumina HiSeq X Ten next-generation sequencer. http://www.illumina.com/content/dam/illumina-marketing/documents/products/datasheets/datasheet-hiseq-x-ten.pdf

[2] Source: A workflow for variant calling based on BWA+GATK in the HiSeq XTM System Lab Setup and Site Prep Guide (Part # 15050093 Rev. H July2015). Current version for September 2015 can be found at:https://support.illumina.com/content/dam/illumina-support/documents/documentation/system_documentation/hiseqx/hiseq-x-lab-setup-and-site-prep-guide-15050093-01.pdf

[3] Based on internal performance tests and a total cost of ownership analysis performed by Qiagen Bioinformatics and Intel. Performance tests were conducted on a 16-node high performance computing (HPC) cluster. Each node was configured with 2 x Intel® Xeon® processor E5-2697 v3 (2.6 GHz, 14 core), 128 GB memory, and a 500 GB storage drive. All nodes shared a 165 TB storage system based on Intel® Enterprise Edition for Lustre, 256 TB of 7.2K RPM NL-SAS disk storage and 4 x 800 GB Intel Solid State Drive Data Center S3700 Series supported by an Intel® True Scale™ 12300 - 36 Port QDR Infiniband Switch and a 2x Intel® True Scale™ Single Port HCA’s – (QDR-80 configured by default). The TCO analysis was performed using an internal Intel tool and publicly available product pricing and availability as of October 9, 2015. The TCO for the test cluster was estimated over a 4-year period and compared with the estimated TCO of an 85-node cluster, as described in the Illumina HiSeq X System Lab Setup and Site Prep Guide, Document # 15050093 v01, September 2015. https://support.illumina.com/content/dam/illumina-support/documents/documentation/system_documentation/hiseqx/hiseq-x-lab-setup-and-site-prep-guide-15050093-01.pdf.  To quantify the TCO comparison, specific products were chosen that would fulfill the general specifications defined within the Illumina guide.  Support costs for both systems were estimated as 60 percent of TCO. The performance and TCO results should only be used as a general guide for evaluating the cost/benefit or feasibility of a future purchases of systems. Actual performance results and economic benefits will vary, and there may be additional unaccounted costs related to the use and deployment of the solution that are not or cannot be accounted for.

This blog is written by a colleague, Robert Sugar, who is a Software Architect at Intel Health and Life Sciences and features some very exciting advances for those who work in the genomics field, particularly as we gather at BioData World Congress, in Cambridge, UK, to hear how organisations across the world are advancing precision medicine. If you'd like to discuss an aspect of this blog you can find Robert's LinkedIn details at the end of his writing.



By Robert Sugar

With an aspiration of all-in-one-day genome sequencing bringing precision medicine to reality, I wanted to share some of the work Intel has been undertaking with partners to speed up an important part of the sequencing process ahead of BioData World Congress 2015. Like most good stories, this one starts at the beginning, by which I mean the mapping phase of DNA sequencing analysis which prepares sequence alignment files for variant calling in sequencing pipelines.


From Multiple to Single Pass

In a typical genomic pipeline, after the mapping phase, a number of tasks such as marking duplicates, sorting and filtering must take place which usually requires multiple passes by different preparation tools. The consequences of calling multiple command line tools numerous times include repeated I/O between the steps and multiple passes over the same data which may have only been incrementally modified. Moreover, many of these tools (such as the Picard tool recommended by the gold standard GATK workflow) utilize only a single CPU thread. As a result, often more time is spent in sorting and filtering than in variant calling itself. This not only slows down the entire process of sequencing a genome but also has financial implications too.


Intel, alongside IMEC and in collaboration with Janssen Pharamceutica (as part of the ExaScience Life Lab) developed elPrep, an open source high-performance tool for DNA sequence (BAM file) processing, which uses a single-pass, parallel filtering architecture. elPrep simplifies not only the computational processes but also the end user’s need to understand these processes, thus reducing both time and costs. Additional filters can also be easily added for customer-specific workflows.

elPrep BAM Processing.png

Figure 1: traditional multi-pass BAM file processing (blue arrows) vs. a single-pass elPrep workflow (orange arrows). Source: Charlotte Herzeel (Imec) and Pascal Costanza (Intel)


Meeting Today’s Standards

Throughout the development of elPrep it was vitally important to ensure compatibility with existing tools and datasets, and this has been achieved. elPrep can be used as a drop-in replacement for existing tools today, e.g. it is now the standard tool for exome loads at Janssen Pharmaceutica. What truly makes elPrep a fantastic tool for the genomics community though is the single-pass, extensible filtering architecture. By merging computations of multiple steps it avoids repeated file I/O between the preparations steps. Unnecessary barriers are removed allowing all preparation steps to be executed in parallel.


Reducing Time, Increasing Value

It is worth illustrating the impact of this with figures from a real-life exome sequencing project by Janssen Pharmaceutica as reported in ‘elPrep: High-Performance Preparation of Sequence Alignment/Map Files for Variant Calling (2015)’1 The runtime of an exome workload can be reduced from around 1 hour 40 minutes to around 10 to 15 minutes. The gains also tend to increase with the complexity of the pipeline. This is a considerable time (and cost) saving when looked at in the context of mapping and analysing whole-genome data.


elPrep is an important addition to the toolset of those working in the genomics field. I would draw your attention to the previously mentioned paper which provides extensive detail on the benefits of elPrep and more information on how it compares to existing SAM/BAM manipulation tools. There is more work to be done as we look ahead to all-in-one-day genomic sequencing but this is an exciting development. elPrep is available open source for both academic and industrial customers at https://github.com/ExaScience/elprep and is being integrated to online genomic toolkits, such as DNANexus.


Contact Robert Sugar on LinkedIn



1 Herzeel C, Costanza P, Decap D, Fostier J, Reumers J (2015) elPrep: High-Performance Preparation of Sequence Alignment/Map Files for Variant Calling. PLoS ONE 10(7): e0132868. doi:10.1371/journal.pone.0132868

Today I presented at BioData World 2015 at the Wellcome Trust Sanger Institute, near Cambridge, UK, on the topic of ‘Advancing the Data Science of Precision Medicine’, so I wanted to be able to share some of my thoughts for those who were unable to attend this fantastic event.


The Unique Challenges of Clinical, Behavioral and Genomic Data

As Chief Data Scientist for Big Data Solutions at Intel I’m really pleased to be able to share how I think big data can really help to advance healthcare and life sciences. The first question I’m often asked is: ‘What’s the big data challenge in healthcare?’ In simple terms we need to do a better job of acquiring, ordering, analyzing and then utilizing all of the myriad of data types generated across the healthcare ecosystem. On the face of it this is technically very challenging. I like to think of the different types of data making up what we call the ‘True State of the Patient’, a computable model of the patient for optimizing care delivery, outcomes and revenue. Think clinical data, behavioral data and genomic data, all of which have with their own unique challenges when it comes to making meaningful sense of them.


Closing the gaps around Clinical Data

Most clinical data sits in Electronic Health Records (EHR) where it’s often assumed that the data is neatly structured and can be analyzed with ease. For the majority of providers that’s not the case just yet. Some of the most important data is sitting in free-form text written by clinicians or in annotated images of scans. There are some great advances being made around Natural Language Processing (NLP) which will help us to make sense of this unstructured data and begin to unlock the value of every EHR.


Plugging Wearables into the Matrix

Wearable technology is mainstream today, devices are affordable, easy to use and capturing increasingly accurate types of data. The challenges for wider adoption by the healthcare ecosystem remain around confidence, from both patient and provider. How can a clinician be sure that data has been accurately recorded? Do patients have the know-how to interpret the data recorded and self-confidence to present this to a medical professional? Integrating data from wearables (and sensing technology already used by providers) is critically important to achieving that 360-degree view of the patient.


Providing the Platform to bring Lifesaving Discoveries

It’s exciting to see how organizations are overcoming the technical challenges posed by the enormous volumes of data surfaced when analyzing the genome at BioData World this week. Supporting the life sciences through our Big Data Solutions here at Intel means that providers will be able to combine clinical data, behavioral data and genomic data much sooner that we could have imagined even just 5 years ago. For example, take a look at the work we are doing with Oregon Health & Science University (OHSU) on the Collaborative Cancer Cloud, a precision medicine analytics platform that allows hospitals and research institutions to securely share patient genomic, imaging, and clinical data for potentially lifesaving discoveries.


The True State of the Patient is an achievable framework for healthcare across the world. It requires change to workflows, technologies and perhaps new ways of thinking. Conversation at BioData World gives me the confidence that we are on the right path, and we’re heading down that right path at an ever-increasing speed. If you consider the last 5 years to have been an exciting ride then hold tight as the next 5 years are going to bring much more.


Contact Bob Rogers on LinkedIn


Filter Blog

By date:
By tag:
Get Ahead of Innovation
Continue to stay connected to the technologies, trends, and ideas that are shaping the future of the workplace with the Intel IT Center