Mobile devices and technology have allowed clinicians to gather patient data at the point-of-care, access vital information on the go, and untether from traditional wired health IT infrastructures. One hidden benefit of mobile capability is how doctors can gain access to data which analyzes their own performances.


In the video above, Jeff Zavaleta, MD, chief medical officer at Graphium Health and a practicing anesthesiologist in Dallas, shares his insight on how mobile devices offer a new opportunity for practitioners to self-evaluate, answer the question, “how did you do this week?,” and see key performance indicators such as their average patient recovery times and on-time appointment starts.

 

Watch the short video and let us know what questions you have about the future of mobile health IT and where you think it’s headed. How are you using mobile technology to improve your practice?

 

Also, be on the lookout for new blogs from Dr. Zavaleta, who will be a guest contributor to the Intel Health & Life Sciences Community.

 

The buzz following the mHealth Summit has been encouraging, to say the least. The December event drew 4,000 attendees, who were brought up to speed on the latest developments spanning policies and research, global health, hospital mobility, consumer engagement, privacy and security and, of course, emerging technologies.

 

The two areas of focus that I found most encouraging centered on consumer engagement and care coordination.

 

Far too often, when the industry talks about mobile health, the technology itself – or even just the promise of an emerging technology – has a way of quickly overpowering the dialogue. But as the Center for Connected Medicine's Joseph Kvedar touched on (and several panels advanced the notion), one of the biggest issues facing healthcare right now is getting and keeping consumers interested in their own care. The success of mobile devices and apps, as well as early consumer interest in wearables, is encouraging because it shows that all the pieces are in place. But until consumers show as much interest in communicating their health information with their doctors as they do, say, sharing Facebook posts, the healthcare system overall will continue to struggle.

 

Given this present state of consumer engagement, news that care coordination works was all the more welcome.

 

As mHealthNews reported: "In health systems large and small, clinicians are using smartphones to instantly connect with others caring for the same patient. They're sharing notes and tests, discussing treatment plans and, in many cases, bringing the patient and his/her family into the loop to map out a care plan that goes beyond the hospital or clinic. It's a tried-and-true process that's gone beyond the pilot stage, as was noted in Healthcare IT News' Monday morning breakfast panel and several educational sessions. Expect this to become the norm for patient care."

 

Taken together, the growing emphasis on consumer engagement – coupled with the now proven advantages of care coordination to help overcome the disconnect between physicians and other care givers – is, in my opinion, highly likely to yield meaningful outcomes.

 

Equally important, as medical groups and health systems begin to make headway with consumer engagement while addressing care coordination holistically, providers should be able to work together to keep patients healthier – while remaining competitive in the marketplace.

 

What questions about mHealth do you have?

 

As a B2B journalist, John Farrell has covered healthcare IT since 1997 and is a sponsored correspondent for Intel Health & Life Sciences.

Read John’s other blog posts

As recently as 10 years ago, healthcare IT was mostly corporate provisioned, less diverse and there were slower refresh rates. Up to this point, usability was treated as a “nice to have” and significantly lower priority than features or functionality of solutions.

 

In this more homogeneous and slower changing environment there was, for the most part, one way to get the job done. Fast forward to today where most healthcare IT environments are much more heterogeneous with a myriad of devices, both corporate and personal BYOD (Bring Your Own Device), operating systems, apps, versions, social media, and now we have wearables and Internet of Things rapidly growing. Furthermore, refresh rates are much faster, especially with personal / BYOD devices, and apps. In today’s environment, usability is very much a “must have” because if it is not present research shows healthcare workers find workarounds, like using personal devices, and these workarounds drive non-compliance and additional security and privacy risk and can often be the source of breaches.

 

Traditionally we have approached usability and security as a tug of war or tradeoff … where having more security meant less usability and vice versa.

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

Unfortunately, breaches have reached alarming levels in both business impact and likelihood. The total average cost of a data breach in 2014 was US $3.5 million. This average is global and across several industries including healthcare. Looking more specifically at healthcare, the global average cost of a data breach per patient is US $359, the highest across all industries. With this kind of cost avoiding, breaches are of paramount importance for healthcare organizations. But how can we add security without compromising usability, and inadvertently driving workarounds that actually cause non-compliance and risk?

 

What is desperately needed is security that preserves or even improves usability, where risks are significantly mitigated without driving healthcare workers to use workarounds. On the surface this may seem impossible, yet there are several security safeguards today that do just that. Many breaches occur due to loss or theft of mobile devices. A very good safeguard to help mitigate this risk are self-encrypting SSD’s (Solid State Drives). If one takes a conventional hard drive, unencrypted and at risk of causing breach if lost or stolen, and replaces it with an SSD + encryption this can often have better data access performance than the original conventional unencrypted drive. Another example of a safeguard that improves usability and security is MFA (Multi-Factor Authentication) combined with SSO (Single Sign On), which improves both the usability and security of each login, as well as reduces the overall number of logins.

 

Intel Security Group is focused on creating innovative security safeguards that combine security software vertically integrated with security hardware that improve usability and harden the overall solution to make it more resilient to increasingly sophisticated attacks, such as from cybercrime. With cloud and mobile, and health information exchange, security becomes like a chain, and effective security requires securing all points and avoiding weakest links. Intel Security Group solutions span right from mobile devices, through networks to backend servers. This paves the way for healthcare to adopt, embrace, and realize benefits of new technologies while managing risk, and improving usability.

 

What questions about healthcare IT security do you have?

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel Corporation and a frequent blog contributor.


Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

There is talk in the medical industry of helping providers practice at the maximum of their licensure. One reason for this is that we don't have enough primary care physicians, and, in part, can address this gap with physician assistants, nurse practitioners, registered nurses, and a myriad of non-traditional team members like pharmacists and health coaches. It so happens that all of these individuals can be more cost-effective than physicians.

 

Medical assistants can do more than escort patients to an exam room and take vital signs. Nurse practitioners have the training and ability to move beyond acute illness diagnosis & treatment to engage in chronic disease management. Collaborative practice agreements allow pharmacists to manage complex patients on complicated medication regimens, assisting the healthcare team with their unique expertise in drug effects and interactions. As for doctors, the highest paid part of that pyramid, how do we make sure they are doing the things that only doctors can do while engaging their team to help with the rest?

 

Elevating the Patient Role

 

There's one team member who is often left out of this conversation -- the patient. How do we engage patients at the maximum of their ability? Patients are capable of doing a lot of more to manage their health if we would just give them the proper training and tools. By the way, patients are free. We don’t have to pay them to take care of themselves.

 

mHealth is the platform on which healthcare will move forward. What role can and should the users of mHealth technologies play? How do we maximize the impact that each user group can have on the health outcomes we are all working towards? How does everyone practice at the maximum of his or her licensure in a mhealth world?

 

It's important to remember the simple goal we are all working towards. We are trying to help people live healthier lives and trying to do it cost effectively. Patients are indispensable in working towards this goal. Patients have access to themselves all day, every day. They are on the front lines of healthcare, and they don’t cost anything.

 

Merging Patients and mHealth

 

In fact, according to an ONC-funded pilot project at Geisinger Health System, patients help to spot errors such as outdated information and omissions such as medications prescribed by another provider. Personal health records can drive these efforts.

 

  • Patients are eager to provide feedback on their medication list – 30 percent of patient feedback forms were completed and in 89 percent of cases, patients requested changes to their medication record.

 

  • Patient feedback is accurate and useful – on average, patients had 10.7 medications listed, with 2.4 requested changes. In 68 percent of cases, the pharmacist made changes to the medication list in the electronic health record based on the patient’s feedback.

 

ONC officials also write that the Open Notes Project, launched in 2010 by Geisinger, the University of Washington's Harborview Medical Center, Beth Israel Deaconess Medical Center and the Robert Wood Johnson Foundation, “found that patients who were given access to their doctors' notes reported they do better in taking their meds.”

 

If patients are going to become effective team members, we need to maximize their potential. mHealth solutions can help remove barriers by providing effective education, the necessary tools for tracking health and the right connectivity with other members of their healthcare team. This would allow the rest of the team to focus on the aspects of care they are uniquely qualified to address.

 

What questions do you have?

 

Lucienne Ide, co-author of this blog post, is CEO of Rimidi.com and Justin Barnes is a Managing Director at Justin Barnes Advisors.

The healthcare industry’s digital transformation calls for shifting the burden of care from the system to the patient. Technology is helping to lead this charge, as evidenced by the growing number of patients who are now able to track their own health information as well as generate data that previously was unavailable to physicians and other care providers. With the 2nd Annual Healthcare Cyber Security Summit this month – and the attack vectors targeting the industry having changed over the past couple years – it’s a good time to revisit the topic.

 

Mobile devices, EMRs, HIEs, cloud computing, telemedicine and other technologies are now common to healthcare settings, incrementally delivering on their promise to stretch resources and lower costs. But along with these new capabilities come new threats to patient data and the organizations responsible for managing it. Such threats are reflected through the rise of HIPAA data breaches from 2012-2013, as well as in the increase of state- and corporate-sponsored cyber attacks targeting medical device makers in 2014. As a recent webinar presented by NaviSite pointed out: the emerging Internet of Things (IoT) also raises the stakes for healthcare organizations, as reflected by Europol’s recent warning about IoT and the FDA’s determination that some 300 medical devices are vulnerable to attack.

 

In April, the FBI issued a sobering notification to healthcare organizations stating that the industry is “…not technically prepared to combat against cyber criminals, basic cyber intrusion tactics, techniques and procedures…” Nor is it ready for some of the more advanced persistent threats facing the industry.

 

It doesn’t help that medical records are considered up to 50 times more valuable on the black market than credit card records.

 

Whether through HIPAA data breaches, malware, phishing emails, sponsored cyber-attacks, or threats surrounding the evolving Internet of Things, the emerging threats in healthcare cannot go unaddressed. Security experts say cyber criminals increasingly are targeting the industry because many healthcare organizations still rely on outdated computer systems lacking the latest security features.

 

With so many mobile and internet-connected devices located in healthcare settings, determining how to secure them should be a top priority. That means developing and implementing strategies that make anti-virus, encryption, file integrity and data management a top priority.

 

Security experts report that, ultimately, data correlation is the key. What is important for healthcare organizations is having a system in place that empowers threat identification, classification, system analysis, and a manual review process that offsets human error, enabling 100 percent certainty regarding potential incidents.

 

With this in mind, how is your organization safeguarding against cyber threats? Do you rely on an in-house cybersecurity team, or has your organization partnered with a managed security service provider for this type of service?

Patient data and analytics are vital to the healthcare experience today. To learn more, we recently caught up with Dr. David J. Cook, professor of anesthesiology at Mayo Clinic, who also has an appointment in the engineering section for the Center of the Science of Healthcare Delivery.


Dr. Cook built MC Health Connection, a cloud-based architecture designed to alter care models and improve the patient experience. Using a tablet, patients, family members and physicians can track their progress with recovery following surgery. In the video below, Dr. Cook shares his thoughts on the three elements for changing care models.

 

Intel: How can wearables and big data work together to improve healthcare?

 

Cook: The first element in the evolution of care is in acquiring data from patients in non-intrusive ways that integrate with their daily lifestyles. We need to give patients the opportunity to share insights into their daily health cycles, which would lead to early detection of disease and ultimately improve the quality of their lives.

 

The second element is connecting patient-generated data to a gateway so that it can inform decisions. Data alone is not enough and the clinical care model is not sufficient unless it has useful and actionable patient health data.

 

The third element is connecting that gateway to a healthcare infrastructure that is accessible to both patients and their healthcare providers. These elements are just beginning to work together to create an intelligent healthcare model.

 

Intel: What can you imagine for the future of healthcare?

 

Cook: We need to shift our thinking and be ready to participate in healthcare models that empower patients to contribute and engage in their own healthcare. The future is shifting away from a passive delivery model to one that focuses on real-time patient engagement. This is probably the fundamental philosophical and social transition that’s going to occur in healthcare.

 

The way we engage with the world is shifting how we live our daily lives—whether that’s in how we bank, plan our travel or decide where to eat or what to buy. It’s reasonable for patients to expect that we deliver healthcare models that connect to modern technologies that can greatly improve their health and longevity.

 

 

Intel: How have patient needs changed in the past 100 years?

 

Cook: In the past, there was a belief that illnesses were just something that happened to patients. Therefore, the responsibility for patient wellness fell entirely on someone who typically didn’t give much thought to preventative care. Now, that model is certainly suitable for acute appendicitis, or typhoid fever, or getting run over by a wagon, but that psychosocial model doesn’t work for diabetes. It doesn’t work for hypertension. It doesn’t work for obesity, which is among the ailments affecting the majority of the patients that we see today. That transition is incredibly important.

 

Intel: How is big data changing your approach to patient care?

 

Cook: Technology has changed what I do tremendously. Technology is radically changing the work experience of physicians; its impact on my own work is extraordinary. I’m an anesthesiologist and I work in cardiac surgery—we get data on multiple physiologic parameters every second. When you have that much data it begins to add amazing amounts of value.

 

The amount of data that we have now provides a remarkable patient safety net. We can now pull data and identify certain patterns that require immediate physician attention. We didn’t have that in the past. This is a completely transformative way of delivering healthcare.

 

Intel: What keeps you up at night?

 

Cook: What keeps me up at night, more than anything else, is frustration at the slow pace forward. What is needed is so absolutely and clearly evident. Yet there seems to be an effort to reach a large comprehensive platform solution, as opposed to creating a variety of smaller solutions that you can test on a relatively small scale. It feels like every week and every month that goes by there’s this pressing need in the United States and elsewhere for cost-effective healthcare that’s of high quality. The way to that is relatively straightforward, I think.

Home healthcare practitioners need efficient, reliable access to patient information no matter where they go, so they need hardware solutions that meet their unique needs. Accessing critical patient information, patient file management, seamless multitasking and locating a patient’s residence, are daily tasks for mobile healthcare professionals. Mobile practitioners don’t have access to the same resources they would if they were working in a hospital, so the tools they use are that much more critical to accomplishing their workload. Fortunately, advances in mobile computing have created opportunities to bridge that gap.

 

An Evolved Tablet For Healthcare Providers

 

As tablets have evolved, they’ve become viable replacements for clunky laptops. Innovation in the mobile device industry has transformed these devices from media consumption platforms and calendar assistants into robust workhorses that run full-fledged operating systems. However, when it comes to meeting the needs of home healthcare providers, not all tablets are created equal.

                 

A recent Prowess Consulting comparison looked at two popular devices with regards to tasks commonly performed by home healthcare workers. The study compared an Apple® iPad Air™ and a Microsoft® Surface™ Pro 3 to determine which device offers a better experience for home healthcare providers, and ultimately, their patients.

 

Multitasking, Done Right

 

One of the biggest advantages to the Surface™ Pro 3 is its ability to let users multitask. For example, a healthcare worker can simultaneously load and display test results, charts, and prescription history via the device’s split-screen capabilities. A user trying to perform the same tasks on the iPad would find themselves running into the device’s limitations; there are no split-screen multitasking options on the iPad Air™.

 

The Surface™ Pro 3’s powerful multitasking abilities combined with the ability to natively run Microsoft Office gives home healthcare providers the ability to focus more time on patient care and less time on administrative tasks. Better user experience, workflow efficiency, file access speed, and split-screen multitasking all point to the Microsoft® Surface™ Pro 3 as the better platform for home healthcare providers.

 

For a full rundown of the Surface™ Pro 3’s benefits to home healthcare workers, click here.

 

What questions about mobile tablets in healthcare do you have?

The growth of mobile healthcare is sometimes staggering to think about. In just a few short years we’ve seen advancements in everything from devices to EHRs to connectivity. While topics such as security, bring-your-own device, and cloud are ever-present, the technology that enables these activities are changing all the time.

 

Mobility is a given as today’s healthcare expands beyond institutions into more home-based and community care settings.  Mobile technology can also help busy clinicians improve quality of care and efficient of care delivery. 

 

Next week’s mHealth Summit 2014 in Washington, D.C., promises to be an engaging event that will address the next wave of mobile healthcare. I’ve seen the growth of this event and am excited to hear the sessions and see the latest devices at the exhibition.

 

Intel will be on hand in booth #303 showing off a number of mhealth tools, including Dell mobile devices and Microsoft mobile apps. In addition, our experts will be participating in a variety of informative conference sessions, including:

 

Sunday, 12/7


mHealth Summit Privacy & Security Symposium, 1:45 – 2:30 pm

Risky Business: Mitigating mHealth Workarounds with “Usable” Security

Healthcare security incidents and breaches have reached alarming frequencies and impacts. Better quality and lower cost healthcare depends on minimizing privacy and security risks and incidents. We need patient care *with* security. Intel Privacy & Security Lead David Houlding will participate as a presenter.

 

Monday, 12/8


Luncheon Panel, 12:30 – 2:00 pm

Going mobile is no longer an option.  Clinicians realize that to drive improvements in clinical efficiency and patient outcomes mobility is required to enable care to be delivered anywhere at any time. It takes the right mobile devices, software, security and improved workflows to successfully deploy a mobile health strategy.  Windows 8 features the state of the art user experience for touch tablets that clinicians demand, and the manageability and security that IT departments require.  Ben Wilson, Director of Mobile Health at Intel Corporation, will lead a panel of the industry's leading healthcare providers in discussing mobile health success stories and why these customers have chosen Windows* 8 as their mobile platform of choice. Speakers: Will Morris, MD, Cleveland Clinic, Bradley Dick, CIO, Resurgens Orthopaedic and Shiv Rao, MD, Cardiologist, University of Pittsburgh Medical Center.

 

Partnerships for the Future of Population Health, 2:30 – 3:30 pm, National Harbor 10-11

This session will address innovative partnerships or projects that are attempting to develop new standards of care or provide insight into diseases/conditions in specific patient populations through novel collaborations for data sharing or analytics. Matt Quinn from Intel and Lona Vincent, Senior Associate Director of Research Partnerships at the Michael J. Fox Foundation will participate.

 

Public mHealth – Insights on Program Development and Implementation, 3:45– 4:45 pm, Room Maryland A

Intel’s Matthew Taylor participates in a session that will examine case studies tackling major public health problems, from childhood obesity, sexually transmitted infections, and child feeding habits to determining the training and technology costs for preparing frontline health workers in mHealth programs.

 

Tuesday, 12/9


Future of Global mHealth, Potomac Ballroom, 9:50 am – 10:15 am

What are the challenges and opportunities for leveraging mobile to deliver healthcare in low-resource environments around the world? Can mobile level the playing field for a more equitable healthcare access and distribution of healthcare resources in the future? In this fireside chat, Lester Russell, senior director for health and life sciences for Intel in EMEA, will discuss key issues shaping the future of global mHealth, such as scalability, market opportunities, policy, key technologies, infrastructure, and the role of public-private partnerships.

 

Wednesday, 12/10


Pharma Roundtable, 11:45 am – 4:00 pm, Potomac 1-2

Intel’s Matt Quinn participates in the Second Annual mHealth Summit Pharmaceutical, Pharmacy and Life Sciences Roundtable, which is dedicated to an open exchange of cross-sector insights for advancing outcomes-driven mobile and connected health strategies and reducing barriers to adoption. The Roundtable seeks to identify opportunities for collaboration and commitment to the development of high-impact mobile and connected health initiatives, which foster patient and caregiver involvement, facilitate informed and shared decision making, and demonstrate improvements in treatment, care and outcomes.

 

We look forward to seeing you at mHealth 2014. What questions about mobile healthcare technology do you have?

jamalloy

Bio IT Data: What to Keep?

Posted by jamalloy Dec 2, 2014

 

Based on what we heard at Supercomputing last month, it’s clear that bio IT research is on the fast track and in search of more robust compute power.

 

In the above video, Michael J. Riener, Jr., president of RCH Solutions, talks about dynamic changes coming to the bio IT world in the next 24 months. He says that shrinking budgets in research and development means that more cloud applications and service models will be implemented. When it comes to big data, next generation sequencing will heighten the need to analyze data, determine what data to keep and what to discard, and how to process it.

 

Watch the clip and let us know what questions you have. What changes do you want to see in bio IT research?

Frustration with electronic health record (EHR) systems notwithstanding, the data aggregation processes that have grown out of healthcare’s adoption of the electronic health record are now spawning analytical capabilities that were unthinkable just 15 years ago. By leveraging big data to track everything from patient recovery rates to hospital finances, healthcare organizations are capturing and storing data sets that are changing the way doctors, caregivers and payers tackle larger scale health issues.

 

It’s not just happening on the clinical side, either, where EHRs are extending real-time patient information to doctors and predictive analytics are helping physicians to better track and understand their patients' medical conditions.

 

In Kentucky, for example, tech investments by the state’s largest provider systems are estimated at over $600 million, a number that doesn’t even reflect investments from two of the biggest local organizations, Baptist Health and University of Kentucky HealthCare. The data collected by these hospitals includes—and far exceeds—the EMR basics mandated under ARRA, according to an article in The Lane Report.

 

While the goal of improving quality of care is, of course, a key driver of such investments, so is the government mandate tying Medicare and Medicaid reimbursement to outcomes. According to a recent report from McKinsey & Company, more than 50 percent of doctors’ offices and almost 75 percent of hospitals nationwide are managing patient information electronically. So, it’s not surprising that big data is catching the attention of healthcare’s management teams.

 

By quantifying and analyzing an endless variety of metrics—including things like R&D, claims, costs, and insights gleaned from patients—the industry is refining its approach to both preventative care and treatment, and saving money in the process. A good example can be found in the analysis of data surrounding regression rates, which some hospitals are now using to stave off premature releases and, by extension, exorbitant penalties.

 

Others, such as Brigham and Women’s Hospital, already are applying algorithms to generate savings beyond readmissions, in areas that include: high-cost patients, triage, decompensation, adverse events, and treatment optimization.

 

While there’s room to debate the extent to which big data is improving patient outcomes—or the scope of savings attributable to big data initiatives given the associated system costs—the trend toward leveraging data for better outcomes and savings will only continue to grow as CIOs advance meaningful implementations of solutions, and major technology companies continue to expand the industry’s basket of options.

 

How is your healthcare organization applying big data to overcome challenges? Have the results proven worthwhile?

 

As a B2B journalist, John Farrell has covered healthcare IT since 1997 and is a sponsored correspondent for Intel Health & Life Sciences.

Read John’s other blog posts

Clinicians are on the front lines when it comes to using healthcare technology. To get a doctor’s perspective on health IT, we caught up with Dr. Sandhya Pruthi, medical director for patient experience, breast diagnostic clinic, at Mayo Clinic Rochester, for her thoughts on telemedicine and the work she has been undertaking with remote patients in Alaska.

 

sandhya-pruthi-11254262.jpg

 

Intel: How are you involved in virtual care?

 

Pruthi: I have a very personal interest in virtual care. I have been providing telemedicine care to women in Anchorage, Alaska, right here from my telemedicine clinic in Rochester, Minnesota. I have referrals from providers in Anchorage who ask me to meet their patients using virtual telemedicine. We call it our virtual breast clinic, and we’ve been offering the service twice a month for the past three years.

 

Intel: What services do you provide through telemedicine?

 

Pruthi: We know that in some remote parts of the country, it’s hard to get access to experts. What I’ve been able to provide remotely is medical counseling for women who are considered high risk for breast cancer. I remotely counsel them on breast cancer prevention and answer questions about genetic testing for breast cancer when there is a very strong family history. The beauty is that I get to see them and they get to see me, rather than just writing out a note to their provider and saying, “Here’s what I would recommend that the patient do.”

 

Intel: How have patients and providers in Alaska responded to telemedicine?

 

Pruthi: We did a survey and asked patients about their experience and whether they felt that they received the care they were expecting when they came to a virtual clinic. The result was 100 percent satisfaction by the patients. We also surveyed the providers and asked if their needs were met through the referral process. The results were that providers said they were very pleased and would recommend the service again to their patients.

 

Intel: Where would you like to see telemedicine go next?

 

Pruthi: The next level that I would love to see is the ability to go to the remote villages in the state of Alaska, where people have an even harder time coming to a medical center. I’d also like to be able to have a pre-visit with patients who may need to come in for treatment so we can better coordinate their care before they arrive.

 

Intel: When it comes to telemedicine, what keeps you up at night?

 

Pruthi: Thinking about how we can improve the patient experience. I really feel that for a patient who is dealing with an illness, the medical experience should wow them. It should be worthwhile to the patient and it should follow them on their entire journey—when they make their appointment, when they meet with their physician, when they have tests done in the lab, when they undergo procedures. Every step plays a role in how they feel when they go home. That’s what we call patient-centered care.

This guest blog is by Sanchit Misra, Research Scientist, Intel Labs, Parallel Computing Lab, who will be presenting a paper by Intel and Georgia Tech this week at SC14.

 

Did you know that the process of winemaking relies on yeast optimizing itself for survival? When we put yeast in a sugar solution, it turns on genes that produce the enzymes that convert sugar molecules to alcohol. The yeast cell makes a living from this process (by gaining energy to multiply) and humans get wine.

 

This process of turning on a gene is called expression. The genes that an organism can express are all encoded in its DNA. In multi-cellular organisms like humans, the DNA of each cell is the same, but cells in different parts of the body express different genes to perform the corresponding functions. A gene also interacts with several other genes during the execution of a biological process. These interactions, modeled mathematically using “gene networks,” are not only essential in developing a holistic understanding of an organism’s biological processes, they are invaluable in formulating hypotheses to further the understanding of numerous interesting biological pathways, thus playing a fundamental role in accelerating the pace and diminishing the costs of new biological discoveries. This is the subject of a paper presented at the SC14 by Intel Labs and Georgia Tech.

 

Owing to the importance of the problem, numerous mathematical modeling techniques have been developed to learn the structure of gene networks. There appears, not surprisingly, to be a correlation between the quality of learned gene networks and the computational burden imposed by the underlying mathematical models. A gene network based on Bayesian networks is of very high quality but requires a lot of computation to construct. To understand Bayesian networks, consider the following example.

 

A patient visits a doctor for diagnosis with symptoms A, B and C. The doctor says that there is a high probability that the patient is suffering from ailments X or Y and recommends further tests to zero in on one of them. What the doctor does is an example of probabilistic inference, in which the probability that a variable has a certain value is estimated based on the values of other related variables. Inference that is based on Bayes’ laws of probability is called Bayesian inference. The relationships between variables can be stored in the form of a Bayesian network. Bayesian networks are used in a wide range of fields including science, engineering, philosophy, medicine, law, finance, etc. In the case of gene networks, the variables are genes and the corresponding Bayesian network models for each gene what other genes are related to it and what is the probability of expression of the gene given the expression values of the related genes.

 

Through a collaboration between Intel Labs’ Parallel Computing Lab and researchers at Georgia Tech and IIT Bombay, we now have the first ever genome-scale approach for construction of gene networks using Bayesian network structure learning. We have demonstrated this capability by constructing the whole-genome network of the plant Arabidopsis thaliana from over 168.5 million gene expression values by computing a mathematical function 7.3 trillion times with different inputs. For this, we collected a total of 11,760 Arabidopsis gene expression datasets (from NASC, AtGenExpress and GEO public repositories). A problem of this scale would have consumed about six months using the state-of-the-art solution. We can now solve the same problem in less than 3 minutes!

 

To achieve this, we have not only scaled the problem to a much bigger machine - 1.5 million cores of Tianhe-2 supercomputer with 28 PFLOP/s peak performance, we also applied algorithm-level innovations including avoiding redundant computation, a novel parallel work decomposition technique and dynamic task distribution. We also made implementation optimizations to extract maximum performance out of the underlying machine.

 

sanchit image3.jpg

 

sanchit image 2.jpg

 

  • (Top)    Root Development subnetwork                 (Bottom) Cold Stress subnetwork

 

Using our software, we generated gene regulatory networks for several datasets - subsets of the Arabidopsis dataset - and validated them using known knowledge from the TAIR (The Arabidopsis Information Resource) database. As a demonstration of the validity and how genome-scale networks can be used to aid biological research, we conducted the following experiment. We picked the genes that are known to be involved in root development and cold stress and randomly picked a subset of those genes (red nodes in the above figures). We took the whole-genome network generated by our software for Arabidopsis and extracted subnetworks that contain our randomly picked subset of genes and all the other genes that are connected to them. The extracted subnetworks contain a rich presence of other genes known to be in the respective pathways (green nodes) and closely associated pathways (blue nodes), serving as a validation test. The nodes shown in yellow are genes with no known function. Their presence in the root development subnetwork indicates they might function in the same pathway. The biologists at Georgia Tech are performing experiments to see if the genes corresponding to yellow nodes are indeed involved in root development. Similar experiments are being conducted for several other biological processes.

 

Arabidopsis is a model plant for which NSF had launched a 10 year initiative in 2000 to find the functions of all of its genes, yet the functions of 40 percent of its genes are still not known. This method can help accelerate the discovery of the functions of the rest of the genes. Moreover, it can easily be scaled to other species including human beings. The understanding of how genes function and interact with each other in a broad variety of organisms can pave the way for new medicines and treatments. Moreover, we can also compare the gene networks across organisms to enhance our understanding of the similarities and differences between them ultimately aiding in a deeper understanding of evolution.

 

What questions do you have?

 

With SC14 kicking off today, it’s timely to look at how high performance computing (HPC) is impacting today’s valuable life sciences research. In the above podcast, Dr. Rudy Tanzi, the Joseph P. and Rose F. Kennedy Professor of Neurology at Harvard Medical School and the Director, Genetics and Aging Research Unit at the MassGeneral Institute for Neurodegenerative Disease, talks about his pioneering research in Alzheimer’s disease and how HPC is critical to the path forward.

 

Listen to the conversation and hear how Dr. Tanzi says HPC still has a ways to go to provide the compute power that life sciences researchers need. What do you think?

 

What questions about HPC do you have? 

 

If you’re at SC14, remember to come by the Intel booth (#1315) for life sciences presentations in the Intel Community Hub and Intel Theater. See the schedules here.

As SC14 approaches, we have invited industry experts to share their views on high performance computing and life sciences. Below is a guest post from Mikael Flensborg. Director, Global Partner Relations at CLC bio, a Qiagen Company. During SC14, Mikael will be sharing his thoughts on genomic and cancer research in the Intel booth (#1315). He is scheduled in the Intel Community Hub on Tuesday, Nov. 18, at 3 p.m. and Wednesday, Nov. 19, at 3 p.m., plus in the Intel Theater on Tuesday at 2:30 p.m.

 

Eight months have now passed since Illumina announced the long expected arrival of the $1,000 genome with the launch of the HiSeq X Ten sequencing instrument, which is also denoted as a new era in High Throughput Sequencing with focus on a new wave of population-level genomic studies. MF-2010.png

 

In order to keep the costs down to the “magic” $1,000 level, it is required to have a full HiSeq X Ten installation plow through vast 18,000 full human genomes per year, which means a completion of each full run every 32 minutes. With focus on such a high volume, the next very important question arrives:

 

What does it take to keep up with such a high throughput on the data analysis side?

 

According to Illumina’s “HiSeq X Ten Lab Setup and Site Prep Guide (15050093 E)”, the requirements for data analysis are specified to be a compute cluster with 134 compute nodes (16 CPU cores @ 2.0 GHz, 128 GB of memory, 6 x 1 terabyte (TB) hard drives) based on an analysis pipeline consisting of the tools BWA+GATK.

 

At QIAGEN Bioinformatics we decided to take on the challenge of benchmarking this, based on a workflow (Trim, QC for sequencing reads, Read Mapping to Reference, Indels and Structural Variants, Local Re-alignment, Low Frequency Variant Detection, QC for Read Mapping) of tools on CLC Genomics Server  (http://www.clcbio.com/products/clc-genomics-server/) running on a compute cluster with Intel® Enterprise Edition for Lustre* filesystem, InfiniBand, Intel® Xeon® Processor E5-2697 v3 @ 2.60GHz, 14 CPU cores, 64GB of memory, and Intel® SSD DC S3500 Series 800GB.

 

We based our tests on a publicly available HiSeq X Ten dataset  and we have reached the conclusion that based on these specifications we can follow the pace of the instrument with a compute cluster of only 61 compute nodes.

 

Given our much lower compute node needs, these results can have a significant positive impact on the total cost of ownership of the compute infrastructure for a HiSeq X Ten customer, which includes hardware, cooling, space, power, and systems maintenance to name a few variable costs.

 

What questions do you have?

As SC14 approaches, we have invited industry experts to share their views on high performance computing and life sciences. Below is a guest post from Eldon M. Walker, Ph.D., Director, Research Computing at Cleveland Clinic's Lerner Research Institute. During SC14, Eldon will be sharing his thoughts on implementing a high performance computing cluster at the Intel booth (#1315) on Tuesday, Nov. 18, at 10:15 a.m. in the Intel Theater.


When data analyses grind to a halt due to insufficient processing capacity, scientists cannot be competitive. When we hit that wall at the Cleveland Clinic Lerner Research Institute, my team began consideration of the components of a solution, the cornerstone of which was a high performance computing (HPC) deployment.

 

In the past 20 years, the Cleveland Clinic Lerner Research Institute has progressed from a model of wet lab biomedical research that produced modest amounts of data to a scientific data acquisition and analysis environment that puts profound demands on information technology resources. This manifests as the need for the availability of two infrastructure components designed specifically to serve biomedical researchers operating on large amounts of unstructured data:

 

  1. A storage architecture capable of holding the data in a robust way
  2. Sufficient processing horsepower to enable the data analyses required by investigators

 

Deployment of these resources assumes the availability of:

 

  1. A data center capable of housing power and cooling hungry hardware
  2. Network resources capable of moving large amounts of data quickly

 

These components were available at the Cleveland Clinic in the form of a modern, tier 3 data center and ubiquitous 10 Gb / sec and 1 Gb / sec network service.

 

The storage problem was brought under control by way of 1.2 petabyte grid storage system in the data center that replicated to a second 1.2 petabyte system in the Lerner Research Institute server room facility. The ability to store and protect the data was the required first step in maintaining the fundamental capital (data) of our research enterprise.

 

It was equally clear to us that the type of analyses required to turn the data into scientific results had overrun the capacity of even high end desktop workstations and single unit servers of up to four processors. Analyses simply could not be run or would run too slowly to be practical. We had an immediate unmet need in several data processing scenarios:

 

  1. DNA Sequence analysis
    1. Whole genome sequence
      1. DNA methylation
    2. ChIP-seq data
      1. Protein – DNA interactions
    3. RNA-seq data
      1. Alternative RNA processing studies
  2. Finite Element Analysis
    1. Biomedical engineering modeling of the knee, ankle and shoulder
  3. Natural Language Processing
    1. Analysis of free text electronic health record notes

 

There was absolutely no question that an HPC cluster was the proper way to provide the necessary horsepower that would allow our investigators to be competitive in producing publishable, actionable scientific results. While a few processing needs could be met using offsite systems where we had collaborative arrangements, an internal resource was appropriate for several reasons:

 

  1. Some data analyses operated on huge datasets that were impractical to transport between locations.
  2. Some data must stay inside the security perimeter.
  3. Development of techniques and pipelines would depend on the help of outside systems administrators and change control processes that we found cumbersome; the sheer flexibility of an internal resource built with responsive industry partners was very compelling based on considerable experience attempting to leverage outside resources.
  4. Given that we had the data center, network and system administration resources, and given the modest price-point, commodity nature of much of the HPC hardware (as revealed by our due diligence process), the economics of obtaining an HPC cluster were practical.

 

Given the realities we faced and after a period of consultation with vendors, we embarked on a system design in collaboration with Dell and Intel. The definitive proof of concept derived from the initial roll out of our HPC solution is that we can run analyses that were impractical or impossible previously.

 

What questions do you have? Are you at the point of considering an internal HPC cluster?

Filter Blog

By date:
By tag: