As SC14 approaches, we have invited industry experts to share their views on high performance computing and life sciences. Below is a guest post from Karl D’Souza, senior user experience specialist at Dassault Systèmes Simulia Corp. Karl will be speaking about the Living Heart Project noted below during SC14 at the Intel booth (#1315) on Wednesday, Nov. 19, at 12:15 p.m. in the Intel Theater and 1 p.m. in the Intel Community Hub.


Computer Aided Engineering (CAE) has become pervasive in the design and manufacture of everything from jumbo jets to razor blades, transforming the product development process to produce more efficient, cost effective, safe and easy to use products. A central component of CAE is the ability to realistically simulate the physical behavior of a product in real world scenarios, which greatly facilitates understanding and innovation. karl2.jpg

 

Application of this advanced technology to healthcare has profound implications for society, promising to transform the practice of medicine from observation driven to understanding driven. However, lack of definitive models, processes and standards has limited its application, and development has remained fragmented in research organizations around the world.

 

In January of 2014, Dassault Systèmes took the first step to change this and launched the “Living Heart Project” as a translational initiative to partner with cardiologists, researchers, and device manufacturers to develop a definitive realistic simulation of the human heart. Through this accelerated approach, the first commercial model-centric, application-agnostic, multi-physical whole heart simulation has been produced.

 

Since cardiovascular disease is the number one cause of morbidity and mortality across the globe, Dassault Systèmes saw the Living Heart Project as the best way to address the problem. Although there is a plethora of medical devices, drugs, and interventions, physicians face the problem of determining which device, drug, or intervention to use on which patient. Often times to truly understand what is going on inside a patient invasive procedures are needed.

 

CAE and the Living Heart Project will enable cardiologists to take an image (MRI, CT, etc) of a patient’s heart and reconstruct it on a 3D model thereby creating a much more personalized form of healthcare. The doctor can see exactly what is happening in the patient’s heart and definitively make a more informed decision of how to treat that patient most effectively.

 

If you will be at SC14 next week, I invite you to join me when I present an overview of the project, the model, results, and implications for personalized healthcare. Come by the Intel booth (1315) on Wednesday, Nov. 19, for a presentation at 12:15 p.m. in the Intel Theater immediately followed by a Community Hub discussion at 1 p.m.

 

What questions do you have about computer aided engineering?

As SC14 approaches next week, we have invited industry experts to share their views on high performance computing and life sciences. Below is a guest post from Charles Shiflett, senior software engineer at Aspera. Charles will be sharing his thoughts on ultra high speed WAN transfers during SC14 at the Intel booth (#1315) on Tuesday, Nov. 18 at 12:30 p.m. in the Intel Theater and on Wednesday at 2 p.m. in the Intel Community Hub.

 

Research is running into a problem where the amount of data that is being generated is growing faster than what can be analyzed and stored using traditional tools and architectures. This has led to an explosion of technologies and tools in processing data that take advantage of the parallelism inherent both in compute clusters and/or cloud environments. What hasn't improved at a commensurate rate is improvements to performance in storage or network throughput. hwcfg.jpg

 

The reason storage and network throughput hasn't improved is in part a sense of complacency. In our traditional computing model, people think of internet (or network) as slow, disks as somewhat faster, and memory as screaming fast. This is completely wrong in an HPC environment, but it is the world we are stuck in when go through traditional interfaces which were designed with this paradigm in mind. Achieving breakthrough performance requires a new approach.

 

Using commodity Intel® hardware we were able to develop a novel solution (termed next generation Aspera FASP) which bypasses traditional kernel layers in Storage and IO. While this solution is still under development we have already been able to show over 90 percent utilization of a transfer solution using two 40 Gbit/s network cards for a total of 80 Gbit/s (see photo above). This equates to disk to disk performance at about 8 GB/s or the ability to transfer 1TB of data in about 2 minutes. At Super Computing 14, Aspera will be showcasing next generation FASP in operation over a WAN environment where we will transfer 250 GB of data in about 20 seconds from the show floor to Chicago and back.

 

As the developer of this solution, what excites me most is the benefit our customers will get from a high speed block based transfer solution that not only solves WAN transfer needs but does it in a way that is secure (every packet is encryped with AES-128), runs on commodity Intel hardware, and is equally applicable in both LAN and WAN environments. Our future plans are to provide this solution in a way that integrates with high performance compute packages (such as Spark or Hadoop), high performance storage (think Lustre or GPFS), while continuing to build upon the Intel and IBM Aspera technologies that have made this solution possible.

 

What questions do you have?

As SC14 approaches, we have invited industry experts to share their views on high performance computing and life sciences. Below is a guest post from Ari E. Berman, Ph.D., Director of Government Services and Principal Investigator at BioTeam, Inc. Ari will be sharing his thoughts on high performance infrastructure and high speed data transfer during SC14 at the Intel booth (#1315) on Wednesday, Nov. 19, at 2 p.m. in the Intel Community Hub and at 3 p.m. in the Intel Theater.


There is a ton of hype these days about Big Data, both in what the term actually means, and what the implications are for reaching the point of discovery in all that data.

 

The biggest issue right now is the computational infrastructure needed to get to that mythical Big Data discovery place everyone talks about. Personally, I hate the term Big Data. The term “big” is very subjective and in the eye of the beholder. It might mean 3PB (petabytes) of data to one person, or 10GB (gigabytes) to someone else. ariheadshot52014-sized.jpg

 

From my perspective, the thing that everyone is really talking about with Big Data is the ability to take the sum total of data that’s out there for any particular subject, pool it together, and perform a meta-analysis on it to more accurately create a model that can lead to some cool discovery that could change the way we understand some topic. Those meta-analyses are truly difficult and, when you’re talking about petascale data, require serious amounts of computational infrastructure that is tuned and optimized (also known as converged) for your data workflows. Without properly converged infrastructure, most people will spend all of their time just figuring out how to store and process the data, without ever reaching any conclusions.

 

Which brings us to life sciences. Until recently, life sciences and biomedical research could really be done using Excel and simple computational algorithms. Laboratory instrumentation really didn’t create that much data at a time, and it could be managed with simple, desktop-class computers and everyday computational methods. Sure, the occasional group was able to create enough data that required some mathematical modeling or advanced statistical analysis or even some HPC, and molecular simulations have always required a lot of computational power. But, in the last decade or so, the pace of advancement of laboratory equipment has left large swath of overwhelmed biomedical research scientists in the wake of the amount of data being produced.

 

The decreased cost and increased speed of laboratory equipment, such as next-generation sequencers (NGS) and high-throughput high-resolution imaging systems, has forced researchers to become very computationally savvy very quickly. It now takes rather sophisticated HPC resources, parallel storage systems, and ultra-high speed networks to process the analytics workflows in life sciences. And, to complicate matters, these newer laboratory techniques are paving the way towards the realization of personalized medicine, which carries the same computational burden combined with the tight and highly subjective federal restrictions surrounding the privacy of personal health information (PHI).  Overcoming these challenges has been difficult, but very innovative organizations have begun to do just that.

 

I thought it might be useful to very briefly discuss the three major trends we see having a positive effect on life sciences research:

 

1. Science DMZs: There is a rather new movement towards the implementation of specialized research-only networks that prioritize fast and efficient data flow over security (while still maintaining security), also known as the Science DMZ model (http://fasterdata.es.net). These implementations are making it easier for scientists to get around tight enterprise networking restrictions without blowing the security policies of their organizations so that scientists can move their data effectively without ******* off their compliance officers.


2. Hybrid Compute/Storage Models: There is a huge push to move towards cloud-based infrastructure, but organizations are realizing that too much persistent cloud infrastructure can be more costly in the long term than local compute. The answer is the implementation of small local compute infrastructures to handle the really hard problems and the persistent services, hybridized with public cloud infrastructures that are orchestrated to be automatically brought up when needed, and torn down when not needed; all managed by a software layer that sits in front of the backend systems. This model looks promising as the most cost-effective and flexible method that balances local hardware life-cycle issues with support personnel, as well as the dynamic needs of scientists.


3. Commodity HPC/Storage: The biggest trend in life sciences research is the push towards the use of low-cost, commodity, white box infrastructures for research needs. Life sciences has not reached the sophistication level that requires true capability supercomputing (for the most part), thus, well-engineered capacity systems built from white-box vendors provide very effective computational and storage platforms for scientists to use for their research. This approach carries a higher support burden for the organization because many of the systems don’t come pre-built or supported overall, and thus require in-house expertise that can be hard to find and expensive to retain. But, the cost balance of the support vs. the lifecycle management is worth it to most organizations.

 

Biomedical scientific research is the latest in the string of scientific disciplines that require very creative solutions to their data generation problems. We are at the stage now where most researchers spend a lot of their time just trying to figure out what to do with their data in the first place, rather than getting answers. However, I feel that the field is at an inflection point where discovery will start pouring out as the availability of very powerful commodity systems and reference architectures come to bear on the market. The key for life sciences HPC is the balance between effectiveness and affordability due to a significant lack of funding in the space right now, which is likely to get worse before it gets better. But, scientists are resourceful and persistent; they will usually find a way to discover because they are driven to improve the quality of life for humankind and to make personalized medicine a reality in the 21st century.

 

What questions about HPC do you have?

What better place to talk life sciences big data than the Big Easy? As temperatures are cooling down this month, things are heating up in New Orleans where Intel is hosting talks on life sciences and HPC next week at SC14. It’s all happening in the Intel Community Hub, Booth #1315, so swing on by and hear about these topics from industry thought leaders:

 

Think big: delve deeper into the world’s biggest bioinformatics platform. Join us for a talk on the CLC bio enterprises platform, and learn how it integrates desktop interfaces with high performance cluster resources. We’ll also discuss hardware and explore the scalability requirements needed to keep pace with the Illumina HiSeq X-10 sequencer platform, and with a production cluster environment based on Intel® Xeon® processor E5-2600 V3. When: Nov. 18, 3-4 p.m.

 

Special Guests:

Lasse Lorenzen, Head of Platform & Infrastructure, Qiagen Bioinformatics;

Shawn Prince, Field Application Scientist, Qiagen Bioinformatics;

Mikael Flensborg, Director Global Partner Relations, Qiagen Bioinformatics

 

Find out how HPC is pumping new life into the Living Heart Project. Simulating diseased states, and personalizing medical treatments, requires significant computing power. Join us for the latest updates on the Living Heart Project, and learn how creating realistic multiphysics models of human hearts can lead to groundbreaking approaches to both preventing and treating cardiovascular disease. When: Nov. 19, 1-2 p.m.

 

Special Guest: Karl D’Souza, Business Development, SIMULIA Asia-Pacific

 

Get in sync with scientific research data sharing and interoperability. In 1989, the quest for global scientific collaboration helped lead to the birth of what we now call the Internet. In this talk, Aspera and BioTeam will discuss where we are today with new advances in global scientific data collaboration. Join them for an open discussion exploring the newest offerings for high-speed data transfer across scientific research environments. When: Nov. 19, 2-3 p.m.

 

Special Guests:

Ari E. Berman, PhD, Director of Government Services and Principal Investigator, BioTeam;

Aaron Gardner, Senior Scientific Consultant, BioTeam;

Charles Shiflett, Software Engineer, Aspera

 

Put cancer research into warp speed with new informatics technology. Take a peak under the hood of the world’s first comprehensive, user-friendly, and customizable cancer-focused informatics solution. The team from Qiagen Bioinformatics will lead a discussion on CLC Cancer Research Workbench, a new offering for the CLC Bio Cancer Genomics Research Platform. When: Nov. 19, 3-4 p.m.

 

Special Guests:

Shawn Prince, Field Application Scientist, Qiagen Bioinformatics;

Mikael Flensborg, Director Global Partner Relations, Qiagen Bioinformatics

 

You can see more Intel activities planned for SC14 here.

 

What are you looking forward to seeing at SC14 next week?

For the past three years, we have been tracking the effectiveness of sales professionals using mobile technology as their main means of information delivery to healthcare professionals (HCPs), and in particular to doctors. As previously mentioned, the use of mobile devices has been variable, with many sales professionals using them in the same way they were using paper materials.

 

We have data to show that where mobile devices are used effectively the doctors rate the sales professionals’ performance higher across multiple key performance indicators (KPIs) than when using paper alone or no materials in support of their key messages. Not only do we see that the mobile device enhances the delivery of information, there is also increasing evidence that using mobile technology increases the likelihood of altering HCP behaviors.

 

Most of the pharmaceutical companies we have tracked still use a combination of paper and mobile devices. We have seen the best and most efficient use of the mobile device when the sales professional is able to use it to open the call and then navigate to the most appropriate information for that particular HCP. We have data on a number of specialist sales teams indicating that in calls lasting less than five minutes no supporting materials, including mobile devices, were used in any interaction with an HCP.

 

Another advantage of mobile devices is at closing a call by being able to immediately email any supporting documents directly to the HCP. Our extensive research with HCPs shows they expect that ability when mobile devices are used; and when offered, a very positive impression is made.

 

Additionally, the opportunity for the HCP to order and sign for samples at the time of the interaction is, in the eyes of the busy HCP, critical. The positive comments we have received from many HCPs on sales professionals’ use of mobile devices are indicative of the acceptance of this technology, enhancement of the experience, and leads to changes in behavior.

 

When asked specifically how the mobile device would be best used as a means of information delivery, we received the following advice and comments. (2013 Data from over 1500 HCPs representing 15 specialties):

 

  • The mobile device is best used for short 1 minute presentations, focusing on main points
  • The mobile device should be used to display medical information in a structured format to save time
  • The mobile device is the ideal tool for one-on-one education
  • The mobile device should be used as a visual aid to get a point across or to educate
  • The mobile device should be used to show videos pertinent to a detail, such as mechanism of action of a drug or how to administer a medicine
  • The mobile device should be used to drill down quickly on topics of special interest, such as dosing in renal failure or drug interactions
  • The mobile device should be used for multimedia or interactive presentations

 

Verbatim Comments

  • “Chance for more information through easier links (as opposed to rummaging through a bag of papers)”
  • “Easier for sales professional to present information, less waste of my time”
  • “It does not leave large volumes of materials behind at our office. Also requires the sales professional to be more to the point with a few slides as opposed to a lengthy paper document.”
  • “Easy to use and navigate the information, easy to sign for samples”
  • “Demonstrations and ease of visualization of material presented”

 

Unfortunately, there are also negative aspects to the use of mobile devices, from the design of apps to the lack of e-licenses for clinical papers and reprints. Another area where mobile technology seems to fall short is when it comes to reimbursement, patient assistance programs and managed care issues. We will discuss this in more depth in future blogs.

 

What questions do you have?

 

As consumers demonstrate a growing interest in generating their own vital signs and wellness data via apps and wearable devices, healthcare systems need to deal with the influx of information and set a strategy for how to analyze it for better outcomes.

 

In the above video, Frederick Holston, executive director and chief technology officer at the Intermountain Healthcare Transformation Lab, talks with Eric Dishman about how healthcare providers can learn to prepare for consumer-generated data, how to trust the data, and how to get data in the hands of physicians to utilize in care plans.

 

Watch the short video and let us know what questions you have about the future of healthcare technology and where you think it’s headed.

With increasing variety, volume and velocity of sensitive patient data, healthcare organizations are increasingly challenged with compliance with regulations and data protection laws, and avoiding breaches. The total average cost of a data breach reached US $5.9M in the United States (2014 Ponemon Cost of a Data Breach), representing an average of $316 per patient record. The prospect of random audits to enforce compliance with regulations, such as the OCR HIPAA privacy, security and breach notification audits, continues to loom large.

 

Healthcare.jpgUnderstanding what sensitive data you have is absolute prerequisite to securing yourself, and has never been more important. Only with an accurate understanding of what sensitive data is at rest, in use, and in transit, can a healthcare organization successfully secure itself. If a healthcare data inventory misses some sensitive data it can go unsecured and lead to a security incident such as a breach, or a finding of non-compliance with regulations or data protection laws in the event of an audit.

 

Ten years ago, healthcare environments were more homogeneous with fewer types of clients, mostly corporate provisioned and more uniform, and with a slower refresh rate. Software used by healthcare workers was also mostly corporate provisioned, leading to a more consistent, less diverse, and more slowly changing software IT environment. In this more homogeneous and slower changing IT environment an annual manual data inventory may have been sufficient where a security and privacy team worked with documentation, IT management tools, and healthcare workers to conduct inventories.

 

Today, most healthcare organizations are much more heterogeneous with a mix of clients or endpoints: smartphones, tablets, laptops, wearables, and Internet of Things. Furthermore, healthcare networks today are a mix of personal, BYOD, and corporate provisioned devices, and have a faster refresh rate, especially for personal and BYOD devices such as smartphone that are often upgraded within two years or less. Exacerbating this diversity is a myriad of operating systems, versions, apps and online services including social media that are collecting, using and storing new types of sensitive data, and moving it over the network in new ways. The bottom line is that healthcare environments have a major challenge tracking all the sensitive data they have at rest, in use, and in transit. Given these challenges, a conventional annual data inventory is generally not sufficient.

 

Today, it is critical for healthcare organizations to understand what sensitive data they have on their networks in near real-time. Once a healthcare organization identifies new unprotected sensitive data on their network they can proactively initiate remediation which can include:

 

  1. Delete sensitive data in an unsecured location,
  2. Encrypting sensitive data in place,
  3. Move sensitive data in an unsecured location somewhere more secure, and
  4. Educate healthcare workers on preferred alternatives to avoid future non-compliance and privacy and security risks.

 

Data Loss Prevention is a mature security safeguard solution that includes the ability to discover sensitive data at rest and in transit. With the rapidly increasing diversity of healthcare IT environments and variety of sensitive data they are collecting, using, storing, and moving, the value proposition of DLP and in particular in its ability to discover sensitive healthcare information has never been greater. This provides a key safeguard to supplement other data inventory initiatives within a modern healthcare organization. Intel Security Group provides network and endpoint DLP solutions that include this discovery capability. Furthermore, these can be vertically integrated with Intel hardware assisted security including AES-NI for hardware accelerated encryption (Data Loss Prevention Best Practices for Healthcare). An effective near real-time inventory of sensitive data, combined with a proactive approach to secure any unsecured sensitive data, enables healthcare organizations to embrace and realize the benefits of new technologies while keeping privacy, security and non-compliance risks manageable.

 

Does your healthcare organization have DLP, and if so do you have the processes in place to use it effectively and realize its full value for near real-time discovery and protection of sensitive data on your network?

 

The promise of personalized medicine relies heavily on high performance computing (HPC). Speed and power influence the genome sequence process and ultimately patient treatment plans.

 

With the SC14 Conference coming up next month, we caught up with Carlos Sosa, high performance computing architect at Cray, Inc., to hear his thoughts on the state of HPCs. In the above video clip, he says that personalized medicine is on the way but that HPC technology needs to be more robust to answer questions quickly for patients and doctors.

 

He cites a University of Chicago workflow that used parallel machines to sequence genomes and performed 47 years of research in just 51 hours as an example of moving toward personalized medicine capability.

 

Watch the clip and let us know what questions you have about HPCs and personalized medicine. What are you seeing?

Health IT is a hot topic in the Empire State. New York was the first state to host an open health data site and is now in the process of building the Statewide Health Information Network of New York. The SHIN-NY will enable providers to access patient records from anywhere in the state.

 

To learn more, we caught up with Howard A. Zucker, MD, JD, who was 22 when he got his MD from George Washington University School of Medicine and became one of America's youngest doctors. Today, Zucker is the Acting Commissioner of Health for New York State, a post he assumed in May 2014. Like his predecessor Nirav R. Shah, MD, MPH, Zucker is a technology enthusiast, who sees EHRs, mobile apps and telehealth as key components to improving our health care system. Here, he shares his thoughts.

 

What’s your vision for patient care in New York in the next five years?

 

Zucker: Patient care will be a more seamless experience for many reasons. Technology will allow for further connectivity. Patients will have access to their health information through patient portals. Providers will share information on the SHIN-NY. All of this will make patient care more fluid, so that no matter where you go – a hospital, your doctor’s office or the local pharmacy – providers will be able to know your health history and deliver better quality, more individualized care. And we will do this while safeguarding patient privacy.

 

I also see a larger proportion of patient care taking place in the home. Doctors will take advantage of technologies like Skype and telemedicine to deliver that care. This will happen as patients take more ownership of their health. Devices like FitBit amass data about health and take steps to improve it. It’s a technology still in its infancy, but it’s going to play a major role in long term care. zucker_263x329.jpg

 

How will technology shape health care in New York and beyond?

 

Zucker: Technology in health and medicine is rapidly expanding – it’s already started. Genomics and proteomics will one day lead to customized medicine and treatments tailored to the individual. Mobile technology will provide patient data to change behaviors. Patients and doctors alike will use this type of technology. As a result, patients will truly begin to “own” their health.

 

Personally, I’d like to see greater use of technology for long-term care. Many people I know are dealing with aging parents and scrambling to figure out what to do. I think technology will enable more people to age in place in ways that have yet to unfold.

 

What hurdles do you see in New York and how can you get around those?

 

Zucker: Interoperability remains an ongoing concern. If computers can’t talk to each other, then this seamless experience will be extremely challenging.

 

We also need doctors to embrace and adopt EHRs. Many of them are still using paper records. But it’s challenging to set up an EHR when you have patients waiting to be seen and so many other clinical care obligations. Somehow, we need to find a way to make the adoption and implementation process less burdensome. Financial incentives alone won’t work.

 

How will mobility play into providing better patient care in New York?

 

Zucker: The human body is constantly giving us information, but only recently have we begun to figure out ways to receive that data using mobile technology. Once we’ve mastered this, we’re going to significantly improve patient care.

 

We already have technology that collects data from phones, and we have sensors that monitor heart rate, activity levels and sleep patterns. More advanced tools will track blood glucose levels, blood oxygen and stress levels.

 

How will New York use all this patient-generated health data?

 

Zucker: We have numerous plans for all this data, but the most important will be using it to better prevent, diagnose and treat disease. Someday soon, the data will help us find early biomarkers of disease, so that we can predict illness well in advance of the onset of symptoms. We will be able to use the data to make more informed decisions on patient care.

The bring-your-own-device to work trend is deeply entrenched in the healthcare industry, with roughly 89 percent of the nation’s healthcare workers now relying on their personal devices in the workplace. While this statistic—supplied by a 2013 Cisco partner network study—underscores the flexibility of mHealth devices in both improving patient care and increasing workflow efficiency, it also shines a light on a nagging, unrelenting reality: mobile device security remains a problem for hospitals.

 

A more recent IDG Connect survey concluded the same, as did a Forrester Research survey that was released earlier this month.

 

It’s not that hospitals are unaware of the issue; indeed, most HIT professionals are scrambling to secure every endpoint through which hospital staff access medical information. The challenge is keeping pace with a seemingly endless barrage of mHealth tools.

 

As a result:

 

  • 41 percent of healthcare employees' personal devices are not password protected, and 53 percent of them are accessing unsecured WiFi networks with their smartphones, according to the Cisco partner survey.
  • Unsanctioned device and app use is partly responsible for healthcare being more affected by data leakage monitoring issues than other industries, according the IDG Connect survey.
  • Lost or stolen devices have driven 39 percent of healthcare security incidents since 2005, according to Forrester analyst Chris Sherman, who recently told the Wall Street Journal these incidents account for 78 percent of all reported breached records originating from healthcare.

 

Further complicating matters is the rise of wireless medical devices, which usher in their own security risks that take precedence over data breaches.

 

So, where should healthcare CIOs focus their attention? Beyond better educating staff on safe computing practices, they need to know where the hospital’s data lives at all times, and restrict access based on job function. If an employee doesn’t need access, he doesn’t get it. Period.

 

Adopting stronger encryption practices also is critical. And, of course, they should virtualize desktops and applications to block the local storage of data.

 

What steps is your healthcare organization taking to shore up mobile device security? Do you have an encryption plan in place?

 

As a B2B journalist, John Farrell has covered healthcare IT since 1997 and is a sponsored correspondent for Intel Health & Life Sciences.

Read John’s other blog posts

From time to time we will look at healthcare IT environments from around the world to see how different countries approach healthcare technology challenges. Below is the second in a series of guest posts on the English NHS from contributor Colin Jervis.

 

In the UK, an aging population threatens to increase demand for healthcare and social services. My last post looked at the features of the integrated care needed to stem this tide and some of the security and confidentiality issues raised by sharing between organizations. Really, the only answer in the short- and medium-term is better models of care supported by Information and Communications Technology (ICT).

 

In addition, Baby Boomers are now aging and are likely to be far more assertive than their parents about healthcare quality and delivery. And they often have better ICT at home than they encounter in a spell with the NHS.

 

For sure, the management of long-term conditions is likely to be a competitive arena for public and private sector healthcare providers. Even among traditional NHS providers we already see the formation of GP consortia and of secondary care providers hiring salaried GPs to create new organizations.

 

Supporting this are wirelessness and data integration – moving away from traditional institutions and clinics and moving closer to care in a patient’s home. But the great benefits this promises come with risks.

 

The NHS uses two-factor authentication to authorize access to systems that contain confidential patient data – password and smartcard. Something you know and something you have. This is practicable for most NHS staff; however, for some it is not.

 

In a busy emergency department with few end user devices, the time taken for an individual to log out and in to the electronic patient record each time is unbearable. So, what tends to happen is that someone logs in with their smartcard at the start of the day and remains logged in until the end of their shift, letting their colleagues use their access rights. Not what is intended, but difficult to censure when clinicians put addressing patient needs before information governance.

 

Further, clinicians mobile in the community often have issues with security. They can attend a patient at their home and login. Provided access is good and there is continuous interaction between patient, clinician and machine this is fine.

 

However, some clinicians, such as physiotherapists, may have longer interventions away from the machine. To comply with security, the device times out after a few minutes. Logging in again is a pain, not to mention the possibility that – for example – an inquisitive family member could access the unattended machine while the connection is open. In the world of remote access security form does not always follow function.

 

Two-factor authentication is sound, however, many ICT helpdesks will rate the resetting of passwords as the biggest reason for user calls. Passwords are not easy for most people to remember particularly if the structure is prescriptive; for example, at least one capital letter, one digit and one symbol – and also has to be changed regularly.

 

Nothing of nothing comes. With the greater use of ICT and the benefits of instant access and mobility, we must trade something. There is no activity that carries no risk. Even if I lie in bed all day to avoid being run over by a truck or attacked by a mugger, I still risk the disbenefits of inactivity such as depression, heart disease and an overdose of comfort eating.

 

But how important to us is the confidentiality of healthcare information, particularly with the growth of wearable health devices and the smartphone app? I’ll address that in my next post.

 

What questions do you have?

 

Colin Jervis is an independent healthcare consultant. His book ‘Stop Saving the NHS and Start Reinventing It’ is available now. His website is kineticconsulting.co.uk, and he also posts on Twitter @colin_jervis.

 

Efficiency is the goal for streamlined, affordable healthcare. But how do we get there?

 

In the above video, Gabi Daniely, vice president of Stanley Healthcare, talks about the company’s five hospital category solutions and how they can improve the operational efficiency of healthcare facilities.

 

How are you improving your facilities’ efficiency? Watch the clip and let us know what questions you have.

 

In the above video, Cycle Computing CEO Jason Stowe talks about the strong disconnect that exists between research and clinical analysis. He says the current challenge in bio IT is to analyze data, make sense of it, and do actionable science against it.

 

He shares an example of a 156,000-core workload run in eight regions of the globe that produced 2.3 million hours of computational chemistry research (264 years’ worth) in just 18 hours. He says this capability will transform both access patterns and the kinds of research that pharmaceutical, life sciences, and healthcare companies are able to tackle when it comes to analyzing genomes.

 

Watch the clip and let us know what you think. What questions about research and clinical analysis do you have?

Doctors and surgeons are some of the brightest individuals in the world. However, no one is immune to mistakes and simple oversights. Unintentional errors occur in any industry; what makes healthcare different is that a single misstep could cost a life. 

 

In, The Checklist Manifesto by Dr. Atul Gawande, he cites a fellow surgeon’s story of a seemingly routine stab wound.  The patient was at a costume party when he got into an altercation that led to the stabbing.  As the team prepared to treat the wound, the patient’s vitals began dropping rapidly. The surgeon and his team were unaware that the weapon was a bayonet that went more than a foot through the man, piecing his aorta.

 

After regaining control of the situation, the man recovered after a few days. This experience presented complications that no one could possibly predict unless the doctors had full knowledge of the situation.  Gawande states, “everyone involved got almost every step right […] except no one remembered to ask the patient or the medical technicians what the weapon was” (Gawande 3). There are many independent variables to account for; a standard checklist for incoming stab wound patients could ensure that episodes like this are avoided and that other red flags would be accounted for. 

 

Miscommunication between clinicians and patients annually accounts for roughly 800,000 deaths in the US, more than heart disease and more than cancer.  The healthcare industry spends roughly $8 billion on extended care as a result of clinical error every year. As accountable care continues to make progress, the healthcare industry is moving more towards evidence based medicine and best practices. This is certainly the case for care providers, but also for patients as well. 

 

Implementing checklists in all aspects of healthcare can eliminate simple mistakes and common oversights by medical professionals and empower patients to become more educated and informed. Studies by the Journal of the American Medical Association (JAMA) as well as the New England Journal of Medicine (NEJM) have concluded that implementing checklists in various facets of care can reduce errors by up to half. Certain implementations of checklists in Intensive Care Units for infection mitigation resulted in reducing infections by 100 percent.

 

Compelling evidence of the need for checklisting can be found in the preparation process for a colonoscopy.  Colonoscopy preparation is a rigorous process that requires patients to be watching their diet and the clock for two days before procedure.  It is not uncommon for a colonoscopy to fail due to inadequate patient preparation. Before the procedure, the patient must pay attention to an arsenal of instructions regarding food, liquid, and medication. A detailed checklist that guides each patient through the process would practically eliminate any errors and failures due to inadequate patient preparation. 

 

From the patient’s perspective, checklisting everything from pre-surgery preparation to a routine checkup should be a priority.   At the end of the day, the patient has the most at stake and should be entitled to a clear, user-friendly system to understand every last detail of any procedure or treatment.

 

A couple of companies are making waves in the area of patient safety checklists, most notably of which are BluMenlo and Parallax.

 

BluMenlo is a mobile patient safety firm founded in 2012. Its desktop, tablet, and mobile solution drives utilization of checklists for patient handoffs, infection mitigation, and Radiation Oncology Machine QA. Although initial focus is in the areas mentioned, BluMenlo is expanding into standardizing best practices hospital and ACO-wide.

 

Parallax specializes in operating room patient safety. Its CHaRM offering incorporates a Heads Up Display to leverage checklists in the Operating Room. The software learns a surgeon’s habits and techniques to accurately predict how long an operation may take as well as predict possible errors.

 

Electronic checklists will certainly take hold as health systems, ACOs and accountable care networks continue to focus on increased patient safety, improved provider communications and best practices for reducing costs across their organizations. We will even see these best practices expedited if we begin to inquire with our care providers as informed and engaged patients.

 

What questions about checklists do you have?

 

As a healthcare executive and strategist, Justin Barnes is an industry and technology advisor who also serves as an Entrepreneur-in-Residence at Georgia Tech’s Advanced Technology Development Center. In addition, Mr. Barnes is Chairman Emeritus of the HIMSS EHR Association as well as Co-Chairman of the Accountable Care Community of Practice.

 

The year 2020 seems far off, but is closer than you think. With the increasing use of technology in healthcare, and with patient empowerment growing each year with the advent of mobile devices, what will a clinician’s workday look like five years from now?


In the above video, we turn toward the future to show you how enabling technologies that exist today will transform the way clinicians treat their patients in 2020. Learn how wearable devices, sensors, rich digital collaboration, social media, and personalized medicine through genomics will be part of a clinician’s daily workflow as we enter the next decade.

 

Watch the short video and let us know what questions you have about the future of healthcare technology and where you think it’s headed.

Filter Blog

By date:
By tag: