1 2 3 Previous Next

Intel Health & Life Sciences

34 Posts authored by: David Houlding

 

I recently had the privilege of interviewing Daniel Dura, CTO of Graphium Health recently on the subject of security on the frontlines of healthcare, and a few key themes emerged that I want to highlight and elaborate on below.

 

Regulatory compliance is necessary but not sufficient for effective security and breach risk mitigation. To effectively secure healthcare organizations against breaches and other security risks one needs to start with understanding the sensitive healthcare data at risk. Where is it at rest (inventory) and how is it moving over the network (inventory), and how sensitive is it (classification)? These seem like simple questions, but in practice are difficult to answer, especially with BYOD, apps, social media, consumer health, wearables, Internet of Things etc driving increased variety, volume and velocity (near real-time) sensitive healthcare data into healthcare organizations.

 

There are different types of breaches. Cybercrime type breaches have hit the news recently. Many other breaches are caused by loss or theft of mobile devices or media, insider risks such as accidents or workarounds, breaches caused by business associates or sub-contracted data processors, or malicious insiders either snooping records or committing fraud. Effective security requires avoiding distraction from the latest media, understanding the various types of breaches holistically, which ones are the greatest risks for your organization, and how to direct limited budget and resources available for security to do the most good in mitigating the most likely and impactful risks.

 

Usability is key. Healthcare workers have many more information technology tools now than 10 years ago and if usability is lacking in healthcare solutions or security it can directly drive the use of workarounds, non-compliance with policy, and additional risks that can lead to breaches. The challenge is to provide security together with improved usability. Examples include software encryption with hardware acceleration, SSD’s with encryption, or multi-factor authentication that improves usability of solutions and security.

 

Security is everyone’s job. Healthcare workers are increasingly targeted in spear phishing attacks. Effective mitigation of this type of risk requires a cultural shift so that security is not only the job of the security team but everyone’s job. Security awareness training needs to be on the job, gamified, continuous, and meaningful.

 

I’m curious what types of security concerns and risks are top of mind in your organization, challenges you are seeing in addressing these, and thoughts on how best to mitigate?

Security was a major area of focus at HIMSS 2015 in Chicago. From my observations, here are a few of the key takeaways from the many meetings, sessions, exhibits, and discussions in which I participated:

 

Top-of-Mind: Breaches are top-of-mind, especially cybercrime breaches such as those recently reported by Anthem and Premera. No healthcare organization wants to be the next headline, and incur the staggering business impact. Regulatory compliance is still important, but in most cases not currently the top concern.

 

Go Beyond: Regulatory compliance is necessary but not enough to sufficiently mitigate risk of breaches. To have a fighting chance at avoiding most breaches, and minimizing impact of breaches that do occur, healthcare organizations must go way beyond the minimum but sufficient for compliance with regulations. int_brand_883_PtntNrsBdsd_5600_cmyk_lowres.jpg

 

Multiple Breaches: Cybercrime breaches are just one kind of breach. There are several others, for example:


  • There are also breaches from loss or theft of mobile devices which, although often less impactful (because they often involve a subset rather than all patient records), do occur far more frequently than the cybercrime breaches that have hit the news headlines recently.

 

  • Insider breach risks are way underappreciated, and saying they are not sufficiently mitigated would be a major understatement. This kind of breach involves a healthcare worker accidentally exposing sensitive patient information to unauthorized access. This occurs in practice if patient data is emailed in the clear, put unencrypted on a USB stick, posted to an insecure cloud, or sent via an unsecured file transfer app.

 

  • Healthcare workers are increasingly empowered with mobile devices (personal, BYOD and corporate), apps, social media, wearables, Internet of Things, etc. These enable amazing new benefits in improving patient care, and also bring major new risks. Well intentioned healthcare workers, under time and cost pressure, have more and more rope to do wonderful things for improving care, but also inadvertently trip over with accidents that can lead to breaches. Annual “scroll to the bottom and click accept” security awareness training is often ineffective, and certainly insufficient.

 

  • To improve effectiveness of security awareness training, healthcare organizations need to engage healthcare workers on an ongoing basis. Practical strategies I heard discussed at this year’s HIMSS include gamified spear phishing solutions to help organizations simulate spear phishing emails, and healthcare workers recognize and avoid them. Weekly or biweekly emails can be used to help workers understand recent healthcare security events such as breaches in peer organizations (“keeping it real” strategy), how they occurred, why it matters to the healthcare workers, the patients, and the healthcare organization, and how everyone can help.

 

  • Ultimately any organization seeking achieve a reasonable security posture and sufficient breach risk mitigation must first successfully instill a culture of “security is everyone’s job”.

 

What questions do you have? What other security takeaways did you get from HIMSS?

I’ve looked at many aspects of Bring Your Own Device in healthcare throughout this series of blogs, from the costs of getting it wrong to the upsides and downsides, and the effects on network and server security when implementing BYOD.

 

I thought it would be useful to distil my thoughts around how healthcare organisations can maximize the benefits of BYOD into 5 best practice tips. This is by no means an exhaustive list but provides a starting point for the no doubt lengthy conversations that need to take place when assessing the suitability of BYOD for an organisation.

 

If you’ve already implemented BYOD in your own healthcare organisation then do register and leave a comment below with your own tips – I know this community will appreciate your expertise.

 

Develop a Bring Your Own Device policy


It sounds like an obvious first step doesn’t it? However, I’d like to stress the importance of getting the policy right from day one. Do your research with clinical staff, understand their technology and process needs, identify their workarounds and ask how you can make their job of patient care easier. Development of a detailed and robust BYOD policy may take much longer than anticipated, and don’t forget that acceptance and inclusion of frontline staff is key to its success. Alongside the nuts and bolts of security it’s useful to explain the benefits to healthcare workers to get their trust, confidence and buy-in from the start.


Mobile Device Management

 

It’s likely that you have the network/server security aspect covered off under existing corporate IT governance. A key safeguard in implementing BYOD is Mobile Device Management (MDM), which should help meet your organisation’s specific security requirements. Some of these requirements may include restrictions on storing/downloading data onto the device, password authentication protocols and anti-virus/encryption software. Healthcare workers must also be given advice on what happens in the event of loss or theft of the mobile device, or when they leave the organisation in respect of remote deletion of data and apps. I encourage you to read our Case Study on Madrid Community Health Department on Managing Mobile for a great insight into how one healthcare organisation is assessing BYOD.


Make it Inclusive


For a healthcare organisation to fully enjoy the benefits of a more mobile and flexible workforce through BYOD they need to ensure that as many workers as possible (actually, I’d say all) can use their personal devices. It can be complex but some simple stipulations in the BYOD policy, such as requiring the user to ensure that they have the latest operating system and app updates installed at all times, can help to mitigate some of the risk. Also I would be conscious of the level of support an IT department can give from both a resource (people) and knowledge of mobile operating systems point of view. Ultimately, the most effective BYOD policies are device agnostic.


Plan for a Security Breach

 

The best BYOD policies plan for the worst, so that if the worst does happen it can be managed efficiently, effectively and have as little impact as possible on the organisation and patients. This requires creation of a Security Incident Response Plan. Planning for a security breach may prioritise fixing the weak link in the security chain, identifying the type and volume of data stolen and reporting the breach to a governmental department. For example, the Information Commissioner’s Office (ICO) in the UK advises that ‘although there is no legal obligation on data controllers to report breaches of security, we believe that serious breaches should be reported to the ICO.’


Continuing Assessment


From a personal perspective we all know how quickly technology is changing and improving our lives. Healthcare is no different and it’s likely that the tablet carried by a nurse today has more computing power than the desktop of just a couple of years ago. With this rapid change comes the need to continually assess a BYOD policy to ensure it meets the advances in hardware and software on a regular basis. The risk landscape is also constantly evolving as new apps are installed, new social media services become available, and healthcare workers innovate new ways of collaborating. Importantly though, I stress that the BYOD policy must also take into account the advances in the working needs and practices of healthcare workers. We’re seeing some fantastic results from improved mobility, security and ability to store and analyse large amounts of data across the healthcare spectrum. We cannot afford for this progress to be hindered by out-of-date policies. The policy is the foundation of the security and privacy practice. A good privacy and security practice enables faster adoption, use, and realisation of the benefits of new technologies.

 

I hope these best practice tips have given you food for thought. We want to keep this conversation about the benefits of a more mobile healthcare workforce going so do follow us on Twitter and share our blogs amongst your personal networks.

 

BYOD in EMEA series: Read Part Three

Join the conversation: Intel Health and Life Sciences Community

Get in touch: Follow us via @intelhealth

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

I like to think of security as a chain, and like any other chain it is only as strong as its weakest link. In the case of security in healthcare the chain consists of the network, the server and the device. Often the focus is overwhelmingly placed on the security of the device but I argue that data is as equally, if not more, at risk when it's in transit as it is when at rest. So, with that in mind I wanted to take a look at some of the wider security considerations around Bring Your Own Device (BYOD).


Whenever I speak at events about security and healthcare my starting point is often that we must remember that the priority for healthcare professionals is patient care. Security cannot, and must not, compromise usability as we know this drives workarounds. Often these workarounds mean using personal devices in conjunction with what is more commonly known as 'Bring Your Own Cloud'.


Bring Your Own Cloud

Bring Your Own Cloud (BYOC) primarily refers to the use of clouds that are not authorized by the healthcare organization to convey sensitive data. This often occurs through an individual using an app they downloaded onto a personal device. Many such apps have backend clouds as part of their overall solution. When sensitive data is entered into the app it gets sync’d to the cloud. Furthermore, this transfer can occur over networks that are not managed by the healthcare organization, making the transfer invisible to the healthcare organization. Of course, sensitive data in an unauthorized cloud can constitute a breach. In many cases these 3rd party clouds can be in different countries, making this transfer a trans-border data flow and can represent further non-compliance issues with data protection laws.


For example, imagine a nurse taking patient notes that need to be sent to a specialist such as a cardiologist. This should be done using a secure device with a secure wireless network and a secure solution approved by the organization for such a task. However, lack of usability, or cumbersome security around such solutions, or a slow or overly restrictive IT department can drive the use of BYOC approach instead. In a BYOC approach the nurse uses a personal app on a personal mobile device together with either unencrypted email, a file transfer app, or social media to send these for analysis by a specialist.


This introduces risks to both the confidentiality of the sensitive healthcare data, as well as the integrity of the patient record that is often not updated with information traveling in these “side clouds”, rendering it incomplete, inaccurate, or out of data. In a best case this can result in suboptimal healthcare, and in a worst case this could be a patient safety issue. The consequences to both patient and organisation of such risks can be severe. Here at Intel we have security solutions available to healthcare organisations, which ensure that data is always secure whether at rest or in transit on the device or organisation’s network. Our security solutions also use hardware-enhanced security to maximize performance and usability, mitigating risk of cumbersome security and the healthcare worker being driven to resort to workarounds and BYOC.


Apps for Healthcare

One area where I’m seeing a lot of rapid change is in the development of apps for healthcare. I recently spoke to the Apps Alliance on the security challenges for developers of healthcare apps, whether they are aimed at healthcare professionals or consumers. These apps often make the recording and analysing of health information very easy and in some cases they can enhance the relationship between patient and clinician.


Stealth IT

I’d also like to briefly take a look at what is often referred to as ‘Stealth IT’, also called ‘Shadow IT’. As with any form of workaround, the use of Stealth IT can be driven by an unresponsive or overly restrictive corporate IT department. One obvious example would be a small team of researchers requiring additional server space to store data but perceiving the organisational process slow and expensive in providing such resources. The consequence is the purchase of what is comparatively cheap and accessible server space with any number of easy-to-find companies on the web. I remind you of my earlier comments about knowing exactly how secure the server is and in which country or continent the server sits.


I like to think that a healthcare organisation looking to put a Bring Your Own Device policy in place appreciates the benefits and risks but starts with understanding why a healthcare professional uses their own device, logs on to an unsecure network or purchases unauthorised server space. Only then will the organisation, healthcare worker and patient truly reap the benefits of BYOD.

 

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

In my second blog focusing on Bring Your Own Device (BYOD) in EMEA I’ll be taking a look at the positives and negatives of introducing a BYOD culture into a healthcare organisation. All too often we hear of blanket bans on clinicians and administrators using their personal devices at work, but with the right security protocols in place and enhanced training there is a huge opportunity for BYOD to help solve many of the challenges facing healthcare.

 

Much of the negativity surrounding BYOD occurs because of the resulting impact to both patients (privacy) and healthcare organisations (business/financial) of data breaches in EMEA. While I’d agree that the headline numbers outlined in my first blog are alarming, they do need to be considered in the context of the size of the wider national healthcare systems.

 

A great example I’ve seen of an organisation seeking to operate a more efficient health service through the implementation of BYOD is the Madrid Community Health Department in Spain. Intel and security expert Stack Overflow assessed several mobile operating systems with a view to supporting BYOD for physicians in hospitals within their organisation. I highly recommend you read more about how Madrid Community Health Department is managing mobile with Microsoft Windows-based tablets.

 

 

The Upside of BYOD

There’s no doubt that BYOD is a fantastic enabler in modern healthcare systems. But why? We’ll look at some best practice tips in a later blog but suffice to say here that much of the list below should be underpinned by a robust but flexible BYOD policy, an enhanced level of staff training, and a holistic and multi-layered approach to security.

 

1) Reduces Cost of IT

Perhaps the most obvious benefit to healthcare organisations is a reduction in the cost of purchasing IT equipment. Not only that, it’s likely that employees will take greater care of their own devices than they would of a corporate device, thus reducing wastage and replacement costs.

 

2) Upgrade and Update

Product refresh rates are likely to be more rapid for personal devices, enabling employees to take advantage of the latest technologies such as enhanced encryption and improved processing power. And with personal devices we also expect individuals to update software/apps more regularly, ensuring that the latest security updates are installed.

 

3) Knowledge & Understanding

Training employees on new devices or software can be costly and a significant drain on time, notwithstanding being able to schedule in time with busy clinicians and healthcare administrators. I believe that allowing employees to use their personal everyday device, with which they are familiar, reduces the need for device-level training.  There may still be a requirement to have app-level training but that very much depends on the intuitiveness of the apps/services being used.

 

4) More Mobile Workforce

The holy grail of a modern healthcare organisation – a truly mobile workforce. My points above all lead to clinicians and administrators being equipped with the latest mobile technology to be able to work anytime and anywhere to deliver a fantastic patient experience.

 

 

The Downside of BYOD

As I’ve mentioned previously, much of the comment around BYOD is negative and very much driven by headline news of medical records lost or stolen, the ensuing privacy ramifications and significant fines for healthcare organisations following a data breach.

 

It would be remiss of me to ignore the flip-side of the BYOD story but I would hasten to add that much of the risk associated with the list below can be mitigated with a multi-layered approach that not only combines multiple technical safeguards but also recognises the need to apply these with a holistic approach including administrative safeguards such as policy, training, audit and compliance, as well as physical safeguards such as locks and secure use, transport and storage.


1)  Encourages a laissez-faire approach to security

We’ve all heard the phrase ‘familiarity breeds contempt’ and there’s a good argument to apply this to BYOD in healthcare. It’s all too easy for employees to use some of the same workarounds used in their personal life when it comes to handling sensitive health data on their personal device. The most obvious example is sharing via the multitude of wireless options available today.


2) Unauthorised sharing of information

Data held at rest on a personal devices is at a high risk of loss or theft and is consequently also at high risk of unauthorized access or breach. Consumers are increasingly adopting cloud services to store personal information including photos and documents.

 

When a clinician or healthcare administrator is in a pressured working situation with their focus primarily on the care of the patient there is a temptation to use a workaround – the most obvious being the use of a familiar and personal cloud-based file sharing service to transmit data. In most cases this is a breach of BYOD and wider data protection policies, and increases risk to the confidentiality of sensitive healthcare data.


3) Loss of Devices

The loss of a personal mobile device can be distressing for the owner but it’s likely that they’ll simply upgrade or purchase a new model. Loss of personal data is quickly forgotten but loss of healthcare data on a personal device can have far-reaching and costly consequences both for patients whose privacy is compromised and for the healthcare organisation employer of the healthcare worker. An effective BYOD policy should explicitly deal with loss of devices used by healthcare employees and their responsibilities in terms of securing such devices, responsible use, and timely reporting in the event of loss or theft of such devices.


4) Integration / Compatibility

I speak regularly with healthcare organisations and I know that IT managers see BYOD as a mixed blessing. On the one hand the cost-savings can be tremendous but on the other they are often left with having to integrate multiple devices and OS into the corporate IT environment. What I often see is a fragmented BYOD policy which excludes certain devices and OS, leaving some employees disgruntled and feeling left out. A side-effect of this is that it can lead to sharing of devices which can compromise audit and compliance controls and also brings us back to point 2 above.

 

These are just some of the positives and negatives around implementing BYOD in a healthcare setting. I firmly sit on the positive side of the fence when it comes to BYOD and here at Intel Security we have solutions to help you overcome the challenges in your organisation, such as Multi-Factor Authentication (MFA) and SSDs Solid State Drives including in-built encryption which complement the administrative and physical safeguards you use in your holistic approach to managing risk.

 

Don’t forget to check out the great example from the Madrid Community Health Department to see how our work is having a positive impact on healthcare in Spain. We’d love to hear your own views on BYOD so do leave us a comment below or if you have a question I’d be happy to answer it.

 

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Today I begin a series of blogs which take an in-depth look at the issues surrounding what is commonly known as ‘Bring Your Own Device’, with a focus on the Healthcare and Life Sciences sector in EMEA. Bring Your Own Device or BYOD, is the catch-all phrase attributed to the use of personal technology devices in a corporate environment for business use.

I’ll look at the scale of BYOD in specific territories, get under the skin of exactly why this trend has taken off over the past few years and dig into the detail of the opportunities and costs of allowing a BYOD culture to develop within your healthcare setting. I’ll conclude the series with some practical advice on how advances in technology can help you safeguard your systems and data.

 

I’d also be interested to get your views too (leave a comment below or tweet us via @intelhealth), whether you’re a clinician working at the sharp end of care delivery and benefiting from using a personal device at work, or you’re an IT administrator tackling the often thorny and complex issue of implementing a BYOD policy in your organisation.

 

Cost of Data Breaches in EMEA

Providing context to the scale of BYOD across EMEA inevitably means looking at the cost of data breaches but I’d stress that not all of the consequences of allowing BYOD are negative, as I’ll explain in a later blog. A great point of reference though is the Ponemon Institute, which has produced detailed reports on the cost of data security breaches for many years.

 

Country

Organisational

Cost (m)

Ave per

Capita Cost

Negligence Cause (%)

Criminal

Attacks (%)

System Biz Process Failures (%)

UK

£2.21

£95

40%

38%

N/A

Australia

£1.42

£74

27%

46%

27%

Brazil

£0.83

£36

38%

31%

31%

UAE/Saudi Arabia

£2.02

£71

21%

50%

29%

Japan

£1.32

£71

31%

46%

23%

France

£2.24

£98

30%

48%

22%

/Germany

£2.54

£104

20%

50%

30%

*Source: 2014 Cost of Data Breach Study (country specific -. December 2014 – Ponemon Institute

 

The table above shows the significant costs associated with data breaches across a number of sectors including pharmaceuticals, energy and public (which includes health). BYOD sits under the term ‘Negligence Cause’ in the table above and for some countries in EMEA it accounts for a significant portion of overall breaches. The organisational costs are significant and reflect not only the consequential increase in investment to safeguard security weaknesses but also fines levied by national and pan-regional government.

 

I’ll drill down into specific examples of Bring Your Own Device in healthcare in more detail in the future but as a brief indicator we know, for example, that in England the National Health Service (NHS) suffered 7,255 personal data breaches over a 3 year period. These breaches of healthcare information security include data being lost, stolen or inappropriately shared with third parties, and in the case of inappropriate sharing this often includes a workaround using a personal device.

 

Opportunities presented by Bring Your Own Device

The negative comments around BYOD and associated costs to healthcare organisations as a result of data breaches often mask what are some fantastic upsides. I’m keen to emphasise in this series that with the right security solutions, both at rest and in transit, and across the entire network-client-device continuum, there are significant advantages to healthcare organisations in allowing individuals to use personal devices at work.

 

I hope this first blog has piqued your interest in what is a hot topic within the health and life sciences sector across the EMEA region. If you’ve successfully implemented a BYOD policy in your healthcare organisation or you want to highlight why and how you are using your personal device to deliver better patient care we’d be grateful to hear from you.

 

It would be fantastic to share some great examples from EMEA to help our community learn together. If you want to be the first to know when the next blog in this series will be published then sign-up to our Health and Life Sciences Community.

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

I recently spoke to the Apps Alliance, a non-profit global membership organization that supports developers as creators, innovators, and entrepreneurs, on the latest trends in healthcare security.

 

It was a fascinating 40 minutes and a great opportunity to take a look at security issues not just from the healthcare professional or patient perspective, but also from a developers’ point of view. In this podcast, we take a look at what's important to all three groups when it comes to privacy, security and risk around healthcare data.


Listen to the podcast here

 

We discussed:

 

  • Best practices for developers looking to secure healthcare data
  • Security challenges that stem from the flow of data from mobile healthcare devices
  • The relationship between usability and security

 

I recently wrote a blog looking at the perceived trade-off between usability and security in healthcare IT and how you can mitigate risks in your own organisation. We have solutions to help you overcome these challenges, many of which are outlined in our Healthcare Friendly Security whitepaper.

 

We'd love to get your feedback on the issues discussed in the podcast so please leave a comment below - we're happy to answer questions you may have too.

 

Thanks for listening.


David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

A consequence of the unprecedented rate of advances in technology has brought the topic of usability of devices in the workplace to the fore. Usability used to be a 'nice to have' but with experiences and expectations heightened by the fantastic usability of personal mobile devices it has become a 'must-have'. The corporate healthcare IT environment is faced with a challenge.

 

Taming the BYOD culture

Either they invest in great corporate IT user experiences for employees or they'll be exposed to the dangers of the 'Bring Your Own Device' (BYOD) to work movement. And healthcare workers are amongst the first to look for workarounds such as BYOD when usability of their IT is having a negative impact on their workflow.

 

If organisations allow a BYOD culture to become established they face heightened security and privacy risks which can often result in data breaches. Since 2010, the Information Commissioner's Office (ICO) in the UK has fined organisations more than £6.7m for data protection breaches. Of this, the healthcare sector suffered fines of some £1.3m alone, which accounts for nearly 30% of the British public sector penalties.

 

These costs highlight the importance of avoiding data breaches, particularly as the UK's public sector health organisations rapidly moved towards cloud-based electronic health records under the Personalised Health and Care 2020 framework. If data security is lacking because of workarounds it may well negate the predicted cost-effective benefits of moving to electronic health records for both patient and provider.

 

The 2020 framework acknowledges that, "In part, some of the barriers to reaping those benefits are comparatively mundane: a lack of universal Wi-Fi access, a failure to provide computers or tablets to ward or community-based staff, and outmoded security procedures that, by frustrating health and care professionals, encourage inappropriate ‘workarounds.’”

 

Mitigating risk of loss or theft

Loss or theft of devices is another common cause of data breaches in healthcare. An audit of 19 UK health-related organisations by the ICO concluded that "a number of organisations visited did not have effective asset management in place for IT hardware and software; this raises the risk of the business not knowing what devices are in circulation and therefore not becoming aware if one is lost or stolen."

 

There are a number of options to mitigate risk in these circumstances. First, usability and security can be vastly enhanced using Multi-Factor Authentication (MFA), which when combined with Single Sign On (SSO) reduces the overall number of device logins required. Second, replacing unencrypted conventional hard drives with SSDs (Solid State Drives) + encryption lowers the risk in the event of theft or loss but also improves data access performance. And that's a win-win result for all healthcare professionals.

 

Effective security is like a chain, it requires the securing of all points and either removing or repairing the weak links. Intel Security Group's solutions has security covered from mobile devices, through networks to back-end servers. We're already helping healthcare organisations across the globe to embrace the rapidly changing face of technology in the healthcare sector while managing risk and improving that all-important usability.

 

We've produced a whitepaper on Healthcare Friendly Security which will help you strike the balance between fantastic usability and industry-leading security in your organisation. Grab your free download today.

 

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

As recently as 10 years ago, healthcare IT was mostly corporate provisioned, less diverse and there were slower refresh rates. Up to this point, usability was treated as a “nice to have” and significantly lower priority than features or functionality of solutions.

 

In this more homogeneous and slower changing environment there was, for the most part, one way to get the job done. Fast forward to today where most healthcare IT environments are much more heterogeneous with a myriad of devices, both corporate and personal BYOD (Bring Your Own Device), operating systems, apps, versions, social media, and now we have wearables and Internet of Things rapidly growing. Furthermore, refresh rates are much faster, especially with personal / BYOD devices, and apps. In today’s environment, usability is very much a “must have” because if it is not present research shows healthcare workers find workarounds, like using personal devices, and these workarounds drive non-compliance and additional security and privacy risk and can often be the source of breaches.

 

Traditionally we have approached usability and security as a tug of war or tradeoff … where having more security meant less usability and vice versa.

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

Unfortunately, breaches have reached alarming levels in both business impact and likelihood. The total average cost of a data breach in 2014 was US $3.5 million. This average is global and across several industries including healthcare. Looking more specifically at healthcare, the global average cost of a data breach per patient is US $359, the highest across all industries. With this kind of cost avoiding, breaches are of paramount importance for healthcare organizations. But how can we add security without compromising usability, and inadvertently driving workarounds that actually cause non-compliance and risk?

 

What is desperately needed is security that preserves or even improves usability, where risks are significantly mitigated without driving healthcare workers to use workarounds. On the surface this may seem impossible, yet there are several security safeguards today that do just that. Many breaches occur due to loss or theft of mobile devices. A very good safeguard to help mitigate this risk are self-encrypting SSD’s (Solid State Drives). If one takes a conventional hard drive, unencrypted and at risk of causing breach if lost or stolen, and replaces it with an SSD + encryption this can often have better data access performance than the original conventional unencrypted drive. Another example of a safeguard that improves usability and security is MFA (Multi-Factor Authentication) combined with SSO (Single Sign On), which improves both the usability and security of each login, as well as reduces the overall number of logins.

 

Intel Security Group is focused on creating innovative security safeguards that combine security software vertically integrated with security hardware that improve usability and harden the overall solution to make it more resilient to increasingly sophisticated attacks, such as from cybercrime. With cloud and mobile, and health information exchange, security becomes like a chain, and effective security requires securing all points and avoiding weakest links. Intel Security Group solutions span right from mobile devices, through networks to backend servers. This paves the way for healthcare to adopt, embrace, and realize benefits of new technologies while managing risk, and improving usability.

 

What questions about healthcare IT security do you have?

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel Corporation and a frequent blog contributor.


Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

With increasing variety, volume and velocity of sensitive patient data, healthcare organizations are increasingly challenged with compliance with regulations and data protection laws, and avoiding breaches. The total average cost of a data breach reached US $5.9M in the United States (2014 Ponemon Cost of a Data Breach), representing an average of $316 per patient record. The prospect of random audits to enforce compliance with regulations, such as the OCR HIPAA privacy, security and breach notification audits, continues to loom large.

 

Healthcare.jpgUnderstanding what sensitive data you have is absolute prerequisite to securing yourself, and has never been more important. Only with an accurate understanding of what sensitive data is at rest, in use, and in transit, can a healthcare organization successfully secure itself. If a healthcare data inventory misses some sensitive data it can go unsecured and lead to a security incident such as a breach, or a finding of non-compliance with regulations or data protection laws in the event of an audit.

 

Ten years ago, healthcare environments were more homogeneous with fewer types of clients, mostly corporate provisioned and more uniform, and with a slower refresh rate. Software used by healthcare workers was also mostly corporate provisioned, leading to a more consistent, less diverse, and more slowly changing software IT environment. In this more homogeneous and slower changing IT environment an annual manual data inventory may have been sufficient where a security and privacy team worked with documentation, IT management tools, and healthcare workers to conduct inventories.

 

Today, most healthcare organizations are much more heterogeneous with a mix of clients or endpoints: smartphones, tablets, laptops, wearables, and Internet of Things. Furthermore, healthcare networks today are a mix of personal, BYOD, and corporate provisioned devices, and have a faster refresh rate, especially for personal and BYOD devices such as smartphone that are often upgraded within two years or less. Exacerbating this diversity is a myriad of operating systems, versions, apps and online services including social media that are collecting, using and storing new types of sensitive data, and moving it over the network in new ways. The bottom line is that healthcare environments have a major challenge tracking all the sensitive data they have at rest, in use, and in transit. Given these challenges, a conventional annual data inventory is generally not sufficient.

 

Today, it is critical for healthcare organizations to understand what sensitive data they have on their networks in near real-time. Once a healthcare organization identifies new unprotected sensitive data on their network they can proactively initiate remediation which can include:

 

  1. Delete sensitive data in an unsecured location,
  2. Encrypting sensitive data in place,
  3. Move sensitive data in an unsecured location somewhere more secure, and
  4. Educate healthcare workers on preferred alternatives to avoid future non-compliance and privacy and security risks.

 

Data Loss Prevention is a mature security safeguard solution that includes the ability to discover sensitive data at rest and in transit. With the rapidly increasing diversity of healthcare IT environments and variety of sensitive data they are collecting, using, storing, and moving, the value proposition of DLP and in particular in its ability to discover sensitive healthcare information has never been greater. This provides a key safeguard to supplement other data inventory initiatives within a modern healthcare organization. Intel Security Group provides network and endpoint DLP solutions that include this discovery capability. Furthermore, these can be vertically integrated with Intel hardware assisted security including AES-NI for hardware accelerated encryption (Data Loss Prevention Best Practices for Healthcare). An effective near real-time inventory of sensitive data, combined with a proactive approach to secure any unsecured sensitive data, enables healthcare organizations to embrace and realize the benefits of new technologies while keeping privacy, security and non-compliance risks manageable.

 

Does your healthcare organization have DLP, and if so do you have the processes in place to use it effectively and realize its full value for near real-time discovery and protection of sensitive data on your network?

Several research studies, most recently Curbing Healthcare Workarounds for Coworker Collaboration, show that healthcare workers are increasingly using workarounds or procedures out of compliance with policy. Lack of usability, cumbersome security and slow or overly restrictive IT departments all drive the use of smartphones / tablets and apps, texting, file transfer, USB sticks, personal e-mail, social media, etc with sensitive information. The majority of healthcare workers are not malicious, but rather well intentioned, and motivated to improve the quality and reduce the cost of healthcare.

 

However, increasingly these healthcare workers are empowered with information power tools, including mobile devices, apps, social media, wearables and Internet of Things, and mostly without the privacy and security savvy to enable their safer use. While these workarounds offer more usable and exciting new alternatives, they also bring major privacy and security risks and non-compliance with policy. BYOD becoming mainstream, apps and devices are gaining power, and the rapid growth of wearables and the Internet of Things further exacerbate these risks.

 

Some “black and white” workarounds, clearly out of compliance with policy, are today effectively detected and mitigated by MDM and DLP. For example if policy forbids the use of USB sticks with certain endpoint devices and this usage is detected by safeguards on such an endpoint device it is straightforward to prevent. However, compliance with policy is often much more difficult to establish in practice with many other user actions. For example if a user is using a file transfer app to transfer a photo from their personal device at a healthcare facility is this out of compliance? Well, it typically really depends on the content of that photo. Is it a photo of a patient, representing PHI and non-compliance, or is it a photo from last weekends hike, being shared with a healthcare friend co-worker, and representing acceptable use.

 

Unfortunately, classification of PHI is challenged with many media types including images, audio, video and often even free form text. Further, many of these media types can be directly acquired on an endpoint such as a personal smartphone, and exchanged over networks such as 3G or 4G that bypass healthcare organization secure servers, challenging existing safeguards including thin client solutions that secure healthcare data on secure managed backend servers.

 

This “gray region” of the risk spectrum is rapidly growing with increasing empowerment of healthcare workers. If you have a personal device and participate in your organizations BYOD program you most likely had to install some MDM software on your device to enable this. However, to illustrate the magnitude of residual risk even after installation of MDM, consider the range of risky actions the healthcare worker can still perform on the device, including taking photos, recording video / audio, texting, file transfer, personal e-mail, social media, and so forth. Many organizations use annual security awareness training to help mitigate this risk. However, this training is often ineffective, and the technology and risk landscape fast evolving.

 

Join us at IAPP Privacy Academy and CSA Congress 2014 in San Jose, Calif., September 17-19, for a lively interactive session on Healthcare Workarounds: Managing Risk in the Age of User Empowerment, sharing 2014 HIMSS research covering the extent of healthcare workarounds, motivations, types and mitigations. We will then highlights practical strategies to enable new technology use while effectively mitigating risk and improving compliance with policy and regulations.

 

What you’ll take away:

  • Increasing empowerment of healthcare workers with technology
  • Growing source of risk from workarounds
  • Practical strategies to embrace and benefit from new technology while effectively mitigating risk

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

In a previous blog, I highlighted the increasing empowerment of end users, whether healthcare workers or patients, with information power tools, delivering powerful new capabilities, but also new privacy and security risks. Health and fitness apps are rapidly proliferating across mainstream app ecosystems, driven by compelling benefits.

 

A sampling of the Google Play Store for Health and Fitness apps shows a wide spectrum of apps ranging from calorie counters, to workout managers, pregnancy assistants, health managers, mental health apps, dieting apps and more. They are collecting increasing variety and volume of sensitive personal information, and often near realtime, presenting a growing privacy challenge. Many users are concerned about privacy and security on some level. However, many are not sure where the specific privacy risks are, or what alternatives they have to continue to engage while also reducing risks. In this blog I explore specific privacy risks in the app space.

 

There are many ways one can reduce such privacy risks. Below I explore six of the most practical actions one can take to continue to engage in using these kinds of apps, while also reducing risks.

 

1. Finding Safer Alternative Apps
In mature mainstream app ecosystems, there are many choices of apps in various categories, including health and fitness. When choosing, paying careful attention to number of users, app rating, reviews, privacy policy, and app permissions can help inform you of the associated privacy risk. Many app developers offer free baseline versions of apps which often use advertising to monetize. These free advertising supported apps often contain embedded ad network libraries that are privacy intrusive. Sometimes it is better to opt for the 99c version to avoid such privacy intrusions. After you have made your choice and installed an app, periodically running an app privacy & security scanner helps detect apps that use dangerous permissions, poor reputations, or are privacy intrusive.

 

2. Configuring Apps for Privacy
Many apps contain settings such as opt-outs that control your privacy cost in using them. They may also have passwords or encryption options. Once you have installed an app be sure to look through settings, be aware of what configuration controls you have, and set them in a way that enables you to use the benefits of the app you want, while avoiding any additional unnecessary privacy cost.

 

3. Disabling Apps You Still Want, But Use Only Occasionally
Beyond the types of personal info apps collect, privacy cost can also depend on the length of time your personal info is collected. In many cases we use an app only periodically, or rarely. However, many apps have background services that are continuously collecting your personal info. Mainstream operating systems including Android enable you to disable apps you use only occasionally, and have those apps automatically re-activated the next time you use them. For example, to do this in Android use the app manager to find the app you want to disable, then select it and “Force Stop” it. This disable endures through a reboot of the device. Only when you next run the app will it be automatically and seamlessly restarted.

 

4. Uninstalling Apps You No Longer Use
Most of us have had the experience of trying an app, not using it afterwards, while forgetting to uninstall it. Such apps can have background services that collect your personal info indefinitely, even though you are not actively using them. This not only has a privacy cost, but also consumes storage, CPU, battery, radio bandwidth and so forth. Periodically, perhaps monthly, reviewing the apps you have installed and uninstalling ones you no longer use is highly recommended to reduce you privacy cost and free up your device resources to ensure best performance.

 

5. Be Careful about the Type of Personal Info You Share, and Who You Share With
The privacy cost of using an app is dependent on what types of personal info you are sharing with it, and gets access to it as a result. Some of this sharing of your personal info automatic, and enabled through the permissions you grant to the app at install time as discussed above. However, much of this is the type of personal info you explicitly opts to share while using the app. Be especially careful of sharing anything that can identify you, be used to contact you, be used to locate you, or information that could be embarrassing or abused in the wrong hands. Any kind of financial or insurance information is also risky as it is highly sought after by hackers since it is easily monetized. Be aware of the audience you are sharing with through the app. For example when posting to social media there is often a choice of the particular forum or group you are sharing with on a post by post basis. Be sure the ones you are sharing with are the ones that have a need to know, and that you want to share with.

 

6. Pay Your Privacy Savvy Forward
Privacy is not an evenly distributed expertise. As you find value in using these practical approaches to reducing your privacy cost in using apps, pay it forward to your friends, family and colleagues to help them engage and have fun while avoiding privacy intrusions.

 

What other practical approaches are you using to protect your privacy and security while using apps?

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Healthcare workers are being empowered with more and more information power tools including apps, smartphones, tablets, social media, wearables and Internet of Things. These tools are fast evolving with new software capabilities appearing daily, and refresh rates on hardware much shorter than seen in the past with PCs.

 

With each of these information power tools comes new privacy and security risks. While existing security safeguards help, they only partly mitigate risk, and significant residual risk remains even after application of MDM or other safeguards, as evidenced by the range of risky actions users can still perform even with these safeguards in place. Effective training is direly needed to address this and help healthcare workers understand privacy and security risks of their actions and alternatives available to them to engage while reducing risk.

 

2014 HIMSS Analytics global research on healthcare privacy and security reveals most organizations provide security awareness training annually 70 percent or during new employee orientation 55 percent, with less than 40 percent providing such training on demand as needed.

 

David Graphic.jpg

 

Given new risks appearing daily with the fast evolving information power tools landscape, this highlights a gap. Even in a best case where a healthcare organization fully comprehends all privacy and security risks, and training is completely up to date at time of delivery, six months down the line the information power tools healthcare workers are using is very different, exposing completely new privacy and security risks.

 

To enable healthcare to embrace and realize the benefits of new technology, including information power tools, we need a near realtime way of engaging healthcare workers with on the job privacy and security training that tracks the evolving risk landscape, fits into their workflow and context, highlights privacy and security risks of their specific actions in teachable moments, and helps them understand safer alternatives available to both engage while reducing risk.

 

For example a healthcare worker may take a picture of a patient with their smartphone and initiate sharing with coworkers using a file transfer app. While convenient this can expose new risks to confidentiality, integrity, availability, and trans-border data flow. What if the healthcare organization privacy and security team could reach that healthcare worker in the teachable moment of this use case, highlight the risks, and viable alternatives available for them to achieve their goals while reducing risks?

 

This would effectively empower the healthcare workers with the privacy and security savvy they need to counterbalance the new risks they are exposed to with the new information power tools at hand, and enable healthcare organizations to embrace and realize benefits of new technologies for better patient care, while keeping risks of privacy and security incidents such as breaches manageable.

 

What approach are you using for effective privacy and security training of your healthcare workers?

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Healthcare workers are being empowered with more and more information power tools, from apps, to smartphones, tablets and other devices, to social media, and now wearables and Internet of Things.

 

These tools deliver great benefits that can improve the quality of patient care, and reduce the cost of healthcare. However, they also bring new risks of accidental breaches and other security and privacy incidents. 2014 HIMSS Analytics global research on healthcare security shows healthcare workers use workarounds (out of compliance with policy) either daily (32%) or sometimes (25%). For example a workaround could be texting patient information to a healthcare co-worker, using a file sharing app with patient information, and so forth.

 

Any one of these could result in a breach, and the staggering cost of a data breach averaging around US $5.85 million in the 2014 Cost of a Data Breach Study. The prevalence of workarounds and impact of security incidents such as breaches highlights the alarming probability and impact of this type of privacy and security risk from healthcare worker user actions. These types of risks and impacts are also set to increase going forward as healthcare workers are further empowered. In most cases, healthcare workers are well-intentioned and try to do the right thing.  However, they inadvertently add risk using new information power tools, often using them under time or cost reduction pressure. Exacerbating this, security and privacy awareness training provided by healthcare organizations is often limited in effectiveness, and even in a best case where training is up to date and well delivered, the technology landscape is fast evolving so the technology and risk landscape is significantly different even a few months later.

 

To date, much of the emphasis on responsibility for privacy and security has been placed on tool and service providers, enforced by regulators. This is analogous to safety regulators regulating the safety features of power tools used in workshops and for construction: even with the tool’s safety features, users know that they could inflict significant harm on themselves or others if they use the tools incorrectly.

 

In other words, they are responsible for using the power tools and incorporated safety features in a way that delivers the benefits while keeping risks of accidents minimal. What we are seeing in the information technology landscape is healthcare workers being empowered with information power tools such as apps, mobile devices, social media, wearables and Internet of Things, with little or no concurrent effective empowerment of privacy and security savvy on how to use these to get benefits while also minimizing risks of security incidents such as breaches.

 

To enable healthcare to rapidly realize the benefits of new technologies while keeping privacy and security risks manageable, we must find better ways of effectively empowering healthcare workers with the privacy and security savvy they need to use these information power tools safely.

 

What privacy and security risks are you seeing with healthcare workers using information power tools? I’m also curious about your thoughts, strategies, and best practices on how to manage these risks?

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

A recent Reuters / Ipsos poll finds that 51 percent of Americans are spooked by Internet companies and their growing ambitions, “with a majority worried that Internet companies are encroaching too much upon their lives.”

 

Clearly, users are increasingly concerned about privacy, including healthcare workers and patients. This will likely be exacerbated going forward by new powerful apps and devices, health and wellness devices, medical devices, social media, wearables, and the Internet of Things.

 

However, many users don’t know what to do about it, and feel the situation is hopeless, as can be seen by widespread sentiment that “privacy is dead.” I assert that if privacy were dead, and all of our personal information was available to anyone who wanted it, including malicious individuals that would seek to abuse it, then we would be much worse off than we currently are.

 

In a prior blog, I discussed Viewing Health IT Privacy as a Purchase Decision. From this standpoint, privacy is far from dead, and we are not “privacy broke,” but rather many users are currently spending too much of their privacy in online engagements, given the benefits they are receiving, and there is a growing need to help users find more “privacy cost effective” solutions to meet their needs.

 

In many forms of online engagement, whether apps or social media, there is a natural privacy “give” required for the benefit or “get.” For example, if one wants to get live traffic information one must share one’s location history since this is a critical input used to calculate the live traffic information. Similarly, if a patient wants health and wellness advice he/she must be willing to share personal health and wellness information with apps and organizations that have the big data/analytics to collect, analyze and derive knowledge from raw health and wellness data and present it to the patient to help them make better choices.

 

However, in many online engagements there is an unnecessary privacy “give,” not required for the benefit the user is receiving. An example may include a flashlight app that has ad network libraries in it tracking the user’s location and other personal information – clearly not required for the user to get the function of the flashlight app providing light, and especially considering that there are many other functionally equivalent flashlight apps out there that do not require this unnecessary privacy “give.”

 

In many cases, there are simple actions users can use to achieve their goals while reducing or minimizing unnecessary privacy “give.” These could include changing configuration settings of their apps and devices – for example unchecking opt-outs, replacing privacy intrusive apps with safer alternatives such as in the flashlight example above, changing the type of information they share in specific online engagements, or in a worst case uninstalling privacy intrusive apps.

 

In many cases users, are unaware of the privacy “give” in online engagements. To really help users with privacy, beyond raising the alarm … helping them actually improve their privacy posture, we need to first increase users awareness of their unnecessary privacy “give” and then guide them to viable alternative actions that achieve their goals while significantly reducing or eliminating the unnecessary privacy “give.” This is no easy task with the rapidly changing technology landscape especially in the exploding ecosystem of health and wellness apps and online services, but critical if we are to maintain users trust and confidence in the privacy safety of new technology, and their willingness to adopt it and use it.

 

Are you concerned about your privacy online? What solutions do you see to address these concerns?

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

 

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Filter Blog

By date:
By tag: