A recent Reuters / Ipsos poll finds that 51 percent of Americans are spooked by Internet companies and their growing ambitions, “with a majority worried that Internet companies are encroaching too much upon their lives.”

 

Clearly, users are increasingly concerned about privacy, including healthcare workers and patients. This will likely be exacerbated going forward by new powerful apps and devices, health and wellness devices, medical devices, social media, wearables, and the Internet of Things.

 

However, many users don’t know what to do about it, and feel the situation is hopeless, as can be seen by widespread sentiment that “privacy is dead.” I assert that if privacy were dead, and all of our personal information was available to anyone who wanted it, including malicious individuals that would seek to abuse it, then we would be much worse off than we currently are.

 

In a prior blog, I discussed Viewing Health IT Privacy as a Purchase Decision. From this standpoint, privacy is far from dead, and we are not “privacy broke,” but rather many users are currently spending too much of their privacy in online engagements, given the benefits they are receiving, and there is a growing need to help users find more “privacy cost effective” solutions to meet their needs.

 

In many forms of online engagement, whether apps or social media, there is a natural privacy “give” required for the benefit or “get.” For example, if one wants to get live traffic information one must share one’s location history since this is a critical input used to calculate the live traffic information. Similarly, if a patient wants health and wellness advice he/she must be willing to share personal health and wellness information with apps and organizations that have the big data/analytics to collect, analyze and derive knowledge from raw health and wellness data and present it to the patient to help them make better choices.

 

However, in many online engagements there is an unnecessary privacy “give,” not required for the benefit the user is receiving. An example may include a flashlight app that has ad network libraries in it tracking the user’s location and other personal information – clearly not required for the user to get the function of the flashlight app providing light, and especially considering that there are many other functionally equivalent flashlight apps out there that do not require this unnecessary privacy “give.”

 

In many cases, there are simple actions users can use to achieve their goals while reducing or minimizing unnecessary privacy “give.” These could include changing configuration settings of their apps and devices – for example unchecking opt-outs, replacing privacy intrusive apps with safer alternatives such as in the flashlight example above, changing the type of information they share in specific online engagements, or in a worst case uninstalling privacy intrusive apps.

 

In many cases users, are unaware of the privacy “give” in online engagements. To really help users with privacy, beyond raising the alarm … helping them actually improve their privacy posture, we need to first increase users awareness of their unnecessary privacy “give” and then guide them to viable alternative actions that achieve their goals while significantly reducing or eliminating the unnecessary privacy “give.” This is no easy task with the rapidly changing technology landscape especially in the exploding ecosystem of health and wellness apps and online services, but critical if we are to maintain users trust and confidence in the privacy safety of new technology, and their willingness to adopt it and use it.

 

Are you concerned about your privacy online? What solutions do you see to address these concerns?

 

David Houlding, MSc, CISSP, CIPP is a senior privacy researcher with Intel Labs and a frequent blog contributor.

 

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts