ISACA Journal
Volume 5, 2,014 


Beyond BYOD: Can I Connect My Body to Your Network 

Giuliano Pozza 

The discussion around bring your own device (BYOD) policies is often focused on topics such as the opportunity, risk and security implications of using personal devices and user apps in the business context. But sometimes, as in health care, the concern is not how to protect your company from being damaged by unsafe personal devices, but how to protect personal, life-saving medical equipment from being attacked through use of infected devices and the organization’s network and infrastructure as a bridgehead.

Adverse Events and Near Misses

The traditional approach to health care technology divides the world into certified medical devices and nonmedical devices. In Europe, medical devices have a CE mark, which signifies that the product conforms with all European Community directives that apply to it; in the US, there is a Federal Communications Commission (FCC) Declaration of Conformity. Unfortunately, there is a dangerous blurring as boundaries are pushed and evidence is beginning to show that implantable and nonimplantable medical devices pose new and upsetting security challenges. This has interesting implications on BYOD policies as well, especially in health care.

It is necessary to stress that so far there is no evidence of cyberattacks on implantable medical devices in the wild, but it has been demonstrated that many implantable devices can be attacked. Probably the most famous proof is the case of Jay Radcliffe, a diabetic 34-year-old computer network security expert, who proved that his insulin pump could be hacked.1 Another expert, Barnaby Jack, claimed that he could hack a pacemaker from 50 feet away, causing it to deliver a lethal 830-volt shock. Obviously, he did not demonstrate his claim, but it is true that even former US Vice President Dick Cheney revealed that he had his pacemaker’s wireless utility disabled as a preemptive security measure.2

Recently, the US Federal Drug Administration (FDA) issued a communiqué stating that it is not aware of any patient injuries or deaths associated with cybersecurity vulnerabilities. On the other hand, the FDA admits in the same document that incidents could directly impact medical devices or hospital network operations, through malware on hospital computers, smartphones and tablets, which could be used via wireless technology to access patient data, monitoring systems and implanted patient devices as well.3

The notion is not new. In the 1980s, a bug in the software of one type of radiotherapy machine in a US hospital caused massive overdoses of radiation to be delivered to several patients, killing at least five of them.4 Indeed, an interesting perspective on implantable cardioverter defibrillators (ICDs) is discussed in an article by Kevin Fu, in which he and his coauthors explain how a device built to administer an electrical shock to restore a normal heart rhythm can be hacked to read data, modify settings or discharge its battery via a denial-of-service (DoS) attack.5 It should be noted that the ICD used by Fu and his colleagues worked at 175 KHz, thus allowing a maximum distance of a few centimeters for programming it. New devices operate in the 402-405 MHz medical implant communications (MICS) band, designed for longer-range communication.

Some producers are already developing apps to monitor medical devices.6 In another interesting case, a physician at Boca Raton Regional Hospital (Florida, USA) created an innovative new iPad-driven method, called a remote-K-viewer, for adjusting and managing patients’ pacemakers. The system requires two-way communication between the physician on an iPad and another clinician on a mobile computer cart at the patient’s bedside. The physician logs into the system through the iPad and can see the patient’s pacemaker readings on the screen. Talking with the second clinician, the physician walks him through the process of reprogramming the patient’s implanted device.7 Even without a direct connection between the physician’s tablet and the pacemaker, it is not difficult to understand how hacking the physician’s mobile phone could result in incorrect readings and, therefore, errors when instructing the clinician who is adjusting the pacemaker.

Moreover, in the last year the concept of “personal device” has been extended to a multitude of sensors and devices as in the concept of a body area network (BAN). Hence, implementing a BYOD policy in a health care setting may result in a multidevice ecosystem comprising a crowd of sensors, smartphones, implantable devices, imaging and diagnostic equipment, workstations, and much more. While no one may be planning to directly connect a pacemaker to a hospital network or to the Internet, “indirect” connections are possible via the devices used to program the pacemaker (programming station); they could be connected and, therefore, are violable. Even if the programming station is not connected to the network, the reading device (whether an iPad or a home care monitoring station) is, in many cases, connected, and a manipulation of the reading could have a dangerous effect on the patient’s health.

Moreover, here the legacy is really a burden; pacemakers, for example, can last for more than 10 years. So, one must consider that a device implanted in 2008 was probably not designed with the security focus of today’s devices. Furthermore, a pacemaker implanted today will confront itself with the security threats of 2024. And, the pace of development in the biomedical engineering field is almost exponential, as shown in many articles.8, 9, 10, 11, 12

From the organizational point of view, one example is to consider how security on medical devices is distributed between IT professionals and clinical engineers. IT professionals usually focus on Radiology Information System—Picture Archiving and Communication System (RIS-PACS) and overlook devices such as imaging scanners, which can contain hidden, security risk. For example, some imaging devices may not be able to run antivirus software, while others can contain hidden electronically protected health information and are vulnerable to infection from a variety of methods. The imaging devices connected to hospital networks can be infected by viruses in a number of ways, including through the network, an access point left by manufacturers that is being exploited, or a cell phone connected to the device for charging. On the other hand, clinical engineers are great at managing the mechanical aspects of medical device issues, but are less comfortable with software.13

A further consideration: personal and mobile devices will increasingly become the attack vector of choice.14


The vast majority of hospitals are using different and layered security methods and procedures to protect themselves and their patients from risk. They use active and passive security tools; virtual local area networks (V-LANs) are a common way to protect subnetworks (subnets) dedicated to medical devices. Often, implantable devices’ programming stations are stand-alone or are confined in protected V-LANs, as with other medical devices. Still, the vulnerabilities are enormous.

Another way to mitigate the risk of unclear responsibilities between IT and clinical engineering is to review the organizational structure supervising health care technology (IT and medical devices) and/or bridge/link IT and clinical engineering.15

But, the key point is more about governance (which includes risk and security management, of course) than any other aspect. Thus, it needs to be stressed that a framework on governance of enterprise IT (GEIT) is absolutely essential to correctly evaluate the risk and opportunities and to assign responsibilities, as explained in the International Organization for Standardization’s ISO 38500 standard.16

Before proceeding with BYOD, a basic check of certain principles is needed. Figure 1 presents three questions—each one with a link to the ISO 38500 principles and an explanation of possible peculiarities to be considered in health care (and possibly in all high-risk contexts).

Figure 1


Prudence is the virtue here. The hard fact is: Everything is moving rapidly and it is not known what the world will be like in even the next three to five years. Technologists and futurists are coming up with interesting scenarios, but none of them are reassuring. According to Gartner, the nexus of forces (social, cloud, information, mobile) will drive society into one of four scenarios: coalition rule, neighborhood watch, controlling parent or regulated risk.17 The input variables for the scenarios are important (from tribal to monolithic control authority, for example, and from enterprise to individual targets), but totally unclear up to now. In the Project 2020 report, the International Cyber Security Protection Alliance (ICSPA) depicts a future scenario including the rise of different levels of security in the web and a secure web coexisting with a general-purpose web.18 Other authors envision a “technology singularity” triggered by a number of inventions and accelerated by the increasing connections and increasing number of users and devices on the Internet.19 The fact is that it is not possible to predict future scenarios and the variability among different hypotheses is extreme. This must be taken into account in deciding present strategies and tactics.

The decision about BYOD is not a purely technical decision, but a governance decision (both IT and enterprise). An “absolutely no BYOD” strategy is tough, but is probably the safest choice if the answers to the three questions in figure 1 are negative. A pragmatic BYOD strategy may be better than simply letting users come out with their personal workarounds.20 The use of encryption, patch management and antivirus software could mitigate risk. On the other hand, a BYOD implementation without a careful evaluation of risk, resources and culture is probably the worst and the most dangerous option.


1 Robertson, J.; “The Trials of a Diabetic Hacker,” Businessweek, 23 February 2012,
2 Kubota, T.; “Hacking Your Heart,” HIMSS Future Care, 25 October 2013,
3 Federal Drug Administration (FDA), “FDA Safety Communication: Cybersecurity for Medical Devices and Hospital Networks,” USA, 13 June 2013,
4 The Economist Blog, “How vulnerable are medical devices to hackers?,” The Economist Explains, 18 June 2013,
5 Fu, K.; D. Halperin; T. Heydt-Benjamin; B. Ransford; S. Clark; B. Defend; W. Maisel; “Pacemakers and Implantable Cardiac Defibrillators: Software Radio Attacks and Zero-Power Defenses,” IEEE Symposium on Security and Privacy, 2008
6 Dolan, B.; “Medtronic Launches First App for Implantable Devices,”, 28 June 2011,
7 Jackson, S.; “iPad Allows Docs to Remotely Manage Pacemakers,” FierceMobileHealthcare, 12 December 2011,
8 Ibid.
9 Gollakota, Shyamnath; Haitham Hasanieh; Benjamin Ransford; Dina Katabi; Kevin Fu; “They Can Hear Your Heartbeats: Non-Invasive Security for Implantable Medical Devices,” SIGCOMM, 2011
10 Federal Drug Administration (FDA), Cybersecurity for Networked Medical Devices Containing Off-the-shelf (OTS) Software, USA, 2005
11 Kime, P.; “DARPA Closer to Brain-controlled Prosthesis,” NavyTimes,
12 Fox News, “Robohand: DARPA’s Bionic Arm Can Be Controlled by Your Brain,” 2014
13 Ridley, E. L.; “Imaging devices present hidden security risks,” AuntMinnie, 2012,
14 TrendLabs, “Blurring Boundaries—Trend Micro Security Predictions for 2014 and Beyond,” Trend Micro, 2013,
15 Pozza, G.; “Healthcare SCADA Systems and Medical Devices Data Systems (MDDS) Governance and Security: A No Man’s Land?,” Journal on Clincal Engineering (JCE), vol. 39, iss. 3, 2014
16 Holt, A. L.; Governance of IT: An Executive Guide to ISO/IEC 38500, BCS, 2013
17 Hunter, R.; “The Future of Global Information Security,” Gartner, 2013
18 ICSPA, Project 2020: Scenarios for the Future of Cybercrime—White Paper for Decision Makers, 2013,
19 Kurzweil, R.; The Singularity Is Near: When Humans Transcend Biology, Viking, 2005
20 Pozza, G.; J. D. Halamka; The Fifth Domain, self-published, CreateSpace, 2014

Giuliano Pozza is a biomedical engineer by training and the chief information officer (CIO) of Fondazione Don Carlo Gnocchi Onlus, one of the largest social care and rehabilitation entities in Italy. Previously, Pozza worked as the CIO of Istituto Clinico Humanitas, a primary hospital in Italy. At the beginning of his career, Pozza worked in the health care practice of consulting firm Accenture. He can be reached at


Add Comments

Recent Comments

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and from opinions endorsed by authors’ employers or the editors of the Journal. The ISACA Journal does not attest to the originality of authors’ content.