Other Blogs
There are no items in this list.
Knowledge & Insights > ISACA Now > Categories
IoT Security and Privacy: Exploring Technology Solutions Aligned to Regulatory Needs

Jon ShendeIn my last post, I spoke about the Internet of Things (IoT) in terms of trust, security and privacy at a high level. Here, I will take a deeper dive in terms of how IoT security and privacy can impact an ecosystem interconnect.

When we talk about IoT, we think about the process we implement as we migrate to sensor-driven infrastructure for automated processes.

Looking at economies and technology ramp-up trends from a financial perspective, we will expect that there with be standardization around policies and processes, as well as implementing interfaces that are expected to connect sensors to networks, platforms, and application systems, or a combination of services.

It can all appear to be complex and large scale, especially in the borderless world of IoT. However, if as security and privacy professionals we ask ourselves, “What are the major areas we should focus on?,” my perspective is that we will have to look at:

  1. Device security and settings
  2. Security device and system physical access (IAM)
  3. Securing our communication network systems
  4. Dealing with the large volume of data we will have to process, leveraging big data analytics, risk scoring and criticality metrics aligned to a system, user privilege, and the business functionality.

IoT PriSec Model
The team at The Cyber Policy and Security Governance Institute have been developing an IoT PriSec Model. This model:

  1. Combines best of breed practices based on network, system and application security, which integrates functionality to meet data security lifecycle expectations as well as data privacy requirements for in-border and cross-border migrations.
  2. Is built on the premise that an IoT infrastructure ecosystem consists of a self-healing, secure network infrastructure and systems that exfiltrates data for analysis from system-system connects and sub-system interactions. This system will have a big data capability to build an analysis of permitted, potentially dangerous and malicious activities, allowing for event-driven capabilities, driving a mindset of adaptive security.
  3. Will be further enhanced to adapt to blockchain technologies.
  4. Integrates privacy definitions that are tied into the IAM and privilege access management which is tightly tracked and auditable.
  5. Promotes an effective combination of cryptography and smart analytics integrated into sensor security mechanisms which can quickly assess, measure and score attack attempts and attack paths for smart attack detection.

One area that will have an impact on IoT environments, given that the growth of cloud and big data are enablers of IoT, is that of unikernel security.

In the paper “Unikernels: Library Operating Systems for the Cloud,” A. Madhavapeddy and team describe a unikernel as follows: “In the context of virtual machines and cloud computing, it makes sense to describe the whole virtual machine as a unikernel.”

Bratterud, Happe and Duncan presented a paper on “Enhancing Cloud Security and Privacy: The Unikernel Solution,” which lists six observations exhibited by Unikernel systems as follows:

  1. Choice of service isolation mechanism
  2. The concept of reduced software attack surface
  3. The use of a single address space, shared between service and kernel
  4. No shell by default, and the impact on debugging and forensics
  5. Microservices architecture and immutable infrastructure
  6. Single thread by default

In a following piece, I will present further details on this aspect, as well as other areas that we are seeing leading IoT vendors focus on from a security and privacy best practice perspective.

GDPR Can Bring Major Benefits to Governance, Security Professionals

Vilius BenetisThe European Union has long considered that a person owns all non-public data about him. Each individual then explicitly grants and revokes rights to process (for example: collect, analyze, aggregate and store) his or her personal data to everyone interested.

With some data, it is easy. One signs a contract, and later on, perhaps cancels the contract, along with permissions to process the data. But the question is not only about granting or revocation of rights to process, but also about getting to know which data is stored, how it was processed, with whom it was shared, and having the possibility to remove that data from systems (i.e., to be forgotten).

Data in the physical world leave some traces, and even more in the digital world. Each of our digital activities touches many systems: computers, servers, information systems, transmission systems, security systems, usage analysis systems, and so on. Moreover, not all of these traces are under contractual relationships due to complexity of interaction between systems, as well as due to usability.

Information systems and the Internet were designed mostly respecting another model – that the owner of the system owns the data as well, unless it is specifically provisioned otherwise.

The new EU General Data Protection Regulation (GDPR) threatens everyone globally that processes data of clients and EU residents with big fines if the EU approach of data owning is not respected. ISACA recently issued a new publication GDPR Data Protection Impact Assessments – What Does It Mean to Me, providing guidance to practitioners and their organizations for how to deal with these considerations.

Despite all the difficulties, I would argue that implementation of the new regulation brings a lot of benefits to all those involved in IT governance, such as:

  1. Organizations are forced to inventory all their digital assets, and start managing them better. In such way, more resilience against cyberattacks or mismanagement can be created.
  2. IT staff are forced to talk and understand legal teams, discuss the impact, and better understand threat landscapes and liabilities, which shrinks gaps of understanding.
  3. The true cost of automizing will be calculated better. Right now, it is still often calculated in only the cost of designing, deploying and running costs of information systems. Now, the securing of information systems, data and information system life-cycling, and the creating, processing, destroying, auditing, handing over and disposing of data will be assessed.
  4. A new profession with a clear mandate and responsibilities will be brought into most organizations. The Data Protection Officer (DPO) will provide extra help in IT governance.

Overall, GDPR has the potential to be one of the pillar forces that gets us together to address cyber security properly. While it alone will not be sufficient, combined with other governance and regulatory efforts, real progress can be made.

GDPR: The Role of the DPO – And How to Find One in a Competitive Landscape

Michael HughesGDPR (General Data Protection Regulation) introduces the new role of Data Protection Officer (DPO). While many organizations have had the title of such a role under the existing EU Directive, member states had different interpretations of what this meant. GDPR takes the responsibilities of the DPO to another level.

To be able to effectively discharge the duties of the DPO, as outlined in Articles 38 and 39 of GDPR, the DPO needs to have a high authority in their organization, have a wide range of experience and be multiskilled, both technically and socially.

The requirement to appoint a DPO will mainly fall upon large corporations, government bodies, organizations in the health and social care sectors, financial institutions, and mostly organizations that are based in the EU.

However, small and medium enterprises (SMEs) may also need a DPO role, as they could be a key component in a large corporate or government organization’s supply chain. These cases probably will not be a dedicated role, and could even be brought in as a managed service.

Also for the first time, an organization acting as an information processor under an outsourced, managed service, such as a cloud service provider arrangement, may need to consider the role of DPO.

This all means there is going to be a large requirement to recruit DPOs. There are many job adverts out there requiring X number of years of GDPR experience, but these people simply do not exist. Yes, there are many data privacy professionals out there, but the requirements of the GDPR go beyond this.

So, what makes a good DPO?
The DPO needs a mix of skills and experience extending from data privacy into information risk management, relationship management, persuasive/negotiating skills, and the ability to operate at the highest levels within an organization. DPOs will need to be able to effectively communicate across the whole of the organization with the ability to articulate potential risk, in business terms. The DPO needs to understand the risk to information and how to appropriately and adequately protect this information related to its level of risk, through people, processes and technology; related governance processes; and management controls.

The DPO’s initial primary focus will be to get his or her organization ready to be GDPR-compliant by the May 2018 deadline, when GDPR becomes enforceable. This will require engagement with all areas of the organization to obtain a good understanding of the information, gathered, processed, stored and shared, with particular attention on Personal Identifiable Information (PII).

However, once the DPO has the organization GDPR-ready, the DPO can add real business value by taking a wider view into information governance. With this in mind, larger organizations should seriously consider developing the DPO role in to the role of the Chief Data Officer (CDO).

Many of the skills and standing within an organization required belong to that of a Chief Data Officer (CDO). While the role of the CDO is wider than that of the DPO, there are many similarities.

To sum up, there is massive requirement to recruit DPOs with GDPR experience. As GDPR is only in its implementation phase, these people do not exist in the numbers required. Therefore, organizations need to take a more pragmatic view. Look at existing data protection professionals; can they be developed into the role of the DPO with training and coaching? Look at information risk and information governance professionals; can they be trained in data privacy? For the large corporates, look at the role of Chief Data Officer, and for SMEs, look at buying a managed service.

GDPR: What a Data Protection Impact Assessment Is and Isn’t

Rebecca HeroldThere has been a lot written over the past year or so about the EU General Data Protection Regulation (GDPR) – what is required, and what needs to be accomplished sooner rather than later in order to meet the May 25, 2018 compliance date. And with 99 articles, with hundreds of requirements within them, covered within the GDPR, there are certainly many topics that must be addressed.

While seven to eight months may seem like a long time to address them all, it is important for those responsible for GDPR compliance activities to realize that some of those activities will necessarily take many weeks of planning and preparation, and then most likely many additional weeks of actual implementation.

One case in point is performing a GDPR compliant data protection impact assessment (DPIA). I’ve heard and read a variety of statements made about DPIAs over the past several months, and I want to correct and clarify a few of the ones that I’ve heard that have been especially of concern.

  1. “I’ve already done a privacy impact assessment (PIA), so I’ve got the GDPR DPIA requirement already taken care of!” Wait; not so fast. While I view a DPIA as a specific type of PIA (this is debated by various fans and foes of both the DPIA and PIAs, but in my experience doing more than 100 PIAs and more than a dozen DPIAs, I believe that, with the outliers that can be effectively addressed, a DPIA fits well within the larger PIA domain), a DPIA does have important and significant differences to a traditional PIA. So, don’t think that just since you’ve done a PIA at some point within the past year or two that you’ve met all the GDPR requirements for a DPIA.
     
  2. “Our lawyers told us it was a legal activity, and that IT, privacy and information security folks don’t need to bother with worrying about doing a DPIA.” While you need to do a DPIA to meet GDPR legal requirements, the answers that you will need to provide will typically not be known to the legal department, so it will need to include involvement from key stakeholders in IT, information security and privacy. And if you’re waiting for the legal department to contact you to ask you simple yes and no questions, keep in mind that many of the answers you will need to provide for an accurate and acceptable DPIA will not be yes or no. To determine as accurate of risk level calculations as possible, you must provide more detailed descriptions for where you are at with accomplishing activities.
     
  3.  “I got a free 10-question GDPR readiness checklist from the Internet, so I’ll use that for my DPIA.” That is a dangerous, and incorrect, belief. A checklist is not an assessment. I am a fan of using checklists to keep track of progress with projects to make sure that I have addressed specific topics fully, to answer truly yes/no types of dichotomy questions, or to cross off the items I wanted to get at the grocery store after I put each in my cart. An assessment is not an either/or dichotomy. An assessment includes doing various types and levels of analysis, and arriving to an assessment that could be any number of answers, representing a multitude of risk levels. GDPR checklists that boil down the high-level activities you need to do can be helpful; a checklist, though, will not accomplish the necessary GDPR required DPIA.

A significant purpose for requiring organizations to conduct DPIAs is to identify and reduce the data protection risks within projects, networks and systems, reduce the likelihood of privacy harms to data subjects, and to determine the levels to which all of the applicable 99 GDPR articles have been implemented by the organization. Traditional PIAs have not fully addressed consideration of harms to data subjects (but that is important for all to address whether or not it is for DPIA), and certainly traditional PIAs did not look at the specific DPIA requirements that are unique from traditional PIA topics covered.

To the specific point of performing a DPIA, I recommend that organizations use a framework that not only addresses and meets the GDPR requirements, but can also meet other requirements for performing other types of privacy impact assessments. I’ve created a PIA framework, based upon the ISACA Privacy Principles, which consolidates similar privacy principle requirements and topics into the 14 ISACA Privacy Principles, and maps all the DPIA requirements within them, in addition to those DPIA questions also mapping to other standards, frameworks and regulatory data protection requirements.

I will go over the associated methodology on 28 September at the “How to Perform GDPR Data Protection Impact Assessments” ISACA webinar (www.isaca.org/Education/Online-Learning/Pages/Webinar-How-to-Perform-GDPR-Data-Protection-Impact-Assessments.aspx), and will also point to a spreadsheet I created for ISACA members to use for performing DPIAs, as well as a new version of an automated DPIA tool I created for ISACA to make available to members.

I hope you can join me!

Privacy Has Had Its Chernobyl Moment

Privacy has had its Chernobyl moment.

Maybe it was when a foreign power stole everything every American had submitted for a clearance form from the Office of Personnel Management. Maybe it was when an insurer lost control of the health records of millions of Americans. Maybe it was when the United Kingdom spilled its child benefit data. Maybe it was when India created a biometric ID system and sort of forgot about controls.

However you want to define a privacy Chernobyl, it, or something like it, has happened.

We exist in a world where our expectation of privacy has been shattered, diminished and demeaned, and yet privacy invasions still outrage us. What we haven’t done is built a cap, and certainly not a sarcophagus that’s designed to protect the radioactive slag for an appropriately long time.

Privacy failures still make the news. Failures on the part of firms who have promised to take it seriously still result in 20-year consent decrees. (Recall that 20 years ago, in 1997, Alta Vista was still the dominant search engine, the Motorola flip phone was dominant amongst those weirdos who bothered with a cellphone, and 56k was pretty good internet connectivity through your phone line. Will word choices that seem agreeable today be sensible after 20 more years of technological acceleration?)

I want to encourage you to use Implementing a Privacy Protection Program: Using COBIT 5 Enablers With the ISACA Privacy Principles as a way for you to realize that personal data is radioactive, and you want to start treating it as such. If you accumulate too much, you risk a meltdown, but even when you have it in small doses, you want to be intentional about it.  You want to know why it’s here, how you’re protecting it, and how to get rid of it when the risk exceeds the reward.

You should be thinking of ISACA’s new privacy protection guidance as an important move forward in your privacy journey. It’s a necessary step, and going through the steps will help you understand if there’s more that you need to do.

Editor’s note: Additional privacy-related guidance can be found in ISACA’s new white paper, Adopting GDPR Using COBIT 5.

About Adam Shostack: Adam is a consultant, entrepreneur, technologist, author and game designer. He's a member of the BlackHat Review Board, and helped found the CVE and many other things. He's currently helping a variety of organizations improve their security, and advising and mentoring startups as a Mach37 Star Mentor. While at Microsoft, he drove the Autorun fix into Windows Update, was the lead designer of the SDL Threat Modeling Tool v3 and created the "Elevation of Privilege" game. Adam is the author of "Threat Modeling: Designing for Security," and the co-author of "The New School of Information Security."

GDPR Compliance: One Step at a Time

Steve WrightMost of the people I speak to about GDPR are struggling with two main things.

The first one is how to interpret the GDPR text, specifically on issues like consent or new privacy rights like the “right to restrict processing,” the “right to oppose profiling,” or the scope of the “right to data portability.” The other is where to start, given the lack of detailed guidance on practical implementation.

I think these two are interlinked and have to be addressed together and simultaneously. In other words, I believe you should approach the GDPR program as a whole, and not try to separate out into different aspects or outsource the program in its entirety as some of the people I’m speaking with are doing.

My business leaders, data owners, IT architects and the CIO have all been badgering me for clear guidance or definitive policy statements, which is really hard when the GDPR text is very oblique and vague on the ‘what’ and ‘how,’ and there is no regulatory guidance or case law yet. They want absolutes – like a rule book or PCI. They want hard facts with yes or no answers. Well, this simply is not possible.

In the past, I turned to lawyers, who kept on telling me “it depends,” which is no good when you need to provide definitive or strategic direction. So instead, we got down into the weeds of the text, and I worked night and day with my in-house lawyer, a solutions architect and really good privacy analysts. Between us, we developed the GDPR Framework and the Privacy Playbook.

The GDPR Framework is like it sounds, a concept model – a framework by which the architects and business could start to consider from a system or process perspective the impacts of “the minimum rules.” The Privacy Playbook allowed us the flexibility to develop, amend, collaborate and interpret the text and conduct ‘what if’ scenarios that helped shape crunch decisions that were needed by the business, so that they could get on with business planning (impact vs risk). The decisions were captured as policy decisions, to ensure the full impact of changes could be considered and absorbed by the business.

So far, this collaborative approach has worked out well, as now we are drafting a consolidated version of the Playbook – with the minimum outcomes necessary to comply. We have completed the discovery exercise to understand the current proliferation of key data sets, and we are considering the full implications (and options) of what ‘good’ GDPR compliance looks like.

The board is now on board, and the path to compliance is clearer to get us to our compliance milestone of May 2018.

One thing is for sure, the only way to get there is by taking one step at a time.

Editor’s note: For more on GDPR, register for the 14 September webinar, “How to Jump Start GDPR with Identity & Access Management.”

To Micro-Chip or Not to Micro-Chip: That is the Question

Rebecca HeroldTalk of employees at a Wisconsin (USA) business getting microchip implants to use within its work facilities for a wide variety of purposes (such as for access control to business networks as well as to secured rooms, to use business machines, to make payments in company stores and vending machines, and many other activities), has been the topic of hundreds of recent news reports.

It seems like those giving opinions are almost exclusively in either the “Hell, no! This is Big Brother surveillance run amok, signaling the end of all privacy!” camp, or the “Hell, yes! The benefits this will bring to the business and employees in saving time, improving security, and facilitating better use of machinery is unparalleled!” point of view.

I don’t fall strictly into either point of view. Of course, this microchipping technology could provide a wide range of benefits and prove to be an increasingly pervasive and powerful business tool. However, with great power comes great responsibility. The marriage of technology with business activities and employee information is typically a very complicated situation. Without adequately addressing the privacy issues involved, the situation could quickly spin out of control and result in a messy business divorce, with associated lawsuits, bad PR for the business, and irreparable harm to the individuals involved.

Benefits
So, let’s consider some of the major potential benefits:

  • No more passwords to remember or manage. This could strengthen access controls into your computers, systems, networks, and anywhere user IDs and passwords are used, and save a lot of time for your user support area. And, it could reduce the risks of hackers getting into your systems.
  • No money for snacks or lunch? No problem! The chips could enable workers to more quickly and efficiently purchase snacks and meals without worrying about having cash or a credit card on hand.
  • No more PCI DSS to deal with! Wait. Really? Hmm. Well, that depends upon how you implement the systems. But you could enable chips to deduct purchases from the business directly from paychecks, so there’s that.
  • Safer facilities. These chips can help organizations, especially those large ones with many workers, as well as businesses in huge, sprawling facilities where workers may be located in virtually unlimited areas, to accurately know where workers are located to help with emergency situations, ensure all individuals are accounted for during disasters such as fires, hurricanes and tornadoes, and any other situation where accounting for the locations of all individuals is critical.

Wow, this is great!

Whoa, there; hold on a minute. Don’t make your decision yet. Keep reading.

Risks and harms
Before you jump onto the pro-microchipping bandwagon, you must also consider the potential business, security and privacy risks and harms. To fully appreciate these risks, you first need to ask yourself some key questions:

  • What data is collected by the microchips? From what has been reported, these will include: individuals’ names, locations, items purchased at date/time/cost, dates/times of facility entries and exits, items purchased and associated dates, times and prices. And, potentially much more.
  • How will all that data be used? To deduct payments from paychecks? To use for attendance? When considering healthcare claims? Salary increases? Promotions or demotions? Firing? And many more possibilities.
  • With whom will the data be shared? HR? Managers? Coworkers? Marketers, to see what employees are purchasing? Outside food and clothing vendors? Police? Government agencies?

Once you establish the answers to these questions, then consider just a few of the many possible risks to the business:

  • Bad press could hurt business reputation. If any of your employees do not like the idea of being chipped and complain to others outside of the business, there is high probability of negative publicity hurting the business and lowering your brand value. Other bad press could occur if the chipping results in physical harm to the individuals, if the data is breached, or if the chipping systems have security incidents or failures.
  • Security incidents could result in breaches, down time, etc. What happens if the chipping system doesn’t play well with the other systems and causes networks to slow to unacceptable speeds, or brings them down completely? Or, what if the systems implemented are not mature and, as a result, data is not processed correctly? Any number of other incompatibility problems could also surface.
  • Lawsuits from those chipped. Even if the chips are made optional, it is possible that those who agreed to get them will come to regret their decisions, perhaps because the chip caused pain, rash or some other physical problem. Or, maybe they read a report about how the chip’s data is used, and then feel like they were tricked into getting them. Don’t forget, the USA and many other countries are litigious societies.
  • Noncompliance violations. It is quite possible that the use of these chips, the implementation of the use, or the associated data use could be violating applicable data protection laws and regulations. For example, consider the many actions you would need to take if you wanted to use microchips in a way that is in compliance with the EU General Data Protection Regulation (GDPR).

And, most importantly from a privacy rights standpoint (and in support of compliance with data protection laws outside of the USA, such as the EU GDPR), consider just a few of the privacy harms that could come to the associated individuals (data subjects):

  • Lost jobs. This could result if the chip data showed the employee was at an off-limits location, was in the cafeteria at a time he or she should have been in a meeting, was doing inappropriate activities on the network based on activities the chips logged, etc. But consider that data taken out of context could lead to bad employment decisions.
  • Denied loans. Purchase habits, as revealed by the chips, could result in employer credit unions or other lenders who obtain copies of the chip data to deny work loans for college, homes, cars or any other purpose. But consider that data taken out of context could lead to bad loan decisions.
  • Denied insurance claims. If the business self-insures its workers or provides the chip data to insurance companies, they could use the data to deny health insurance claims (“A diabetic knows better than to eat candy bars!”), life insurance claims (“They were in a clearly marked restricted area!”), or a wide range of other claims. But consider that data taken out of context could lead to bad insurance decisions.
  • Humiliation and embarrassment. What if the company makes the data available for others to view or decides it is a good idea to have a contest to see, based on chipped data, which employees are doing the best nutritionally? I’ve seen many businesses throughout my career do things that, in hindsight, were ridiculous to even consider. But consider that data taken out of context could lead to incorrect assumptions regarding individuals.

These are just a few examples. Your situation will have other types of risks and harms to consider, many that will be unique and specific to your own business environment.

Bottom line …
Before any business makes any decisions where personal data is involved, it needs to ask three basic questions:

  • Will this improve business? Will micro-chipping employees really improve business? Or, do the costs and potential risks and harms outweigh the potential benefits?
  • What are the risks? How could micro-chipping employees damage business? What are the technical, physical and legal issues involved?
  • What are the harms? Could the sharing or use of data involved with micro-chipping employees potentially cause harm to the associated data subjects?

If your answers to these questions indicate that there will be greater benefits than business risks and personal privacy harms, and that those risks and harms can be acceptably mitigated, then happy chipping! Otherwise, you need to do more research and investigation, or simply conclude, “No. This is not a good action for our business to take at this time.”

I recommend every business perform a privacy impact assessment (PIA) for any type of new system considered that involves personal data. Be like Peter Parker: Before implementing a microchipping system, do a PIA to reinforce the great responsibility of even thinking about using such a powerful system.

Digital Forensics Professionals Encountering New Challenges

Bill DeanWhen I began performing digital forensics more than 10 years ago, things were relatively simple. At that time, the complexity of digital forensics revolved around ensuring each artifact of relevance was identified, and the proper tools to analyze them were available to leverage against computers used by the suspect.

The computer(s) of the suspect were typically the only focus. In some instances, we were also having to deal with mailbox exports of corporate users. When mobile devices came onto the scene in 2008 timeframe, our single device analysis approach to investigations was disrupted significantly. What are these things? Why don’t my hard drive forensics tools work on phones? We “forensicators” had no idea what challenges we would face in the next decade.

The significant challenges facing digital forensics experts today are the vast amounts of devices and locations that may house the valuable information. It is no longer always the case that all data sought to derive a conclusion is on a single device or in a single location. While it is now common to analyze both the computer(s) and phones used by the suspect, there now must be consideration given to other mobile devices (tablets), cloud-based email, cloud-based storage, social media activity, game consoles, IoT devices and even wearables.

Forensics tools for mobile devices were historically valued based on how many phones were supported. We are now arguably down to four phone types that you will likely encounter. Even now as the forensics tools have advanced considerably, the collection of mobile devices requires different approaches than computers. In many instances, the trusted full forensic image of the evidence is not always available – only the data the phone manufacturer will allow you to have. With the drastic reduction in the types of phones you will now encounter, the value is now in the parsers for the applications. With the millions of mobile applications available, and the frequent updates, it continues to be a challenge for the mobile phone forensic platforms to keep up with the rapid pace.

Over the past decade, users have traded in their locally stored email from their Internet service provider (ISP) for the convenience of webmail platforms such as Gmail, Yahoo Mail, or Outlook.com. When users are using webmail services, it is very unlikely that their email will be stored locally and, compared to years past, only fragments of the email are available in Internet cache files. Depending on the nature of the investigation, forensicators may be given the needed access to collect this information from the provider for analysis. When involved in internal investigations involving employees, it is very unlikely that forensicators will be given this access. In addition, even if you can obtain the webmail credentials from the device analyzed, you are not permitted to log into their personal email account. Therefore, the Internet histories and limited file fragments are all that will be available.

This same scenario now applies to personal files as users have migrated this information to cloud-based storage such as Box, Google docs and Dropbox. The same difficulties as webmail email exist.

There are few investigations that do not have a social media component, either directly or indirectly. While Internet histories may demonstrate the usage of these sites, the available information related to all activity and communications can be difficult to extract from the device alone. While the social media providers likely have extensive activity available for each user, this information would require subpoena power that you may or may not have.

Lastly, the IoT phenomenon is also significantly impacting the digital forensics field to provide types of information we have not had in the past. From Internet cameras to fitness wearables, anything electronic may now be a potential target for collection and analysis. However, IoT devices pose similar challenges to that of mobile devices in 2010. There are thousands of different types of devices and little to no standardization. With that diversity and chaos, there are challenges for the collection, parsing, and analyzing of this information. As the mobile device forensic platforms exploded and faced challenges a decade ago, I predict the same for IoT devices going forward.

The overall goal of forensic analysts is to have confidence that every artifact has been properly identified, parsed and analyzed for an accurate conclusion. We have digital artifacts that we never dreamed of years ago. With the diversity of information and numerous locations where pertinent data may now be stored, it is a challenge to be certain you have everything you need.

I suggest that forensicators be patient, yet diligent, with the data sources available. As an artifact points to a data source that is not currently available, regroup and seek that information for additional analysis.

Editor’s note: For more insights on digital forensics, visit www.isaca.org/digitalforensics, and watch a related video at https://youtu.be/ZUqzcQc_syE.

ISACA Chapter President Finds Creative Way to Spread GDPR Awareness

Editor’s note: ISACA Belgium Chapter President Marc Vael, CISA, CISM, CGEIT, CRISC, recently took a creative approach to spread awareness about General Data Protection Regulation (GDPR), spearheading a game about the coming regulations that will affect enterprises worldwide. Competitors can win the game by answering GDPR questions correctly and with a little luck with the dice. ISACA Now recently visited with Vael about the game, which will be available on a limited basis at the ISACA chapter leadership event, this weekend in Munich, Germany, prior to EuroCACS. The following is an edited transcript.

ISACA Now: How did this GDPR game come about, and who was primarily involved with its development?
Basically, at my IT company, Smals, we were looking to bring the content of the EU GDPR to this group of IT developers, IT analysts, IT project managers and even management differently, avoiding PowerPoint or brochures or self-assessment questionnaires.

Initially, my colleague Nathalie Dewancker and myself started building “the journey to become EU GDPR compliant,” but that journey was too simple, and we started adding gaming effects, and before we knew it ourselves, we had a full-blown EU GDPR game. We loved the reactions so much that we didn’t want to keep it within our company or for ourselves, and thus we decided to ask ISACA Belgium for support, which the board of ISACA Belgium did by funding the professional look and feel of the EU GDPR game.

ISACA Now: ‘Game’ is probably not the first word that comes to mind when people think about GDPR. Why did you think this format would be a good fit?
True. Most of the messaging happens via PowerPoints, brochures and information on websites. Here and there we discover some apps with the searchable EU GPDR text in different languages or some EU GDPR self-assessment questionnaires. We found out that up to today, we are the only ones with a proper EU GDPR game box. Gamification is a well-known concept, but it is not used enough, in our humble opinion. Moreover, we notice huge discussions between the players, and that is just what we want to achieve: not just “acquiring” knowledge, but critically looking at this knowledge.

ISACA Now: Did it really only take a few weeks to put the game together? How were you able to execute the idea so swiftly?
Yes, we build from initial journey to full game in three weeks, with some tryouts. Then, molding it into a professional looking game box took another three weeks, thanks to the help of our external PR agency that we use here in Belgium. So, six weeks in all. And we were just in time to bring our game boxes for the main Belgian INFOSECURITY exhibition in Brussels, where over 3,000 attendees came in the end of March this year. Thus, it was plain teamwork.

ISACA Now: What has been the preliminary response to the game’s release?
Initially, skepticism that participants would learn about “such a complex matter as EU GDPR” via a game. But then, when playing, a lot of discussions happen between the participants and between participants and observers (since there can only be a maximum four participants, more people can join as observers of the game). It is great fun to see how some people really want to win.

We only made 300 EU GDPR game boxes and almost all are sold now. We initially wanted to give them away for free as marketing, but since we only had 300 game boxes, we did not want to have people take them and throw them away, so we ask only 5 Euro per game box as a token of appreciation and eagerness to have the box.

When we launched the game box at INFOSECURITY BELGIUM, our stand was very popular and people bought all 100 game boxes we brought over there in two days. We were surprised.

ISACA Now: What was the most remarkable reaction you got on the game?
Actually, some players asked why we did not include more information about the EU GDPR in the game box (like a manual on EU GDPR or some form of brochure or leaflet). We did not do that on purpose, and we responded by saying to them “If you play Monopoly, do you first have to follow a real estate course? No. If you play Stratego or Risk, do you first have to follow a military course? No.” So, if you play the EU GPDR game, we believe you do not have to follow some privacy course before playing either since the objective is to learn about EU GDPR during the game. People truly liked our reaction very much.

ISACA Now: What are some of the biggest implications GDPR could have on organizations that are affected by it?
The need to review and update the inventory of processes and suppliers, execute the privacy risk assessments on the core processes and suppliers, execute privacy awareness amongst employees and external personnel, and test the incident escalation process (to check if they can make it within 72 hours).

ISACA Now: What are a few misconceptions that technology professionals have about GDPR?
Very good question; here are some of the misconceptions I hear frequently by IT experts:

  1. Some organisations believe they are too small for EU GDPR so they pretend not to fall under the regulation
  2. Believing EU GDPR is merely an information security issue which can be solved by encrypting all data
  3. Stating that May 2018 is still far away to handle such compliance topic
  4. Believing EU GDPR is a legal topic so legal counsel will handle it
  5. IT is mainly a data processor so the responsibility for EU GDPR is for the data controller (which is not IT)

ISACA Now: What is the best way for someone to purchase a copy of the game?
When living in Belgium (since the game is in Dutch/French combined), people can come and collect game boxes in our office (if they warn us upfront). When living outside of Belgium, we try to arrange for the cheapest way to get a game box shipped (I can be reached by email at president@isaca.be). We will also bring some game boxes to the ISACA European chapter leadership meeting this weekend since some ISACA chapter leaders have asked to bring a box over there.

Teaching Smart Gadgets Privacy Manners

The Internet of Things (IoT) is quickly becoming a highly populated digital space. Two popular types of IoT items are the Amazon Echo personal helper, that answers to “Alexa” (or “Echo” or “Amazon”), and the Google Home personal helper, that responds to “OK” (or “Google”). These highly proclaimed smart gadgets are always listening; as are generally all similar types of smart gadgets and toys.

Listening can quickly change to recording and storing the associated files in the vendors’ clouds because of how these devices are engineered. Let’s consider the privacy implications of how those recordings are made, where they are stored, how the recordings are used, and who has access to the recordings.

Amazon and Google both claim that their smart personal assistant devices do not keep any data that they are listening to before those keywords that trigger the recordings. However, here are just a few important privacy-impacting facts:

  • Amazon keeps approximately 60 seconds of the recordings from before the wakeup request to communicate with the devices within the local device, and a “fraction” of that is sent to the cloud.
  • All the sounds going on within the vicinity are also part of the recordings, along with a large amount of meta data, such as location, time, and so on.
  • The recordings will be kept indefinitely until consumers take it upon themselves to take actions and request the recordings be deleted.
  • Data, possibly including recordings (this topic is not directly addressed by Amazon or Google), may be shared with a wide range of third parties, and both vendors state they have “no responsibility or liability” for how that data is used by the third parties.

There are other privacy issues, of course. But, for now, let’s focus on these, which are significant on their own.

Privacy protections currently require manual intervention
While the Amazon and Google privacy policies each boast of privacy protections, those policies fall short of providing full explanation for full privacy protections specifically for Alexa and Home. And for the most part, consumers must take actions to protect their privacy, particularly for the issues listed previously. For example, users must, at a minimum, take the following six actions to establish a minimum level of privacy protections for themselves:

  1. Physically turn off the devices to keep them from recording everything in the vicinity. The devices do not turn off by themselves. These devices have been known to respond to words other than the keywords, and even order items as a result. By keeping the devices on all the time, you risk having private conversations recorded and accessed by whomever has access to the vendors’ clouds. Users should keep smart devices turned off when they have guests over and when they simply do not plan to use these devices.
  2. Set a password and change default passwords and wake words. Choose ones that are different from your other passwords, that are long and complex, and that are not composed of words found in any type of dictionary or are commonly spoken.
  3. Opt out of data-sharing. Generally, for most businesses in the U.S., if you don’t opt-out of data-sharing, you will be implicitly allowing the manufacturer to give, or even sell, your data to unlimited numbers of third parties; e.g., marketers, researchers and other businesses. You will then have no control or insights into how the data about YOU is used and shared by THEM.
  4. Use encryption. Turn on encryption for data transmissions and data in storage. Most are off by default. Amazon and Google generally state they encrypt all data in transit and in the cloud for all their services and products. However, disappointingly, neither give an option to encrypt the in-home device data storage.
  5. Read the privacy policy. If any IoT device vendor does not have a privacy policy, then don’t buy from them! This is an indication of either a bogus site, or of a site that does not build security or privacy into their products.
  6. Delete your data from the cloud. Don’t forget that all the audio recorded, and the associated meta data, will be kept within the Amazon and Google cloud systems forever – unless you take the initiative to delete it. And since that data is being accessed by a wide range of unknown third parties, you don’t want the information to be used to violate your privacy or result in privacy harms.

Effective privacy protections must be built in and automatic
These manual actions need to be taken for current versions of smart personal gadgets to protect privacy in the short-term. However, the time is long overdue for privacy protections and security controls to be engineered into every type of smart device available to consumers. The amount of data collected and the potential privacy harms that could occur with that data are too great to allow IoT vendors to simply take a few incomplete actions that only start, and do not complete, the implementation of all privacy protections that are necessary to protect the privacy and security those using the devices.

For example, to address the issues discussed here, Google and Amazon could have engineered the devices so that:

  1. Device settings could be set by consumers to automatically turn the devices off without physically doing so.
  2. Authentication was required and had to be strong.
  3. Data would not be shared with third parties without explicit permission as a device setting from the associated consumers.
  4. Data in storage on the device was automatically and strongly encrypted.
  5. Privacy notices could be accessed (possibly via audio) through the device.
  6. Consumers could have settings for automatic deletion from the cloud.

Over the past couple of years, I’ve chatted with my friends at CW Iowa Live about the privacy issues involved with these IoT devices. For more information on this topic beyond this blog post, you can listen to them here and here.

Utilize ISACA Privacy Principles to build privacy into processes
So how should engineers approach building privacy controls into IoT devices? Use new ISACA privacy resources! I am grateful and proud to have been part of the two ISACA International Privacy Task Force groups, both led by Yves Le Roux, since 2013, and to have been the lead developer authoring the newly released ISACA Privacy Principles and Program Management Guide (PP&PMG), incorporating the recommendations and input of the International Task Force members, as well as a complementary privacy guide targeted for publication in mid-2017.

The ISACA PP&PMG outlines the core privacy principles that organizations, as well as individuals, can use to help ensure privacy protections. These privacy principles can be used by engineers to build the important privacy and security controls into IoT devices right from the beginning of the initial design phase, and use them all the way through the entire product development and release lifecycle. Aligned and compatible with international privacy models and regulatory frameworks, the ISACA Privacy Principles can be used on their own or in tandem with the COBIT 5 framework.

The second ISACA privacy guide that will be released this year will include many examples throughout the entire data lifecycle and a detailed mapping of where to incorporate privacy controls within the COBIT 5 control framework component.

Editor’s note: Saturday is Data Privacy Day, and ISACA is an International Data Privacy Day champion.

1 - 10 Next