Other Blogs
There are no items in this list.
Knowledge & Insights > ISACA Now > Categories
Five Considerations for Data Breach and Incident Reporting in the EU

Anna Vladimirova KryukovaThe increasing amount of cybersecurity incidents cause a serious negative impact on enterprises, prompting legislators around the world to explore new policies and regulations. Certainly, the GDPR was one of the most popular topics in the last year (the report of the European Commission shows that in May 2018 Google inquiries for the GDPR were more popular than those related to Beyoncé and Kim Kardashian). Having finalized the initial GDPR implementation stage, companies have been proceeding to deal with the practical challenges related to the new requirements. One of them is reporting personal data breaches to a supervisory authority and notifying data subjects.

However, the GDPR is not the only binding act setting forth the obligation of notifying certain parties about breaches and incidents. Some countries followed the privacy protection “wave” and introduced their own data protection acts requiring similar breach notifications. There are also other acts, which do not focus only on personal data matters, but cover also notification procedures regarding breaches and incidents (for example, NIS Directive, PSD 2, and ePrivacy Directive as well as country-level acts and guidelines implementing the directives). The wide array of applicable rules (which is especially important for international businesses) might cause organizational problems and misunderstanding regarding the actions to be undertaken in case of a probable incident. Further, the terminology used in different situations varies. Some acts refer to breaches, some to incidents, and in each particular case, the meaning of the term used should be assessed within the context of the corresponding act.

In order to understand which steps should be taken in order to ensure proper incident or breach reporting in the EU, it is recommended to take into consideration the following aspects and summarize them for further use:

1. Requirements applicable to the company. Companies may be subject to certain legal obligations depending on different factors. For example, the applicability may vary when taking into consideration the territory where the company is incorporated or carries out its business activities, the character of the provided services or produced goods, and the clients or partners impacted by the company.  For example, GDPR applies also to non-EU companies offering goods or services to the data subjects located in the EU, while the NIS Directive applies to network and information systems within the EU. As the EU Directives are usually implemented on a country level, the companies shall check their obligations against their country’s legislation. Additionally, it is recommended not to forget about acts such as criminal or administrative laws. In some countries, such documents also cover certain types of incidents that might impose reporting obligations.

2. Classification of the event. When it is clear which acts are binding on the company, it is necessary to understand which cases “trigger” the obligation to report the incident – namely, the types of information, systems, people that are impacted, and on which scale, and which level of risk the event falls under and whether this requires disclosure. For example, personal data and financial information systems operated by digital services providers or critical infrastructure might be impacted, but not necessarily require reporting in all cases.

3. Reaction time. The next step is to address the deadline for reporting different types of breaches or incidents. The statutory requirements for the deadlines might vary from several hours to several days or months, depending on the type of event.

4. Reporting. The scope of notification obligation might also be different. Some acts require reporting to authorities, such as personal data protection supervisory authorities, authorities similar to CERT (computer emergency response team), financial and telecommunication regulators, or police. Additionally, the company might be subject to the obligation to notify other impacted parties (clients, employees, cooperation partners).

5. Contents. The final step is to identify the information that will be reported based on the applicable requirements. It also is possible to use special reporting forms or the official template (if available). However, this does not mean that the company cannot collect any additional information for internal incident response purposes.

A summary of the above-mentioned information should be communicated in a way that is understandable to the people responsible for incident reporting in the company. However, the aforementioned activities are only the beginning, and the next task is to ensure that the reporting process is organized correctly and is carried out in appropriate fashion.

 

GDPR Audits for SMEs Are All About the Language

Steven ConnorsIt is often said that a good auditor is a good communicator, and this is particularly true when dealing with smaller organizations.

Small and medium-sized enterprises (SMEs) tend not to have the capacity to employ specialists in every role, instead relying upon generalists who fulfil many roles in the organization.

Unless the SME’s business is data processing or falls into one of the other categories that require a data protection officer (DPO), then the chances are that as auditors we will be speaking to the finance head or IT manager or HR manager about data protection.

ISACA’s new GDPR Audit Program for Small and Medium Enterprises is written not with the professional IT auditor in mind, but the auditee. Consequently, its language is simplified from that of the enterprise version.

One of the biggest issues I have found when dealing with SMEs is ensuring my conversations and questions are designed to fit the audience and are jargon-free. Only by adjusting the narrative to fit the audience can we hope to deliver an audit product that adds value. This is particularly important with GDPR in the SME space. Indeed, many SMEs still have not fully embraced the central theme of the GDPR – it’s all about the data subject, not the organization.

When auditing SMEs, it’s as much about education as compliance. GDPR is about how following some basic rules about good data governance, such as ensuring data quality, can add value, not just cost, to an SME. As auditors, we can help owners and managers to embrace this concept that we are adding value above and beyond what is derived from a compliance report.

It is also important to be aware that many SMEs will not have received the best advice leading up to GDPR. Many will have scoured the internet, talked with fellow business owners or at best attended a seminar or two – or, worse, been drawn into spending money on software solutions that are generic and not a good fit for their businesses.

In the hands of an experienced auditor, the audit program should be used as much to help devise a remediation plan as to arrive at an audit opinion. After all, the audit is designed to validate controls implemented to manage risk and to agree to a risk treatment plan.

A survey by Q2Q in November 2018 found that 41 percent of SMEs are still unsure about the rules and regulations surrounding GDPR. This, combined with 22 percent saying that emerging online risks are their biggest headache, present an opportunity for the auditor to use the program to offer genuine guidance to their SME clients.

One of the major issues that organizations and their auditors had with the previous Data Protection Act was that it was primarily viewed as an IT problem to be solved with technology. Complying with GDPR is about managing information risk and needs to consider a trio of risks: people, processes and technology. These risks must be considered across all facets of an organization.

Paying for Apps with Your Privacy

Rebecca HeroldDon’t look at your device when I ask you this question: How many apps do you have on your smartphone? Or, if you use your tablet more often, how many apps do you have on your tablet? Remember this number or write it down.

OK, now look at your device. How many apps do you actually have installed? Is that number higher than what you wrote down previously?

For most people, it would be. In many of my keynotes, and in most of my client key stakeholder meetings, I ask this question. I’ve seen around 90-95 percent of people severely underestimate the number of apps they have on their devices. For example, I’ve had people tell me they had maybe 15 or 20 apps installed, and after they checked, they found they actually had well over 100. But they were only using around 15 of them.

Keep this in mind: just because you are not actively using apps does not mean that those apps are not actively harvesting data from you.

Most people download apps willy-nilly. The mentality is often, if it is free, then, hey…let’s get it and see what it does! Oftentimes those never-used-but-still-installed apps are silently and often continuously taking data from the device and sending it to the app vendor, which then shares the data with unlimited numbers of other third, fourth, and beyond parties. Who are those third parties and beyond? What are they doing with your app data? How can those actions have negative impacts on those associated with the data?

Throughout my career, when doing my hundreds of assessments and risks analyses, I’ve often heard the following from those reading the reports, “Have these possibilities you’ve outlined actually happened? Has such misuse of data actually happened? Why is sharing data from devices a problem?” The overwhelming opinion was, "If nothing bad has happened yet, or we haven’t heard about bad things happening, then why worry? Probably nothing bad will happen." This often-stated denial of risks, and the lack of accountability that such opinions try to establish, are factors motivating app vendors and tech companies to share as much app data as possible, monetizing it along the way, and leading to a wide range of emerging invasions of privacy that don’t fall neatly under the definitions of “privacy breaches,” even though those involved certainly feel creeped-out and victimized, often in multiple ways.

Recent reports, including an intriguing one from the Wall Street Journal, are shining light on how so many app vendors are sharing data with Facebook, one of many social media and tech giants that is involved. For example, the report noted, “Instant Heart Rate: HR Monitor, the most popular heart-rate app on Apple’s iOS, made by California-based Azumio Inc., sent a user’s heart rate to Facebook immediately after it was recorded.” Do you think the app users knew this would happen? To what other businesses was their data sent? What about all the other apps being used? How many other organizations are they sending data to, unbeknown to the app users?

The types of data from apps that are being shared, and the insights they can give into people’s lives, are alarming, and go far beyond heart rate data. Apple and Alphabet Inc. (Google’s parent company) reportedly don’t require apps to disclose to the app users all the third parties that receive their personal data. So, in the HR Monitor example, the app users were likely not told that Facebook was going to get their data immediately as the data was collected. How many other third parties, and which ones, also got their data?

There are some huge problems that app creators and tech companies are generally not addressing in any meaningful or long-term way. Here are a few of them:

  • They do not clearly describe all the data they are collecting, deriving, sharing, processing and storing that that can be linked to specific individuals. In other words, they are not defining the personal data involved with the apps.
  • They do not specify the types of other data being associated with personal data, a combination that can result in very sensitive data.
  • They do not list the third parties with whom they are sharing that data, nor how the app users can determine how those third parties are using their data.

App creators and distributors need to do a better job at communicating the answers to these important questions to all those using their apps. But app users also need to be more proactive. They need to be more vigilant with how they download, use, and remove apps from their devices. I provided advice to app users about this in a couple of recent news stories – you can check them out at USA Today and Nerdwallet.

Moving Beyond Stubborn Reluctance to Comply with GDPR

Laszlo DelleiLast May marked the beginning of the application of the General Data Protection Regulation (GDPR), which harmonized and unified the rules governing privacy in the European Union. Leading up to and following the adoption of the regulation, data protection has been in the focus of attention all around the world. Governments introduced new legislation, while supervisory authorities, the civil society, data controllers and processors publicly discussed rules, obligations and institutions set out in the GDPR, and campaigns have been launched to raise privacy awareness among data subjects and the public.

Despite all this, at the six-month mark after the compliance deadline took effect, only 30 percent of companies located in the EU could be considered GDPR compliant, a recent study showed.

Perhaps we should not be entirely surprised by that underwhelming statistic. GDPR compliance can be time-consuming and resource-intensive. It necessitates a strategic approach and a permanent focus on all activities related to data processing. Unfortunately, these characteristics might result in certain hazardous attitudes on the side of controllers and processors. Many of these actors are aware of the new rules introduced by the GDPR, yet they choose to ignore the relevant obligations, hoping to avoid inspections and further consequences. Others are reluctant to comply with the regulation and may consider other responsibilities as trumping privacy, for instance, assigning economic benefits more weight than protection of personal data. Finally, it is a common misconception that if the controller publishes its privacy notice or policy, its activities would be in line with all obligations deriving from GDPR.

Nonetheless, data subjects are becoming more conscious about their privacy and demand effective control over their personal data. Beside the heightened interest in the activities of controllers and processors, monitoring and enforcement mechanisms set out in the GDPR are operated by supervisory authorities around the European Union. Fines have been issued for non-compliance and, as a further consequence, the publicity of unlawful conduct further damages the reputation of controllers. Thus, the previously mentioned attitudes are harmful to the rights and freedoms of individuals; they violate provisions protecting privacy of data subjects, and may also lead to significant loss of income on the side of the controllers and processors.

Instead of demonstrating the previously mentioned attitudes, controllers and processors should realize that certain easy steps can promote GDPR readiness. First, they need to be self-aware concerning activities connected to the use of personal data. An updated record of processing activities and the designation of a data protection officer may be of great help in this respect. Application of data governance tools can also assist in setting the relevant internal policies. Furthermore, it is necessary to document every aspect of these activities, thus demonstrating compliance with the principle of accountability. Finally, controllers and processors should make their operations transparent to supervisory authorities and data subjects as well as to the general public, via data protection notices and other methods of providing information.

These are certainly not the overall conditions for GDPR compliance, but they facilitate controllers and processors in achieving it, and constitute valuable proof that an organization is willing to abide the rules of the regulation and respect the privacy of data subjects.

Incorporating Privacy into Data Protection Strategy

Mais BarouqaNowadays, the term privacy echoes across boardrooms globally, where each country and enterprise races to update its laws and policies to keep up with the need for data privacy controls. This massive wave of interest is largely driven by the introduction of emerging technologies such as robotics process automation; Internet of Things (IoT) and artificial intelligence (AI), which are increasing the number of sources of personal data available to enterprises. This, in turn, is increasing data protection risk to enterprises.

A recent ISACA whitepaper, Enforcing Data Privacy in the New Digital World, highlights the fact that although many enterprises are focused on data privacy compliance, data breaches can also cause irreparable monetary and reputational damage. This is supported by a 2018 IBM study that reports the average cost of a data breach to be $3.86 million.

In addition, if we examine the global risk landscape recently assessed by the World Economic Forum, massive data fraud/theft comes in fourth place, followed by large-scale cyber-attacks. These reports confirm data privacy is now a significant risk that should be tackled immediately by enterprises since the benefits from implementing controls to address data privacy are beyond the costs.

After laying down the numbers and facts, in order to implement data privacy controls, enterprises should start from the top – by incorporating data privacy into the enterprise’s data protection strategy. This will set the direction in which the enterprise will move forward concerning the data privacy initiative. At this phase, careful consideration must be taken in harmonizing the data privacy strategy with the corporate strategy. In the end, data is flowing throughout the organization, and unlike many assumptions, it is not limited to IT departments.

Once the data privacy strategy is defined, enterprises can move forward with translating it into their governance activities. Enterprises should begin with an examination of their current organizational structure. Data privacy acts and laws, such as GDPR, have introduced new roles to be implemented within enterprises to ensure compliance and proper implementation of data privacy. Some enterprises fall short of properly defining the responsibilities needed to implement the data privacy strategy, where such new roles may end up siloed and without proper reporting lines and involvement in the enterprise. Enterprises should also revisit or prepare policies and procedures with particular focus on data privacy. These guidelines must be formally written and enforced in the enterprise. An example of those policies is the definition of guidelines over data retention, information security, monitoring and reporting procedures, data disposal, etc.

As enterprises move forward with the data privacy project, they will begin to understand the types of data currently processed in their environment. This allows enterprises to also determine the challenges they need to overcome to be fully capable of applying data privacy controls. Following this, enterprises can work on establishing controls such as implementing tools to ensure data privacy within the IT environment, developing a privacy culture within all departments and ensuring periodic training and awareness sessions on data privacy.

An important point here relates to third-party involvement in data privacy. Typically, enterprises outsource certain functions within the IT department to third-party vendors in order to provide the needed skills and support to customers. Nevertheless, outsourcing does not remove the responsibility of the enterprise to ensure their vendors comply with data privacy policies and laws. Enterprises should revisit their third-party vendor contracts and service level agreements to ensure that data privacy compliance provisions are included.

In light of the growing importance of data privacy, enterprises that incorporate privacy compliance within their corporate strategy, role definition, policies and procedures, controls, and third-party management practices will be best positioned to reduce regulatory non-compliance penalties and reputational risk.

Google’s GDPR Fine Reinforces Need for Intentional Data Governance

Andrew NealFor those of us who work in information security, data privacy and governance, we seem to traverse daily from one headline to another. A new corporate victim announces they were breached to the tune of 100 million records. A regulatory body announces a financial and oversight settlement with a company for failure to adequately protect data. On and on we go.

Because of this constant onslaught, nobody was terribly surprised to hear about the €50 million fine leveled against Google by French data privacy regulators for violations of GDPR. We all knew a big enforcement was coming, and that the early, large fines would be against a social media or tech giant.  Check and check. But what does this mean to organizations on a broader scale?

As I draft this post on Data Privacy Day, trying to find the larger meaning in this first-of-many large fines, I am faced with many possibilities. Could the message be about regulatory muscle-flexing, or is it about corporate arrogance and gamesmanship? Is this a legitimate assertion of individual rights against a corporate giant, or is it an attack against a successful tech company and its profit model? In GDPR, are we looking at the shape of tomorrow’s global data environment, or are we seeing a regulatory trend that risks stifling innovation and “free” service delivery? Of course, the answer is all of the above.

The regulatory authorities across the EU who are charged with enforcing GDPR must, at some point, exercise their authority. No regulation can be effective until it is applied, tested and, ultimately, proven or defeated in practice. At the same time, some organizations may look at the details of the regulation and make a risk-based assessment that they have done enough to comply with their interpretation of the regulation, reasoning “We have taken some [less-than-perfect] actions, let’s see what happens.” The rights to one’s personal data are becoming more widely accepted as a given, but many consumers still are willing to casually or selectively trade some of those rights for convenience or services. With data privacy and security laws and regulators proliferating and evolving, data-centric business activities and profit models must be more carefully engineered and scrutinized. All of the above.

This recent and highly publicized enforcement activity is likely to spur additional compliance efforts from many organizations. Few can absorb a fine with that many zeros in it. On a strategic level, however, it may well contribute to the gradual paradigm shift away from the whack-a-mole approach to security and privacy regulations, and toward a philosophy of intentional data governance and strategy.

There are many financial and organizational benefits to proper data governance, including lower infrastructure costs, better litigation readiness, smaller cyberattack footprint, and better visibility for regulatory compliance. But sometimes it takes a negative result occurring to somebody else to make us ask the right questions and do the right things. Time will tell if a hefty fine is enough to move the behavioral needle for Google, or for the rest of us.

Editor’s note: For more on this topic, read “Maintaining Data Protection and Privacy Beyond GDPR Implementation.”

Offshoring: Getting it Right Through a Security and Privacy Lens

Vinay NarangThe offshoring industry is at a turning point. There is a growing demand to further saturate offshoring hubs with a view to increase profits. The true value of offshoring can be realized when viewed as a relationship amongst parties rather than a mere delivery model.

Success of this relationship can be seen when:

  • The offshoring units meet contractual metrics and produce deliverables of industry quality;
  • Onshore units are successful in cutting costs and drawing profits, and are able to focus on critical tasks toward business expansion;
  • People involved in the offshore and onshore units are satisfied, competent, and have synergy;
  • Industry standards are maintained with due care to information security and privacy requirements.

However, in the real world, it seems companies struggle to manage this relationship, with security and privacy considerations becoming all the more challenging to manage.

So, the question is, offshoring: how to get it right? Or do we plan to offshore this task as well?

Below are key considerations that, when consciously applied by the onshore and the offshore teams, will help companies achieve talent utilization, value creation and profit realization.

Key considerations for the ONSHORE team

1. Change in mindset
Operational lens:
The current patch in the mindset of onshore professionals in which  offshore teams are flooded with work requests needs to be updated. Onshore professionals need to update and mature their mindset in the pursuit of achieving low costs and high quality. The offshore team must be viewed as an extension of the team, and team members should be encouraged to ask questions and build their expertise. The vision of the firm and the engagement should unite the teams with a shared purpose when geographic distance separates them.
Security/privacy lens:
Change is the only constant in technology. Based on changing laws and regulations, the onshore team must be aware of the information that is being dealt within onshore locations. According to chapter 5 of the General Data Protection Regulation (GDPR), which is related to transfers of personal data to third countries or international organizations, considerations must be satisfied while processing or intending to process personal data. As such, given the global impact, it is vital for onshore professionals to update their mindset from a security/ privacy lens and carefully scan the information that can or cannot be offshored.

2. Collaborate and share knowledge with offshore teams
Operational lens:
Onshore professionals should be encouraged to share knowledge to offshore teams to help understand the objectives of the deliverables. Having structured periodic calls/updates helps achieve efficiency on both sides of the table. Training the onshore team on how to efficiently collaborate with offshore professionals, understanding the culture of communication and work management at the offshore site, and periodic checkpoints on technical learnings will meet these goals.
Security/privacy lens:
A strong relation requires both parties to complement each other. In this direction, it is important to train offshore teams with technical aspects of security and privacy considerations. Training can be based on a framework (like NIST or ISO) or focused training on areas such as access control, information risk assessments, network security, and system development. As such, collaborating and sharing such knowledge will make the offshore teams informed, enabling them to make sound decisions.

3. Invest in the right technology
Operational lens:
Large firms that embrace offshoring usually have a file-sharing/instructions-sharing mechanism connecting the onshore and offshore teams. With time, it is noted that the tool or mechanism being used seems ineffective in terms of time, usage, and perhaps intent. While long emails and Excel trackers have been a thing of the past, firms must smartly invest in research and development of proprietary tools and automation techniques.
Security/privacy lens:
From a security/privacy lens, companies need to consider:
1. Technology being used to share the data
2. Actual content or data being shared
Automation brings its own risks, especially related to data security and access security. Wise implementation of automation, backed by constant monitoring of security measures, helps mitigate risks. When actual content or data is being shared, special care needs to be taken when dealing with personal data.

Key considerations for the OFFSHORE team

1. Build the right team
Operational lens:
With cheaper costs at offshoring locations, the easy option would be to hire as many professionals and then distribute work amongst them. However, building the right team that has the required skillsets, educational background, and professional interests aligning to the services provided by the firm is critical. Hiring process at offshore locations should be based on standards that align with the quality represented by the firm.
Security/privacy lens:
The issue of data sent offshore and the risk to its privacy has shown that current laws (HIPAA, GLBA) do not adequately cover or protect US customers when information is sent abroad for processing. Offshore teams must have subject matter experts who engage in opportunities focused on regulations and are able to drive teams with their experience. Offshore teams execute best when they are led and trained by experienced leaders within the group. Industry certifications and periodic internal workshops on information security and risk management go a long way in building the right team.

2. Invest in quality and project management:
Operational lens:
With contractual metrics established between onshore and offshore teams, the need to rush and hand back deliverables to the onshore teams highlights a gap in the quality and project management practices. Offshore teams must check their deliverables for quality, voice opinions if they differ from those of the onshore teams, suggest innovative ways of accomplishing tasks and streamline quality processes. Offshore leadership must work with their teams to check if there are any gaps with respect to project management techniques, which affect resources or onshore stakeholders.
Security/privacy lens:
Low cost and high quality are traditional labels that sell offshoring. It is an investment of patience and continuous good practices to achieve high quality with offshoring teams. Techniques such as Six Sigma have been instrumental in streamlining quality requirements, and some companies have aligned Six Sigma to their security framework to derive security-driven return on investments. Offshoring teams should define, evaluate, and monitor their quality metrics, and present how they add value to onshore teams and customers.

GDPR Progress Paves Way for Deeper Look at Role of Data in 2019

Andrew NealThe European Union’s General Data Protection Regulation (GDPR) commanded the attention of the business community throughout 2018. Thought leadership gatherings such as ISACA conferences and webinars attempted to answer questions like, “What does it take to comply?” and “What will enforcement look like?”

Answers were largely speculative, and the actual enforcement processes associated with the regulation are only now taking shape. We can, however, look back at 2018 and make some observations about what has been accomplished, the drivers of compliance activities, and the work left to be done.

At six months past the implementation deadline, many organizations have harvested the low-hanging GDPR fruit. Privacy policies have been updated, cookie notices added to websites, and mechanisms have been deployed to support opt-in, opt-out, and data subject requests. Those using third-parties to process data, or those who are the third-party, have defined commitments and expectations regarding personal Information. Training programs have been rolled out to educate about GDPR-related issues. Accomplishing these items has allowed organizations to mark a significant part of their GDPR checklist as complete and have a reasonable story to tell in case of an incident.

The desire to comply with GDPR and avoid any potential fines motivated much of this activity. Since GDPR, the regulatory landscape has continued to change and evolve. A proliferation of privacy and data breach regulations (such as the California Consumer Privacy Act, Brazil’s new data privacy regulation, etc.) has refocused the discussion from a single regulation to an overall issue of data privacy and business process. As recently explained by a business executive, “There is no way we can fund a new project to comply with each privacy and security regulation that comes along, so we must address these issues at a higher, more efficient level.” These conversations about compliance costs and efficiencies are driving the next wave of privacy-related projects.

Having addressed the basics, many of our clients now seek to reduce costs and lower their overall compliance risks. This often involves a deeper look at the role of data within business processes. Good information governance requires such things as accurate data and process maps, defined data lifecycles, security protections for data, and incident response plans. The ever-increasing risks related to compliance in a complex regulatory environment, and the standard benefits of good data governance, are causing many organizations to revisit some of these governance program elements. While 2018 saw a heavy focus on GDPR, 2019 may be a year of transformational governance projects as companies seek to reduce costs and compliance complexity by more precisely directing their use, management and protection of data.

The impact of GDPR has been significant, with more official guidance and enforcement decisions on the horizon. But the bigger story may be the pressures exerted on business processes by the combination of multiple data privacy and breach regulations, changing consumer expectations, and related B-to-B obligations. The next year may demonstrate how organizations are choosing to comply with GDPR while  addressing these additional pressures.

Key Considerations for Assessing GDPR Compliance

Mohammed J. KhanThe European Union General Data Protection Regulation (GDPR), which took full effect in May this year, solidifies the protection of data subjects’ “personal data,” harmonizes the data privacy laws across Europe and protects and empowers EU citizens’ data privacy, in addition to changing the way data is managed and handled by organizations.

The GDPR regulation affects people across the globe. The scope of GDPR is quite wide-ranging, and can apply to many global institutions with operations in Europe. Certainly, GDPR has created more power for data regulators, due to the severe potential financial penalties for non-compliance (maximum of 4 percent of annual global turnover or €20 Million, whichever is higher).

A few of the key things to know about GDPR are:

  • The regulation governs how institutions collect, record, use, disclose, store, alter, disseminate, and process the personal data of individuals in the EU.
  • If a breach involves personal data, the Data Protection Authorities must be notified within 72 hours.
  • It governs the rights of data subjects, including rights to access, rectification, erasure, restricting processing, data portability, and rights in relation to automated decision-making and profiles.

How do I assess my GDPR compliance?
All these are essential reasons for institutions to ensure that the proper governance and tactical steps are taken for compliance with GDPR regulation. The GDPR Audit Program Bundle developed by ISACA does just this by helping provide institutions with a guide for assessing, validating, and reinforcing the GDPR regulations by which institutions must abide. The audit program was developed to provide enterprises with a baseline focusing on several key areas and their respective sub-processes, that covers all key components of GDPR, including:

  • Data governance
  • Acquiring, identifying and classifying personal data
  • Managing personal data risk
  • Managing personal data security
  • Managing the personal data supply chain
  • Managing incidents and breaches, create and maintain awareness
  • Properly organizing a data privacy organization within your institution

Also included are key testing steps involving control category types and frequency to help facilitate the effective discussion and analysis as it fits your institution. The important thing to remember is that there is no absolute right way to go about becoming GDPR-compliant. However, a robust and thorough review of your GDPR environment as it pertains to data processing for your institution is required to ensure a proper baseline is used to assess compliance and successfully execute a GDPR compliance program.

Editor’s note: ISACA has addressed both general and particular audit perspectives for GDPR through its new GDPR Audit Program Bundle. Download the audit program bundle here. Access a complimentary white paper, “How To Audit GDPR,” here.

The Beginnings of a New Privacy Framework Through NIST

NIST conducted a workshop on 16 October in Austin, Texas, USA, to discuss plans for a voluntary privacy framework, and attendees had the opportunity to have a robust discussion about what such a framework should entail. The workshop was attended by individuals from industry, academia, and government.

The need for a framework, according to NIST, is because we live in an “increasingly connected and complex environment with cutting-edge technologies such as the Internet of Things and artificial intelligence raising further concerns about an individual’s privacy. A framework that could be used across industries would be valuable in helping organizations identify and manage their privacy risks.” It would also assist an organization in preparing and maintaining a comprehensive privacy plan.

“I think being able to have guidance at a federal level that takes into consideration key other privacy legislation and regulations as well as standards will be important,” said Paula deWitte, computer scientist, author, and privacy attorney. “The comment at the workshop about relentless interoperability of standards and the framework will be key to its usability.”

NIST discussed how the process for creating the privacy framework was largely aligned with how its Cybersecurity Framework was created, with collaboration from the public, and iteratively. NIST envisions the privacy framework as being “developed through an open, transparent process, using common and accessible language, being adaptable to many different organizations, technologies, lifecycle phases, sectors and uses and to serve as a living document.”

“The Cybersecurity Framework is more about critical infrastructure. Privacy is a different beast, and frankly, a bigger lift. We don’t even have a clear definition for privacy. On top of that, privacy is multi-dimensional. One must look at privacy from its impact on the individual, groups, and society,” said deWitte.

“The major elephant in the room identified at the hearing is that we don’t have a grip on what data needs to be protected and where the company’s data is. By that I mean, we don’t fully understand what data must be kept private and we must consider that organizations must be in complete control of data throughout its entire lifecycle including from procuring it, to storing it, to sharing it (as appropriate) to disposing of it,” said Harvey Nusz, Manager, GDPR, and ISACA Houston Chapter President.

With more work to do on the general strategic front, the group determined the overall approach for the framework would be enterprise risk management, a focus both Nusz and deWitte applaud, while offering words of caution.

“I agree that we need to fit the framework into an enterprise risk management approach, but how do we actually define and conduct risk management? Risk management encompasses all types of enterprise risk, so there is the issue of how one defines risk. Is anyone using a good methodology for risk management we can all get behind?” said deWitte.

“Every organization should at a minimum create a risk register,” said Nusz. “That needs to be part of privacy planning.”

The workshop attendees discussed that the risk-based approach represents the reality that privacy has moved beyond a compliance, checklist mentality. It is now a viable business model with data considered an asset. The key is identifying the acceptable level of risk and owning responsibility if something goes wrong.

“This creates legal questions because our laws are written for the physical world, but if my identity is stolen, it can encompass legal issues of including jurisdiction, standing and damages. Who has jurisdiction in the cyber world? Law always lags technology, so all of this has yet to be determined,” said deWitte.

“We have an opportunity to build trust with consumers through the way we handle their privacy,” said Nusz. “I look forward to this challenge and working with NIST to see it recognized.”

Some of the ideas for how to put the framework in practice to improve trust with consumers included: incorporating human-centered research in work done to protect privacy, attempts to de-identify information and be as transparent as possible with the process, as well as leveraging privacy enhancing techniques.

NIST will take the feedback from the hearing and build an initial outline, which it will present at a workshop in early 2019. To stay current on the privacy initiative, please visit the NIST Privacy Framework website.

1 - 10 Next