The European Union General Data Protection Regulation (GDPR), which took full effect in May this year, solidifies the protection of data subjects’ “personal data,” harmonizes the data privacy laws across Europe and protects and empowers EU citizens’ data privacy, in addition to changing the way data is managed and handled by organizations.
The GDPR regulation affects people across the globe. The scope of GDPR is quite wide-ranging, and can apply to many global institutions with operations in Europe. Certainly, GDPR has created more power for data regulators, due to the severe potential financial penalties for non-compliance (maximum of 4 percent of annual global turnover or €20 Million, whichever is higher).
A few of the key things to know about GDPR are:
- The regulation governs how institutions collect, record, use, disclose, store, alter, disseminate, and process the personal data of individuals in the EU.
- If a breach involves personal data, the Data Protection Authorities must be notified within 72 hours.
- It governs the rights of data subjects, including rights to access, rectification, erasure, restricting processing, data portability, and rights in relation to automated decision-making and profiles.
How do I assess my GDPR compliance?
All these are essential reasons for institutions to ensure that the proper governance and tactical steps are taken for compliance with GDPR regulation. The GDPR Audit Program Bundle developed by ISACA does just this by helping provide institutions with a guide for assessing, validating, and reinforcing the GDPR regulations by which institutions must abide. The audit program was developed to provide enterprises with a baseline focusing on several key areas and their respective sub-processes, that covers all key components of GDPR, including:
- Data governance
- Acquiring, identifying and classifying personal data
- Managing personal data risk
- Managing personal data security
- Managing the personal data supply chain
- Managing incidents and breaches, create and maintain awareness
- Properly organizing a data privacy organization within your institution
Also included are key testing steps involving control category types and frequency to help facilitate the effective discussion and analysis as it fits your institution. The important thing to remember is that there is no absolute right way to go about becoming GDPR-compliant. However, a robust and thorough review of your GDPR environment as it pertains to data processing for your institution is required to ensure a proper baseline is used to assess compliance and successfully execute a GDPR compliance program.
Editor’s note: ISACA has addressed both general and particular audit perspectives for GDPR through its new GDPR Audit Program Bundle. Download the audit program bundle here. Access a complimentary white paper, “How To Audit GDPR,” here.
NIST conducted a workshop on 16 October in Austin, Texas, USA, to discuss plans for a voluntary privacy framework, and attendees had the opportunity to have a robust discussion about what such a framework should entail. The workshop was attended by individuals from industry, academia, and government.
The need for a framework, according to NIST, is because we live in an “increasingly connected and complex environment with cutting-edge technologies such as the Internet of Things and artificial intelligence raising further concerns about an individual’s privacy. A framework that could be used across industries would be valuable in helping organizations identify and manage their privacy risks.” It would also assist an organization in preparing and maintaining a comprehensive privacy plan.
“I think being able to have guidance at a federal level that takes into consideration key other privacy legislation and regulations as well as standards will be important,” said Paula deWitte, computer scientist, author, and privacy attorney. “The comment at the workshop about relentless interoperability of standards and the framework will be key to its usability.”
NIST discussed how the process for creating the privacy framework was largely aligned with how its Cybersecurity Framework was created, with collaboration from the public, and iteratively. NIST envisions the privacy framework as being “developed through an open, transparent process, using common and accessible language, being adaptable to many different organizations, technologies, lifecycle phases, sectors and uses and to serve as a living document.”
“The Cybersecurity Framework is more about critical infrastructure. Privacy is a different beast, and frankly, a bigger lift. We don’t even have a clear definition for privacy. On top of that, privacy is multi-dimensional. One must look at privacy from its impact on the individual, groups, and society,” said deWitte.
“The major elephant in the room identified at the hearing is that we don’t have a grip on what data needs to be protected and where the company’s data is. By that I mean, we don’t fully understand what data must be kept private and we must consider that organizations must be in complete control of data throughout its entire lifecycle including from procuring it, to storing it, to sharing it (as appropriate) to disposing of it,” said Harvey Nusz, Manager, GDPR, and ISACA Houston Chapter President.
With more work to do on the general strategic front, the group determined the overall approach for the framework would be enterprise risk management, a focus both Nusz and deWitte applaud, while offering words of caution.
“I agree that we need to fit the framework into an enterprise risk management approach, but how do we actually define and conduct risk management? Risk management encompasses all types of enterprise risk, so there is the issue of how one defines risk. Is anyone using a good methodology for risk management we can all get behind?” said deWitte.
“Every organization should at a minimum create a risk register,” said Nusz. “That needs to be part of privacy planning.”
The workshop attendees discussed that the risk-based approach represents the reality that privacy has moved beyond a compliance, checklist mentality. It is now a viable business model with data considered an asset. The key is identifying the acceptable level of risk and owning responsibility if something goes wrong.
“This creates legal questions because our laws are written for the physical world, but if my identity is stolen, it can encompass legal issues of including jurisdiction, standing and damages. Who has jurisdiction in the cyber world? Law always lags technology, so all of this has yet to be determined,” said deWitte.
“We have an opportunity to build trust with consumers through the way we handle their privacy,” said Nusz. “I look forward to this challenge and working with NIST to see it recognized.”
Some of the ideas for how to put the framework in practice to improve trust with consumers included: incorporating human-centered research in work done to protect privacy, attempts to de-identify information and be as transparent as possible with the process, as well as leveraging privacy enhancing techniques.
NIST will take the feedback from the hearing and build an initial outline, which it will present at a workshop in early 2019. To stay current on the privacy initiative, please visit the NIST Privacy Framework website.
The last two years have taught us that conventional wisdom and knowledge around privacy and security needs a makeover, in particular as it relates to the EU’s GDPR and the California Consumer Privacy Act. Data controllers and businesses, the entities responsible for what happens to personal data under GDPR and CCPA, respectively, are subject to new obligations that place significant organizational risk squarely on their shoulders. Though compliance issues can come from many places, one often-overlooked impact is managing processor/third-party risk.
Third parties (aka processors in the GDPR or information recipients in California law) are critical to organizational operations, from cloud hosting to payroll administration and processing. They hold customer, partner, employee, and confidential data that is the lifeblood of organizations, and we can’t run without them. While many third parties strive to be good stewards of their customers’ data, we find ourselves in a time where trust and good-faith efforts aren’t going to pass muster anymore.
Under the GDPR, CCPA, and other regulations, controllers need to hold their vendors contractually responsible in regards to specific obligations for how data is handled through data processing agreements and other measures, and as always, “trust but verify” that the vendor is acting accordingly. By extension, this includes our vendors’ partners as well, when fourth parties are involved.
Along with contractual measures, controllers need to assess, test and review a vendor’s ability to adequately safeguard the data they are transferring through product, personnel, and organizational protection mechanisms. This also requires that they pass the same data protection expectations downstream.
All of this due diligence should, at all times, be centrally documented and maintained. In the event of an incident or breach, controllers must be able to demonstrate a reasonable and defensible process for vetting third parties, including providing results of their assessments of vendors' practices and commitments to data protection, to help mitigate risks of liability. This also includes identifying potential risks of doing business with a particular vendor, taking actions to mitigate those risks, and continually managing vendors based on the scope and sensitivity of the data they process.
Now, chances are your organization has already taken steps to ensure proper actions are taken. For organizations looking for continual process improvement (CPI) and formal action plans, here’s a sample Vendor Risk Management lifecycle to consider:
This lifecycle is a roadmap to operational Vendor Risk Management that includes:
- Establishing a baseline for new vendors to benchmark associated risks (done during the evaluation and procurement process);
- Mitigating risk down to the lowest possible level and using that analysis to set a cadence for vendor review frequency;
- Documenting all aspects of vendor due diligence, including services agreements, privacy and security risk analysis, data processing agreements, vendor contacts, and internal owners; and
- Reviewing all vendors periodically to ensure agreements and relationships are maintained with appropriate controls in place, including based on regulatory guidance, as renewals or new services may be rendered.
Organizations should also incorporate privacy/security by design into vendor onboarding practices by integrating with procurements processes to take advantage of work being done today. This could include an early screening to determine if further privacy and security due diligence will be required – based on what services are being rendered – and how they’re delivered.
Editor’s note: For more resources related to GDPR, visit www.isaca.org/gdpr.
On 25 May 2018, the world did not stop simply because the General Data Protection Regulation (GDPR) became enforceable. For many organizations, however, the enforcement date became a distraction, an unofficial deadline. In reality, there was no finish line.
We all recall the panic-driven deluge of marketing consent emails from companies this past summer – some we engaged with, many we forgot about and others we never even noticed. That deluge has now slowed down to a trickle.
Also, noticeably quieter are the salespeople peddling “GDPR-compliant” and “one-size-fits-all” solutions. Foreboding news headlines no longer scream about fines of up to 20 million EUR or 4% of total worldwide annual turnover for the slightest misdemeanor.
Three-plus months on from the enforcement deadline, here are a few observations and reflections on how organizations are adjusting to life under the new European privacy and data protection regime.
#1: Business as usual for some?
It would be inaccurate to say that organizations have quickly thrown off the restraints placed on them by the GDPR regarding the processing of personal data. However, it would be equally inaccurate to claim that poor data protection practices have been fully discarded and that we are now living in an era where organizations treat our personal data appropriately.
For Europeans at least, there is evidence of some change in behavior from large technology and global marketing companies, some of whom are already under scrutiny by regulators. For some other organizations, however, GDPR fatigue has begun to set in and organizational priorities are shifting from expensive programs to other hot-button enterprise risk issues.
GDPR compliance initiated a rush of activity that led to the creation of (or updates to) policies, procedures, system inventories and contracts. Some organizations brandished these new shiny documents as their evidence of being “GDPR-ready.”
However, having controls by themselves without a plan to assure that their design and operating effectiveness achieves the desired control objectives is half-hearted. Weak governance and the absence of privacy assurance programs increases the risk of a return to the past.
In reality, control effectiveness cannot be fully determined until after a designated cycle of operation. It may take at least one year before we start to see true changes in organizational attitudes toward data protection.
#2: Integrating privacy into enterprise risk management
Forward-thinking organizations saw GDPR compliance as an opportunity to return to the drawing board and, in some cases, revisit their approach toward enterprise risk management.
Far from simply fulfilling a checklist of requirements, some organizations used their GDPR compliance programs to test the alignment between their operational risk, information security, IT governance and privacy functions.
This also was an opportunity to embed privacy risk into enterprise risk management frameworks, check the health of three-lines-of-defense models, adjust risk tolerance levels and develop new key risk indicators (KRIs) to provide end-to-end assurance.
Where new privacy risk management processes (such as steering committees) have been implemented, they will need time to develop traction. In the long term, the right approach could see organizations improving the maturity of their data protection controls while also improving their overall enterprise risk posture.
#3: The “SAR-pocalypse” did not happen
It just didn’t.
Depending on who you spoke to, the increased public awareness of privacy rights enshrined in the GDPR would unleash an avalanche of data subject access requests (SARs) from incentivized or incensed data subjects.
Executives feared that customers, disgruntled employees and coordinated activists flexing their new regulation-enabled muscles would bombard their service desks with requests seeking to enforce rights of access, erasure and others.
The term 'SAR-pocalypse' (a hypothetical denial-of-service scenario caused by an organization’s inability to manage an excessive volume of SARs) was whispered in hushed tones with real concerns that failing to deal with requests within the required period could attract penalties.
In the weeks just before and after the enforcement deadline, many organizations did in fact see a sharp rise in the number of data subjects requests they received. However, many of those requests originated from people annoyed with the panic mass mailing campaigns in the weeks prior to the enforcement date. Understandably, many of the requests were for erasure and account deletion.
A retail organization I spoke with noted a higher-than-usual volume of requests in the weeks leading up to 25 May. Requests to be erased reached an all-time peak in the weeks following. However, by mid-June, those numbers had begun to drop. By the end of August, request volumes had returned to pre-25 May levels.
I am yet to hear of any organizations admitting that their service desks have toppled over due to a flood of SARs. However, organizations should not trivialize the need to keep their personal data flows up-to-date and to keep testing the effectiveness of their process for responding to SARs and other GDPR-related queries.
#4: Waiting to see what the regulators will do with penalties
‘Data Breach Scapegoats Wanted!’, wrote one satirical industry commentator on social media.
While Europe’s regulators adjust their oversight machinery to be able to effectively police the GDPR, there is a collective holding of breath by organizations waiting to see what precedents will be set with post-25 May financial penalties.
Perhaps the most high-profile data privacy related incident to hit the headlines since the GDPR enforcement deadline was the one involving the infamous Cambridge Analytica. For its part in the scandal (which preceded the 25 May enforcement date), the UK Information Commissioner’s Office (ICO) fined Facebook £500,000 (the maximum fine under the old UK Data Protection Act 1998).
Data privacy breaches continue to be reported, and post-25 May, the UK regulator has continued to take enforcement action against erring organizations. For example, British Telecommunications plc (BT) was fined £77,000 (hardly 4% of their global annual turnover) for sending nuisance emails to customers.
When scrutinized through the lens of Article 83 (“Each supervisory authority shall ensure that the imposition of administrative fines...in respect of infringements...shall in each individual case be effective, proportionate and dissuasive”), it might be a while before a “GDPR-scale” maximum penalty is imposed on any organization.
The absence of scapegoats may be because Europe’s regulators are either overwhelmed with data subject complaints or simply biding their time until they find the right opportunity to set a dissuasive precedent.
Rather than waiting for precedents and second-guessing regulators, organizations should continue to improve their incident prevention, detection and response procedures while maintaining a state of readiness for potential data breaches.
#5: After the hype, what comes next?
As the GDPR hype starts to wane, organizations should not lose sight of the wider benefits that can be derived from an improved attitude toward data protection.
For example, there will continue to be opportunities to improve data governance and unlock business insights from the personal data they lawfully process if organizations maintain their discipline around personal data collection and processing.
As informed consumers continue to exercise their enhanced consent rights under the GDPR, available inventories of user data are likely to come under pressure. By focusing on data quality (including processing data that is “adequate, relevant and limited to what is necessary”) rather than scale, organizations can improve engagement at different points within the customer journey.
The Privacy & Electronic Communications Regulations (soon to be ePrivacy Regulation) remains a hot topic and the next keenly anticipated regulation from Europe. Correctly implementing GDPR requirements should have placed most organizations in a good position to adopt the requirements within the ePrivacy regulation.
While senior executive support for GDPR remains warm, Data Protection Officers need to test their newly minted powers and ensure that their independence (including avoiding conflicts of interest with other tasks and duties) goes beyond qualities and responsibilities listed in a job description.
There is no turning back
The reality for many organizations is that GDPR program funding and resources will move elsewhere. Data privacy champions will change roles. Vendors will come and go. Applications will be developed and retired. Meanwhile, more countries and jurisdictions (like California) are likely to strengthen their own data privacy laws. The journey never ends.
Somewhere in all of this, care must be taken to avoid the slow erosion of data protection controls arising from negligence and poor governance and a return to the old ways. Seeing the GDPR not as a checklist but as an opportunity to transform corporate attitudes and embed good data protection practices will help organizations thrive under the new privacy regime in the long-term.
Editor’s note: For more GDPR insights and resources, visit www.isaca.org/gdpr.
GDPR: An acronym and a buzzword that has set many of us into “alert mode.” Since it was set in motion more than two years ago, thousands of people worked hard to ensure their organizations were prepared by the set enforcement deadline of 25 May, 2018, and continue doing so. But among the good guys and gals, there were also some “louche” (a French adjective that means “shady” characters, and was used in CNIL’s video on GDPR. These are people who had no ethical problems in providing misleading guidance and wrong answers to the many questions concerning GDPR).
Unfortunately, Poland was among those countries where this phenomenon grew to be a danger to the whole idea of protection of personal data. Here are just a few examples of the consequences of the created havoc:
- Hospitals refused to inform parents whether their children were admitted after a serious bus accident with many schoolchildren injured;
- Teachers started calling out pupils by their assigned numbers instead of their names;
- Closure of a cemetery, because some gravestones had names of living persons on them; and
- Offers of special GDPR-compliant filing cabinets.
These situations were widely described and discussed on the internet in Poland, raising concern. To counteract this, in June this year, the Minister of Digital Affairs empowered Mr. Maciej Kawecki, the Director of the Department of Data Management at the Ministry, to create a special task force to deal with the worst absurdities. Mr. Kawecki is a top data protection specialist who is coordinating the work done in Poland to adapt Polish law to GDPR. The mission is very challenging; there are about 800 regulations that need to be revised. In the next few weeks, the Polish Parliament will debate the first package of legislative changes.
Mr. Kawecki posted a call for volunteers to work in the group. This proved to be a very sought-after, widely appreciated initiative, and the response was huge. From the several hundred candidates, 93 people were picked to work in five groups on issues concerning specific topics: health, education, finance/telecomms, public administration and general issues.
I had the pleasure to be selected to be a member of the education team. We come from a mix of different professions and different involvement in day-to-day school activities. This creates additional value as we have different perspectives and experience that enable us as a team to take a much broader look at GDPR issues.
In the first stage, we were asked to compile replies to seven especially pressing questions concerning schools. We came to the conclusion that each question should have two answers:
- A short one, of the "YES /NO" type with just a brief added comment, so that headmasters and headmistresses would know right away what they can or cannot do, and
- A long one, with legal reference to the applicable regulations concerning school and pre-school education and some practical advice for all concerned.
We already have noted our first success. Part of our work has been used in the GDPR guide for schools, just published by the Ministry of Education together with the Polish supervisory authority.
Creating a GDPR task force by the Ministry of Digital Affairs is a highly recommended approach. It gives the opportunity for data protection professionals to get involved in supporting GDPR compliance at the national level. It also creates opportunities for an exchange of knowledge and experience between practitioners and government officials in charge of developing regulations and recommendations. The Ministry intends to continue using our group to obtain practical and up-to-date information on issues and problems concerning GDPR implementation and to develop appropriate guidelines. This also gives us the opportunity to share our ideas and thoughts with our peers and to disseminate best GDPR practices to stakeholders both in the public and private sectors.
A good example of the usefulness of guidelines developed by official organizations are the “Guidelines on the protection of personal data in IT governance and IT management of EU institutions” published by the European Data Protection Supervisor (EDPS). These good practices are based on ISACA’s COBIT 5 and describe the data protection aspects related to the processing of personal data. With just a few minor changes that basically come down to replacing “EU institutions” with “data controllers,” this document can easily serve large and small organizations from the public and private sector in the European Union and outside in their efforts to achieve GDPR compliance.
Do we really need regulators to come and tell us that each person’s data is, well, private? A few years before the GDPR regulation came into effect in Europe, the Law for Protection of Personal Data Held by Private Parties (LFPDPPP) in Mexico stated basically the same principles with which many companies are now struggling to comply:
- Individuals have the right to know what personal data about them is stored by any company
- Individuals have the right to request such information to be deleted or withheld from being shared with any other third party
The enactment of these regulations has made both individuals and companies alike aware of the basic fact that too much information about ourselves has been voluntarily but unknowingly disclosed; that some common-sense boundaries have been breached; and that so much information is definitely not needed to provide the digital services we are signing into, and thus, we could block access to prevent further dissemination and commercialization of our habits, browsing history, location, network of family, friends and colleagues, and so on.
So, if rules could be rewritten from start … if you were actually to read the license or service agreement of each online service that you really want to stick with, what terms would you consider reasonable to understand the information about yourself that you are willing to disclose in order to receive those digital services? Of course, we are discarding the possibility that you are happy with such clauses like “by using this app, you understand that we can obtain every piece of your personal data, contacts, location, browsing history and sell it and share it with whomever we can get to pay more for it, with no obligation to you or your descendants.”
So, trying to solve this puzzle, allow me to propose the following Taxonomy of Private Identity and briefly explain the different components.
In today’s model, we have assumed the fact that we are required to authenticate ourselves in the online universe basically through one of two widely adopted credentials: your email address and/or your Facebook credentials. Yes, sure, your Facebook account was originally authenticated through an email account but now qualifies as equally valid. However, both can be faked. Yet we are comfortable with an authentication mechanism that is not certain and can be easily stolen.
In the proposed taxonomy, different data is protected behind purpose-specific gates. Those gates can be opened with their respective private key, plus one key linked to you as an individual.
Detailed description of proposed encryption mechanism and data structure of the proposed blockchains will be the subject of an upcoming article. At this level, let’s say that the key that allows access to the other gateways should be generated from biometric data. Fingerprints and facial recognition are now easily implemented, but a widespread model would require more complex data, potentially even DNA data that would link that personal key to the owner.
Implementation of such taxonomy would allow participants to segregate the information that they open to different actors or services, for specific purposes. For example, your LinkedIn profile could add a tag in each of your education or professional milestones, indicating that each of these items has been “verified,” without a need to provide a copy of it in the open network. As long as LinkedIn is a participant of the authentication protocol, it can confirm that participant universities or employers have confirmed your information, without the need to provide any unnecessary data to persons requesting confirmation of the event. In a similar way, personal legal papers (say, shares deposited into a trust fund) could become public legal papers when linked to a document like a will. You, and only you as owner of your private data, would be tagging the existence of such personal legal documents to what could be consulted by the public, if that’s needed or required by law.
So, the key point is that we start thinking about whether we can identify all these important pieces of private data, know where it is stored, and whether we have given unnecessary access to a huge technology company to link it to marketing algorithms … or worse, if rogue actors have very easy access to the digital representation of our lives and assets.
Author’s note: Jose Angel Arias has started and led several technology and business consulting companies over his 30-year career. In addition to having been an angel investor himself, as head of Grupo Consult, he participated in TechBA’s business acceleration programs in Austin and Madrid. He transitioned his career to lead the Global Innovation Group in Softtek for four years. He is currently Technology Audit Director with a global financial services company. He has been a member of ISACA and a Certified Information Systems Auditor (CISA) since 2003.
Data security always has meant different things to different people. Most have agreed on the importance of using firewalls, but for decades, businesses have been able to choose the level of data encryption they employ. If they didn’t think a VPN was necessary, they simply didn’t use one. If they didn’t think they needed end-to-end data encryption, they would skip it and take their chances. That is, until recently.
Thanks to the newly enforceable General Data Protection Regulation (GDPR), data security is starting to have a legal definition, making it a legal requirement to have certain types of data security. The GDPR regulations exist to protect the data of EU citizens and applies to enterprises globally because EU citizen data is stored by businesses all over the world.
Since a majority of personal data is collected and stored when people sign up for newsletters, businesses can no longer approach email marketing strategies casually and need to take extra precautions.
Don’t skip the double opt-in
A double opt-in process gives you tangible proof that each user joined your list of their own free will. Under GDPR, you are required to be able to prove every user chose to sign up.
Wanting to skip the double opt-in process for your new leads is understandable. Will the confirmation email go to spam? What if they forget to check for it, or the email is delayed? How many signups will you lose because people don’t want to go through the extra step?
These questions are valid concerns. However, they’re based on flawed logic. The incorrect perception is that getting as many leads as possible is a productive approach to email marketing. The truth is, if your leads don’t take the time to confirm their choice to join your email list, they’re not likely to be good customers.
Good customers are the heart of every successful business. For most businesses, 80% of sales come from about 20% of their customers. You really don’t want to keep every customer, and experts even recommend “firing” 10% of your customers each year.
Leads that don’t take the time to confirm opt-in probably don’t care much about the information in the first place. Or, they were just looking for a freebie. Your best leads will be people who are passionate about what you’re sharing and can’t wait to receive your confirmation email.
Encrypt internal email messages, too
No matter how private you think your emails are, every email you send and receive is stored on a remote hard drive you have no control over. If your email provider doesn’t encrypt your emails from end-to-end, (most don’t), all company emails are at risk.
Encrypting employee email communications plays a huge role in maintaining GDPR compliance. The average employee won’t think twice about emailing co-workers about sensitive issues that may include data from the business database. For example, someone might send a customer’s credit card information to the sales department for processing a return.
To protect your internal emails and maintain GDPR compliance, buying general encryption services isn’t enough. You need to know exactly how and when the data is and isn’t being encrypted. Not all encryption services are complete.
For instance, if you’re using Microsoft 365, you’ve probably heard of a data protection product called Azure RMS. This product uses TLS security to encrypt email messages the moment they leave a user’s device. Unfortunately, when the messages reach Microsoft’s servers, they are stored unprotected. “This means that Microsoft and other intermediary third-party providers can access the securely-sent data,” say security experts at Virtru, “making certain data residency, privacy, and compliance requirements more difficult to meet.”
How you secure your data is no longer your choice
GDPR regulations require businesses to take specific measures to protect data, including:
- The pseudonymization and encryption of data;
- The ability to restore users’ access to their own personal data after a breach;
- The frequent testing of a business' security measures;
- The right to have personal data deleted (although it’s already a law (Google Spain vs. Costeja).
Fines for ignoring these requirements can be hefty at up to 10 million euros or 2% of the business’ annual turnover – whichever is higher. Additionally, that fine may rise to 4% if certain obligations are ignored.
Employing data security according to your own preferences is simply no longer worth the risk.
This week, in my home state of California, the state legislature passed, and the governor signed, AB 375, officially known as the California Consumer Privacy Act of 2018. The legislation will take effect January 1, 2020. The good news for privacy professionals is that this bill resembles in many ways the European Union’s General Data Protection Regulation (GDPR). Much of the same data classification, business logic, and tracking of consent and preferences developed to comply with the GDPR should translate to this California law.
However, there are some key differences, which I will highlight below.
A little background and a race against time
While work on AB 375 began in February 2017, its passage yesterday is a direct response to current events. The legislation lists as one of its raisons d’être the recently disclosed actions of Cambridge Analytica, and a ballot measure, the “California Consumer Privacy Act,” that was designed to push the bill along. The measure had overwhelming popular support, and June 28 was the last day that the measure could be pulled from the ballot.
With the passage of AB 375, Alastair Mactaggart, chairman of Californians for Consumer Privacy and the major force behind the ballot measure, announced that the measure would be pulled, as was previously promised if the bill passed. The bill and the ballot measure were very similar, but by passing the bill, the California Legislature preserved its right to amend the law going forward and limited consumers’ rights of redress to breaches as opposed to all violations.
Taking GDPR a few steps further
There are several key differences between AB 375 and GDPR. The major ones are the right for consumers to sell their personal information (and by explicit reference in section 1798.125 (b), the right for a business to offer incentives to consumers to allow their information to be collected and sold), and, under section 1798.115, the consumer has the right to direct a business that sells the consumer’s information to disclose: a) what they are collecting; b) what they are selling; and c) what they are transferring for other business uses.
The right to offer incentives is a huge leap forward in that is allows firms to offer something (not necessarily money) in exchange for the resale of a consumer’s personal data, but it also establishes ownership rights in a whole new way. It’s one thing to control the use of one’s data, it’s still another to allow it only with compensation. It will be very interesting to see the market (consumers and data collectors) set the price. How much is your information worth?
California rightly excludes, under section 1798.145, the obligations where none of the covered activities take place in California and do not involve individuals who are in California at the time of data collection.
As an information security professional, I have always used California (SB 1386), Massachusetts (201 CMR 17.00), Nevada (N.R.S. § 603A.010) and Texas (Texas Medical Records Privacy Act) as my state regulatory privacy proxies. I will immediately add AB 375 to that list and predict that the consumer backlash to the events and disclosures of 2016-2018 will cause other states to pick up where California has left off.
Author’s note: Bill Bonney is a security evangelist, author and consultant, and formerly Vice President and Chief Strategist at encryption software maker FHOOSH. Before FHOOSH, Bonney held numerous senior information security roles in industries including financial services, software and manufacturing. Bonney holds patents in data protection and classification, is an advisor to technology incubator CyberTECH, and is on the San Diego CISO Roundtable board of directors. He holds a Bachelor of Science degree in Computer Science and Applied Mathematics from Albany University.
Where calls to “get ready for GDPR” permeated last year’s InfoSecurity Europe conference in London, keynote speakers at this year’s event—conducted just 10 days after the European Union’s regulatory enforcement deadline—put a stronger spotlight on GDPR compliance and sunk more serious messaging teeth into their talks.
Nowhere was this more evident than during the event’s “EU’s GDPR Is Here– Now What?” panel, where two enterprise privacy and security officers, a Microsoft cyber senior executive and a UK GDPR policy lead weighed the realities and rigor of the new regulatory environment.
Vivienne Artz, chief privacy officer for Thompson Reuters, said the organization has “put its house in order. Privacy, privacy and security by design are the new normal.”
Critical to Thompson Reuters progress, according to Artz, was senior management buy-in. GDPR support and change “must be a top-down exercise. Privacy cannot be delegated to a department. It is each individual who is now personally responsible,” she noted.
GDPR’s requirement that organizations report security breaches within a 72-hour period reinforces the individual employee awareness and activation, especially of documented, regularly practiced breach notification policies, according to Artz.
“If you don’t have a breach notification policy, you’re fried,” Artz declared.
Artz and Trainline security director Mieke Kooij emphasized understanding the regulation’s fine details, and working collaboratively, and very actively, across IT, audit, assurance and legal. For instance, “there are new things defined as ‘breach,’” and org-wide awareness is essential to avoid complaints and penalties, said Kooij.
The enterprise leaders emphasized their need for more automated services and tools to support regulatory requirements, such as data sourcing, mapping, data types and data access—a theme echoed by Johnnie Konstantas, Microsoft Enterprise Cybersecurity Group senior director. She said Microsoft, and most other technology and cloud service vendors, are deploying such capabilities given that GDPR lays additional burdens on the always accelerating pace of change in “applications, services and data … and of the supply chain. All of it as a very dynamic environment.”
And while not asserting the Information Commissioner’s Office (ICO) will “fry” non-compliant enterprises, technology policy head Nigel Houlden said “It’s fair to say there are some panicking” given GDPR’s requirements and impact across EU-based organizations and all entities that do business or have customers in the region.
“If an organization is willful, disregardant and neglectful of GDPR, you will be investigated. You will feel the force of … the authority of enforcement,” Houlden said. “We will not ignore anything, even the smallest complaint, if there is harm done.”
So, while leading up to the GDPR enforcement deadline, an ISACA survey asked participants about their GDPR readiness, maybe now the question should be along the lines of whether you are GDPR
- or all of the above?
Editor’s note: For more GDPR resources from ISACA, visit www.isaca.org/gdpr.
In the infancy of any technology, there are going to be teachable moments. Prehistoric man’s mastery of fire didn’t come without a few scorched fingers and the occasional multi-acre conflagration. As a species, our taming of fire and combustion enabled innovations in everything from cooking to metallurgy to transportation, to an array of other endeavors. Those innovations, however, required a continuous process for humans to learn and establish capabilities to control fire, to use it appropriately, and to make it work for humanity’s benefit.
What the discovery of fire meant to ancient humankind, the Internet is to our modern world: a reshaping force that has reconfigured the ways in which we interact and innovate. And—like our forebearers—we are still singeing our hands a bit as we learn to operate appropriately in our evolving digital society. No matter whether we are enterprises or individuals, we must continue to develop and mature our capabilities to embrace and cope with new technologies and the resulting data that offer so much positive potential.
Data is not the new “oil” anymore. Data is the new “air.” It has become more than economic fuel; it is a catalyst of innovation, of disruption, and of possibilities. However, it’s never a guarantee that all innovations, disruptions and possibilities will be positive ones. Creating fire was one of early humanity’s greatest accomplishments. It also made arson possible. We still need to learn how to harness data and the Internet for positive benefit—as well as to manage and mitigate its risks. In the data we generate, just as there is great value, there also is great risk. We need to understand both and plot our digital pathways accordingly.
Facebook CEO Mark Zuckerberg’s recent moments on Capitol Hill made our need to digitally evolve even more stark. His testimony made the spotlight already focused on data and privacy even brighter. If nothing else was accomplished by his interactions with Congress, he has surfaced important and thought-provoking issues worthy of continued discussion—discussion that needs the active participation of policymakers, regulators, industry executives, academic leaders and individual citizens concerned about the use of their personal data.
Zuckerberg’s appearance in Washington, DC came in the aftermath of a data scandal involving a UK-based political data firm that improperly accessed data of millions of Facebook users. Pointing a finger at Facebook and asking, “How did this happen?” may feel cathartic, but it misses the larger point. This happened because the digital world in which we are now living continues to evolve faster than we have developed internationally accepted standards. This happened because, absent of such standards, evolution within the global regulatory and public policy realm has been unable to keep pace with the rapid advancement of technology.
During his testimony, Zuckerberg admitted mistakes, accepted responsibility, and promised to do better—and then was grilled about many of those mistakes and the path forward. While Facebook has pledged expanded efforts to protect its users’ data, including giving users a better understanding of which apps can access their data and providing developers less access to data without users’ expressed consent, the revised approach going forward should not be Facebook’s responsibility alone. We, as individuals, have to accept some responsibility, too. In an odd sort of way, people have become data-driven companies in their own right. We must be proactive in the protection of our personal information, profiles, data and privacy rights.
The urgent need for sound data protection has reached new heights globally thanks to the arrival of the long-anticipated General Data Protection Regulation (GDPR), which is now in effect. ISACA research conducted in the weeks leading up to the deadline shows that prioritizing GDPR compliance among other business priorities is among the leading challenges that organizations face. While balancing enterprise priorities amid a disruptive and fast-evolving technology landscape is no easy task, protecting customers’ personal information – whether mandated by GDPR or otherwise – must be a priority, and therefore not relegated to being treated as a secondary consideration.
Data is the new air, and leveraging its positive potential is essential to catalyze innovation, progress, and to create new value. To inspire assurance and confidence that the appropriate data protection efforts are in place, implementation of more rigorous and robust information/data governance is not an option; it has become a must. We may also need consensus-based standards to shape the right governance environment, ultimately making it easier to comply with any new policies and regulations that will come forward in the future. Without these conditions in place and lacking a collective commitment to collaboration, breathing this new air will become far more difficult.
Editor’s note: This article originally appeared in CSO.