Due to scheduling and timing constraints, articles and columns that appear in the ISACA Journal are often written weeks or even months before they appear online or become available on the Journal app. This was most certainly the case with my volume 3 IS Audit Basics column, “Auditing Data Privacy,” which was completed well before the Facebook and Cambridge Analytica story hit the news headlines.
Shortly after the story broke, I shared a Guardian article1 on the ISACA Knowledge Center,2 which I said then and still believe now is a must-read for all IT auditors. The article referenced a research paper3 that showed that easily accessible digital records of behavior, e.g., Facebook likes, can be used to automatically and accurately predict a range of highly sensitive personal attributes including: sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender. Further, when users liked curly fries and Sephora cosmetics, this was said to give clues to intelligence; Hello Kitty likes indicated political views; and “being confused after waking up from naps” was linked to sexuality.4
The scandal also resulted in calls to #deletefacebook and Facebook having to defend its practices on both sides of the Atlantic, most notability before the US Congress.5 The scandal has resulted in 3 million European users deleting their Facebook accounts and has directly hit their bottom line.6 I have no doubt that Mark Zuckerberg and his executives would much rather the events had never happened.
I strongly believe these events may not have happened had Facebook applied ISACA Privacy Principles.7 This is because the ISACA Privacy Principles were created using input from existing privacy standards, frameworks and privacy principles from around the world. The development process consisted of reviewing existing privacy principles, standards and frameworks to identify the elements considered generally common among all revised privacy documents and being most applicable to the diverse ISACA membership. The result was a set of 14 privacy principles that harmonizes widely accepted privacy standards, principles, frameworks and good practices and fills the gaps that exist among frameworks.8 I discuss using the privacy principles as an overarching framework for an IT audit in my volume 3, 2018, IS Audit Basics column, “Auditing Data Privacy.”
Read Ian Cooke's recent Journal volume 3 column:
"Auditing Data Privacy," ISACA Journal, volume 3, 2018.
1 Cadwalladr, C.; E. Graham-Harrison, “How Cambridge Analytica Turned Facebook “Likes” Into a Lucrative Political Tool,” The Guardian, 17 March 2018, https://www.theguardian.com/technology/2018/mar/17/facebook-cambridge-analytica-kogan-data-algorithm
2 Audit Tools and Techniques, www.isaca.org/Groups/Professional-English/it-audit-tools-and-techniques/Pages/Overview.aspx
3 Kosinski, M.; D. Stillwell; T. Graepel; “Private Traits and Attributes Are Predictable From Digital Records of Human Behavior,” Proceedings of the National Academy of Sciences of the United States of America, 9 April 2013, http://www.pnas.org/content/110/15/5802
4 Op cit, Cadwalladr
5 Hempel, J.; “Congress Is Unearthing Facebook’s Terrible Power,” Wired, 11 April 2018, https://www.wired.com/story/congress-is-unearthing-facebooks-terrible-power/
6 Neate, R.; "Over $118bn Wiped Off Facebook's Market Cap After Growth Shock," The Guardian, 26 July 2018, https://www.theguardian.com/technology/2018/jul/26/facebook-market-cap-falls-109bn-dollars-after-growth-shock
7 ISACA, ISACA Privacy Principles and Program Management Guide, www.isaca.org/knowledge-center/research/researchdeliverables/pages/isaca-privacy-principles-and-program-management-guide.aspx
8 Ibid, p. 41
My work as a systems integrator has allowed me to meet a large number of customers in various industries. It has given me the privilege of seeing various aspects of their businesses. At the very least, I get to meet interesting people. It has also allowed me to meet different organizations at different levels of information technology adoption maturity from the advanced conglomerates, to the MS Office-wielding “mom and pop” store, to the “talk and text” only phone-holding Sari Sari1 store owners. Generally, discussions involving technology are very different from one type of organization to another.
On 27 March 2016, the Philippine nation was made aware of the largest published data breach in its history, dubbed COMELEAK. The leak involved 340 gigabytes of data covering 55 million voters. The timing was also ominous, as this was just 2 months before the hotly contested national elections in May 2016. This was also the first incident in which the newly formed National Privacy Commission (NPC) stepped in. In light of this event, a majority of my conversations with the C-suite now involve data privacy. More recently, Uber Philippines has confirmed that Filipino users were also affected by the 57 million-person Uber breach in October 2016. Sometimes it takes large-scale events like this to wake people up to the reality of personal data breaches. Now everybody wants to know how to protect their organizations against leakages like these.
It does not take much to lose sensitive personal information. Once it is leaked, it is lost. Data privacy is now top of mind. I wrote an article in this year’s first issue of the ISACA® Journal titled “When What is Lost is Lost Forever: Data Privacy” describing the Philippines’ journey toward greater awareness of data privacy, and I provide some advice on how to get started as an organization. These solutions are surprisingly simple and a majority of customers are surprised to learn they already have the solution in their midst.
Read William Emmanuel Yu’s recent Journal article:
“When What Is Lost Is Lost Forever: Data Privacy,” ISACA Journal, vol. 1, 2018.
1 Philippine equivalent of a smaller scale convenience store that sells generally various goods in sachet-sized portions. They are found in every street corner. They are generally sole proprietor options.
Data are emerging as forms of capital in every industry, and data are also the most coveted asset. The forces affecting business operations drive organizations to hunt and gather data, and, in due course, shape them into reservoirs and refineries of giant data. In a typical organization, data of all types, including personal information, free flow across physical and virtual clusters, reflecting lowered barriers to data movement. This free flow of personal information results in a degeneration in what consumers refer to as privacy-friendly business. It is not uncommon that data-rich organizations struggle with responding to covert attacks and information thefts on one side and cleaning up the mess of accumulated data on the other. In some ways, the organizational digital doctrine emulates the natural history metaphor, “the struggle for privacy and survival of the protected.”
Taking a closer look at some of these organizations would reveal that the core products and services that control and process the wealth of data would have traditionally satisfied the business need but have fallen short of addressing consumers’ right to privacy. Privacy by design, in a nutshell, aims to embed privacy and data protection principles into the products and services from the very design process when they are modeled and architected. However, in a real-world scenario, integrating privacy requirements into products and services is not a straightforward affair. Setting up a leading-edge program of privacy by design is often challenged by the following representative influences:
- Tangible design and engineering strategies still remain unclear for many organizations.
- Many legacy solutions are poorly suited to address the emerging class of privacy risk.
- Products and solutions are sometimes rushed to market for competitive reasons without considerable thought to privacy.
- Institutional knowledge of personal data elements and organizational data flow is sometimes limited.
While there are some known challenges to overcome, and privacy by definition is not consistent across geographies, privacy-friendly design is now the expectation of new generations of consumers who drive business dynamics. Privacy by design is a stride toward consumercentric design that empowers users to exercise their right to and over their own information.
My recent Journal article expands on this topic and outlines the key principles for modeling privacy by design, using blockchain technology as an example.
Read Sudhakar Sathiyamurthy’s recent Journal article:
“Design With the End in Mind,” ISACA Journal, volume 5, 2017.
My most recent Journal article was based on an analysis of data privacy I performed for an ISACA presentation. The privacy areas of concern detailed by the International Association of Privacy Professionals (IAPP) and the 7 categories of privacy according to ISACA were integrated with the privacy and security controls included in National Institutes of Standards and Technology (NIST) Special Publication (SP) 800-53 revision 4 to reveal the key ingredients to inform privacy planning.
In my most recent Journal article, I reveal the root causes of data breach incidents and related statistics that highlight the severity of data breaches. There are several privacy categories (with associated concerns), questions, responsibilities and areas of risk that a privacy officer (PO) needs to address to protect data. The PO also needs to adopt a governance strategy that respects personal privacy and educates the organization to ensure a unified effort. Four main privacy controls (management, computer operations, business operations and technical controls) should be implemented to ensure a successful privacy program. An organization’s privacy plan should include a list of authorities, definitions, scope and purpose, roles and responsibilities, privacy controls, and other considerations to be set up for success. My article has a more thorough outline to help the PO and your organization implement a successful privacy plan.
My most recent Journal article aims to help everyone understand the complexities of data privacy and provides some guidance on managing data and associated activities. I invite you to share your thoughts on personal privacy management and contribute suggestions. If you have a data breach plan or manage data privacy in your organization, I encourage you to share the similarities and differences of your approach to the one I have presented in my recent article.
Read Larry G. Wlosinski’s recent Journal article:
“Key Ingredients to Information Privacy Planning,” ISACA Journal, volume 4, 2017.
The Hexa-dimension metric is an initiative that was prompted by the phenomenon that ramifications for privacy breaches are seldom satisfactory, no matter how meticulous the decision-making process. The reason for this lack of satisfaction is that consequences are argued in rational, logical and financial terms only. This deficiency leads me to reflect on the status quo: the solution that is derived from the Herbert Simon decision-making model, which is the guiding light for decision making and deep-rooted in our thought and practice of management, is congenitally defective. We need to improve the decision formulation. The Simon doctrine does not deliver a satisfactory decision because decision makers are not always rational and are sometimes judgmental, emotional or reliant on escalation of commitment. In addition, the decision variables are considered in financial terms only, but risk and cost can be ethical, social, legal, technical and ecological in nature.
A New Risk Paradigm
We always take risk for granted as a technical concern and measure it in economic and legal terms. It is, in fact, a managerial concern also and should be evaluated in sociotechnical, legal and financial terms. A shift of the way risk is viewed is necessary.
The 6-d Operationalization Framework for Striving for Balance/Trade-off
A justifiable return and an optimized rate of utilization for the hefty investment in expensive technologies are expected. Measuring success at this point, in financial and technical terms, is essential. Because the huge amount of generated information can be abused, legal and ethical issues arise. The use of the information may serve some well, but not others; therefore, a social issue immediately emerges. As more technologies are used, more natural resources are consumed, and the impact on the environment must be considered.
To arrive at a satisfactory solution, decision makers must consider financial viability, technical effectiveness, legal validity, ethical acceptance, social desirability and ecological sustainability. To balance, or strike a tradeoff among, and measure the 6 attributes, I recommend the adopted Ethical Matrix that is embedded in the 6-d Operationalization Framework. The Ethical Matrix is also good for clarifying/determining the pragmatically ethical and effective, in ethical leadership and effective leadership, for which all professionals strive, and is not just for data-privacy-protection policy formulation.
Read Wanbil W. Lee’s recent Journal article:
“An Ethical Approach to Data Privacy Protection,” ISACA Journal, volume 6, 2016.
Data protection used to be a simple compliance task. Most of the data protection laws are based on the Organisation for Economic Co-operation and Development (OECD) Privacy Framework Basic Principles. The core of this framework can be summarized as transparency—the purposes of personal data collection are made known and justified to individuals and their implicit or explicit consent is obtained before collection and processing. Furthermore, if an enterprise wants to change the use of personal data to a new purpose, the enterprise must obtain individuals’ consent before proceeding.
It all sounds just about doable, but the enterprise must also consider somewhat disruptive big data analytics, which indiscriminately collects massive amounts of data with the hope that a previously unforeseen insight will suddenly be discovered. This being the case, one would wonder how the now-contradictory concepts of transparency and big data analytics can be reconciled when an enterprise begins with no idea of the use it may have for the personal data that are collected for big data analytics.
While regulators continue to call for transparency or anonymization of personal data to reduce harm, data controllers and privacy think tanks are arguing for a paradigm shift from strict compliance to a risk-based approach that has put considerable stress on the importance of ethics in this increasingly data-driven world. For example, the Information Accountability Foundation (IAF) has a 4-part Big Data Ethical Framework Initiative that has caught the interest of the Office of the Privacy Commissioner of Canada, which has given the IAF a grant for the project. The initiative aims to finally develop an assessment and enforcement framework on big data analytics that is based on ethical consideration.
Going further in technological development, the next big thing following big data analytics may be artificial intelligence (AI). A recent report from the Executive Office of the President of the United States and prepared by the US National Science and Technology Council acknowledges the difficulty of transparency in artificial intelligence and suggests that, “Ethical training for AI practitioners and students is a necessary part of the solution.”
While hard rules appear to be failing in the world of privacy because of technological advancement, ethics now rises as a viable alternative for core value—something that will require a lot of pondering by policymakers and lawyers.
Read Henry Chang’s recent Journal article:
“An Ethical Approach to Data Privacy Protection,” ISACA Journal, volume 6, 2016.
Did you know that 69 percent of reported breaches involve someone inside the organization? Whether by mistake or malice, users are the biggest threat to a company’s data. Therefore having forensics and analytics on your users’ actions is the best way to audit and respond to a data breach. But how will users feel about you collecting these forensics?
On the one hand, organizations need to monitor user activity for potential threats. On the other hand, employees do not want to feel like their privacy is being violated. So, how do you protect your company from data breaches without employees seeing you as being intrusive? Here are a few suggestions:
- Clearly communicate monitoring policies—When giving employees or third-party users access to the system, notify them that their actions will be monitored. Create a “policies and procedures” document that clearly outlines why user behavior is monitored, what will be monitored, and what behaviors are considered illegal or unacceptable. Give this document to all users when they first receive their login credentials. Discovering this monitoring policy later may leave employees or vendors feeling like their privacy has been violated.
- Explain the goal of user activity monitoring—To help employees feel like they are trusted members of the company, it is important to explain the goal of user monitoring. Monitoring simply records actions to flag down potential illegal activity or threats to the company. The standard employee should have nothing to be concerned about. In fact, this software will help protect them from blame if a breach does occur.
- Explain what activities are monitored—Unfortunately, all action taken on a company system must be monitored, recorded and stored. While it does not seem necessary to record someone browsing Facebook or checking personal email, stopping the recording during these times would open up opportunities for disguising illegal behaviors. To ease employees’ minds, explain that while every action—including individual keystrokes—is being recorded, they are not necessarily being monitored. Only suspicious or illegal activity will trigger alerts.
- Remind users they are being monitored—Even after explaining the monitoring policies fully, it is a good idea to regularly remind employees of these policies. Notifications and policy messages can be built into your monitoring software to remind users every time they log in so they never feel caught off guard. It can also act as a constant deterrent for anyone attempting any illegal acts.
User activity monitoring is the best defense for the inside threat companies face. But companies should be smart about it. Follow these tips to keep users feeling happy and safe while keeping the company protected.
Read Dimitri Vlachos’ recent Journal article:
“User Threats Vs. User Privacy,” ISACA Journal, volume 1, 2015.
Ashwin Chaudhary, CISA, CISM, CGEIT, CRISC, CISSP, CPA, PMP
My recent Journal article
addresses increasing concerns over user privacy due to a wide usage of personal mobile devices in the workplace. Recent privacy violation issues faced by large organizations have brought the topic of privacy issues into the limelight. There are several increased privacy regulations, such as the US Health Insurance Portability and Accountability Act (HIPPA) and the US Health Information Technology for Economic and Clinical Health (HITECH), which focus on health-related privacy issues, and the US Children's Online Privacy Protection Act (COPPA) for the online privacy of children. Such efforts are initiated to bring about stringent privacy regulations; however only strict enforcement of these regulations can ensure the law’s effectiveness.
With respect to bring your own device (BYOD), an enterprise’s focus is mainly on the corporate network and data security rather than user privacy. As a social responsibility, organizations also need to adopt user privacy audits and assurance programs to manage user privacy, as this protection is equally as important as protecting corporate security.
Regulations and compliance requirements that mandate annual certification are generally at a point-in-time, and some of them are based on self-assessment and self-certifications, which may lead to cutting corners. Continuous independent assurance programs, such as Service Organization Control (SOC) 2 or SOC 3 Type2, should be considered in corporate security planning.
Read Ashwin Chaudhary’s recent Journal article:
“Privacy Assurance for BYOD,” ISACA Journal, volume 5, 2014.
By Horace McPherson, CISA, CISM, CGEIT, CRISC, CISSP, PMP
Data privacy is more than just a compliancy or a business issue. People become vulnerable whenever they turn over their personal information to companies. Companies, regardless of industry, owe it to their customers or subscribers to protect their personal information as if they are protecting people’s most precious possessions.
I see what happens to people when they are notified that a company holding their personal information has been breached: anxiety sets in, people have sleepless nights and they sometimes even become pessimistic about the future. Victims of identity theft sometimes feel alone since, in most cases, the burden of proof is on them to prove that they are not responsible for the results of any nefarious actions performed by an identity thief.
In my opinion, personal information is worth more than the numbers on a balance sheet or income statement. In the area of corporate social responsibility (CSR), organizations must be concerned with what is called the triple bottom line. Elements of the triple bottom line include social, environmental and economic factors. Protecting customers’ information is aligned with the social and economic aspects of the triple bottom line, 2 of the essential elements of CSR. If companies do not properly protect personal information, they are not being good corporate citizens. Once sensitive information is collected, there is an expectation of due diligence and due care in the application of data protection policies and mechanisms.
At the end of the day, a company’s approach to data privacy and protection depends on the moral outlook of the company’s leaders. The ethical perspective of the top management team determines whether a company will be proactive and a leader in setting and supporting privacy protection policies and whether privacy protection is put ahead of profits. The tone at the top is very important. Let us hope that the tone is a good and fair one.
By William Emmanuel Yu, Ph.D., CISM, CRISC, CISSP, CSSLP
We live in a world where technology is present in everything we do. We have essentially become dependent on this level of pervasive communication technology. However, these same technological capabilities also make it possible to perform unprecedented levels of surveillance. People in the technology sector have always been aware of this power and have capitalized on it. However, in June 2013 things changed. The post-Snowden world has brought increasing awareness to the issue of mass surveillance. More people are now aware of it and more people want action from their governments. This increase in awareness has compelled regulators and governments worldwide to review intelligence agencies, laws and regulations with respect to data privacy.
For liberal countries with no data privacy laws, there will likely be a move to enact data privacy regulation. Countries that already have regulation will start reviewing and strengthening it in most cases. For a while, customers will be more discriminating about where their personal data resides. Decisions will be made on the perceived safety of these service providers. This puts an additional burden on companies that rely on IT to ensure that they continue to provide their services within a more data privacy-aware regulatory and cultural framework.
At the same time, this is also the era of big data, which enables the large-scale collection of customers’ personal and transactional information. Companies are increasingly looking at their data streams as assets and have invested in technology to keep more of their data longer and identifying ways to monetize it.
Companies are in no position to predict all possible changes in regulatory action or cultural expectations in the market. However, they need to build their applications to ensure they comply with these regulatory and cultural norms. In my recent Journal article
, I recommend that application developers seriously review their applications in the context of existing global privacy regulatory frameworks, which can serve as a template. These general privacy principles can ensure a degree of future proofing for these applications.
We are seeing the collision of capability and responsibility. We now have the capability to keep, process and monetize more private data. This is what technology allows, but at the same time, service providers have a responsibility to customers to protect this information and use it in a fair and proper fashion.