IT and Privacy Audits 

 
Download Article

Over the last decade or so, concerns over the privacy of personal private information (PPI) have gone from limited to a few special industries, to pervasiveness in almost all industries. Business systems process and store growing amounts of PPI. Thus, PPI as a whole is exposed to a variety of vulnerabilities across the business community, including loss, misuse, theft and unauthorized distribution. In addition, there are federal and state laws in the US that mandate certain standards for the protection and privacy of that data. Also, there is the continuing threat of identity theft that seems to have become a perpetual risk associated with PPI. Contracts may call for protection of PPI involved in business processes or services being provided by a service provider. Therefore, entities affected by these laws, or that have an unacceptable level of risk associated with PPI being collected, processed and stored, need to have privacy audits to assure compliance and a reasonable degree of mitigation of these and other risks associated with PPI.

Because PPI is digital, privacy audits tend to be done by IT auditors. PPI clearly involves IT in terms of collection, storage and dissemination at a minimum and, thus, should be subject to IT controls.

This article will describe the basics associated with the issues related to privacy audits (including laws and technical literature), best practices, resources and suggestions for controls to prevent adverse events related to PPI.

Laws

One must always be aware of the key aspects of the audit objectives. In many privacy audits, part of the audit objectives is compliance with laws and regulations, state and federal. Thus, the IT auditor would need to know the requirements of applicable laws and regulations in performing a privacy audit. In the US, there are some federal and state laws applicable to privacy audits.

A key piece of US federal legislation was the passage of the Health Insurance Portability and Accountability Act (HIPAA) of 1996, which mandates the protection and privacy of personal information of patients in the health care industry. In the financial services industry, there is the Gramm-Leach-Bliley Act (GLBA) of 1999. It mandates certain standards related to privacy of PPI for customers. There are many intricate details in GLBA, such as the fact that the monitoring and enforcement of PPI involves industry regulators, the US Federal Trade Commission (FTC) and the US Securities Exchange Commission (SEC). Other potentially relevant laws include the Children’s Online Privacy Protection Act (COPPA) and the Privacy Act of 1974.

State laws also affect privacy audits. The genesis of state laws is California’s SB 1386 of 2002. This act requires entities that have experienced a security breach of PPI, where the customers/clients are residents of California, to notify each customer/ client of the breach. The California law became the template and motivation for other states. The various state laws, however, do vary in terms of the requirements. Thus, the auditor needs to not only be aware of the California law, but research the laws in each state affected by a breach, if one occurs. At last count, 44 US states have similar laws.1

Canada, Australia and the European Union (EU) have similar laws as well. The IT auditor would need to know or research applicable international laws for international entities.

Risks

There are risks associated with PPI that go beyond the compliance issues associated with applicable laws and regulations. It could be the risk of losing a significant number of customers, clients or patients because of a breach where thousands of individuals’ PPI has been compromised. A serious breach could lead to the loss of revenues and business, which would be a significant business risk. The breach may also affect the entity’s public image in a negative way that hinders securing new customers. Then, there is the criminal element that has learned how lucrative identity theft crimes can be, with a concomitant ability to avoid detection and prosecution that also can be a business risk.

One of the major risks is the cost of a breach. Depending on the nature of nexus of the business and its customers and on applicable state laws, the entity may be required to notify each person of the breach and even provide a year’s service from a credit reporting entity. The average cost of notification alone is more than US $100 per person. So it is cost-beneficial for the entity to follow best practices for privacy. One area of high risk is digital storage using portable devices such as laptops, USB drives and backup drives being transported.2 If the personal data are on an encrypted device, the entity may not have to notify those customers, clients or patients, thus saving extensive expense in cost of notification.

In addition, there is a public image risk if the breach becomes public—and it will if the entity is subject to one of the state laws requiring notification. Most people have read stories about TJ Maxx, Choicepoint.com, Card Services and others that have had this unpleasant experience. A few years ago, one of the popular credit card providers supposedly lost a significant part of its business after a report of credit card theft of thousands of its customers became public. CD Universe, which once had a considerable portion of online sales of music CDs, had a significant loss of business after reporting the theft of thousands of customers’ credit card data. Once that kind of event goes public, prospective customers might choose to stay away in droves.

Technical Literature

ISACA has published materials to assist in privacy audits including IT Audit and Assurance Guideline G31 (Privacy)3 and “Risks of Customer Relationship Management,”4 an audit work program.

The IT Audit and Assurance Guideline on privacy naturally ties the audit objectives and procedures back to the applicable COBIT element. Sections 1.6.5 and 1.6.6 discuss issues related to privacy audits, and section 5.1.1 has a lengthy list of subtopics or elements of privacy that would need to be reviewed for possible inclusion in a privacy audit. G31 uses the general principles established by the Organization for Economic Cooperation and Development (OECD) in 1980 and 2002 (see figure 1). These principles would be considered best practices as well. Section 6.1.3 has a table of possible questions to use in the audit, and section 7 outlines the performance of audit work.

Figure 1
Figure 1 cont.

The “Risks of Customer Relationship Management” audit work program document has a section (No. 12, pages 175- 182) for privacy risks and contains detailed information on conducting privacy-related audits in a customer relationship management (CRM) system, which is applicable to privacy audits in general.

The American Institute of Certified Public Accountants (AICPA), in conjunction with the Canadian Institute of Chartered Accountants (CICA), has established guidance on privacy audits, Generally Accepted Privacy Principles (GAPP).

These were originally adopted in 2003, revised in 2006 (see figure 2), and an exposure draft was out for comment through early 2009; therefore, auditors may want to check to see if the new version has been adopted.

Figure 2

According to the AICPA, using GAPP, “organizations can proactively address the significant challenges that they face in establishing and managing their privacy programs and risks from a business perspective.” There are versions of GAPP customized to management and auditors. By default, the GAPP identified therein are best practices and the source of potential procedures in privacy audits.

Other relevant technical literature is available, including ISO 27002, Payment Card Industry (PCI) Data Security Standards (DSS) and the Federal Financial Institutions Examination Council (FFIEC)’s IT Examination Handbook, that might also apply to privacy audits.5

Sample Controls

A list of sample controls for privacy is provided in figure 3. Some the controls needed to protect and maintain privacy of PPI are associated with various storage mechanisms. For instance, it is common for employees to have a laptop that contains information of customers of the entity, or PPI in files acquired from clients. Another common risky situation is the use of USB drives to download files that contain PPI. Both of these are dangerous risks concerning PPI.

Figure 3

For example, the loss of a laptop is more costly than simple replacement of the equipment. For instance, an auditor had a laptop stolen from his/her vehicle where the laptop contained PPI of 243,000 customers of Hotels.com. The Computer Security Institute estimates that the average theft of a laptop costs a company US $89,000.

It is worth noting that one advantage of using cloud computing is the possible elimination of the risks associated with the storage problems related to laptops, USB drives and drives being transported. Employees could do remote work without the need to copy files to a laptop, but rather copy the files to the “cloud” data storage. The same is true of USB drives; they would not be necessary, just copy the files to the appropriate cloud computer. Even transporting data drives might be unnecessary if the transport is for backup data drives, and instead the cloud service company provides backups of data in the cloud computers for its customers. The same is true if Microsoft Outlook is being provided as Software as a Service (SaaS) using cloud computing. That would eliminate problems associated with attachments to e-mails that are unencrypted but contain some PPI. Because the Outlook application is within the cloud computing, PPI in e-mail attachments is safe behind the firewalls and protection built into those cloud systems, which are typically very strong due to the nature of their security controls.

Another issue is that of home computers that could contain PPI. It is possible that they will be brought into situations regarding laws and regulations, and are already being discussed in entities’ privacy policies.

Conclusion

PPI presents multifaceted risks associated with compliance, business risks, potential crimes (ID theft) and other risks. Privacy audits are not only becoming more common in response to these risks, but are likely to grow. Regarding privacy audits, IT auditors need to know the risks, resources, best practices and effective controls associated with protecting and securing PPI.

Endnotes

1 As of the end of 2008, the following US states had no or pending legislation on PPI: Alabama, Kentucky, Missouri, Mississippi, New Mexico and South Dakota. See an interactive map at www.csoonline.com/article/221322/CSO_Disclosure_Series_Data_Breach_Notification_Laws_State_By_State.
2 Many organizations have lost backup drives being transported to disaster recovery sites. For instance, in 2006, the American Institute of Certified Public Accountants lost a drive that contained all 330,000 members’ data/PPI.
3 ISACA, IT Audit and Assurance Guideline G31, Privacy, 2005, www.isaca.org/standards
4 ISACA, “Risks of Customer Relationship Management: A Security, Control and Audit Approach,” February 2004, www.isaca.org/downloads
5 See also “7 Master Data Management Best Practices” from CIO Magazine, http://searchcio.techtarget.com, and http://searchsecurity.techtarget.com/search/1,293876,sid14,00.html?query=privacy&x=6&y=5 for a list of possible articles from Searchsecurity.com.
6 Sans.org has a sample privacy policy available online.
7 IronKey is an example.

Tommie W. Singleton, Ph.D., CISA , CITP, CMA, CPA
is an associate professor of information systems (IS) at the University of Alabama at Birmingham (USA), a Marshall IS Scholar and a director of the Forensic Accounting Program. Prior to obtaining his doctorate in accountancy from the University of Mississippi (USA) in 1995, Singleton was president of a small, value-added dealer of accounting IS using microcomputers. Singleton is also a scholar-in- residence for IT audit and forensic accounting at Carr Riggs Ingram, a large regional public accounting firm in the southeastern US. In 1999, the Alabama Society of CPAs awarded Singleton the 1998-1999 Innovative User of Technology Award. Singleton is the ISACA academic advocate at the University of Alabama at Birmingham. His articles on fraud, IT/IS, IT auditing and IT governance have appeared in numerous publications, including the ISACA Journal.


ISACA Journal, formerly Information Systems Control Journal, is published by ISACA, a nonprofit organization created for the public in 1969. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors, employers or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.

Subscription Rates:
US: one year (6 issues) $75.00
All international orders: one year (6 issues) $90.00
Remittance must be made in US funds.