Other Blogs
There are no items in this list.
ISACA > Journal > Practically Speaking Blog > Categories
How to Implement MFT for Data Protection

Dave BrunswickThe EU General Data Protection Regulation (GDPR) outlines measures required to protect personal data and how an enterprise moves, uses and stores that data. My recent ISACA Journal article, “Protection From GDPR Penalties With an MFT Strategy,” discusses why a robust managed file transfer (MFT) and integration platform is useful for organizations looking to comply with GDPR and other data protection measures.

Here are some key steps for implementing an MFT solution to meet increasingly stringent data demands:

  1. Assess your ecosystem—It is difficult for any organization to understand all the systems and applications deployed across departments and geo-distributed locations. But it is important to understand them and the life cycle of business data to ensure GDPR compliance. A comprehensive enterprisewide assessment of every on-premise and cloud system, database, application, and storage repository is critical in determining how MFT can streamline data processing.
  2. Evaluate your best deployment architecture—Understanding the data flows and systems that will be exchanging data, along with your other architectural components, will help define the best deployment architecture for your MFT solution. Depending on the nature of your data exchanges, an on-premise, cloud or hybrid solution may enable the best control. Also, the ability to integrate with central authentication and auditing systems may be important, as might the ability to deploy as part of a broader development operations (DevOps)-driven environment.
  3. Select the software—There are numerous MFT vendors out there. Smart organizations, however, know exactly what questions to ask an MFT vendor before selecting a solution that may not even fit the business needs.
  4. Design the service—The project manager often will liaise with the vendor for the bulk of this work, but that person also must seek input from other personnel (i.e., enterprise architect, security and compliance managers, application analyst) to ensure that all the required business functionality, IT administration and security boxes are checked.
  5. Test, test, test—After you have captured the service flows, configure the system, partner profiles and firewall connections, and throw everything you can at it. This quality assurance period is critical to outlining each required (and potential) use case and data pattern.
  6. Deploy and support—The time between testing and production is critical because it is when operational challenges surface. Solicit your vendor’s professional services team to help with migration and implementation requirements and lean on their support teams to resolve issues upon deployment.

Read Dave Brunswick’s recent Journal article:
Protection From GDPR Penalties With an MFT Strategy,” ISACA Journal, volume 4, 2018.

The Absence of IT Governance Codes

In recent years, board-level supervision in information technology matters has become a key IT governance topic. It is often assumed that national corporate governance codes can guide board members to design and potentially improve their IT governance practices. At the Antwerp Management School (AMS), we conducted a study to understand what IT governance-related guidelines are included in national corporate governance codes.

We selected 15 national corporate governance codes to study. These codes were selected based on income level and geographic dispersion across different continents. Surprisingly, we found that most national corporate governance codes do not include key IT governance topics. There is hardly any IT governance information incorporated in the codes at all. The only exception we found was the South African corporate governance code, King III, which contains an entire chapter on IT governance-related guidelines. We also note that the committee responsible for drafting the South African corporate governance code recently finalized King IV, in which IT-related matters assume an even more prominent role. Based on our findings, we conclude that:

  • Corporate governance committees responsible for drafting corporate governance codes worldwide that are willing to recognize the value of IT governance can certainly benefit from looking at the South African corporate governance codes.
  • Additionally, we suggest that board members who are already complying with their existing national corporate governance codes refer to the King III guidelines to explore more concrete guidelines on IT governance.

This study was performed by researchers at AMS around an industry-sponsored research project on board-level IT governance. The research project focused on the need for boards to extend their governance accountability from a mono-focus on finance and legal as proxy to corporate governance. This extended accountability should include technology and provide digital leadership and organizational capabilities to ensure that the enterprise’s IT department sustains and extends the enterprise’s strategies and objectives. We discovered that board members are increasingly seeking guidance on how they can expand their IT governance accountability within the board and also in an appropriate modus vivendi with executive management. More information, including intermediary results, can be found on the AMS website.

Read Steven De Haes, Anant Joshi, Tim Huygh and Salvi Jansen’s recent Journal article:
Exploring How Corporate Governance Codes Address IT Governance,” ISACA Journal, vol. 4, 2017.

Leverage Enterprise Data Management Investments to Facilitate Data Breach Reporting Requirements

In Canada, it is the Data Privacy Act and its impact on the Personal Information Protection and Electronic Documents Act (PIPEDA); in the United States, the regulations include the Gramm-Leach-Bliley Act, the Health Insurance Portability and Accountability Act (HIPAA), and the US Personal Data Notification and Protection Act; in Australia, it is the Privacy Amendment Act, while in the EU, it is the ePrivacy Directive. There are more regulations than those previously listed. In common with each is the growing requirement for privacy breach reporting, with breach assessment being a major part of that process. This includes identifying the location of the breach, the type of data that have been compromised and identifying exactly who could be compromised by the breach, since they would need to be individually notified in case of a breach of their sensitive data.

Now, large corporations, especially banks, but also insurers and financial securities organizations, have invested heavily in various forms of enterprise data management (EDM) tools and processes since the publication of the Basel Committee on Banking Supervision’s (BCBS) Principles for Effective Risk Data Aggregation and Risk Reporting (RDARR) in 2013. More recently, the data management implications of BCBS 265, fundamental review of trading book, are coming to light, with much in common with RDARR from a data aggregation perspective.

Cyber security practitioners are already familiar with data classification frameworks for the sensitivity of various enterprise data artifacts. However, the very EDM tools mentioned previously readily facilitate more classification detail, such as whether data constitute personally identifiable information (PII), payment card industry (PCI) data or personal health information (PHI), or, at a lower level, even whether the data are passport data, insurance numbers or credit card numbers.

By classifying more enterprise data, they become easier to locate, e.g., identify all data sources across the enterprise with passport numbers. Critically, it also becomes easier and faster to identify all the data classifications that are exposed at the point of breach. This drives the level of detail required in the breach report and the nature of the post-breach actions, all serving to simplify post-breach planning. Note that data classification is already happening, often implicitly, as part of many different enterprise data profiling activities within the EDM team. 

From a cyberrisk mitigation perspective, data classification is an enterprisewide initiative, since a breach could happen anywhere. For those organizations with the foresight to implement true enterprisewide data management as a follow-on, e.g., from RDARR, it makes sense for the cyber security team to leverage these EDM investments and learnings for cyberrisk management purposes. This not only makes good sense to the chief financial officer, it also means a quicker time to deployment and quicker time to breach reporting, all of which is good news for compliance, and really good news for us, the public.

Read Guy Pearce’s recent Journal article:
Boosting Cyber Security With Data Governance and Enterprise Data Management,” ISACA Journal, volume 3, 2017.

Going for the ATO

Jo Anna BennersonThe Authority to Operate (ATO) is necessary to work in the system of US federal government agencies. My recent Journal article provides details on how to obtain the authority to operate. The following steps can help US enterprises gain the approval to operate with the federal government:

   ●  Ensure confidentiality, integrity and availability—The first necessary step toward achieving ATO is confidentiality, integrity and availability (CIA). This means that only approved people can get in, any changes to the system or data are genuine, and the system is up and ready for use.
   ●  Embrace the NIST 800-53 control families—Every family is a tightly knit assembly of control with a dash-one, or parent control, followed by offspring controls that dive deep into the security measure. For instance, the Access Control Family starts with the dash one control of access control policy. It is followed by more detailed controls to be implemented and assessed such as Account Management and Access Enforcement. Using the lists of controls within each of the 18 NIST control families allows users to demonstrate security that is in place or that it is being planned.
   ●  Keep the evidence—Just like in any operational process, you create or gather documentation to delineate the process and what has taken place. Just like any trail or audit, you keep evidence of the path you have taken. The ATO process allows you to gather and store all the security documentation. This serves well in building a case for the security posture of your system and how it fits into your federal agency’s risk profile.

In addition to these steps, following the US National Institute of Standards and Technology Risk Management Framework can help your system be granted with the ATO.

Read Jo Anna Bennerson’s recent Journal article:
Navigating the US Federal Government Agency ATO Process for IT Security Professionals,” ISACA Journal, volume 2, 2017.

EU GDPR:  Embracing Privacy Requirements

We are living in a digital world where a staggering number of data breaches have resulted in the theft of personal data of end users across a broad spectrum of sectors, such as financial, health care and media. The growing adoption of the cloud, mobile devices and social media has resulted in an increase in incidents related to the theft of personal data.

As organizations begin the scramble to comply with the European Union (EU) General Data Protection Regulation (GDPR), there is a dire need to understand its scope and the privacy requirements mentioned in the standard. The regulation is applicable to all organizations that store, process and transmit any personal data related to an EU resident. The GDPR will replace Directive 95/46/EC, which has been the basis of European data protection law since it was introduced in 1995. The regulation will apply to even those organizations that may not have a presence in the EU, but are processing or accessing the personal data of EU data subjects.

There are an overwhelming amount of privacy requirements that an organization has to consider to enhance its privacy management program, mitigate privacy risk and demonstrate adherence to the GDPR. The following should be considered when developing policy to comply with the privacy requirements of the GDPR program:

  • Organizations must have commitment and support from the leadership and a consensus to successfully implement the GDPR compliance.
  • Conduct an awareness campaign so that everyone understands the seriousness and importance of the new privacy law, which will be become enforceable in May 2018.
  • Resources and budget will be required to develop the complete roadmap to achieve compliance with the GDPR.
  • Noncompliance with the GDPR results in enormous fines for both the data controller and the data processor.
  • There are strict conditions for privacy notices and obtaining consent.
  • Pseudonymisation of data, which involves processing personal data without identification of the subject, is necessary.
  • Understand and implement the new privacy requirements, such as privacy by design, right to erasure, right to portability, mandatory privacy impact assessments, data breach notification and appointment of a data protection officer (DPO).
  • There are enhanced obligations for data processors.

It is imperative for organizations to proactively determine their current state of data protection and benchmark it with GDPR requirements to understand whether they are GDPR compliant and identify which gaps must be filled. To bring themselves in line with the GDPR, companies both inside and outside the EU will be required to consider the changes required in the way they interact with customers and the transfer of data. It also means organizations have to invest more on the tools and technologies required to ensure adherence to stringent privacy requirements of GDPR.

Tarun Verma is a senior consultant with Infosys-Information and Cyber Risk Management (iCRM) practice. He has experience in the domains of security governance, IT risk management, regulatory compliances, privacy, cyber security and cloud security. He is responsible for delivering governance, risk and compliance consulting and advisory services to Fortune 500 clients.

Fire Protection Is a Shared Responsibility
Haris HamidovicBy Haris Hamidovic, Ph.D., CIA, ISMS IA
Fire protection best practices encompass all social actors (government bodies, other institutions, and all legal entities and citizens). Such inclusion is logical and necessary, considering the fact that a fire can occur in any area. As a result, all these social subjects are made responsible for fire protection, and fire protection must be an integral part of their regular activities. Each entity must also have an interest in protecting their personnel and property from a fire. Each entity must be aware of the causes of a fire, and secondly, each entity must be aware that it may be the cause of a fire.

Proper and consistent application of the technical norms and standards for design, installation, implementation, use and maintenance of electrical and other installations and devices is intended to prevent the outbreak of fire caused by these installations and devices. In many countries there is a legal obligation for correct and consistent application of appropriate fire protection measures provided for electrical and other installation, equipment and facilities. 

The probability of fire originating in digital equipment (servers, storage units) is very low because there is little energy available to any fault and little combustible material within the equipment. But the associated risk may be significant considering IT equipment has become a vital and commonplace tool for business, industry, government and research groups. Numerous steps can be taken to avoid the risk of fire in the computer room environment. Compliance with the US National Fire Protection Association (NFPA) Standard for Fire Prevention NFPA 75 or British Standard 6266 will greatly increase the fire safety in computer rooms. These standards recommend minimum requirements for the protection of computer rooms from damage by fire and its associated effects.

Read Haris Hamidovic’s recent Journal article:
Fire Protection of Computer Rooms—Legal Obligations and Best Practices,” ISACA Journal, volume 4, 2014.

The Challenges of Protecting Electronic Document Integrity
Haris HamidovicBy Haris Hamidovic, Ph.D., CIA, ISMS IA
The increased use of technologies that allow electronic document storage and electronic communication has led lawmakers and courts in many jurisdictions around the world to consider the legal status of such information and the legal effect of that communication. The laws pertaining to electronic documents in most countries are not sector-specific. The enactment of these laws means that all organizations will have to take appropriate measures to protect document integrity while using electronic documents in the ordinary course of business. Failure to take these measures is no longer just a lack of due professional care; it constitutes a violation of legal obligations and can result in fines.
In some countries, the laws governing electronic documents makes the use of electronic documents by organizations legally valid in their business transactions, both internally and with their clients, but these documents must be signed with a qualified electronic signature. Qualified electronic signatures are advanced electronic signatures that are based on a qualified certificate and which are created by a secure signature-creation device. Currently, public key infrastructure (PKI) technology is the sole technology able to meet the requirements of qualified electronic signatures. Although it is a mature technology that is being implemented more and more, it remains a rather complex technology, especially when it becomes intertwined with legal requirements. Consequently, the combination of technical and legal requirements can make it difficult for both technical and legal experts to implement a legally compliant electronic document system.
But meeting the challenging specifications governments require is not enough. Unfortunately, internal auditing does not play a large enough role in ensuring electronic document integrity. As discussed in my recent Journal article, without regular internal auditing, enterprises cannot know that their defenses are sound.
Read Haris Hamidovic’s recent Journal article:
Electronic Documents Information Security Compliance,” ISACA Journal, volume 3, 2014.
Value Creation Interactions' Automation Under Basel III With IT-directed IRM
Frank Bezzina, Ph.D., Pascal Lele, Ph.D., Ronald Zhao, Ph.D., Simon Grima, Ph.D., Robert W. Klein, Ph.D., and Martin Hellmich, Ph.D.
Frank Bezzina, Ph.D., Pascal Lele, Ph.D., Ronald Zhao, Ph.D., Simon Grima, Ph.D., Robert W. Klein, Ph.D., and Martin Hellmich, Ph.D.

The specific objective of Basel III is to take into account the impact of operating risk management on value creation capacity, thereby allowing enterprises to anticipate and cover counterparty risk (i.e., the risk when the counterparty of a transaction fails to meet its obligations or when it might be incapable of meeting the obligations before the fulfilment of a transaction).

The challenge for counterparty credit risk (CCR) entities is to schedule performance on the basis of the deposit of potentially recoverable operational risk losses (the source of cost savings) and to process in real time the indicators of productivity.

Through gap analysis, investor relationship management (IRM) modules provide weekly calculations of cost savings realized on each of the indicators, factors or causes at the origin of operational risk losses (absenteeism, quality defects, occupational accidents, direct productivity and skills gaps).

The efficiency of the proposed IRM system is based on the permanent link between internal control functions (finance, human resources and operations management), constituting the global structure of enterprise risk management (ERM), and the consideration of five indicators (and not only the most worrying) in the calculations of the created value and the variable remuneration.

The aim of IT-directed IRM is to feed the information system on which the internal controls of a firm rely in order to analyze financial risk with richer financial management data.

The pricing of assets is known to be a major difficulty for investors (banks, insurance firms and financial markets). In the absence of operational risk data, the prudent financial analysis model that prevails is one with weak effectiveness. This concept characterizes information emerging from the observation of past income statements or past stock market prices. An examination of the past asset profits is useful in planning future profitability.

The utilization of expected loss data and of cost savings, bound with the CCR’s appetite for operational risk, allows financial analysts to treat the assets in line with International Financial Reporting Standards (IFRS) and US Generally Accepted Accounting Principles (GAAP)—elements on which firms depend for future economic and competitive advantages.

IT-directed IRM provides reports that enable investors to reach this objective. In particular, it supplies in mathematical modelling tools (of financial modelling and economic modelling) the data of endogenous interaction of operational risk associated with the CCR for the calculation of the ratios of generalization or for the macroeconomic projections of long-term provisions. The data provided are particularly useful for updating the risk, especially when the financial and social quality of the CCR is deteriorating.

Read Frank Bezzina, Pascal Lele, Ronald Zhao, Simon Grima, Robert W. Klein and Martin Hellmich’s recent Journal Online article:
The Value in Using IT-directed Investor Relationship Management," ISACA Journal, volume 6, 2013.

Planning for FISMA Governance in the Private Enterprise
Timothy McCain
Timothy McCain, CISM
The US Federal Information Security Management Act (FISMA) varies from other regulations in that it was developed to address US federal agencies rather than private enterprise. Therefore, the certification and accreditation (C&A) bodies put in place to validate compliance and assign the authority to operate (ATO) are not typically accessible to private companies. The C&A process includes the performance of a risk assessment, gap analysis, control implementation, and security testing and evaluation of the FISMA boundary. The C&A process must be performed under the supervision of a federal agency, as it will provide the ATO, which is the document that attests compliance with the requirements of FISMA.
In recent years, government contractors and subcontractors have been held more to FISMA compliance than ever before as federal agencies have expanded oversight for contractors outside of defense and are applying FISMA to researchers. Therefore, your organization may come under governance requirements if it is awarded a contract or if it receives federal funding and grants. Unfortunately, until a company obtains a contract or receives grant funding, it has historically not been provided an understanding of the requirements and obligations it will be held to in regards to obtaining and maintaining an ATO.
Upon winning the contract or obtaining the grant, I recommend that the organization immediately identify and contact the contracting officer (COTR) for the agency funding the organization, as the COTR will be the primary contact in supporting your FISMA compliance efforts. The COTR will normally be listed within the contractual paperwork; however, the contract point of contact can direct you if it is not listed. The COTR will provide you with the classification of systems within your FISMA boundary based upon the FIPS 199; this is due to the extension of the contractor’s infrastructure being held to the same level of classification as the agency itself. Prior to engaging the COTR, it is advisable to define your FISMA boundary (the systems that will be used in support of the contract or grant and any interconnections to those systems); this will help the COTR better understand scoping and the impact of requirements on your company. FISMA is an organizationally based regulation, rather than data-based, which can be quite impactful to your current IT governance. However, if the COTR is brought in early, the rigor of the control application can usually be negotiated.
Thorough planning, early engagement of the COTR and identification/segmentation of your boundary, including interconnections, will help in minimizing the impact to your organization’s governance and decrease the overall C&A process time lines and project costs.
Read Timothy McCain, Jacqueline Medina, Ryan Morrell, Dennis Pickett, John Lumpkin, Dina Drankus Pekelnicky, Alex Bengoa and David Songco’s recent Journal article:
Considerations for Ensuring Security of Research Data in a Federally Regulated Environment,” ISACA Journal, volume 3, 2013.
The Value Proposition of the Smart Credit Card
Ali AlaswadAli Alaswad, ITIL, PMPG, PMP
The fundamental goal behind transforming efforts from securing merchants’ environments to securing the credit cards is to allow merchants to allocate the time and cost associated with these efforts to the improvement and expansion of the business.
Attaining Payment Card Industry (PCI) compliance is not a straightforward task:  Projects implemented to achieve compliance require resources with a diversity of skill sets and merchants’ organizations have to be assessed from different aspects, considering both IT and non-IT perspectives. Typically such an investment can be costly and the cost of a breach can easily be many times the cost of attaining PCI compliance.
The Smart Credit Card proposed in my recent Journal article eliminates fraud at the merchant level by shifting the liability to the card-issuing banks and credit card holders (The total amount of credit card fraud worldwide is US $5.55 billion annually [source:  www.statisticbrain.com]) and introducing a new credit card with technical features that enable card holders to generate temporary credit card numbers valid for one-time use and to provide the card holders the ability to confirm or decline transactions.
Merchants, card-issuing banks and credit card holders all benefit from this solution.
The value proposition to merchants:
  • Reallocate the PCI compliance implementation and sustainability funds to other areas in the organization in accordance with the organization’s defined strategic business objectives.
  • Enhance business reputation and increases client trust, leading to the increased use of credit cards for payment.
  • Avoid penalties and the risk of not being PCI-compliant.

The value proposition to the card-issuing bank:

  • Increase the reliance on credit cards as a method of payment.
  • Save the time and cost associated with the administrative work of fraud management related to merchant’s transactions.
The value proposition to card holders:
  • Enjoy the protection provided by the Smart Credit Card solution and the associated freedom to use credit cards, as this solution addresses different fraud types (e.g., online and phone banking fraud [card not present], counterfeit card fraud, card loss/theft).
  • Save the monetary cost and inconvenience caused by fraud.
  • Enjoy peace of mind.
Read Ali Alaswad’s recent Journal article:
Securing Merchant Environments Is Good, Securing the Credit Card Itself Is Better,” ISACA Journal, volume 2, 2013