‭(Hidden)‬ Admin Links

ISACA > Journal > Journal Author Blog
The Uses for a Due Diligence Framework
Bostjan Delak, Ph.D., CISA, CIS, and Marko Bajec, Ph.D.
Bostjan Delak, Ph.D., CISA, CIS, and Marko Bajec, Ph.D.

Several managers, owners and shareholders are asking the same questions daily:

  • “Acquire and merge or do not acquire and merge?”
  • “To outsource or not to outsource?”
  • “To implement new technology or not to implement it?”

Performing qualitative and effective due diligence helps to reduce the associated risk and makes decision making easier, and there are several possible ways to do this.

From 1998 to 2008, we conducted more than 40 general IS due diligences and more than 25 initial IS due diligence engagements in Central and Eastern Europe. At that time there was a lack of the due diligence frameworks. We have studied different methodologies, approaches and standards (e.g., COBIT, ITIL, ISO/IEC 9000, ISO/IEC 27000, ISO/IEC 20000, BCM, ITADD, KnowledgeLeader) and through the years we have assembled a new framework for rapid due diligence (FISDD). With this framework, IS due diligence may be delivered in a reasonably short period of time. FISDD was successfully tested on several real merger and acquisition case studies in the financial industry. It can be used for different types of IS due diligence, including:

  • Initial—should be conducted prior to the merger or acquisition of any organization
  • General—used upon the request of shareholders or an organization’s top management to determine the status of an important part of IS or to complete status of IS within the organization
  • Vendor—should be done before any outsourcing contract and should be repeated annually
  • Technology—is performed on prospective technology investments.

IS due diligence is very similar to the general IS audit process. However, due to its inherent complexity it requires a framework for delivery. Our recent Journal article introduces the FISDD framework and delivers a timeline for using it.

Read Bostjan Delak and Marko Bajec’s recent ISACA Journal article:
Conducting IS Due Diligence in a Structured Model Within a Short Period of Time,” ISACA Journal, volume 4, 2014.

Stopping the Segregation of Duties Creep and Confusion
Kevin KobelskyKevin Kobelsky, Ph.D., CISA, CA, CPA (Canada)
Five years ago I sat at a conference with leading practitioners and academics, watching a vendor describe a software-based internal control tool for use in enterprise resource planning (ERP) systems that generated a very large matrix of incompatible duties. It struck me as being overly complex and a reflection of experience rather than the product of profound design principles. At the break, I asked the 5 professionals and academics at my table (each of whom had 10-30 years of experience) what they thought segregation of duties (SoD) meant—I got 5 different answers.
Subsequently I polled many more academic and professional colleagues and continued to get different answers. Many cited model segregating asset custody, recording and authorization, while others added initiation or reconciliation. I reviewed textbooks and professional publications and found a variety of models but no detailed descriptions of the justification for the segregations proposed. In fact, many of these resources provided examples that were incorrect. (I confirmed these inaccuracies with multiple colleagues.) Examples from professional sources yielded large, unwieldy matrices with little or no rationale provided. It seemed that when new IT tasks arose, the matrices would merely add another duty to be segregated from those already existing, leading to segregation creep. Some professional colleagues commented that firms were beginning to push back, presenting strong counterarguments. So much for the notion of a generally accepted model of SoD in the profession!
My recent Journal article presents the IT-side of a general model for SoD to help reduce confusion. But because there are a variety of SoD definitions, my model may not be right for everyone. What would you change about this model to make it more applicable to you?

Read Kevin Kobelsky’s recent Journal article:
Enhancing IT Governance With a Simplified Approach to Segregation of Duties,” ISACA Journal, volume 4, 2014.
Ethical Hacking and Its Value to Security
Viktor PolicBy Viktor Polic, Ph.D., CISA, CRISC, CISSP
A false sense of security puts information in danger. It results from the lack of risk perception and historical records on information security incidents. The fact that there is no indication of an incident does not mean that systems have not been attacked; businesses may just not be aware of it yet. Many reports on information security breaches show that they are discovered months and sometimes even years after the fact. One such example was Operation Shady RAT in 2011. In its annual Global Risk Index report, Lloyd’s has upgraded cyberrisk to position 3 from position 12. Despite the growing threat of cyberrisk, many businesses still believe they are able to deal with the risk.

So how can organizations measure information security risk effectively? COBIT 5 recommends aligning IT risk management with the organization’s enterprise risk management framework. However, there is no accurate method for quantifying information security risk. The recently discovered vulnerability in OpenSSL cryptographic library (CVE-2014-0160, or more popularly called Heartbleed) illustrates a serious deficiency in the secure software development life cycle of that popular open source library. What would have been the accurate probability estimate of such a risk prior to the vulnerability disclosure?

There are many standards for measuring resistance and robustness of security devices, e.g., burglar-resistant doors, car alarm systems, electromagnetic shielding, fire protection. Those measurements could be used to help quantify safety and security risk. However there are no defined standards for quantifying information security risk. Nevertheless, there are companies that focus their skills on verifying resistance of information systems against known hacking techniques and tools. These companies are known as ethical hackers or penetration testers.

Unfortunately highly skilled manual work comes at expenses that many businesses cannot afford to spend. A novel approach, which is discussed further in my recent Journal article, is to combine automated vulnerability assessment scanners with the manual work of ethical hackers through a front-end web-based application offered as a Software as a Service (SaaS) solution. The objective is to bring down the cost of such a security audit while increasing the accuracy of risk estimates.

Read Viktor Polic’s recent Journal article:
Ethical Hacking: The Next Level or the Game Is Not Over?,” ISACA Journal, vol. 4, 2014.

How IT Governance Can Spur Innovation With the Right Metrics
Yo DelmarYo Delmar, CISM, CGEIT
Last year, the Harvard Business Review published a blog post that claimed, “IT governance is killing innovation.” Since then, I have talked to many business executives—both from IT and the business side—on this very subject. Most of them attest to the mounting importance of IT governance, especially in today’s world where technology has become increasingly pervasive across all business activities.

Some of the questions that popped up in our conversations included:  Can IT governance programs do more than just manage IT operations and performance? Can IT collaborate better with the business to drive innovation? Can IT governance play a more transformative role in the organization?

I believe the answer to all 3 questions is yes. Robust IT governance programs that provide real insights can actually facilitate business innovation and growth. The keywords are “meaningful metrics and analytics.” When metrics are thoughtfully developed to closely align with both business objectives and the analytics framework, they enable an organization to fine-tune its strategies and to optimize resources toward maximizing its competitive advantage.

IT metrics act as building blocks of a larger business analytics program, which can help organizations make more informed decisions when it comes to opportunities to drive business performance and innovation. Most importantly, IT analytics empower the organization with the strategic, practical and operational insights it needs to invest in IT projects that have the most transformative power.

Here are a few things to consider while developing effective IT metrics and analytics for your IT governance program:

  • Ensure that your IT metrics and analytics are defined and directed by enterprise goals, not the other way around.
  • Partner with the business to create metrics around emerging technologies, like social media, that can boost brand performance.
  • Define metrics that can quickly adapt to a dynamic business and technological environment.
  • Choose metrics that are relevant across multiple initiatives, such as IT security, business continuity, disaster recovery, crisis management and asset protection.
  • Ensure that people in the organization know what is being measured, how and why.
  • Do not get locked into a static set of metrics or analytics that no longer measure what matters—constantly reevaluate them and their relevance to changing business goals.

With the right set of IT metrics and the resulting analytics framework, closely aligned with business strategy and performance objectives, IT departments can become centers of innovation and competitive advantage.

Read Yo Delmar’s recent Journal article:
Leveraging Metrics for Business Innovation,” ISACA Journal, volume 4, 2014.

Considering Cloud Services? Walk Before you Run
By Tim Myers
Many companies rely on threadbare IT resources or external advisors to guide them in making technology decisions and are understandably wary when considering new options, especially the multitude of software as a service (SaaS) features now available to them in the cloud.

Companies that are considering cloud-based services for the first time or that have made only marginal forays into the use of public or private data centers should walk before they run. As any battle-scarred veteran of the business world knows, new programs or projects stand a much better chance of success—and widespread acceptance—if they are approached in a methodical manner.

Rather than flying into the cloud headfirst, the prudent choice may be to take a more modular approach. With advice from IT leaders and any outside experts, companies can test out the cloud by piloting it first with 1 cloud-ready enterprise function, like accounting, email services or data backup. This way, organizations will quickly learn what works, what needs tweaking and whether or not the cloud is proving to be beneficial from a return on investment perspective.

Importantly, while assessing this initial foray into the cloud, the rest of the business will run as usual. Thus, any problems or delays can be ironed out without disrupting the rest of the enterprise.

If the cloud is living up to its billing, organizations should be ready to add on additional cloud-ready functions and applications and enjoy further cost, productivity and security benefits. And given that 87 percent of cloud users surveyed recently would recommend the cloud to a peer or colleague, the likelihood of satisfaction is high.

Read Tim Myers’ recent Journal article:
Trial by Fire in Cloud Development Pays Dividends,” ISACA Journal, volume 4, 2014.
Fire Protection Is a Shared Responsibility
Haris HamidovicBy Haris Hamidovic, Ph.D., CIA, ISMS IA
Fire protection best practices encompass all social actors (government bodies, other institutions, and all legal entities and citizens). Such inclusion is logical and necessary, considering the fact that a fire can occur in any area. As a result, all these social subjects are made responsible for fire protection, and fire protection must be an integral part of their regular activities. Each entity must also have an interest in protecting their personnel and property from a fire. Each entity must be aware of the causes of a fire, and secondly, each entity must be aware that it may be the cause of a fire.

Proper and consistent application of the technical norms and standards for design, installation, implementation, use and maintenance of electrical and other installations and devices is intended to prevent the outbreak of fire caused by these installations and devices. In many countries there is a legal obligation for correct and consistent application of appropriate fire protection measures provided for electrical and other installation, equipment and facilities. 

The probability of fire originating in digital equipment (servers, storage units) is very low because there is little energy available to any fault and little combustible material within the equipment. But the associated risk may be significant considering IT equipment has become a vital and commonplace tool for business, industry, government and research groups. Numerous steps can be taken to avoid the risk of fire in the computer room environment. Compliance with the US National Fire Protection Association (NFPA) Standard for Fire Prevention NFPA 75 or British Standard 6266 will greatly increase the fire safety in computer rooms. These standards recommend minimum requirements for the protection of computer rooms from damage by fire and its associated effects.

Read Haris Hamidovic’s recent Journal article:
Fire Protection of Computer Rooms—Legal Obligations and Best Practices,” ISACA Journal, volume 4, 2014.

The Intersection of Framework and Legislation
By Fatih Altinel, CISA, and Yeliz Kilinc

Industrial and professional organizations have a great need to standardize applications. ISACA® and COBIT® have a leading role in spreading and widening acceptance of the notions of IT governance, IT risk and IT control concepts. But at the same time, sectoral authorities are the most important agents for country-wide acceptance of these types of frameworks. In this context, the Banking Regulation and Supervision Agency of Turkey (BRSA) has the leading role in the adoption of IT governance, IT risk and IT control concepts by banks.

Since the Imar Bank case of 2000, an IT-oriented banking fraud case, and the Turkish retailer Gima case, in which a massive amount of credit card information was stolen, IT audits have been performed in Turkish banks since 2006.   

BRSA, the sole Turkish regulatory authority of the banking industry, credit card systems and other payment systems, is guided partially by COBIT and predominantly by local legislation, which has been prepared in parallel with other international frameworks. BRSA recommends the use of COBIT in internal audit and external audit activities performed in banks. Our recent Journal article discusses the similiarities between COBIT 5 and local legislation and Turkish banking regulations.

Read Fatih Altinel and Yeliz Kilinc’s recent JournalOnline article:
Similarities Between Banking Regulations of Turkey Made by BRSA and COBIT 5 Governance Area,” ISACA Journal, volume 3, 2014
Data Privacy—Essential for Corporate Social Responsibility
Horace McPhersonBy Horace McPherson, CISA, CISM, CGEIT, CRISC, CISSP, PMP
Data privacy is more than just a compliancy or a business issue. People become vulnerable whenever they turn over their personal information to companies. Companies, regardless of industry, owe it to their customers or subscribers to protect their personal information as if they are protecting people’s most precious possessions.

I see what happens to people when they are notified that a company holding their personal information has been breached:  anxiety sets in, people have sleepless nights and they sometimes even become pessimistic about the future. Victims of identity theft sometimes feel alone since, in most cases, the burden of proof is on them to prove that they are not responsible for the results of any nefarious actions performed by an identity thief.

In my opinion, personal information is worth more than the numbers on a balance sheet or income statement. In the area of corporate social responsibility (CSR), organizations must be concerned with what is called the triple bottom line. Elements of the triple bottom line include social, environmental and economic factors. Protecting customers’ information is aligned with the social and economic aspects of the triple bottom line, 2 of the essential elements of CSR. If companies do not properly protect personal information, they are not being good corporate citizens. Once sensitive information is collected, there is an expectation of due diligence and due care in the application of data protection policies and mechanisms.

At the end of the day, a company’s approach to data privacy and protection depends on the moral outlook of the company’s leaders. The ethical perspective of the top management team determines whether a company will be proactive and a leader in setting and supporting privacy protection policies and whether privacy protection is put ahead of profits. The tone at the top is very important. Let us hope that the tone is a good and fair one.

Read Horace McPherson’s recent JournalOnline article:
Data Privacy—Protecting This Asset Is a Priority,” ISACA Journal, volume 3, 2014
Lessons Learned From Pilot Projects Using Audit-focused Mining
By Martin Schultz, CISA, CIA, Alexander Ruehle, CISA, CIA, and Nick Gehrke, Ph.D., CISA
Martin Schultz Alexander Ruehle Nick Gehrke
Our recent Journal article discusses the ways auditors can use IT resources to make process audits more efficient. Although not a substitute for human auditors, these automated processes can revolutionize the way audits are executed., We have applied audit-focused mining in several pilot projects for diverse companies. Three major benefits have been experienced by using an audit-focused mining approach for these audit assignments:
  1. The scoping of critical processes and transactions has improved significantly. Based on a list of significant financial statement accounts, the complete as-is processes are automatically derived. As a result of audit-focused mining, no time-consuming walkthroughs are needed, and no discussions are held on which process variants do exist and which ones need to be considered for the audit. The automatically generated process models are accepted as single point of truth, which is the practice of structuring information models so that every data element is only stored once. Although manually documented process models are available, the whole project team solely relies on the automatically generated ones. Furthermore, the internal audit department provides process transparency. This was perceived as value added by the process owner and management.
  2. The fieldwork starts earlier and is more focused. With the process-mining-based approach, a comprehensive overview of the auditee’s processes is easily gained. Accordingly, in the early stages of the pilot projects, the focus (in terms of time and budget) shifts from process-understanding tasks to process-auditing tasks. Process flows that deviate from defined standard processes are immediately identified. With a drill down to related financial documents, suspicious business transactions can be investigated in detail, along with the responsible process owner or involved employees. During the pilot programs, obtaining a sample-based selection of documents was omitted, although it was part of the initial project plan. However, after having the complete process models available, all project members agreed that a blind sampling is of less value for the effectiveness of the audit project.
  3. New audit analyses are enabled. With this approach, single financial documents are connected to complete end-to-end business transactions. This integrated view enables new analyses that are of high value for a process audit and were not possible by solely looking at single documents. For instance, during  the pilot projects, especially the segregation-of-duties (SoD) analyses on enacted business transactions, revealed several severe audit findings (e.g., invoice and payment posting by the same user along with changes to the bank account of the corresponding vendor). With the help of a standardized SoD matrix, business transactions with the most critical combinations could be easily identified and investigated. The major difference of our findings compared to common SoD analysis based on granted system access rights is that they do not just constitute potential risk of misuse. Instead, in these cases, far too extensive access rights have been exploited. Accordingly, within the pilot projects, these findings were used to set up projects for reworking the access rights management of the enterprise resource planning (ERP) systems.
Against this background, the next step in the development road map is a type of intelligent dragnet investigation, which applies several independent analytics and measures to the end-to-end business transactions and calculates an aggregated risk score for every business transaction. By doing so, high-risk transactions can be identified right at the beginning of an audit, reduce false positives and identify business transactions that put financial statements at risk.

Read Martin Schultz, Alexander Ruehle and Nick Gehrke’s recent Journal article:
Audit-focused Mining—New Views on Integrating Process Mining and Internal Control,” ISACA Journal, volume 3, 2014
The Pitfalls, Perils and Pleasures of Big Data
Muzamil RiffatBy Muzamil Riffat, CISA, CRISC, CISSP, PMP, GWAPT
The whole world suddenly seems to be obsessed with and immersed in big data talk. The term “big data” refers to the deluge of information that is generated through a myriad of digital devices around us. Basically, every step we take, every move we make results in generating some sort of data and leaves a digital footprint behind that can be aggregated or analyzed for various purposes. The amount of data humanity is generating is so gargantuan that the exact quantification is quite improbable. It is estimated that if the data generated by us in the year 2011 were converted into a high-definition movie, it would take 1 person more than 47 million years to watch it.

Advocates of big data proclaim that the advanced computational power available today and superabundance of data is a match made in heaven, which could be used to address all of humanity’s ills. These claims are substantiated by success stories that have been achieved in different areas (e.g., retail, logistics, healthcare, customer management).

However, the combination of availability of data and the initial success of information that has been gleaned from it has also resulted in the initiation of some dubious projects. The conclusions drawn from these projects are equally questionable. Most importantly, the conclusions seem to be victims of a post hoc, ergo propter hoc fallacy (a logical fallacy in which conclusions are derived entirely on the chronology of events rather than by taking all of the related factors into consideration). The correlation is confused with the causation. Let us illustrate this with an example:  By analyzing the comprehensive data about students’ activities at a particular university, the conclusion could be drawn that students who wash their hands more than a specific number of times tend to score better on examinations. It is obvious that washing hands cannot cause better exam results (although these 2 events might be correlated). This obvious distinction between correlation and causation is often overlooked in more subtle business cases. The reliability and quality of the source data is also a major concern. Big data analytics have the potential to amplify errors present in the source data, since the overall conclusions are based on a faulty data set.

Organizations embarking upon the big data journey should not only consider the success stories, which get publicized and hyped, but also be mindful of the fact that a few big data projects have either failed spectacularly or provided spurious conclusions.

There is no denying the fact that the volume of data generated and captured today is soaring at an unprecedented pace, which has the potential to transform our lives. There is also no denying that without proper planning and objective setting, organizations embarking upon big data projects will be shooting in the dark and unrealistically hoping to hit the bull’s eye.

Read Muzamil Riffat’s recent Journal article:
Big Data—Not a Panacea,” ISACA Journal, volume 3, 2014
1 - 10 Next