‭(Hidden)‬ Admin Links

ISACA > Journal > Journal Author Blog
Ethical Hacking and Its Value to Security
Viktor PolicBy Viktor Polic, Ph.D., CISA, CRISC, CISSP
 
A false sense of security puts information in danger. It results from the lack of risk perception and historical records on information security incidents. The fact that there is no indication of an incident does not mean that systems have not been attacked; businesses may just not be aware of it yet. Many reports on information security breaches show that they are discovered months and sometimes even years after the fact. One such example was Operation Shady RAT in 2011. In its annual Global Risk Index report, Lloyd’s has upgraded cyberrisk to position 3 from position 12. Despite the growing threat of cyberrisk, many businesses still believe they are able to deal with the risk.


So how can organizations measure information security risk effectively? COBIT 5 recommends aligning IT risk management with the organization’s enterprise risk management framework. However, there is no accurate method for quantifying information security risk. The recently discovered vulnerability in OpenSSL cryptographic library (CVE-2014-0160, or more popularly called Heartbleed) illustrates a serious deficiency in the secure software development life cycle of that popular open source library. What would have been the accurate probability estimate of such a risk prior to the vulnerability disclosure?

There are many standards for measuring resistance and robustness of security devices, e.g., burglar-resistant doors, car alarm systems, electromagnetic shielding, fire protection. Those measurements could be used to help quantify safety and security risk. However there are no defined standards for quantifying information security risk. Nevertheless, there are companies that focus their skills on verifying resistance of information systems against known hacking techniques and tools. These companies are known as ethical hackers or penetration testers.

Unfortunately highly skilled manual work comes at expenses that many businesses cannot afford to spend. A novel approach, which is discussed further in my recent Journal article, is to combine automated vulnerability assessment scanners with the manual work of ethical hackers through a front-end web-based application offered as a Software as a Service (SaaS) solution. The objective is to bring down the cost of such a security audit while increasing the accuracy of risk estimates.

Read Viktor Polic’s recent Journal article:
Ethical Hacking: The Next Level or the Game Is Not Over?,” ISACA Journal, vol. 4, 2014.

How IT Governance Can Spur Innovation With the Right Metrics
Yo DelmarYo Delmar, CISM, CGEIT
 
Last year, the Harvard Business Review published a blog post that claimed, “IT governance is killing innovation.” Since then, I have talked to many business executives—both from IT and the business side—on this very subject. Most of them attest to the mounting importance of IT governance, especially in today’s world where technology has become increasingly pervasive across all business activities.


Some of the questions that popped up in our conversations included:  Can IT governance programs do more than just manage IT operations and performance? Can IT collaborate better with the business to drive innovation? Can IT governance play a more transformative role in the organization?

I believe the answer to all 3 questions is yes. Robust IT governance programs that provide real insights can actually facilitate business innovation and growth. The keywords are “meaningful metrics and analytics.” When metrics are thoughtfully developed to closely align with both business objectives and the analytics framework, they enable an organization to fine-tune its strategies and to optimize resources toward maximizing its competitive advantage.

IT metrics act as building blocks of a larger business analytics program, which can help organizations make more informed decisions when it comes to opportunities to drive business performance and innovation. Most importantly, IT analytics empower the organization with the strategic, practical and operational insights it needs to invest in IT projects that have the most transformative power.

Here are a few things to consider while developing effective IT metrics and analytics for your IT governance program:

  • Ensure that your IT metrics and analytics are defined and directed by enterprise goals, not the other way around.
  • Partner with the business to create metrics around emerging technologies, like social media, that can boost brand performance.
  • Define metrics that can quickly adapt to a dynamic business and technological environment.
  • Choose metrics that are relevant across multiple initiatives, such as IT security, business continuity, disaster recovery, crisis management and asset protection.
  • Ensure that people in the organization know what is being measured, how and why.
  • Do not get locked into a static set of metrics or analytics that no longer measure what matters—constantly reevaluate them and their relevance to changing business goals.

With the right set of IT metrics and the resulting analytics framework, closely aligned with business strategy and performance objectives, IT departments can become centers of innovation and competitive advantage.

Read Yo Delmar’s recent Journal article:
Leveraging Metrics for Business Innovation,” ISACA Journal, volume 4, 2014.

Considering Cloud Services? Walk Before you Run
By Tim Myers
 
Many companies rely on threadbare IT resources or external advisors to guide them in making technology decisions and are understandably wary when considering new options, especially the multitude of software as a service (SaaS) features now available to them in the cloud.

Companies that are considering cloud-based services for the first time or that have made only marginal forays into the use of public or private data centers should walk before they run. As any battle-scarred veteran of the business world knows, new programs or projects stand a much better chance of success—and widespread acceptance—if they are approached in a methodical manner.

Rather than flying into the cloud headfirst, the prudent choice may be to take a more modular approach. With advice from IT leaders and any outside experts, companies can test out the cloud by piloting it first with 1 cloud-ready enterprise function, like accounting, email services or data backup. This way, organizations will quickly learn what works, what needs tweaking and whether or not the cloud is proving to be beneficial from a return on investment perspective.

Importantly, while assessing this initial foray into the cloud, the rest of the business will run as usual. Thus, any problems or delays can be ironed out without disrupting the rest of the enterprise.

If the cloud is living up to its billing, organizations should be ready to add on additional cloud-ready functions and applications and enjoy further cost, productivity and security benefits. And given that 87 percent of cloud users surveyed recently would recommend the cloud to a peer or colleague, the likelihood of satisfaction is high.

Read Tim Myers’ recent Journal article:
Trial by Fire in Cloud Development Pays Dividends,” ISACA Journal, volume 4, 2014.
Fire Protection Is a Shared Responsibility
Haris HamidovicBy Haris Hamidovic, Ph.D., CIA, ISMS IA
 
Fire protection best practices encompass all social actors (government bodies, other institutions, and all legal entities and citizens). Such inclusion is logical and necessary, considering the fact that a fire can occur in any area. As a result, all these social subjects are made responsible for fire protection, and fire protection must be an integral part of their regular activities. Each entity must also have an interest in protecting their personnel and property from a fire. Each entity must be aware of the causes of a fire, and secondly, each entity must be aware that it may be the cause of a fire.


Proper and consistent application of the technical norms and standards for design, installation, implementation, use and maintenance of electrical and other installations and devices is intended to prevent the outbreak of fire caused by these installations and devices. In many countries there is a legal obligation for correct and consistent application of appropriate fire protection measures provided for electrical and other installation, equipment and facilities. 

The probability of fire originating in digital equipment (servers, storage units) is very low because there is little energy available to any fault and little combustible material within the equipment. But the associated risk may be significant considering IT equipment has become a vital and commonplace tool for business, industry, government and research groups. Numerous steps can be taken to avoid the risk of fire in the computer room environment. Compliance with the US National Fire Protection Association (NFPA) Standard for Fire Prevention NFPA 75 or British Standard 6266 will greatly increase the fire safety in computer rooms. These standards recommend minimum requirements for the protection of computer rooms from damage by fire and its associated effects.

Read Haris Hamidovic’s recent Journal article:
Fire Protection of Computer Rooms—Legal Obligations and Best Practices,” ISACA Journal, volume 4, 2014.

The Intersection of Framework and Legislation
By Fatih Altinel, CISA, and Yeliz Kilinc

Industrial and professional organizations have a great need to standardize applications. ISACA® and COBIT® have a leading role in spreading and widening acceptance of the notions of IT governance, IT risk and IT control concepts. But at the same time, sectoral authorities are the most important agents for country-wide acceptance of these types of frameworks. In this context, the Banking Regulation and Supervision Agency of Turkey (BRSA) has the leading role in the adoption of IT governance, IT risk and IT control concepts by banks.

Since the Imar Bank case of 2000, an IT-oriented banking fraud case, and the Turkish retailer Gima case, in which a massive amount of credit card information was stolen, IT audits have been performed in Turkish banks since 2006.   

BRSA, the sole Turkish regulatory authority of the banking industry, credit card systems and other payment systems, is guided partially by COBIT and predominantly by local legislation, which has been prepared in parallel with other international frameworks. BRSA recommends the use of COBIT in internal audit and external audit activities performed in banks. Our recent Journal article discusses the similiarities between COBIT 5 and local legislation and Turkish banking regulations.

Read Fatih Altinel and Yeliz Kilinc’s recent JournalOnline article:
Similarities Between Banking Regulations of Turkey Made by BRSA and COBIT 5 Governance Area,” ISACA Journal, volume 3, 2014
Data Privacy—Essential for Corporate Social Responsibility
Horace McPhersonBy Horace McPherson, CISA, CISM, CGEIT, CRISC, CISSP, PMP
 
Data privacy is more than just a compliancy or a business issue. People become vulnerable whenever they turn over their personal information to companies. Companies, regardless of industry, owe it to their customers or subscribers to protect their personal information as if they are protecting people’s most precious possessions.

I see what happens to people when they are notified that a company holding their personal information has been breached:  anxiety sets in, people have sleepless nights and they sometimes even become pessimistic about the future. Victims of identity theft sometimes feel alone since, in most cases, the burden of proof is on them to prove that they are not responsible for the results of any nefarious actions performed by an identity thief.

In my opinion, personal information is worth more than the numbers on a balance sheet or income statement. In the area of corporate social responsibility (CSR), organizations must be concerned with what is called the triple bottom line. Elements of the triple bottom line include social, environmental and economic factors. Protecting customers’ information is aligned with the social and economic aspects of the triple bottom line, 2 of the essential elements of CSR. If companies do not properly protect personal information, they are not being good corporate citizens. Once sensitive information is collected, there is an expectation of due diligence and due care in the application of data protection policies and mechanisms.

At the end of the day, a company’s approach to data privacy and protection depends on the moral outlook of the company’s leaders. The ethical perspective of the top management team determines whether a company will be proactive and a leader in setting and supporting privacy protection policies and whether privacy protection is put ahead of profits. The tone at the top is very important. Let us hope that the tone is a good and fair one.

Read Horace McPherson’s recent JournalOnline article:
Data Privacy—Protecting This Asset Is a Priority,” ISACA Journal, volume 3, 2014
Lessons Learned From Pilot Projects Using Audit-focused Mining
By Martin Schultz, CISA, CIA, Alexander Ruehle, CISA, CIA, and Nick Gehrke, Ph.D., CISA
 
Martin Schultz Alexander Ruehle Nick Gehrke
Our recent Journal article discusses the ways auditors can use IT resources to make process audits more efficient. Although not a substitute for human auditors, these automated processes can revolutionize the way audits are executed., We have applied audit-focused mining in several pilot projects for diverse companies. Three major benefits have been experienced by using an audit-focused mining approach for these audit assignments:
  1. The scoping of critical processes and transactions has improved significantly. Based on a list of significant financial statement accounts, the complete as-is processes are automatically derived. As a result of audit-focused mining, no time-consuming walkthroughs are needed, and no discussions are held on which process variants do exist and which ones need to be considered for the audit. The automatically generated process models are accepted as single point of truth, which is the practice of structuring information models so that every data element is only stored once. Although manually documented process models are available, the whole project team solely relies on the automatically generated ones. Furthermore, the internal audit department provides process transparency. This was perceived as value added by the process owner and management.
  2. The fieldwork starts earlier and is more focused. With the process-mining-based approach, a comprehensive overview of the auditee’s processes is easily gained. Accordingly, in the early stages of the pilot projects, the focus (in terms of time and budget) shifts from process-understanding tasks to process-auditing tasks. Process flows that deviate from defined standard processes are immediately identified. With a drill down to related financial documents, suspicious business transactions can be investigated in detail, along with the responsible process owner or involved employees. During the pilot programs, obtaining a sample-based selection of documents was omitted, although it was part of the initial project plan. However, after having the complete process models available, all project members agreed that a blind sampling is of less value for the effectiveness of the audit project.
  3. New audit analyses are enabled. With this approach, single financial documents are connected to complete end-to-end business transactions. This integrated view enables new analyses that are of high value for a process audit and were not possible by solely looking at single documents. For instance, during  the pilot projects, especially the segregation-of-duties (SoD) analyses on enacted business transactions, revealed several severe audit findings (e.g., invoice and payment posting by the same user along with changes to the bank account of the corresponding vendor). With the help of a standardized SoD matrix, business transactions with the most critical combinations could be easily identified and investigated. The major difference of our findings compared to common SoD analysis based on granted system access rights is that they do not just constitute potential risk of misuse. Instead, in these cases, far too extensive access rights have been exploited. Accordingly, within the pilot projects, these findings were used to set up projects for reworking the access rights management of the enterprise resource planning (ERP) systems.
Against this background, the next step in the development road map is a type of intelligent dragnet investigation, which applies several independent analytics and measures to the end-to-end business transactions and calculates an aggregated risk score for every business transaction. By doing so, high-risk transactions can be identified right at the beginning of an audit, reduce false positives and identify business transactions that put financial statements at risk.

Read Martin Schultz, Alexander Ruehle and Nick Gehrke’s recent Journal article:
Audit-focused Mining—New Views on Integrating Process Mining and Internal Control,” ISACA Journal, volume 3, 2014
The Pitfalls, Perils and Pleasures of Big Data
Muzamil RiffatBy Muzamil Riffat, CISA, CRISC, CISSP, PMP, GWAPT
 
The whole world suddenly seems to be obsessed with and immersed in big data talk. The term “big data” refers to the deluge of information that is generated through a myriad of digital devices around us. Basically, every step we take, every move we make results in generating some sort of data and leaves a digital footprint behind that can be aggregated or analyzed for various purposes. The amount of data humanity is generating is so gargantuan that the exact quantification is quite improbable. It is estimated that if the data generated by us in the year 2011 were converted into a high-definition movie, it would take 1 person more than 47 million years to watch it.

Advocates of big data proclaim that the advanced computational power available today and superabundance of data is a match made in heaven, which could be used to address all of humanity’s ills. These claims are substantiated by success stories that have been achieved in different areas (e.g., retail, logistics, healthcare, customer management).

However, the combination of availability of data and the initial success of information that has been gleaned from it has also resulted in the initiation of some dubious projects. The conclusions drawn from these projects are equally questionable. Most importantly, the conclusions seem to be victims of a post hoc, ergo propter hoc fallacy (a logical fallacy in which conclusions are derived entirely on the chronology of events rather than by taking all of the related factors into consideration). The correlation is confused with the causation. Let us illustrate this with an example:  By analyzing the comprehensive data about students’ activities at a particular university, the conclusion could be drawn that students who wash their hands more than a specific number of times tend to score better on examinations. It is obvious that washing hands cannot cause better exam results (although these 2 events might be correlated). This obvious distinction between correlation and causation is often overlooked in more subtle business cases. The reliability and quality of the source data is also a major concern. Big data analytics have the potential to amplify errors present in the source data, since the overall conclusions are based on a faulty data set.

Organizations embarking upon the big data journey should not only consider the success stories, which get publicized and hyped, but also be mindful of the fact that a few big data projects have either failed spectacularly or provided spurious conclusions.

There is no denying the fact that the volume of data generated and captured today is soaring at an unprecedented pace, which has the potential to transform our lives. There is also no denying that without proper planning and objective setting, organizations embarking upon big data projects will be shooting in the dark and unrealistically hoping to hit the bull’s eye.

Read Muzamil Riffat’s recent Journal article:
Big Data—Not a Panacea,” ISACA Journal, volume 3, 2014
Privacy in a Post-Snowden World
William Emmanuel YuBy William Emmanuel Yu, Ph.D., CISM, CRISC, CISSP, CSSLP
 
We live in a world where technology is present in everything we do. We have essentially become dependent on this level of pervasive communication technology. However, these same technological capabilities also make it possible to perform unprecedented levels of surveillance. People in the technology sector have always been aware of this power and have capitalized on it. However, in June 2013 things changed. The post-Snowden world has brought increasing awareness to the issue of mass surveillance. More people are now aware of it and more people want action from their governments. This increase in awareness has compelled regulators and governments worldwide to review intelligence agencies, laws and regulations with respect to data privacy.

For liberal countries with no data privacy laws, there will likely be a move to enact data privacy regulation. Countries that already have regulation will start reviewing and strengthening it in most cases. For a while, customers will be more discriminating about where their personal data resides. Decisions will be made on the perceived safety of these service providers. This puts an additional burden on companies that rely on IT to ensure that they continue to provide their services within a more data privacy-aware regulatory and cultural framework.

At the same time, this is also the era of big data, which enables the large-scale collection of customers’ personal and transactional information. Companies are increasingly looking at their data streams as assets and have invested in technology to keep more of their data longer and identifying ways to monetize it. 
 
Companies are in no position to predict all possible changes in regulatory action or cultural expectations in the market. However, they need to build their applications to ensure they comply with these regulatory and cultural norms. In my recent Journal article, I recommend that application developers seriously review their applications in the context of existing global privacy regulatory frameworks, which can serve as a template. These general privacy principles can ensure a degree of future proofing for these applications. 
 
We are seeing the collision of capability and responsibility. We now have the capability to keep, process and monetize more private data. This is what technology allows, but at the same time, service providers have a responsibility to customers to protect this information and use it in a fair and proper fashion.
 
Read William Emmanuel Yu’s recent Journal article:
Data Privacy and Big Data—Compliance Issues and Considerations,” ISACA Journal, volume 3, 2014
The Challenges of Protecting Electronic Document Integrity
Haris HamidovicBy Haris Hamidovic, Ph.D., CIA, ISMS IA
 
The increased use of technologies that allow electronic document storage and electronic communication has led lawmakers and courts in many jurisdictions around the world to consider the legal status of such information and the legal effect of that communication. The laws pertaining to electronic documents in most countries are not sector-specific. The enactment of these laws means that all organizations will have to take appropriate measures to protect document integrity while using electronic documents in the ordinary course of business. Failure to take these measures is no longer just a lack of due professional care; it constitutes a violation of legal obligations and can result in fines.
 
In some countries, the laws governing electronic documents makes the use of electronic documents by organizations legally valid in their business transactions, both internally and with their clients, but these documents must be signed with a qualified electronic signature. Qualified electronic signatures are advanced electronic signatures that are based on a qualified certificate and which are created by a secure signature-creation device. Currently, public key infrastructure (PKI) technology is the sole technology able to meet the requirements of qualified electronic signatures. Although it is a mature technology that is being implemented more and more, it remains a rather complex technology, especially when it becomes intertwined with legal requirements. Consequently, the combination of technical and legal requirements can make it difficult for both technical and legal experts to implement a legally compliant electronic document system.
 
But meeting the challenging specifications governments require is not enough. Unfortunately, internal auditing does not play a large enough role in ensuring electronic document integrity. As discussed in my recent Journal article, without regular internal auditing, enterprises cannot know that their defenses are sound.
 
Read Haris Hamidovic’s recent Journal article:
Electronic Documents Information Security Compliance,” ISACA Journal, volume 3, 2014.
1 - 10 Next