The assurance objective of performing an application security audit is to ensure that application security controls are operating effectively to protect the confidentiality, integrity and availability of information. Application security is concerned with maintaining confidentiality, integrity and availability of information at the application layer. Of the key application controls that are reviewed as part of any application audit, application security is utmost important. This also translates as ‘restricted access’ in financial audit assertions. The risk associated with a compromise of application security can lead to security-related frauds that may give rise to financial and reputation losses.
This article adopts a layered approach to auditing application security. Excluded from the scope of this article is security over databases, operating systems, middleware and network layers. The article provides general guidelines to auditing application security, and some of the key controls that need to be tested to obtain a high level of assurance.
Prerequisites for Auditing Application Security
It is of paramount importance for an auditor to obtain a clear understanding of the underlying business process for which the application has been designed. It is also important to understand the different sources of data input to and output from the application. Interfaces to and from the applications should be understood to identify data flows. Most applications are accessed through individual user IDs and passwords to the application. However, other forms of login, such as single sign-on mechanisms, have become increasingly popular, given the magnitude of applications used in a corporate environment.
The design of the application for user provisioning should be understood upfront. Some applications are designed with an application security module, which is used for user provisioning within the application. On the other extreme, in some applications, user accounts need to be hard-coded into the application logic. The controls that operate in these two environments could be significantly different due to the difference in the way they are provisioned.
The various roles, descriptions, user profiles and user groups that can be created within an application should be obtained. This includes system administration accounts, security administration accounts, privileged accounts and service accounts. Organisationwide policy for obtaining user access and supporting standards and procedures needs to be reviewed upfront. This is required in order to understand the level of guidance available and to gauge the extent to which the information security policies and standards are embedded in the application layer. Having obtained a thorough understanding of the systems, policies, processes and procedures around application security, the auditor can then step through the various control points in an application security audit as detailed here.
Application Security Layers
A layered approach is used for analysing application security (figure 1
). In information security terms, this is a typical defense-in-depth approach. The application security profile can be classified into three layers:
- Operational layer—This is the core of application security and is generally controlled through the security module of the application.
- Tactical layer—This is the next management layer above the operational layer. This includes supporting functions such as security administration, IT risk management and patch management.
- Strategic layer—This layer includes the overall information security governance, security awareness, supporting information security policies and standards, and the overarching IT risk management framework.
The operational layer includes:
- User accounts and access rights—Creating unique user accounts and providing them access rights appropriate to their roles and responsibilities is a well-regarded best practice in application security. An auditor should always ensure the use of unique user IDs that can be traced back to individuals. Use of guest, test or other generic accounts should be reviewed. Auditors should also check for duplicate/multiple accounts belonging to the same/shared users. Likewise, vendor accounts and third-party accounts should be reviewed. In essence, users and applications should be uniquely identifiable.
Privileged and service accounts are sometimes treated as mystery accounts. Accounts that have elevated access rights (including system administration) are referred to as privileged accounts. Service accounts are generally used for interfaces, batches or other maintenance type activities. Service accounts may or may not be interactive. An auditor should test the controls that monitor the usage of these privileged accounts. It is often noted that such accounts tend to remain in the applications for an extended period of time, even if they are rarely or never used. Regular monitoring of the use of these accounts serves as an adequate control from an application security perspective. An auditor should also review the authorisation records for granting privileged user rights.
User profiles are generally created upfront. A user profile is a skeleton of user access rights that represents a particular role (e.g., accounts manager). User accounts are then created by copying the existing profiles in the application. When appropriateness of user access is tested as part of the audit, the auditor should consider if the principle of least privilege was applied while granting access. Where existing user profiles are copied to new users for granting access, there is an inherent risk that new user accounts may inherit excessive access rights that may not be appropriate for their role.
Regular management review of user access appropriateness and user access authorisation records to user accesses is the key control to be tested under this domain.
- Password controls—Organisationwide security policy generally dictates password controls applicable to the different layers of technology. Application password controls need to be strengthened to prevent hacking of user accounts. In general, password strength (mix of alphanumeric characters), password minimum length, password age, password non-repetition and automated lockout after three to five attempts should be set as a minimum. Some off-the-shelf applications may not have strong password controls. In these instances, auditors need to look for compensating controls in the form of layered security (via network login, then application login) or other mechanisms where application passwords are well protected. Encryption of passwords is another key control, especially for critical applications that transmit payments or sensitive information. Password controls over privileged accounts and service accounts should be reviewed to ensure compliance with the organisationwide security policy.
Various login mechanisms, including login from mobile devices (wireless), offsite locations, home and other remote mechanisms, should be analysed from a control standpoint. In all instances, security measures such as secure connection, encryption and monitoring of remote login activity should be considered for auditing. Remote logins should be strictly controlled.
- Segregation of duties (SoD)—With recent occurrences of security-related frauds around the world (e.g., Societe Generale incident in January 2007), the importance of this control has heightened. Before an auditor ventures into testing of this control, a clear understanding of this concept is important. Segregation of duties is defined as:
A basic internal control that prevents or detects errors and irregularities by assigning to separate individuals responsibility for initiating and recording transactions and custody of assets to separate individuals.1In an application security review, SoD needs to be tested at the following levels to provide a high level of assurance:
– Segregation of duties in the user access rights—The user profile within the application provides various access rights associated with the user profile. Understanding these access rights available to each user profile is important to ensure that those rights do not violate segregation of duties principles. In figure 2, the access rights within user profile A1 for application A are indicated as A1.1, A1.2, A1.3, etc. Quite often in poorly designed applications, auditors tend to find a lack of synchronisation between access rights and associated user profiles. For example, an account manager user profile can have review, authorisation and submit user access rights. If the profile also includes accounting entry input access rights, this is considered to violate the SoD principles.
– Segregation of duties within user profiles/accounts— Some applications are designed in a way to allow for more than one user account for the same user to reflect the multitude of roles performed by the user. This is typical in small organisations with limited staff numbers. Again, testing of access rights over these multiple user accounts within an application should be performed to ensure that there is no violation of SoD principles between such allocations of user profiles to the same user. The user profiles are depicted as A1, A2, A3, etc., in the figure. For example, a user may have payment and authorisation user profiles allocated within the same application. This violates SoD principles.
– Segregation of duties across multiple applications—In large organisations, users have access to a number of applications that may be appropriate to their role. A holistic approach needs to be adopted while performing this testing, to ensure that access rights of the user in those applications do not violate SoD principles. For example, a user may have purchase order access rights in the the enterprise resource planning (ERP) system and payment approval rights in a payment application. This is an explicit violation of SoD principles. To perform this testing, an auditor needs to have a very good understanding of the organisationwide applications being used and the nature of user profiles in those applications (represented as applications A, B and C in figure 2).
– Segregation of duties between security administration and other functions—In large/mature organisations, a centralised/decentralised security administration function is in place to perform user provisioning. However, smaller technology shops cannot afford to have a separate security administration function. Therefore, the security administration function is often combined with application support function/business function, thereby explicitly violating SoD principles. In addition, application design may not cater to separate functions for administering the system and security. In those instances, compensating controls, if any, should be reviewed for adequacy and effectiveness.
Depending on the level of assurance required, auditors may choose to perform SoD testing at various levels. Figure 2depicts the various levels of segregation of duties testing to be performed.
The operational layer includes:
- Security administration—Timeliness of user provisioning activities such as creating/deleting and changing of user accounts is important in this fast-moving world. An auditor should ensure that user accounts are created only after formal approval from the employee’s manager. All formal approvals need to be stored in a centralised repository for future reviews and reference. User access change requests need to be performed immediately after an employee is transferred to other divisions within the same organisation. Timeliness of this activity is important to ensure that users do not inherit their previous access rights for an extended period of time, as this could potentially lead to violations of SoD principles, if they are moving roles that could potentially lead to such situations. Deleting of user access needs to be timely, especially when disgruntled employees leave the organisation. While disabling access rights at the network is seen as a compensating control, revoking access rights at other layers in a timely manner is also important to ensure protection from unauthorised logins. The supporting security administration procedures and security configuration of the application need to be documented for application support and future reference.
- IT risk management—Some of the key functions performed by IT risk management in relation to application security are:
– Assessing risk over key application controls
– Conducting a regular security awareness programme on application users
– Enabling application users to perform a self-assessment/ complete compliance checklist questionnaire to gauge the users’ understanding about application security
– Reviewing application patches before deployment and regularly monitoring critical application logs
– Monitoring peripheral security in terms of updating antivirus software, etc.
An auditor should understand the risk associated with each application. This may be obtained through review of the reports on periodic risk assessment on the application or self-assessment/compliance reports on the application.
- Application patch management—Organisations tend to focus on patching at the database or operating-system level. Application patching needs to be performed regularly. It is a must to ensure that key applications are in ‘vendor support’ mode at all times, as security patches will be released by the vendor only when the applications are supported. Patches can be analysed in consultation with IT risk and only those necessary for applications should be deployed.
- Interface security—An auditor needs to understand that data flow to and from the application. Security of the interfaced data is also important, especially when unencrypted methods of transmission are used for data transmission. User listing of the associated interfaces supporting the application needs to be reviewed to ensure no unauthorised access to interfaced data.
- Audit logging and monitoring—Monitoring the audit logs of every single user and transaction becomes impractical in large organisations. Therefore, an auditor needs to understand the business-critical data pertaining to any application. Audit logging may typically be performed at the application or database level. It is often noted that while critical activities are logged, they are seldom monitored. This renders the control futile.
An information security programme can be defined as:
The overall combination of technical, operational and procedural measures, and management structures implemented to provide for the confidentiality, integrity and availability of information based on business requirements and risk analysis.2
This includes clear allocation of roles and responsibilities, governance structure and management reporting.
A comprehensive information security programme fully supported by top management and communicated well to the organisation is of paramount importance to succeed in information security. The security policy should be supported by detailed standards and guidelines, which can then drive the appropriate level of security at the application, database and operating system layers. Likewise, a comprehensive IT risk management framework including a security risk management framework is essential to support the overall information security programme of an organisation. One of the key responsibilities of the IT risk management function is to promote ongoing security awareness to the organisation’s users.
Security metrics are now becoming popular to gauge the performance of the security management function. These are often good indicators of the security health of an organisation. Application security metrics serve as useful key performance indicators (KPIs) to assess the maturity of the function. Some of the key metrics that can be used for auditing application security are shown in figure 3.
Third-party Security Controls
As the world rapidly drifts towards an outsourcing environment, the application security principles described here should be applied at length in auditing application security at the service provider. The alignment of the service provider’s security policy and the organisation’s security policy needs to be reviewed to understand the degree of alignment and/or discrepancy, if any. Assurance is generally obtained through Statement on Auditing Standards (SAS) No. 70 Type 1 and Type 2 reports, produced by the service provider’s auditors. An auditor needs to review the third-party report to identify and analyse the gaps between standard assurance processes and the third-party report. Likewise, the third-party report can also be used to gauge the service provider’s adherence to the security controls/ procedures prescribed by the organisation in accordance with the outsourcing contract. An auditor may need to perform additional audit procedures to validate the gap analysis.
Risks Associated With Failure/Weak Application Security Controls
Based on the key controls described previously, the risk of failure/weakness in the operating effectiveness of the key application security controls at the operational layer is described in figure 4 along with the risk associated with such failure.
Standards and Guidance
As identity thefts increase and information security becomes more difficult, there has been a growth in the number of standards, guidelines and compliance requirements. Some of the standards and guidance that are available on application security are:
- Control objectives for application security are more specifically defined in COBIT® 4.1, including DS5.3 Identity management, DS5.4 User account management and DS5.5 Security testing, surveillance and monitoring.3
- ITAF™: A Professional Practices Framework for IT Assurance4 provides more guidance (including value drivers and risk drivers) on how to use COBIT to support the IT assurance/audit activities relevant to managing security.
- ISACA® has published IT Audit and Assurance Guideline G38, Access Controls,5 which is as a valuable reference for auditing application security.
- The Payment Card Industry (PCI) Data Security Standard (DSS)6 has prescribed two security compliance requirements that are specifically relevant to application security: Security Principle 6, ‘Develop and maintain secure systems and applications’ and Security Principle 8, ‘Assign a unique ID to each person with computer access’.
- The ISO/IEC NP 27034 ‘Guidelines for application security’ was under development at the time of this writing.
Information security is a journey. An auditor can contribute to the success of this journey by adopting a proactive approach to analysing and auditing information security. All the key controls identified need to be tested to provide the right level of assurance to the executive board and audit committees. While technological developments occur endlessly, the control objectives and principles behind auditing application security remains the same as described in the article.
1 ISACA, Glossary, www.isaca.org/glossary
3 ISACA, COBIT 4.1, USA, 2007, www.isaca.org/cobit
4 ISACA, ITAF: A Professional Practices Framework for IT Assurance, 2008, www.isaca.org/itaf
5 ISACA, IT Audit and Assurance Guidelines, G38, Access Controls, www.isaca.org/standards
6 PCI Security Standards Council, www.pcisecuritystandards.org
Alagammai Adaikkappan, CISA, CISM
is the principal (technology audit) at the National Australia Bank, Melbourne, Australia. In her role, she is involved in IT audits, IT risk reviews and security audits. Her experience in this field spans more than 10 years. She specialises in IT risk management, security management and IT governance. In her current role, she specialises in information technology audits in corporate and institutional banking. Her other areas of expertise include data management, BCP/DR, application development and project management. Adaikkappan can be reached at email@example.com.
ISACA Journal, formerly Information Systems Control Journal, is published by ISACA, a nonprofit organization created for the public in 1969. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.
Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors, employers or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.
US: one year (6 issues) $75.00
All international orders: one year (6 issues) $90.00
Remittance must be made in US funds.