Red Teams: An Audit Tool, Technique and Methodology for Information Assurance 

 
Download Article

From the late 1990s to the present, individuals and businesses have been acutely aware of the havoc caused by poorly implemented or nonexistent information security practices. Terms such as "phishing," "cracking" and "black-hat hacking" have entered the mainstream lexicon to describe new crimes perpetrated against people. As businesses become increasingly bound to their information systems, the value of the equipment, software and services are not the only line items affected when they are violated or attacked. Less deterministic aspects of the organization, such as goodwill, public trust and market valuation, are also negatively impacted. Furthermore, the development of advanced digital storage devices has allowed public and private entities to retain innumerable quantities of personal client/customer information, which can be analyzed, bought and sold. These massive concentrations of valuable data have developed into lucrative targets for thieves. During the last decade, governmental and private organizations have joined together to develop legislation and standards that help ensure that organizations take concrete and decisive steps toward preserving the confidentiality, integrity and availability of their information systems. This effort has yielded broad, comprehensive legislation that is often difficult to interpret and even tougher to obey.

Growing Security Risks

To help cope with these new challenges, the field of information security has had to develop rapidly in a short span of years. Prior to the early 1990s, there were only a few high-profile information security cases, best characterized as either belonging to the realm of international espionage—exemplified by Clifford Stoll's The Cuckoo's Egg, a case involving cracking a Cold War cyberspy ring—or the exploits of juveniles and technical enthusiasts such as the 414s and the case of hacker and now-reformed security expert Kevin Mitnick. Unfortunately, the playing field has since been leveled, and anyone can pose a threat to an organization's information security. This has placed many corporations and governments on the constant defensive and without a clear offensive strategy. The relative youth of the field, in comparison to other criminal sciences, and the lack of highly skilled professionals has left these organizations without the tools they need to adequately cope with security threats.

To make matters worse, there are few formal instructional texts available to teach information security officers and information systems auditors how to assess information systems for security risks and establish procedures to help minimize the impact of potential incidents. Many information security training events and courses, such as those provided by ISACA, the SANS Institute and universities participating in the National Security Association's (NSA's) Centers for Academic Excellence in Information Assurance program, attempt to overcome this hurdle through a variety of instruction methods in classroom and laboratory settings. Many information security courses expose students to a variety of real-life tactics used by hackers and crackers to exploit vulnerabilities through penetration testing, thus creating a new generation of "white-hat hackers," or individuals who hack or probe systems for weak points to better defend them against attack.1

Red Team Tactics

ImageMany organizations' first engagement with white-hat hackers is during a security assessment or audit that incorporates a "red team." The red team is composed of individuals skilled in performing ethical hacking—employing the same tactics malicious hackers may use against information systems, but instead of damaging systems or stealing information, the findings are reported back to the organization.2 IS auditors can use the red team method to their advantage to gain a better understanding of new and emerging security threats and produce actionable proof to make the case for fundamental changes in an organization's security practices. Red teams made up of internal staff are problematic. If they are internal, they already have knowledge of the network and security. This violates the rule of "testing" one's own systems. Therefore, auditors who are hired from outside the company to conduct a red team audit are not given prior information about the network or security to simulate a true exercise in external intrusion. Thus, they can provide an external, unbiased test of the control infrastructure. The recent Carnegie Mellon report on insider threats points to the fact that external threats and internal threats need separate approaches.

The concept of red teaming has its beginnings in military science. It was used by the US military throughout the Cold War and into the present to help strategic planners do the following:

  • Provide a surrogate adversary to "sharpen skills, expose vulnerabilities that adversaries might exploit and increase the understanding of the options and responses available to adversaries and competitors." The red team may accomplish this by emulating the adversary.
  • Play "devil's advocate." The red team can offer different alternatives to current plans, operations, processes and assumptions.
  • Offer sources of judgment that are external to the organization and act as a "sounding board" for new ideas that may arise from red team engagements.3

These basic concepts were general enough to apply to other governmental and commercial organizations experiencing their own set of security issues.

The red teaming process is a component of both the assessment and maintenance phases of the information security life cycle. It is an assessment function, in that it can yield valuable information that can be used to form a clear, objective picture of an organization's information security practices. Red teaming practices can also be used later in the life cycle by information security and auditing professionals to retest systems and procedures to determine if the suggested changes were successful and implemented. Figures 1 and 2 illustrate the information security life cycle. Red team audits can inform policy-making and improve all functions in the security life cycle and the information security (infosec) process.

ImageWithin the IS auditing sphere, as the real-life experiences of many auditors have proven, the assessment phase should ideally fall in the beginning stages of any major IT projects adopted by the organization. In addition to keeping down the costs of radically redesigning a project to defend against new security threats, this also helps a red team focus on fundamental threats that may plague the project and reduce the scope of the engagement.

The decision to pursue or forego red team testing should be predicated on the results of an extensive risk assessment beforehand. In any risk assessment, the audit team should attempt to use best practices for risk assessment to categorize risks by severity. If there are severe or undefined risks present for the organization's critical, high-value systems or projects, red team testing may be easier to justify than if the systems are noncritical and of low value. The manpower and expense of a red team exercise may also determine if this is a suitable tool to use in the audit. Depending on the size of the information system or the level of technical expertise needed by team members, the audit staff may not be able to be directly involved with the red team engagement. Auditors may have to rely on outside sources to help them augment the team's efforts. This raises an interesting dilemma: Is it necessary for auditors to acquire the expertise needed to fully evaluate systems, or should they just step back and merely oversee and direct the red team's efforts? The answer to this question is unclear and auditors must carefully consider the implications of bringing in outside assistance for such a sensitive operation.

To help navigate around these and other obstacles that may roadblock a successful red team engagement, auditors should ensure that the corporate culture is able to accept the judgment of the red team and tolerate criticism of the established practices.4

A Culture of Security

One could argue that the most fundamental operations of the organization should contribute toward a culture that cares about ensuring the security of confidential data—from the human resources department's hiring practices, which should emphasize choosing the most suitable and stable candidates for positions that may have access to sensitive information, to the facilities department's continual diligence in maintaining a safe and clean environment, to the board of directors' willingness to listen and react to senior management's suggestions to improve security, and to the staff members, who take pride in their jobs and obey the rules that not only protect the company's wellbeing, but also their own personal information, such as payroll documents, employee reviews and medical records. In the same vein, the auditee must understand that the red team is not an affront to the competence of the organization's programmers or administrators, internal workflow, or management style. Auditors should explain to management that it is merely a method used to apply a high level of stress to a system or process to gauge its effectiveness and efficiency. Audits may be used in court to defend against negligence suits. Performing audits indicates a culture that values due diligence and due care.

The team must have top cover from management to deflect the concerns from other levels of the organization to preserve objectivity in their methods and findings.5

Agree to Parameters

The red team must have not only top management support but also a comprehensive agreement with the auditee that outlines the evaluation plan and identifies the exact systems or processes to be examined. Without an explicit, written assurance guaranteeing that the team is free from liability, there is a significant risk that the team may be subject to criminal and civil penalties, as it could be possible, for example, for the team during their testing to unintentionally intrude on systems that may be governed by the laws of other countries, accidentally violate the integrity of systems connected to the organization via an electronic trading system, or view highly sensitive, patented or secret material.6 Once the team and the organization have concluded the agreement, the team should have the full support of top management to conduct what C.C. Palmer characterizes as "no-holds-barred" activities against the specified systems or processes. In a control audit engagement and a red team penetration test, the best results may come from testing systems that are in normal and routine operations. The red team must strike a careful balance with management. On one hand, to gather the most accurate results, it may be necessary for management to accept that critical systems may malfunction or data may be lost during the testing; on the other, the red team must not push the systems beyond the agreed-upon parameters.7 Strong management support is also needed to deflect potential animosity and resistance that the team might encounter from an organization's IT staff after being confronted with the knowledge that their systems might not be secure. It should be stressed to management and IT that the red team engagement is not out to assign blame to any individual or team; it is only a method of gathering information so the organization can have an objective viewpoint regarding the security of its systems. The data gathered from an engagement is not merely a laundry list of problems with systems, but a potential opportunity for the organization to recognize the need for greater investment and attention to IT's needs.

Red team exercises must have a project plan with limited scope (everything cannot be tested at once) and a methodology to be followed. Red teams should never just go in and mess with a system but should have a careful plan based on prior vulnerability assessments and risk to systems.

Red Team Testing and Training

The red team should be composed of competent subject matter experts (SMEs) with experience appropriate to the engagement.8 Both the participating auditors and red team technical specialists must have, at a minimum, a firm grasp of the fundamentals of computer forensics and a good general knowledge of the systems and processes against which they will test. The red team will typically test four areas of an information system:

  • Information residence, or operating system platform and storage security
  • Information transmission, or networks and communications
  • Information use, or the applications and decision processes based on data generated or collected by the information system
  • Information access, or the policies, passwords and permissions used to gain access to data9

Highly complex systems may require that the audit team pursue advanced training or partner with third parties to adequately assess the information system. The following are some of the key areas of specialization:

  • Developing the hacker's mind
  • Network surveying
  • Port scanning
  • System identification/OS fingerprinting
  • Firewall/ACL testing
  • Social engineering
  • Password cracking
  • Performing legal assessments on remote/foreign networks
  • Examining an organization for weaknesses as through the eyes of an industrial spy or competitor
  • Determining appropriate countermeasures to thwart malicious hacking10

There are many instructional courses being formed currently to help auditors and red team specialists keep current on new attacks and threats. At the professional level, many organizations, such as ISACA, the Information Systems Security Association (ISSA), the Institute of Internal Auditors (IIA) and the SANS Institute, offer training and industry-recognized certification to members. At the audit specialist level, there are numerous platform- and application-specific courses available from major software/hardware companies, such as Microsoft, PeopleSoft, Cisco and Juniper Networks.11 The demand for qualified and experienced individuals in information security normally exceeds the actual supply; therefore, it may be necessary to bring in new talent to fulfill the staffing needs of the red team. The federal government has recognized this need and has formed the Center for Cyber Defenders at Sandia National Laboratory in Los Alamos, New Mexico, USA. The program employs and trains college students in computer forensics, network programming, computer science and other key areas.12

In addition to developing a training path for the red team, auditors and specialists can benefit from the experiences and training of each team member through the keeping of detailed records of common practices and general observations made during investigations in a knowledge database. This will promote information sharing between team members and the rest of the security community and eliminate the need for additional research on problems that may have been encountered on previous engagements. This argues for a specialization in corporate forensics as advocated by Patricia Logan of Marshall University and John Patzakis, vice chairman of Guidance Software, and there are several organizations (Bank Of America, Wells Fargo, etc.) that have created such support entities within their operations.

Determine the Scope

The scope of the engagement must be focused and manageable. The audit staff should make full use of the initial planning stages of the assessment to gain enough basic information about the systems and processes that they will engage. During this phase, auditors can gauge from the responses of interviews with staff, questionnaires and details of prior audits the necessary manpower required for the red team. This will also help determine if the scope of the engagement will be at the application level or the enterprise level. Typically, application-level red team engagements will examine the security of systems that are not terminal to the organization, have a very specific set of goals and a short timeline, and are structured in approach. On the other hand, an enterprise engagement takes a broad perspective of information security and may determine the future course of the organization.13 It is important that the team focus on systems that are of the greatest importance to the organization's survival..14Examples of high-risk systems might be the organization's web site, a critical back-end database or any concentration of data used to make daily business decisions.

ImageWith thousands of known vulnerabilities and tens of thousands yet to be discovered, the red team must be careful to avoid scope creep by attempting to test for the irrelevant or trivial during the attack planning stage. The attack tree is one innovative method that can help define and manage the scope in both types of engagements. Figure 3 provides an example of an attack tree.

In an attack tree, the overarching goal of the attack, for example, cracking into an organization's payroll database, is represented at the top of the tree as the root node. To crack into the payroll database, a subgoal, which is represented by a subnode of the root node, might be to guess the password to the database or to copy the database in its entirety.15Each of these goals may have atomic, or highly specific, tasks that can be done to accomplish it, such as "run a database password cracking tool." These tasks will take the form of leaf nodes in the attack tree. After the attack tree has fully taken shape, the red team can evaluate each leaf node using capability analysis. In capability analysis, the team will make an estimation of the cost, ability and tolerance of apprehension of the attack. Therefore, in an attack tree illustrating cracking into the payroll database, it may be determined that the cost of using a freely available password cracker tool is US $0, the ability level is very low, and the tolerance of dishonesty, or likelihood that someone would use this method to crack the database without the fear of getting caught, is minimal. On the other hand, the risk of copying the database, which may be on a standalone computer, may be quite high if the attacker has to resort to stealth and physical intrusion to get to the computer. As the red team evaluates each possibility and weighs the chance that an attack is likely, the least likely options are pruned from the attack tree.16 This can be useful in determining the scope and timeline of the red team engagement, because it can help eliminate tests for low-probability and/or low-value risks and be reused in future engagements. An attack tree should consider three points of vulnerability: people, policies and technology. Testing just one will not do—a point of vulnerability can exist for each one.

Act on Results

Results should have immediate consideration and be acted a relatively short time period. At the team engagement, the final report organization should document all the steps and used in the course of testing. This must detailed to satisfy concerns that customers might have about the introduction programs or loss of data caused by the team must have a documented plan. The clearly state specific steps that management the risks identified during the initial possible recommendations found in a red to suggest the customer:

  • Apply a software update
  • Change a firewall's access list to prevent intrusion
  • Implement account and process auditing software
  • Create and promote security awareness within the organization

If the red team engagement was driven by a requirement to comply with legislation, auditors should attempt to answer the by offering specific and concrete steps to fulfill this need. For example, if the healthcare facility that had to comply with Insurance Portability and Accountability Act (HIPAA), the report should suggest steps the organization should take to comply with the HIPAA security mandate. This mandate requires that patient data and transactions containing patient data be safeguarded to protect confidentiality, integrity and availability, but does not give any suggestions or recommendations for how a provider can accomplish these goals. Auditors can reconcile the risks to patient data and the red team's vulnerability assessment by using outside resources such as the American Medical Association's (AMA's) best practices publication, Field Guide to HIPAA Implementation, networking with other auditors and teams specializing in the same area, or drawing from past experience. A possible suggestion for this scenario might be to implement an encrypted virtual private network between two doctors' offices to deter hackers from intercepting transmissions of patient data. Another resource is the US National Institute of Standards and Technology's (NIST's) guidance SP 800-66, for implementing the security rule. The potential cost of not complying with just a single piece of legislation such as HIPAA's mandate can be US $25,000 per year for each requirement not fulfilled. Since a piece of legislation such as HIPAA may have thousands of requirements, the costs may exceed tens of millions of US dollars. Many large organizations may find themselves surrounded on all sides with multiple rules and regulations, including the US Sarbanes-Oxley Act for most publicly traded companies, US SEC rule 17-a-4, the US Gramm-Leach-Bliley Act, the state of California's SB 1386 and the US Patriot Act for many financial institutions, and numerous regulations at both the state and international levels. Each requires full compliance either now or at a date that is too close to ignore for many organizations. The cost of not testing for vulnerabilities can be expensive as well.

The results of the engagement must be kept discreet. Prior to the engagement, the organization should conduct a thorough background check of red team members. Some complex systems may require third-party companies or freelance experts to help assist auditors in conducting the testing. It is important to make sure that these individuals are trustworthy beyond doubt due to the sensitive nature of ethical hacking. Finally, confidentiality agreements should be drafted by the organization and signed by the members of the red team prior to or after the engagement affirming that no information will be shared.

Three Red Team Tactics for Success

A successful red team engagement should not only be a hunt for software vulnerabilities, but the members should be expected to employ novel and inventive tactics against the organization's defenses to successfully emulate a competitor or an attacker. As organizations get wise to standard distributed denial-of-service (DDOS) and password cracking attacks, hackers may rely on other means to get into the network, such as social engineering, document grinding and war driving. The social engineering hack can be perpetrated in many forms:

  • An e-mail from an internal account using an authoritative-sounding name, such as "supervisor" or "administrator"
  • A friendly phone call from someone claiming to be a member of the help desk<
  • An individual piggybacking on an open door to a secure facility
  • Using charm or flirtation or feigning helplessness to gain access

Social engineering tactics may be the hardest for most individuals to pull off, but the rewards of discovering that they can work against the organization's supposedly secure information systems are worth the effort and, hence, must be anticipated in the red team's investigation.

Document grinding is closely related to social engineering, in that it examines what employees consider trivial documents, such as appointments, spreadsheets, phone numbers, manuals and letterheads, and then assembles these pieces of information together to get a clear picture of how the organization works. For example, by conducting a night walk-through of open cubicles in an organization or "dumpster diving" for unshredded documents, the red team can gather passwords or names of important individuals to use during a social engineering hack, or discover the names of critical systems to use during a penetration test. Examining calendars and address books might give the red team clues into the best times to conduct their attack to coincide with an employee's vacation or shift change. They can also search the Internet for other sources of information in the form of ads in job databases, requests from key security and support personnel for technical help in forums and newsgroups, resumes posted by individuals currently or previously employed by the organization, SEC filings, or even documents posted on the company's own web site. Since these types of attacks require the element of surprise, management must not inform or alert staff of the red team's operations.

The newest tactic in hacking focuses specifically on the wireless network, the newest addition to many organizations' information systems. No matter what new specifications are developed to secure wireless transmissions, the fact remains that since packets pass unprotected through the open air, they are fully traceable/sniffable and a prime target for hackers. The tactic of war driving allows hackers and, therefore, red team members to flesh out the size of a wireless network, gather details about the relative positions of access points from their signal strengths and triangulation methods, and determine the encryption being used, if any, for the packets being sent. These can help the red team exploit a new avenue of attack to explore during the engagement. Wireless should be one of many things to be tested, rather than a single target for an exercise.

Conclusion

The red team exercise can be a valuable tool for auditors to use in assessing risks to information systems above and beyond those techniques traditionally used by auditors. This method has been successfully used in the past by the US military and other large organizations to simulate and defend against previously inconceivable threats.

Unfortunately, for red teaming to gain wider acceptance and usage, it must overcome two obstacles. First, auditors must increase their awareness of tactics used by hackers by training and collaborating with information security professionals. Fortunately, the body of knowledge on information security is rapidly growing and there is an ample amount of literature and courses offered to help auditors retrain. Second, the concept of "ethical hacking" is still a hard sell for organizations that might be wary of allowing someone to subvert their security without employees being given advance notice. There is no easy approach to help these organizations understand and acknowledge that the benefits to the red team exercise outweigh the negatives. The secretive nature of the red team exercise prevents auditors from producing detailed case studies to show organizations quantifiable proof of its value.

Endnotes

1 Kabay, M.E.; Philip S. Holt; "Career Advice: Breaking into Infosec," May 2001, http://infosecuritymag.techtarget.com/ articles/may01/features_career_advice.shtml
2 Palmer, C. C.; "Ethical Hacking,"IBM Systems Journal, 2001, www.research.ibm.com/journal/sj/403/palmer.html
3 United States Department of Defense, Defense Science Board Task Force, "The Role and Status of DoD Red Teaming Activities," September 2003, www.acq.osd.mil/dsb/redteam.pdf
4 Ibid.
5 Ibid.
6 Peake, Chris; "Red Teaming: The Art of Ethical Hacking," 12 July 2004, www.sans.org/rr/papers/5/1272.pdf
7 Op. cit., Palmer
8 Op. cit., US Department of Defense
9 Duggan, David P.; Robert L. Hutchinson; "Red Teaming 101," Sandia National Laboratories, 17 July 2004, www.cs.nmt.edu/%7Ecs491_02/RedTeaming-4hr.pdf
10 Op. cit., Peake
11 Institute for Internal Auditors, "Tools and Resources for Security Management," 10 August 2004, www.theiia.org/iia/index.cfm?doc_id=3061
12 Sandia Corporation, "Center for Cyber Defenders (CCD)," 10 August 2004, www.sandia.gov/ccd/
13 Op. cit., US Department of Defense
14 Op. cit., Peake
15 Amezana Technologies Ltd., "Creating Secure Systems Through Attack Modeling," 10 June 2003, www.amenaza.com/downloads/docs/5StepAttackTree_ WP.pdf
16 Ibid.

Frederick Gallegos, CISA, CGFM, CDE
is an adjunct professor and MSBA–Information Systems Audit Advisor for the Computer Information Systems Department, College of Business Administration, California State Polytechnic University (USA). He has more than 33 years of experience in the information systems audit, control and security field. He has taught undergraduate and graduate courses in the IS audit, security and control field, and has written four textbooks and more than 180 articles. He has been active with ISACA and is a member of the Journal Editorial Committee.

Matthew L. Smith, CISA, CISSP
is a information security professional at Foundstone, a division of McAfee Corporation. He has more than 10 years of experience in both private and public sectors in IS and is currently completing his master's degree at California State Polytechnic University, specializing in IS auditing. He welcomes comments at mlsmith@nyx.net.


Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the ISACA. Membership in the association, a voluntary organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.

Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal does not attest to the originality of authors' content.

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.