In the information security world, speed kills. Every week, approximately seven new software vulnerabilities are publicly disclosed, according to the latest “Internet Security Threat Report” released by information security company Symantec Corp. Each vulnerability represents a potential target for hackers hoping to exploit the vulnerability before it is patched. In 2003, the number of documented software vulnerabilities totaled 2,636.
However, patches take time, not just to create but to test and deploy. Some experts estimate the average period for testing patches can reach 30 days.
Meanwhile, threats are spreading faster than ever. In 2001, variants of the Code Red virus infected several million hosts worldwide within a matter of hours. The Slammer worm of January 2003 doubled its rate of infection every 8.5 seconds in its beginning stages, infecting 70,000 hosts in just 30 minutes.
Increased propagation speed, aided in part by constantly increasing bandwidth, means that any of these threats has the potential to cause widespread damage more quickly than ever before. The number of attacks is also increasing. One of the most significant information security events of 2003 occurred in August when the Internet was hit by three new high-impact worms in only eight days. Blaster, Welchia and Sobig infected millions of computers worldwide, with Blaster alone infecting an average of 2,500 computers per hour.
The alarming speed of propagation of such threats is due in large part to their make-up. Called “blended threats” because of their use of multiple methods and techniques to spread, these threats combine the characteristics of malicious code such as viruses, worms and Trojan horses with the ability to exploit vulnerabilities to break into targeted computers. The result? A digital whirlwind that can spread to large numbers of systems in a short time, causing widespread damage quickly.
In response to such threats, information security companies are engineering technologies that offer proactive, rather than reactive, protection. One of the most promising is generic exploit blocking.
Beyond Traditional Signatures
Generic exploit blocking technology does just what its name implies: it generically detects and blocks all potential exploits or attacks against a vulnerability. Upon learning of a new vulnerability, such as a buffer overflow exposure in a web server, a security engineer characterizes the vulnerability and builds a signature that recognizes and stops any attempts to exploit that vulnerability.
This marks a major deviation from traditional signature-based protection technologies. With traditional technologies, a signature is written after a virus has appeared and has been analyzed by security experts. This signature is created to block the specific threat that is circulating via vulnerable systems. While traditional signature-based technology is effective at stopping malicious code that is spreading, a signature cannot be written until an actual threat appears. If the threat spreads faster than the signature can be produced and distributed, then the infection will cause a contagion.
Generic exploit blocking, in contrast, protects against any and all exploits that are capable of taking advantage of a disclosed vulnerability. How does this work? Consider a padlock. Without ever having seen a key, a locksmith can examine the tumblers in the padlock and exactly characterize the shape of all keys capable of opening the padlock.
Similarly, without ever seeing a computer worm capable of attacking a vulnerability, by analyzing the vulnerability, the shape of the worm required to unlock the vulnerability can be determined before such a worm is ever written.
This, in turn, changes the usual sequence of events for protecting against Internet threats. In the typical time line following the disclosure of a software vulnerability, a patch is released, and IT administrators begin their patch testing and deployment process. Virus writers then capitalize on this post-disclosure period to create a threat that exploits the vulnerability.
In this scenario, it is only after such a threat is created and released that security professionals have code to analyze to produce a signature. That changes with generic exploit blocking. Because security experts analyze the software vulnerability rather than the virus writer’s malicious code, security professionals can develop a signature even before a threat emerges.
This innovative technology continues to be integrated into the solutions of a growing number of information security providers. Appropriate for use in consumer and enterprise solutions, on desktops and on firewalls and routers, a generic exploit blocking signature can be delivered in the form of a convenient, regular security update (i.e., a virus definition data file).
For businesses and consumers, the bottom line is this: generic exploit blocking offers a way to proactively safeguard their information and systems. Because vulnerability analysis can be challenging and require hours of analysis on the part of security engineers, generic exploit blocking technology is especially appropriate for addressing critical vulnerabilities found in popular applications with a large user base.
Traditional signature-based virus detection will continue to play an important role in protecting businesses and consumers against Internet threats. With the ability to uniquely identify attacks by name, virus signatures can accurately zero in on known threats and stop their spread efficiently and reliably.
But in today’s fast-paced digital world, where threats multiply and accelerate at greater and greater speeds, prevention is often better than a cure. To that end, generic exploit blocking technology promises to keep the digital community a step ahead of Internet threats by anticipating and blocking them before they appear.
Carey Nachenberg is chief architect of Symantec Research Labs and has been an innovator at Symantec Corporation for the past 14 years. As chief architect, Nachenberg helps to set the technological agenda for the company’s research division and investigates new technologies across the computer security space. Nachenberg has contributed to books, including Internet Security Professional Reference and Windows NT Server 4: Security, Troubleshooting and Optimization, and has published articles in numerous publications, including Virus Bulletin, Secure Computing and Communications of the ACM.
Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by ISACA®, Inc.. Membership in the association, a voluntary organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.
Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA® and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal does not attest to the originality of authors' content.
© Copyright 2005 by ISACA® Inc., formerly the EDP Auditors Association. All rights res erved. ISCATM Information Systems Control AssociationTM
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by ISACA® Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.
INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005