JOnline: The SAP Landscape That Warrants Audit 

 
Download Article

Created in 1972 by five former IBM employees in Mannheim, Germany, SAP® states that it is the world's largest interenterprise software company and the world's fourth-largest independent software supplier. The original SAP idea was to provide customers with the ability to interact with a common corporate database for a comprehensive range of applications. Gradually, the applications have been assembled and today many corporations, including IBM and Microsoft, are using SAP products to run their own businesses. A recent R/3® version was provided for IBM's AS/400 platform. SAP is approaching the market with new offerings that may challenge the competition: xRPM, a cross-application ("cross-app") for resource and program management that is positioned as a product targeting internal research and development (R&D) or IS organizations, and SAP Professional Services, a customer relationship management (CRM) solution.

SAP applications, built around the latest R/3 system, provide the capability to manage financial, asset and cost accounting; production operations and materials; personnel; plants and archived documents. The R/3 system runs on a number of platforms, including Windows 2000, and uses the client-server model. Client-server describes the relationship between two computer programs in which one program, the client, makes a service request from another program, the server, which fulfills the request. Although programs within a single computer can use the client-server idea, it is a more important model within a network. In a network, the client-server model provides a convenient way to interconnect programs that are distributed efficiently across different locations. The latest version of R/3 includes a comprehensive Internet-enabled package.

SAP has recently recast its product offerings under a comprehensive web interface, called mySAP.com, and added new e-business applications, including CRM and supply chain management (SCM).

Storage Problems

The cost of disk storage is often underestimated and has been since IBM invented disk storage in 1956. At that time, a 5-megabyte (MB) drive cost US $50,000. That amounts to US $10 million per gigabyte (GB) and an astounding US $10 billion per terabyte (TB). Over time, this cost has dramatically shrunk, as shown in Figure 1.

However, this chart can be misleading because it does not take into consideration vital factors of the cost equation, such as the cost to manage, the cost of downtime and the cost of secure, fail-safe data.


Figure 1—Cost of Disk Space (In US Dollars)

Year MB Cost GB Cost TB Cost
1956 $10,000 $10 million $10 billion
1980 $193 $193,000 $193 million
1990 $9 $9,000 $9 million
2000 $0.01 $15 $15,000
2004 $0.001 $1.15 $1,150

 

Data storage requirements for a typical business are doubling every nine to 12 months. As the requirements for storage increase, so does the cost of maintaining these data. Factor into this the cost to manage disk space and the environmental increases needed, such as Redundant Array of Inexpensive Disks (RAID) controllers, and the cost to manage the storage becomes five times higher. Add the cost of top quality fail-safe RAID hardware, instead of the inexpensive disk meant for personal computers, and the cost will double if not triple or quadruple. Growing data is more expensive and problematic than it was in the past. This means that disk cost decreases over time, but the cost to maintain that storage increases exponentially, which then creates problems for companies faced with an increase in storage requirements:

  • It takes longer to restore systems when failure happens. As TB-plus databases are becoming the norm, it is increasingly difficult to recover from system failure. Systems such as SAP that are paramount to a company's success need to be recovered quickly and efficiently, even in testing, training and quality assurance (QA) systems where new development is being created and production error resolution is ensuring quality for the corporation.
  • The hard disk cost is just a fraction of the cost of adding new storage to a system. In fact, it costs almost as much to add hardware, such as RAID controllers and new network servers, as it does to buy the hard disk.

The size of an SAP landscape can grow at an alarming rate. The growth first occurs on a production system and then is copied via client and database to several other QA, training and development systems, causing a small increase in the production data size and an increase of a TB or more across the landscape. At the current rate, storage requirements are predicted to double every 12 months. In light of this, every SAP customer has to look at either purchasing mass amounts of disk space or reducing the size of the nonproduction support systems.

In most cases, if minimal or no effort were made to archive the data, the production SAP system would typically have grown considerably since it went live. This is true because the production system would be supported by a QA system, which in most cases would have been created by making a copy of the production client. In certain cases, multiple clients are used for different testing purposes. A refresh copy is made from the production client periodically, because newer data are needed to adequately resolve production errors and test new applications. As the production database grows, these refresh copies get increasingly difficult to execute, which means they happen less frequently. The inability to get up-to-date data in the QA system causes shortfalls in the testing cycle that show up as production errors, with their attendant impact on operations and the bottom line, including cost of ownership.

Development systems, housing new development efforts, typically have many more clients to support different teams. In a normal SAP landscape, development systems are usually old copies of the production systems. These systems have very infrequent data refreshes and, consequently, have the poorest data in the landscape. Poor data surface during unit testing by the developing team. Programs and reports that are poorly tested using old and redundant data get transported to QA for testing, only to find out there were problems. The data then need to be fixed in development and transported back to QA. The problem compounds if QA does not have current data. The spiral effect is that this now causes problems in production, resulting in emergency situations and businesses losses due to the breakdown of production activities. Add to this other systems used to support production, such as a dedicated training system, which tend be to production-oriented. These systems may contain, on average, seven years' worth of data from production. Are all those years of SAP data needed to adequately test, develop, train or support the environment?

Typically, companies need the last three years of SAP data. These are the data that will be most relevant to current activities in nonproduction scenarios. Data older than three years are rarely looked at and may be useless for testing purposes, as changes in business rules may mean the older data configuration does not fit the new scenarios.

More often than not, the solution adopted to resolve the issue of growing databases is to add more disk space. As discussed previously, disk space, even though it reduces in price over time, is not the issue. The issue is the total storage costs. Decreasing landscape size is one way to address rising costs.

Improving SAP Efficiency

To improve the SAP landscape efficiency and save disk space, there are only a few alternatives. The goal is an SAP environment that is smaller than the original production system but maintains all the necessary elements to make it complete and relevant.

SAP archiving is used by many companies to "slim down" production and, therefore, reduce the footprint as well as increase performance. It is meant to remove older data no longer needed in a live system. Archiving may also be used to slim down nonproduction systems. SAP identifies roughly 215 objects that may be removed from a system during this process, which includes saving these objects to flat files in other systems or on content servers. When using archiving in nonproduction systems, these files are not needed. The data contained in them are duplicates of those in production.

Each object identified for archiving has to be selected individually and run through a three-step process for complete deletion:

  1. Identify the objects that need to be deleted based on the size of space they take up in the system.
  2. Schedule an archiving job, which creates the flat file.
  3. Delete the data from the database.

This three-step process is not necessarily easy. Using SAP's archiving tools to create smaller nonproduction systems poses some challenges:

  1. Archiving needs to be set up and configured correctly.
  2. There may be frequent errors when running the jobs because of the system checks that are vital when taking data from a production environment, but may not be as relevant when applied to data in nonproduction systems.
  3. Objects have to be identified and run one at a time, which requires an in-depth analysis of the system.

Balancing Needs and Cost

Several factors are driving new investments in data center technologies, including an increased demand for storage capacity, new data compliance regulations and a growing need among companies to make information accessible in new ways.

The key is striking the right balance among storage capacity needs, personnel requirements and system reliability to maximize the return on investment (ROI) in data center technologies. Companies that are most effective at mastering the ROI balancing act are able to reduce storage and personnel costs and limit downtime.

Companies can also reduce the costs associated with downtime by investing in new storage technology and increasing the resiliency of their infrastructure. To successfully measure ROI, there has to be a solid alliance between the business and IT sides of an organization. Bridging the cultural gap between these two entities can be a challenge because IT wants to provide the best solution, which is often time-consuming to deploy. On the other hand, the business side often wants faster, less costly results, which can lead to disastrous consequences in the long run.

The decreasing cost of communication bandwidth is a factor worth considering in the equation. Many companies wisely choose to locate the backup machine offsite—from a few miles to several hundred miles away—to improve the chances of continuous availability in the unlikely event of a site disaster at the location of the production machine. The good news is the cost of the communications bandwidth needed to remotely connect two machines has plummeted in recent years. Currently, each MB of communication bandwidth has an average cost of about US $50 per month, while four years ago each MB had an average cost of about US $800 to US $1,000 per month.

In conclusion, the cost associated with maintaining large SAP landscapes rapidly grows as production systems increase in size. IT directors and managers are well advised to look at solutions to rein in the total cost of ownership of their environments and improve the efficiency of their departments.

Yusuf Musaji, CISA, CISM, CPA, CISSP, CGA is the Director of Security at Epsilon Interactive Inc., a leading provider of multichannel marketing services, technologies and database solutions. Epsilon Interactive is an Alliance Data Company.

Yusuf Musaji’s areas of expertise include enterprise risk management, IT security and privacy, financial system development and implementation. He is widely published in IT, financial and security journals regarding IT/user relationships, and he has authored two books: Auditing and Security, AS/400, NT, UNIX, Networks and Disaster Recovery Plans (2001) and Auditing the Implementation and Operation of ERP Systems (2003).


Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.

Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal does not attest to the originality of authors' content.

© Copyright 2006 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights res erved. ISCATM Information Systems Control AssociationTM

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.

www.isaca.org

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 1, 2006