Find Resources and
Connect with members on topics that interest you.
Please sign in to see your topics.
2017 is here with cyberbreaches increasing, with their impacts rippling ever further into business and personal life.
Are these threats too big to manage? Is cyberthreat management the ‘elephant in the room’?
Cyberresilience needs to be on the board agenda, but still too many boardrooms prefer to manage the risk with the Ostrich Control—hoping it will go away—exacerbated by the fact that security budgets continue to grow whilst answers to how much and what to target remain aloof.
To read the rest of this article, click here. The author will also be here to answer questions.
Starting to think of risk from a COBIT framework can be very helpful. Many technology risks that seem uncontrollable are not so. Rather, the ostrich effect disables the ability of partially effective technology controls from being effectively applied. The compounding effect upon layers of an organization skipping partial controls due to lack of support then makes the sense of lack of control a self-fulfilling prophecy.
It turns out that both frequency and impact can be affected even for risks that are partially controlled. It turns out that auto accidents cannot be fully controlled; yet, bumpers, seat belts and air bags make a positive difference. In the case of cyber security the difference in effect from Ostrich to Excellent Risk Management makes a multiplied factor of 10 difference. Were an Ostrich firm to be attacked and experience impact amounting to Cyber Security average loss rates near 12% of annual revenue, then a firm of equal size on the Excellent side of Cyber Security could experience average loss rates near 1.2% of annual revenue. This effect occurs by partial control without any ability to control Cyber attackers.
Quantifying Cyber Security risk also can seem a daunting task. That perplexity again can favor a more loss prone “Ostrich” response. It can be helpful to think of Cyber Security as a special form of quality assurance. Wanted flows of information and value added processing for business is the measure of quality. Knowing what a business wants to do with information and how fast processing makes money leads to quality assurance decisions. Unwanted leaks of data and unwanted delay in processing are quality defects. The defect rate is about frequency. Defect damage is about Impact.
Scaling Cyber Security risk to a firm also turns out to be perplexing. Some firms get attacked more often even though they have excellent security. From quality assurance we learn that while all models are wrong, some are useful. To start good thinking, the following model is only partially good because it simply assumes that all fraud in the USA comes from a Cyber Attack of a computer. Further, it also assumes that all sensitive data in a firm is evenly spread across every computer in a firm, which is not reliably true either. Even so, the model itself is a very useful way to scale Cyber security frequency and impact questions to the size of a firm. I like to call this the 0.01% model.
The average firm has 3.5 computers per staff member of that firm. In front of you is a laptop, tablet, desktop or smart phone. Next to it is a corporate IP phone which is actually a computer also. Behind these is a Server, Network Switch, Internet Connection, Firewall and more. In your household there are about 1.2 computers per person in your home. Focusing on the total count of computers in your firm, multiply this by 0.01% to find the average number of computers that either are or will be Cyber Attacked in a year. Count up the average number of customer data records per staff in one year. This will likely be the size of records involved per computer that has been Cyber Attacked. Consider the worth of these records to the firm. Multiply computers attacked per year to a years’ worth of records per staff and then multiply by the average worth of each record to the firm. Now you have a model of your firm’s average Cyber Security losses per year. Larger firms get attacked more often because they have more computers to attack. Some firms deal in more damaging records than others and so their losses can be bigger. Other firms play the Ostrich and experience a higher loss per computer than excellent firms do.
With a simple model, wrong though it may be, a lot of the perplexity of Cyber Security can be reduced. Proportionate action can be taken based on one very useful insight. Cyber Security Impact and Frequency are related to the number of computers a firm has. With such a model, the starting justification to measure attacks and track the flow of wanted information can justify the tools needed to measure and make a better model.
You must login to leave a comment.
You must be a logged in to start a discussion.