The fact is, new vulnerabilities come to light every day. Unfortunately, staying ahead of these new vulnerabilities, or otherwise addressing them promptly, has proven to be incredibly difficult (not to mention costly). The good news is, not all vulnerabilities impact every organization. But, for vulnerabilities that do apply, it often is difficult to make risk-based decisions to address them – do we mitigate, avoid, transfer, or accept them?
These decisions become a great deal easier when organizations include the likelihood of an exploit along with a vulnerability's impact as risk analysis inputs. In these cases, impact is often relatively straightforward. For example, we might consider legal, strategic, financial, operational, or reputational impacts or, as Common Vulnerability Scoring System (CVSS) does, we might consider impact to classic objectives like confidentiality, integrity and availability.
Likelihood seems softer than impact and, as a result, we might think it is harder to determine. To get there, we have to think about the threats that could take advantage of a vulnerability. To exploit a vulnerability, there first must be a related threat. As it turns out, CVSS has sorted out quantifying likelihood by prompting for easier-to-answer questions like the origin of a threat, the difficulty of an exploit and the need for a victim's involvement. One of the common shortcomings with vulnerability management processes is in their often-limited understanding of applicable threats.
So, what is a threat?
We think of a threat scenario as a threat agent acting against a target to accomplish an objective. For example, a hacker targeting an e-commerce website to steal credit card data. A vulnerability creates a point of entry through which the attacker can reach the target. In a more complex attack, a hacker might work through a series of layers, exploiting various vulnerabilities along the way.
We worry about threats from thieves, hackers, malware and ransomware, social engineers and phishers, and natural disaster. However, the definition of a threat can encompass more than just these common actors. For example, an organization might view regulatory compliance as a threat. After all, an audit can have a significant impact – fines and penalties.
Why does understanding threats matter?
Regardless of your organization, addressing vulnerabilities is a business decision. As with any other business decision, risk and cost are a factor. Understanding a vulnerability in the context of the threats that might exploit it makes it easier to plan a course of action and prioritize your response.
Editor’s note: For more on this topic, download ISACA’s new white paper on vulnerability assessment.
When I finished my proof-of-concept presentation to the CIO of a prospective client at a recent meeting, he was more than surprised – he was upset. He almost yelled at me: “How did you do it?”
For my demo, my client had to complete a paper application form used by his company’s sales force. He needed to do this by hand, as would any customer, but using a digital pen equipped not only with an ordinary ink cartridge, but also with a micro-camera that captured each trace of the pen on the paper. When he had finished the application, he checked one box at the end of it that read “Transmit.” While explaining the features of the digital pen, I opened my laptop and remotely connected to our demo server. From there, just a few seconds after he had completed the application, I could show to him not only a high-quality scan of the completed application, but also all the data already translated into usable fields: numbers, dates, addresses, ready for ERP integration. He stood up in astonishment and asked: “How did you do it? How??”
This appears to be a nice example of a presentation that went so well that I took my audience completely by surprise with an emerging, unexpectedly beautiful technology. But the truth is, less than two years after launching our work with digital writing, we had to completely write off two years of work and investment put in an offering that appeared to be “The Next Big Thing.”
Talking about our digital transformation successes is always nice, but I would like to share these five innovation facts that, from my experience, should be understood to avoid failing in this era where all of us are at the brink of launching The Next Big Thing, whether on top of blockchain or IoT or AI or machine learning technologies.
1. “Innovation Chasm” does exist. I am sure that many of you have seen the Technology Adoption Lifecycle graph that describes the Innovators, Early Adopters, etc. Well, in that graph, there is a chasm between being loved by technology fans and getting a growing majority of users that will make your product the next iPhone. In the case I described, we could not convince owners of the intellectual property in a timely fashion to simplify the pricing model to accelerate the creation of a minimum user base. Check your business model for scenarios where the chasm is bigger than anticipated.
2. Platforms and ecosystems matter. The possibilities of emerging technologies are immense but decisions need to be made in relation to the platform or ecosystem you want to belong to or create for others. No one cares for a solution that cannot integrate and evolve for future needs. Our digital writing offering did use industry standards like XML or GMS but relied heavily on proprietary technology within the core product.
3. The “Innovator’s Dilemma” is real. Professor Clayton Christensen has said that companies are designed for the status quo and innovation efforts are killed by design. This is, although companies may not say it, they do not really want to disrupt themselves. So, your presentation to whoever approves your innovation effort needs to avoid a collision trajectory and rather explain the complementary nature of business and customer bases that you are bringing to the table.
4. Being a maverick is cool, but … In the end, a successful launch of an emerging technology needs to be on good terms with the leading powers that will put your product in front of users. It needs to integrate seamlessly with dominant social platforms as well as with online and app stores, and be designed to quickly open its features to the newcomers that will play a dominant role in your marketplace. That is why you see such collaboration among companies that otherwise would be rivals to create the future ecosystems for blockchain, machine learning, etc.
5. ITBMS! I have a blog post called It’s the Business Model, Stupid. We have seen for several years that, in the end, all successful technology companies have managed to build a credible business model that will turn around years of losses (sorry, capital investments) by creating value for an ever-growing number of users. So, be bold in pursuing your dreams for a better world, but keep close your friends that can make sense of it in terms of a sustainable, long-term business model.
Author’s note: Jose Angel Arias has started and led several technology and business consulting companies over his 30-year career. In addition to having been an angel investor himself, as head of Grupo Consult, he participated in TechBA’s business acceleration programs in Austin and Madrid. He transitioned his career to lead the Global Innovation Group in Softtek for four years. He is currently technology audit director with a global financial services company. He has been a member of ISACA and a Certified Information Systems Auditor (CISA) since 2003.
Employees are at their best when they are encouraged to take calculated risks, rather than becoming complacent with what they know and what has become comfortable. The same holds true for enterprises.
Some of the best risks enterprises can take in our technology-driven business landscape involve deploying transformative technologies that allow them to connect with customers in new and innovative ways. Yet, in many cases, organizations are failing to capitalize on the widening array of opportunities.
ISACA’s new Digital Transformation Barometer research shows that only 31% of organizations frequently evaluate opportunities arising from emerging technology. Given the swift pace with which technology is introduced and refined, this shows that most enterprises are undercutting their ability to seize marketplace opportunities and better serve their customers.
Boards of directors and the C-suite should be challenging their operational teams to research, pilot and ultimately become experts in emerging technologies capable of transforming their enterprises. Big data, artificial intelligence, Internet of Things devices and blockchain are just a few examples of technologies capable of delivering transformational change. To lead effectively, senior leaders have to be able to articulate the future vision for their companies in the context of the technologies that will get them there.
There isn’t a board chair or CEO on the planet who would not be thrilled to open new revenue streams or reach new customers – some of the top motivators for pursuing digital transformation. So, what is holding so many organizations back? A shortage of digitally fluent leaders is one impediment. Only a little more than half of survey respondents expressed confidence that their organizations’ leaders have a solid understanding of technology and its related benefits and risks. ISACA’s research shows that those organizations lacking digitally fluent leadership are less likely to evaluate technology opportunities.
Even those organizations that perform their due diligence in vetting new technologies often develop reservations once more is learned about the associated risks. A whopping 96% of survey respondents believe there is high or medium risk in deploying IoT devices, and more than 9 in 10 respondents also categorized public cloud and AI/machine learning/cognitive technology as posing medium to high risk.
The reality is every new technology introduced expands the attack surfaces and presents new risks. Organizations must move beyond that inherent discomfort and devote the necessary resources to mitigate risk to acceptable levels. Enterprises with effective information and technology governance programs can deliver better customer experiences, innovate more, and improve their business performance and profitability. Investing in well-trained, highly skilled professionals in areas such as audit, risk, governance and cyber security can provide enterprises the confidence they need to effectively and securely leverage their technology. Organizations should also resist the urge to take shortcuts in pilot testing or research and development when evaluating new technologies.
It’s important to have realistic expectations about digital transformation. Not every turn of the wheel on an enterprise’s journey can be a smashing success, and organizational leaders must give their team members the freedom to take a well-reasoned risk that may – or may not – yield the anticipated results. Those failures can provide unparalleled learning opportunities.
Organizations that remain committed to digital transformation can reap great rewards. From telecommunications giant Sprint tapping into big data, to a town in North Carolina, USA, shedding the yoke of legacy applications, there is no shortage of examples of enterprise large and small successfully harnessing digital transformation.
As the Latin proverb goes, fortune favors the bold. Enterprise leaders should embrace that mindset and make digital transformation a centerpiece of their organizations’ roadmaps toward a prosperous future.
Emerging technologies – such as machine learning, artificial intelligence (AI), blockchain, Internet of Things (IoT), augmented reality, and 3-D printing – are swiftly disrupting several industries. To paraphrase Klaus Schwab, co-founder of the World Economic Forum, these mind-boggling innovations are redefining humanity, pushing the thresholds of lifespan, health, cognition, and capabilities in ways previously considered to be preserves of science fiction.
The possibilities presented by digital transformation are indeed captivating. The uses are as varied as the organizations putting them to use. Sensors attached to jet engines are transmitting signals mid-flight, enabling airlines to promptly detect sub-optimal performance and conduct pre-emptive maintenance, boosting safety and minimizing downtime. Physicians are replicating flesh and bones using 3-D technology to simulate high-risk surgical operations, lifting patients’ confidence and shortening their anaesthesia durations. Meanwhile blockchain – an open source, distributed ledger of everything – is being used to develop self-executing contracts, eliminating record labels and enabling artists to interact directly with consumers, maximizing their ingenuity rewards.
The benefits of digital transformation are unquestionable, but enterprises must manage these programs carefully. Here are three key recommendations:
Drive cultural change
Digital transformation transcends IT – it’s an enterprise-wide matter that requires unwavering commitment from the C-suite to front-line staff. To succeed, enterprises must place cultural change, not technology, at the core of their strategies. This requires eliminating unnecessary barriers to innovation, agility and change that exist within organizations, including breaking down functional silos and revising bureaucratic governance structures. As Jeffrey R. Immelt, CEO of General Electric, said, “You can’t have a transformation without revamping the culture and the established ways of doing things.”
Leadership from the top is essential to establish vision, institute appropriate governance structures and drive cultural change during any major change, and digital transformation is no exception. Executive messages must be clear and consistent, persuading employees that creating a nimbler enterprise that can swiftly respond to market needs is an existential matter; status quo is untenable. This fosters an environment of trust and spurs employee engagement, prerequisites for success.
On the contrary, inconsistent messages fuel doubts, forcing employees to work in silos and resent change. This risk looms large when transformation is perceived as a threat to people’s jobs. Consistent with this view, the majority of respondents to the ISACA’s Digital Transformation Barometer rated AI and public cloud as top candidates to face organizational resistance. While initial reservations about public cloud are waning, migration efforts and radical process changes can pose such organizational challenges.
In the race to keep up with competitors, enterprises often have a disproportionate emphasis on the pace of transformation. Often, security and infrastructure considerations are afterthoughts, but such missteps can have lasting business repercussions.
Emerging technologies are exerting enormous pressure on traditional security models. For instance, billions of IoT devices with glaring vulnerabilities are integrating with critical infrastructure, creating numerous backdoors for malefactors to exploit. Cloud is enabling employees to bypass IT governance processes and export volumes of sensitive data to unsanctioned environments, aggravating the enduring shadow IT problem. At the same time, location-based applications collect troves of personal data, raising safety and privacy concerns. Each emerging technology presents new security issues, many of which have not been sufficiently evaluated nor understood.
To thrive, businesses need to make security an inescapable facet of digital transformation programs, considering implications early during business case evaluations. Enterprises also must have a nuanced understanding of each technology, carefully balancing pace of adoption, security and convenience. Traditional one-size-fits-all models don’t cut it anymore. Securing an implanted cardiac pacemaker that can resuscitate a faltering heart, for example, requires more rigor when compared to securing a wearable device that tracks steps.
As this revolution unfolds, several jurisdictions are also tightening privacy laws. For instance, the EU’s General Data Protection Regulation (GDPR) will impose fines up to $20M EUR or up to 4% of the annual worldwide turnover, whichever is greater. Businesses must have a strong grasp of applicable privacy laws to ensure compliance and retain customers’ trust.
Consider the impact of legacy applications
As digitization gains pace, several enterprises are finding themselves saddled by jumbles of complex, aged and proprietary applications, referred to as “legacy spaghetti.” Several of these decades-old digital workhorses have developed a reputation for reliability and still underpin vital operations. But they can also be daunting obstacles to digital transformation. Specifically, they are not designed to handle the flexibility, speed and performance demanded by today’s digital enterprise. Furthermore, they don’t have well-defined interfaces, sufficient documentation and available subject matter experts.
To manage this risk, business leaders should ask the following questions:
- Which legacy applications can be cost-effectively modernized as part of the transformation program?
- Which applications must remain untouched to mitigate risks to the stability of core operations?
- Which skillsets are required to seamlessly integrate novel applications with existing infrastructure and support mission-critical applications that cannot be feasibly decommissioned?
An effective digital transformation strategy, therefore, carefully balances the need to rejuvenate customer experiences with the steadiness of core processes. None of these can be dealt with in isolation.
This wave of digital transformation calls for enterprises to deeply rethink their strategies. Those that stick their heads in the sand may soon be irrelevant to their customers.
About the authors
Phil Zongo is a head of cyber security for an Australian investment management firm. He is the 2016-17 winner of the ISACA’s Michael Cangemi Best Book/Article Award, a global award that recognizes individuals for major contributions to publications in the field of IS audit, control and/or security. Phil has more than 13 years of technology risk consulting experience, advising executives on how to manage critical risk in complex technology transformation programs across multiple industries.
Natasha Barnes, CISA, is a manager with a global consulting firm, based in the Washington D.C. metro area. She has provided IT risk and compliance consulting services within both public and private sectors for more than seven years. Natasha helps her clients to optimize their control environments and address evolving cyber security challenges. Natasha is also a member of ISACA and a career coach with Careerly, where she mentors aspiring cyber security professionals by providing students with practical guidance to make informed career decisions.
This is a story about researching a simple question: Why are there so many vulnerabilities in information systems? One answer that might strike a chord with ISACA members is: “failure to listen to experts.”
Many of us have spent years advising companies to adhere to the principles of security by design and privacy by design, yet some still ship products with holes in them, vulnerabilities that leak sensitive data or act as a conduit to unauthorized system access. We’ve been teaching cyber-hygiene to end users since before it was called that, and we’ve all encountered organizations that don’t listen to our warnings about the risks inherent in their deployment of digital technologies.
But why do some people not listen to experts? I decided to study this question with help from my research colleague at ESET, Lysa Myers. We found an established body of research that examines the way people perceive risk and explores the ways in which risk communication can become more effective. Many of these studies centered on the rejection of warnings about risks inherent in successive waves of technology. For example, some were funded back when people argued about the risks from nuclear power and radioactive waste disposal. More recent research has explored why so many people don’t heed the warnings of climatologists.
Many studies used survey questions phrased like this: “How much risk do you believe [this hazard] poses to human health, safety, or prosperity?,” where this hazard might be global warming, genetically modified foods, and so on. Responses to these questions revealed interesting patterns when subjected to demographic analysis, particularly when that analysis included profiles derived from the cultural theory of risk perception (CT for short). According to this theory, we tend to perceive risk in a way that affirms our understanding of social structures and our place within them.
People who see society as a hierarchy of individuals rather than as a community of equals typically rate global warming less risky than folks who are more egalitarian and communitarian. Studies also found that, as a group, white males rated risks from a variety of technologies lower than white females, non-white males, and non-white females. Dubbed “the white male effect” by researchers who first observed it in 1994, this phenomenon appears to be caused by a subset of white males drastically under-rating risk relative to the mean (these men are predominantly hierarchical individualists with above average education and income).
What we didn’t find in our literature review was comparable surveying around risks arising from digital technology, so we conducted our own. We mixed six digital hazards in with nine risks unrelated to information systems, like air pollution. Using Survey Monkey, we polled more than 700 adults in the US. Our first surprise when analyzing responses was that “criminals hacking into computer systems” rated higher than any other risk, ahead of air pollution and hazardous waste disposal. A second digital hazard, theft or exposure of private data, rounded out the top four.
These results suggest that a significant portion of the American public now “gets” that digital technology brings serious risks, but what did our survey tell us about communicating with those who don’t “get” it? We did find a white male effect in our sample, but it was less pronounced for digital risks. The cultural alignment of respondents followed earlier studies for global warming, but looked quite different for digital risks. That tells me there is more work to do in this field, but we can improve our risk communication skills by learning from the work of those studying how cultural theory informs the science of science communication.
I encourage you to read Dan Kahan’s articles on this at CulturalCognition.net, and hope to see more people studying why the advice of information security experts is not universally embraced.
For more of our results, see our slides on SlideShare: https://www.slideshare.net/secret/j6a7vyrtlEgzOf.
Microsoft: More than 80 percent of employees admit to using unapproved SaaS apps for corporate purposes.
Cisco: 15 to 25 times the number of known cloud services are purchased by employees without IT involvement.
These are just two examples of the quiet, but pervasive, existence of shadow IT in enterprises today. Although the name “shadow IT” sounds like something that might appear in an espionage novel, it is very real and very alarming, as we discovered in gathering material to write ISACA’s new white paper, Shadow IT Primer. We interviewed business and technology professionals whose responsibilities include IT operations, audit and security, and who deal with shadow IT on a regular basis. Their insights and real-world examples give the ISACA publication a perspective that is not reflected in other articles on the topic.
Shadow IT can be defined as applications and services that are used within an enterprise without having been reviewed, tested, approved, implemented or secured by the enterprise’s IT and/or information security function. Or, as one of the professionals interviewed put it: If you want to know what specific and timely functionality employees need but your enterprise is not currently providing, take a look at the shadow IT discovered in your business.
Employees are at the heart of shadow IT – well-meaning, innovative employees. They want to do a good job but are hindered by a lack (or lack of awareness) of the tools they need to do so. They are drawn to shadow IT’s usefulness, which they can generally acquire and start using in minutes by skipping the IT department’s vetting process.
This seems fairly innocuous, so why do enterprises care about shadow IT? Because those applications can enable significant data breaches, which may result in substantial financial loss. In addition to the obvious security risk, the threats associated with shadow IT include regulatory noncompliance, inadequate or unenforced policies, and reputational damage.
Many organizations have found that a range of approaches to address the risk is more effective than a single solution. A few of the controls used by the professionals interviewed for ISACA‘s publication include:
- A shadow IT policy that outlines expected behaviors
- Transitioning the IT department from detection and punishment to acceptance and protection
- Using IT budgeting and procurement controls to shut down unapproved purchases
- Restricting users’ ability to freely install applications
- Educating users about the potential risk of shadow IT and the existence of an approval process
In ISACA’s white paper, these controls, and others, are fleshed out with implementation criteria and assessment methods.
Control does not necessarily equate to elimination of risk. In fact, many organizations are taking an “embrace” rather than “eliminate” approach to shadow IT. Of course, sometimes it is necessary to pull the plug. No matter how beneficial an application may appear, if it shows potential to harm the enterprise, it must be shut down immediately. The risk is too great to do otherwise.
But, even in an “eliminate” situation, there is room to “embrace” as well. A progressive approach entails realizing that, although a particular application needs to be dismantled, there is benefit in considering the problem the application is attempting to solve and empowering the IT function to find or build a safe and secure replacement – right away.
It is reasonable to assume that every enterprise contains shadow IT, given the ease and relative affordability of acquiring it, coupled with employees’ desire to fill needs or leverage opportunities with minimal delay. Savvy enterprises recognize this and mine the potential benefits, while managing the associated risk.
Loss of massive amounts of critical data in one sweep. The network can be hacked through a mouse. Easy introduction of malware into the environment. Mechanism for a bad actor to remotely control your environment.
Are these items that could have an adverse effect on your organization?
Cyber security has become an important focus for companies in today’s environment. Large sums of money are spent each day to ensure that a company’s most vulnerable assets are secure. Companies are buying pieces of software/hardware, hiring new employees or procuring the assistance of consultants to accomplish this. The main theme of securing company environments is to protect valuable information from getting into the wrong hands.
Throughout my career, I have performed multiple audits, risk assessments and reviews of IT landscapes. An easy first step to keeping your company safe, and one that I often suggest to my clients, is to turn off the ability to connect mass storage devices via USB drives. This will prevent employees and other violators from removing large amounts of data from the company. In addition to sensitive information intentionally being transferred outside of the company, USB drives are small and are often misplaced or lost, and can easily end up in the wrong hands.
Conversations around this topic usually resemble the following:
Turning it off
There are many items I bring to attention during customer engagements that require large-scale process changes, budget increases, time commitments or the addition of FTEs to accomplish. Turning off the ability to connect a mass storage device via a USB drive is not one of them. Most companies have some sort of shared drive to store files. Instead of saving files to a USB stick or USB mass storage drive, how about using the solution already in place and encouraging employees to share the file path internally? By using the shared drive, data can be secured via roles and log files can track activity.
Turning off the ability to connect a mass storage device does not hinder the ability to use the USB drive as a charging port, or being able to use a wireless mouse. Configurations can be set to allow those activities, but still disallow data to be written to a mass storage device.
When USB is a must
I have heard the rebuttal of “We have applications that require a USB drive.” This is sometimes a true statement, but not common. A solution to this is to implement a process to address exceptions. This process should be similar to obtaining access to an application, requiring approvals from the manager and application owner. Once access is approved, the user would receive an encrypted USB stick that is passphrase-protected, providing the ability to continuously monitor the usage. Instead of 100% of employees having the ability to use mass storage devices on the company’s network, the threat landscape is reduced significantly.
When you allow the use of a USB mass storage device, you are allowing the potential for a virus to be introduced into your environment. Employees often use these devices on their home computers, which do not have the same protection as the company’s computers. To reduce these risks, configure your anti-virus software to require a scan of devices plugged into the USB port before it is usable. However, we recommend not allowing mass storage devices at all.
Managing the change with employees
When implementing this change, employees may be upset. Having been part of an organization that has successfully implemented this change, I can say from experience that the shock will subside quickly. Most people don’t like change, but if the reasons are explained and they know they can still charge their phones and use their favorite USB mouse, they may move on quickly.
I have also heard people say, “When we tell people this is coming, employees will connect a USB drive and take the data with them preemptively.” To that statement, consider the following:
- Create a policy stating why employees should not perform this function and the ramifications if they are caught.
- Companies need to start implementing this policy and why not now? At least this will stop people from negative actions in the future.
Push-back on this issue also can come from executives. If management is not able to get this quick win for all employees, try different areas of the company where highly sensitive information is stored, such as:
- HR data
- Merger and acquisition data
- Intellectual property
- Customer data
The list goes on, and what is important to one company will differ from the next.
It is also argued that one could email sensitive files to a personal email account. However, that is limited to a smaller amount of data at a time versus the scale a mass storage device allows. Programs to monitor for this type of activity should be in place as well.
Mass storage devices have become inexpensive and store more data than ever before. As I write this post, a quick search on Amazon shows an external hard drive with 5 TB of space for $119! That could store quite a bit of data, causing great damage.
So, what are you waiting for? Turn off the ability to connect mass storage devices via USB drives because everybody wins.
Just a decade ago, as security professionals, we could talk reasonably about physical security and logical security requiring different approaches. Five years ago, we might have found ourselves having conversations about the blurring lines between the two types of security discipline, and could have easily pointed to aspects of both physical and logical security that crossed over each other.
Today? In organizations that have embraced even the least cutting-edge aspects of operational and information technological advances (consumer IoT, industrial IoT, cloud hosted services, etc.), we can no longer rationally discuss a strictly “physical” or “logical” approach to managing security risks to the enterprise.
Quite simply, in a world where:
- Every camera and door lock in a facility has an individual IP address
- All security investigations must happen in the real and virtual worlds at the same time
- Even the most visibly "physical" of protective measures – security officers – are networked via trackers and devices to provide instant information and communication
… there are few, if any, areas left that do not require attention to a holistic and comprehensive view of all security disciplines at once.
What does this mean for the personnel and management teams that are tasked with providing security in this borderless environment? How do we, as practitioners who may have long histories in a single discipline, protect the organization in a security environment where the risks and mitigation tactics have converged, regardless of whether our organizational structures have evolved to match them?
The answer: Enterprise Security Risk Management (ESRM).
ESRM is a risk management model that allows all functional areas tasked with mitigating security risk to operate under a converged philosophy and approach to more efficiently and effectively mitigate security risk across the enterprise, regardless of the physical or logical nature of the asset, or the vector of the potential threat.
Recognizing the Role
ESRM allows security personnel to work together to effectively protect the enterprise from a broad spectrum of security risks by first recognizing that it is the role of the security organization, at root, to manage security risk in conjunction with the business, and to protect assets from harm in line with business tolerance.
The tasks we perform to mitigate risks might be different, but the process of identifying the assets to be protected, recognizing and prioritizing the risks to those assets, and then mitigating the assets to within acceptable levels of business tolerance, are the same. Take a look at the table below, excerpted from the forthcoming book, Enterprise Security Risk Management: Concepts and Applications (Allen & Loyear, 2017). It shows a quick side-by-side of the kinds of tasks that security groups do, and how they are essentially mitigation responses to the same security risks.
Physical and Logical Security Risk Responses
Because of This
Physical Security Does This
Logical Security Does This
Gates and Fences
Business Continuity Teams
Cyber Response Teams
Security Gap Remediation
Security Risk Management
Business Impact Analysis and Risk Assessments
Business Impact Analysis and Risk Assessments
The overarching risks cannot be effectively mitigated by only a single tactical function. Working together, under a common risk management framework, all security personnel can more effectively protect the enterprise environment against security risk.
The Benefits of ESRM and Cross-Functional Risk Management Collaboration
Managing all security risks in partnership and under a common ESRM approach can bring the enterprise significant gains in efficiency and effectiveness, even with multiple groups participating in the security partnership. A few to note include:
- Unified security awareness messaging
- A partnership approach under an ESRM philosophy allows for the creation of a single, unified, security message that include all facets of security awareness.
- Single security point-of-contact
- When all security teams operate under the risk-management approach with the same defined processes, any security incident can be reported to a single point in the company and escalated and directed as needed to the appropriate response team.
- Operational efficiency
- Employees with different skill sets can more easily collaborate on incident response processes.
- Information sharing enables cross-department cooperation during security investigations that require both physical and logical forensics.
- Streamlined processes save hours and money, allowing diverse security risks to be managed by a single process.
- Consolidated metrics reporting to business management save time and effort.
- Optimized risk profile
- All security risks are identified and managed in an overarching program, making the risk identification and mitigation process more robust and decreasing the potential of overlooked risk.
How Do We Get There?
So, how do we get to the point of converging under a common philosophy, regardless of reporting lines and department structures?
All leaders in the organization with any security responsibilities can align with a risk-management approach by asking themselves:
- Does my team have clear risk management goals aligned with business risk tolerance?
- Does my team work with other department stakeholders in the risk decision-making process?
- Do the members of my team work together with other security teams in situations that cross boundaries of scope?
- Am I communicating to all areas of the business that my role, and the role of all other security teams, is to manage security risks holistically?
When all the security functions in the enterprise choose to embrace a risk management – ESRM – approach, the outcome is that:
- All security teams follow a formal and consistent process for security risk decision-making.
- All security teams follow the same incident response approach, including postmortem investigations and root cause analysis to continually improve the security risk situation of the enterprise.
- All security teams work in partnership with one another, ensuring open communications and collaboration across department lines.
- All security teams have the transparency, independence, authority and scope needed to do their work in the right way.
- All security risks, no matter which team mitigates the risks, are considered part of the holistic security risk management program.
- All security teams, no matter who they report to, understand that security risk management is everyone’s role.
The blockchain’s distributed ledger paradigm is serving as the supporting foundation to some forms of digital transformation, including the utilization of cryptographic virtual currencies (VCs) such as Bitcoin. These virtual currencies are actively utilized around the globe, both within and outside the circuits of formal economies of countries, with important financial implications including increased economic disintermediation, financial inclusion and extended digital pseudo-ecosystems that combine people, business entities, and a new generation of smart connected components.
Not only is the whole fintech industry becoming substantially disrupted by the paradigm due to the ability to move money in a decentralized and secure peer-to-peer model, but virtually all other industries are prone to substitute often bureaucratic procedures for more automated and smarter business practices.
During recent years, global organizations including the United Nations system, Multilateral Development Banks (MDB), International Financial Institutions (IFI), and the World Economic Forum, were actively engaged in their respective roles trying to commensurate the impact of this paradigm in the societies and economies of the world.
The World Economic Forum, through its intellectual debate about the Fourth Industrial/Digital Revolution, as well as one of its Global Future Councils focused on the “Future of Blockchain,” has been vocal and active on the topic, stating that “blockchain is more than just moving money. It has the potential to transform our lives, and to make the world a more efficient, frictionless place. The number of people around the world living in either broken systems or entirely corrupt systems is staggering. If done right, blockchain could positively reform entire systems.”
In January 2016, the International Monetary Fund released a first-of-its kind professional paper called “Virtual Currencies and Beyond: Initial Considerations.” This so-called staff discussion note gave a serious consideration to how new technologies are driving transformational changes in the global economy, including the emerging utilization of virtual currencies created as private sector systems that, in many cases, facilitate peer-to-peer exchange, bypassing traditional central clearinghouses. The paper also notes that “VCs offer many potential benefits, including greater speed and efficiency in making payments and transfers—particularly across borders––and ultimately promoting financial inclusion. At the same time, VCs pose considerable risks as potential vehicles for money laundering, terrorist financing, tax evasion and fraud.”
In a separate article, the IMF explores the topic of how “The Internet of Trust” is transforming the financial sector. Per its proponents, Bitcoin’s blockchain technology can be used to transform the financial sector fundamentally, for example by reducing the settlement time for securities transactions. With faster settlement, less money needs to be set aside to cover credit and settlement risks—just as collateral is not needed for a cash transaction.
The Inter-American Development Bank (IADB), the main regional development institution for Latin American and Caribbean countries, in March 2017 released the discussion paper “Digital Finance: New Times, New Challenges, New Opportunities,” explaining the financial implications of distributed ledger technologies applied in the region and around the World. The paper explains that “there is growing consensus in the financial services industry that distributed ledger technology (DLT), also known as blockchain, might just be the answer to the need of more efficient management of collateral [risks], resulting in more firms accessing credit, as well as … freeing up intermediaries’ capital for lending, and potential effects on SMEs’ direct and indirect access to multiple ways of credit.”
Now, coming back to the question of what implications and motivations this new paradigm may have in our professional life, I believe that a new generation of the IT governance, oversight and assurance professionals are called to play an elevated role in future ecosystems, economies and societies.
Similar to other emerging topics such as the advanced application of artificial intelligence (AI), big data, cloud computing, and Internet of Things (IoT), this must occur only by providing an unprecedented new level of verification and trust required by the stakeholders to sustain a paradigm that intends to be intrinsically resilient and secure by keeping distributed copies of the thematic ledger supported worldwide, using cryptographic proofs of data integrity and providing tamper-proof ledger entries.
Extraordinary challenges and opportunities are ahead for the millennials’ generation of assurance professionals, when called to provide both holistic and transactional assurance on increasingly complex digital ecosystems that involve people, processes, systems, as well as connected physical entities.
But the level of disruption to the assurance profession may not stop there. As another report, "Here's Why Robots could be the Future of Finance" from the World Economic Forum pointed, the traditional tasks of human audit work are also highly subject to substitution by artificial intelligence interventions. Meanwhile, some audit tasks may be better assisted by this advanced application of technology. We, the auditors, will face the challenge of providing assurance to our stakeholders that these algorithms are effectively well designed, implemented, deployed and operating as expected.
In our profession, traditional auditing will remain necessary in many parts of the globe and in many traditional businesses environment for a while. However, and not less importantly, a new generation of millennial auditors will need to raise the bar by providing increasingly complex assurance services in more agile business environments and in support of upcoming digital transformations. A different professional audit mindset and additional expertise will be required to satisfy the expectations of stakeholders and business owners in this new world.
Having had the privilege to have visited a number of cities throughout the world, I have learned that Chengdu is not Mexico City, Brussels is not Houston, Abuja is not Melbourne, and Johannesburg is not Dubai. That’s because the heart of every city beats differently. Each has its own character, its own vibe, and its own goals for assuring the best standard of living possible for its citizens and for the visiting public.
Likewise, every city is evolving at its own particular pace, though all are aligned to a common principle of modernizing their infrastructure services – public transportation, utilities, health care – by leveraging technology and law enforcement in “smart” ways to improve quality of life while assuring operational efficiency, stability and security. As noted by Eduardo Paes, the former mayor of Rio de Janeiro, “Smart cities are those who manage their resources efficiently. Traffic, public services and disaster response should be operated intelligently in order to minimize costs, reduce carbon emissions and increase performance.”
The term “smart cities” has been used recently as a label for those seemingly few cities of the world that are consciously embedding technology into all aspects of city planning. However, with current forecasts estimating close to 50 “megacities” housing over 10 million people and about two-thirds of the world’s population living in urban environments by 2050, the mindset must shift to think of the ‘smartness’ of any urban center as a non-negotiable element.
Many urban centers are claiming to be ahead of the “smart” curve though, in actuality, they are finding themselves handcuffed by custom systems that are not interconnected, interoperable, portable, extensible nor efficient in their operations, maintenance, and overall cost-effectiveness. Overcoming this challenge is burdensome, especially when paired with pressure to make progress that can unintentionally lead to chaos as concurrent initiatives are deployed, leading to uncoordinated solutions that can be misaligned to the intended outcomes. No wonder city planners are not sleeping at night.
The diagnosis is seemingly familiar, and not unlike the challenges many enterprises are facing with what are currently referred to as digital transformation projects. What's different for the urban center, however, is the scale of the complexity. The complexity is not just a question of technology deployment, but also taking into consideration economic, political and social issues that shape a city’s being. It’s an extreme case of a “system of systems of systems and more systems” problem, for which the only “smart” solution is a universal consensus-based governance framework.
Technology companies like Cisco and AT&T have developed their own frameworks, driven by their product strategies, especially for IoT. Standards-developing organizations such as ISO, IEC, ITU, IEEE and a number of others are facilitating the development of new standards related to specific pieces of the overall urban development challenges. Recognizing the fragmented (yet well-intended) and disparate approaches, NIST has launched a working group intended to converge these groups and their respective knowledge assets under the guise of a Smart City Framework.
The key to the success of any framework is its acceptance by universal consensus. This means the framework is created, maintained and endorsed by the professional community for the benefit of the community itself. The framework provides guidance on how to carry out the work aligned to desired outcomes in conjunction with tools that enable stakeholders to self-assess, benchmark, and measure capability maturity and progress toward the goals. This is indicative of the 20-plus year success experienced by ISACA's globally recognized COBIT framework for the governance and management of enterprise technology, which itself has the potential to be foundational for smart urban initiatives.
For now, city planners find themselves challenged across a wide spectrum of issues, ranging from technology to compliance. As members of the technology community, we need to help them by leveraging our knowledge of technology governance frameworks and their development and deployment, our holistic systems thinking and problem-solving capabilities, and our innate ability to assess and mitigate risk to inspire the confidence necessary to enable innovations that can evolve the urban environment by leveraging the best technology has to offer.
Our work has never been more important. And because we recognize the pervasive nature of technology and understand how to leverage its positive potential, I am confident that we can contribute enough to the evolution of so-called “smart cities” that the term “smart” will eventually be dropped from the lexicon. That in itself would be a great accomplishment.
Editor’s note: This blog post by ISACA CEO Matt Loeb originally appeared in CSO.