In a perfect world, all humans would be rational beings capable of making decisions based on facts and logic.
But this isn't a perfect world.
Even though the human brain is powerful, it is subject to limitations when simplifying information. Our brain can become lazy and take shortcuts to arrive at decisions. These shortcuts are called biases.
Cognitive biases are defined as systematic patterns of deviation from the rationality of judgment.
To make it simpler, let's look at an example. Imagine you won a game with a cash prize of $300. But suddenly, they announce that you can choose to either receive the $300 today or receive $30 every year for the next 20 years. Of course, who has the patience to wait 20 years, even if the reward is doubled? For any rational person, taking the money right away is instinctive.
In the world of psychology, this is called hyperbolic discounting: our inclination to choose immediate rewards over rewards that come later in the future, even when those rewards are similar or higher.
Every human is vulnerable to these errors in judgment. So understanding these biases can help us overcome any security-related errors and design a more robust defense strategy.
Let's look at a few biases that can affect your cybersecurity decisions.
Availability bias impacts our decisions by making us focus on the most recent information. For example, if there is news about a new ransomware attack, most security teams will focus on protecting their networks from it, even though it may not apply to their industry.
Such news can cause organizations to ignore other important issues that can do more damage to their networks. Though it is necessary to safeguard against trending attacks, it is equally important to consider other cases too.
Confirmation bias is the tendency to favor information that confirms our beliefs. We can see this bias in action while hunting threats. This bias can trick analysts into looking for specific information that aligns with their beliefs and skills. Many experienced security analysts anchor on the cause of an issue before investigation and only look for evidence supporting that cause.
For example, if an analyst believes a breach to be the result of an insider job, they might completely ignore the fact that a certain related-party interaction (involving third-party vendors and resellers, government authorities, or internal auditors) could've triggered the series of events leading to the breach.
Security professionals should also be more open to suggestions and accept others' different points of view. This can help them analyze issues that they might have missed out on previously.
Also known as the illusion of invulnerability, optimism bias makes us believe that the chance of experiencing something positive is higher (or something negative is lower) than it actually is.
Having a SIEM tool with all the correlation rules and alerts set up doesn't mean your network won't get compromised. A simple phishing attack can allow adversaries to gain access to your network in no time.
While this bias can be good for our personal lives, in cybersecurity, it is always better to have the opposite mindset when configuring servers, applications, firewalls, and more. We recommend you take precautionary measures by setting up advanced threat intelligence, UEBA, and other capabilities to improve your security posture.
We can see aggregate bias in action when we tend to conclude something about an individual using data about a larger population.
Imagine there has been a data breach in your organization. Whose logs would you first start checking? Definitely people with lots of access, right? This bias can cause analysts to focus on a particular individual based on the individual's group, such as administrators or privileged users. But in reality, any regular employee could've clicked on a phishing link, triggering a series of events that eventually led to the breach.
We recommend you analyze individual human behavior to detect anomalies through recognizing subtle changes in regular activities. Implementing UEBA to detect malicious behavior can strengthen your defenses against insider threats. Learn more about UEBA.
Framing bias impacts people's decisions based on how choices are presented, rather than an examination of the facts. This can be used by hackers when sending phishing emails that are framed as something important from a higher official or as a product update.
You can also see this bias when purchasing security tools. Analysts may choose to buy expensive solutions that address low-probability risks like ransomware because of a recent incident (also due to availability bias).
We recommend that decision-makers think more analytically when it comes to buying a security tool. There are multiple tools out there that address specific security problems. Consider evaluating a SIEM tool that can perform all the various functions needed to improve the security of your organization.
Cognitive biases impact our everyday decisions. Whether it's as simple as buying an item at the grocery store or as complex as deciding which stock to invest in, biases are everywhere.
One way to overcome these biases is to always be aware of them. As discussed before, these biases affect not only the way you analyze threats in your network but also your decisions to invest in security tools. Training your employees about these biases and implementing tools and procedures to identify, analyze, and manage advanced threats are the remedies you need.
Remember, security is not all about the tools; it's a combination of processes, technology, and an understanding of human behavior.
You will receive regular updates on the latest news on cybersecurity.
© 2021 Zoho Corporation Pvt. Ltd. All rights reserved.