Tversky and Kahneman proposed the availability heuristic as an explanation for illusory correlations, in which people mistakenly judge that two events are linked. They explained that people judge correlation based on how easy it is to imagine or remember the two events together. [13] [38] A heuristic boils down to an enlightened assumption or instinctive feeling. Instead of following a set of rules or instructions, a heuristic is a mental shortcut. In addition, it often leads to suboptimal and even irrational results, which can differ even with the same input. When people move around the world, they have to process large amounts of information and make many decisions with a limited amount of time. When information is missing or an immediate decision is required, heuristics act as “rules of thumb” that guide behavior on the most effective path. Speed, real-time monitoring, and the ability to work with big data are the most notable benefits of heuristics in IT, cybersecurity, and risk prevention. The representativeness heuristic consists of making a decision by comparing the current situation to the most representative mental prototype. When trying to decide if someone is trustworthy, you can compare aspects of the individual with other mental examples you have. Heuristics therefore facilitate timely decisions that may not be the best, but are appropriate enough.

Individuals constantly use this kind of clever guesswork, trial and error, elimination processes, and past experiences to solve problems or determine a course of action. In an increasingly complex and overloaded world of big data, heuristics make decision-making easier and faster through sufficiently good shortcuts and calculations. Heuristics are a practical rule of thumb that manifests as mental shortcuts in judgment and decision-making. Without heuristics, our brains would not be able to function given the complexity of the world, the amount of data to be processed, and the computer skills required to make an optimal decision. Instead, heuristics allow us to make quick and good enough decisions. However, these decisions may also be subject to systemic inaccuracies and biases identified by behavioural economics. Heuristics are mental shortcuts that allow people to solve problems and make judgments quickly and efficiently. These empirical strategies shorten decision-making time and allow people to function without constantly stopping to think about their next course of action. Heuristic reasoning can be very effective (i.e., “unbiased”) in many practical or familiar situations (Gigerenzer, 2000; Gigerenzer and Gaissmaier, 2010; Goldstein and Gigerenzer, 2002). Heuristically, humans don`t have to weigh every conceivable piece of evidence. You just need to consider the information at hand and then decide based on the simple heuristics you learned earlier (Gigerenzer, 2007). This view is known by various names, such as “fast and economical” (Gigerenzer & Goldstein, 1996), “satisfactory” (Simon, 1987), “taking the best” (Newell & Shanks, 2003) and “flashing” (as opposed to thinking: Gladwell, 2005).

According to Klein (Klein, 1997, 1998), this quick and effective way of making decisions in regular and familiar situations is little more than recognition, that is, the activation of domain knowledge established and stored in memory through previous learning and experience. This is called “skillful intuition” (Klein, 1997, 1998; Simon, 1992). Klein explained this view in his recognition-initiated decision-making model (Klein, 1997). From this point of view, correct decisions made intuitively are the result of experience and expertise. Most of the time, in normal, familiar situations, we can either trust our first impulse (which is often right) or correct our initial decisions if we turn out to be wrong. False heuristics and distortions result from a lack of expertise. People can then apply experience-based heuristics under unknown or unknown conditions that don`t fit their mental model. This is done, for example, in experiments conducted in artificial or laboratory settings (Klein, 1993, 1998, 2008).

The overall problem is that we have no idea where our intuitions come from. There is no subjective marker distinguishing correct intuitions from intuitions generated by highly imperfect heuristics (Kahneman and Klein, 2009).