Jan 9, 2020, 4:04 PM
Missed this, but omfg, yes it’s that simple.
I am a Stoic if:
1) I live in accordance with natural law and reason;
2) I avoid fallacy of sunk costs in everything;
3) I limit my attention to what is actionable;
4) I am never a victim of circumstances as I always have a choice to change them.
As far as I know, this is all that is required of me to be a Stoic.
– Martin Št?pán
In effect, mindfulness provides control of loss aversion and all the subsequent feeling of being ‘out of control’ of one’s environment. Saying “it’s god’s plan” does exactly the same thing”. The difference being that under stoicism you are in control and under theology the big man in the sky is in control.
Why? Because we are twice as motivate by fear of lass as we are by incentive to gain.
Why? Because we know most of our theories (plans) fail.
Why? Because of neural economy, mental calculation (thinking) is expensive, error prone, but we must maintain the will to act in the kaleidic universe so we preserve the illusion.
Fallacy of sunken cost is when you consider something you’ve already invested into more valuable even though that investment doesn’t exist anymore. I.e. I might keep repairing an old car I’ve already poured lots of money into despite the fact that I’d save money by replacing it with a better because I fallaciously include what I paid in the value of the old one.
In Stoicism, I can use the most extreme example provided by Epictetus. If my child dies, there’s no sense in getting emotional over it because that investment is gone and I can’t get it back.
Loss aversion is encapsulated in the expression “losses loom larger than gains” The pain of losing is psychologically about twice as powerful as the pleasure of gaining. People are more willing to take risks (or behave dishonestly) to avoid a loss than to make a gain. Loss aversion has been used to explain the endowment effect and sunk cost fallacy, and it may also play a role in the status quo bias.
The basic principle of loss aversion can explain why penalty frames are sometimes more effective than reward frames in motivating people.
When people fear that their decision will turn out to be wrong in hindsight, they exhibit regret aversion. Regret-averse people may fear the consequences of both errors of omission (e.g. not buying the right investment property) and commission (e.g. buying the wrong investment property) (Seiler et al., 2008). The effect of anticipated regret is particularly well-studied in the domain of health, such as people’s decisions about medical treatments. A meta-analysis in this area suggests that anticipated regret is a better predictor of intentions and behavior than other kinds of anticipated negative emotions and evaluations of risk (Brewer et al., 2016).
Mental accounting is when people think of value in relative rather than absolute terms. They derive pleasure not just from an object’s value, but also the quality of the deal – its transaction utility (Thaler, 1985). In addition, humans often fail to fully consider opportunity costs (tradeoffs) and are susceptible to the sunk cost fallacy.
Why are people willing to spend more when they pay with a credit card than cash? Why would more individuals spend $10 on a theater ticket if they had just lost a $10 bill than if they had to replace a lost ticket worth $10? Why are people more likely to spend a small inheritance and invest a large one?
According to the theory of mental accounting, people treat money differently, depending on factors such as the money’s origin and intended use, rather than thinking of it in terms of the “bottom line” as in formal accounting (Thaler, 1999). An important term underlying the theory is fungibility, the fact that all money is interchangable and has no labels. In mental accounting, people treat assets as less fungible than they really are. Even seasoned investors are susceptible to this bias when they view recent gains as disposable “house money” that can be used in high-risk investments. In doing so, they make decisions on each mental account separately, losing out the big picture of the portfolio.
Consumers’ tendency to work with mental accounts is reflected in various domains of applied behavioral science, especially in the financial services industry. Examples include banks offering multiple accounts with savings goal labels, which make mental accounting more explicit, as well as third-party services that provide consumers with aggregate financial information across different financial institutions
This bias occurs when we overvalue something that we own, regardless of its objective market value. It is evident when people become relatively reluctant to part with a good they own for its cash equivalent, or if the amount that people are willing to pay for the good is lower than what they are willing to accept when selling it. Put more simply, people place a greater value on things once they have established ownership. This is especially true for things that wouldn’t normally be bought or sold on the market, usually items with symbolic, experiential, or emotional significance. Endowment effect research has been conducted with goods ranging from coffee mugs to sports cards (List, 2011). While researchers have proposed different reasons for the effect, it may be best explained by psychological factors related to loss aversion.
STATUS QUO BIAS
Status quo bias is evident when people prefer things to stay the same by doing nothing (see also inertia) or by sticking with a decision made previously. This may happen even when only small transition costs are involved and the importance of the decision is great.
Field data from university health plan enrollments, for example, show a large disparity in health plan choices between new and existing enrollees. One particular plan with significantly more favorable premiums and deductibles had a growing market share among new employees, but a significantly lower share among older enrollees. This suggests that a lack of switching could not be explained by unchanging preferences.
Samuelson and Zeckhauser note that status quo bias is consistent with loss aversion, and that it could be psychologically explained by previously made commitments, sunk cost thinking, cognitive dissonance, a need to feel in control, and regret avoidance. The latter is based on Kahneman and Tversky’s observation that people feel greater regret for bad outcomes that result from new actions taken than for bad consequences that are the consequence of inaction.
While status quo bias is frequently considered to be irrational, sticking to choices that worked in the past is often a safe and less difficult decision due to informational and cognitive limitations (see bounded rationality). For example, status quo bias is more likely when there is choice overload or high uncertainty and deliberation costs.
Commitments (see also precommitment) are often used as a tool to counteract people’s lack of willpower and to achieve behavior change, such as in the areas of dieting or saving. The greater the cost of breaking a commitment, the more effective it is (Dolan et al., 2010). From the perspective of social psychology, individuals are motivated to maintain a consistent and positive self-image (Cialdini, 2008), and they are likely to keep commitments to avoid reputational damage (if done publicly) and/or cognitive dissonance (Festinger, 1957). A field experiment in a hotel, for example, found 25% greater towel reuse among guests who made a commitment to reuse towels at check-in and wore a “Friend of the Earth” lapel pin to signal their commitment during their stay (Baca-Motes et al., 2012). The behavior change technique of ‘goal setting’ is related to making commitments (Strecher et al., 1995), while reciprocity involves an implicit commitment.
Some core ideas in behavioral economics focus on people’s propensity to do nothing, as evident in default bias and status quo bias. Inaction may be due to a number of factors, including inertia or anticipated regret. However, sometimes people have an impulse to act in order to gain a sense of control over a situation and eliminate a problem. This has been termed the action bias (Patt & Zeckhauser, 2000). For example, a person may opt for a medical treatment rather than a no-treatment alternative, even though clinical trials have not supported the treatment’s effectiveness.
Action bias is particularly likely to occur if we do something for others or others expect us to act (see social norms), as illustrated by the tendency for soccer goal keepers to jump to left or right on penalty kicks, even though statistically they would be better off if they just stayed in the middle of the goal (Bar-Eli et al., 2007). Action bias may also be more likely among overconfident individuals or if a person has experienced prior negative outcomes (Zeelenberg et al., 2002), where subsequent inaction would be a failure to do something to improve the situation.
Information avoidance in behavioral economics (Golman et al., 2017) refers to situations in which people choose not to obtain knowledge that is freely available. Active information avoidance includes physical avoidance, inattention, the biased interpretation of information (see also confirmation bias) and even some forms of forgetting. In behavioral finance, for example, research has shown that investors are less likely to check their portfolio online when the stock market is down than when it is up, which has been termed the ostrich effect (Karlsson et al., 2009). More serious cases of avoidance happen when people fail to return to clinics to get medical test results, for instance (Sullivan et al., 2004).
While information avoidance is sometimes strategic, it can have immediate hedonic benefits for people if it prevents the negative (usually psychological) consequences of knowing the information. It usually carries negative utility in the long term, because it deprives people of potentially useful information for decision making and feedback for future behavior. Furthermore, information avoidance can contribute to a polarization of political opinions and media bias.
Confirmation bias (Wason, 1960) occurs when people seek out or evaluate information in a way that fits with their existing thinking and preconceptions. The domain of science, where theories should advance based on both falsifying and supporting evidence, has not been immune to bias, which is often associated with people processing hypotheses in ways that end up confirming them (Oswald & Grosjean, 2004). Similarly, a consumer who likes a particular brand and researches a new purchase may be motivated to seek out customer reviews on the internet that favor that brand. Confirmation bias has also been related to unmotivated processes, including primacy effects and anchoring, evident in a reliance on information that is encountered early in a process (Nickerson, 1998).