"Here’s a classic litmus test for the sunk cost fallacy:"If it is indeed classic, then these diagnoses apply to many individuals. While I'm going to focus on dreeves (Daniel Reeves) here, because I can be certain of his symptoms, the analysis will likely apply widely. I'd prefer not to pick on an individual. I certainly hold dreeves no malice - to the contrary. However, this method catches easily-overlooked important details; while at first such details look like idiosyncratic symptoms, subsequent detailed diagnoses will show the important details are common in function, even if they differ in detail. Attempting to focus on probable commonalities will only cause me to make false assumptions about the details; which I should probably add to the symptoms of abstraction intoxication.
"Do you go anyway, so as not to have wasted the money on the ticket? Probably you shouldn’t."Reeves should explain why, otherwise I have to assume a justification to be able to properly analyze the idea. Secondly, this case study will mainly involve me enumerating plausible assumptions that mean you should go anyway - the issue is nowhere near cut and dry enough to blithely dismiss these possibilities from discussion.
After hopefully demonstrating his factual errors, I will go on to show how the various afflictions caused the error, and then detail the process I went through to refine out the disease characterization.
"Going to the show lets you avoid some painful cognitive dissonance — though this one doesn’t really accord with my notion of rationality."Goals are arational.
There is no rational justification for wanting to go to the theatre at all. There is neither a rational justification for not wanting to go.
Rather, the desire is justification.
The usual point of the theatre is to enjoy yourself - hence the 'feel like going' terminology. Desiring to avoid pain is also justification for doing so.
Of course, Reeves is correct heuristically. If your goal is to avoid pain, facing cognitive dissonance is usually more effective than conceding to it, and it won't often matter whether you had good reasons for facing down the dissonance or if you did it by habit or heuristic.
This is probably a minor case of ingroup suite disease. Rationalists are perceived as seeing emotions as in opposition to reason. Daniel thinks of himself as a rationalist, therefore believes this, therefore makes statements that amount to 'avoiding pain is irrational.'
I found this pattern because I made a habit of checking articles like these against my recent experiences, and I found that it made for bad strategy. When I analyzed the logic for errors, to try to distinguish between them being wrong and me making mistake or misunderstanding, I found that all the rationalists made the same error. (Mutatis mutandis for other groups.) As a bonus, reading the error felt the same, to the point where if I felt the feeling I could accurately pigeonhole the author on that evidence alone.
"3. You might not remember what all persuaded you to buy the ticket but the more you spent the better reasons you must have had. [...]Predictably missed the biggest one. You don't feel like going, but if you went, you'd get over it and enjoy yourself anyway. Feelings are weird sometimes. Predictable because of rationalist ingroup suite disease - they discount emotions, and so the nuances of emotional states escape them.
But mostly those are rationalizations and, rationally, you shouldn’t go."
Appreciating your own consciousness is important so that you can tell the difference between what it feels like to not want to go, and what it feels like to not be able to enjoy the show. 'Going' and 'being at' are two different things, especially in terms of subjective associations. Here, Reeves is heuristically wrong - if you start coming up with these rationalization, it is a sign you should go and you'll enjoy yourself. If you're right to go, again it won't often matter if you make the decision for good reason or from rationalization.
Reeves could also be suffering from some abstraction intoxication, his dazzling ideas blinding him to the nuances of his own experience. He thinks in terms of science paper jargon, rationality, fallacies, prudence, and budgets. Sadly the human working memory is only so large, and these considerations can easily overflow by themselves - making them all that is perceivable, giving the impression that they're all that's there to be perceived.
Let me concretely realize the alternative conception. He should have mentioned something about being able to enjoy the show, but knowing you'd enjoy something else better. If you think about it in terms of what you'll actually experience, can the sunk cost fallacy even get a word in edgewise? Reeves finds himself strung up in a Gordian knot; for me, this concrete analysis cuts the knot.
The first step for conceptualizing the abstraction intoxication pattern was that I realized that if I couldn't think of a concrete instance of a thing, I didn't understand it. (I thought of doing this in the first place because I prefer to reason about instances and then generalize.) However, when I asked for or otherwise brought up instances, authors seemed indifferent, incapable, and occasionally even hostile to the practice. Over time I developed a counter-hostility to abstractions, which triggered the observation of how common are empty abstractions.
"The lesson here is: Don’t throw good money (or effort/energy) after bad."The lesson here is, apparently, that thinking about sunk cost fallacies is strategically unsound. This is the essence of anti-bias bias disease. Are most individuals better off trying to out-think their sunk cost biases, or just going with the flow? The anti-bias bias sufferer never checks and doesn't care anyway.
As I note below, I don't think Reeves actually has this one; he doesn't have the full symptomology.
I found anti-bias bias by successfully getting into the habit of checking my assumptions. The exact purpose of countering biases isn't important, but it is important that you define that purpose. So: does countering biases actually serve that purpose for the layhuman, and secondly does having that purpose itself even serve the layhuman?
I try to counter my biases because I value truth above just about everything - if it were at all painful for me, it would be a waste of time. Countering biases is often inefficient.
Once I realized this, I set it as a test to anti-bias proponents, and the majority failed. They spend far, far more effort attempting to counter biases than they do reaping the rewards. No matter how virtuous honesty and correctness are, spending more stuff correcting biases than you get back can't possibly be virtuous.
"Everyone agrees on Scenario 2. Of course you do. No one’s on such a tight budget that an unexpected change in wealth of $10 changes their utility for theater.That's because they're not equivalent. People may not be able to articulate why, but I was curious so I also went and checked. (Luckily, I'm a people too.)
But many people refuse (I’ve checked) to see that Scenario 1 is fully equivalent."
Again, abstraction intoxication dazzles Reeves. The abstract economic equivalence it sufficient for him, blinding to factors of human psychology, among others.
First, let me assume that if you lose ten bucks, the displeasure you feel motivates you to correct your error so you don't do it again, regardless of whether you lose it in the form of a ticket or bank notes.
However, after losing ticket-form wealth, buying a ticket sends yourself a message, "It's fine to lose cheap tickets, I'll just buy another." It reinforces losing tickets as a habit. Do it enough times and it will become standard; you'll habitually buy an early ticket, forget it, and then buy another on the spot.
Buying another ticket when you lose a bank note doesn't form this habit. Your brain keeps the issues separate, it doesn't associate them. Is this rational? It doesn't matter, because it happens regardless; what's irrational is not taking it into account.
Evolution is smarter than you. It is dangerous to assume your urges are dumb and animalistic unless you understand their mechanics. Out-thinking your own subconscious is very tricky.
Moreover, the pain of overcoming that reluctance to repurchase can ruin the show. Similarly, do you find that once you know the relevant psychology, you can weigh the risk of forming a habit against the joy of seeing the show that day? Taking the event out of the abstract 'sunk cost fallacy' bucket and putting it into concrete alternatives helps immensely, for me at least.
There some other non-equivalences as well. Most everyone has pseudo-lost something. I've lost my glasses on my head and as a kid I once lost something I was actually holding in my hand. (I had gotten used to the feeling and my fingers were blocking sight.) Losing keys in the coat you're wearing. Buying another ticket destroys the chance you'll re-discover your old one. Buying a ticket does not reduce the chance you'll re-discover your tenner - in fact it increases it, as you may discover it while hunting for your wallet.
Forming a habit of buying tickets on the spot is risky, as tickets aren't always immediately available or cheap. (I do it anyway; I find the costs of buying early, such as the risk of not feeling like going or losing the ticket, aren't worth it.)
Contra Reeves, situations that feel different usually are, even if you can't immediately verbalize how.
To truly support the idea that these are diseases, I have to reject the hypothesis that they're accidents. In this case, I find the patterns and predictive power to be contradictory to the idea they're random. Brains glitch now and then - if you do fifty simple addition problems, some of them will be wrong, even though you are perfectly capable of addition - you have to check your work. But nobody can predict which ones you'll get wrong. However, someone who has ignored emotional considerations, or privileged abstractions over their details, or opposed a bias for its own sake, is very likely to get something else wrong for the exact same reason.
I similarly don't like the missing information explanation, because it would mean I have privileged access to some kind of information. That would be fun, but I'm mostly just some guy with an internet connection, so it doesn't seem likely.
"3. “I’m willing to blow off the flight if it was cheap enough.”"First note that if you've heard of sunk cost before, the intended answers to these puzzles are probably obvious to you. (On a real test I might have got #4 wrong because I wasn't feeling the wording, and Reeves admits it wasn't good.)
Getting into the habit of blowing off expensive flights is a bad idea. If you know ahead of time you'll take the flight even if you don't want to, if it is expensive, it will motivate you to do your due diligence - and possibly realize ahead of time you shouldn't buy the ticket. If instead you know, "I might ditch this ticket due to appreciation of the sunk cost fallacy," you may buy it 'just in case.' One can't be rational at every moment - and human biases are already balanced against each other. Mess with the system and the outcome is hardly guaranteed to be better overall.
It seems Reeves doesn't have an actual case of anti-bias bias disease, because I don't think he'd become irate if confronted on it. Rather, I would guess he was simply surrounded by the infected, making up the bulk of his information sources. His case of abstraction intoxication made him vulnerable to the skew, minor though it is. Similarly his rationalist in-grouping is pretty minor - I found a good example and a nearly good example of emotional appreciation. These minimal infections are consistent with my finding that his reasoning is generally pretty good. Severe cases promote secondary infections.
No comments:
Post a Comment
New failcomment system also fails to publish my comments, it's not limited to yours. Keep trying, it will usually work, eventually.
Blogger deliberately trying to kill itself, I expect.
Captchas should be off. If it gives you one anyway, it's against my explicit instructions.