Sunday, June 29, 2008

Professionalism, Ideology, and Logical Techniques

Because I apparently insist on dealing with amateurs, it behooves me to illustrate the difference. Also, because I am an autodidact, I have no certification; basically no one will agree or vouch for my expertise. Also, I've always enjoyed articles with meat I can actually use. This post is intended to be like a user's manual for your brain.

I actually like the above situation. Certification is fatally flawed. It becomes a shortcut to actually evaluating whether a source is trustworthy or not. If you can't do that, especially with philosophy, you have no business anywhere near a philosopher, because it's fucking dangerous.

This kind of mistake is what gives ideology a bad name. Ideologies can be extremely powerful, and like all tools, are neither good nor bad. However, when an amateur starts dicking around with one, the probability of abusing it becomes nearly certainty, and the power amplification of the ideology magnifies the abuse to epic proportions.

Ideology. Warning! Misuse can cause megadeaths. Accidents can condemn endless generations to hellish suffering. Use at own risk of becoming Hitler. Side effects may include puritanism, institutionalized torture, or nihilism.

The personal tragedies that occur because of broken ideology are no less horrific, and are yet worse because the reason they occur is that they're supposed to be justified. This makes it all the more important to get the ideology right. Even if it did nothing else it would take up the spot that broken ideologies would try to invade.

Don't fuck around with ideas. They seem so innocent and formless, but they're literally the most powerful tool on Earth.

Instead, imagine how much good that much power could do if used for that purpose. Then, leave it to the professionals.

But anyway, I'm supposed to be talking about logical techniques.

It's a problem. How do you self-detect conflation? Conversely, how to unify apparently disparate things?

One way is to look at purposes. If you have an apparent single object or idea that's serving two purposes, then there's a good chance they can looked at separately. And anyway, it is dead obvious when they can't.

For instance, my stapler is being used as a paperweight. But it is not being used as a paperweight and a stapler at the same time. I should not conflate the logical consequences of the paperweight with those of the stapler.

Similarly, if you have two apparently different things that are doing/for the exact same thing, then logically speaking they are the same thing. Quacks like a duck, walks like a duck, etcetera.

A meatier example. I have stated that there is some optimal way to live your life. This is a fact. Do not conflate it with the statement that you should live your life this way. Finding such an optimum is impossible, and even an optimum heuristic is out of reach. The purpose of the first statement is to convey a fact, and whatever consequences that fact may entail. The purpose of the conflated statement is purely normative, and as we can see, the bare fact is irrelevant to the norm.

Purpose is just a handy tool in general, actually.

Using Purpose
Knowing the purpose of everything being discussed almost automatically organizes the discussion along logical lines.

To properly conduct a debate, it is necessary to define terms. As a corollary to this, it is necessary to define goals. Which inevitably means that the debaters have to agree on the purpose of all concepts being discussed.

Discovering conflicting goals can go as far toward a productive debate as discovering conflicting data. Once uncovered, the conflict can be resolved, or at least dealt with consciously and rationally. For an example, look at the second post on the page by Alan, where Alan uses purpose extensively to analyze meanings and motives.

Incidentally and as an example, the purpose of the idea "respect other's beliefs" shows that they really means "respect other's goals." Respecting beliefs is stupid to the extent that they are known to be false. More probably false, more stupid. Goals, however, cannot be false. Thus, respecting them is never stupid.

Confirmation Bias
To evade it, act thusly.

First, find the position you're sympathetic with. You'll almost always have such a position before you reason it out.

Then, attempt to prove the opposite. Pretend you really want to prove it. This causes you to actively seek disconfirming evidence, and it can become a challenge in its own right. Additionally it can give you greater confidence in your final conclusion; first you may stumble upon a solid proof of the opposite, with no chance of misunderstanding; second, if you are truly incapable of proving the opposite, having tried your best, then you know for almost certain that your final position is correct.

Unless of course you conflated something, which means that your positions weren't actually opposite. They were not actually differing answers to a yes/no question.

As a further bonus, this method gives you insight into the meaning of the opposing side. Having assumed yourself that it is true, then you know the consequences from the inside. Often this knowledge is all you need to complete the process.

This technique may even turn the confirmation bias into an asset. I'm not actually sure on that point.

This technique also neutralizes the congruence bias and the subjective validity bias. In fact I think it neutralizes almost every bias, because my availability bias doesn't throw up many incidences of me finding a bias it doesn't neutralize. Instead I remember often thinking, "This bias is just a facet of the confirmation bias." This wasn't quite true, but it has the same consequences as what is true.

Availability Bias
When proving things, we are biased towards easily remembered data. Stated like this, it's obvious.

However, this can be used to your advantage. For example, if something is supposed to be really common, yet your availability bias can't find any examples, then it is probably not actually that common. (Alternatively it could be like fish and water, in which case you won't be able to see it anyway.)

Don't allow newspaper articles unless by 'common' you mean 'happens often somewhere on the entire globe.' For instance I'm arguing (ha ha, no I'm objecting and he's projecting) with an anarcho-socialist who proposes that property rights are immoral because people need to eat. So, if you live in a so-called 'capitalist' society, can you find any instances of starvation in your society with your availability bias? I certainly can't, which suggests that this objection, even if true, (it's not) isn't of any practical consequence.

You can also use it to find important things. Ask yourself what's important on some subject. Whatever pops out is almost certainly important, because it is available. If it is not available and nothing is reminding you of it, it is almost always not important.

A similar so called 'bias' is the hyperbolic discount bias. In reality, things in the future really are worth hyperbolically less to you.

A Technique for Not Sounding Idiotic
So I hear all sorts of crap philosophy vomiting out of people's mouths. Obviously they are unaware of this, and actively resist it if I try to point it out.

How do I know I don't suffer the same ailment? Three habits.

1. I run everything I say out loud back through my auditory interpretation system. Did I say what I wanted to mean?
2. Do I believe what I meant to say? Are the immediate consequences of what I said actually things I believe?
3. What is the purpose of me saying this? Is what I actually said serving this purpose?

In both 1 and 3 I attempt to reset my brain so that it doesn't simply run through the same circuits again. I do this by adopting a separate paradigm; using a slightly different approach to analyzing my statement than I did while generating them.

Because these are habits, they do in fact happen in real time. And yes, they catch stuff all over the place. I have a low-frequency habit of starting three sentences before I manage to finish one, because the first two self-contradict and by the third I've realized I want to say something entirely different. Sometimes if someone says 'What?' I'll give them a completely different sentence because I've already realized the first one sucked.

And second, habitually examining my statements in light of my beliefs is pretty effective at making my beliefs consistent. It is getting tough for me to even think inconsistently. I'm beginning to recognize what it feels like, and then simply avoiding that feeling.

Does any of that sound desirable to you? Worth the effort of generating those habits?

Debates and Stubbornness
Whenever you put a statement into a public space there's a temptation to defend it to the death. If someone objects, it can become a contest to see who 'wins' with the 'loser' having to admit 'defeat.'

I think this is a species of reactance, but I'm not sure.

This is so wrong on so many levels that I'm actually cutting the discussion of it. (Avoiding a tangent? What now?)

To avoid this, make sure that before you state anything, you have an objective standard which will cause you to change your belief. If you think you might never change your belief, if someone argues with you, you should let them know that you won't accept arguments. You don't have to feel ashamed of this flaw, as most people have it. Just accept it and act accordingly.

Don't forget to check the standard for reasonableness. Do this for yourself, because you want to be right, so that you'll take the responsibility seriously. The God and racism discussions have standards of disproof, but they're subject to hyperinflation. Don't be like them.

Because I like lists:

  1. Conflation
  2. -Analyze the purpose behind all your concepts to see if they can be split or combined.
  3. Purpose
  4. -Know the purpose of all concepts and the arguments that use them
  5. -Use this to arrange the concepts and arguments into a sensible system
  6. -Don't forget to analyze the purpose of the system
  7. Confirmation Bias
  8. -Attempt to prove the opposite of what you want to prove
  9. Availability Bias
  10. -Is an tool, not a problem
  11. Talking Coherently
  12. -Did I say what I meant to say?
  13. -Do I believe what I meant to say? Are the consequences coherent?
  14. -What is the purpose of my statement, and is it serving that purpose?
  15. Stubborn Resistance to Data
  16. -Always come up with a disproof an objector could achieve.
  17. -Do this for yourself, because you want to be right, which will cause you to take it reasonably and seriously.
So, did you find that useful?

I could go through the entire list if it was.

Incidentally, one of the purposes of putting this out is to make it easy to prove that I'm a hypocrite, if indeed I am.

No comments: