It would be handy if I could think as thoroughly about a subject I haven't written about as one I have.
In the last eighteen days, I've come to realize that when I say, "New arguments are many order of magnitude rarer than arguments trying to appear new," I mean it utterly emphatically.
The point of evaluating every argument I come across is not, primarily, to have the evaluation of those arguments. I have at least two goals above that; the lesser, to practice logic so as to become more facile; the greater, to learn to recognize an argument worth evaluating.
There's a broadly held myth that peer review is necessary to overcome one's biases. While I cannot verify the causal path, I have every other reason to believe that this belief stems from the fact that peer review is highly effective at countering one's biases.
I hardly feel the need to debunk the myth directly: in this case, to solo-review, pre-evaluate an argument, and make a judgment as to whether it is worth reviewing. Write this down or otherwise ensure you won't mis-remember it later. (Optional: detail a few reasons for the judgment. I find it helpful to intentionally ask myself for features the argument should not contain.) Then, review it anyway and compare the review to your judgment.
This is not a difficult process. It is not arcane. It is not even difficult to invent if you haven't seen it before, provided you're familiar with the idea of objective tests.
Coming back to the original post's thrust, whenever I see a feature of the belief landscape like this myth, I again realize that it is hardly worth paying attention to what the common beliefs are on any particular subject. I find once again that ad populum is, in fact, a fallacy.
Common beliefs are something to be explained, not something to explain with. As a bonus, attempting to explain true common beliefs generally turns up the explanation, "It is true," with details on how it is true.
In case you think the objective review process may not be worth your time: it also teaches recognition of subcategories of 'worthwhile.' I was just trying to learn to see arguments that would increase my personal supply of true beliefs, but I ended up being able to quickly evaluate arguments against any of my goals, at will. Similarly, my habit of predicting particular failures before review has taught me to quickly recognize why, exactly, an argument fails the test.
Having completed that project, I naturally applied it, and was I ever surprised at the incredible saturation level of pointless arguments! I think I need more effective countermeasures against my optimism bias. I don't have a problem with individuals being self-serving, but I would have thought that more would at least try to offer something, to have to argument serve themselves and others, instead of relying on pure trickery. I wonder if this is partially due to ingrained jealous habits?
Though fair warning: this overall evaluation of the argument pool may be premature. I used a shortcut method, one I hope to detail in a future post, which involves looking at a concrete instance of something, asking my intuition for a list of similar instances, and then scanning the list to come up with my percentage estimate. I will be using the (judge)-(review anyway) technique recursively when the opportunity arises, even though this method has been very reliable in the past - after all, without a theoretic framework, I won't know what the failure modes are until I trip over them.
Ironically, despite the fact I'll be summarily rejecting more pieces of writing, methods like this one have allowed me to see more useful arguments. I end up not rejecting some I previously would have. Given a large corpus or a complex argument, the thought of detailed analysis can be lethally offputting. Judgment occurs, after all, whether I've refined the capacity or not. Now it is much easier to scan large volumes of sources I'm unimpressed with, looking specifically for details counter to my expectations. In the past, I have repeatedly been surprised to find that bad quality is not nearly as monolithic as it seems - nor as it is commonly portrayed - despite consistently confirming that most is, indeed, bad.
Monday, February 28, 2011
Thursday, February 10, 2011
Epistemic Expertise
I'ma get the tangents out of my brain so I can concentrate.
I find this particular statement ironic, though.
Tangent complete.
An interesting property of advanced epistemology is that if I'm doing it right, I must necessarily get answers frequently different than everyone else. If I get the same answer it should frequently be for different reasons. Prima facie it is indistinguishable from being a crank.
But clarity of communication is on average...not good. Do you want to miss the killer counter-argument because it was difficult to parse? Therefore, epistemology effectively teaches a form of hermeneutics.
Running around practising this hermeneutics regularly, I learned something that should have been obvious - positions must generally be consistent with the holder's daily experience. As much as biases can deflect us from truth, it is harder for the bias to activate every day than it is to simply accept the truth. I suspect a lot of religious hypocrisy comes from this - daily truths are acknowledged, so that the adherent can effectively go about their day, and only have to spin up the biases in the rarer, specifically theological situations.
So already there's two categories - closely following Hanson's near/far mode categorization. It is easy, indeed common for far positions to have no basis in reality whatsoever. You can safely ignore all of them. (Things tagged 'speculation' are usually in this category.) The purpose of this kind of position is not to be accurate. Its usefulness is in building alliances.
If you can determine that a position is interacting with someone's daily life, and contradicts your position, it cannot be so easily dismissed, regardless of the quality of arguments the holder can render in its favour. The truth must be consistent with their experiences. So is your position's consistency non-obvious to your opponent? Are their far-mode positions contaminating the discourse? Are you failing to communicate well? Did you piss them off and they're opposing you out of spite?
If not any of the above, the most likely explanation is that they have data you don't, that they are familiar with situations novel to you, and they are inconsistent with your hypothesis.
Together it means, subject to the caveats detailed above, you must have an answer to every counter-argument, and a position by itself is a form of argument.
"Logical inference is simply insufficient."Untrue. The impression comes from the extra-large volume of crud available in the field. Due diligence here is especially strict. Add in the population of bullshitters who aren't even trying to get it right...and logical reasoning gets a PR problem.
I find this particular statement ironic, though.
"which is ~90% of the time unjustifiable."Why is it unjustifiable? Under what framework?
"There's only those 2 choices when any reasonable fraction of folks disagree : they're ALL stupid, underinformed, or evil, or you might well be wrong. "And here's a straightforward logical inference being used, including the principle of excluded middle.
Tangent complete.
I'm epistemically privileged because I've spent well over a decade intensively studying epistemology.
"Then what makes you think you sit in a privileged epistemological position?"
An interesting property of advanced epistemology is that if I'm doing it right, I must necessarily get answers frequently different than everyone else. If I get the same answer it should frequently be for different reasons. Prima facie it is indistinguishable from being a crank.
"Then why don't you take your opponents positions (not their arguments) more seriously?"An interesting thing about ratiocination is that you must be able to answer every argument. If your thinking is actually solid, then you should be able to find the contradictions in all opposing arguments. (This is not as huge a task as it seems. New arguments are many order of magnitude rarer than arguments trying to appear new.)
But clarity of communication is on average...not good. Do you want to miss the killer counter-argument because it was difficult to parse? Therefore, epistemology effectively teaches a form of hermeneutics.
Running around practising this hermeneutics regularly, I learned something that should have been obvious - positions must generally be consistent with the holder's daily experience. As much as biases can deflect us from truth, it is harder for the bias to activate every day than it is to simply accept the truth. I suspect a lot of religious hypocrisy comes from this - daily truths are acknowledged, so that the adherent can effectively go about their day, and only have to spin up the biases in the rarer, specifically theological situations.
So already there's two categories - closely following Hanson's near/far mode categorization. It is easy, indeed common for far positions to have no basis in reality whatsoever. You can safely ignore all of them. (Things tagged 'speculation' are usually in this category.) The purpose of this kind of position is not to be accurate. Its usefulness is in building alliances.
If you can determine that a position is interacting with someone's daily life, and contradicts your position, it cannot be so easily dismissed, regardless of the quality of arguments the holder can render in its favour. The truth must be consistent with their experiences. So is your position's consistency non-obvious to your opponent? Are their far-mode positions contaminating the discourse? Are you failing to communicate well? Did you piss them off and they're opposing you out of spite?
If not any of the above, the most likely explanation is that they have data you don't, that they are familiar with situations novel to you, and they are inconsistent with your hypothesis.
Together it means, subject to the caveats detailed above, you must have an answer to every counter-argument, and a position by itself is a form of argument.
Tuesday, February 8, 2011
Hypervillianism
After the suggestion, 'hyperstition.'
The tendency to invent villainous agents to ameliorate feelings of helplessness.
Quick question; can you think of an example of a totally helpless situation? Globe warms? You can plan coastal escape routes. Worried about supernovae? You can at least ensure the globe isn't totally sterilized.
Also, an ah-ha moment. Why does 'storm' have such a strong negative connotation to it? Because storms used to actually be worth worrying about. It used to puzzle me because the storms in my life are light shows, not threats. The worst I've seen would have made the day somewhat inconvenient for a tent-dweller, let alone anyone behind brick. There's actual blizzards, but I live many many miles south of where they happen.
The tendency to invent villainous agents to ameliorate feelings of helplessness.
Quick question; can you think of an example of a totally helpless situation? Globe warms? You can plan coastal escape routes. Worried about supernovae? You can at least ensure the globe isn't totally sterilized.
Also, an ah-ha moment. Why does 'storm' have such a strong negative connotation to it? Because storms used to actually be worth worrying about. It used to puzzle me because the storms in my life are light shows, not threats. The worst I've seen would have made the day somewhat inconvenient for a tent-dweller, let alone anyone behind brick. There's actual blizzards, but I live many many miles south of where they happen.