The truetemp case is a possible objection to reliabilism.  In a case of philosophical science-fiction, a man unknowingly has a chip implanted in his brain.  The chip is connected to a sensor on top of his skull, and causes him to form beliefs about the temperature.  So during the ordinary course of the day, he finds himself thinking “it’s 80 degrees…it’s now 78 degrees…” and so on.  He’s perfectly accurate, but doesn’t know why he’s thinking about the temperature, and doesn’t have any reason to believe he’s accurate.  The case is supposed to be a counterexample to some forms of reliabilism.

In the truetemp experiments, Swain, Alexander and Weinberg found that experimental subjects who were presented with a clear case of knowledge before being given the truetemp case were less likely to ascribe knowledge than subjects who were first presented the truetemp case.  Ernest Sosa responds:

But surely the effects of priming, framing, and other such contextual factors will affect the epistemic status of intuition in general, only in the sort of way that they affect the epistemic status of perceptual observation in general. One would think that the ways of preserving the epistemic importance of perception in the face of such effects on perceptual judgments would be analogously available for the preservation of the epistemic importance of intuition in the face of such effects on intuitive judgments. The upshot is that we have to be careful in how we use intuition, not that intuition is useless.

That’s fine, so far as it goes, but it’s one place where the analogy between intuition and perception drives me batty.  Because as a matter of fact, even within the boundaries of common-sense, we have a grasp of the circumstances that produce error, and of methods of removing that error.

Muller-Lyer illusion? Bust out the damn ruler.  Funny lighting? Just take the damn thing outside.  In complex cases, such simple solutions may break down (some visual illusions persist under conditions of full illumination, etc, etc), but we still have scientific techiques for getting at the truth, even if it requires prolonged investigation.  In contrast, we have no good account of how to remove biasing factors with regard to philosophical intuitions.

As far as I’ve muddled my way around the topic, part of the problem is the obscure character of the facts that correspond to philosophical intuitions.  Since, e.g. color, purports to be an attribute of physical objects, albeit one that is intimately connected to the tendencies of certain animals to make judgments about those colors, it follows that there are characteristic methods of investigating the presence of those attributes.  No similar account has been given for the subjects of characteristic philosophical intuitions.  More to follow when I’m feeling more brash..


Comments are closed.