Tag Archives: consciousness

SIA on other minds

Another interesting implication if the self indication assumption (SIA) is right is that solipsism is much less likely correct than you previously thought, and relatedly the problem of other minds is less problematic.

Solipsists think they are unjustified in believing in a world external to their minds, as one only ever knows one’s own mind and there is no obvious reason the patterns in it should be driven by something else (curiously, holding such a position does not entirely dissuade people from trying to convince others of it). This can then be debated on grounds of whether a single mind imagining the world is more or less complex than a world causing such a mind to imagine a world.

The problem of other minds is that even if you believe in the outside world that you can see, you can’t see other minds. Most of the evidence for them is by analogy to yourself, which is only one ambiguous data point (should I infer that all humans are probably conscious? All things? All girls? All rooms at night time?).

SIA says many minds are more likely than one, given that you exist. Imagine you are wondering whether this is World 1, with a single mind among billions of zombies, or World 2, with billions of conscious minds. If you start off roughly uncertain, updating on your own conscious existence with SIA shifts the probability of world 2 to billions of times the probability of world 1.

Similarly for solipsism. Other minds probably exist. From this you may conclude the world around them does too, or just that your vat isn’t the only one.

Philosophy of mind review

I recently read A Brief Introduction to the Philosophy of Mind, a short undergraduate text. I didn’t understand some bits, but I’m not sure if that’s because the book wasn’t that good or philosophy isn’t or I’m not. Here I list them, for you to enlighten me on:

1. It’s apparently standard to use what you do or don’t want to believe as evidence for what is true. E.g. A legitimate criticism of parallelism and epiphenomenalism is that they are ‘fatalistic’. If a theory means that aliens wouldn’t feel the same as us, then it is too anthropomorphic. The problem of other minds implies that we don’t know how others feel, but we tend to assume we do, therefore we do and anything that implies otherwise is wrong. “Externalism, then, opens the door to an unpalatable form of skepticism, and this is reason enough to adopt internalism instead.” Is there some legit reason for this?

2. It’s apparently standard to use the fact that you can imagine a situation where the theory wouldn’t hold as evidence that it isn’t true. E.g. That you can imagine someone with a different brain state and the same mind state is evidence against their coincidence. You can imagine zombies, so functions or brain states can’t determine mental states. It would be correct to say that your previous concept of x can’t determine y if you can imagine it varying with the same y, but it’s not evidence that the concept can’t be extended to coincide.

3. An argument against the interaction between mind and brain necessary for dualism: “..The mind is non-physical and so does not occupy space. If the mind cannot occupy space, there can be no place in the brain or space where interaction happens”. Why does causality have to take up space?

4. Parallelism (the version of dualism where there is no interaction between mind and body, but it so happens that they coincide, thanks to God or something else conveniently external) is not criticized for the parallel existence of a physical world being completely unnecessary to explain what we see if it doesn’t interact with our minds.

5. An argument given against brain states coinciding with mental states is that a variety of brain states produce roughly the same mental states – for instance hearing the sound of bells ringing coincides with quite different brain states in someone whose brain has been partly damaged and the relevant parts replaced by other neuroplastic brain regions, but we assume the experience is basically the same. Similarly, for reasons mentioned in 1 we would like to think aliens with different brains have the same feelings. Apparently, ‘these kinds of considerations have motivated philosophers (e.g., Jerry Fodor) to adopt an idea called the principle of multiple realization. According to this principle…the same type…of mental state, such as the sensation of pain, can exist in a variety of different complex physical systems. Thus it is possible for…forms of life to share the same kinds of mental states though they might have nothing in common at the physical level. This principle…has led many philosophers to abandon the identity theory as a viable theory of mind.’ But the evidence that other people or creatures have similar mental states to you is by analogy to you, and analogy becomes weaker as you know their brains are significantly different – there is no reason to suppose that a different creature feels exactly the same as you. Also you can say brain states coincide with mental states while maintaining that a broad class of brain states correspond to similar mental states. Obviously a variety of brain states coincide with variations on ‘hearing bells ring’ if you can hear bells ring while hearing other things, or after you have learned something, or when you are sleepy. You can say the brain states have something in common without requiring they be identical. There is no evidence that they have ‘nothing in common physically’. I don’t see why there being more than one exact brain state that coincides with apparent pain refutes an identity between brains and minds.

6. Functionalism is put forward as an explanation of consciousness. It doesn’t seem to explain qualia, because someone with an inverted colour spectrum of qualia would presumably behave the same. To which functionalists apparently argue that this doesn’t matter that much and such differences between experiences are probably common by virtue of functions being implemented differently in different brains. But if brain states other than functions characterize conscious experience, it seems you have gone back to some theory where any old non-functional brain states determine mental states anyway. Or does the presence of just any ‘function’ cause awareness, then other things determine what the awareness is of? What classes as a ‘function’ anyway? Something that evolution was actually trying to achieve?

7. To decide whether folk psychology can be eliminated by eliminative materialism, one question given is whether it is a theory (because there is a precedent of other theories being eliminated). The fact that it gives false predictions sometimes and we don’t discard it is said to show it isn’t a theory. “If a scientific theory yields even one false prediction, this is usually reason enough to think it is a bad theory and ought to be abandoned or amended”. True for some theories maybe, but not for theories about likely behavior  of messy systems, such as those in social science and psychology. And why can’t it be eliminated if it’s not a theory? If it’s something like a theory except wrong more often, does that protect it somehow?

8. Supervenience is the idea that mental properties depend on physical ones, but can’t be reduced to them entirely. Arguments given against this: a) Supervenience wouldn’t imply that physical properties cause mental ones – it could still be vice versa. We want to think physical properties are primary for some unexplained reason. Therefore supervenience is unsatisfactory. But if physical properties causing mental is necessary in a theory for some reason, doesn’t that just narrow it down to ‘supervenience + physical causes mental’ theory being true? b) Supervenience doesn’t actually explain anything – it just describes the relationship. But what is an explanation other than a simpler description which includes the phenomena you wanted explained? What would an explanation look like?

9. What determines the content of a mental state? Internalism says the contents of your mind, externalism says your relationships to external things. Seems like a pointless definition question – supplying a label and asking what it defines. You can categorize thoughts according to either. I must be missing something here.