Gertie wrote: ↑May 19th, 2020, 2:27 pm
I'd be interested if you can give a clear, coherent explanation of Dennett's position on this.
Okay, keeping in mind that "intentionality" as a concept is concerned with the content of a mental state. . . .
Dennett's view is that we ascribe the content of a mental state to someone as a way to predict and understand an agent's current and future behavior. To have a belief that X means that in various contexts you are likely to exhibit certain kinds of behavior, because we have social norms about what a belief in X requires and means.
There will of course be variations based on other beliefs you may have. Mary may believe it is raining hard outside, and one contextual behavior we would expect is that when she goes out, she'll use an umbrella or otherwise cover herself. If she doesn't, that would be a violation of expected norms for belief it is raining hard. But if it turned out that she also believes it is fun to walk in the rain, or also believes her car is just 10 feet from the door and that the car is her destination, those beliefs would explain why her behavior failed to meet expected norms. Beliefs form networks.
What doesn't matter in ascribing intentionality to a subject is what configuration of stuff is inside of the subject, making it behave the way it does. We don't care about that any more than we care about the details of a computer program in determining whether it is a spread sheet or not. All we care about is whether an agent behaves rationally according to social norms, or in the case of a computer program, whether it is good at doing what we expect of THAT sort of program.
That's one reason why Dennett's views on intentionality make him an anti-reductionist. Even between members of the same species (say, human beings) there is such variation in the ways different people's nervous systems physically instantiate what we'd essentially call the "same" belief that reduction to some common element in all human nervous systems would be impossible to find. At any rate, it wouldn't be necessary to find such a thing.
For eliminativists, failure to reduce means mental states don't exist, or are some kind of folk myth, but because Dennett has the views he does on how mental states acquire content in the first place, he doesn't need to require reductionism for them to still be real in a robust, pragmatic sense.
Gertie wrote: ↑May 19th, 2020, 2:27 pm
And how his anti-reductionism of mental states ties in to his claim that simply describing one's mental states captures all their qualities.
His theories of intentionality are separate from his theories about consciousness. His approach to intentionality will be about mental state content. His approach to consciousness will be concerned with how some intentional states come to dominate the brain's networks of neurons over other states.
And he has never said that "simply describing one's mental states captures all their qualities". The closest thing to this that he has said is that a successful theory of consciousness must explain how people come to say the things they say about what it is like to have conscious experiences of one sort or another--and when such a theory can do this, there is nothing left for it to do.