Gertie wrote: ↑September 8th, 2020, 3:22 pm
That would be misleading. Qualia are not properties of brain processes, but products of brain processes.
Could you clarify how the difference works here?
I'd think that difference was pretty obvious. The product of a process is not a property of the processor. E.g., "Guernica" is a product of Picasso, but not a property of him. Cotton (the fabric) is a product of a textile mill, but not a property of the mill. Honey is a product of bees, but not a property of them. Though, we could say the
ability to make honey is a property of bees --- and the ability of some brains to produce consciousness is a property of those brains.
Just to agree some terms - would you go with qualia are akin to units of certain types phenomenal experience like sensory perceptions, emotions and sensations? Or all 'what it's like' experience?
Yes. Qualia are the brain's mode of representing all the various internal and external states it can detect to itself.
And what do you mean by 'consciousness' here, which the brain ''presents phenomenal experience'' to? Other types of experiential states, a self which is something different to experiential states, or something else?
That is a tough one, because the term "conscious" has two different senses in ordinary speech --- it is contrasted with "unconscious," e.g., asleep or in a coma, etc., and "non-conscious," assumed of plants, rocks, etc. So (living) humans are conscious in the second sense even when asleep. We can then define "consciousness" as the state of being conscious in the first sense. But that still doesn't tell us what consciousness is. My own (currently) preferred analysis, gaining favor among some neurophysiolgists and AI researchers, is,
a system is conscious when it has the means to gather a wide variety of information about its own internal states and external environment, an ability to store information about past states of itself and the environment, can use that data to generate a dynamic, virtual model of itself and its surroundings, run "what-if" scenarios in the model, drawing upon memories of past actions and the results thereof, and direct its actions based on the ouput of that processing. I think we'd be willing to call any system that could do those things "conscious." It would pass the Turing test. Our subjective "conscious experience" is the ongoing operation of that virtual model.
Again, what is the ''us'' or Me here doing the distinguishing?
The "me" is the system as a whole, as represented in the virtual model --- the virtual "me." The brain generates that model, not unlike the way a computer and its program generates virtual world for a video game, except that the raw data for the brain's model is drawn from environment in real time.
If I'm reading you correctly, you're saying Dennett believes it's arbitrary that sticking my hand in a fire feels bad, and and eating when I'm low on calories feels good?
Oh, no. Dennett wouldn't say anything like that. The tags --- qualia --- applied to mark various distinguishable inputs are arbitrary, in the sense of being unpredictable, but the evaluation of some of the the information they convey is surely pre-programmed (via evolution, as you say).
Umm OK. I'd thought Dennett disputed their inneffability.
He doesn't dispute it; he dismisses it, as an unnecessary feature of an unnecessary concept (qualia).
They are also intrinsically subjective --- there is no way for me to know whether the sensation you experience when seeing red is the same as mine --- that question doesn't even make sense.
Right it is unknowable, but the claim the question doesn't make sense implies a whole lot more.
It makes no sense in the same way that "The universe and everything in it is doubling in size every minute" makes no sense. It is a question impossible in principle to answer, as the latter is a proposition impossible in principle to verify. It is an idle question.