Page 28 of 70

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 10:31 am
by Consul
Faustus5 wrote: May 22nd, 2020, 2:48 pm
Gertie wrote: May 22nd, 2020, 2:32 pm Information Integration Theory has a go at quantifying consciousness, sort of, I think. I don't understand it myself.

https://en.wikipedia.org/wiki/Integrate ... ion_theory
It's pretty cool, but I don't think what they are doing is really quantifying consciousness so much as quantifying something important in the brain processes that constitute consciousness. Insofar as some mental states can only be categorized (and are created) by social norms, there is always going to be an aspect of consciousness that cannot be quantified because there's nothing there TO quantify.
QUOTE>
"IIT strives, among other things, not just to claim the existence of a scale of complexity of consciousness, but to provide a theoretical approach to the precise quantification of the richness of experience for any conscious system. This requires calculating the maximal amount of integrated information in a system. IIT refers to this as the system’s phi value, which can be expressed numerically, at least in principle."

Integrated Information Theory of Consciousness: https://www.iep.utm.edu/int-info/
<QUOTE

What is quantified and measured is integrated information; and according to IIT, consciousness is integrated information.

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 10:48 am
by Consul
Terrapin Station wrote: May 23rd, 2020, 8:51 amIf we read that as "The subject having experiences is not an experience," then you're simply restating the claim that I'm saying is mistaken. You, as the subject, are not something different than your experiences.
If we read that as "What experiences are about , or the contents of an experience, are not experiences," that's fine, but it has nothing to do with the fact that you are not different than your experiences.
By "content of consciousness/experience" I mean its immanent subjective mental (experiential/phenomenal) content, and not what a perceptual experience is about, i.e. the experience-transcendent perceptual object. I know that in the philosophy of mind "content" is often used to refer to what I call perceptual, intentional, or representational objects. The object of a perceptual experience or perception is not part of it, but the content is.

Your assertion that I am not different from my experiences is not a fact, but a nonfact, because it's false. It's nonsensical to say that the subject of experience is itself an experience or a "bundle" of experiences. No experience or bundle of experiences can possibly experience something, have or undergo experiences. No item of mentality/experientiality can possibly be a subject of mentality/experientiality, and vice versa.

QUOTE>
"Hume's bundle theory of ourselves may be seen, I think, to be perfectly absurd. Indeed, to my mind, the simplest consideration against the view is perfectly decisive: We must acknowledge the possibility of just a single thinking, or a solitary perceiving, or a lonely experiencing. As such a lonely experience is so perfectly solitary, it can't be part of any bundle or collection of perceptions. On a bundle theory, then, it must be an experiencing of no experiencer at all, which is obviously absurd. Nor will we be helped in avoiding absurdity, it's evident, by a clever logical dodge, as with defining 'minimal bundle' comprising just one single experiencing, which may somehow serve to comprise, or to compose, a being whose experiencing it is. For, it's also obviously absurd to think that an experiencing may be its own subject, or to think that there may be no difference at all between an experiencer, however small and impoverished, and, on the other side, the experiencing that this subject enjoys.

Please don't misunderstand me here. I'll allow that it's perfectly possible for there to be a person who only ever engages in extremely little experiencing. Just so, it's perfectly possible, I'll at least allow, for there to be someone who exists for only a tiny fraction of a single second, experiencing only just then, and then only in just a certain absolutely specific way, perhaps as simple as just experiencing bell-tonely, for instance. As I'll happily allow, this 'terribly fleeting' being won't ever enjoy, or he won't ever suffer, any more experiencing than just that, during his single existential moment. What I'm not prepared to allow, by contrast, is that there's ever any experiencing, or any (so-called) experience, that somehow constitutes an experiencer; no more than there's any experience, or any experiencing, really, that's not the experience of an experiencer. Rather, in every possible case, it's the experiencer that's quite basic, with the experiencing, or the (so-called) experience, only ontologically parasitical (on the sentient being). It may be helpful, I think, to put the point this graphically: An Almighty God, if Such there be, could make an experiencer who never enjoys any experience; maybe, during his very brief existence, he's always in absolutely deep sleep, say, and, for that reason, he's never experiencing. But, not even an Almighty God could make there be some experiencing (or, what's colloquially called some experience) that wasn't the experiencing (or the so-called experience) of an experiencer, who's ontologically more basic than the experiencing.

On views very different from bundle theories, as with Descartes's idea of substantial individual souls, there's no place for such absurd thoughts as that of an experiencing, or an experience, without any experiencer. (Nor is there any place for such an absurd idea as that of an experiencer who's identical with his experiencing.) Rather than an experience that's floating free of all souls, we should have each experience be an experience of a properly powerful individual, maybe a manifestation of the individual's power to experience. Though we may disagree with Descartes on many other matters, on this critical point, he is, apparently, right on the money. By contrast, Hume's bundle theory is wrong."

(Unger, Peter. All the Power in the World. Oxford: Oxford University Press, 2006. pp. 57-8)
<QUOTE

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 12:27 pm
by Gertie
Consul wrote: May 23rd, 2020, 10:31 am
Faustus5 wrote: May 22nd, 2020, 2:48 pmIt's pretty cool, but I don't think what they are doing is really quantifying consciousness so much as quantifying something important in the brain processes that constitute consciousness. Insofar as some mental states can only be categorized (and are created) by social norms, there is always going to be an aspect of consciousness that cannot be quantified because there's nothing there TO quantify.
QUOTE>
"IIT strives, among other things, not just to claim the existence of a scale of complexity of consciousness, but to provide a theoretical approach to the precise quantification of the richness of experience for any conscious system. This requires calculating the maximal amount of integrated information in a system. IIT refers to this as the system’s phi value, which can be expressed numerically, at least in principle."

Integrated Information Theory of Consciousness: https://www.iep.utm.edu/int-info/
<QUOTE

What is quantified and measured is integrated information; and according to IIT, consciousness is integrated information.
what are referring to here by ''integrated information''? what is it they are actually counting in phis?

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 1:21 pm
by Consul
Gertie wrote: May 23rd, 2020, 12:27 pm
Consul wrote: May 23rd, 2020, 10:31 am What is quantified and measured is integrated information; and according to IIT, consciousness is integrated information.
what are referring to here by ''integrated information''? what is it they are actually counting in phis?
Unfortunately, Tononi's concept of integrated information is hard to understand—and I don't think I really understand it. What is clear is that it is different from Shannon's concept of information.

For technical details, see:

* http://www.scholarpedia.org/article/Int ... ion_theory

* https://journals.plos.org/ploscompbiol/ ... bi.1003588

QUOTE>
"Abstract: This paper presents Integrated Information Theory (IIT) of consciousness 3.0, which incorporates several advances over previous formulations. IIT starts from phenomenological axioms: information says that each experience is specific – it is what it is by how it differs from alternative experiences; integration says that it is unified – irreducible to non-interdependent components; exclusion says that it has unique borders and a particular spatio-temporal grain. These axioms are formalized into postulates that prescribe how physical mechanisms, such as neurons or logic gates, must be configured to generate experience (phenomenology). The postulates are used to define intrinsic information as “differences that make a difference” within a system, and integrated information as information specified by a whole that cannot be reduced to that specified by its parts. By applying the postulates both at the level of individual mechanisms and at the level of systems of mechanisms, IIT arrives at an identity: an experience is a maximally irreducible conceptual structure (MICS, a constellation of concepts in qualia space), and the set of elements that generates it constitutes a complex. According to IIT, a MICS specifies the quality of an experience and integrated information ΦMax its quantity. From the theory follow several results, including: a system of mechanisms may condense into a major complex and non-overlapping minor complexes; the concepts that specify the quality of an experience are always about the complex itself and relate only indirectly to the external environment; anatomical connectivity influences complexes and associated MICS; a complex can generate a MICS even if its elements are inactive; simple systems can be minimally conscious; complicated systems can be unconscious; there can be true “zombies” – unconscious feed-forward systems that are functionally equivalent to conscious complexes."
<QUOTE

Also: The Problem with the 'Information' in Integrated Information Theory

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 2:43 pm
by Consul
Faustus5 wrote: May 22nd, 2020, 10:54 amI'm totally cool with that version of qualia. Where I part company with the qualiafiles is the moment they try to spin the concept so that it is somehow immune to scientific scrutiny.
The big epistemological and methodological problem the science of (phenomenal) consciousness has is that its experiential contents are subjective, private, and directly accessible only from the first-person perspective. However, it doesn't follow that experiences aren't composed of or constituted by neural processes which are nonsubjective, nonprivate, and directly accessible from the the third-person perspective.

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 3:40 pm
by Terrapin Station
Consul wrote: May 23rd, 2020, 10:48 am Your assertion that I am not different from my experiences is not a fact, but a nonfact, because it's false.
Here's an equal and opposite argument:

Your assertion that you are different than your experiences is not a fact, but a nonfact, because it's false.

Is that a good argument?
It's nonsensical to say that the subject of experience is itself an experience or a "bundle" of experiences. No experience or bundle of experiences can possibly experience something, have or undergo experiences.
This assumes that you're something different than your experiences. If you're not different than you're experiences then you're not something (different) that has or that undergoes experiences. You ARE those experiences. You're not something different than them that just is subject to them.
No item of mentality/experientiality can possibly be a subject of mentality/experientiality, and vice versa.
I'm not sure what this is saying because of the odd way in which it's using "item" versus "subject."

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 3:46 pm
by Gertie
Consul wrote: May 23rd, 2020, 1:21 pm
Gertie wrote: May 23rd, 2020, 12:27 pm what are referring to here by ''integrated information''? what is it they are actually counting in phis?
Unfortunately, Tononi's concept of integrated information is hard to understand—and I don't think I really understand it. What is clear is that it is different from Shannon's concept of information.

For technical details, see:

* http://www.scholarpedia.org/article/Int ... ion_theory

* https://journals.plos.org/ploscompbiol/ ... bi.1003588

QUOTE>
"Abstract: This paper presents Integrated Information Theory (IIT) of consciousness 3.0, which incorporates several advances over previous formulations. IIT starts from phenomenological axioms: information says that each experience is specific – it is what it is by how it differs from alternative experiences; integration says that it is unified – irreducible to non-interdependent components; exclusion says that it has unique borders and a particular spatio-temporal grain. These axioms are formalized into postulates that prescribe how physical mechanisms, such as neurons or logic gates, must be configured to generate experience (phenomenology). The postulates are used to define intrinsic information as “differences that make a difference” within a system, and integrated information as information specified by a whole that cannot be reduced to that specified by its parts. By applying the postulates both at the level of individual mechanisms and at the level of systems of mechanisms, IIT arrives at an identity: an experience is a maximally irreducible conceptual structure (MICS, a constellation of concepts in qualia space), and the set of elements that generates it constitutes a complex. According to IIT, a MICS specifies the quality of an experience and integrated information ΦMax its quantity. From the theory follow several results, including: a system of mechanisms may condense into a major complex and non-overlapping minor complexes; the concepts that specify the quality of an experience are always about the complex itself and relate only indirectly to the external environment; anatomical connectivity influences complexes and associated MICS; a complex can generate a MICS even if its elements are inactive; simple systems can be minimally conscious; complicated systems can be unconscious; there can be true “zombies” – unconscious feed-forward systems that are functionally equivalent to conscious complexes."
<QUOTE

Also: The Problem with the 'Information' in Integrated Information Theory
I'm none the wiser, but thank you :)

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 3:58 pm
by Terrapin Station
Ah, realized that this is simply yet another way to reword the same claim your making:

"No item of mentality/experientiality can possibly be a subject of mentality/experientiality, and vice versa. "

Rewording the same claim over and over isn't an argument for the claim.

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 4:25 pm
by Consul
Terrapin Station wrote: May 23rd, 2020, 3:58 pmAh, realized that this is simply yet another way to reword the same claim your making:

"No item of mentality/experientiality can possibly be a subject of mentality/experientiality, and vice versa. "

Rewording the same claim over and over isn't an argument for the claim.
The simple argument is that it's nonsensical to say that experienceRs are (bundles of) experiences.
Did you read the Unger quote? Here's yet another one:

QUOTE>
"There are two kinds of entity that feature in the mental realm. On the one hand, there are items of mentality (mental items). These are such things as sense experiences, beliefs, emotions, and decisions, which form the concrete ingredients of the mind. On the other hand, there are subjects of mentality (mental subjects). These are the persisting entities that have mental lives and in whose mental lives mental items occur; they are the things that have experiences, hold beliefs, feel emotions, and make decisions. Mental items can occur only as elements in the lives of mental subjects. This is because our very concept of any type of mental item just is the concept of a subject’s being in a certain mental state, or performing a certain kind of mental act, or engaging in a certain kind of mental activity. It is fundamental to our understanding of the forms of mentality in question that for an experience to occur is for a subject to experience something, for a belief to occur is for a subject to believe something, for a decision to occur is for a subject to decide something, and so on for each type of mental item. To suppose that an item of mentality could occur without a subject of mentality would be as absurd as supposing that there could be an instance of motion without something that moves, or an instance of smiling without something that smiles.

Some philosophers of a radically empiricist persuasion have rejected an ontology of mental subjects on the grounds that the attachment of mental items to subjects is not introspectively detectable. They have insisted that what we ordinarily think of as the mental life of a persisting subject is really only an organized collection of ontologically autonomous mental items that stand to one another in certain psychological and causal relations, and are typically causally associated with the same biological organism. There is a double confusion here.

In the first place, even if these philosophers were right in supposing that the attachment of mental items to subjects is not introspectively detectable, there is no getting around the point that our very concept of any type of mental item is the concept of a certain form of ‘mentalizing’ by a subject. Whatever the introspective situation, it simply makes no sense to envisage the occurrence of an experience without someone who has it, or the occurrence of a belief without someone who holds it, or the occurrence of a decision without someone who makes it. If the recognition of an ontology of subjects fails to pass some empiricist test of respectability, this serves to show only that the test is misconceived.

Secondly, those philosophers who have denied that the attachment of mental items to subjects is introspectively detectable have approached the issue of such detection in the wrong way. They have wrongly supposed that the introspective awareness of a mental item is similar in character to the perceptual awareness of a physical item, except that it is directed onto objects that exist in the inner arena of the mind rather than in the outer arena of the physical world. And because they have employed this perceptual model of the introspective awareness of mental items, they have further assumed, again wrongly, that if the attachment of a mental item to a subject is to be introspectively detected, this detection will have to take the form of the presentation of an additional object alongside the mental item in the inner arena, the two objects being presented in a form which displays the one as the subject of the other. It is hardly surprising that, on this basis, they have concluded that the attachment of mental items to subjects is not introspectively detectable. But it is the model of introspection that is at fault. When someone is introspectively aware of a mental item, he is not aware of it as an object presented to him. He is aware of it, more intimately, from the inside, as an instance of his own mentalizing—as an instance of his being in a certain mental state, or performing a certain kind of mental act, or engaging in a certain kind of mental activity. The subject’s awareness of himself, and of his role as mental subject, is an essential element of his awareness of the item itself.

There should be no issue, then, over the need for an ontology of mental subjects. One has only to focus on the nature of any type of mental item as our concept of that type reveals it—be it pain, visual experience, belief, decision making, desire, anger, or whatever—to be able to see quite plainly that that sort of thing can be realized only as an instance of mentalizing by a subject. And one has only to think about introspective awareness in the right way to see quite plainly that someone’s introspective awareness of a mental item includes the awareness of himself as its subject."

(Foster, John. "Subjects of Mentality." In After Physicalism, edited by Benedikt Paul Göcke, 72-103. Notre Dame, IN: University of Notre Dame Press, 2012. pp. 72-4)
<QUOTE

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 4:43 pm
by Terrapin Station
Consul wrote: May 23rd, 2020, 4:25 pm
Terrapin Station wrote: May 23rd, 2020, 3:58 pmAh, realized that this is simply yet another way to reword the same claim your making:

"No item of mentality/experientiality can possibly be a subject of mentality/experientiality, and vice versa. "

Rewording the same claim over and over isn't an argument for the claim.
The simple argument is that it's nonsensical to say that experienceRs are (bundles of) experiences.
Did you read the Unger quote? Here's yet another one:
Yes, but for the umpteenth time, I'm really only interested in quotations if you're going to defend them against objections (in a directed, specific manner rather than just searching for more quotes).

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 5:15 pm
by Sy Borg
Consul wrote: May 22nd, 2020, 11:52 pm
Greta wrote: May 22nd, 2020, 6:16 pmAs stated, we will know that we understand qualia when we can recreate it.
No, we will know that we understand phenomenal consciousness/subjective experience when the neuroscientists can give detailed answers to the following questions: https://plato.stanford.edu/entries/cons ... neSpecCons
No, you have put the cart before the horse.

Proofs come from experiments and observation, not guesswork and speculation. If we cannot measure qualia, then we cannot understand it. Until that time, we are in the dark and may pretend to be closing in the problem either for the sake of pride, grant money or politics (atheism/theism battle).

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 5:33 pm
by Consul
Terrapin Station wrote: May 23rd, 2020, 4:43 pmYes, but for the umpteenth time, I'm really only interested in quotations if you're going to defend them against objections (in a directed, specific manner rather than just searching for more quotes).
Let's hear your objections to what Unger, Foster, and I say!

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 5:37 pm
by Consul
Greta wrote: May 23rd, 2020, 5:15 pm
Consul wrote: May 22nd, 2020, 11:52 pm No, we will know that we understand phenomenal consciousness/subjective experience when the neuroscientists can give detailed answers to the following questions: https://plato.stanford.edu/entries/cons ... neSpecCons
No, you have put the cart before the horse.
Proofs come from experiments and observation, not guesswork and speculation. If we cannot measure qualia, then we cannot understand it. Until that time, we are in the dark and may pretend to be closing in the problem either for the sake of pride, grant money or politics (atheism/theism battle).
Of course, the scientific answers to those questions must be based on empirical data (both third-personal, extrospective ones and first-personal, introspective) ones.

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 5:51 pm
by Sy Borg
Consul wrote: May 23rd, 2020, 5:37 pm
Greta wrote: May 23rd, 2020, 5:15 pm No, you have put the cart before the horse.
Proofs come from experiments and observation, not guesswork and speculation. If we cannot measure qualia, then we cannot understand it. Until that time, we are in the dark and may pretend to be closing in the problem either for the sake of pride, grant money or politics (atheism/theism battle).
Of course, the scientific answers to those questions must be based on empirical data (both third-personal, extrospective ones and first-personal, introspective) ones.
If you can't put it in practice then you don't understand it. There will be a conceptually simple proof. I don't think measuring brain states alone will get us there, though. I think other body systems play unacknowledged roles in all of this. It was not so long ago that we had little idea of the influence of the microbiome. Now it's seen as a key player in health. There will be more discoveries of this ilk, I suspect, in relation to metabolic systems. My guess is that there are unknown synergies in the interactions of the nervous, digestive, respiratory, circulatory and endocrine systems that create a sense of being but, until we have experimental evidence, my guess is just one other in a long line.

Re: Consciousness without a brain?

Posted: May 23rd, 2020, 5:55 pm
by Gertie
Consul
''These are the persisting entities that have mental lives and in whose mental lives mental items occur; they are the things that have experiences, hold beliefs, feel emotions, and make decisions. Mental items can occur only as elements in the lives of mental subjects. This is because our very concept of any type of mental item just is the concept of a subject’s being in a certain mental state, or performing a certain kind of mental act, or engaging in a certain kind of mental activity. It is fundamental to our understanding of the forms of mentality in question that for an experience to occur is for a subject to experience something, for a belief to occur is for a subject to believe something, for a decision to occur is for a subject to decide something, and so on for each type of mental item. To suppose that an item of mentality could occur without a subject of mentality would be as absurd as supposing that there could be an instance of motion without something that moves, or an instance of smiling without something that smiles.''
This might be the case, but it's not a logical certainty in the way it's absurd to have a smile without a smiler.

Can there be mental experiencing (verb) without an Experiencer (noun)? The structure of our grammar, the way we think, makes it look absurd, and the fact that our experience has a first person perspective located in a body adds to the impression of Experiencer having experiences.

But when we look at the physical correlates, we don't find a Self correlate located in the brain, no Command and Control Centre watching the Cartesian Theatre play out and issuing orders.

Instead we see lots of subsystems doing their thing and interacting with others. And presumably some process whereby this massively complex cacophany becomes a unified field of consciousness, with a first person perspective moving through space and time, and the ability to focus attention and create something useful and coherent.

The question then is, I think, is there anything more to being a Self than that, or is this set of processes what amounts to a sense of being a self, an Experiencer (noun)?

That's an open question imo.

If you're a monist, you might answer one way, if you think mental experience isn't reducible you might answer another.