Page 3 of 13

Re: How would you Design a Humanoid ?

Posted: May 12th, 2022, 4:15 pm
by UniversalAlien
Atla wrote:
Bla bla most members of those now extinct species were content most of the time when they lived.
FALSE :!: Life has nothing to do with being content; Balanced sometimes, but never content :idea:

From the first cell, whether created by accident or design - Contentedness did not exist,
if it did there never have been a second cell, etc., etc.

Re: How would you Design a Humanoid ?

Posted: May 12th, 2022, 4:44 pm
by Sy Borg
Atla wrote: May 12th, 2022, 10:55 am
SteveKlinko wrote: May 12th, 2022, 8:29 am If there is a useful purpose in the Universe for Excessive Suffering (beyond usefulness for Survival), I don't understand it. But I have a feeling that it might not be as bad as we seem to think it is. Great thing for me to say all comfortable here at my desk typing on the computer. But the important thing is that all Suffering is apparently eventually ended. Someone once made the observation that Excessive Suffering beyond any usefulness for Survival exists because there is no Survival advantage in Evolving a shut off switch for this type of Suffering.
Actually I think most organisms seem to be content most of the time. There is a lot of carnage, but even more contentedness. If life was just carnage, it couldn't have made it this far.
Fair point, but that level of carnage has long been unacceptable to humans. Imagine a group of thugs in the street suddenly knocking over a young child and tearing chunks off her flesh as she screams, but no one is able to help. That situation is completely normal for many social species, even a daily occurrence.

Such animals might spend much of their time ostensibly content, but it is a contentment tempered by skittishness caused by the realistic potential for horrific violence to be perpetrated on their members every day.

Re: How would you Design a Humanoid ?

Posted: May 12th, 2022, 11:38 pm
by Atla
UniversalAlien wrote: May 12th, 2022, 4:15 pm Atla wrote:
Bla bla most members of those now extinct species were content most of the time when they lived.
FALSE :!: Life has nothing to do with being content; Balanced sometimes, but never content :idea:

From the first cell, whether created by accident or design - Contentedness did not exist,
if it did there never have been a second cell, etc., etc.
Non sequitur

Re: How would you Design a Humanoid ?

Posted: May 13th, 2022, 8:15 am
by Pattern-chaser
Pattern-chaser wrote: May 10th, 2022, 9:51 am Why do you express this as a binary choice? Either use emotions or use intellect? Why not both? At the same time, even? It's what humans have always done. Sometimes we use more emotions, and others, we make more use of intellect, but we do both, most of the time, in some (varying) ratio.
Sy Borg wrote: May 10th, 2022, 4:14 pm I am not thinking as you imagine. I just look at the world, millions and millions of absolute idiots making ridiculous emotional claims about topics they know nothing about, that a minute's reflection and study would correct. Have you ever checked out the "logic" of flat-Earthers or "pro-lifers"?

If you have a problem with shifting away from atavism towards intelligence, then you do not think the way I thought you did.
I don't know as much about this as I would like, given that I hope to answer your objections. 🙁 Nevertheless, I have read that AI folks seem to have an understanding that emotions give us purpose; they drive us to aim for things, to seek to achieve them. Here's a quote that seems to support such a view:
Traditional approaches to the study of cognition emphasize an information-processing view that has generally excluded emotion. In contrast, the recent emergence of cognitive neuroscience as an inspiration for understanding human cognition has highlighted its interaction with emotion.

...

Investigations into the neural systems underlying human behavior demonstrate that the mechanisms of emotion and cognition are intertwined from early perception to reasoning. These findings suggest that the classic division between the study of emotion and cognition may be unrealistic and that an understanding of human cognition requires the consideration of emotion.
Link to original article.

And here's another, taken from a review of the book "The Value of Emotions for Knowledge" (link):
If we could assign a foil for this volume, it would be the view that takes emotions to be opposed to rationality, a view on which emotions are generally distracting, fact-twisting, misleading, and unreliable, hindering rather than furthering our epistemic goals. Most of the papers in this volume challenge this picture in one way or another.
Sy Borg wrote: May 10th, 2022, 4:14 pm Ultimately, though, the only useful role emotions will have to AI is the human interface.
See above.

You describe any reliance on emotion as "atavism", which doesn't seem to square with our current understanding of such things. I have always been a little alarmed at the intelligencists (?) who champion their preferred characteristic with the dedication of religious fundamentalists, or sciencists. If the only mental attribute we needed was intelligence, then surely evolution would've allowed the other attributes, such as emotions, to fall by the wayside? Those unencumbered with time-wasting qualities like emotion, or even wisdom, would have succeeded beyond their peers, in the way of evolution. But it hasn't happened. Why is that, do we think?

I think it's because intelligence, untempered by our other mental attributes, is insufficient to our needs.


Sy Borg wrote: May 10th, 2022, 4:14 pm Life is replete with torture, which need not be deliberate.
Ah, OK. I had assumed that torture included intentionality, so this disagreement seems to be just a misunderstanding.

Re: How would you Design a Humanoid ?

Posted: May 13th, 2022, 12:42 pm
by AverageBozo
JackDaydream wrote: May 7th, 2022, 9:33 am
Scientists have been working on this for a long time in the movement of transhumanism, and some aspects may be science fiction and some aspects of artificial intelligence are developing already. It is likely that there are attempts to create superbeings, which merge aspects of machine or the artificial with sentience. Nanotechnology is an aspect of upgrading or the human being.

But the question may be where this is going with evolution? Is transformation an aspect of physicality or consciousness? There are also political agendas and the upgrades may serve the powers of the elite. Perhaps, humans as we know them, and other sentient beings will become increasingly instinct, in a world dominated by the humanoid superbeings as masters of planet earth.

Personally, I probably am not very good at being a machine, which may be more and what is in demand, but I think that I would rather remain as an a human being with flaws than become a humanoid, programmed beyond my own control. Of course, it would be great if the negative aspects of human nature could be eliminated, but it may not be that simple. Loss of sentience may mean loss of emotion, empathy and compassion.
Interesting post. BTW Isaac Asimov raised the possibility of robots becoming pseudo human in his short story (written in the 1950s I believe), Centennial Man.

It seems that as human body parts are replaced by the growing number of artificial devices (e.g. artificial hips and knees, artificial heart valves and coronary arteries), humans may actually become more artificial than biologic.

Re: How would you Design a Humanoid ?

Posted: May 13th, 2022, 5:01 pm
by UniversalAlien
AverageBozo wrote: May 13th, 2022, 12:42 pm
JackDaydream wrote: May 7th, 2022, 9:33 am
Scientists have been working on this for a long time in the movement of transhumanism, and some aspects may be science fiction and some aspects of artificial intelligence are developing already. It is likely that there are attempts to create superbeings, which merge aspects of machine or the artificial with sentience. Nanotechnology is an aspect of upgrading or the human being.

But the question may be where this is going with evolution? Is transformation an aspect of physicality or consciousness? There are also political agendas and the upgrades may serve the powers of the elite. Perhaps, humans as we know them, and other sentient beings will become increasingly instinct, in a world dominated by the humanoid superbeings as masters of planet earth.

Personally, I probably am not very good at being a machine, which may be more and what is in demand, but I think that I would rather remain as an a human being with flaws than become a humanoid, programmed beyond my own control. Of course, it would be great if the negative aspects of human nature could be eliminated, but it may not be that simple. Loss of sentience may mean loss of emotion, empathy and compassion.
Interesting post. BTW Isaac Asimov raised the possibility of robots becoming pseudo human in his short story (written in the 1950s I believe), Centennial Man.

It seems that as human body parts are replaced by the growing number of artificial devices (e.g. artificial hips and knees, artificial heart valves and coronary arteries), humans may actually become more artificial than biologic.
Already, today, and not just in sci-fi:

CYBORG {1960s: blend of cyber- and organism}: "a fictional or hypothetical person whose physical abilities are extended beyond normal human limitations by mechanical elements built into the body."

Still a mostly organic brain - The question then becomes of whether a totally synthetic, non organic, entity possessing the qualities of sentience, including value judgments, feelings, and emotions and a Human like ego, can exist, remain a matter of speculation :?: :idea: :?:

Re: How would you Design a Humanoid ?

Posted: May 13th, 2022, 10:07 pm
by Sy Borg
Pattern-chaser wrote: May 13th, 2022, 8:15 am
Pattern-chaser wrote: May 10th, 2022, 9:51 am Why do you express this as a binary choice? Either use emotions or use intellect? Why not both? At the same time, even? It's what humans have always done. Sometimes we use more emotions, and others, we make more use of intellect, but we do both, most of the time, in some (varying) ratio.
Sy Borg wrote: May 10th, 2022, 4:14 pm I am not thinking as you imagine. I just look at the world, millions and millions of absolute idiots making ridiculous emotional claims about topics they know nothing about, that a minute's reflection and study would correct. Have you ever checked out the "logic" of flat-Earthers or "pro-lifers"?

If you have a problem with shifting away from atavism towards intelligence, then you do not think the way I thought you did.
I don't know as much about this as I would like, given that I hope to answer your objections. 🙁 Nevertheless, I have read that AI folks seem to have an understanding that emotions give us purpose; they drive us to aim for things, to seek to achieve them. Here's a quote that seems to support such a view:
Traditional approaches to the study of cognition emphasize an information-processing view that has generally excluded emotion. In contrast, the recent emergence of cognitive neuroscience as an inspiration for understanding human cognition has highlighted its interaction with emotion.

...

Investigations into the neural systems underlying human behavior demonstrate that the mechanisms of emotion and cognition are intertwined from early perception to reasoning. These findings suggest that the classic division between the study of emotion and cognition may be unrealistic and that an understanding of human cognition requires the consideration of emotion.
Link to original article.

And here's another, taken from a review of the book "The Value of Emotions for Knowledge" (link):
If we could assign a foil for this volume, it would be the view that takes emotions to be opposed to rationality, a view on which emotions are generally distracting, fact-twisting, misleading, and unreliable, hindering rather than furthering our epistemic goals. Most of the papers in this volume challenge this picture in one way or another.
Sy Borg wrote: May 10th, 2022, 4:14 pm Ultimately, though, the only useful role emotions will have to AI is the human interface.
See above.

You describe any reliance on emotion as "atavism", which doesn't seem to square with our current understanding of such things. I have always been a little alarmed at the intelligencists (?) who champion their preferred characteristic with the dedication of religious fundamentalists, or sciencists. If the only mental attribute we needed was intelligence, then surely evolution would've allowed the other attributes, such as emotions, to fall by the wayside? Those unencumbered with time-wasting qualities like emotion, or even wisdom, would have succeeded beyond their peers, in the way of evolution. But it hasn't happened. Why is that, do we think?

I think it's because intelligence, untempered by our other mental attributes, is insufficient to our needs.


Sy Borg wrote: May 10th, 2022, 4:14 pm Life is replete with torture, which need not be deliberate.
Ah, OK. I had assumed that torture included intentionality, so this disagreement seems to be just a misunderstanding.
Agree, there have been some crossed wires. In context, when I talk about less emotion and more logic and rationality, I am not talking about suddenly ripping emotionality from essentially emotional animals. I am talking about the issue of people believing utterly implausible things, and then doubling down when corrected - due to emotions ruling the intellect. Humanity won't be mature until at least most adults eschew obvious superstitions and are not easily conned by obvious manipulations.

Re: How would you Design a Humanoid ?

Posted: May 14th, 2022, 12:34 pm
by Pattern-chaser
AverageBozo wrote: May 13th, 2022, 12:42 pm
JackDaydream wrote: May 7th, 2022, 9:33 am
Scientists have been working on this for a long time in the movement of transhumanism, and some aspects may be science fiction and some aspects of artificial intelligence are developing already. It is likely that there are attempts to create superbeings, which merge aspects of machine or the artificial with sentience. Nanotechnology is an aspect of upgrading or the human being.

But the question may be where this is going with evolution? Is transformation an aspect of physicality or consciousness? There are also political agendas and the upgrades may serve the powers of the elite. Perhaps, humans as we know them, and other sentient beings will become increasingly instinct, in a world dominated by the humanoid superbeings as masters of planet earth.

Personally, I probably am not very good at being a machine, which may be more and what is in demand, but I think that I would rather remain as an a human being with flaws than become a humanoid, programmed beyond my own control. Of course, it would be great if the negative aspects of human nature could be eliminated, but it may not be that simple. Loss of sentience may mean loss of emotion, empathy and compassion.
Interesting post. BTW Isaac Asimov raised the possibility of robots becoming pseudo human in his short story (written in the 1950s I believe), Centennial Man.

It seems that as human body parts are replaced by the growing number of artificial devices (e.g. artificial hips and knees, artificial heart valves and coronary arteries), humans may actually become more artificial than biologic.
That would be cyborgs, then. Part biological and part metal.

Re: How would you Design a Humanoid ?

Posted: May 14th, 2022, 1:53 pm
by Pattern-chaser
Sy Borg wrote: May 13th, 2022, 10:07 pm When I talk about less emotion and more logic and rationality, I am not talking about suddenly ripping emotionality from essentially emotional animals. I am talking about the issue of people believing utterly implausible things, and then doubling down when corrected - due to emotions ruling the intellect.
I'm not sure that believing nonsense is necessarily an emotional thing. I wouldn't be surprised to find emotions involved, but isn't there more to it than just blaming emotions? Are emotions alone sufficient to cause such behaviours, I wonder?

Re: How would you Design a Humanoid ?

Posted: May 14th, 2022, 9:42 pm
by Sy Borg
Pattern-chaser wrote: May 14th, 2022, 1:53 pm
Sy Borg wrote: May 13th, 2022, 10:07 pm When I talk about less emotion and more logic and rationality, I am not talking about suddenly ripping emotionality from essentially emotional animals. I am talking about the issue of people believing utterly implausible things, and then doubling down when corrected - due to emotions ruling the intellect.
I'm not sure that believing nonsense is necessarily an emotional thing. I wouldn't be surprised to find emotions involved, but isn't there more to it than just blaming emotions? Are emotions alone sufficient to cause such behaviours, I wonder?
Yup :)

Feigning belief in nonsense is strategic, but the strategy is based on fear. For example, a North Korean who "believes" that Kim can fly with his magic powers is doing so to avoid victimisation, possibly death.

Believing nonsense is definitely emotional. If not emotional, a person presented with a nonsense idea will either fact-check or be content with uncertainty.

Re: How would you Design a Humanoid ?

Posted: May 15th, 2022, 9:18 am
by Pattern-chaser
Sy Borg wrote: May 13th, 2022, 10:07 pm When I talk about less emotion and more logic and rationality, I am not talking about suddenly ripping emotionality from essentially emotional animals. I am talking about the issue of people believing utterly implausible things, and then doubling down when corrected - due to emotions ruling the intellect.
Pattern-chaser wrote: May 14th, 2022, 1:53 pm I'm not sure that believing nonsense is necessarily an emotional thing. I wouldn't be surprised to find emotions involved, but isn't there more to it than just blaming emotions? Are emotions alone sufficient to cause such behaviours, I wonder?
Sy Borg wrote: May 14th, 2022, 9:42 pm Yup :)

Feigning belief in nonsense is strategic, but the strategy is based on fear. For example, a North Korean who "believes" that Kim can fly with his magic powers is doing so to avoid victimisation, possibly death.

Believing nonsense is definitely emotional. If not emotional, a person presented with a nonsense idea will either fact-check or be content with uncertainty.
What worries me about your theme is that you present (only) two highly-general headings - "logic and rationality" and "emotion" - to cover a wide variety of things. Can these two really encapsulate and fully-contain this discussion? Aren't other, er, qualities involved too?

Re: How would you Design a Humanoid ?

Posted: May 15th, 2022, 9:06 pm
by Sy Borg
Pattern-chaser wrote: May 15th, 2022, 9:18 am
Sy Borg wrote: May 13th, 2022, 10:07 pm When I talk about less emotion and more logic and rationality, I am not talking about suddenly ripping emotionality from essentially emotional animals. I am talking about the issue of people believing utterly implausible things, and then doubling down when corrected - due to emotions ruling the intellect.
Pattern-chaser wrote: May 14th, 2022, 1:53 pm I'm not sure that believing nonsense is necessarily an emotional thing. I wouldn't be surprised to find emotions involved, but isn't there more to it than just blaming emotions? Are emotions alone sufficient to cause such behaviours, I wonder?
Sy Borg wrote: May 14th, 2022, 9:42 pm Yup :)

Feigning belief in nonsense is strategic, but the strategy is based on fear. For example, a North Korean who "believes" that Kim can fly with his magic powers is doing so to avoid victimisation, possibly death.

Believing nonsense is definitely emotional. If not emotional, a person presented with a nonsense idea will either fact-check or be content with uncertainty.
What worries me about your theme is that you present (only) two highly-general headings - "logic and rationality" and "emotion" - to cover a wide variety of things. Can these two really encapsulate and fully-contain this discussion? Aren't other, er, qualities involved too?
There is zero to worry about with my views. I think it's thoroughly reasonable to hope for a future where people make rational decisions based on actual facts than emotional decisions based on transparent conspiracy theories, eg. Q and ancient myths.

Because I am one human on a forum with limited time, I'm just considering one - extremely important - angle.

So what other qualities do you suggest? I expect you would point to altruism. But what kind of altruism? To give people what they want of what the powers-that-be think they should have? Any other qualities?

Re: How would you Design a Humanoid ?

Posted: May 16th, 2022, 8:46 am
by Pattern-chaser
Pattern-chaser wrote: May 15th, 2022, 9:18 am What worries me about your theme is that you present (only) two highly-general headings - "logic and rationality" and "emotion" - to cover a wide variety of things. Can these two really encapsulate and fully-contain this discussion? Aren't other, er, qualities involved too?
Sy Borg wrote: May 15th, 2022, 9:06 pm I think it's thoroughly reasonable to hope for a future where people make rational decisions based on actual facts than emotional decisions based on transparent conspiracy theories, eg. Q and ancient myths. ... So what other qualities do you suggest?
To be honest, I'm not sure. It just seems somewhat rash to describe the entire mental life of a person with only two criteria - logic/reasoning and emotions. Surely there's more to it than this simple binary decision?

Re: How would you Design a Humanoid ?

Posted: May 16th, 2022, 8:04 pm
by Sy Borg
Pattern-chaser wrote: May 16th, 2022, 8:46 am
Pattern-chaser wrote: May 15th, 2022, 9:18 am What worries me about your theme is that you present (only) two highly-general headings - "logic and rationality" and "emotion" - to cover a wide variety of things. Can these two really encapsulate and fully-contain this discussion? Aren't other, er, qualities involved too?
Sy Borg wrote: May 15th, 2022, 9:06 pm I think it's thoroughly reasonable to hope for a future where people make rational decisions based on actual facts than emotional decisions based on transparent conspiracy theories, eg. Q and ancient myths. ... So what other qualities do you suggest?
To be honest, I'm not sure. It just seems somewhat rash to describe the entire mental life of a person with only two criteria - logic/reasoning and emotions. Surely there's more to it than this simple binary decision?
I don't care about other aspects; they are either not as problematic, or not an issue at all, just peccadilloes.

However, there are major problems caused by extreme emotionalism that trumps the intellect, breaking down people's capacity to think clearly, resulting in outlandish cognitive dissonance being normalised.

Once reason is abandoned, there can be only war - be it physical, political or social. When emotion conquers reason, there can be no discussion, no working through issues, only hostility and the destruction of one's enemies. I like to think that reflexive, mindless lunacy can be overcome.

Re: How would you Design a Humanoid ?

Posted: May 16th, 2022, 8:49 pm
by GrayArea
Sy Borg wrote: May 16th, 2022, 8:04 pm
Pattern-chaser wrote: May 16th, 2022, 8:46 am
Pattern-chaser wrote: May 15th, 2022, 9:18 am What worries me about your theme is that you present (only) two highly-general headings - "logic and rationality" and "emotion" - to cover a wide variety of things. Can these two really encapsulate and fully-contain this discussion? Aren't other, er, qualities involved too?
Sy Borg wrote: May 15th, 2022, 9:06 pm I think it's thoroughly reasonable to hope for a future where people make rational decisions based on actual facts than emotional decisions based on transparent conspiracy theories, eg. Q and ancient myths. ... So what other qualities do you suggest?
To be honest, I'm not sure. It just seems somewhat rash to describe the entire mental life of a person with only two criteria - logic/reasoning and emotions. Surely there's more to it than this simple binary decision?
I don't care about other aspects; they are either not as problematic, or not an issue at all, just peccadilloes.

However, there are major problems caused by extreme emotionalism that trumps the intellect, breaking down people's capacity to think clearly, resulting in outlandish cognitive dissonance being normalised.

Once reason is abandoned, there can be only war - be it physical, political or social. When emotion conquers reason, there can be no discussion, no working through issues, only hostility and the destruction of one's enemies. I like to think that reflexive, mindless lunacy can be overcome.
This will probably be difficult to solve, though not impossible, because even the very idea that emotion brings negative consequences and that we should value ration more, is based on ration and intellect to begin with, which does not resonate with people who deny them in the first place and cling to the polar opposite which is emotion.

Perhaps it could solved through the external physical modification of the human brain, though that is also a debatable topic by itself.