Jump to content

Photo

If Robots could feel/have emotions


  • Please log in to reply
68 replies to this topic

#31 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 02 July 2013 - 01:57 PM

Er... long post warning.

 


 

 

Further, you make the implication that I argue that everything that is authentic is worthy of (legal) protection. I do not. I do believe that authentic evil has a right to exist.

 

 

 

 

 

I made that implication because the word "authentic" doesn't make that kind of distinction in the language I know, and being that that distinction is important to the discussion, I don't think "authenticity" is an appropriate word to use in this situation. I know what you mean, but that's what being tacit is. That's what non-constructive language looks like, and it doesn't help us solve interesting problems. I'm only pressing for better, more articulate words.

 

Take the sentence "I do believe that authentic evil has a right to exist." I would interpret this as "It is a useful (problem-solving) approximation to the truth to assert that evil intent is useful." There are multiple ways in which this is true, but it's easy to dismiss the first statement on the grounds that you can't do anything about it. You can't make a law saying "authentic evil is illegal" because you can't say what authentic evil is, or how it differs from regular evil. The word "evil" is the problem here. It is too tacit to solve specific problems, and thus you can't use it to demonstrate value in making good decisions. Someone who speaks of "evil" will never know if they are doing the right thing, and they can't effectively convince anyone else that they are. That's a problem.

 

"Human dignity shall be inviolable"

 

The US Declaration of Independence has a similar statement, though it is not an actual legal document:

 

"We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness."

 

They both reference a concept of self-evident, inviolable innate human rights. The problem is that they aren't actually self-evident, they are just common to all human cultures. This is because human cultures are all trying to solve similar sets of problems, so we have developed a "standard" set of tools to do so. Kant's terminology is different and his ideas are more fleshed out, but they are still not the best we can do. We assert a need for people to have privacy and freedom of thought and action because it increases the variety of ideas in your culture and allows your society to solve more problems and be more flexible with its tools. There's nothing really innate about that, but it just doesn't carry the same rhetorical impact. It's also less tacit. We treat people well for a reason. It's also why we don't give the same rights to insects, bacteria, or objects -- because it wouldn't help us solve our problems. If it did, we'd want to give them legal protections as well.

 

 

As an aside, I don't like the idea of "truth" being thrown about as though it has some absolute meaning. There might be this idea of "truth" that underlies how the universe works. You know "The really real reality" that "exists" independent of any human mind -- but we'll never have access to it, and our language can't do much be be supremely general about such things. Instead, I would think of truth as a token of relative abstraction. For example:

 

Q: is A the same as B?

 

A: Yes, they are both letters.

A: No, they are different letters.

 

What do we mean by "same?" It's better, given that we don't "really really" know what same means here, to assert that both statements are true. That is, they are both approximately true under this loose understanding of sameness. With a more accurate definition, one or the other might not be true anymore. The ability to articulate a definition of sameness that allows one to differentiate between two different truths is what "knowledge" is all about. For example, if I right "1+2=4" down, and you can tell me what it is, specifically, that makes that statement false, then I know you understand the concepts of numbers, addition, and equality of expressions.

 

So it is not the fantasy of an electron that makes it meaningful, per se, but the ability for real people to act that makes the electron meaningful. Or rather, it lets physicists say "electrons are physical things." That is true in our best approximation.


Edited by RetraRoyale, 02 July 2013 - 01:59 PM.


#32 Sheik

Sheik

    Deified

  • Members

Posted 02 July 2013 - 02:52 PM

I notice how we think within different attitudes of mind. For you, what is useful is functional for the society. For me, what is useful is useful for the subject. The interest of the subject is of greater importance to me because the subject is the smallest unit/element of society. Also because, when in doubt, the subject will act towards it's own benefit (as in, towards its own wishes within the strategies that this subject has built to cope with the fact of himself having wishes) and not towards the benefit of the society. And it is in that sense that I see evil intent as justified for the subject because it can be a functional way of coping. For the society this can well be a problem. Society exists to the greater benefit of protecting it's "units" (subjects, individual persons). Now, if it is within the interest of society (if the subjects arrive at a common conclusion that it is of interest to all of them) I see it justified to put legal restrictions to evil intent.

So when I say that the idea of an electron is a meaningful fantasy, what I mean does cover what you said. But I also said it to imply that any fantasy is a meaningful fantasy for the subject that has created this fantasy.
 

Anyways, I am not interested in what you call an approximation to the truth. In fact, I do not even believe that there is a truth that holds true for more but one subject at a time. Each subject creates truth itself. These individual fantasies of truth can overlap and they do so signigicantly, but they are not the same. What I am interested in is peace. And peace is always working towards solving conflicts in a way that offers the greatest benefit to every single subject. And here is where I find the Kantian concept of human dignity to be most applicable. This is within my fantasy of the reality, within my "truth" or "faith", and I am well aware that is not what everybody believes. However, I happen to think it is justified to assume that the concept of human dignity is a worthwhile tool towards the goal of peace through postulation of inviolable human rights.

By the way, excuse my suboptimal choice in words or partially choppy wording/grammatics. I am not a native speaker of the English language, but I try my best to put my thinking into coherent sentences. By the way, this is one of the best debates I had on PZC in a while. You have a very interesting way of thinking, quite elaborate.
 


Edited by Sheik, 02 July 2013 - 02:55 PM.


#33 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 02 July 2013 - 03:27 PM

Your English is excellent, and my problems with your word choice are not about whether you are a native speaker or not, but about the fact that we have different influences. We are saying similar things with different vocabularies, and it might be the case that you are better at expressing one set of concepts, while I'm better at another. When I use the terms "tools" and "problem solving", I'm using them because they are simple and technical. You can translate these terms into something more expressive without losing the content, but it is easy to get confused when doing this, so I choose not to. I use these words because, in my mind, they don't imply much else.

 

I have a problem with your attempt to differentiate what "useful" means to either of us. First, the individual is not the smallest unit of a society. It's the smallest independently thinking unit, but there are smaller parts. (Organs for example.) The health and state of being of parts of persons are important to society. Just look at how social systems respond to sexual images or deformities. In this sense, the individual mind is only an interface between different levels of social structure.

 

More importantly, a human being cannot exist independent of culture. There is a concept called the "social self" which asserts that the human mind can functionally reproduce a "nondescript other" for social purposes. I can have a conversation with myself, or surprise myself, or reassess my own behavior. You don't need a society to do this, because your own brain fulfills the functions of an internalized social setting. You have a social system with expectations, judgements, and decisions in your own head. No healthily socialized person is ever socially alone.

 

If you want to say society exists to serve the individual, I think you might be thinking too narrowly about what a decision-making system is. A society cares for individuals in the same way that you care for your organs: you need them to survive. However, the mechanism is different. You care for yourself because (A) you have instincts to do so, and (B) because your social self can't survive without it. Social systems are decision-making systems, and there is no known way to isolate a decision-making system from its components or environment. Your mind (the decision-making system), brain (the components), and experiences (the environment) are all inseparably linked together.

 

Now, a fantasy may be useful to an individual and not a society. This is where the idea of "relative truths" or, in my terminology, "approximations to truth" comes into play. The ideal thing for a person to do will be beneficial to both the society and the individual. There is no reason to assume only one can benefit from the actions of an individual, so there is no reason for a society to protect actions that don't benefit both. Your laws should say "You can only help yourself if it also benefits society." Or more accurately, "You can only help yourself if you aren't significantly hurting anyone else to do so." The significance is determined by the social system, not the individual.

 

Peace preserves the things that society needs to make good decisions: individual variety & autonomy. The relationship between an individual and a society is really symbiotic. I think its a mistake to elevate either to a position of continual priority. We need better ways of talking about the system as a whole, including all of its intricacies. Regular old common language fails to do this, so we need good philosophers and wise law-makers.


Edited by RetraRoyale, 02 July 2013 - 03:30 PM.


#34 Mero

Mero

    Touch Fluffy Tail

  • Banned
  • Real Name:Tamamo No Mae
  • Location:Rainbow Factory

Posted 03 July 2013 - 05:36 AM

They CAN have a sense of touch...

stay up to date on technology guys buy watching VSauce and Subscribing to Science Daily. :D



#35 Sheik

Sheik

    Deified

  • Members

Posted 03 July 2013 - 02:54 PM

Rentra, I have read you post and I am sorry I didn't have found time to respond to it yet. Basically, this week I am about 10 hours on average in university each day (save for friday, which I have already filled with house-duties, though) and I barely find time to check my mails. Anyways, I agree on much of what you say, but I disagree that the relationship of individual and society is a symbiotic one. On contrary I would argue that it is forcefully a conflicted one. Let me provide you with something I will quote from the very worthwhile article "Critical psychoanalytic social psychology in the German speaking countries" by Markus Brunner (Sigmund-Freud-University), Nicole Burgermeister (Leibniz University), Jan Lohl (Sigmund-Freud-Institut), Marc Schwietring (Georg-August-University of Göttingen) and Sebastian Winter (University of Bielefeld). It portrays very well one of my most important influences on the thinking I have presented in the debate thus far:

 

 

Culture depends on human drives but needs to refuse their immediate satisfaction at the same
time: In order to control outer nature, culture must subject individuals to compulsory labor
and to rationality, it must annihilate desires and ‘channel’ or sublimate drive impulses.
Furthermore, pacifying the ‘cultural community’ internally requires that norms and ideals of
communal life are internalized and that people identify themselves with the community.
Internalizing the constraints and ideals that constitute culture in the form of the super-ego,
which gains its strength from tabooed aggressive and culturally inimical tendencies, produces
permanent feelings of guilt – “Civilization and its Discontents” (Freud, 1930) –, which grows
with the progress of culture but promotes progress at the same time.

 

 


Edited by Sheik, 03 July 2013 - 02:55 PM.


#36 LTTP

LTTP

    LTTP

  • Members
  • Real Name:LTTP
  • Location:A Place Far Far Away

Posted 03 July 2013 - 03:25 PM

to much text QQ



#37 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 03 July 2013 - 06:40 PM

Well, Sheik, now I'm confused about why you have a disagreement at all.

 

"In order to control outer nature, culture must subject individuals to compulsory labor
and to rationality, it must annihilate desires and ‘channel’ or sublimate drive impulses."

 

Society, as a decision-making system, controls and enforces human behavior. You might call this "forcefully conflicted", but I don't see why. Society does this for its own benefit, not the individuals. This might sound exploitative, but... I have another point to address here.

 

Anyway, consider the reverse situation. An individual will also seek out a social system that provides it a feeling of "welcome-ness" -- a home, or a place where they can be happy or content. Every person does this, whether they want acceptance into a broad culture, or just their own internalized one. Individuals will build societies because they need them, and the exploitation of individuals by society is not inherently forceful or conflicted. But then, this is hard to show, because there is such a huge variety of social systems operating at a variety of scopes and sizes.

 

As for the point I skipped: we are trying to address the issue of machine rights. This is an issue of ethics -- the study of decision-making. 

 

Now, when I claim that "a machine should have basic rights," one response you'll hear is "but it's just a machine." This is a tiresome and irksome idea. Imagine I claimed that "we should turn left at the intersection" and someone responded "but the sky is still blue." There is a very stupid assertion that I should be tacitly assuming that the sky being blue has anything to do with which direction to turn. I'm not going to make that assumption, because it's nonsense. Or rather, it's inarticulate and uninformed.

 

It doesn't matter if what you say is true if it doesn't matter. This is a really important principle of linguistics: relevance always supersedes truth. When you have to make a decision, the ideal solution must both be a solution and be ideal. It doesn't matter if we are talking about a machine. All that matters is "can it make good decisions?"

 

This is the same reason we have equal-opportunity and non-discrimination laws. Human achievement is almost entirely the result of the mastery of language. A person's skin color or gender or ancestry is irrelevant next to their ability to make good decisions. That's what society is trying to create. Should we discriminate against machines? No, they should be valued according to their ability to make decisions.

 

These ideas have their roots in intuitionistic logic and constructive mathematics. We construct, through incremental linguistic transformations, relevant and 'true' ideas that can be used to solve real problems. Everyone does this, on all scales, in every culture, all the time. That is what progress is, and if you can make a machine that can do it, then you've really struck gold. You'd have to be a complete buffoon to throw that kind of development in the trash.

 

(Pardon the lecture.)


Edited by RetraRoyale, 03 July 2013 - 06:42 PM.


#38 Sheik

Sheik

    Deified

  • Members

Posted 03 July 2013 - 11:22 PM

Well, Sheik, now I'm confused about why you have a disagreement at all.

 

"In order to control outer nature, culture must subject individuals to compulsory labor
and to rationality, it must annihilate desires and ‘channel’ or sublimate drive impulses."

 

Society, as a decision-making system, controls and enforces human behavior. You might call this "forcefully conflicted", but I don't see why. Society does this for its own benefit, not the individuals. This might sound exploitative, but... I have another point to address here.

 

Okay, so the idea I am trying to portray is that every individual forcefully has a conflict with society. Now, for this sentence to be even understood you will have to follow me into the language and conceptualization of psychoanalysis, Freudian school (for now). When a psychoanalyst says that a drive of a subject emerges, what s/he really means is that the subject's Id is pressing the subject's Ego towards some sort of release of drive engery (libido). The Id performs on the pleasure principle - it does not take into account how adequate or realistic its drives are, it simply pushes for them. The Ego of the Subject now has to channel this drive energy into an action, applying the reality principle. Ideally, the Ego will carry out the wish (the drive) of the Id. However, this is almost never possible. Society (through socialization) has instituted the Super Ego instance in the subject's psyche. Internalized morality and norms are what the Super Ego is interested in (and by extent, what society is interested in in its members). The Ego will have to take the moral push of the Super Ego into account when it tries to release the Id's libido. And this situation is forcefully conflicted. The Super Ego will not tolerate if the Ego would act on almost any drive without any consideration of the normative context. What we have now is two different interests (the drive/wish of the Id and the morality of the Super Ego, by extension of society) that press on the Ego with two opposing interests. This is what psychoanalysts call a conflict. And in that sense, the relation between indiviual (subject, self) and society (more accurately, its norms and morals) is necessarily conflictual.

I agree with this other statement you have made:
 

An individual will also seek out a social system that provides it a feeling of "welcome-ness" -- a home, or a place where they can be happy or content. Every person does this, whether they want acceptance into a broad culture, or just their own internalized one.

This is part of what the post-Freudian psychoanalysts described as need for relatedness, a very basic motivation in human life.

Anyways, within the Freudian conceptualization subject and society are forcefully in conflict due to the workings I have sketched out above. The conflict must of course not be overt or concious, as the psychoanalyst's terminology would call it (in fact, it will be unconcious most of the time), but it is very much a psychological reality of the subject, and thus something that exists.



On the matter of including robots/machines in morality, what pushes me off it that I am factually not possible to feel any kind of empathy towards robots. Their non-organic nature seems to have this kind of effect. And I would probably make it an postulate of my morality to be (at least theoretically) able to feel empathy towards an object worthy of my moral consideration and by extension of this idea (within the Kantian thinking figure) worthy of legal protection. I don't happen to be all that much interested in functionality which you constantly are pointing out to.

Now I am not sure whether the "soul robot" would potentially evoke empathy within me. Maybe if it was given a face with human features on top of its duplication of human soul life. In that case I might consider them worthy of legal protection. (Because this is what our brain needs to trigger empathy in the first place.)



 


Edited by Sheik, 03 July 2013 - 11:31 PM.


#39 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 04 July 2013 - 02:04 AM

On the matter of including robots/machines in morality, what pushes me off it that I am factually not possible to feel any kind of empathy towards robots. Their non-organic nature seems to have this kind of effect.

 

How can you say it is a fact that you can't feel empathy with a robot when you haven't met a robot with human-level intelligence? I'm sure you can find examples of children attributing agency to and exercising empathy for machines, especially if they mimic human behavior well. A robot that acts like a person will act organically. It will have an organic nature, because that's the nature it replicates. You should have no trouble empathizing with a suitably powerful machine.

 

Besides, we're not talking about mere robots here, but ones specifically capable of interacting successfully in social roles. You will be able to successfully interact in a social manner with it because that's what we are assuming in order to have this discussion.

 

Also, I don't have very much respect for Freudian psychoanalysis, so it's not a subject I'm going to discuss. I see no reason to believe the brain is so specifically compartmentalized. Talking about the wishes of the Id or the desires of the Super Ego is an unacceptable level of personification of what should be very abstract concepts.

 

Now, if you are trying to say that a person's base desires are in conflict with societal demands, then you are still thinking narrowly. Say, for example, that I want to punch someone in the face. Well, that's illegal. This is a good example of a conflict? Unfortunately for that notion, society is actually an important decision-making system, as I've been indicating. Society effectively demands a resolution to the conflict. Either I don't punch anyone in the face, or I figure out how to do it legally. Thus, I might create a virtual reality (if I'm clever enough to articulate how to construct one), and punch someone in the face irrespective of the law.

 

That doesn't seem like an individual vs society conflict to me, but rather an applied pressure to be good at making decisions. You either do what society demands or you do something better. In a highly developed society, you need to have a certain minimum amount of value to be accepted, so you need to be educated and integrated in order to be included. It's just not much of a conflict because of the individual need for inclusion. Society is using you to solve its problems, so it has standards for what kind of behavior is allowed -- namely, the type of behaviors that lead to good problem-solving.

 

It's not unreasonable to see it as a conflict, but I don't think it's ideal. "Conflict" tends to invoke an image of resolution by being victorious over your opponent, where in actuality, the conflict is resolved by a mutual desire for technological (be is social, physical, or linguistic) advancement. Another failure of the conflict idea is that it doesn't work well with the possibility of 'intermediate' systems, say, an organization that interfaces between an individual and their society. Like a robot, for instance.



#40 Sheik

Sheik

    Deified

  • Members

Posted 04 July 2013 - 12:34 PM

Also, I don't have very much respect for Freudian psychoanalysis, so it's not a subject I'm going to discuss. I see no reason to believe the brain is so specifically compartmentalized. Talking about the wishes of the Id or the desires of the Super Ego is an unacceptable level of personification of what should be very abstract concepts.

Have you read Sigmund Freud's writings? I really suggest that you read at least his introductory lectures on psychoanalysis. You can't really understand Freud well if you only read the summaries of his work because the summaries leave out all the thoughts Freud made about why and how he developed his concepts. And if you are willing to follow his rhetoric you will find his work a very enlightening read.
 

Now, if you are trying to say that a person's base desires are in conflict with societal demands, then you are still thinking narrowly. Say, for example, that I want to punch someone in the face. Well, that's illegal. This is a good example of a conflict?

It is not the best example, actually. To punch somebody in the face wouldn't really be the drive, it's the act which the drive is channeled into. The drive might in that case be more something like the expression of aggression towards a bad (potentially internalized) object which is projected onto the person you want to punch in the face. It might actually be an actualization of something like a negative affect towards a parent, something which society can not accept. (This is hard to put into a good example. Psychoanalysis has a very idiosyncratic focus and making any generalized examples does not do it justice. We would have to learn about your person and about your past and you would have to generate material to make out what affect the wish to punch somebody (and who that somebody is) in the face is acting on.)
I am sorry that this is so hard to access. I am afraid you need some quite elaborated insight into the psychoanalytic attitude of mind to make sense of it. :/

Edit: By the way. You might have noticed that I am kind of portraying society as the antagonist of the individual. This is right to some degree, but I do by no means disagree with you that society and individuals also are allies. It is sort of an dialectic relationship. But I argue that the underlying theme is necessarily conflicted, due to the interference of the nature of the drives of the subject and the norms of society.
Also, you are portaying society a lot in terms of laws and legal issues. What I mean I when I say society does also (and prominently) include these things, but I also refer to society as the subjects living in interpersonal relationships with other subjects.


Edited by Sheik, 04 July 2013 - 03:37 PM.


#41 Shane

Shane

    🩶

  • Moderators
  • Location:South Australia

Posted 04 July 2013 - 12:43 PM

If robots had feelings, they would take over the world.



#42 Nathaniel

Nathaniel

    Deified

  • Members

Posted 04 July 2013 - 01:08 PM

You think you would get something like the Cylons from Battlestar Galactica?



#43 )-( Marchland Malady )-(

)-( Marchland Malady )-(

    )-( Still No Tombs In The Sims 4 )-(

  • Members
  • Pronouns:He / Him
  • Location:United States

Posted 04 July 2013 - 02:49 PM

I found a few YouTube videos that are relevant to this subject. They present the dangers of robotics going too far.

http://www.youtube.c...e&v=VTxQsm2JeLg

http://www.youtube.c...e&v=2wmH7Snl_n4

http://www.youtube.c...e&v=DF39Ygp53mQ

http://www.youtube.c...WmqbJa99hi34j1A

The subject of robots being fused with (and even becoming like) animals and humans was covered by a certain game by Nintendo named Mother 3. If robots can become like (and even replace) animals and humans, then life has no value and no sanctity whatsoever.


Edited by Nolornbon, 04 July 2013 - 03:05 PM.


#44 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 04 July 2013 - 05:12 PM

So when a living thing creates a object that makes it redundant, it destroys its own value? You can already make babies. Life has value because life is doing the evaluating. The only way to diminish the value of life is to stop the evaluating process -- to kill it.

 

Besides, you don't know if emotional machines already exist somewhere in the universe or not. Either way, it has nothing to how valuable your life is.

 

(These discussions get real stupid when everybody gets cynical and starts assuming human beings would rather destroy the world than make something better than themselves. It's completely childish and unrealistic.)


Edited by RetraRoyale, 04 July 2013 - 05:25 PM.


#45 coolgamer012345

coolgamer012345

    🔸

  • Members
  • Location:Indiana, USA

Posted 26 July 2013 - 07:00 PM

All I have to say is

 

to much text

 

I think the future posts need to not have so much text in them... Of course before if the feeling robots took over the world.


Edited by coolgamer012345, 26 July 2013 - 07:03 PM.



1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users