Jump to content

Photo

If Robots could feel/have emotions


  • Please log in to reply
68 replies to this topic

#1 LTTP

LTTP

    LTTP

  • Members
  • Real Name:LTTP
  • Location:A Place Far Far Away

Posted 30 June 2013 - 09:07 PM

If robots were made to be just like humans, have feelings (Happy, sad ext..) Feel (Pain, hot, cold) and think like humans...

Do you think they should have rights like us?
(EXP: Can't use them as slaves, can't kill them, can't rape them, ext... pretty much same laws as humans)

Why or why not?

You could say "O its only artificial feelings and emotions... But is it really different from us?


My answer would be yes, anything with emotions, feeling, and the ability to do both good and bad should have rights.

 


This came to mine when one of my cuz (who works on robot like things, tho nothing near like this) was working on a project for university when I mentioned (You should make a robot that sets it self on fire and make it feel pain) and he said "wouldn't that be cruel?"

 



#2 strike

strike

    life is fragile, temporary, and precious

  • Members
  • Real Name:Olórin

Posted 30 June 2013 - 09:14 PM

You say do good and bad things, but that is not actually the defining factor on whether or not something should have rights: It's the ability to distinguish from the two. Most carnivores can kill and eat humans which most reasonable people consider wrong and can feel complex emotions, but they don't have rights because they can't tell the difference between right and wrong. I mean animals are really just extremely complicated machines when you think about it. A spiritual soul is the only defining factor distinguishing a human from an animal.

-Strike

#3 JetBox

JetBox

    Wizard

  • Members

Posted 30 June 2013 - 09:26 PM

I think, no, and yes. No because they don't have souls, and are only programmed to know this is bad. They need to know it's wrong or good not from a computer chip, but from a standpoint of passion and feeling.

 

Yes, because I don't want the robots to make Terminator a real life thing. :nerd:



#4 LTTP

LTTP

    LTTP

  • Members
  • Real Name:LTTP
  • Location:A Place Far Far Away

Posted 30 June 2013 - 11:07 PM

Maybe this can differ on a religious stand point as well (Souls and such)



#5 Russ

Russ

    Caelan, the Encouraging

  • Administrators
  • Location:Washington

Posted 30 June 2013 - 11:39 PM

Religious guy here, I think humans have souls. That said, I think robots should have all the same rights as humans, assuming they meet all those conditions you listed. Souls or no souls, they're still intelligent enough that they deserve to be treated equally.
  • Ventus likes this

#6 Hergiswi

Hergiswi

    don't look for me, i'm just a story you've been told

  • Members
  • Real Name:chris
  • Location:house

Posted 01 July 2013 - 12:00 AM

I want to say yes because things that are almost on the same level as us should probably be treated the same as us, but my heart is telling me no. We're all humans, and look at how we all treat each other. It would really be insulting to robots.

 

Plus the idea of that terrifies me and I'm not a strong enough man to overcome that fear in order to give liberty to robots.


  • Rambly likes this

#7 JetBox

JetBox

    Wizard

  • Members

Posted 01 July 2013 - 01:03 AM

Sorry for getting religious there. Also, they might be able to be treated equally, but who would want a robot like that anyways?



#8 strike

strike

    life is fragile, temporary, and precious

  • Members
  • Real Name:Olórin

Posted 01 July 2013 - 07:30 AM

Completely agree with everything Russ just said.

-Strike

#9 Koh

Koh

    Tamer Koh

  • Members
  • Real Name:Dominic
  • Location:Monsbaiya, Virginia

Posted 01 July 2013 - 08:49 AM

I swear we had a thread about this already, but it's old, so.......here we go again~  TAKE 2 BABY!

 

First of all...Emotions is spells wrong in the title :P.  How did that happen anyway?  O and A are on opposite sides of the keyboard, and T is still quite a few keys away from A.....o.o.

 

Secondly, I feel that they should.  I'm the kind of person who also backs at least basic Animal Rights, such that they aren't treated like shit.  The same applies to artificial life forms.  We shouldn't tread on their existence, but rather see where it leads them.  IF you put someone down, they're GOING to revolt.  It's inevitable.  It may not happen right away, but it WILL happen.  Think about slaves.  They didn't do it right away, but eventually they rose to power and claimed their freedom, through violent and unviolent means.  Instead of repeating history's mistakes, it'd be wise to instead learn from them, and give them general freedom right off the bat, and see how they'll fit into society.  Otherwise, you're going to have people getting killed during yet another revolution.


Edited by Koh, 01 July 2013 - 08:52 AM.


#10 RetraRoyale

RetraRoyale

    Doyen(ne)

  • Members

Posted 01 July 2013 - 01:02 PM

Robots don't have souls? How would anyone ever know that? What do you think makes them work?

 

Anyway, since social decision making and artificial systems are my professional specialty, I'd like to weigh in on this one.

 

First, "rights" are a kind of social fiction based entirely in rhetoric -- they are compact decision making tools, but they are not built for solving these kinds of problems. I don't actually have a "right" not to be a slave. What I have is a society that has laws telling people not to make me a slave because we, as a social system, have agreed that it is the best decision. In other words we don't have rights, we have laws. We have laws because we want to make good decisions.

 

(I could go on for days about what makes a decision "good" or not, but that is a deeper analysis than what is warranted here.)

 

Secondly, emotions are not all that important to the discussion, per se. You could always make a Pascal's Wager-esque assertion that everything has emotions, but that would be inefficient and tedious. Thus, being efficient and not tedious are more of a priority than respecting the emotional state of a thing. The reason human emotions are important is because they greatly influence our behavior and our ability to make good decisions. There is no need to (legally) respect emotions that don't lead to good behavior. (I.e, no law to protect someone for hurting someone else "because they were angry.") Your emotions aren't a matter of consideration unless they are socially constructive.

 

Thirdly, there are some very complex types of large-scale decision-making that interface with emotions -- compassion and social fears and whatnot. The benefit of these things is that they are a powerful social learning mechanism, which is important because (in the deeper analysis I mentioned above), good decisions are a result of recursive tool-building, which necessitates a form of social learning. That's just an elaborate way of saying that human beings can learn how to make good decisions by both having and reacting to emotions. It may even be possible that a fully recursive decision-making system (with some scale restraints) can't exist without emotions, but that is highly debatable.

 

Lastly, to answer the original question: creating machines with emotions would be an astounding and remarkable breakthrough in technology. We could could these machines to use in a huge number of novel ways. But what would the laws be protecting? Laws don't protect you because you have emotions, they protect you because you have individuality, as individuality means you have the potential to solve novel social problems. So long as a computer can do the same, it should be protected just the same. However, if the computer can be recreated on a whim, why protect it differently from property?


Edited by RetraRoyale, 01 July 2013 - 01:03 PM.


#11 LTTP

LTTP

    LTTP

  • Members
  • Real Name:LTTP
  • Location:A Place Far Far Away

Posted 01 July 2013 - 01:08 PM

@Koh

NO idea i was tired n was late when i posted this :P


@Russ
Perfect!!

@Retra
damm o.o



#12 Rambly

Rambly

    Hero of Time

  • Members

Posted 01 July 2013 - 01:25 PM

I say yes because I like both Bender and Nano and they're both robots.


  • SpacemanDan likes this

#13 Sheik

Sheik

    Deified

  • Members

Posted 01 July 2013 - 01:51 PM

 

No because they don't have souls, and are only programmed to know this is bad.

 

Religious guy here, I think humans have souls.

So, what is that a soul? What is there more to a soul that LTTP has not listed? Is it not the integration of emotion, motivation, learning, cognition and sensation?

Plus, they are not forcefully programmed. Scientists have built robots that are programmed to essentially program themselves (within the capacity of their ability to calcuate certain logarithms (spelling?) to anticipate and react to stimuli). These were robots that behaved "like" a swarm of ants. Where ants are directed my pheromones, these robots where directed in their "behavior" by their common input and the reactions they were able to calcuate within this input. And in fact their overt behavior was so much alike that of a swarm of insects, that you can by defintion only say that these robots formed a swarm.

Anyways. On the matter of whether they should have rights: Not the same ones as humans*. I am going to argue that humans do have rights, and I am not going to apply a logic as cold as Rentra's. Arguing with Immanuel Kant's idea of human dignitiy, I will say that humans do have inherited human rights that are essientially "born with them". If you want to approach this idea empirically, concepts such as the human capacity for empathy become relevant. Either way, I am going to admit to the restriction that human rights are a psychosocial situative construction, but I will argue that reality itself is a psychosocial situative construction, so if you disagree with me here that is probably because we might disagree on a more basic level about what the nature of reality (and thus, by implication, the nature of everything we are capable to percieve in any way at all) actually is.

*Robot rights - yes. Whatever that might be can probably only be fully understood once this future kind of robot has been created (or has created itself).


Edited by Sheik, 01 July 2013 - 02:01 PM.


#14 Russ

Russ

    Caelan, the Encouraging

  • Administrators
  • Location:Washington

Posted 01 July 2013 - 01:56 PM

Well, that's another thing. Really, you can't prove that robots don't have souls, because you can't really point to a soul. it's like asking "Okay, point to the mind in the human body." Well, your brain does the thinking, but it's not exactly the same thing as your mind, which is more intangible. You could even argue that a mind and a soul are the same thing, or roughly so, and that anything with a certain amount of intelligence has a soul. So that's why I think that having a soul shouldn't be the deciding factor for if something has basic rights. If these robots have all those traits mentioned, they practically have a soul anyways, regardless of the technical definition of soul or anything like that.



#15 Sheik

Sheik

    Deified

  • Members

Posted 01 July 2013 - 02:08 PM

Yes, yes, indeed. But what is it that you, personally, would say makes a soul? Let's say the mind and the soul were not the same (and the brain is only the substrate of the mental, it is not the mental itself, so we can put that aside from our debate for now). What is it that a soul is more than a mind?

If the soul was 'only' what a mind is, and if the human was capable of creating this within a robot (or if a robot was capable of creating this within a robot) then the human (or the robot) has provided both: the body and the soul. A truely divine act, is it not, from a religous point of view? Is that not blasphemy within a religious attitude of mind? (Why am I bringing this up? Because you say that you are religous and yet you are apparently so cool with the whole concept. I am merely a little irritated.)


Edited by Sheik, 01 July 2013 - 02:16 PM.



0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users