Do they deserve basic human rights?

Basic human rights?


  • Total voters
    86
Status
Not open for further replies.
Why would a robot want to be human? Do robots argue if humans can have basic robot rights?
 
I believe that any creature that can perform duties has rights. Therefore my answer is yes.

On a similar note, since animals can't perform any civic duties they have no rights. We have the freedom to be kind to them, but it's at our discretion. Animal rights = BS.

They keep our ecosystem intact and provide sustenance. Sounds like civic duties to me.
 
I'm not saying that it'd be impossible to design a machine with the same responses to pain as us, but that we shouldn't.
Torture is the most extreme use of pain, and it's universally condemned. The intense psychological suffering resulting from prolonged pain isn't something we should design into AIs.

It may be necessary to imbue a robot with such responses if we want it to be a contributing worker. It may be that the only way to make useful robots in human society is to set them up with a pain/pleasure response to stimuli, have one prototype live and grow for about 10 years, and then copy the program into several other robots.

Some futurists have suggested that robots assigned to especially important tasks like emergency response, science, and leadership, will have to attend human schools and live and play with human children.

This will raise a whole host of ethical problems, but the transformation will be so gradual and occur in such tiny logical steps that our culture will gradually learn to accept robots into schools, homes, and finally into leadership positions. Giving them civic rights, however, will probably be a prolonged battle not unlike the plight of African American slaves. After all, robots will be our slaves, and should they be as intelligent as humans,and especially if they grow up and learn with humans, there will be a great moral battle ahead.

I don't know, I just find AI to be very interesting, and robotics to be a field that has absolutely huge potential. This is why I am going to start studying computer science and minoring in robotics at Carnegie Mellon next year. Hopefully I'll be around to facilitate the great revolution in consumer robotics which will surely happen within the next 50 years. I predict simple houseworker robots with the brain capacity of dogs by about 2030, and robots with sufficient human intelligence by 2050. The robotics revolution will have many parallels to the PC revolution, but other than that, we can not be certain of where it will go.
 
If we get to that point ethics will seriously have to be rethought. It is hard to say. For the moment I shall vote no, even though objectively I realise there would really be no mental difference between us (and I find that more important than it being made of metal and me blood).
 
well animals dont get human rights, so robots shouldn't either, if humans are smart enough to make that many good working robots that feel pain and all, then they would be smart enough to come up with "robot rights" or something like that, animals get animal rights dont they, so robots could have something similar.

if they had human rights then that will mean they can get a job, and own a house. where would that leave the humans.
 
If something that complex were to be created, I think it would be like creating another intelligent species (something that can reason like us apparently). Regardless of whether or not it can truely feel due to it's lack of organic components, I think we should still respect it like any other being.


Woudln't not-respecting it be like, lets say in the future we can perserve ourselves longer by implanting our brain into some sort of machine, would we not respect something like that simply because it can't truely feel?
 
Do robots deserve human rights? No.

Do robots deserve rights? Yes.

If robots ever get as intelligent or more intalligent than humans, I don't think they really deserve the same rights as humans (for the reasons Houndini stated). If we create robots, their purpose will be to serve a particular task to make human life/research/whatever easier, I don't think we'll start producing robots just for the purpose of them existing, our population is high enough as it is. They won't really need property or money or any of the things humans deem as a necessity. However, they will need some rights to prevent people from exploiting them, so they can keep serving their task.
 
I guess the important thing is that they behave exactly like humans. That means they will respond the same way to getting punched in the face or sworn at. If you punch it, it will punch back, and maybe limp a bit afterwards or something. But we don't know if it will actually feel pain. Because the thing is, we don't even know what feelings are, and I don't think we ever will. I think they might be beyond our mental capacity to comprehend.

This machine would look gloomy and sound sad when it had a shit day, and let's say the pattern of signals in its gloomy brain is also roughly the same as that in a gloomy human's brain. But its brain is made of silicon. It's the pattern that the signals follow that produces the output to the body, telling it to look and act gloomy, and to the rest of the brain, telling it to think gloomy thoughts. But how will that pattern make it feel gloomy? I mean, what if instead of a silicon computer, it had a brain made of nanoscopic cogs and gears, like Babbage's Difference Engine (but smaller)? The same patterns would still be there and it would act just the same, but how can the movement of cogs produce an actual feeling?

@theotherguy: Yeah, COG can learn things and recognise stimuli as good or bad. It knows when its joints are bent badly, but that's not the same as feeling pain, is it? It's the same with writing a program that favours different users depending on what imput they give it. You could program it to recognise certain behaviours and accordingly change a variable that coresponds to that user. Then it could give different outputs to that user, depending on what value the variable is, but in the end, all you're doing is changing a variable.

Actually, I suppose with enough research we could isolate certain sections of the brain, maybe certain speciallised neurons, which are responsible for the perception of emotion, and the other senses. I suppose we'd then know where emotions come from, but I still don't think we'd actually understand how or why that works.
Sorry for going on so much.

EDIT: Oh yeah, the question. I voted Yes, because I'd rather assume that they really can feel things than risk causing massive amounts of suffering to a sentient creature. And some other vague reasons that I'm not sure how to put into words.
 
Do robots deserve human rights? No.

Do robots deserve rights? Yes.

If robots ever get as intelligent or more intalligent than humans, I don't think they really deserve the same rights as humans (for the reasons Houndini stated). If we create robots, their purpose will be to serve a particular task to make human life/research/whatever easier, I don't think we'll start producing robots just for the purpose of them existing, our population is high enough as it is. They won't really need property or money or any of the things humans deem as a necessity. However, they will need some rights to prevent people from exploiting them, so they can keep serving their task.

well said
 
I voted no, but it's not important whether they have feelings or not. What is relevant is whether they have power or the potential to have power over us. The history of human rights is punctuated with popular uprisings and revolts. Outside of small naturally ordered communities, nobody would be initially willing to grant rights to others unless their was good reason (fear of the masses).

As evidence I present the French revolution of 1789, the American revolution of 1775, the widespread European revolutions of 1848, the Bolshevik revolution of 1917, the Emancipation Proclamation, the Civil Rights Movement, and the Suffrage Movement.
 
Didn't read the pages of this topic.

AL, I voted YES. If we go so far as to make something like that, we might as well go the rest of the way. If it's a sentient machine, it gets the rights.
 
Even if those robots would truely be sentient, there would still be major differences between them and humans. They would deserve right, but not the same rights as humans. Definitely not the same rights as humans if they don't abide some kind of properly functrionning '3 laws' system that would keep them from going postal.
 
oh yeah, lets just give them human rights cause we feel sorry for them - sarcasim

next thing you will see is robots voting and then a robot president, LOTS of people are struggling to find work now, it will be nearly impossible with robots.
 
oh yeah, lets just give them human rights cause we feel sorry for them - sarcasim

next thing you will see is robots voting and then a robot president, LOTS of people are struggling to find work now, it will be nearly impossible with robots.

Yeah, because robot rights are a big current issue right now. Remember, this is all hypothetical.

The current state of robots:
PR_KUKA_Industrial_Robot_KR16_01.jpg
 
I guess the important thing is that they behave exactly like humans. That means they will respond the same way to getting punched in the face or sworn at. If you punch it, it will punch back, and maybe limp a bit afterwards or something. But we don't know if it will actually feel pain. Because the thing is, we don't even know what feelings are, and I don't think we ever will. I think they might be beyond our mental capacity to comprehend.

This machine would look gloomy and sound sad when it had a shit day, and let's say the pattern of signals in its gloomy brain is also roughly the same as that in a gloomy human's brain. But its brain is made of silicon. It's the pattern that the signals follow that produces the output to the body, telling it to look and act gloomy, and to the rest of the brain, telling it to think gloomy thoughts. But how will that pattern make it feel gloomy? I mean, what if instead of a silicon computer, it had a brain made of nanoscopic cogs and gears, like Babbage's Difference Engine (but smaller)? The same patterns would still be there and it would act just the same, but how can the movement of cogs produce an actual feeling?

@theotherguy: Yeah, COG can learn things and recognise stimuli as good or bad. It knows when its joints are bent badly, but that's not the same as feeling pain, is it? It's the same with writing a program that favours different users depending on what imput they give it. You could program it to recognise certain behaviours and accordingly change a variable that coresponds to that user. Then it could give different outputs to that user, depending on what value the variable is, but in the end, all you're doing is changing a variable.

Actually, I suppose with enough research we could isolate certain sections of the brain, maybe certain speciallised neurons, which are responsible for the perception of emotion, and the other senses. I suppose we'd then know where emotions come from, but I still don't think we'd actually understand how or why that works.
Sorry for going on so much.

EDIT: Oh yeah, the question. I voted Yes, because I'd rather assume that they really can feel things than risk causing massive amounts of suffering to a sentient creature. And some other vague reasons that I'm not sure how to put into words.

Let's say I invented a silicon device which was the same size and had the exact same functions of a neuron. Let's say then, that we killed one of your brain cells, and put this silicon device in its place. Let's connect this device to all the other neurons that the living neuron used to be connected to, and turn it on.

Are you any less of a person, or even a different person from us performing this one procedure?

Let's continue the process, piece by piece, until we have killed, and replaced, every single neuron in your brain.

You would act exactly the same. You would have exactly the same thoughts, the same emotions, the same feelings, insights, reason, personality. You would be indistinguishable from any human being, the only difference would be that your brain was made of silicon rather than protein.

It's not the medium which matters, it is the messages, the code, the pattern that occurs in that medium. We are all machines. We all have genetic "code" which defines the way we act and respond to stimuli. The only difference between a human brain and a rat brain and a robot brain are the number of neurons, and their specific pattern and location. There isn't some kind of magical substance, no "lifeblood" which gives us feelings and thoughts, it is simply neurons firing and glands releasing hormones.

If you say a robot cannot truly "feel" pain, you must also concede that a human cannot "feel" pain. After all, we are merely responding to stimuli designated by our evolutionary history as "bad". We are merely acting on our memories and this innate sense to produce actions which are themselves evolutionarily "hardwired". We are merely adding up variables, measuring chemical levels, interpreting bits of data from our senses. I see absolutely no distinction between the way COG thinks and feels, and the way a human, with only 128 neurons, would think and feel. I also see no reason why a robot with trillions of neurons could not be considered "really" human, in the same way that I would not consider someone with an artificial, prosthetic brain not "really" human.
 
ITT arguing about something that doesn't exist.

...

So that God guy is pretty cool.
 
The medium is the message

Is a recording on a record any qualifiedly different than the same recording burned to a CD? Would you be able to identify the song in both cases? What if you were merely given a set of headphones and could listen only to the music. Does it make any difference then, whether the music was recorded on vinyl or plastic? Or is it the melodies which count?

I say, it makes no difference whether neurons are firing in electrical patterns in an organic brain or an inorganic one, the effects, the code-- the melody is the same in all cases, and aside from a few pops and scratches is indistinguishable in any medium.
 
ITT arguing about something that doesn't exist.

...

So that God guy is pretty cool.
You've gotta wonder if God think we're entitled to basic human rights. I've got a book tucked away somewhere that would probably lean towards "no".
 
Freedom is the right of all sentient beings.

Optimus Prime taught me that, and he's completely inorganic. They deserve rights.
 
Freedom is the right of all sentient beings.

Optimus Prime taught me that, and he's completely inorganic. They deserve rights.

What they deserve is hellfire missles and a "GTFO of our planet".



It's our planet goddamit! Stop crowding it! We humans are fine by ourselves, thankyouverymuch. :p
 
Something which is just simulating life is not alive itself. No matter how good the simulation seems. It will simulate emotion, but it doesn't have emotions. Why would something not 'living' need the right to life?
 
Something which is just simulating life is not alive itself. No matter how good the simulation seems. It will simulate emotion, but it doesn't have emotions. Why would something not 'living' need the right to life?

Exactly so, but it could be said that we are all organic machines, that simulate life. Of course, being organisms, that is natural and is our purpose.
 
I don't think "simulated sentience" is possible, something is either sentient or is not sentient. The machine looks like it passes any sentience test we care to throw at it, so yes it deserves the rights inherent to any sentient being - we call them human rights only because we know of nothing else that is capable of sentience.
Indeed any artificially created sentience would probably be a more 'pure' form of sentience separated from biological needs, influences or moods.
 
ITT arguing about something that doesn't exist.

It's an ethics discussion. Like, the one with people tied to a railway and a lever. It's not happening, but it's interesting to talk about the morality.

-Angry Lawyer
 
Equivalent rights as humans do.

It amazes me how people take everything literally and don't consider the prospect of an equal alternative. Yes the topic says human rights but imo they'd only be used as a template for robot rights.
 
I sincerely hope we ever avoid creating something like that. :/
 
Is a recording on a record any qualifiedly different than the same recording burned to a CD? Would you be able to identify the song in both cases? What if you were merely given a set of headphones and could listen only to the music. Does it make any difference then, whether the music was recorded on vinyl or plastic? Or is it the melodies which count?

I say, it makes no difference whether neurons are firing in electrical patterns in an organic brain or an inorganic one, the effects, the code-- the melody is the same in all cases, and aside from a few pops and scratches is indistinguishable in any medium.
Dan is right. What are you proposing? A soul?

That doesn't mean I believe there is a really fundamental difference of type between a human and a synthetic mind - or at least, a difference that counts enough to me to vote 'no'.

It doesn't actually matter whether it (not they. there is only one) needs all those rights, like marriage. It can use them if it wants. An asexual person might conceivably feel he doesn't need marriage but he can do it if he likes.

Human rights are for Humans. These things would have their own rights, but probably less of them. I still want the right to turn them off, incase they go tits up matrix style.
That's included under 'basic human rights', since the machine could and would be legally charged with attempted murder or suchlike.
 
thats what i said, they shouldn't have human rights they should have robot rights, why would humans invent robots just so they can live and buy there own house and one day run for president.

theres also a big difference

Humans learn for what they see

robots need to be programmed to learn what each object is.

they are no way like humans, even if they can feel pain, animals feel pain. but they have animal rights.
 
Status
Not open for further replies.
Back
Top