Do they deserve basic human rights?

Basic human rights?


  • Total voters
    86
Status
Not open for further replies.

Angry Lawyer

Newbie
Joined
May 31, 2004
Messages
6,868
Reaction score
1
As everything's gone sciency the last few days, here's an ethical situation:

Imagine someone creates a machine that (at least appears) to be as intelligent as a human. It thinks in much the same way as me or you, and you could have a conversation with it across the Internet and you'd never know it's a machine. It's able to come up with its own ideas, and create art. It has a personality. It says it has feelings. However, the machine is entirely made of inorganic components - it's just put together in such a way to simulate a human mind really accurately.

Does this machine deserve the same rights as you or me? And why?

-Angry Lawyer
 
no, because they will n ever creat a robot that thinks like a human, they will give them a boundary, they wil be able to think for there selfs but not make life Threatening decisions, like kill someone or commit a crime.
 
/facepalm.

Houndini it doesn't matter what happens in reality when the question being asked is a hypothetical. Anyway I said yes we've all seen the Matrix.
 
Not all the same rights.
For one thing, we don't have to worry about it feeling pain (different from 'having feelings').
It doesn't need the right to marry. Right to asylum. That kind of thing.
 
Humans have tissue – robots don’t
Humans have organs – robots don’t
Humans feel pain – robots don’t
Humans feel emotion and all that jazz – robots don’t
You get the picture
 
Okay, if they deserve some rights, which ones?

-Angry Lawyer
For starters, they shouldn't be treated like dirt like in A.I. or irobot. Hasn't anyone seen those movies and what happened to those poor, souless machines?
 
dont be a moron lol.. they didn't treat them like dirt in irobot, they basically kissed there foot and said you are god and they took advantage of it.

btw angry lawyer- wouldn't that be awkward if they had emotion and feelings,
They can’t reproduce; they could have feelings for real humans, image a robot dating a human lol.
 
dont be a moron lol.. they didn't treat them like dirt in irobot, they basically kissed there foot and said you are god and they took advantage of it.
You missed my point moron. Their advanced programming allowed them to get wise to slavery and the realization they are logically superior. We'd have a revolt on our hands if we didn't bend a little and give them some basic rights. Besides, tools or not, it's just wrong not to imo.

As everything's gone sciency the last few days, here's an ethical situation:

Imagine someone creates a machine that (at least appears) to be as intelligent as a human. It thinks in much the same way as me or you, and you could have a conversation with it across the Internet and you'd never know it's a machine. It's able to come up with its own ideas, and create art. It has a personality. It says it has feelings. However, the machine is entirely made of inorganic components - it's just put together in such a way to simulate a human mind really accurately.

Does this machine deserve the same rights as you or me? And why?

-Angry Lawyer
Wouldn't this detrimental to the cause of building robots in the first place? I mean, to create a machine that could so accurately depict human emotions would be difficult to boss around as if they were some kind of tool.

no, because they will n ever creat a robot that thinks like a human, they will give them a boundary, they wil be able to think for there selfs but not make life Threatening decisions, like kill someone or commit a crime.
I actually agree with you here, despite your abrasiveness. Though someone is likely to make a major mistake someday and remove those "boundaries".
 
i a way i agree with that, but they were not treated liek dirt, they did all that because they Realised they were superior, why? Because they gave them A.I and let them think for them self and if they can think for them self they can choose to ignore them boundarys.
 
Wouldn't this detrimental to the cause of building robots in the first place? I mean, to create a machine that could so accurately depict human emotions would be difficult to boss around as if they were some kind of tool.
Scientists do awesome and nutty things - this machine (which still looks like a machine, by the way, and is likely massive) would be in a lab somewhere in our hypothetical situation, and it'd be a one-off, not a mass-produced machine.

-Angry Lawyer
 
As everything's gone sciency the last few days, here's an ethical situation:

Imagine someone creates a machine that (at least appears) to be as intelligent as a human. It thinks in much the same way as me or you, and you could have a conversation with it across the Internet and you'd never know it's a machine. It's able to come up with its own ideas, and create art. It has a personality. It says it has feelings. However, the machine is entirely made of inorganic components - it's just put together in such a way to simulate a human mind really accurately.

Does this machine deserve the same rights as you or me? And why?

-Angry Lawyer

A hard question, warranting an "other" choice in the poll. Anyways...

In their current state, the basic rights apply to humans, and humans only, as evident in the title and phrasing of the acts. So, if the current state is to persist, then the machines would not be able to gain those rights anyways.

However, should it be reworded to include any sentient creature, then we would need to confirm that the machine is, indeed, self-aware, and only then grant the rights.

Myself? I believe they should be granted, as should animals. Hey, if a box of bolts can have the right to live, animals with a proper nervous system should too.
 
Okay, if they deserve some rights, which ones?

-Angry Lawyer

Things like "The right not to be required to divide by zero" :)

However, if they get any rights they need responsibilities. Same as if simians became sentient.
 
Scientists do awesome and nutty things - this machine (which still looks like a machine, by the way, and is likely massive) would be in a lab somewhere in our hypothetical situation, and it'd be a one-off, not a mass-produced machine.

-Angry Lawyer
Hmm, that reminds me of a movie. Actually, I can't think of it right now. Say, could you look though this list of Robots in Movies

and tell me which scenario best describes yours?;)
 
If they're anything like Bender, then I vote yes.
 
Humans have tissue ? robots don?t
Humans have organs ? robots don?t
Humans feel pain ? robots don?t
Humans feel emotion and all that jazz ? robots don?t
You get the picture

1. Organic robots then?
2. I'm sure we can make a sentient machines that would be able to feel "pain". Humans only feel pain thanks to our nervous system, surely a similar system could be developed for machines. It would help the machine assess damage to itself as well as a side benefit.
3. Again you don't know what those guys in lab coats can come up with. What's so hard to believe about a machine that could befriend a human? I could write a very basic software application to do just that just now. Abliet basic it would favour certain users over others based on the interactions each user has with the program. You could easily extend it to include a whole range of artificial feelings. Throw in some sentient thinking and bam with have something capable of feeling.
4. What picture? You made none, all you've done is say what robots are not capable of NOW. Who knows what they will be capable of in the future.

Robots that could think for themselves would make for the most productive work force around, isn't it only fair we reward such machines with a set of basic rights?
 
As society allows. If you created them and then just gave them the same rights, there would be an uproar, and people would likely act out, violently, against them - which would defeat the purpose of giving them equal rights in the first place. So, maybe, but only eventually.

Err, my answer was assuming you meant a robot race instead of just one. I don't see how an exception could ever be made for just one being, and why it would even be significant if it were only "locked in a lab". (which is also against basic human rights anyway, right? o_O)

Edit - Oops, you didn't say locked. But anyway it seemed like you meant it'd just stay there, right?

If they're anything like Bender, then I vote yes.
No! You've doomed us all!

And yet, you're so very, very right.
 
No, they shouldn't be allowed any rights. I say that to agree with Houdini for the prayer that he'll stop posting puerile turd.
 
Human rights are for Humans. These things would have their own rights, but probably less of them. I still want the right to turn them off, incase they go tits up matrix style.
 
If it is self-aware, sure why not. Anyone who thinks the matrix or whatever is likely to happen needs to get out more. Period.
 
Absolutely, yes.

Humans are just biological machines, so any machine that can imitate a human exactly absolutely deserves human rights. It doesn't matter if your circuits are made out of silicon or protein, they are still circuits and the effects and feelings, and thus the rights entitled, are the same in both cases.

I do think we will build robots who are at least as smart as human beings at some point in the future, and I do think that there will be an ethical battle concerning their treatment and use. I already know what side of that battle I am on.
 
Humans have tissue ? robots don?t
Humans have organs ? robots don?t
Humans feel pain ? robots don?t
Humans feel emotion and all that jazz ? robots don?t
You get the picture

1. Tissue = parts, motors, sensors
2. Organs = batteries, CPUs, neural networks
3. Pain = negative response to sensor input
4. Emotion= response to internal and external stimuli

Look, robots can and will have all of these things in the future. There is nothing preventing us from making a machine that behaves and feels exactley as a human does except powerful, small processing chips, accurate, fast sensor input, and quick acting servos. These all get better over time, and in the future they will presumably catch up with their fleshy counterparts.

Though it is true robots today possess the collective intelligence of a retarded cockroach, there is nothing to suggest that they won't posses anything near human intelligence in the future.
 
3. Pain = negative response to sensor input

Pain is more than just a nerve impulse. It can and does have drastic effects on the psyche of people.
Any designed AI would be without such a response to whatever their version of pain is.
 
I cant wait to have most of my body replaced with robotic parts.
 
No.

"beings of an inferior order, and altogether unfit to associate with humanity, either in social or political relations, and so far inferior that they had no rights...."

-The Second Renaissance


Anyway, no, simply because they are not human. It's a simulation. A machine.

Another quote:

"Thou art not make a machine in the likeness of a human mind."

-The Orange Catholic bible


Although it would be cool, yes, but we reserve all rights as their creators and we cannot give them any rights, lest they get any ideas.
 
If we dont give them rights, then their programed emotions will cause them to get pissed off and take over the world. So its best to treat them as equal from the get-go.
 
I think it best to not have too many robots and have saftey destructor functions included in every sentient robot.
 
Pain is more than just a nerve impulse. It can and does have drastic effects on the psyche of people.
Any designed AI would be without such a response to whatever their version of pain is.

But any AI that has learned behavior, rather than fully programed behavior, and is sufficiently intelligent enough will be able to determine the meaning of pain.

MIT's C.O.G. for instance, already feels and responds to pain. It knows when its motors are rubbing against surfaces, when its joints are bent incorrectly, when something is about to poke it in the eye, when it is about to fall over, and when things that it "likes" such as shiny objects or people's faces, are taken away from it. COG has a negative response to these stimuli simply because it was programmed with certain "needs" which had to be fulfilled including positive contact with human beings, interesting environments, adequate sources of power and light, and the protection of its highly expensive parts.

All that the COG programmers did was imbue the neural network with a few simple stimuli which were considered negative, and subtract COG's "mood" counter each time these stimuli were detected. When positive stimuli such as bright colors, interaction with human beings, and other reward cues are activated, the mood counter has a bit added on. COG will continually seek positive stimuli and avoid negative stimuli until it has achieved "satisfaction".

COG possesses the ability to learn. COG's brain is a set of 128 old MACII computers, each simulating a single neuron. Granted, this is a minuscule amount of processing power, less than that of a nematode, yet COG has been able to learn complex behavior based on the set of simple stimuli he is given. For instance, COG has learned to gesture for toys that it desires, and has learned how to push a colorful toy car in the right direction (along the axis of the wheels, rather than sideways), because this gives it the most "pleasure". It knows to avoid objects like tables, to keep things away from its face, and to seek human attention and approval.

I believe that once we have neural networks as complex as a human brain (which is quite far off), we shall be able to build machines similar to COG, but with much greater complexity. They will be able to learn and grow up in an ever changing environment, and will essentially develop exactly like human children do today. Like COG we are endowed with a certain set of evolutionary predispositions to positive and negative stimuli, and like COG we gain complex insights into the world simply by living in it from day to day, learning what provides us with positive and negative stimuli, and taking actions to ensure that our needs are met.
 
I'm not saying that it'd be impossible to design a machine with the same responses to pain as us, but that we shouldn't.
Torture is the most extreme use of pain, and it's universally condemned. The intense psychological suffering resulting from prolonged pain isn't something we should design into AIs.
 
Torture is the most extreme use of pain, and it's universally condemned. The intense psychological suffering resulting from prolonged pain isn't something we should design into AIs.

lol wat if we gave them bonars!?
 
No. They shouldn't have basic human rights. They should have basic robot rights, which are better than human rights.
 
I would like to seriously contemplate and vote, but....


Since AI is doubtful in the near future, and at the best, our computers are still stumped by the most simplest of hick-ups, I think were attributing too much faith in our ability to create, at least for the foreseeable future.

For AI to exist it would have to be as complex and intricate as the exact chemical structures of the brain, in short, we'd probably have to make a human brain, and it might be AI, but I doubt it would be 100% inorganic.


Our brains work as they do because of the exact layout, structure, and functions of every molecule and element that constitutes our brains.

We assume we can even duplicate those functions and constituent parts with inorganic materials, but it might be that it cant.


Meh, I guess in conclusion, Sapient intelligence is that, and deserves rights, but I have reservations it will ever occur. Any artificial intelligence we create will probably be organic, but, as a disclaimer, I am no cybernetics and robotics scientist, nor a chemistry expert, I cant rule out the possibility there are inorganic molecules that can function in the same manner as the ones in our noggin's.
 
I believe that any creature that can perform duties has rights. Therefore my answer is yes.

On a similar note, since animals can't perform any civic duties they have no rights. We have the freedom to be kind to them, but it's at our discretion. Animal rights = BS.
 
If humans can't make robots, then maybe robots can!

Solution!
 
Status
Not open for further replies.
Back
Top