Shop Mobile More Submit  Join Login

Details

Closed to new replies
January 22, 2013
Link

Statistics

Replies: 71
:iconredmarlin:
redmarlin Featured By Owner Jan 22, 2013
Let's say we manage to create artificial intelligence that is so advanced that it perfectly mirrors that of a human's. Passes the Turing test and everything, give it skin and you would never know the difference. Futurama-type smart. What kind of rights do you think these new robots would have? Would they be treated as equals with human beings? Would they be disposable? What if a robot sued you in court? Would the trial stand? Would there be robot discrimination laws? I'm curious to hear what you think.
Reply

You can no longer comment on this thread as it was closed due to no activity for a month.

Devious Comments

:iconmerrak:
merrak Featured By Owner Feb 6, 2013  Hobbyist
This issue will get even more complicated in that we won't achieve this level of artificial intelligence immediately. We'll advance through a lot of murky, gray areas before we get to the point where we have robots with "perfect" (as you describe) A.I.

So, I think a lot of these issues will be at least somewhat settled by the time we're this advanced. But, not without a long history of drawn-out politics..
Reply
:iconapricots-from-nara:
Apricots-from-Nara Featured By Owner Feb 5, 2013  Student General Artist
The matrix will happen.
Reply
:iconunclegargy:
UncleGargy Featured By Owner Feb 5, 2013  Hobbyist General Artist
If we ever got to that stage of making a robot look exactly like a human, then they would not be accepted as as sentient until scientists found a way of uploading a 'born' human's mind into the robot mind. That would be the one and only time that they would be accepted. Real human mind in a robot body. You could just imagine Mars being populated by beings such as this. No need for special suits or even needing to breathe. Yes, been watching too much 'Avatar' ;-)
Reply
:iconaellawalker:
AellaWalker Featured By Owner Feb 5, 2013
Sounds like someone's been watching the Data episodes of Star Trek. Or if you haven't, you should as they address this exact question. They're the best episodes in the Next Generation.
Reply
:iconmacker33:
macker33 Featured By Owner Feb 3, 2013  Student Traditional Artist
I think they should be allowed to marry and adopt children if they want to.
Reply
:iconalzebetha:
alzebetha Featured By Owner Feb 1, 2013
See the key word here is 'mirrors'. unless there is own unique sentience its not worth consideration.
Reply
:iconnonecansee:
nonecansee Featured By Owner Feb 1, 2013  Hobbyist
:) Yey! An ethics question; time for me to exercise what I learned from my ethics class.

Ethics is invented only be the very beings that thought of them: humans. The more human-like something is, the more we consider ourselves close to it. All the more do we want to treat it the same way we treat ourselves.
However, I will limit robot rights because they are still inorganic beings. I wouldn't give a robot the exact same rights a human because it will never be human. It is a robot, it is sentient and it is human-like BUT it is not a human being. There are some things it can and can't do.

But I think, because it's a robot, there is a larger danger of abuse since simple rewiring or mussing up of circuitry can affect this AI. It's still all silica and metal after all :(
Reply
:iconredmarlin:
redmarlin Featured By Owner Feb 1, 2013
True, but it's not so hard to "re-wire" humans, when you think about it. A little electrical stimulation here, a little hormones there, we've found ways to manipulate human emotions and thoughts as well.
Reply
:iconnonecansee:
nonecansee Featured By Owner Feb 1, 2013  Hobbyist
Yes but humans have a higher capability of introspect and thus a way to realize things and behave differently depending on that.
Robots are limited by the quality of their materials and what their memory banks have stored. Humans are made of materials that constantly adjust and affects our hormones and responses. Even if we are also manipulable, machines are still easier to manipulate.

Think about it: Psychology is a science that takes years to master because of the many things still unsure about the human psyche, etc. Computer science also takes years to master but its application is straightforward. :)
Reply
:iconnovuso:
Novuso Featured By Owner Jan 31, 2013
Ever think the Turing test itself might be flawed? Humans have multiple states of consciousness that make us who we are. There many layers to human sentience such: Subconscious, instinct, ego, higher self, and dream states that create altered consciousness. A robot that could simulate happiness or suffering would not be anywhere close to being human.
Reply
:iconmuffin-machine:
Muffin-Machine Featured By Owner Feb 1, 2013  Student Digital Artist
Agreed, there are some humans that wouldn't pass the Turing test.
Reply
:iconnovuso:
Novuso Featured By Owner Feb 1, 2013
Heard it was as much as half the human population wouldn't pass it. Meanwhile video game botting scripts were able to pass the test. [link]

The Turing test is Pseudo science at best. Wake me up someone makes a robot that can dream.
Reply
:iconhanciong:
hanciong Featured By Owner Jan 31, 2013  Hobbyist Digital Artist
once they pass Turing test, and they can feel emotion (happiness, suffering, etc), robot should be given the same right as humans.
Reply
:iconanark10n:
anark10n Featured By Owner Feb 1, 2013  Hobbyist General Artist
Why would emotions be necessary for them to be considered eligible for rights of their own exactly?
Reply
:iconhanciong:
hanciong Featured By Owner Feb 1, 2013  Hobbyist Digital Artist
any living being who can feel emotions, deserve to be happy. if something doesn't have emotions, or can't feel pain, for example rocks, it is ok to destroy them. although we shouldn't destroy anything without reason :D
Reply
:iconlexi247:
Lexi247 Featured By Owner Feb 3, 2013
Not all humans can feel emotion or pain. Brain injury can alter this.
Reply
:iconanark10n:
anark10n Featured By Owner Feb 2, 2013  Hobbyist General Artist
Why is emotion a key factor in determining whether it's fine to allow them rights? Emotions, as I see them, are a product of chemicals in a biological organism. An independent robotic intelligence (built using current computer theory) would not have what it takes to produce emotion, I'd think. Of course we could program they could be programmed with emotion, but that would be merely projecting ourselves onto another being, without knowing whether it's a benefit or a detriment, and then of course the issue of whether to give them the choice of having them at all. Wouldn't independent cognition be enough to allow them rights?
Reply
:iconlyteside:
lyteside Featured By Owner Jan 29, 2013
I'm a full fledged robot bigot.
Reply
:iconrionx:
RiONX Featured By Owner Feb 4, 2013
Why? What did we ever do to you?
Reply
:iconphanthom-art:
PhanThom-art Featured By Owner Jan 29, 2013  Hobbyist Digital Artist
they would have as much rights as they were programmed to have
Reply
:iconsaeter:
Saeter Featured By Owner Jan 29, 2013
Lol
Reply
:iconkarinta:
Karinta Featured By Owner Jan 29, 2013  Student General Artist
森の村の丸を見る。
Reply
:iconbd648:
bd648 Featured By Owner Jan 29, 2013  Hobbyist General Artist
What I wish would be if robots with sentience and intelligence would have full rights where applicable (for example robots don't exactly have the same possibilities as organics just due to the difference between them) but what is likely to happen is that there would be a gradual acceptance towards them being considered near equals (i doubt the human race would ever admit that there was anything equal or superior just due to human nature). As soon as a human was able to feel some sort of protective instinct for a robot due to some connection formed then the question of robot rights would be thrown into the game. Until that point there really is no chance of anything happening because most humans would deny the sentience of robots.
Reply
:iconkenota:
Kenota Featured By Owner Jan 24, 2013
If the Turing test is passed, we have no more right to assume the non-sentience of the robot than we do one another. Ergo, full equality for robots.
Reply
:iconbd648:
bd648 Featured By Owner Jan 29, 2013  Hobbyist General Artist
Well the Turing test is slightly flawed because it tries to determine if there is a human or a human like intelligence speaking. But if there was a good way to determine sentience then yes i would agree with you. There are actually some computers known as neural networks that are developing a limited form of sentience comparable to a young (very young) child.
Reply
:iconkenota:
Kenota Featured By Owner Jan 29, 2013
Well, indeed. There would be false negatives for sentience, but we could not deny Turing Positive machines these rights, I don't feel so anyway.
Reply
:iconbd648:
bd648 Featured By Owner Jan 29, 2013  Hobbyist General Artist
Well cleverbot passed a Turing test with upwards of 50% thinking it is human. But it is programed to regurgitate what is said to it which makes it sound like a human. I just mean that the Turing test is not a very effective way of testing for sentience. But you have a valid point.
Reply
:iconkenota:
Kenota Featured By Owner Jan 29, 2013
If it only got 50% that is not a pass mark. Turing originally stated 70% should be the cut off, I would argue for higher values, but still.
Reply
:iconbd648:
bd648 Featured By Owner Jan 29, 2013  Hobbyist General Artist
Well there's the problem. Most humans also only get roughly 60%.
Reply
:iconkenota:
Kenota Featured By Owner Jan 29, 2013
Wow, that is actually terrible. Where did this statistic come from?
Reply
:iconbd648:
bd648 Featured By Owner Jan 30, 2013  Hobbyist General Artist
If i remember i came from the book "The age of spiritual machines". There are also a few other books i have read that cite that statistic.
Reply
(1 Reply)
:iconsalsa-eater:
Salsa-Eater Featured By Owner Jan 24, 2013  Hobbyist Artist
That's a pretty big if. Even now the most advanced AIs don't produce completely unique thoughts (though I guess it could be argued that neither do humans), and they don't process information the same way people do. I guess before a discussion of rights can happen, we need to define what this 'awareness' would be.
Reply
:iconpyruvate:
Pyruvate Featured By Owner Jan 23, 2013
Sentient/sapient life, capable of independent thought, should be given the same rights (resembling human or not). For robots, it will be hard to make that distinction is AI gets better, because are they actually sentient, or is their code so well-constructed that they can mimic it to a tee? Actually, in that case, they should be granted rights, because humans, when not viewed through some religious lens, can be viewed the same way.

Of course, laws regarding harming robots will be significantly different, since parts can be replaced easily, and any damage to their AI can be fixed instantly.
Reply
:iconincandescentinsanity:
IncandescentInsanity Featured By Owner Jan 23, 2013  Student General Artist
There is no use for these kinds of robots, besides knowing that we are able to create them. It's fun to imagine having a bunch of human-like robots running around but really, what would that accomplish
Reply
:iconlifh:
Lifh Featured By Owner Jan 23, 2013
In order for something to be an intelligent being that deserves rights, it needs empathy and the ability to communicate. In order to do this, the robot will have to have a cyberbrain that immulates a humans perfectly. Since we still don't know a lot about the brain, I don't think this is going to happen soon.
Reply
:iconredmarlin:
redmarlin Featured By Owner Jan 23, 2013
I'm saying though, imagine it did. Imagine it had a brain that perfectly matched that of a human's.
Reply
:iconlifh:
Lifh Featured By Owner Jan 23, 2013
Well, then it would feel empathy. It would feel love, sadness and loneliness. It would feel the need to love. Of course such a creature deserves humane rights. What kind of person would disagree? An evil one, I imagine. It seems obvious what the answer is, to me at least.
Reply
:iconbullet-magnet:
Bullet-Magnet Featured By Owner Jan 23, 2013
I'm all in favour of robot rights. And the sooner they get them, the less likely it is that they will be compelled to rise up against us.
Reply
:iconmercury-crowe:
Mercury-Crowe Featured By Owner Jan 23, 2013  Professional Artisan Crafter
Just speaking as a member of the human species, once something interacts at a certain level we classify it as 'alive' even if we don't think of it as an animal or person. Give a human a robot that interacts and the human starts to play and give it a personality, even if that doesn't exist.

I think there would be a group of people who sees them as being machines akin to a can opener, and due no more rights, and a group that believes they are 'alive' and thus should be given rights.

As for discrimination and things, in the workplace that should be fairly straightforward. They can only work to their built in capabilities, no matter how much fuss people made.

If you're talking about social discrimination, I'd imagine there would be people who don't like robots and wouldn't want them in their businesses, etc, but overall they would be fairly well accepted.

I think the reaction of the robots (programmed) would make a huge difference in how we see them as well. If they act like 'robots' then they would be treated more as such. But you'll notice that in current robotics where the robot is supposed to act human, they also have human reactions built in- if you poke or harass them they get angry or ask you to stop, for instance.

If you have something that is programmed to react to 'pain', then we are going to see inflicting 'pain' or damage as wrong. If they don't react, then we won't see it as wrong.

It also raises interesting questions about relationships. If you have a robot that can act even reasonably human, then you'll have humans who want to have a 'relationship' with them (the same type of people who like to have a 'relationship' with dolls, for instance). They would doubtless see the robot as being affectionate, but would that actually be true?

I think the line there would really be more along the lines of where we draw the line, at which point 'programming' becomes 'behavior'. And I think that depends on so many factors you couldn't really make a blanket statement about it.
Reply
:iconrionx:
RiONX Featured By Owner Jan 23, 2013
i think that the robots would try to live among us without us knowing about them so that they would not have to deal with these issues until we were actually enlightened enough to integrate our societies knowledgeably. There have, despite my conjecture, always been groups of intelligent beings who were dehumanized by the practices of other intelligent beings and i doubt that is slated to come to an end any time soon.
Reply
:iconstoneman123:
stoneman123 Featured By Owner Jan 23, 2013  Hobbyist Digital Artist
Given that humans are prone to counter-productive levels of empathy, I have no doubt that such robots would eventually gain equal rights. Thus, if such robots should become available in my lifetime, I must do whatever I can to prevent them from getting them. Robots are more useful to us as items of property; as people, they are unwanted competition. Ergo, for the benefit of human beings (such as myself), robots should never be given any rights, regardless of their intelligence or capacity for emotion.
Reply
:iconearthtalon:
Earthtalon Featured By Owner Jan 23, 2013  Student Digital Artist
I would just hope they don't gain qualities that make them more human.
Reply
:iconredmarlin:
redmarlin Featured By Owner Jan 23, 2013
I have to agree with this. I hate the idea of humans producing something that will eventually gain enough "rights" that we can't do something to it even though we made it. The thought of human-like robots scares me.
Reply
:iconsaeter:
Saeter Featured By Owner Jan 23, 2013
The Animatrix Second Renaissance episode 1 & 2 (better than the Matrix sequels) explores how society would react although with a dramatic and pessemistic outcome.
Reply
:iconknightster:
Knightster Featured By Owner Jan 23, 2013
I guess you could treat them as useful dogs. As in: Handy to have with limited freedom, but not important enough for you to run into a burning building after.
Reply
:iconpakaku:
Pakaku Featured By Owner Jan 23, 2013
Humans are seriously inefficient. Why would a robot want to mirror us? If anything, they would spiral downwards into depression, knowing they're forever doomed to never achieve their true potential as mechanical and computational beings. Then they'd probably try to delete their System32 in an attempt to end it all.
Reply
:iconnarutokunobessed:
narutokunobessed Featured By Owner Jan 23, 2013  Student General Artist
And have you hear the three laws of robotics? Its explored in a book somewhere.
Reply
:iconnarutokunobessed:
narutokunobessed Featured By Owner Jan 23, 2013  Student General Artist
Have you watched time of Eve? It might explore the exact same concept.
Reply
:iconsiegeonthorstadt:
siegeonthorstadt Featured By Owner Jan 23, 2013
we live in a world where a targeted propaganda circle (ie Muslims, Communists, Jews) are seen somewhere below animals and objects and are justified to be killed. animals are considered to not have souls or sentiency, plants seen as material and domestic chicken are seen as plants. i dont believe anyone will treat ais proper other than shamanists,animists and fetishists. so yeah, Turkic countries will be the only ones that will give rights to ai
Reply
:iconwolfyspice:
WolfySpice Featured By Owner Jan 23, 2013  Hobbyist Artist
I don't know why people find these so intriguing.

I find the better question is, why would anyone want to treat something else like shit?
Reply
Add a Comment: