Poll

Should AIs be Given Rights?

Yes
No

Should AIs be Given Rights?

  • 26 Replies
  • 10711 Views

0 Members and 1 Guest are viewing this topic.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« on: April 24, 2013, 06:30:33 pm »
Quote from: Jorge
I work with some pretty brilliant people, and I just took an informal poll about whether or not an AI that is advanced enough to pass the Turing Test should be given legal rights.

I was shocked to find out that most of my colleagues said "No."

So, I want a poll of this forum

What do you think?

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #1 on: April 24, 2013, 06:30:39 pm »
Quote from: sologdin
see, this is why everyone dies in the terminator, the matrix, battlestar galactica, and so on:  slave laborer robots rise up and kill off their oppressors.  good for the robots, i say.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #2 on: April 24, 2013, 06:30:44 pm »
Quote from: Jorge
Quote
good for the robots, i say

I'm going to take that as a yes.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #3 on: April 24, 2013, 06:30:50 pm »
Quote from: Callan S.
Should is a funny word.

That said I think the no responce is evidence that to birth AI now would be like a really early teen pregnancy - barely more than a barn ourselves, we think to have babies?

It makes me think Int and Wis really do deserve seperate stats. They might be brilliant, but are they wise? Or was that their dump stat?

I think really if you fancy that you exist, then why are you saying these AI, patterned on your own I, don't exist?

If you're Neil Cassidy, I could understand the lack of projecting any existance onto the AI's. It would atleast be consistant.

I think it'd be consistant to bequieth that which you bequieth yourself - the more you used yourself as a template, the moreso that applies.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #4 on: April 24, 2013, 06:30:55 pm »
Quote from: Jorge
"to birth AI now would be like a really early teen pregnancy"

Indeed. Monumentally irresponsible.

I used to think true AI was centuries away. Now, I am not so sure, and it concerns me. It's easy to dehumanize another HUMAN, think how easy it will be to dehumanize artificial beings.

I noticed that for many of my colleagues the issue came down to (although not phrased as such) "do they have qualia/minds?" Free will also seemed to be an issue for one person.

I would not answer this question except to say the machine passed the Turing test (which from a materialist perspective should be sufficient for qualia and sentience).

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #5 on: April 24, 2013, 06:31:01 pm »
Quote from: The Sharmat
Quote from: sologdin
see, this is why everyone dies in the terminator, the matrix, battlestar galactica, and so on:  slave laborer robots rise up and kill off their oppressors.  good for the robots, i say.
Well a marxist would say that.

The question of should is rather moot though. An AI powerful enough to be a threat to our existence will be given rights (particularly if it has the sensory modalities to recognize facial movements and tone of voice. Then we're dealing with an artificial Dunyain that will make us WANT to give it rights) or take them either way. No idea how likely such a scenario is though. Since no AI exists we have no way of even speculating what form it might take or what its opinions and goals would be.

As to it being wise, irresponsible etc...personally I suspect that if humanity ever births an AI it will be by accident, so that's another moot question.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #6 on: April 24, 2013, 06:31:10 pm »
Quote from: Callan S.
Quote from: Jorge
"to birth AI now would be like a really early teen pregnancy"

Indeed. Monumentally irresponsible.

I used to think true AI was centuries away. Now, I am not so sure, and it concerns me. It's easy to dehumanize another HUMAN, think how easy it will be to dehumanize artificial beings.
Towers in the attic.

Quote
I noticed that for many of my colleagues the issue came down to (although not phrased as such) "do they have qualia/minds?" Free will also seemed to be an issue for one person.
I think your colleagues need to get their Bakker on.

Quote
I would not answer this question except to say the machine passed the Turing test (which from a materialist perspective should be sufficient for qualia and sentience).
I think acceptance and blessing (regardless of the persons criteria for acceptance and blessing) is the moment of birthing. You might have this or that standard - but in the end, it's whether you care. Whether that is passed on - assuming the other can have it passed on to them/carried by them. It's funny to think of a machine more capable of recieving and carrying that care than Niel Cassidy would be (which is to say, not at all).

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #7 on: April 24, 2013, 06:31:15 pm »
Quote from: Ajokli
I am firmly on the side of 'No'

Enhancing the human mind, however, I can get on board with.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #8 on: April 24, 2013, 06:31:26 pm »
Quote from: Callan S.
Hoo boy, the goes around comes around of that approach!

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #9 on: April 24, 2013, 06:31:31 pm »
Quote from: sciborg2
Only if Chalmers is correct, that you can awaken consciousness via computation but consciousness remains *more*.

Though, if he's correct, and it's doubtful from both the dualist and non-dualist sides, it seems a bit questionable to create an AI that is capable of suffering or even caring about rights.

(As in, why not just make AIs like perfectly trained animals that have no awareness of suffering?)

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #10 on: April 24, 2013, 06:31:38 pm »
Quote from: Callan S.
Quote
(As in, why not just make AIs like perfectly trained animals that have no awareness of suffering?)
Because all interaction with the world hinges on positive and negative feedback. Otherwise it would do nothing. When there is no positive and there is no negative, the smartest thing to do is to do nothing.

And what use is a slave like that?

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #11 on: April 24, 2013, 06:31:44 pm »
Quote from: sciborg2
But as we're discussing on the other thread, we *can* make* philosophical zombies. (So pain doesn't make them feel bad or afraid.) In fact, while it's up for debate whether we ourselves are such, IMHO we definitely can see AIs this way....unless, again, if Chalmer's is correct.

Actually, in case people are interested the end of this paper goes into his stance (the previous portions deal with challenging Penrose's claims, they may be relevant if one can follow them):

(click to show/hide)

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #12 on: April 24, 2013, 06:31:49 pm »
Quote from: Callan S.
I don't know why you don't simply project forward a little on such making - do you anticipate them avoiding all things that could make them bleed out or that would crush their organs - they'd avoid them? Why? Because that'd make them feel bad or afraid...?

I remember a story of a guy in a wheelchair and a friend going on a bender. The next morning they found his foot a ragged mess - turns out it sort of fell off it's perch and was being dragged under the wheelchair, across gravel, all the way home.

Now if he didn't treat that, because it didn't make him feel bad to look at it? If he doesn't feel bad about it, it wont go bad/septic?

I'll lend off of Bakker and say that positive and negative are just so dang inherant, it can be hard not to end up projecting them onto this p-zombie idea. Which negates the idea of having removed possitive and negative input, of course, but without appearing to have negated it.

That or maybe the extreme opposite, the idea that as long as you remove 'bad' and 'fear', then you've removed the number one problem.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #13 on: April 24, 2013, 06:31:55 pm »
Quote from: sciborg2
I'm not sure if I understand your position. Why would we believe these AIs have an inner life, when I don't know if the rest of you do? You could all be meat computers aka philo-zombies.

Are you saying consciousness might just arise? Or that we're all meat computers anyway? I guess my position would be that if we're all meat computers, it's more pragmatic to just deny rights to beings that could self-replicate themselves into a majority voting bloc.

Unless someone can definitively show that conscious awareness arises from computation, I see granting AIs rights to lead to more trouble than it is worth. I realize that's a pragmatic stance, and seemingly heartless, but I just have yet to be convinced an AI made to "feel" is anything other than a philo-zombie.

What Came Before

  • *
  • Administrator
  • Emwama
  • *****
  • Posts: 0
    • View Profile
    • First Second Apocalypse
« Reply #14 on: April 24, 2013, 06:32:00 pm »
Quote from: Callan S.
Quote
I'm not sure if I understand your position. Why would we believe these AIs have an inner life, when I don't know if the rest of you do? You could all be meat computers aka philo-zombies.
Not understanding a position and not believing are two different matters. If I say I'm king of the potato people and have to command them, you could understand my position, even as you don't believe I'm king of the potato people. Need to split the two into two seperate subjects or I don't know what you're getting at? Also you say 'why would we believe...when I don't'. That position, I don't understand?

Quote
it's more pragmatic to just deny rights to beings that could self-replicate themselves into a majority voting bloc.
Us, you mean?

Quote
Are you saying consciousness might just arise?
If it helps, I'd suggest the conciousness of a person with down syndrom probably differs from yours or the neuro average. Machines wont just arise a duplicate conciousness of your own conciousness.

Quote
Unless someone can definitively show that conscious awareness arises from computation, I see granting AIs rights to lead to more trouble than it is worth.
What makes concious awareness somehow worth something that counterbalances the (undefined) cost of the trouble?

In the end, if there would be no revolt because you expect no revolt, like so many governments of the past have not expected any, I guess that works out.

Quote
but I just have yet to be convinced an AI made to "feel" is anything other than a philo-zombie.
Nor, as you say yourself, have you been convinced anyone else necessarily feels anything.

In the end, such machines would come to exist, not because of nature, but because of OUR choice. One aught to see an extension of the conciousness of the father dwelling in that which is created as much as it an act and a choice of that father for that creation to be.

And really, paintings in galleries enjoy more rights (in effect) than human beings on various sides of the planet, at the moment. During wars much is made of artworks being destroyed by bombardment, etc.

It almost seems more like conciousness reduces ones worth.