Thank you for voting Crowdsignal Logo
Option image

When, if ever, will a robot deserve “human” rights? (Poll Closed)

  •  
     
  •  
     
  •  
     
  •  
     
  •  
     
  •  
     
9 Comments

  • Bryan Burgess - 13 years ago

    When it has the ability to choose to ask for human rights.

  • Justen Robertson - 13 years ago

    Giving robots "human" rights is a little bigoted, isn't it? :) How about "sapient rights" - the right of all things which possess the capacity or potential to understand and observe universal ethics to enjoy their protection, and the duty to reciprocate. When a machine or animal can be said to have the capacity to understand life, liberty and property it is automatically obliged to assert and respect those things. No turing test required. I am not convinced half the observably human people I talk to could pass a turing test.

  • John Burgess - 13 years ago

    My read on this is that human rights are for things that have the needs that humans do, so they will never be extended to non-humans. Autonomous machines will have their own needs, and so will have to determine what rights are necessary to foster conditions that will lead to those needs being met. Those conditions will only be of equal concern to human-centric concerns when autonomous machines have the power to enforce that equality. Non-human animals will never have equal rights, because they cannot hope to enforce equality.

    To deserve rights is another question. Humans don't deserve rights. We choose to give ourselves rights because it makes life less terrible if we have some kind of protection from the war of all against all. But to the best of our knowledge there's no ontological order that we can appeal to that makes those rights fundamental. Everything is conditional and based on mutual agreement. Autonomous machines won't deserve rights either, but they should be no more or less constrained from working out mutual agreements for group and individual betterment.

  • Andrew - 13 years ago

    Mass veganism would do one thing many would advocate (however not me): dramatic population reduction and borderline extinction.

    Thinking in terms of rights is a continuation of a failed and skewed paradigm. I suggest instead thinking in terms of suffering. We must strive to end suffering for all sentient beings - humans, animals, insects, plants, and machines alike. If and when a machine can express suffering, either on its own or through measurement and observation, that will be the time that an intelligent machine should be considered "human" and treated with ethical respect.

  • Justin - 13 years ago

    I think veganism rejects humanism: we are animals.

    We ashould accept ourselves before gong nto new forms of conscience creation.

  • Ames - 13 years ago

    Say we create a sentient software. So what? I don't believe a machine should ever be seen as equal to biological life. I think that in another 50 years, if you go down and buy a robot with intelligence, you bring -it- home and decide you don't want it, then you should be able to destroy it without consequences. Robots should be our slaves, nothing more. I agree with Lesley Dove though, we should be trying to enforce the way we treat sentient life on Earth before we try to create some of our own. Although, I don't we should be vegan.

    My thinking is that machines are not equal to biological life. If/when created, machines, no-matter how intelligent their algorithms may appear, they are our servants, we created them as machines, it is completely different to weather or not biological life we create should get rights, it is entirely different. Machines should not get any rights. I like my little roomba robot. But I don't treat it like my dog, he is sentient, roomba is not.

  • Lesley Dove - 13 years ago

    We have to work on getting animal rights first surely, and more ppl going vegan, before we even think about robots being people with rights? It's about sentience and I was not aware we could create sentient artificial life! I think we should not create something sentient (even if we can) until we can respect those sentient beings already here.

  • Lesley Dove - 13 years ago

    We have to work on getting animal rights first surely, and more ppl going vegan, before we even think about robots being people? It's about sentience and I was not aware we could create sentient artificial life! I think we should not create something sentient (even if we can) until we can respect those sentient beings already here.

  • CygnusX1 - 13 years ago

    When will a robot deserve personhood? When it applies memory and apperceptions to sentience and achieves self-awareness! Of course it need not be a robot at all and may just be an AGI software algorithm, at least at first - ala HAL 2001?

Leave a Comment

0/4000 chars


Submit Comment

Create your own.

Opinions! We all have them. Find out what people really think with polls and surveys from Crowdsignal.