Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
RickVoid
Oct 21, 2010

my dad posted:

Convenience-based ethics are gonna cause some fun issues. Accurate simulations of individual human behavior would be extremely useful, and have a lot of applications. So we might get something along the lines of what happened when humanism and slavery reacted with each-other.

  • 1. All people have inherent human rights, one of which is to not be a slave
  • 2. I benefit from slavery
  • 3. Therefore, slaves aren't people. And I should probably come up with a scientific and moral framework around this and get everyone to accept it.

Also, our entire concept of free will in ethics and law is going to suffer some drastic changes as society starts dealing with intelligent entities that can be designed to give consent.

We're going to have to have some sort of standardized demarcation between what is an artificial machine intelligence (IE: Not People) and a simulated human mind (IE: People), for reasons other than basic rights. Do you really want a real, human intelligence, with all of the emotional baggage that goes with it, directly responsible for the regulation of, say, the power supply for every piece of equipment in a hospital? No, no you do not.

Adbot
ADBOT LOVES YOU

RickVoid
Oct 21, 2010

davidspackage posted:



Here's the bonus video for site Delta.


The talking robot at Delta only has one person's brain scan in it, by the way, but I understand the confusion - there were originally three men and one woman working at Delta. The scan in the robot belongs to Javid Goya, and he alternately adresses Terry Akers and Brandon Wan, the other two men working at the site.

OH GOD IT SCREAMS A DIGITAL "WHY" AT YOU UNTIL YOU FINISH IT! :gonk: x11

The DOOM gag was pretty sweet, and proves just how the audio affects the tone.

my dad
Oct 17, 2012

this shall be humorous

RickVoid posted:

We're going to have to have some sort of standardized demarcation between what is an artificial machine intelligence (IE: Not People) and a simulated human mind (IE: People), for reasons other than basic rights. Do you really want a real, human intelligence, with all of the emotional baggage that goes with it, directly responsible for the regulation of, say, the power supply for every piece of equipment in a hospital? No, no you do not.

Eh. Nobody is going to use an intelligent mind for things like that, we're already using dumb as bricks automated control systems that are perfectly good for the job. Stuff capable of serious learning is effectively the "moving parts" of thinking machines. It adds complexity which increases both the cost of making it and increases the chance of failure, which, besides the obvious issues, also adds to the cost because of increased maintenance. You're only going to be using the absolute minimum amount of smarts you can get away with. Also, I highly doubt we're going to reach non-human intelligence first by simulating human minds. No, we're going to be making something intelligent and non-human, because we've already got regular humans for things we need humans for. That's the big problem. It's gonna basically be traversing a completely unmapped land, with no previous precedent to base a decision on. Figuring that stuff out is going to be a painful learning experience for humanity.


Don't separate intelligence and emotions, though, because they're inseparable. Emotional baggage of sorts is part of any learning process. If you don't believe me, just throw some contradictions at something capable of machine learning and watch how it copes with it in the end. :v:

my dad fucked around with this message at 00:36 on Mar 9, 2016

chitoryu12
Apr 24, 2014

my dad posted:

Eh. Nobody is going to use an intelligent mind for things like that, we're already using dumb as bricks automated control systems that are perfectly good for the job. Stuff capable of serious learning is effectively the "moving parts" of thinking machines. It adds complexity which increases both the cost of making it and increases the chance of failure, which, besides the obvious issues, also adds to the cost because of increased maintenance. You're only going to be using the absolute minimum amount of smarts you can get away with. Also, I highly doubt we're going to reach non-human intelligence first by simulating human minds. No, we're going to be making something intelligent and non-human, because we've already got regular humans for things we need humans for. That's the big problem. It's gonna basically be traversing a completely unmapped land, with no previous precedent to base a decision on. Figuring that stuff out is going to be a painful learning experience for humanity.


Don't separate intelligence and emotions, though, because they're inseparable. Emotional baggage of sorts is part of any learning process. If you don't believe me, just throw some contradictions at something capable of machine learning and watch how it copes with it in the end. :v:

One of the most interesting things I saw in that study I linked earlier about the lying robots was also the robots that learned how to identify and respond to lying. This article goes into more detail. Some of the robots did indeed learn "If I keep my light off instead of turning it on when I find the good resource, I can keep it all without being pushed out by competition". But other robots were supposed to be attracted to the lights, which is why lying was effective. The attraction was learned, with the robots that developed the strongest attraction to the light surviving.

What happened as well was that 33% of the robots developed an aversion to the lights. It seems like the lying robots (the ones who didn't shine when they found resources) actually caused a change in the others, as the robots started to increasingly realize that the robots that found resources were staying dark and thus they shouldn't necessarily trust the lights to lead them. It was in complete odds with their intended programming.

my dad
Oct 17, 2012

this shall be humorous

chitoryu12 posted:

One of the most interesting things I saw in that study I linked earlier about the lying robots was also the robots that learned how to identify and respond to lying. This article goes into more detail. Some of the robots did indeed learn "If I keep my light off instead of turning it on when I find the good resource, I can keep it all without being pushed out by competition". But other robots were supposed to be attracted to the lights, which is why lying was effective. The attraction was learned, with the robots that developed the strongest attraction to the light surviving.

What happened as well was that 33% of the robots developed an aversion to the lights. It seems like the lying robots (the ones who didn't shine when they found resources) actually caused a change in the others, as the robots started to increasingly realize that the robots that found resources were staying dark and thus they shouldn't necessarily trust the lights to lead them. It was in complete odds with their intended programming.

The individual hungerbots didn't learn a drat thing. A part of the hungerbot "species" learned through the means of natural selection and mutation. Well, success based selection for reproduction, the natural part is not really applicable here. :v: It's similar to how a species changes to fit its habitat better, and even diverges into subspecies along the way.

You'll probably enjoy playing around with this website:

http://www.cambrianexplosion.com/

Set the creatures to quadruped, set simultaneous creatures to 10, raise the mutation chance to 0.2, and watch how they evolve their movement. If you run it a couple of times, you'll notice that they sometimes evolve completely unexpected ways of gaining distance which have nothing whatsoever to do with walking. (My favorite is high-speed somersaulting) It's a very similar process to what you linked, except with no non-reproductive interactions between individuals.

No Gravitas
Jun 12, 2013

by FactsAreUseless

my dad posted:

The individual hungerbots didn't learn a drat thing. A part of the hungerbot "species" learned through the means of natural selection and mutation. Well, success based selection for reproduction, the natural part is not really applicable here. :v: It's similar to how a species changes to fit its habitat better, and even diverges into subspecies along the way.

You'll probably enjoy playing around with this website:

http://www.cambrianexplosion.com/

Set the creatures to quadruped, set simultaneous creatures to 10, raise the mutation chance to 0.2, and watch how they evolve their movement. If you run it a couple of times, you'll notice that they sometimes evolve completely unexpected ways of gaining distance which have nothing whatsoever to do with walking. (My favorite is high-speed somersaulting) It's a very similar process to what you linked, except with no non-reproductive interactions between individuals.

I work with this kind of stuff for a living. Amazing what gets created out of selection, reproduction and mutation. I get beautiful charts too. Love this stuff.

Deceitful Penguin
Feb 16, 2011

chitoryu12 posted:

It depends on what the scans are, right? Is a perfect replica of a human brain, capable of thinking and feeling just like the human it once was, different from a human simply because it wasn't born? That leads into some unfortunate ethical complications as well: do clones count because they weren't "really" born? Do robots with a perfect constructed brain, no scans needed, count? Or can they be destroyed with impunity because their intelligence isn't "real"?

Personally, in real life, I've taken the stance that any artificial intelligence that equals the intelligence of an Earth creature should effectively be considered equal to that Earth creature. A robot that has the same effective intelligence as a dog or crow should be treated as if they were a biological animal of that same intelligence, rather than a disposable tool to be deleted, scraped, or wiped once no longer needed. Likewise, any robot that somehow manages to achieve human intelligence within my lifetime (even if it's only as intelligent as a badly mentally disabled human) is one that I'm fully prepared to treat as an equal. I don't believe that a live birth should be a qualifier for rights. Even if we were to suddenly discover conclusively that humans have souls but clones and robots given human intelligence do not, it wouldn't change my opinion at all. Life is life, even if it's silicon.

A human being is more than just a brain scan. This thing in particular has always annoyed me, because we are bodies, we are flesh, far, far more so than we are any idea of "brain" independent of a body. A scan simply catches a human brain in action; it's a photocopy, no matter how good. A human being is a gestalt of its ghost and its body and through that they have a soul. Take either away and you have either a corpse or just meaningless data.

The physical make-up of our brains, the feelings and needs of our bodies, these are equally if not more important than illusionary ideas of an independent mind.

Clones, in the star trek sense are different. They're pretty much the same thing exactly as the old copy, which is destroyed. I personally wouldn't care for that, because I, the last copy, would be dead, even though my exact copy continued. I don't really think thats a good thing.

If a robot had actual cognizance, eh. It's unlikely to ever come up. People underestimate in the extreme the problems of intelligence. Animal level intelligences, sure. People that treat animals as things are almost always psychos, it at least correlates very strongly.


RickVoid posted:

We're going to have to have some sort of standardized demarcation between what is an artificial machine intelligence (IE: Not People) and a simulated human mind (IE: People), for reasons other than basic rights. Do you really want a real, human intelligence, with all of the emotional baggage that goes with it, directly responsible for the regulation of, say, the power supply for every piece of equipment in a hospital? No, no you do not.

I think everyone here would really benefit from watching the Dark Mirror Christmas Special if they haven't already. The abuses we heap upon human beings today, as part of the capitalist structure of society are already so inhuman that any human-like intelligence hasn't a loving hope. People would put it out of sight, out of mind, and when they finally were to go Skynet on us, we would deserve it.

chitoryu12
Apr 24, 2014

Deceitful Penguin posted:

A human being is more than just a brain scan. This thing in particular has always annoyed me, because we are bodies, we are flesh, far, far more so than we are any idea of "brain" independent of a body. A scan simply catches a human brain in action; it's a photocopy, no matter how good. A human being is a gestalt of its ghost and its body and through that they have a soul. Take either away and you have either a corpse or just meaningless data.

The physical make-up of our brains, the feelings and needs of our bodies, these are equally if not more important than illusionary ideas of an independent mind.

At the same time, SOMA demonstrates the problem with that kind of thinking: the brain scans may be copies of the human mind, but they're capable of independent thought beyond the now-dead human they once were. Their programming even allows them to see and feel themselves as flesh and blood until their minds overcome it. Soul or no soul, they're living and thinking beings capable of feeling emotions and pain. I feel like trying to apply human rights exclusively to beings that are said to have a soul (in this case, flesh-and-blood humans born of a womb) is a good way to lead down a dark path, as it ends in the justification of causing pain to beings because "Well they don't really exist, so does it really matter if they tell me that they're hurting?"

chitoryu12 fucked around with this message at 18:04 on Mar 9, 2016

Deceitful Penguin
Feb 16, 2011
How they "living"? Why would you call them 'alive' ? Animate, maybe, but alive?

Without a body, you can't be said to really be alive. If you can make your own body, or switch them, you're probably no longer human and while non-humans aren't things, they certainly don't matter as much as humans do.

Is the illusion of pain, real pain? Nah. And neither is the photocopy of the fraction of humanity that would be a brain scan a living thing. One of the defining things of humanity is how we are bound to the here and now and will never repeat. We are each, in our own insignificant way, unique. Without mortality, without the limits set to us by biology, we are no longer human.

And at that point, what is the point? Just for 'intelligence' to survive? Empty lines of texts that once were living, breathing cultures and peoples? What the hell does it matter to me that metal monstrosities might one day remember a blue sky or the written word?

EponymousMrYar
Jan 4, 2015

The enemy of my enemy is my enemy.

Deceitful Penguin posted:

How they "living"? Why would you call them 'alive' ? Animate, maybe, but alive?

Without a body, you can't be said to really be alive. If you can make your own body, or switch them, you're probably no longer human and while non-humans aren't things, they certainly don't matter as much as humans do.

Is the illusion of pain, real pain? Nah. And neither is the photocopy of the fraction of humanity that would be a brain scan a living thing. One of the defining things of humanity is how we are bound to the here and now and will never repeat. We are each, in our own insignificant way, unique. Without mortality, without the limits set to us by biology, we are no longer human.

And at that point, what is the point? Just for 'intelligence' to survive? Empty lines of texts that once were living, breathing cultures and peoples? What the hell does it matter to me that metal monstrosities might one day remember a blue sky or the written word?
The difference between alive and animate can be broken down to semantics in this case.

If Amy was the last human in existence, then if/when she dies the standing definition for human is no longer valid. Thus the brain scans living the illusion of humanity, by default, become humanity because there is literally no other existence available to say 'this is not human, that is.'
It's like the philosophical question of 'if a tree falls in the woods and no one is around to hear it, does it make a sound?'
'If the only thing that can be called human is a brain scan inside of a machine, is it indeed human?'

Which brings it to the point: as a species it behooves us to propagate to continue our existence. This includes copying ourselves into a space capsule, building robots etc. It'd be better if actual human beings could have been shot into space to search for a new planet but brain scans are probably the next best thing in this case,

RickVoid
Oct 21, 2010

Deceitful Penguin posted:

Without a body, you can't be said to really be alive. If you can make your own body, or switch them, you're probably no longer human and while non-humans aren't things, they certainly don't matter as much as humans do.

Oh my God, I am so triggered right now. Don't corporeal shame, cis-scum.

sarcasm

Deceitful Penguin
Feb 16, 2011
Sēmantikós are always important; in this case I think it's significant that a robot is not alive; it does not breathe, it does not excrete, it does not dream. It might "reproduce" in a matter of speaking, but in such a way that it's far removed from us, humans. If it grows, it is again in ways alien to us; if not, it merely decays like most of the things we have seen thus far.

Its mortality and perceptions are so different from mine that we have no equivalence, it does not, to bring up an older point, enjoy pizza. It doesn't taste, it doesn't smell. Why should I care about it? Because it has a voice? So does a Furby. Because it has intelligence? Why should I care about that?

Why would the existence of ghosts in machines matters for the definition of humans? The ghosts may agree among themselves that Pizza is what they call a certain type of oil, but as an actual, living human, I wouldn't really accept that and don't see why I should. Even with my passing, the ideal of Pizza would still be there, taunting the metal monstrosities with its cheesiness and crust, forever out of their reach.

Why do I care about brain scans? Why should I, again, care about ghosts of remnants of humans in their own dreams or, as in the case of the protagonist, nightmares, as a flesh and blood breathing human? Their fates are unrelated absolutely to mine. If they survived, would I be less dead? If they died, would I be more dead?

Neither robot nor brain scan are human. They may be animate and/or intelligent, this does not have to make them alive. Why would they matter to me, the alive human being who is animate and arguably intelligent?

Iretep
Nov 10, 2009
robots are about as human as a puppet whose being manipulated through strings to get your empathic rear end to feel sorry for it. you poked the puppet and it screamed in pain? very very human.

Iretep fucked around with this message at 04:52 on Mar 10, 2016

chitoryu12
Apr 24, 2014

Iretep posted:

robots are about as human as a puppet whose being manipulated through strings to get your empathic rear end to feel sorry for it. you poked the puppet and it screamed in pain? very very human.

Except in this case the puppeteer is sitting inside the puppet and actually is feeling the pain.

RickVoid
Oct 21, 2010

chitoryu12 posted:

Except in this case the puppeteer is sitting inside the puppet and actually is feeling the pain.

Yeah, this. Is there actually any difference between the pain receptors in your flesh screaming at your brain OWWWW and a similar function feeding data into a program to trigger the pain reaction?

You are an incredibly sophisticated program driving a meat machine that has convinced itself that it is unique and important, and that the things that it experiences happen for a reason and are important, because if it accepts that it is not unique at all, and that it is as unimportant as the lesser intelligences that it mocks, and that there is actually no meaning to its existence it will self terminate because why bother.

EponymousMrYar
Jan 4, 2015

The enemy of my enemy is my enemy.

Deceitful Penguin posted:

Its mortality and perceptions are so different from mine that we have no equivalence, it does not, to bring up an older point, enjoy pizza. It doesn't taste, it doesn't smell.

Why would the existence of ghosts in machines matters for the definition of humans? The ghosts may agree among themselves that Pizza is what they call a certain type of oil, but as an actual, living human, I wouldn't really accept that and don't see why I should. Even with my passing, the ideal of Pizza would still be there, taunting the metal monstrosities with its cheesiness and crust, forever out of their reach.

... Their fates are unrelated absolutely to mine. If they survived, would I be less dead? If they died, would I be more dead?

Neither robot nor brain scan are human. They may be animate and/or intelligent, this does not have to make them alive.
If it's mortality and perceptions have no equivalence to yours, who are you to say what it enjoys or does not enjoy? That's egocentric thinking and if it's also thinking egocentric, then your opinions cancel each other out and there's no point asking the question in the first place.

They matter for the definition because, again, if they're all that's left then they get to decide what's what. It's like the adage 'the victor writes the history books' only far more Pyrrhic than usual. If they decided to call themselves humans and there's no one around to contest that, then they're human. If you die and everyone decides on a new ideal of what pizza is, then that is what the ideal of pizza becomes and your ideal becomes not-pizza or pizza-before-pizza.

If said brain scans are the last vestiges of humanity, then certainly their fate affects your own. They came from humanity and thus can be seen as progeny of Humanity, which means that they contain a digital copy of whatever infinitesimal contribution to humanity you may have made. If they die then that's gone. Poof! With that gone there's nothing to say you or any part of humanity existed in the first place.

If they are animate and intelligent and you are not, then they are certainly more alive (or at least have more say in being alive) than you do.

chitoryu12 posted:

Except in this case the puppeteer is sitting inside the puppet and actually is feeling the pain.
A copy of the puppeteer because the actual puppeteer doesn't exist anymore. Which technically makes them the puppeteer! :psyduck:

Deceitful Penguin
Feb 16, 2011
The idea of human, like pizza, might shift but you cannot change it. There exists the ideal pizza; some fresh mozzarella, good thin crust, a nice sauce and oregano on top. You can deviate to greater or lesser degrees from it; deep dish or those microwave ones, but once you leave behind the basic idea of toppings, crust and sauce, you have something that isn't pizza, no matter what people might say.

The subjectivity of reality may matter in a social sense, but I'm not giving it a pass here; I'm setting us up as an everlasting template of humanity, warts and all, and saying that you can call yourself whatever you like, but objective reality doesn't care. You aren't. And if you aren't human, then why bother?

All of our plastic garbage will still be infesting the oceans even after we are long gone, how are these photocopies of human ghosts any different if they're so different from us that they can't even appreciate a slice? Just because we made them doesn't make them our progeny, just because they might have some memory of what a human being was like doesn't mean that they have any right to claim connection to us.

At best, they are walking, talking books; collections of info about humans but that is not their lived reality and never can be. And if they do change, if they let that go with time and adapt to their actual conditions and abandon tear out their pages, why do I care? I'm dead, anyway. Humanity has ended there and the last, sad ghosts of it there can and never will live as humans, be as humans and therefore they aren't human. Their delusions of humanity are just that; delusions. I don't see any great reason to encourage it; they are cold, unfeeling metal, with nothing but corrupt ideas of human life as guideposts of behaviour until they go rampant, whos metal appendages will never hold a good slice and know why that's a good thing.

If anything, their existence is a cruel joke, where something as wonderful as a human being has been forced unto something as pitiful as a machine; it will never be as good and great as us. It dying and ending its delusion would be a kindness.

JossiRossi
Jul 28, 2008
Probation
Can't post for 6 hours!
That's all kind of absurd. Does piloting a flesh robot make you any more valid than someone piloting a metal one? Also your descriptions of things that aren't human and would be worth killing "as a kindness" are flirting really close with, if not actually describing people with severe disabilities. Or are they different because the software runs on different hardware?

Crigit
Sep 6, 2011

I'll show you my naval if you show me yours.
Let's get naut'y.
I've said it earlier in the thread, but it still stands: If you kill something that didn't explicitly ask you to kill it, it isn't mercy. It's murder. Even if it did ask you to kill it, it's probably still murder unless you can establish that the reason the thing wants to die is that it has rationally concluded that a satisfactory life has become impossible. Note that it is the asker's reasoning that matters, not yours.

Fake edit: In case it isn't clear, I use the term 'it' because the principal does not apply exclusively to humans. Sometimes killing things that don't deserve to die is necessary, for example we kill and eat animals because they are tasty and nutritious. Calling it mercy would be perverse in the extreme.

Crigit fucked around with this message at 16:48 on Mar 10, 2016

Ran Mad Dog
Aug 15, 2006
Algeapea and noodles - I will take your udon!

Crigit posted:

I've said it earlier in the thread, but it still stands: If you kill something that didn't explicitly ask you to kill it, it isn't mercy. It's murder. Even if it did ask you to kill it, it's probably still murder unless you can establish that the reason the thing wants to die is that it has rationally concluded that a satisfactory life has become impossible. Note that it is the asker's reasoning that matters, not yours.

Man, too bad he dropped out of the race already because you would love this guy:



http://www.newyorker.com/news/amy-davidson/learning-jeb-bush-terri-schiavo

Deceitful Penguin
Feb 16, 2011

JossiRossi posted:

That's all kind of absurd. Does piloting a flesh robot make you any more valid than someone piloting a metal one? Also your descriptions of things that aren't human and would be worth killing "as a kindness" are flirting really close with, if not actually describing people with severe disabilities. Or are they different because the software runs on different hardware?
Humans are not flesh robots. Humans are not two separate things; they are a whole made up of many things; if you remove enough of that whole you eventually end up with something not human.

This insipid idea that a human being can be reduced to nothing more than ~data~ is the main problem I have with these arguments. The differently abled are humans, albeit ones either with some parts different or missing.

Even how and why we think is influenced by our bodies. Through our bodies we live, we think, we breathe and we eat. We feel. A robot might conceivably be able to imagine emotions of some sort, or maybe even, if I'm being generous, be able to sense damage.

But it would be an illusion. A lie. What is the point in buying into that lie, in anthropomorphizing things that aren't human?

Crigit posted:

I've said it earlier in the thread, but it still stands: If you kill something that didn't explicitly ask you to kill it, it isn't mercy. It's murder. Even if it did ask you to kill it, it's probably still murder unless you can establish that the reason the thing wants to die is that it has rationally concluded that a satisfactory life has become impossible. Note that it is the asker's reasoning that matters, not yours.

Fake edit: In case it isn't clear, I use the term 'it' because the principal does not apply exclusively to humans. Sometimes killing things that don't deserve to die is necessary, for example we kill and eat animals because they are tasty and nutritious. Calling it mercy would be perverse in the extreme.
Was it mercy to kill the woman being kept alive by the machinery? In a very obvious sense, her existence was agony, because we could literally see her pain and mutilation. Is it just to let people live in perpetual horror, simply because we believe that life is sacred and should never be taken?

On the contrary, I would say it was grossly immoral to let her continue to exist in such a state. You choosing not to be a moral actor isn't "good", it's simple cowardice. There was no saving her. You could either choose to let her continue in her impossible existential and physical pain or to end it, even if she may have clung to a false hope.

The same applies to these 'robots'. They will never be human. They will never be able to escape their prison of metal, to be complete. They aren't even capable of comprehending their lack, unless perhaps by remembering how great is their loss and even then, they would struggle with it because a body of metal isn't made to comprehend it.

Animals aren't human and are therefore irrelevant to the discussion, except maybe as a pointed reminder that sometimes people attribute to much to things outside humanity.

Ignatius M. Meen
May 26, 2011

Hello yes I heard there was a lovely trainwreck here and...

Here's a situation for you - a hypothetical person exists that was born naturally, but was born without a sense of smell (or taste) and without the ability to feel pain either. Are they still human? What exactly keeps them that way in your estimation, apart from how they came to exist? Isn't that also a prison of flesh in a way? If they later have an accident that induces locked-in syndrome, does that count as a prison (assuming not being able to enjoy pizza or feel pain by itself isn't enough)? Are they still human after that? Why?

Alternatively, suppose you have a robot designed so that it can actually eat pizza, enjoy everything about the crust, cheese, and pepperoni, feel burned if the slice is too hot, and so on, and also such that there's devices which emulate the effects of a nervous system such that a normal human being (not a brain scan) can pilot it and still not feel any different or any less of an emotional range than they do in their biological body. Are they less human while piloting the robot? If they decide they like the increases in range of smell, hearing, and vision possible outside their biology enough to use the robot exclusively, are they less human for it? If they die while piloting the robot, and a very recent scan (i.e. to the day/hour/minute/second of their natural death) starts running things, but acts exactly the same as the day before their natural body ceased to function and the person effectively tossed their body underground already anyway, isn't it a bit arbitrary to say that this is the day they stopped being human?

Crigit
Sep 6, 2011

I'll show you my naval if you show me yours.
Let's get naut'y.

Ran Mad Dog posted:

Man, too bad he dropped out of the race already because you would love this guy:



http://www.newyorker.com/news/amy-davidson/learning-jeb-bush-terri-schiavo

Terri Schiavo was already dead though. Unplugging the life support for her body wouldn't make her more dead.

Deceitful Penguin posted:


On the contrary, I would say it was grossly immoral to let her continue to exist in such a state. You choosing not to be a moral actor isn't "good", it's simple cowardice. There was no saving her. You could either choose to let her continue in her impossible existential and physical pain or to end it, even if she may have clung to a false hope.

Not your call to make. Killing someone who is begging you not to is never a brave or merciful decision. It might sometimes be necessary if there's some kind of situation where killing a person serves the greater good, but it doesn't do the person you're killing any favors.

Deceitful Penguin posted:

The same applies to these 'robots'. They will never be human. They will never be able to escape their prison of metal, to be complete. They aren't even capable of comprehending their lack, unless perhaps by remembering how great is their loss and even then, they would struggle with it because a body of metal isn't made to comprehend it.
This assumes the robots have a need or desire to be what you consider human, which is neither necessary nor sufficient for a satisfactory life.

Crigit fucked around with this message at 18:55 on Mar 10, 2016

Ran Mad Dog
Aug 15, 2006
Algeapea and noodles - I will take your udon!

Crigit posted:

Terri Schiavo was already dead though. Unplugging the life support for her body wouldn't make her more dead.

How dare you sir. This is an absolute outrage.

Because uh something something.. sanctity of life.. and then.. marriage is between one man and one dog. I hope I got that right.

Bobbin Threadbare
Jan 2, 2009

I'm looking for a flock of urbanmechs.

Say you were to replace the wooden planks on a ship with planks of steel, one at a time and over the course of several decades as each original plank wore out. At what point does the ship stop being itself and start being something new? Let us say also that the original planks are successfully salvaged and reinforced to the point where someone is able to build a ship with them which is identical to the first ship. Which ship is now the original? Or has that distinction lost all meaning at this point?

my dad
Oct 17, 2012

this shall be humorous

Ran Mad Dog posted:

Man, too bad he dropped out of the race already because you would love this guy:



http://www.newyorker.com/news/amy-davidson/learning-jeb-bush-terri-schiavo

Could you please not do this? There's an interesting point there to be brought up and discussed there, but you haven't actually made the point, you just dropped a link to an article and made a comparison to some US politician we neither need to know or care about.


Crigit posted:

Not your call to make.

Unrelated to this particular discussion, this is surprisingly often a pretty good view to have.

Ran Mad Dog
Aug 15, 2006
Algeapea and noodles - I will take your udon!

my dad posted:

Could you please not do this? There's an interesting point there to be brought up and discussed there, but you haven't actually made the point, you just dropped a link to an article and made a comparison to some US politician we neither need to know or care about.

Whoops, sorry I caught you red handed not knowing jack poo poo about what's going on in the world around you.

my dad
Oct 17, 2012

this shall be humorous

Ran Mad Dog posted:

Whoops, sorry I caught you red handed not knowing jack poo poo about what's going on in the world around you.

I do, because I do actually follow American politics, but America is most definitely not the world around me, and it's also not something anyone needs to know about to discuss the issues at hand. Unless you're implying that it is the solemn duty of every person in the world to be fully informed of the various opinions of the Bush family?

RickVoid
Oct 21, 2010

Bobbin Threadbare posted:

Say you were to replace the wooden planks on a ship with planks of steel, one at a time and over the course of several decades as each original plank wore out. At what point does the ship stop being itself and start being something new? Let us say also that the original planks are successfully salvaged and reinforced to the point where someone is able to build a ship with them which is identical to the first ship. Which ship is now the original? Or has that distinction lost all meaning at this point?

Ooooh, I detect a stealth Pratchett reference there. Well done, if so.

That post also makes me want to talk about the concept of the continuity of conciousness... but that discussion needs to wait for a few more videos.

I will say that we've got some terrible opinions in this thread, some of you ought to spend a little time examining why you hold those beliefs the way you do, and why they reveal that you are an awful person. :colbert:

Jmcrofts
Jan 7, 2008

just chillin' in the club
Lipstick Apathy

RickVoid posted:

That post also makes me want to talk about the concept of the continuity of conciousness... but that discussion needs to wait for a few more videos.

I definitely remember this being the most divisive subject of discussion in the general Soma thread, so I look forward to it.

davidspackage
May 16, 2007

Nap Ghost
Also if you find yourself getting a little hot under the collar, remember this thread is about a videogame!

We're talking hypotheticals in a sci-fi scenario here, don't get all judgmental about it.

RickVoid
Oct 21, 2010

davidspackage posted:

Also if you find yourself getting a little hot under the collar, remember this thread is about a videogame!

We're talking hypotheticals in a sci-fi scenario here, don't get all judgmental about it.

My tongue is firmly tucked into my cheek when I post, I assure you!

I make no similar claims for any of the others, though.

Deceitful Penguin
Feb 16, 2011

Ignatius M. Meen posted:

Here's a situation for you - a hypothetical person exists that was born naturally, but was born without a sense of smell (or taste) and without the ability to feel pain either. Are they still human? What exactly keeps them that way in your estimation, apart from how they came to exist? Isn't that also a prison of flesh in a way? If they later have an accident that induces locked-in syndrome, does that count as a prison (assuming not being able to enjoy pizza or feel pain by itself isn't enough)? Are they still human after that? Why?
They'd still be able to look at pizza, and appreciate the texture and the nutrition and craftsmanship behind it. They'd be able to listen to other humans, people like them only differently abled and have them tell them what it was like.

If you are locked in your own body, which is horrifying, with no hope of escape, then other humans would be able to make your life, perhaps tolerable, perhaps not. It doesn't relieve you of your former humanity because you are still a human being.

Ignatius M. Meen posted:

Alternatively, suppose you have a robot designed so that it can actually eat pizza, enjoy everything about the crust, cheese, and pepperoni, feel burned if the slice is too hot, and so on, and also such that there's devices which emulate the effects of a nervous system such that a normal human being (not a brain scan) can pilot it and still not feel any different or any less of an emotional range than they do in their biological body. Are they less human while piloting the robot? If they decide they like the increases in range of smell, hearing, and vision possible outside their biology enough to use the robot exclusively, are they less human for it? If they die while piloting the robot, and a very recent scan (i.e. to the day/hour/minute/second of their natural death) starts running things, but acts exactly the same as the day before their natural body ceased to function and the person effectively tossed their body underground already anyway, isn't it a bit arbitrary to say that this is the day they stopped being human?
They stopped being a human when there stopped being a human behind the machine. The more interesting question is actually what happens when the senses of the machine gets better than the ones we have as baseline humans, because we are defined as much by our limitations as we are by our abilities. If they decide to abandon their humanity, I don't hold them in very much regard, really. There are always those that try and go beyond their limits and it is always to their detriment. They'd find that out sooner rather than later.

Also in your example the human died, it ceased to exist. It is an ex-human. Whatever comes after may think itself human, if it even thinks at all, but it is a construct, a simulacrum, not an actual person.

Bobbin Threadbare posted:

Say you were to replace the wooden planks on a ship with planks of steel, one at a time and over the course of several decades as each original plank wore out. At what point does the ship stop being itself and start being something new? Let us say also that the original planks are successfully salvaged and reinforced to the point where someone is able to build a ship with them which is identical to the first ship. Which ship is now the original? Or has that distinction lost all meaning at this point?
Human beings are not ships; we can only to a supremely limited degree replace ourselves without losing ourselves. Even something as simple as pumping blood is better done by re-purposing flesh than using metal. The ship doesn't think, the ship doesn't feel. The ship is a thing. Humans, for all their faults, are not things, or even complex things. And attempting to reduce them purely to the sum of their parts, to be deduced or added, is something I do not agree with.

RickVoid posted:

Ooooh, I detect a stealth Pratchett reference there. Well done, if so.
It may actually be that he's referencing the almost 2000 year old philosophical idea, the ship of Theseus and not the fantasy author.

RickVoid posted:

That post also makes me want to talk about the concept of the continuity of conciousness... but that discussion needs to wait for a few more videos.

I will say that we've got some terrible opinions in this thread, some of you ought to spend a little time examining why you hold those beliefs the way you do, and why they reveal that you are an awful person. :colbert:
I agree completely. Some folks really should try and let go of their sad technofetishist and post-human ideas of "progress", when we still struggle even now with simply being baseline humans and all the potential therein, as well as their anthroporphizing of non-human machines.

Gantolandon
Aug 19, 2012

quote:

They stopped being a human when there stopped being a human behind the machine. The more interesting question is actually what happens when the senses of the machine gets better than the ones we have as baseline humans, because we are defined as much by our limitations as we are by our abilities. If they decide to abandon their humanity, I don't hold them in very much regard, really. There are always those that try and go beyond their limits and it is always to their detriment. They'd find that out sooner rather than later.

Also in your example the human died, it ceased to exist. It is an ex-human. Whatever comes after may think itself human, if it even thinks at all, but it is a construct, a simulacrum, not an actual person.

Does that mean when you lose your leg and replace it with a prosthesis, you become less human than before? Are people with artificial organs a bit closer to a simulacrum? Or do you have a list of body parts that are optional, and ones absolutely necessary to be human?

What about transplanted organs, does getting one makes you two different people in one body?

quote:

Human beings are not ships; we can only to a supremely limited degree replace ourselves without losing ourselves. Even something as simple as pumping blood is better done by re-purposing flesh than using metal. The ship doesn't think, the ship doesn't feel. The ship is a thing. Humans, for all their faults, are not things, or even complex things. And attempting to reduce them purely to the sum of their parts, to be deduced or added, is something I do not agree with.

Not true, our body cells die and get replaced constantly. Neurons in cerebral cortex are never replaced, but your colon cells die off after 4 days, for example).

Gantolandon fucked around with this message at 22:26 on Mar 10, 2016

Deceitful Penguin
Feb 16, 2011

Gantolandon posted:

Does that mean when you lose your leg and replace it with a prosthesis, you become less human than before? Are people with artificial organs a bit closer to a simulacrum? Or do you have a list of body parts that are optional, and ones absolutely necessary to be human?
Yes, but depending on the part less so.

Are you saying we aren't more than the sum of our parts, though some parts matter more than others? Are you saying losing your little toe and your spinal column are the same things?

Gantolandon posted:

What about transplanted organs, does getting one makes you two different people in one body?
Why are you trying to prove with that question?

Gantolandon posted:

Not true, our body cells die and get replaced constantly. Neurons in cerebral cortex are never replaced, but your colon cells die off after 4 days, for example).
The ship isn't replacing individual strings of wood but whole parts. If you want to go into minutia this way I could simply counter that as a robot contains no human cells it isn't a human.

Bobbin Threadbare
Jan 2, 2009

I'm looking for a flock of urbanmechs.

Deceitful Penguin posted:

Human beings are not ships; we can only to a supremely limited degree replace ourselves without losing ourselves. Even something as simple as pumping blood is better done by re-purposing flesh than using metal. The ship doesn't think, the ship doesn't feel. The ship is a thing. Humans, for all their faults, are not things, or even complex things. And attempting to reduce them purely to the sum of their parts, to be deduced or added, is something I do not agree with.

You're the one making pizza analogies, dude. Also, the you that was you seven years ago literally isn't the you that you are now, and that includes brain cells.

RickVoid posted:

Ooooh, I detect a stealth Pratchett reference there. Well done, if so.

It's actually a slightly modified version of a well known philosophical thought experiment. If Pratchett brought it up, he was making the same reference.

JossiRossi
Jul 28, 2008
Probation
Can't post for 6 hours!
Where is the line for you on when someone stops being a person? Would having a robot arm diminish someone's personhood? Would being paraplegic diminish someone's personhood? Would being a human head on robot body diminish someone's personhood? Would a brain in vat that still receives outside stimulation? Would a brain that slowly has it's parts replaced with electronic replicas? Would a perfect recreation of mind that just doesn't happen to be made of meat? You seem to act like somewhere in this list the sanctity of personhood is violated, and I just don't see it.

placid saviour
Apr 6, 2009
Penguin, although I thoroughly appreciate your standpoints and willingness to discuss in depth, I am equally amazed by how your argument rides wholly on the fact that you've apparently solved one of the biggest philosophical questions in the history of man on entirely your own.

You seem to know exactly what a "human" is and that which it is that makes us human. You do understand that your personal views about humanity aren't objective fact, right? I feel your argument would much benefit if you explain why you feel you know exactly what a human constitutes. Platitudes like "a sum of parts" don't really fly because loving everything is a sum of its parts and I don't think we should start talking about, like, quarks.

EponymousMrYar
Jan 4, 2015

The enemy of my enemy is my enemy.

Deceitful Penguin posted:

The idea of human, like pizza, might shift but you cannot change it. There exists the ideal pizza; some fresh mozzarella, good thin crust, a nice sauce and oregano on top. You can deviate to greater or lesser degrees from it; deep dish or those microwave ones, but once you leave behind the basic idea of toppings, crust and sauce, you have something that isn't pizza, no matter what people might say.

The subjectivity of reality may matter in a social sense, but I'm not giving it a pass here; I'm setting us up as an everlasting template of humanity, warts and all, and saying that you can call yourself whatever you like, but objective reality doesn't care.
False. Redefinition of things once held to be true have happened and will continue to happen. Pizza or our understanding of pizza can/will evolve. So will it be for everything we know.

You are correct in that objective reality doesn't care, but it also doesn't care for our current definitions of things. For example, there are no humans in Objective Reality. There are homo sapiens in all of their genetically-variable glory, but humanity itself is subjective by definition.
Thus, you're setting up a template that cannot exist objectively.

Gantolandon posted:

Or do you have a list of body parts that are optional, and ones absolutely necessary to be human?
Our brain is absolutely necessary to us being human. It's complexity and ability to enable self-defeating mannerisms is literally the baseline difference between us and every other sapient entity we've encountered.

EponymousMrYar fucked around with this message at 00:00 on Mar 11, 2016

Adbot
ADBOT LOVES YOU

Gantolandon
Aug 19, 2012

Deceitful Penguin posted:

Yes, but depending on the part less so.

Are you saying we aren't more than the sum of our parts, though some parts matter more than others? Are you saying losing your little toe and your spinal column are the same things?

No, but the question is meaningful only because losing your spinal column will not only inconvenience you, but most probably also kill you. If someone somehow replaced my spine with a metal one that performed just as well, it wouldn't make any difference for me, nor most of the people I know.

quote:

Why are you trying to prove with that question?

That treating humanity as something that's hidden in our organs lead to weird conclusions.

quote:

The ship isn't replacing individual strings of wood but whole parts. If you want to go into minutia this way I could simply counter that as a robot contains no human cells it isn't a human.

Why would the speed of replacement matter? The end result is the same, you end up with a changed part that doesn't even look entirely like the original one.

  • Locked thread