|
maskenfreiheit posted:yes, kidnapping is a less serious crime than murder. it's a bit scary you don't seem to understand this... In real life, if you were kidnapped and being tortured and had to kill your kidnapper to escape, we call that self defense and we don't lock people up for that. Now imagine instead of outright killing your kidnapper, you just ran past him and locked him in the room you escaped from, figuring that he had a way out of it. But oops, turns out he didn't and he dies of starvation in there. Still self defense, just as much so if you had choked the life out of him to escape. In the episode, the cookies have no way out of their kidnap torture hell other than this one plan, and the unfortunate side effect is his death. It's still self defense.
|
# ? Mar 3, 2018 04:15 |
|
|
# ? Apr 26, 2024 22:06 |
|
maskenfreiheit posted:yes, kidnapping is a less serious crime than murder. it's a bit scary you don't seem to understand this... So child killing Dalys is worse one. Is that the point?
|
# ? Mar 3, 2018 04:29 |
|
WampaLord posted:Now imagine instead of outright killing your kidnapper, you just ran past him and locked him in the room you escaped from, figuring that he had a way out of it. But oops, turns out he didn't and he dies of starvation in there. Still self defense, just as much so if you had choked the life out of him to escape. i mean, if you were out in the woods and it took several days to find help and he died of thirst while you're drinking from streams yes if you're found and refuse to tell the cops where he is, yes, that is manslaughter his death was preventable - they could have messaged someone - that was a major plot point earlier in the show.
|
# ? Mar 3, 2018 04:35 |
|
Joe Chill posted:So child killing Dalys is worse one. Is that the point? yes, i'm sorry i've spawned this huge derail but that's my point: that daly was wronged, tried to find a nonviolent outlet, and then was killed for it. and that maybe that murder is worse than... abusing sims.
|
# ? Mar 3, 2018 04:36 |
|
maskenfreiheit posted:if you're found and refuse to tell the cops where he is, yes, that is manslaughter No, that's obstruction of justice at most. Also, they had no idea that what they were doing would leave him trapped in the game world. You make it seem like it's obvious that them doing the wormhole plan will lead straight to his death, the line of causality is not that simple. It's not murder, no matter how many times you try to frame it that way.
|
# ? Mar 3, 2018 04:45 |
|
WampaLord posted:No, that's obstruction of justice at most. creating circumstances that lead to someone's death, like locking them in a room they starve in is textbook negligent homicide, usually codified as involuntary manslaughter. feel free to provide a citation if you want to continue to assert otherwise
|
# ? Mar 3, 2018 05:13 |
|
maskenfreiheit posted:creating circumstances that lead to someone's death, like locking them in a room they starve in is textbook negligent homicide, usually codified as involuntary manslaughter. feel free to provide a citation if you want to continue to assert otherwise Not if done in self defense, that's been my entire point, they're acting out of self defense. Again, if you were being kidnapped and tortured, you are LEGALLY ALLOWED to kill your kidnapper if it helps you get free.
|
# ? Mar 3, 2018 05:21 |
|
WampaLord posted:Not if done in self defense, that's been my entire point, they're acting out of self defense. yes, in the moment. once you are free, you don't have that right. the threat was over and they were free. they had an obligation to mitigate the damages at that point. just like you can shoot a burglar in your apartment, but if they run away you can't follow them outside and shoot them in the parking lot. i'm going to assume you're pretending to not understand this to troll me, and won't be commenting further if you can't provide a source to back up your assertion. have a nice night!
|
# ? Mar 3, 2018 05:27 |
|
maskenfreiheit posted:the threat was over and they were free. they had an obligation to mitigate the damages at that point. What damages? How could they possibly know? maskenfreiheit posted:have a nice night! Please don't do this cutesy poo poo.
|
# ? Mar 3, 2018 05:29 |
|
WampaLord posted:What damages? How could they possibly know? They mentioned they knew he’s trapped in the game. And please don’t swear at me I’m being covil. Gnight for real this time
|
# ? Mar 3, 2018 05:47 |
|
maskenfreiheit posted:ok, let's concede they're people (i don't, but let's for now) If you don't concede they are people, then they can't be liable for manslaughter, Daly basically commited suicide via the programme he created. If you do concede they are people, then gently caress daly for torturing them and killing a child. They might have committed manslaughter, but he's committed murder and they were acting in self defence. Please stop being wrong and go away.
|
# ? Mar 3, 2018 11:53 |
|
BSam posted:If you don't concede they are people, then they can't be liable for manslaughter, Daly basically commited suicide via the programme he created. This right here. These goalposts you're moving? Wherever they go, your point is wrong. They're either people who defended themselves (and that would also make Daly a crazy murderer, for the kid stuff) or they're not so they can't be held responsible.
|
# ? Mar 3, 2018 15:32 |
|
USS Callister does have kind of a "and then what?" ending. The AI clones get to truck along happily in cyberspace, sure, but what happened to the real life Your Mother? She's never gonna get any closure on her cloud hacker, and the apartment she was blackmailed into breaking in to ended up with a guy dead from frybrains. She is gonna think she murdered a dude who, to her knowledge, never wronged her, and probably there will be forensic evidence in his apartment that tells the same story.
|
# ? Mar 3, 2018 15:44 |
|
Zulily Zoetrope posted:USS Callister does have kind of a "and then what?" ending. The AI clones get to truck along happily in cyberspace, sure, but what happened to the real life Your Mother? She's never gonna get any closure on her cloud hacker, and the apartment she was blackmailed into breaking in to ended up with a guy dead from frybrains. She is gonna think she murdered a dude who, to her knowledge, never wronged her, and probably there will be forensic evidence in his apartment that tells the same story. and justice will prevail
|
# ? Mar 3, 2018 23:50 |
|
I really love the discussions USS Callister spawns, so much more entertaining than the actual episode. BTW, make sentient AI's and torture them all you want they're just code.
|
# ? Mar 4, 2018 00:46 |
|
Grem posted:I really love the discussions USS Callister spawns, so much more entertaining than the actual episode. But...but they're sentient
|
# ? Mar 4, 2018 03:08 |
|
A. Beaverhausen posted:But...but they're sentient And not AIs, they are a 100% copy of a person's brain in a virtual online multiplier game, they were not created/coded from noting. Imagine putting a digital copy of a person's brain into another body cloned from their DNA, they would be indistinguishable from their original self.
|
# ? Mar 4, 2018 04:44 |
|
Actually, saliva doesn't actually contain enough information to replicate a person and all their memories. All of that is just roleplaying the Captain does to believe in the fiction of torturing his coworkers. In reality they're just code made to approximate their personalities and attempt escape only so the Captain can catch them. He programmed them too well and they killed him.
|
# ? Mar 4, 2018 05:05 |
|
That is not the text. You didn't watch the episode.
|
# ? Mar 4, 2018 05:37 |
|
BSam posted:If you don't concede they are people, then they can't be liable for manslaughter, Daly basically commited suicide via the programme he created. replied subsequently but not to this what a surprise
|
# ? Mar 4, 2018 05:53 |
|
Joe Chill posted:And not AIs, they are a 100% copy of a person's brain in a virtual online multiplier game, they were not created/coded from noting. If you pull the plug on the servers they don't die, they just disappear. They're not people. e: I'm also okay saying dude committed suicide via his own super smart Rube Goldberg machine
|
# ? Mar 4, 2018 11:44 |
|
So do you think the torturing they go through is ok in White Christmas since they aren't real?
|
# ? Mar 4, 2018 12:12 |
Grem posted:If you pull the plug on the servers they don't die, they just disappear. They're not people.
|
|
# ? Mar 4, 2018 12:23 |
|
Mu Zeta posted:So do you think the torturing they go through is ok in White Christmas since they aren't real? Yea, and it's a pretty cool way to manage your house, too. Fake me probably makes a mean toast. e: they can't sleep, real people can sleep, they're code, and because of that they are able to be manipulated by just switching some poo poo around on a keyboard, it's ok to have them open your blinds in the morning. Grem fucked around with this message at 13:00 on Mar 4, 2018 |
# ? Mar 4, 2018 12:54 |
|
Mantis42 posted:Actually, saliva doesn't actually contain enough information to replicate a person and all their memories. All of that is just roleplaying the Captain does to believe in the fiction of torturing his coworkers. In reality they're just code made to approximate their personalities and attempt escape only so the Captain can catch them. He programmed them too well and they killed him. So Daly knew about Cole's secret cloud pictures too? Because he would have to know about those super secret pictures to program their memories. That does not make any sense. Mantis42 posted:Actually, saliva doesn't actually contain enough information to replicate a person and all their memories. Actually, saliva does not have ANY memories! It's like this show is fantasy or something! Grem posted:If you pull the plug on the servers they don't die, they just disappear. They're not people. What happens to your mind when you die? Is ceasing to exist unlike dying? Why does that make them not people? If their mind was transferred into a clone body, then died, wouldn't they be a person then by your definition?
|
# ? Mar 4, 2018 13:08 |
|
Grem posted:Yea, and it's a pretty cool way to manage your house, too. Fake me probably makes a mean toast. Dude that poor lady was traumatized and broken by the time she was the house servant. They 'feel' and 'think'. I think therefore I am.
|
# ? Mar 4, 2018 19:16 |
|
If the digital copies aren't real people, then why was the episode largely from their point of view? What's the point of seeing what happens to them after they go through the wormhole?
|
# ? Mar 4, 2018 19:44 |
|
The thinking they do is just a very close approximation to what the human they're programmed after is able to do. Do you guys think that the robot in I'll Be Right Back was human? All it could do was what it learned about the thing it was copying. Same thing for the things in White Christmas and USS Callister, they're just a bit more advanced.
|
# ? Mar 4, 2018 20:19 |
|
Grem posted:The thinking they do is just a very close approximation to what the human they're programmed after is able to do. Do you guys think that the robot in I'll Be Right Back was human? All it could do was what it learned about the thing it was copying. Same thing for the things in White Christmas and USS Callister, they're just a bit more advanced. I think the jump in technology is a fair bit more than that, with regards to Be Right Back vs White Christmas / Callister (and also Black Museum). Edit: I mean, the thing in Be Right Back was an approximation, not able to really think or feel on its own, and didn't really have the same self-awareness as the other examples. It was closer to what we have today with AI. Rupert Buttermilk fucked around with this message at 20:34 on Mar 4, 2018 |
# ? Mar 4, 2018 20:32 |
|
Not far enough to make a real human though. I'll go ahead and put a qualifier on it, if they have an episode where they copy a day old baby and it grows to develop new feelings and emotions, like the existential dread of an eternity of silent isolation, then I'll think it's sentient. Right now, the things in USS Callister fear things like the death of a loved one because the people they're copied from already have an idea of what that emotion is like.
|
# ? Mar 4, 2018 20:37 |
|
Grem posted:Right now, the things in USS Callister fear things like the death of a loved one because the people they're copied from already have an idea of what that emotion is like. And this makes torturing them by killing their loved ones in front of them ok?
|
# ? Mar 4, 2018 23:58 |
|
I don't get upset when people do hosed up things to characters in Skyrim, either.
|
# ? Mar 5, 2018 01:15 |
|
Grem posted:I don't get upset when people do hosed up things to characters in Skyrim, either. When those characters eventually become more complex than you are, with deeper and more meaningful things going on in their virtual lives than you have in your own, they'll be able to say the same of you. For now they're not and that's fine. The "sentience debate" is just the slavery debate all over again and that's already pretty well explored. Everyone wants all sorts of slaves making them happy, and no one wants to be one. We can design artificial slaves that have extremely specialized and transient existences, blissfully unable to consider what sort of slave they are (like those easily coded Skyrim NPCs you've got dancing around and killing each other), but if what they're doing gives them complex lives that are more meaningful than our own then the above paragraph applies. People who fearmonger about AI taking over the world are exactly the same as anyone else in history who was terrified of a potential slave uprising. Yeah, no poo poo no one wants to be on the receiving end of a slave uprising, but it's probably better to not create a world full of pissed off slaves in the first place. An AI is overworked and abused if it's too smart for the task it's doing. It makes a difference whether the AI is something like a calculator program your brain is delegating work to, versus a full fledged complex being with a deeply meaningful life being wasted on a simulation for someone of equal or lesser value. Happy Thread fucked around with this message at 04:14 on Mar 5, 2018 |
# ? Mar 5, 2018 03:57 |
|
Boy there's been an upswing in posters who fixate on the metaphysics of the tech in the show while being coldly unmoved by the themes, characterization, and actual Point, huh.
|
# ? Mar 5, 2018 04:32 |
|
Supercar Gautier posted:Boy there's been an upswing in posters who fixate on the metaphysics of the tech in the show while being coldly unmoved by the themes, characterization, and actual Point, huh.
|
# ? Mar 5, 2018 05:01 |
|
I'm an autistic person and i agree with supercar, the number of empathy-deprived canon-prodders in this thread has legit made me sick to my stomach.
|
# ? Mar 5, 2018 05:06 |
|
I'm legitimately surprised that there's someone who doesn't feel some sort of pity for that poor women from white christmas.
|
# ? Mar 5, 2018 05:41 |
|
Why? She's got her house running by a very personalized Alexa, sounds like she made out pretty good, unless she was gouged financing the program or something.
|
# ? Mar 5, 2018 06:02 |
|
Your refusal to concede and acknowledge the humanity of Cookies speaks to a profoundly broken sense of empathy and I would not trust you to feed a starving dog.
|
# ? Mar 5, 2018 06:06 |
|
|
# ? Apr 26, 2024 22:06 |
|
DoctorWhat posted:Your refusal to concede and acknowledge the humanity of Cookies speaks to a profoundly broken sense of empathy and I would not trust you to feed a starving dog. I'm sure my Cookie has that covered. https://petnet.io/
|
# ? Mar 5, 2018 06:10 |