Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Ravenfood
Nov 4, 2011

There Bias Two posted:

So I just finished this show and I have really mixed feelings about it. It was definitely fun, but a lot of the elements fell completely flat for me. I wasn't sold on the whole idea of the envoys trying to destroy stack technology. What about the Meths makes them any worse than regular ol' privileged psychopaths? It seems stupid to attack a tool when clearly the main issues were socioeconomic ones.

Besides with the existing biotechnology, they'd probably develop an alternative system in just a few years.

Tak's whole backstory in general was a mess here.
Yeah your position is basically the book's position. Meths are privileged psychopaths who are slightly worse simply because their longer lifespans allow them more time to accumulate power and wealth while stunting everyone else around them. The saying that "science advances one funeral at a time" doesn't really work as well when people don't die. Or, more broadly, its possible that societal change would stagnate somewhat if lifespans increased (once the initial upheaval settled). Its a shame that every TV show that has any kinds of transhumanism in it seems to settle on yelling about how its unnatural instead of exploring it more.

The books do also address the current argument between Zaphod/Hammerstein and Batutta, because while something like a needlecast is pretty clearly "you" still, the branching forks and backups muddy it quite a bit. Kovacs is impressed despite himself that Bancroft kills himself even knowing he has a backup because he knows he's losing those new experiences and is "dying". His clone would be demonstrably different from him (not having Rawlings, for one), he's just so that convinced that the world needs "him"-as-of-47-hours-ago that he kills himself to save what is effectively a very, very slightly different person, but who the rest of the world will see as him. Ego and willpower, to some extent. Its not just that meths have the means, but they have the desire. Plus, that argument that Kovacs has with himself that I should re-read at some point.

Ravenfood fucked around with this message at 04:48 on Feb 14, 2018

Adbot
ADBOT LOVES YOU

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

bring back old gbs posted:

That's interesting, because the Carnage guy was a human stack in a shoddy android body.

So in theory could Poe(or AI in general) be transferred to a stack/stack-like interface and put in a meat+bones clone?

Yeah that'd be cool as poo poo. Go all Agent Smith on this.

Ravenfood
Nov 4, 2011

Zaphod42 posted:

Yeah that'd be cool as poo poo. Go all Agent Smith on this.

Surely, transferring Poe into a differing substrate for consciousness and experience would be killing him in your opinion, yeah?

Ravenfood fucked around with this message at 05:27 on Feb 14, 2018

bring back old gbs
Feb 28, 2007

by LITERALLY AN ADMIN

Ravenfood posted:

Yeah your position is basically the book's position. Meths are privileged psychopaths who are slightly worse simply because their longer lifespans allow them more time to accumulate power and wealth while stunting everyone else around them. The saying that "science advances one funeral at a time" doesn't really work as well when people don't die. Or, more broadly, its possible that societal change would stagnate somewhat if lifespans increased (once the initial upheaval settled). Its a shame that every TV show that has any kinds of transhumanism in it seems to settle on yelling about how its unnatural instead of exploring it more.

The books do also address the current argument between Zaphod/Hammerstein and Batutta, because while something like a needlecast is pretty clearly "you" still, the branching forks and backups muddy it quite a bit. Kovacs is impressed despite himself that Bancroft kills himself even knowing he has a backup because he knows he's losing those new experiences and is "dying". His clone would be demonstrably different from him (not having Rawlings, for one), he's just so that convinced that the world needs "him"-as-of-47-hours-ago that he kills himself to save what is effectively a very, very slightly different person, but who the rest of the world will see as him. Ego and willpower, to some extent. Its not just that meths have the means, but they have the desire. Plus, that argument that Kovacs has with himself that I should re-read at some point.

This post should be added to the OP, it's a really succinct way to explain the technically-death-even-if-you-backup thing. Ego run amok, clouding you from the truth that it's not actually "you".

Neddy Seagoon posted:

No, you couldn't. Altered Carbon has an interesting perception of AI's on the understanding that they're wasted as humanoid robots. You get the most out of them in infrastructure like hotels or cities, burning through thousands of parallel processes. Seeing a genuine humanoid robot is an antique oddity.

Poe's little nanotech body is just something from the show (not that it's a bad change for the format). The Hendrix in the book just supplies meals via dumbwaiters and interacts through screens.

It's literally cheaper to sleeve someone in a synth body than to go in for AI in a synth body.

Ah, I kept wondering what the changes everybody was talking about w/r/t/ Poe. Yeah, loved the character, glad he was there. It's also a great concept that a physical, singular body is beneath them or something. Makes sense.

bring back old gbs fucked around with this message at 05:40 on Feb 14, 2018

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

bring back old gbs posted:

That's interesting, because the Carnage guy was a human stack in a shoddy android body.

So in theory could Poe(or AI in general) be transferred to a stack/stack-like interface and put in a meat+bones clone?

No, you couldn't. Altered Carbon has an interesting perception of AI's on the understanding that they're wasted as humanoid robots. You get the most out of them in infrastructure like hotels or cities, burning through thousands of parallel processes. Seeing a genuine humanoid robot is an antique oddity.

Poe's little nanotech body is just something from the show (not that it's a bad change for the format). The Hendrix in the book just supplies meals via dumbwaiters and interacts through screens.

It's literally cheaper to sleeve someone in a synth body than to go in for AI in a synth body.

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

bring back old gbs posted:

This post should be added to the OP, it's a really succinct way to explain the technically-death-even-if-you-backup thing. Ego run amok, clouding you from the truth that it's not actually "you".

That only holds if your backup or stack is somehow out of date - and only to the extent that you consider losing a few hours of your existence 'real death'. I don't think most people believe retrograde amnesia would kill them.

Putting an AI in a synth body would probably qualify as some sort of crime against (in)humanity.

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

General Battuta posted:

That only holds if your backup or stack is somehow out of date - and only to the extent that you consider losing a few hours of your existence 'real death'. I don't think most people believe retrograde amnesia would kill them.

Putting an AI in a synth body would probably qualify as some sort of crime against (in)humanity.

It's nice that the new you thinks that, but you're still cooling meat making GBS threads itself on the floor.

Ravenfood
Nov 4, 2011

General Battuta posted:

That only holds if your backup or stack is somehow out of date - and only to the extent that you consider losing a few hours of your existence 'real death'. I don't think most people believe retrograde amnesia would kill them.
Yeah, its "technically" death only because the backup and original have different experiences. I personally think I'd find it difficult to do but I'm also not someone who grew up in that society.

e: You're wrong about the retrograde amnesia part though. Yes, from the perspective of the backup, it would be equivalent to suffering amnesia and not a big deal, and if you asked Bancroft at -49 hours whether he'd be okay just losing the next 48 hours of memory for very good reasons, he probably would. But neither of those are the same as what the Bancroft fork who blew out his stack experiences, so from the perspective of that fork, it IS dying. At least how I see it.

Ravenfood fucked around with this message at 06:06 on Feb 14, 2018

The Ninth Layer
Jun 20, 2007

Neddy Seagoon posted:

It's nice that the new you thinks that, but you're still cooling meat making GBS threads itself on the floor.

Why would this have to be the case, assuming the technology is sophisticated enough? Imagine a system in which you were hooked up to a virtual reality system, a fusion of biological and digital signal processing, and as you were connected a software system slowly deactivated parts of your biological brain and replaced them with digital processes stored on hardware, in such a way that you yourself would not be able to tell the difference. Couldn't this process slowly kill off your biological self while maintaining your continuous state of consciousness, in such a way that *you* would be fully transferred over in a way you could be satisfied with?

If that's indeed the case then why couldn't this process happen just a lot more quickly, according to whatever CPU you're using?

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

The Ninth Layer posted:

Why would this have to be the case, assuming the technology is sophisticated enough? Imagine a system in which you were hooked up to a virtual reality system, a fusion of biological and digital signal processing, and as you were connected a software system slowly deactivated parts of your biological brain and replaced them with digital processes stored on hardware, in such a way that you yourself would not be able to tell the difference. Couldn't this process slowly kill off your biological self while maintaining your continuous state of consciousness, in such a way that *you* would be fully transferred over in a way you could be satisfied with?

If that's indeed the case then why couldn't this process happen just a lot more quickly, according to whatever CPU you're using?

At some point your mind keeps on going separately as software, but you yourself just die as your brain gets shut off.

The basic thing people don't seem to comprehend is that when you transfer data of any kind from one system to another, there is no actual magical data entity moving through the signal to its new home. System A sends instructions to System B for how to write down its own copy of the data. Once transfer is complete and verified, System A deletes the original. That's it.

Neddy Seagoon fucked around with this message at 06:08 on Feb 14, 2018

Ravenfood
Nov 4, 2011

Neddy Seagoon posted:

At some point your mind keeps on going separately as software, but you yourself just die as your brain gets shut off.

The basic thing people don't seem to comprehend is that when you transfer data of any kind from one system to another, there is no actual magical data entity moving through the signal to its new home. System A sends instructions to System B for how to write down its own copy of the data. Once transfer is complete and verified, System A deletes the original. That's it.
How are "your mind" and "you yourself" different?

Yeah, ok, and? If all "I" am is my experiences, and those experiences continue, then I continue. It only gets weird when system A doesn't delete the original.

Ravenfood fucked around with this message at 06:14 on Feb 14, 2018

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

Ravenfood posted:

How are "your mind" and "you yourself" different?

Yeah, ok, and? If all "I" am is my experiences, and those experiences continue, then I continue. It only gets weird when system A doesn't delete the original.


A copied instance does, yes. You yourself are still quite dead. It only "gets weird" because you realize the true nature of what that transfer's just done.

Ravenfood
Nov 4, 2011

Neddy Seagoon posted:

A copied instance does, yes. You yourself are still quite dead.
No, because you haven't specified what makes an identical copy (with the original deleted) different from the original simply continuing. You haven't said what makes "you yourself" in any way special or somehow different from something else with an identical experience, you just keep saying that it is.

e: And no, the weirdness comes when trying to determine just how far the tolerances from "same shared information = same person" to "is now a different person" is. Its very easy to say a perfectly identical copy is the same as the original if the original ceases to exist the moment the copy comes into existence, but its harder to do that if they exist in tandem.

Ravenfood fucked around with this message at 06:26 on Feb 14, 2018

bring back old gbs
Feb 28, 2007

by LITERALLY AN ADMIN

General Battuta posted:

That only holds if your backup or stack is somehow out of date - and only to the extent that you consider losing a few hours of your existence 'real death'. I don't think most people believe retrograde amnesia would kill them.

Putting an AI in a synth body would probably qualify as some sort of crime against (in)humanity.

Well the only way to do the transfer SO FAST that your brain doesn't diverge into a new consciousness is to do a needlecast, (or to be unconscious AND somehow not dreaming while a backup is done and then you get killed in a complicated contraption of a plan) right? So yeah we'd have to define what we consider to be a completely new consciousness. Is an hour old backup of you a completely different person? But why an hour, technically your experiences diverge the instant you're backed up and the version that stayed conscious and not in storage began forming new memories. So the time element is almost irrelevant entirely.

I'd agree with you about the amnesia thing of course it's still you, but the ability to just take a a copy of the indefinable spark that makes you, you and have it not only inside you still, but also in your hand but slightly out of date but still capable of being put into a dummy body and functionally replacing your life...I mean poo poo, that would be insane. I don't even really know how to approach that. Good fuckin luck to the lawmakers that do.

Imagine that story a few years ago about the Silicon Valley guy who paid like $60k of his $250k/year programming job to a guy in India who basically did his entire job for him. But you could do that yourself, with you, just take half the time off while you and your clone tap in and out of the daily bullshit.

Ok I'll put the bong down now.

bring back old gbs fucked around with this message at 06:25 on Feb 14, 2018

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

Neddy Seagoon posted:

A copied instance does, yes. You yourself are still quite dead. It only "gets weird" because you realize the true nature of what that transfer's just done.

The basic thing you don’t seem to get is that there’s no magical quality to the place data lives. Information doesn’t care what’s used to encode it as long as it’s preserved. Your mind is a system for moving atoms around according to rules. If those rules are suddenly moving bits around instead, they don’t care. They’re still you.

This should be trivially obvious once you recognize that all the protons in your brain could be swapped out for identical duplicates without killing you. A proton functions as a proton.

Identifying information with substrate is a trap. The fear that a brain upload would replace you with an ineffably different copy and leave you dead doesn’t hold up to logic. It’s just a fear created by the fact that the same process in day to day life casts off old substrate as heat, sweat, and poo poo rather than a corpse.

The Ninth Layer
Jun 20, 2007

Neddy Seagoon posted:

At some point your mind keeps on going separately as software, but you yourself just die as your brain gets shut off.

The basic thing people don't seem to comprehend is that when you transfer data of any kind from one system to another, there is no actual magical data entity moving through the signal to its new home. System A sends instructions to System B for how to write down its own copy of the data. Once transfer is complete and verified, System A deletes the original. That's it.

Well this depends on whether you view yourself as your brain, heart, lungs, hands, feet, etc. or if you view yourself as the entity that experiences the functioning of your brains, heart, lungs, hands, feet, etc. As human beings we generally consider ourselves to be -both- as there are merits for both views, our consciousness is a function of the structure of our brain and a bunch of biological processes, and our brain is essentially a computational network of processes that individually can be simulated (if poorly) by computer networks.

Altered Carbon comes down firmly on one side, aka you are the network overseeing a wide variety of processes, of which any component could theoretically be replaced by an artificial or synthetic component without interrupting the whole. You seem to be coming down hard on the other side, but I'll bet I could put forward tech scenarios in which you would accept that indeed your consciousness can be transferred.

For example, say I have a virtual reality machine that will incidentally depower your neural functions, while simultaneously keeping your brain healthy and sending the appropriate signals to the biological systems that rely on your brain to function. Say we go about this in the same way I did in my first post: I slowly shut down your biological neural components but in such a way that subjectively you cannot tell the difference. Once I've transferred you all the way over, and provided I can prove to you in a sufficient way that your brain was shut off, I proceed to "restart" various portions of your brain and discard the corresponding digital systems in such a way that you wouldn't be able to tell. Eventually I disconnect you from the VR and you're at 100% brain power. In this situation would you now accept that I have transferred your *self* back and forth, or would you insist that I have simply created and then deleted a copy while I paralyzed you?

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

The Ninth Layer posted:

Well this depends on whether you view yourself as your brain, heart, lungs, hands, feet, etc. or if you view yourself as the entity that experiences the functioning of your brains, heart, lungs, hands, feet, etc. As human beings we generally consider ourselves to be -both- as there are merits for both views, our consciousness is a function of the structure of our brain and a bunch of biological processes, and our brain is essentially a computational network of processes that individually can be simulated (if poorly) by computer networks.

Altered Carbon comes down firmly on one side, aka you are the network overseeing a wide variety of processes, of which any component could theoretically be replaced by an artificial or synthetic component without interrupting the whole. You seem to be coming down hard on the other side, but I'll bet I could put forward tech scenarios in which you would accept that indeed your consciousness can be transferred.

For example, say I have a virtual reality machine that will incidentally depower your neural functions, while simultaneously keeping your brain healthy and sending the appropriate signals to the biological systems that rely on your brain to function. Say we go about this in the same way I did in my first post: I slowly shut down your biological neural components but in such a way that subjectively you cannot tell the difference. Once I've transferred you all the way over, and provided I can prove to you in a sufficient way that your brain was shut off, I proceed to "restart" various portions of your brain and discard the corresponding digital systems in such a way that you wouldn't be able to tell. Eventually I disconnect you from the VR and you're at 100% brain power. In this situation would you now accept that I have transferred your *self* back and forth, or would you insist that I have simply created and then deleted a copy while I paralyzed you?

You have deleted a copy and paralyzed someone.


General Battuta posted:

The basic thing you don’t seem to get is that there’s no magical quality to the place data lives. Information doesn’t care what’s used to encode it as long as it’s preserved. Your mind is a system for moving atoms around according to rules. If those rules are suddenly moving bits around instead, they don’t care. They’re still you.

This should be trivially obvious once you recognize that all the protons in your brain could be swapped out for identical duplicates without killing you. A proton functions as a proton.

Identifying information with substrate is a trap. The fear that a brain upload would replace you with an ineffably different copy and leave you dead doesn’t hold up to logic. It’s just a fear created by the fact that the same process in day to day life casts off old substrate as heat, sweat, and poo poo rather than a corpse.

You seem to be contemplating information as some kind of intangible elemental force. It is not.

Ravenfood
Nov 4, 2011

Neddy Seagoon posted:

You have deleted a copy and paralyzed someone.


You seem to be contemplating information as some kind of intangible elemental force. It is not.

What fundamentally separates a perfect copy from the original if the original ceases to exist the instant the copy is created?

e: Or even a copy from the original if both exist?

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

Neddy Seagoon posted:

You have deleted a copy and paralyzed someone.


You seem to be contemplating information as some kind of intangible elemental force. It is not.

No. Information is a well understood and quantified concept in physics - conservation of information may even be as fundamental a law as conservation of mass-energy. If this is new to you, check out the Wikipedia page on physical information.

Again. Information is the arrangement of elements into a structure: the difference between several piles of base elements and a human being of the same composition. The elements may be swapped out (neurons for simulated digital equivalents, or nanomachines, or whatnot) without altering the information encoded.

You surely understand that all protons are identical, right? All atoms of carbon or oxygen and so forth? So you grasp that if a wizard were to replace every atom in your body with an identical replacement, nothing would change. You would not die.

Now imagine the wizard leaves the matter he swapped out in a pile by your side. Do you scream “My poor dead atoms! Now I am only a clone!”? No, you don’t.

Now imagine the wizard leaves the matter he swapped out in the shape of your dead corpse. It’s dead because the wizard didn’t transfer any brain activity. Do you shriek “I’ve been murdered! Murder!”

Or do you see why getting hung up on substrate is a trap?

Proteus Jones
Feb 28, 2013



We are the sum of our state changes in Planck time.

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

General Battuta posted:

No. Information is a well understood and quantified concept in physics - conservation of information may even be as fundamental a law as conservation of mass-energy. If this is new to you, check out the Wikipedia page on physical information.

Again. Information is the arrangement of elements into a structure: the difference between several piles of base elements and a human being of the same composition. The elements may be swapped out (neurons for simulated digital equivalents, or nanomachines, or whatnot) without altering the information encoded.

You surely understand that all protons are identical, right? All atoms of carbon or oxygen and so forth? So you grasp that if a wizard were to replace every atom in your body with an identical replacement, nothing would change. You would not die.

Now imagine the wizard leaves the matter he swapped out in a pile by your side. Do you scream “My poor dead atoms! Now I am only a clone!”? No, you don’t.

Now imagine the wizard leaves the matter he swapped out in the shape of your dead corpse. It’s dead because the wizard didn’t transfer any brain activity. Do you shriek “I’ve been murdered! Murder!”

Or do you see why getting hung up on substrate is a trap?

Your comprehension of conservation of information seems to be very selective, because the concepts you're playing with are still a pattern of information and not a transferred instance. You have created a separate entity with delusions of continuity because there is no blip in the perception of the replacement, not actual continuity. There is no place in your argument that actually governs transfer, only alteration or creation of identical information in a second object.

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.
By that objection you’ve disproven the existence of your own brain. Do you think your thoughts depend on specially labeled atoms? Unique proteins that known they’re part of you? A specially privileged part of space time? You are not an object, friendo. You are not an instance or substrate. You’re a pattern of information computing on a substrate, and wherever that pattern operates, so do you.

The continuity here is not an illusion. It’s mathematically provable that no information is lost, only substrate. And as we laid out above, substrate doesn’t matter.

But let’s be more specific about your objection. In which of the above scenarios do you feel the wizard killed you?

The Ninth Layer
Jun 20, 2007

Neddy Seagoon posted:

Your comprehension of conservation of information seems to be very selective, because the concepts you're playing with are still a pattern of information and not a transferred instance. You have created a separate entity with delusions of continuity because there is no blip in the perception of the replacement, not actual continuity. There is no place in your argument that actually governs transfer, only alteration or creation of identical information in a second object.

So then you are left telling us where the dividing line is that you are no longer you and now become the copy. If you cannot tell this line by experience, to the point where I could transfer you into my VR system and back out of it without you experiencing the journey any differently than you'd experience riding the subway from home to work and back again, then where is the line?

I could shut down your visual cortex and eye, and replace it with a digital network that did form processing and object recognition on a pair of camera in the exact way that your brain does and have it connect to your biological brain in an identical fashion to the cells I disconnected, and you would probably agree that *you* would experience sight out of these cameras.

I could shut down your long term memory and how it relates to semantic networks, and maybe I'll take your language processes and semantic network with it, and replace it all with identically structured and connected replacements. Now maybe at this point you are no longer *you* if we consider consciousness and memory and semantic language to go hand in hand... but we'd hesitate to say someone is no longer *them* if they got into an accident and lost those areas, or that they have lost total conscious experience (depending on how severe and localized the accident was).

If any one of these components could be seamlessly replaced individually in such a way that you would still consider you *yourself* then where's the line where suddenly you are not *yourself?*

bring back old gbs
Feb 28, 2007

by LITERALLY AN ADMIN
So do you think you can just learn a skill like chopping wood well by arranging neurons or atoms in the right structure in the brain? Is a memory the specific structure and arrangement of neurons or can this change from person to person - how to chop wood from the same bundle of whatever in the same spot in the same arrangement? And further on from that copy all the skills one has ever acquired in their life. And somehow do this for emotions too. And then you put that same order in a different brain, transferring all those skills/emotions and whatever paths muscle memory has worked out, and this process actually works.

Does this person get angry in traffic like the original? Do they have the same taste in food? Where does that poo poo come from? How do you quantify the "data" or even begin to parse it?

:psyduck:

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.
You can could probably do that, though I shudder to think how hard it might be to adapt one brain’s idiolect to abother’s. It might be easier to just provide realistic simulated input - canned, compressed ‘practice’.

A lot of the sleeves in Altered Carbon seem to come with skills, but I don’t know if it’s laid out how much they’re neural and how much they’re somatic/reflexive - probably a lot of both, realistically.

The Ninth Layer
Jun 20, 2007

bring back old gbs posted:

So do you think you can just learn a skill like chopping wood well by arranging neurons or atoms in the right structure in the brain? Is a memory the specific structure and arrangement of neurons or can this change from person to person - how to chop wood from the same bundle of whatever in the same spot in the same arrangement? And further on from that copy all the skills one has ever acquired in their life. And somehow do this for emotions too. And then you put that same order in a different brain, transferring all those skills/emotions and whatever paths muscle memory has worked out, and this process actually works.

Does this person get angry in traffic like the original? Do they have the same taste in food? Where does that poo poo come from? How do you quantify the "data" or even begin to parse it?

:psyduck:

Assuming brains have perfect control of their bodies, you absolutely could do all of this because our brain is structured with all of these things weighted. In other words you have a neural network for wood chopping that puts certain strains into moving some muscles and not others in a way that would facilitate your chopping, and through biological feedback systems those weights get adjusted as necessary. If you were in a robot body with corresponding robot muscles, joints, touch/sensory information and so on to where it'd be a 1:1 copy with the human body then you could easily acquire an "optimal wood chopping"" network of motor actions etc.

Could you take ~someone else's~ chopping system and put it into ~your~ physical brain? I think that's harder to answer and it's unlikely we all form our networks exactly the same way. What's effective for a 7'4" muscled giant may not work for me at 5'10" but if we're not limited by the tech we're using then it's perfectly possible that a computer algorithm exists that could "convert" the giant's muscle chopping skill into something that would work for my physical wiring.

The decision to yell in traffic is absolutely calculated and could easily be a part of a digital simulation of personality, you may have to simulate the effects of hormone production and how it affects your emotional state, but even without that you could explain someone yelling in traffic through learned neural network responses. Traffic is frustrating, frustration creates a feedback loop of worry/anxiety, and yelling may be a learned method of breaking that feedback loop.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Ravenfood posted:

e: You're wrong about the retrograde amnesia part though. Yes, from the perspective of the backup, it would be equivalent to suffering amnesia and not a big deal, and if you asked Bancroft at -49 hours whether he'd be okay just losing the next 48 hours of memory for very good reasons, he probably would. But neither of those are the same as what the Bancroft fork who blew out his stack experiences, so from the perspective of that fork, it IS dying. At least how I see it.

Yep that's what I keep saying.

Neddy Seagoon posted:

A copied instance does, yes. You yourself are still quite dead. It only "gets weird" because you realize the true nature of what that transfer's just done.

Yep

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Ravenfood posted:

What fundamentally separates a perfect copy from the original if the original ceases to exist the instant the copy is created?

e: Or even a copy from the original if both exist?

They're not the same "thing". Pick whatever word you want to call that! You seemed to really hate entity, how about version? But version makes them sound different in state, which they aren't.

But even if two things have the exact same state, they're aren't identical if only by virtue of there being TWO of them!

Lets say I have 2 USB sticks. I copy the data from one to the other. Does that mean I now only have 1 USB stick? No, there's 2 USB sticks that both hold the same data. And then if I delete one, I still have that data, I haven't lost it, but one USB stick is now empty. Get it?

Information state alone is not existence. You have a physical place. A process, in whatever form, electronic or biological. When you copy something you have the exact same information but in a different place; they're not the same thing, the same entity. Fundamentally.

We do not exist in a universe of pure ideas and information.

Zaphod42 fucked around with this message at 08:14 on Feb 14, 2018

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Neddy Seagoon posted:

Your comprehension of conservation of information seems to be very selective, because the concepts you're playing with are still a pattern of information and not a transferred instance. You have created a separate entity with delusions of continuity because there is no blip in the perception of the replacement, not actual continuity. There is no place in your argument that actually governs transfer, only alteration or creation of identical information in a second object.

This a million times

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

General Battuta posted:

wherever that pattern operates, so do you.

No, this isn't given! You can't just declare this!

You're assuming that if somebody happens to assemble atoms in the future in a way that perfectly matches my current pattern, that'll magically mean that my consciousness will transfer from the past to the future. That isn't a given! Maybe it'll just be another me. There's nothing written in the laws of the universe that says identical patterns maintain the same process. Its a different process with identical behavior.

In theory there are ways you could transform a consciousness while maintaining it but ALSO in theory there are ways you can copy it while leaving the same one going, and then kill it afterwards. And again, the latter is what Altered Carbon demonstrates.

Ravenfood
Nov 4, 2011

General Battuta posted:

A lot of the sleeves in Altered Carbon seem to come with skills, but I don’t know if it’s laid out how much they’re neural and how much they’re somatic/reflexive - probably a lot of both, realistically.
From what I remember, its mostly the latter. I can't remember any sleeves coming with actual skills, just speed, reflexes, and coordination. That one person had a supercomputer jammed into her skull, I suppose. I wonder if early in stack/clone development before they got good at "growing" them with those reflexes, muscle mass, etc, they just had someone sleeve into them and practice incessantly as part of their day job so the sleeve would be ready for whatever when the person who owned it wanted it. Presumably they have a way of preserving unused sleeves without damaging them over time, though I don't know if, say, their heart beats or not. If they do, those sleeves would need regular maintenance and upkeep.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

The Ninth Layer posted:

So then you are left telling us where the dividing line is that you are no longer you and now become the copy. If you cannot tell this line by experience, to the point where I could transfer you into my VR system and back out of it without you experiencing the journey any differently than you'd experience riding the subway from home to work and back again, then where is the line?

I could shut down your visual cortex and eye, and replace it with a digital network that did form processing and object recognition on a pair of camera in the exact way that your brain does and have it connect to your biological brain in an identical fashion to the cells I disconnected, and you would probably agree that *you* would experience sight out of these cameras.

I could shut down your long term memory and how it relates to semantic networks, and maybe I'll take your language processes and semantic network with it, and replace it all with identically structured and connected replacements. Now maybe at this point you are no longer *you* if we consider consciousness and memory and semantic language to go hand in hand... but we'd hesitate to say someone is no longer *them* if they got into an accident and lost those areas, or that they have lost total conscious experience (depending on how severe and localized the accident was).

If any one of these components could be seamlessly replaced individually in such a way that you would still consider you *yourself* then where's the line where suddenly you are not *yourself?*

The part where Battuta said you euthanize the body. That's where you draw the line.

Zaphod42 posted:

In theory there are ways you could transform a consciousness while maintaining it but ALSO in theory there are ways you can copy it while leaving the same one going, and then kill it afterwards. And again, the latter is what Altered Carbon demonstrates.

Its not that you can't do that. You possibly could. But what we see in the show isn't that.

Consider double-sleeving. The instant there are 2 consciousnesses alive that can talk to each other and experience things individually, THAT is the line. See how easy that is? Its not nearly as impossible as you're making it out to be.

Blisster
Mar 10, 2010

What you are listening to are musicians performing psychedelic music under the influence of a mind altering chemical called...

Neddy Seagoon posted:

This is actually true. Ortega explicitly states she's not done unpacking her poo poo because the apartment is meant to be shared with Ryker and she wanted to do it with him.

Admittedly I forgot about this. The apartment was just the first set that I thought of when trying to explain how I feel about the look of the show. I am sure I'm not articulating it right, but something about the world in AC just feels empty or not right. Like they could have gone balls out crazy and done literally anything but instead everything looks very pedestrian.

bring back old gbs
Feb 28, 2007

by LITERALLY AN ADMIN

The Ninth Layer posted:


Could you take ~someone else's~ chopping system and put it into ~your~ physical brain? I think that's harder to answer and it's unlikely we all form our networks exactly the same way. What's effective for a 7'4" muscled giant may not work for me at 5'10" but if we're not limited by the tech we're using then it's perfectly possible that a computer algorithm exists that could "convert" the giant's muscle chopping skill into something that would work for my physical wiring.



That's sort of what I was getting at. Could an olympic athlete sell a copy of their skills, or pro skateboarder sell his new trick for $14.99? Could you buy a P-90X routine from a huge guy in an infomercial that was his entire memory of correctly learned exercises and food prep routines? Or would you get his roid rage / general distemper along with it?

A theme park where they temporarily load you up with the best jetfighting skills in the world and let you tear rear end around in real jets, or gunslingin, whatever.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

bring back old gbs posted:

That's sort of what I was getting at. Could an olympic athlete sell a copy of their skills, or pro skateboarder sell his new trick for $14.99? Could you buy a P-90X routine from a huge guy in an infomercial that was his entire memory of correctly learned exercises and food prep routines? Or would you get his roid rage / general distemper along with it?

We don't know enough to say, but given infinite technology why not.

Sufficiently advanced technology is indistinguishable from magic and all that jazz.

I agree with what Ninth Layer said though, they are probably stored in vastly different ways but there's no saying we couldn't have some intelligent computer system map one to the other and do a conversion or whatever.

Ravenfood
Nov 4, 2011

Zaphod42 posted:

Consider double-sleeving. The instant there are 2 consciousnesses alive that can talk to each other and experience things individually, THAT is the line. See how easy that is? Its not nearly as impossible as you're making it out to be.
Literally nobody is disagreeing about what happens in double-sleeving, or in instances of forking, though.

bring back old gbs posted:

That's sort of what I was getting at. Could an olympic athlete sell a copy of their skills, or pro skateboarder sell his new trick for $14.99? Could you buy a P-90X routine from a huge guy in an infomercial that was his entire memory of correctly learned exercises and food prep routines? Or would you get his roid rage / general distemper along with it?
In Altered Carbon, no, you couldn't, at least not yet. VR training and time compression is your best bet there. You could buy an athlete's sleeve and get all of their practiced reflexes, but you'd have to spend awhile sorting them out and integrating them with your conscious thoughts and it'd be awkward but you'd get there eventually. Morgan also never really talks about what would happen if someone with a traumatic brain injury or seizure history or severed corpus collosum or Alzheimers is placed in an neurologically intact sleeve or vice versa, at least that I recall. He seems to separate all neurological activity into the stack, probably for convenience's sake.

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"
If you want to take the Altered Carbon example, consider this; Battuta's concepts would explicitly prevent Double-Sleeving.

It works by bouncing someone as d.h.f across a planet to a resleeving facility, faking a failed transfer, and they restore the original from the transmission buffer. Oh well, guess I'll go home then. Nobody even knows the second one's walking around until they get back.



Blisster posted:

Admittedly I forgot about this. The apartment was just the first set that I thought of when trying to explain how I feel about the look of the show. I am sure I'm not articulating it right, but something about the world in AC just feels empty or not right. Like they could have gone balls out crazy and done literally anything but instead everything looks very pedestrian.

This is actually drawn from the book, but as usual the writers don't understand what they're doing with the nuances of the source material they drew from. The short answer is that Earth isn't the super-populated technological hub of the Protectorate. They're headquartered there buuut, that's about all the planet really has going for it aside from general corporations. See, once the Martian starcharts were discovered with maps to worlds that were guaranteed habitable, anyone with a desire for adventure got on the first Colony Barge they could.

The only people left on Earth were the ones who didn't want to leave, or couldn't (like the Catholics), and Laurens Bancroft was actually around for watching the Barges take off into the sky. It's what his telescope was used for. Bay City is probably about identical to how it is nowadays, but a big theme of the book is that everything is recycled from what it once was. It's why Fell St Police Station has the big glass windows; It used to be a Church. 'Licktown' is just low-lying under-the-freeway-strip-mall territory, and most of the traffic there is regular old ground-cars.

Another example is the Fight Drome in the book was a beached aircraft carrier called the Panama Rose (and the series flubs its script editing by calling the Fight Drome 'Panama Rose' a few times. Oops.). Even the vaunted Head in the Clouds isn't some uber-advanced floating platform. It's just a massive repurposed blimp that was once used to monitor the Earth's environment.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.
According to Battuta's logic booting up 2 people with the same state would mean one person would be experiencing both because "you" are instantly anywhere that your state exists.

But that's not how it works; there's 2 different "yous" that diverge instantly.

Ravenfood posted:

Literally nobody is disagreeing about what happens in double-sleeving, or in instances of forking, though.

Okay but its the same technology. Consider the ramifications.

Like in Bancroft's case, he definitely isn't immortal, right? He died. There's still a Bancroft but he definitely died, the backup wasn't even exactly the same person.

But it isn't the lack of 40 minutes of memory that kills him. Its his body and current consciousness dying. Booting up a backup afterwards doesn't un-do the death. You can't say someone died and then 40 minutes later retroactively say they didn't die.

And its the same thing even without the 40 minutes, even if its instantaneous. Unless you were doing a transfer instead of a copy, but we know you aren't because you have to be able to copy to double-sleeve.

I guess you could argue that while Bancroft and Double-Sleeves are copies, that otherwise everything in Altered Carbon is a transfer, and that backups are a special read-only operation separate and completely different from normal needlecasting.
Certainly that's how the population chooses to look at it, but I think they're delusional. I mean, they choose to look at even Bancroft's backup as being the same thing, but we know its not. He died. If they think Bancroft didn't die and they're wrong, couldn't they be wrong about the rest?

Ravenfood
Nov 4, 2011
That's the cheap, illegal way to double-sleeve. IIRC, the conceit is that for some reason storing stack information in a way that can be used as a backup is difficult and takes a ton of memory, but for some reason stacks don't.

I don't think anything says that Bancroft couldn't wake up his "backup" without also having RD'd his original except that it'd be illegal as gently caress.

Adbot
ADBOT LOVES YOU

bring back old gbs
Feb 28, 2007

by LITERALLY AN ADMIN
In the book does the daughter get caught in her mothers sleeve by Kovaks? Because that made resleeving seem pretty pedestrian, even if they are rich. More like a high end designer dress or sports car.

  • Locked thread