Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Taeke
Feb 2, 2010


Shelvocke posted:

Also in matter is the pilot of the Bliterator, I forget her name, which addresses the fact that the back-up doesn't save you, you still die. It just makes a copy of you that lives on, and while it's a copy down to the molecular level, there's not transference-of-consciousness, no waking up in another body; you die, and the you-robot wakes up elsewhere.

Which is where the philosophical questions come in (and the Star Trek transporter), because how is it any different than someone who briefly died but was resuscitated, knocked out (or not showing any brain activity for some reason), even going to sleep and waking up or hell, the normal state of affairs of going from one moment to the next? All we are is essentially our experience in any given moment, including our memories and whatever. You could argue that every time we go from one moment to the next our old self dies and our new self takes over, and whether or not there was any time or space in between those two and whether there is a continuity in the 'vessel' that harbors our consciousness is not all that relevant to what is essentially something we won't ever get a complete answer on.

But like I said, we discussed this in depth before and it's a very interesting but complicated issue.

Adbot
ADBOT LOVES YOU

Seaside Loafer
Feb 7, 2012

Waiting for a train, I needed a shit. You won't bee-lieve what happened next

Tony Montana posted:

Could you explain the philosophical implications? You're not more than bits of data on a datastream for a period?
The viewpoint that the mind is is just a biological computer is the hard-reductionist http://en.wikipedia.org/wiki/Reductionism position. In this your neural pathways define everything you do, and though you think you have free will you dont. In this the concept of free will is a self sustaining myth defined by the mind to keep us all from going bonkers, if you woke up every morning thinking you didnt have any free will it would be a rather depressing world to live in wouldnt it. But if it feels like free will it doesnt really matter does it.

(personally i believe that to a point but also quantum effects on the nueral net chuck in a bit of randomness)

e: ive just done an philosophy AI module and could post some good reading on it if interested.

Seaside Loafer fucked around with this message at 16:18 on Jan 14, 2014

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

Shelvocke posted:

Also in matter is the pilot of the Bliterator, I forget her name, which addresses the fact that the back-up doesn't save you, you still die. It just makes a copy of you that lives on, and while it's a copy down to the molecular level, there's not transference-of-consciousness, no waking up in another body; you die, and the you-robot wakes up elsewhere.

Fortunately for her, the 'copy' is a valid fork of her identity from the moment the copy was made, so she can choose to view herself as the duplicate and the original as safe. It's just as fallacious a position (both forks, the Bliterator pilot and the transmitted backup are 'the real her'), but at least she can focus on the fact that one of her is going to live.

(there's no 'you-robot', just two yous, and there is transference of consciousness to the extent that there are now two consciousnesses diverging from the moment the scan is made)

General Battuta fucked around with this message at 19:22 on Jan 14, 2014

Taeke
Feb 2, 2010


Isn't there this thought experiment where you are duplicated but it isn't revealed (or possible to) which one of you is the original and which one is the copy? Effectively such questions become meaningless and the more I think about it, the less special everything becomes. The same goes with the question of free will and while I do have a rather bleak outlook on these things I take solace in the fact that while every choice I make might be predetermined, the important thing is that the choice still has to be made and that it's my experience that matters. Still, nihilism is always just around the corner.

The Dark One
Aug 19, 2005

I'm your friend and I'm not going to just stand by and let you do this!
To put the Culture's biotech into perspective, they're so fuzzy around the edges that they've put special care to make themselves fertile with almost any thing with two legs. People are mentioned as having lived as birds, fish, dirigibles and "small clouds of cohesive smoke" as fashion dictated, but it's unclear if that was generic inheritance thing or just them transferring their consciousness into a ready-made host. The monkey guy in Look to Windward is called out as having a form different from what his genes suggested.

And from the hardware side of things, Excession also touches on those ideas of self and continuity with an Elencher drone whose ship comes under a swift physical/mimetic attack.

Seaside Loafer
Feb 7, 2012

Waiting for a train, I needed a shit. You won't bee-lieve what happened next

General Battuta posted:

(there's no 'you-robot', just two yous, and there is transference of consciousness to the extent that there are no two consciousnesses diverging from the moment the scan is made)
Exactly. If it were possible to copy a complete neural network the very instant it was done the 2 would diverge because of the quantum effects on whatever medium its stored on. So 1 second later you have a consciousness very close to the original but already becoming different.

So if you dont believe in the soul you have just made a different entity.

If you do, well thats a whole kettle of fish im not interested in.

Seaside Loafer fucked around with this message at 18:25 on Jan 14, 2014

RoboChrist 9000
Dec 14, 2006

Mater Dolorosa
The Bliterator is Surface Detail, not Matter. Surface Detail, unsurprisingly, deals a fair bit with death and mind-states. There's the Chay character, after all.

Barry Foster
Dec 24, 2007

What is going wrong with that one (face is longer than it should be)

Seaside Loafer posted:

The viewpoint that the mind is is just a biological computer is the hard-reductionist http://en.wikipedia.org/wiki/Reductionism position. In this your neural pathways define everything you do, and though you think you have free will you dont. In this the concept of free will is a self sustaining myth defined by the mind to keep us all from going bonkers, if you woke up every morning thinking you didnt have any free will it would be a rather depressing world to live in wouldnt it. But if it feels like free will it doesnt really matter does it.

(personally i believe that to a point but also quantum effects on the nueral net chuck in a bit of randomness)

From what I understand, quantum effects don't occur on the scale of our neurons/synapses/electrical signals/etc, so that particular argument for the possiblity of free will doesn't really hold water (I might be wrong, and would be very interested if anyone knows more about this). Although, either way w/regard to absolute determinism or randomness - either one could be said to preclude free will. It's not like you get to choose the dice rolls, after all...

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.
Yeah, quantum effects have no bearing on the question of free will, since even if they do alter your nerves' action potentials the influence would be random. The best way to preserve free will in a monist, causally closed universe is to think of yourself as an RPG character: your actions are a product of your particular internal states and history. They may be determined but they're determined by traits unique to you and contingent on your past.

Taeke
Feb 2, 2010


General Battuta posted:

Yeah, quantum effects have no bearing on the question of free will, since even if they do alter your nerves' action potentials the influence would be random. The best way to preserve free will in a monist, causally closed universe is to think of yourself as an RPG character: your actions are a product of your particular internal states and history. They may be determined but they're determined by traits unique to you and contingent on your past.

Exactly. The way I see it, ultimately the decisions you make might be predetermined by factors outside of your control but that doesn't change the fact that they still have to be made and that you are the agent making them. The very fact that the predetermined outcome is unknown/unknowable to you until the moment you act makes it worthwhile.

To reduce it to a very simple analogy: it's kind of like watching a movie. The plot and outcome might be set but that doesn't detract at all from the enjoyment of watching it.

Shelvocke
Aug 6, 2013

Microwave Engraver
Woops, yes, it was Surface Detail.

General Battuta posted:

Fortunately for her, the 'copy' is a valid fork of her identity from the moment the copy was made, so she can choose to view herself as the duplicate and the original as safe. It's just as fallacious a position (both forks, the Bliterator pilot and the transmitted backup are 'the real her'), but at least she can focus on the fact that one of her is going to live.

(there's no 'you-robot', just two yous, and there is transference of consciousness to the extent that there are now two consciousnesses diverging from the moment the scan is made)

I wouldn't want there to be a backup were I a Culture citizen, unless I had family/friends who would be genuinely miserable were I not to come back.

From my own selfish point of view, this particular bag of meat that I'm driving will cease to exist, along with the chemical reactions going on in my brain. Just as I don't care what happens to my body after I die - bury it, burn it, chuck it in a bin - I don't really care about the me-copy that would go on living, beyond basic human compassion, because I won't continue to live on except from the perspective of the new me and the people around. F that guy, for jumping in my shoes!

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

Shelvocke posted:

Woops, yes, it was Surface Detail.


I wouldn't want there to be a backup were I a Culture citizen, unless I had family/friends who would be genuinely miserable were I not to come back.

From my own selfish point of view, this particular bag of meat that I'm driving will cease to exist, along with the chemical reactions going on in my brain. Just as I don't care what happens to my body after I die - bury it, burn it, chuck it in a bin - I don't really care about the me-copy that would go on living, beyond basic human compassion, because I won't continue to live on except from the perspective of the new me and the people around. F that guy, for jumping in my shoes!

I don't know if it'll change your mind, but there is a key factual mistake here - you will keep on living, since one of the forks the backup copy produces will be you in every way, including your interior subjectivity, your qualia, the indefinable first-person you. But another one of you, with all the same credentials to be 'the real you', will die. You're automatically assuming the perspective of this fork, as if the backup is in your local past and the other fork has, in your words, 'jumped into your shoes'. But if the backup's yet to come, in your subjective future, well, it's a very different matter - you get the security of knowing that one causal descendant of you will live.

The divergence occurs at the exact moment of the backup. If you're interested I can walk you through the thought experiments that demonstrate why both forks are 'the real you', and why it's fallacious to talk about one as the copy and one as 'me'.

Or, in other words: your post is completely valid from the perspective of the fork who's about to die, but it's not valid from the perspective of you before the backup, or the other fork who remains after the backup. Maybe majority rules :v:

Strategic Tea
Sep 1, 2012

Or perhaps consciousness as we think of it is an illusion. Billions of individual, discrete mindstates flipping past like a flipbook animation. If you took away one flipbook and replaced it with an identical one, and did it faster than the time between individual images, there wouldn't be a difference to the picture. In a way, we die thousands of times every second as the brain changes - but conciousness isn't a thing, its a process. The survival of the process is the survival of the consciousness.

Shelvocke
Aug 6, 2013

Microwave Engraver

General Battuta posted:

I don't know if it'll change your mind, but there is a key factual mistake here - you will keep on living, since one of the forks the backup copy produces will be you in every way, including your interior subjectivity, your qualia, the indefinable first-person you. But another one of you, with all the same credentials to be 'the real you', will die. You're automatically assuming the perspective of this fork, as if the backup is in your local past and the other fork has, in your words, 'jumped into your shoes'. But if the backup's yet to come, in your subjective future, well, it's a very different matter - you get the security of knowing that one causal descendant of you will live.

The divergence occurs at the exact moment of the backup. If you're interested I can walk you through the thought experiments that demonstrate why both forks are 'the real you', and why it's fallacious to talk about one as the copy and one as 'me'.

Or, in other words: your post is completely valid from the perspective of the fork who's about to die, but it's not valid from the perspective of you before the backup, or the other fork who remains after the backup. Maybe majority rules :v:

I understand the premise, but from the perspective of me before the backup- who cares about that? The me that gives a crap is the one about to die; when the biological processes stop and the lights go out, that's it. I don't get time after that to think, oh boy, it sure is a good thing I had a copy made.

I don't care enough about my impact on the world, or am gloomy enough to be realistic about my impact upon it, to care whether even a direct copy of my mind state still exists. From the moment you are created even not knowing if you're the copy or the original you are a discrete (almost) unique biological robot that is experiencing the universe. If the other one dies he doesn't get to inhabit my head.

I don't think any amount of philosophy or quantum consideration will change the fact that I won't be any more or less reckless with my life if a copy exists.

General Battuta
Feb 7, 2011

This is how you communicate with a fellow intelligence: you hurt it, you keep on hurting it, until you can distinguish the posts from the screams.

Shelvocke posted:

I understand the premise, but from the perspective of me before the backup- who cares about that? The me that gives a crap is the one about to die; when the biological processes stop and the lights go out, that's it. I don't get time after that to think, oh boy, it sure is a good thing I had a copy made.

I don't care enough about my impact on the world, or am gloomy enough to be realistic about my impact upon it, to care whether even a direct copy of my mind state still exists. From the moment you are created even not knowing if you're the copy or the original you are a discrete (almost) unique biological robot that is experiencing the universe. If the other one dies he doesn't get to inhabit my head.

I don't think any amount of philosophy or quantum consideration will change the fact that I won't be any more or less reckless with my life if a copy exists.

You're not killing yourself, so you clearly don't mind continuing to exist (please don't take this as a suggestion :ohdear:), so on some level you do care about whether a direct copy of your mind state still exists - you're gonna be alive in five minutes, right? That's a valid causal descendant of your current mindstate.

But you're right that the backup process won't save an individual fork about to die. If your concern is never allowing any valid causal descendant to die then it won't do anything for you.

Strom Cuzewon
Jul 1, 2010

Seaside Loafer posted:

The viewpoint that the mind is is just a biological computer is the hard-reductionist http://en.wikipedia.org/wiki/Reductionism position. In this your neural pathways define everything you do, and though you think you have free will you dont. In this the concept of free will is a self sustaining myth defined by the mind to keep us all from going bonkers, if you woke up every morning thinking you didnt have any free will it would be a rather depressing world to live in wouldnt it. But if it feels like free will it doesnt really matter does it.

(personally i believe that to a point but also quantum effects on the nueral net chuck in a bit of randomness)

e: ive just done an philosophy AI module and could post some good reading on it if interested.

Isn't this more monoism/physicalism? I never really relate reductionism vs emergence to free will.

But on the other hand after a lecture on epiphenomenalism I became convinced that we don't actually have thoughts, we just imagine that we do, so my understanding on this is shakey at best.

Seaside Loafer
Feb 7, 2012

Waiting for a train, I needed a shit. You won't bee-lieve what happened next

Strom Cuzewon posted:

Isn't this more monoism/physicalism? I never really relate reductionism vs emergence to free will.

But on the other hand after a lecture on epiphenomenalism I became convinced that we don't actually have thoughts, we just imagine that we do, so my understanding on this is shakey at best.
Im not sure man. It was a very interesting module but there were more 'ism's than I could keep track of!

This was the reading list if any of you guys want to check them out:

Blay Whitby (2003), A Beginner's Guide to Artificial Intelligence
Harnish, R.M., Minds, Brains, Computers: An Historical Introduction to the Foundations of Cognitive Science
Green, D.W. et al, Cognitive Science: An Introduction
Churchland P.M. (1988) Matter and Consciousness: A Contemporary Introduction to the Philosophy of Mind (Revised Edition), MIT

Although I really liked this one:
Michael Negnevitsky, Artificial Intelligence, A Guide To Intelligent Systems. (gently caress load of hard maths in there though, its more about design than philosophy, but there are plenty of good bits)

Daktar
Aug 19, 2008

I done turned 'er head into a slug an' now she's a-stucked!

Tony Montana posted:

No poo poo. I've only read UoW, there are other civilizations more advanced than The Culture? So using humans with humans is political as much as anything else. Mind blown, dudes, cheers :)

There's a bit in Matter (I think) where it's said that technological development is more like a cliff face than a ladder. Different civilisations get to the top in different ways. For a very early example, a species evolving on an ice planet might never develop the wheel in favour of sledges. The top civilisations in the galaxy are generally all equally powerful, technologically speaking, but it pays to be cautious because you never know whether your neighbour is going to blindside you with some tech you never even thought of because of your developmental history.

MeLKoR
Dec 23, 2004

by FactsAreUseless
On the other hand didn't someone mention in a book how the top tier civilizations have more or less the same kind of tech? Makes sense that at the end of the search for the most powerful weapons everybody would reach the same things.

the fart question
Mar 21, 2007

College Slice

MeLKoR posted:

On the other hand didn't someone mention in a book how the top tier civilizations have more or less the same kind of tech? Makes sense that at the end of the search for the most powerful weapons everybody would reach the same things.

In Hydrogen Sonata, the Culture and the other guys are supposed to be at the same developmental level but the culture's experience in interstellar war with the Idirans and their propensity for exploring everything gave them a significant edge.

Strategic Tea
Sep 1, 2012

Although of course Excession kind of proves the Involved wrong when they talk about being at the civilisation cruise stage. There's a way between universes and none of them have the slightest clue how it works. Not to mention sublimed's attitude that a Culture level civ has already done everything and should politely retire. There's a ton more out there. Who's being immature now :smug:?

sebmojo
Oct 23, 2010


Legit Cyberpunk









The pen and paper Rpg Eclipse Phase allows backing up and beaming of consciousness, and managing the mental strain from this is a big part of the game.

It's a bit clunky in the mechanics but is otherwise brilliant and is available for free download on a Creative Commons license, so worth checking out.

Strom Cuzewon
Jul 1, 2010

Seaside Loafer posted:

Im not sure man. It was a very interesting module but there were more 'ism's than I could keep track of!

This was the reading list if any of you guys want to check them out:

Blay Whitby (2003), A Beginner's Guide to Artificial Intelligence
Harnish, R.M., Minds, Brains, Computers: An Historical Introduction to the Foundations of Cognitive Science
Green, D.W. et al, Cognitive Science: An Introduction
Churchland P.M. (1988) Matter and Consciousness: A Contemporary Introduction to the Philosophy of Mind (Revised Edition), MIT

Although I really liked this one:
Michael Negnevitsky, Artificial Intelligence, A Guide To Intelligent Systems. (gently caress load of hard maths in there though, its more about design than philosophy, but there are plenty of good bits)

According to Wikipedia Churchland doesn't believe in thoughts. Either that's some heinous strawmanning, or he's gone off the deep end of hyper-reductionism. Either way, I'm gonna check his book out, thanks!

OtherworldlyInvader
Feb 10, 2005

The X-COM project did not deliver the universe's ultimate cup of coffee. You have failed to save the Earth.


Tony Montana posted:

I won't be too quick to judge UoW.. I will certainly let it settle and then read through the thread.

But.. I just want to put this out there. I just devoured Starship Troopers. I couldn't put it down. Why do people rag on Heinlein so much? I think he's fantastic.. some of the moral and ethical discussion in the book I just read rates right up there with anything I've read or seen in the genre. The action is fast and frantic, truly exciting and doesn't feel fake. The technology is mindblowing and so much fun.. but still within the realms of a reality we have some connection with. To think this book was published in 59 really blows the mind.

Banks exploration of AI was brilliant and I've never encountered a viewpoint like that, I am certainly richer for it. To say UoW isn't about scifi pew pew and more about a man and his journey.. you could say exactly the same thing about Troopers.

Why do people say Heinlein is a lovely writer? Is it because books like Troopers obviously display some dated viewpoints? I just see those for what they are.. even revel in some of the upfront honesty in how people thought in those times. At no point, at any point in Troopers did I think 'hrm, that was a bit clumsy'. He got pretty militarily specific toward the conclusion, but he was a military man and it showed the contrast between a grunt's thought process and how much more is going on in the head of a commander.

There is some really relevant deeper thought in there.. about the society they live in and how it relates to us. He even makes direct comparisons which ring surprisingly true now so many decades after it was written.

As you all seem to be a step above the groupthink and parroting circlejerk (the way in which you answered my inital posts about UoW) - tell why Heinlein is considered to be anything less than masterful.

For me (you're certainly free to disagree), its because the philosophy in Banks' Culture novels is a hell of a lot more ambiguous and open to criticism. From what I remember of Troopers, critics of Heinlein's philosophy in his books tend to exist only to be shot down. Authors have biases, that's fine, but a big measure of them is how they acknowledge them in their writing. Obviously the Culture has aspects which reflect Banks' own political beliefs, and yet the main character of the very first Culture novel is some one who's decided to fight against them in a war. He's not an ignorant child to be converted and taught, or a straw-man to be torn down and proven wrong. Throughout the book he levies criticisms against the Culture which are perfectly valid, even if you believe the Culture are "the good guys". It doesn't end there either, throughout the entire series I've read so far, Banks' is constantly asking the reader: "is the Culture really utopia?" and that ends up being a really complex question people can have disagreements with. By comparison the political philosophy in Troopers is almost a Randian style soapbox speech. I liked it when I read it at something-teen, but largely for the space battles because the political messages the book conveys can very easily be interpreted as fascist propaganda.

Anyway I'm just now finishing up Excession. Its great to see the Culture finally face something even their near god-like Minds are completely baffled by. Though I feel like the premise (of the Culture actually facing a real existential crisis) was never explored as deeply as it could have been, everything got wrapped up a bit too quickly.

Taeke posted:

Isn't there this thought experiment where you are duplicated but it isn't revealed (or possible to) which one of you is the original and which one is the copy? Effectively such questions become meaningless and the more I think about it, the less special everything becomes. The same goes with the question of free will and while I do have a rather bleak outlook on these things I take solace in the fact that while every choice I make might be predetermined, the important thing is that the choice still has to be made and that it's my experience that matters. Still, nihilism is always just around the corner.

Alastair Reynolds House of Suns has precisely this. The main characters are part of a group of 999 clones and 1 original. As all of them have exactly the same memories leading up to the divergence, not even the original knows if they're really it or a clone. By the time the book takes place, none of them seem particularly interested in the distinction (if it actually exists) anymore, and the author never makes any attempt at answering which one is "the original".

Pompous Rhombus
Mar 11, 2007
Found a copy of Raw Spirit in the clearance bin at the bookstore yesterday. It's basically Iain Banks going around Scotland drinking/writing about scotch.

To say I'm excited to read about so many of my favorite things at once would be an understatement!

andrew smash
Jun 26, 2006

smooth soul
For those of you who haven't seen Her you should do so, it reminded me of Banks in that it's basically the story of a mind being born in the frame of an oddball love story. It also has some pretty cool near-future speculation sprinkled in the background that is really interesting.

Fragmented
Oct 7, 2003

I'm not ready =(

Thanks. The movie is a little slow but its a Sunday edible day so its perfect.

-edit Those were baby minds at best. And the fuckers sublime right away. Still an ok movie.

Fragmented fucked around with this message at 01:10 on Feb 17, 2014

andrew smash
Jun 26, 2006

smooth soul

Fragmented posted:

Thanks. The movie is a little slow but its a Sunday edible day so its perfect.

-edit Those were baby minds at best. And the fuckers sublime right away. Still an ok movie.

I considered the OSes to be roughly human-level AIs and the event at the end of the movie to be the creation of a group consciousness proto-Mind. I'm pretty sure some of the culture stuff mentioned a similar process in culture early history.

fookolt
Mar 13, 2012

Where there is power
There is resistance
http://io9.com/the-final-iain-banks-book-will-be-published-next-februa-1525319337

I had no idea there was one last thing. I'm still feeling gutted :(

GeneralZod
May 28, 2003

Kneel before Zod!
Grimey Drawer
For UK goons: "The Crow Road" is currently available for £0.99 via Amazon's Kindle Daily Deal.

Those On My Left
Jun 25, 2010

Re-reading Use of Weapons for the first time, and one thing I'm noticing in the alternate roman-numeral 'backwards' chapters is that it seems to be a chronicle of Zakalwe loving up and experiencing serious pain. There's the balls-up at Winter Palace, the near-death in the caldera, the beheading, and that whole thing with the crying hysterical woman he has tied to the chair.

Nothingtoseehere
Nov 11, 2010


On the discussion about back-ups, there is also the case on Quitrilis in Matter, who backs up before he leaves home and goes travelling, but then dies when he accidentals discovers that most of the Oct fleet is a hologram (by crashing into one). He himself says its a proper death, as he had changed massively since the 500 days when he left home.

Gravitas Shortfall
Jul 17, 2007

Utility is seven-eighths Proximity.


There's also a Culture tradition (Law?) that a copied mind-state that's been operating independently for a certain amount of time is considered its own entity, and has to be asked whether or not it wants to be deleted/merged with the original.

It comes up very briefly in Hydrogen Sonata. I think it's Tefwe that sends out copies of herself to do things.

Liveware
Feb 5, 2014

Gravitas Shortfall posted:

There's also a Culture tradition (Law?) that a copied mind-state that's been operating independently for a certain amount of time is considered its own entity, and has to be asked whether or not it wants to be deleted/merged with the original.

It comes up very briefly in Hydrogen Sonata. I think it's Tefwe that sends out copies of herself to do things.

I thought the original was asked if the instance should be integrated?

Gravitas Shortfall
Jul 17, 2007

Utility is seven-eighths Proximity.


FrozenDorf posted:

I thought the original was asked if the instance should be integrated?

I'm pretty sure the original asked the instance if it wanted to be integrated, though I don't have the book on me to check.

the fart question
Mar 21, 2007

College Slice

FrozenDorf posted:

I thought the original was asked if the instance should be integrated?

All the instances spent so much time apart that they all had to be asked. They all went to do their own thing if I remember correctly.

Liveware
Feb 5, 2014

gender illusionist posted:

All the instances spent so much time apart that they all had to be asked. They all went to do their own thing if I remember correctly.

Gravitas Shortfall posted:

I'm pretty sure the original asked the instance if it wanted to be integrated, though I don't have the book on me to check.

I took a brief gander, and only passage I found relevant was in Tefwe's first scene:

quote:

And a no-constraints chance to negotiate over subsequent re-integration, just me and it, or them.

The debrief scene just showed her as reviewing the memories of her two instances being streamed live, while said instances were away on different ships.

I think I may have conflated with Byr's uncle Tishlin in Excession who said:

quote:

They asked if I wanted it to be reincorporated after it had done its job. They sai it would be sent back and sort of put back inside my head, but I said no. Gave me a creepy feeling just thinking about it.

Both imply the original has a fairly significant amount of control over reintegration, but to different degrees.

Well, anyway. I apologize for sidetracking.

Liveware fucked around with this message at 02:46 on Mar 5, 2014

andrew smash
Jun 26, 2006

smooth soul
For those interested in this topic the excellent Ancillary Justice explores it in a fascinating if initially confusing way.

Fragmented
Oct 7, 2003

I'm not ready =(

I wonder what the biggest battle in the Idiran War would have been like. We only saw skirmishes in Consider Phlebas. The numbers on Wikipedia are insane. I'm thinking a hundred thousand warships on each side. Gridfire popping everywhere. How long would the fight take? Four seconds? Five? War loving sucks.

Adbot
ADBOT LOVES YOU

MikeJF
Dec 20, 2003




Well we know that in one of the final large battles, the Lasting Damage blew two stars supernova.

  • Locked thread