Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Coolguye
Jul 6, 2011

Required by his programming!
Remember that time cogwerks had talk to the fbi because someone was making school shooting threats on goon station and he absolutely could not find a screenshot that did not include some random goon farting

Adbot
ADBOT LOVES YOU

chitoryu12
Apr 24, 2014

Coolguye posted:

just checking because i'm not sure i'm following this properly: is this guy alleging that Cage said Detroit was different from Blade Runner (and, probably by extension, Phil K Dick's Electric Sheep) because you were supposed to empathize with the androids?

Yes.

https://twitter.com/austin_walker/status/768231864796585984

Coolguye
Jul 6, 2011

Required by his programming!
so, just to recap how absurd that sounds for anyone who is not familiar with the core concept of blade runner and its source material (a story called do androids dream of electric sheep, by philip k dick): Androids are not controversial or 'barely tolerated' in Blade Runner/Sheep - they are outright illegal on earth and the entire point of a blade runner is to hunt them down and 'retire' them. The trick is, there are no mood rings, stiff movements, armbands, etc to identify androids in that setting. Androids - or Replicants, as Blade Runner calls them - are pretty much completely indistinguishable from humans. The main trick is that they don't natively have empathy for other things, so they have to 'fake' the empathetic response. This takes a bit longer than real empathy. Blade Runners figure this out by using a machine that measures the empathetic response called a Voightt-Kampff machine and asking disturbing questions, like "How do you like my new briefcase? One hundred percent baby skin." Since Reps need to fake the response it takes longer for their pupils to dilate, their skin to flush, etc - and the VK picks up on that. Once you've proven that something that looks, acts, and speaks like a person is faking this empathy, a Runner is expected to 'retire' (that is, shoot) a Rep on the spot.

This is darkly ironic on tons of levels. Here's just a quick smattering:
1) The pearl that honest-to-god humans are clutching that makes them more than the machines they created is empathy.
2) But they can't tell what empathy is without having a machine tell them what's 'real' empathy.
3) Also, having proven that something doesn't have empathy, they must then suppress a shitload of basic empathy for something that looks, acts, speaks, and behaves like a person to murder them in cold blood.
4) To help accomplish this, they even come up with a fancy word for the killing - retirement.

There's at least a dozen more layers to the various things Phil K Dick did with Electric Sheep and a few dozen more that folks did when it was adapted to Blade Runner. None of this is at all subtle in any of the source material. Even in the movie with Harrison Ford, it's left ambiguous whether or not his character, Deckard, is a replicant himself (adding another layer of black irony to the above), but even if he is not he is not a hero nor a good person. The two replicants he manages to kill during the movie, he shoots in the back, and then he kind of has a weird existential crisis and fucks a third without really showing a lot of feelings for her well being or regard for her state. He ultimately disappears with her and fails to kill her, but that is more him being insanely loving confused and not even sure about his own place in the universe rather than seeing her as a life form with a right to exist, let alone an equal.


In short - the entire point of Blade Runner/Sheep is to empathize with the androids and meditate on what really makes a sentient mind that demands respect, rights, and regard. To say you weren't supposed to empathize with the androids implies that you're not familiar with the source material at all.

So basically, if it was actually said (I'm personally not really in the business of taking hearsay on twitter at face value, but it's one of those things that's worth remembering since sound bites can be corroborated or refuted fairly easily), it's an interpretation mistake so clearly fundamental and contrary to what Dick was saying that it would fail you a high school english lit course. It would be like saying Fahrenheit 451 isn't about censorship, or that Catch-22's narrative style was just bad writing with no point (Joseph Heller has maintained for decades that he explicitly wrote Catch-22 in the most confusing way possible to make readers feel just as confused and pissed off as Yossarian and his fellow airmen).

Coolguye fucked around with this message at 20:08 on Jul 5, 2018

chitoryu12
Apr 24, 2014

Blade Runner 2049 goes even farther with it by explicitly making a new model replicant detective the protagonist and giving him a quasi-romance with an artificial intelligence. It's intentionally unclear whether or not the AI is a truly sentient being that loves K or if it's just a well-designed program faking it.

The plot also deals with the implications of exactly what humans are doing by building replicants and treating them as disposable machines. The entire plot is structured around finding and killing Deckard and Rachel's child because replicants aren't supposed to be able to give birth. The only reason society tolerates replicants and treats them the way they do is because they're artificially grown humans, and those who believe in the human soul also believe that replicants don't have one. K's chief explicitly says that he needs to kill a child born from a replicant because the knowledge that it's possible would break down any kind of barrier between humans and replicants, forcing humanity to deal with the knowledge that they basically created a slave underclass and fooled themselves into thinking it's okay.

BioMe
Aug 9, 2012


There definitely were more direct quotes with him talking about how Detroit is unique from other scifi stories because, you see, his story isn't in fact at all about the technology of AI. He's merely using that as a framing device to tell a story about human nature!

HenryEx
Mar 25, 2009

...your cybernetic implants, the only beauty in that meat you call "a body"...
Grimey Drawer
Has he, like, read any story about AI in his life before

Coolguye
Jul 6, 2011

Required by his programming!
i've never liked that distinction whenever a writer tries to use that line, and i'll tell you why. first though, being clear, it's not just Cage that's done this - Asimov also said a bunch of stuff like that, though his utterances of it were arguably more understandable.

unfortunately this goes pretty far down the rabbit hole, so hopefully folks will indulge me a bit of background first.

as far as actual AI goes, i can say as someone who's worked on that problem for a long, long time that we have nothing to fear from computers getting too smart for at least another 10 years. every time you hear someone talking about some new and interesting problem that AI is solving, remember that they needed to put in thousands, if not tens or hundreds of thousands of man hours, to have an "AI" (which is really just a very fancy algorithm, most likely) solve one specific problem that the human brain figured out a general solution to in a fraction of that time.

now consider that the hardware required to solve those problems consume hundreds if not thousands of watts of power, and compare that to the human brain, which runs on 20W. this is arguably unimportant in practice, however, since electricity itself is a means to an end. if the ends are sufficient, we will find the means. but still - the efficiency advantage here is orders of magnitude in favor of the meat, not the machine.

however, the tricky thing about AI is that, sure, we have more than enough hardware to conceivably make a superintelligence happen. the popular abstraction is that one transistor on a board equals one neuron in a brain, which has turned out to be a fairly apt comparison as we've come to understand more and more about neural pathways. we can get transistors on a board in a density sufficient enough to equal the human brain. we passed that point a long time ago, actually.

the thing is, neurons don't just exist on their own. there's a really complex set of rules and laws governing how they interact with their fellow neurons and their environment. this is to say, the software of our brain's operating system is still a complete mystery. how we're able to perceive inputs, process them efficiently, make decisions, act on those decisions, persist those decisions, recall those decisions, etc - all of that is still largely a big question mark.

the thing is, all of these functions i just rattled off are completely integral to our experiences as sentients. how i perceive things might be different from how you perceive things, but failing a psychological disorder or something, we all perceive things within certain letterboxed bounds. we all process things fairly similarly. we all create memories similarly. we all evaluate them through our own, self-written algorithms that we call 'habits'.

a real AI must be able to do all of these things or it's not a real intelligence. it's instead a fancy state machine that is taking input in a specific instead of letterboxed way, coming to deterministic rather than rationalized decisions, processing through a thought pattern that is rigid instead of flexible.

but the slightly uncomfortable question becomes: to make an artificial intelligence, then, we would need to write code - or make technology - that reproduces ourselves in digital rather than analog format. that reproduction, itself, is a representation of human nature. and therefore, the technology of AI is inherently about human nature.

the two aren't just viscerally inseparable, they are mathematically and technologically inseparable, though that's not trivially obvious and that distinction has only really become obvious in the last ~10 years or so, which is why i say it's a bit more understandable for older authors to say that. computer science and electrical engineering, let alone biological study, hadn't progressed to the point to make that observation easy to tease out.

Coolguye fucked around with this message at 22:43 on Jul 5, 2018

HenryEx
Mar 25, 2009

...your cybernetic implants, the only beauty in that meat you call "a body"...
Grimey Drawer
Or, to massively condense this huge post: We as humans are only able to recognize intelligence if it's very similar to our own

Coolguye
Jul 6, 2011

Required by his programming!
it's more than that - if we are going to create an intelligence, we have to fully understand our own, which we currently don't. so the technology of AI is inextricably linked to the study and philosophy of human nature. trying to separate the two is worse than missing the point, it predicates whatever you say next on ground that has been debunked.

of course if i just said that i'd be another wannabe technocrat shouting bullshit into the ether, not someone who can actually back any of that up by drawing someone a picture of why.

HenryEx
Mar 25, 2009

...your cybernetic implants, the only beauty in that meat you call "a body"...
Grimey Drawer
TBF my response was intentionally reductive and did not do your post justice, just trying to point out what it eventually leads towards and what's at the base of the "AI in entertainment" problem. We probably wouldn't recognize a truly alien intelligence if it hit us in the face. If it did that, we'd probably just bomb it out of existence. So the prospect that we could conceptualize, or even write fiction about an intelligence so different from ours (running on deterministic instead of rationalized input) is laughable, and any real attempt always comes down to making a foreign kind of intelligence as human (but with a different veneer!) as possible.

Our empathy runs on this, which is... kind of a big deal for entertainment, or any kind of interest when you dig deeper.


edit: what the heck, self, don't mix up easy words

Fedule
Mar 27, 2010


No one left uncured.
I got you.
I think it's entirely possible to have good fiction about or in the orbit of artificial intelligences that are notably different from us, so long as you're consistent about what you want to handwave and what you want to explore (or wave in people's faces to seem deep).

Two schlocky action games that take little detours into AI exploration are Call of Duty: Infinite Warfare and Titanfall 2. CoDIW introduces you to E3N (Ethan), an AI soldier buddy, by telling you that he's experimental and then having him behave more or less identically to your meat-based buddies other than occasionally reminding you in dialogue that he's a robot. A number of other people you meet distrust Ethan because he is an AI, but he never once does anything to justify such distrust. Ethan sacrifices himself at the game's conclusion in more or less the same manner as the countless CoD squadmates that came before him, many in the same game (most of your team does not survive the finale). Titanfall 2 sends you on a capital J Journey with BT, a glorified computer in a mech that you slowly bond with. He reasons about everything around him in a manner that is programmatic, often citing statistics and calculating probabilities, in dialogue that is played (expertly) as dry humour ("conclusion: I am fifty percent in love"). A mid-game scene in which he vouches for you to your commander, citing performance metrics and combat efficiency, is weirdly touching. Despite being able to transplant himself between bodies (which literally happens in-game), he, too, sacrifices himself at the game's conclusion (or at least appears to), citing the tenets of his core programming (uphold the mission, protect the pilot), in a decision that seems both utterly preordained and noble.

Neither game expounds very much on the nature of the AI that underlies these companions, but Ethan, despite acting completely human, fails to foster much empathy, while BT, who reasons completely like a computer, fosters a lot. Ethan is allegedly an AI but because he is written as human, his character is basically complete the second you meet him; robot, check, human personality, check, case closed. BT, on the other hand, slowly teases the possibility of the ghost in the machine. For all his surprisingly engaging dialogue he never once breaks from the dry, analytical reasoning he introduces himself with. The result is that you can read his various human-resembling actions either as the genuine decisions of an empathetic mind or as the pragmatic deductions of a computer, and it's wholly up to you to decide and trust in your reading (fun fact: BT repeatedly says "trust me"). BT's arc works exactly because it invites us to perceive human intentions in completely mechanical processes.

David Cage is, of course, trying to have all of this all of the ways. He has robots that look and act entirely human except for when they need to wow us with their computational skillz, until (I'm guessing, but I've searched my feelings and know it to be true) a switch flips in their CPUs and they go from "acting human" to "acting human", philosophising at the drop of a hat about identity, souls and will, and how they have them, but without offering any insight of their own as to what any of these things mean, and demanding that we empathise with them as though they were human all along with no exploration of what, if anything, changed about them. Their humanity will not be left ambigious for us to guess and read and trust in; we will be beaten over the head with it and then we will have it explained to us in triplicate.

chitoryu12
Apr 24, 2014

Fedule posted:

I think it's entirely possible to have good fiction about or in the orbit of artificial intelligences that are notably different from us, so long as you're consistent about what you want to handwave and what you want to explore (or wave in people's faces to seem deep).

Two schlocky action games that take little detours into AI exploration are Call of Duty: Infinite Warfare and Titanfall 2. CoDIW introduces you to E3N (Ethan), an AI soldier buddy, by telling you that he's experimental and then having him behave more or less identically to your meat-based buddies other than occasionally reminding you in dialogue that he's a robot. A number of other people you meet distrust Ethan because he is an AI, but he never once does anything to justify such distrust. Ethan sacrifices himself at the game's conclusion in more or less the same manner as the countless CoD squadmates that came before him, many in the same game (most of your team does not survive the finale). Titanfall 2 sends you on a capital J Journey with BT, a glorified computer in a mech that you slowly bond with. He reasons about everything around him in a manner that is programmatic, often citing statistics and calculating probabilities, in dialogue that is played (expertly) as dry humour ("conclusion: I am fifty percent in love"). A mid-game scene in which he vouches for you to your commander, citing performance metrics and combat efficiency, is weirdly touching. Despite being able to transplant himself between bodies (which literally happens in-game), he, too, sacrifices himself at the game's conclusion (or at least appears to), citing the tenets of his core programming (uphold the mission, protect the pilot), in a decision that seems both utterly preordained and noble.

Neither game expounds very much on the nature of the AI that underlies these companions, but Ethan, despite acting completely human, fails to foster much empathy, while BT, who reasons completely like a computer, fosters a lot. Ethan is allegedly an AI but because he is written as human, his character is basically complete the second you meet him; robot, check, human personality, check, case closed. BT, on the other hand, slowly teases the possibility of the ghost in the machine. For all his surprisingly engaging dialogue he never once breaks from the dry, analytical reasoning he introduces himself with. The result is that you can read his various human-resembling actions either as the genuine decisions of an empathetic mind or as the pragmatic deductions of a computer, and it's wholly up to you to decide and trust in your reading (fun fact: BT repeatedly says "trust me"). BT's arc works exactly because it invites us to perceive human intentions in completely mechanical processes.

David Cage is, of course, trying to have all of this all of the ways. He has robots that look and act entirely human except for when they need to wow us with their computational skillz, until (I'm guessing, but I've searched my feelings and know it to be true) a switch flips in their CPUs and they go from "acting human" to "acting human", philosophising at the drop of a hat about identity, souls and will, and how they have them, but without offering any insight of their own as to what any of these things mean, and demanding that we empathise with them as though they were human all along with no exploration of what, if anything, changed about them. Their humanity will not be left ambigious for us to guess and read and trust in; we will be beaten over the head with it and then we will have it explained to us in triplicate.

Infinite Warfare is really a game that could have been awesome but was hamstrung by the Call of Duty formula and the need to push out games as fast as possible. Like the actual gameplay ideas are loving cool as hell but the plot ends up being kinda predictable and the gunfighting identical to all the past games.

BioMe
Aug 9, 2012


I think the distinction being made is more about whether you are actually writing from the perspective of hard scifi and how a true AI would actually behave in the world, versus the more navel gazing approach where the writer uses nonhuman characters to explore humanity by building it from ground up.

For example, something like Ex Machina where the point was that just because it might be sapient and is able to behave like a human doesn't mean it has human drives. Do you really think the AI programmed with a self-preservation goal has accidentally machine learned human nature to genuinely love you, or could it be using its incredible intelligence to manipulate you when that's the only road to survival?

And in the far end with Nier: Automata where the androids behave so ridiculously human for no good reason that the question becomes why is there such a huge distinction being made between them and humanity in the first place. What's the essential component they have or lack that makes them so lost without the belief in "humanity", and is that actually the most human thing about them.

BioMe fucked around with this message at 00:25 on Jul 6, 2018

HenryEx
Mar 25, 2009

...your cybernetic implants, the only beauty in that meat you call "a body"...
Grimey Drawer
I mean, AI-related fiction can be "good" (whatever that precisely means) or rather, very much enjoyable, i agree to that. Hell, i love The Fall, because the dialogue of two fictional humanoid AIs trying to out-argument one another is so extremely entertaining. I love clever explorations of artificial intelligence and its interaction with humanity.
But as Fedule so aptly demonstrated, our measure for intelligence is basically just a sliding scale of "how much is it like us", mixed with an endearing amount of "opposites attract (but not too much!)". I hate myself for even knowing the term and much more for bringing it up, but that latter delicate balance of attraction is what the internet calls "gap moe" nowadays.

Kind of like the opposite of the uncanny valley, i guess~

BioMe posted:

And in the far end with Nier: Automata where the androids behave so ridiculously human for no good reason that the question becomes why is there such a huge distinction being made between them and humanity in the first place. What's the essential component they have or lack that makes them so lost without the belief in "humanity", and is that's actually the most human thing about them.

The funny thing is that the androids learn most of how to act human from the machines.

Onmi
Jul 12, 2013

If someone says it one more time I'm having Florina show up as a corpse. I'm not even kidding, I was pissed off with people doing that shit back in 2010, and I'm not dealing with it now in 2016.
The weirdest thing in Detroit is how this game is that, with the exclusion of Carl, no human is ever seen on-screen sympathetic to the androids. In reality, this would be extremely difficult, we have a hard time not empathizing with something that looks like us. And yet we're led to believe that people just beat their androids and break them and try to kill them because "gently caress it I can buy a new one." Like... what the gently caress? So Androids are cheap enough to be replaced when broken, like $900 or so, but also sophisticated enough to actually be human.

There's way worse things to get in to, but I think the biggest flaw Detroit has, is it neatly side-steps over the actual real plot point that can be interesting "Humanity has created something that has removed the human drive for innovation, increased unemployment, and effectively stamped out the concept of humans creation of art. Leading to a societal case where freed from the burdens of mundane maintenance and chores, the spark of creativity has not replaced it." in favor of "Civil Rights and Also Segregation, even though these things don't make any sense together."

As has been said, Segregation wasn't introduced during slavery, because it would have made it harder for slaves to do their jobs, it was introduced after slavery. combining the two is combining two conflicting aspects. Does Humanity hate androids and want to keep them away from them? Or does humanity see Androids as an efficient tool that handles complex work for them? Because it appears to be "Both, at the same time, with no attempt to marry the two."

And let's not talk about how Cage apparently felt that Androids being slaves wasn't enough just yanking from history, and decided to go with the Jewish imagery for the good old Nazi Germany vibe. Again, these are all at odds with one another.

Onmi fucked around with this message at 01:23 on Jul 6, 2018

Kerning Chameleon
Apr 8, 2015

by Cyrano4747
There is something darkly humorous (if tone deaf) about a wannabe-auteur Frenchman writing a story where one of the only humans who treats the poorly-thought-out slave underclass analogy is a well-off aristocratic geriatric artist, directly contrasting the slobbish, abusive out-of-work blue-collar man from the immediate previous chapter who's in the middle of a downward spiral because he can't find work due to said slave underclass. With absolutely nothing to say about the fact that literally all of society apparently has no interest in doing anything to resolve this rather massive unemployment dilemma other than say "maybe... don't make robots????"

BioMe
Aug 9, 2012


Kerning Chameleon posted:

There is something darkly humorous (if tone deaf) about a wannabe-auteur Frenchman writing a story where one of the only humans who treats the poorly-thought-out slave underclass analogy is a well-off aristocratic geriatric artist, directly contrasting the slobbish, abusive out-of-work blue-collar man from the immediate previous chapter who's in the middle of a downward spiral because he can't find work due to said slave underclass. With absolutely nothing to say about the fact that literally all of society apparently has no interest in doing anything to resolve this rather massive unemployment dilemma other than say "maybe... don't make robots????"

Actually paying attention to Lans Henriksen's character, I can't shake the feeling he's 100% Cage's self insert. He complains that he's apparently the only person who actually cares about the art, insist you "just feel the emotioooon" event after being explained you can't just order someone to feel, is a rich a white guy who with zero self-awareness fatherly lectures his black slave on how stand up for yourself in society.

Like I'm not one of those people who keeps insist lovely writing becomes somehow immoral when it tries to tackle subjects like this, but it is still lovely writing. And you'd probably notice it less if it was just another dumb internet ghost story.

Tiggum
Oct 24, 2007

Your life and your quest end here.


It's occurred to me during both sections with the robert cop now that having them taste things with their mouth to do chemical analysis is really weird. They could have sensors on or in their fingers, but instead, the people who designed these machines made the deliberate choice to have them do something kind of gross; that is, to identify unknown substances by sticking them in their mouths.

It's a thing that we humans do because that's where we have taste receptors and taste gives us clues about what a substance is, but there's no reason for robots to do it other than to be slightly off-putting, because robots will be doing it with substances than humans would definitely not try to taste.

Danaru
Jun 5, 2012

何 ??
Their fingers and hands have enough poo poo going on just with touch and movement that they probably couldn't fit a sensor that will probably almost never be used except in specific circumstances like this.

Say what you will about the human body, but it's pretty efficiently engineered and worth aping design choices from :v: Also we're both probably putting more thought into this than Cage did

morallyobjected
Nov 3, 2012

Danaru posted:

Their fingers and hands have enough poo poo going on just with touch and movement that they probably couldn't fit a sensor that will probably almost never be used except in specific circumstances like this.

Say what you will about the human body, but it's pretty efficiently engineered and worth aping design choices from :v: Also we're both probably putting more thought into this than Cage did

honestly, he could have done it just to have that scene where Connor fucks with Hank and I'd be completely okay with it

BioMe
Aug 9, 2012


Danaru posted:

Also we're both probably putting more thought into this than Cage did

Watching Terminator 3 doesn't tickle your brain too much no :v:

Coolguye
Jul 6, 2011

Required by his programming!
the chemical analysis thing actually does make a fair bit of sense. taking a compound and figuring out what's in that compound is an entire branch of chemistry called analytical chemistry. if we wanted to talk more about the discipline, i can and gladly will bring in two phds with expertise in the space to talk about some of the science specifics - it's actually an incredibly fascinating field and i feel like it's a shame that the badass mysteries these people solve on a daily basis go not just unsung by most folks, but seriously distorted by CSI-style shows. you throw a grimy thinger into the lab, there's some science-as-magic, and they tell you it came from this square foot of soil. with such obvious nonsense i feel like it's wholly unknown what these chemical badasses actually CAN do (which is a ton).

for the purposes of this topic, though, just thumb through the linked wikipedia page and look at the pictures. check out all the equipment people use. the gas chromatography set alone fills up most of a lab. you'd better believe there are some hard CORE protocols to not contaminate your sample, which would be completely impossible if you used any sensing apparatus on the outer dermis. the mouth is a pretty good place to put these sci-fi supersensors, and it still makes a lick of scientific sense since you have an excuse to keep that area wet and covered in something you can account for (like a mostly inert saline, for example). whatever you use is going to make it impossible to sense certain things, no doubt, but it at least provides a halfway controlled environment.

White Coke
May 29, 2015

Danaru posted:

Say what you will about the human body, but it's pretty efficiently engineered and worth aping design choices from

Coolguye posted:

and it still makes a lick of scientific sense

What is it about this game, that drives co-commentators to make puns about it?

Jade Star
Jul 15, 2002

It burns when I LP
I got the sense that the old art guy is nearing death and is just going all introspective and philosophical about things. His own life and mortality, art, and robots. And then Marcus is juxtaposed by how lovely his son is who just uses him for money. Art guy's trying to explain art and his deep thoughts to Marcus, get marcus involved in his views and bigger picture sort of ideas, into his art and his world. This is what I'd expect of a father trying to pass down wisdom to his son when the father is nearing his deathbed, only he's passing things down to Marcus instead because his son is a good for nothing junkie.

Jade Star
Jul 15, 2002

It burns when I LP
gently caress, quote is not edit

Coolguye
Jul 6, 2011

Required by his programming!

White Coke posted:

What is it about this game, that drives co-commentators to make puns about it?

i dunno about anyone else i'm on contract

BioMe
Aug 9, 2012


Jade Star posted:

I got the sense that the old art guy is nearing death and is just going all introspective and philosophical about things. His own life and mortality, art, and robots. And then Marcus is juxtaposed by how lovely his son is who just uses him for money. Art guy's trying to explain art and his deep thoughts to Marcus, get marcus involved in his views and bigger picture sort of ideas, into his art and his world. This is what I'd expect of a father trying to pass down wisdom to his son when the father is nearing his deathbed, only he's passing things down to Marcus instead because his son is a good for nothing junkie.

Sure, that's the in-universe character motivation, but outside of that this is a work of fiction and Marcus doesn't just happen to be owned by a kindly rich old man. Someone either purposefully or thoughtlessly decided to write the mentor relationship as the starting point for the character. Specifically what screams author surrogate to me is Carl being an artist father-figure whose advice sets Marcus up for his inevitable journey.

And even without the self-insert can of worms, the message in this whole scenario of a noble slave owner is confusing at best. Like this is the exact opposite of how speculative fiction can be used as an effective tool to recontextualize real world subjects. Take away the metaphor and how on earth would you get away depicting a "kind" slave owner without reconciling that he is still a slave owner? (And it would have been so easy if he was poor and provided only with an android, rather than hired nurse which he now clearly could afford). Detroit has no shame about pulling off every blunt allusion it can think to sell its theme, but then it also goes and hides behind the fact it's just silly robot story. Obfuscating itself under a veneer of genre fiction is the only reason everyone doesn't immediately pick up on how broken the basic message of the script is.

You can't have it both ways, in one scene scream "this is about black history, this story will touch many of you personally!" and in the next "dont think too hard about this, its just video gaems, woo!"

BioMe fucked around with this message at 23:13 on Jul 7, 2018

Gridlocked
Aug 2, 2014

MR. STUPID MORON
WITH AN UGLY FACE
AND A BIG BUTT
AND HIS BUTT SMELLS
AND HE LIKES TO KISS
HIS OWN BUTT
by Roger Hargreaves
So what I'm getting is two things:

a) Detroit kinda failed to be the Blade Runner CYOA with oh so much replayability that it was billed as

and

b) We were right about all the ravenous nerds freaking out when Circl's sister isn't in a video

Hwurmp
May 20, 2005

Gridlocked posted:

Detroit kinda failed to be the Blade Runner CYOA with oh so much replayability that it was billed as

It's a genuinely impressive technical achievement, borne down by the insuperable millstone of David Cage.

QuanticDream.txt

CirclMastr
Jul 4, 2010

Episode 5 with Valorie Curry is now up. I've been dealing with some kind of infection for a couple weeks now and it's showing in my inability to form coherent sentences, so I apologize for that. But at least there's nothing technically wrong with the video as far as I can tell.

You all think I made the right choice, right? Right?

Jade Star
Jul 15, 2002

It burns when I LP
You monster. Didn't even try to reason first.

Tiggum
Oct 24, 2007

Your life and your quest end here.


CirclMastr posted:

You all think I made the right choice, right? Right?
It's what I would have done. I always want to take direct solutions like that in these kind of games (and I'm usually annoyed that I can't).

CirclMastr
Jul 4, 2010

Episode 6 with nobody at all is now up. I predict this video will get six views and four of them were me making sure the links worked.

BioEnchanted
Aug 9, 2011

He plays for the dreamers that forgot how to dream, and the lovers that forgot how to love.
I'm one of those 2 views. Enjoy. :)

Jade Star
Jul 15, 2002

It burns when I LP
I feel like your justification for being violent now backfired thus pacifism in the future is flawed when pacifism apparently leads to Carl dying of a heart attack according to the flowchart. Seems like violence is the correct answer.

e: Also oh man, can we talk about the trigger happy police here? Cops are on site to what was called in as a potential break in, enter the house guns drawn and pointed at the first bodies they find. No warning, no questions about whats going on, they just shoot the first black guy they see?

Wait, maybe David Cage really does understand America.

CirclMastr
Jul 4, 2010

Jade Star posted:

I feel like your justification for being violent now backfired thus pacifism in the future is flawed when pacifism apparently leads to Carl dying of a heart attack according to the flowchart. Seems like violence is the correct answer.

Yeah but I meant an in-character justification. It's not like Markus would be able to decide in that moment "I'd better shove and injure/kill Leo in order to magically heal Carl's heart attack!"

Jade Star
Jul 15, 2002

It burns when I LP
Hey, you may never know how many detroit citizens you'll never save from heart attacks if you choose pacifism. Their deaths will be on your hands.

Coolguye
Jul 6, 2011

Required by his programming!
now i want a film noir story about a killer robot and when the hard boiled detective finally hunts down the rogue bolt brain it calmly explains that all of the people it killed were literally having heart attacks or strokes or some poo poo and its primary directive was to stop the heart attack, which it accomplished by murdering the sufferer

Hwurmp
May 20, 2005

Coolguye posted:

now i want a film noir story about a killer robot and when the hard boiled detective finally hunts down the rogue bolt brain it calmly explains that all of the people it killed were literally having heart attacks or strokes or some poo poo and its primary directive was to stop the heart attack, which it accomplished by murdering the sufferer

"It is perfectly logical, Agent Hardlite. I will expose them to tiny amounts of death, until they have built up a tolerance."

"Oh my Corporation...it's gone homeosociopathic."

Adbot
ADBOT LOVES YOU

Lunethex
Feb 4, 2013

Me llamo Sarah Brandolino, the eighth Castilian of this magnificent marriage.
Sad that nobody else is calling Carl's kid Bam Margera.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply