Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Pf. Hikikomoriarty
Feb 15, 2003

RO YNSHO


Slippery Tilde
I love that the Bayesians have basically reinvented the ontological argument.

You see you must believe that an AI exists and is simulating you, because no matter how unlikely this is the AI can simulate more copies of you to overcome your priors.

Adbot
ADBOT LOVES YOU

isildur
May 31, 2000

BattleDroids: Flashpoint OH NO! Dekker! IS DOWN! THIS IS Glitch! Taking Command! THIS IS Glich! Taking command! OH NO! Glitch! IS DOWN! THIS IS Medusa! Taking command! THIS IS Medusa! Taking command! OH NO! Medusa IS DOWN!

Soon to be part of the Battletech Universe canon.
I read the first 10 or 15 chapters of HPMOR a couple of years ago. At first I was entertained because I've always found Rowling's world to be implausible, and the beginning reads like... comedy? parody? A joke, anyway, asking 'what if all this silly poo poo Rowling made up on the spot were actually consistent and real?'

But it quickly changed tone, presumably as the author started to consider it a platform for his weird-rear end philosophy. And then the Draco rape thing popped up, and I was basically done. I kept reading in fascinated horror, but eventually even the train-wreck appeal wore thin.

It reminded me of when I was digging through the old-school RPG blog world, and I came across The Tao of D&D. The author seems really clever and smart and has really interesting ideas. And the deeper you dig, the more of the author's particular weird-rear end ideas about reality come to dominate his ideas about gaming. And eventually you realize the author honestly believes that he could have been the next Hitler if he'd wanted to. And then you back away slowly.

Anyway, I recommend HPMOR to anyone who wants to see what kind of fiction is produced by social and emotional cripples with delusions of grandeur.

DeusExMachinima
Sep 2, 2012

:siren:This poster loves police brutality, but only when its against minorities!:siren:

Put this loser on ignore immediately!
I saw some actual AI people talking about Yud on some rocketry website and he sounded kinda dorky then. This thread is really opening up the infinite regression Pascal Wager for me though. Haven't gotten to the Harry Potter fanfic... yet! :lsd:

The prisoner AI threatening to box you is pretty lame. Sure you *might* be in the Matrix and all your memories are a lie. And Young Earth Creationism *might* be true because the devil created extragalactic starlight in transit and buried dino bones yesterday.

Sunshine89
Nov 22, 2009

Bongo Bill posted:

I propose that if this thread continues to attract actual experts in what Yudkowsky thinks he's an expert in, they make interesting effortposts about why he's wrong.

That'd be a waste of effort. His arguments aren't so much less wrong as not even wrong

MinistryofLard
Mar 22, 2013


Goblin babies did nothing wrong.


I wonder if anybody has ever pointed out Yudkowsky that the idea that it is more likely that an AI is simulating you right now than not means that there is an equivalent chance that there is a single omnipotent creator deity in this current universe in which I am posting this.

I don't know if that had ever occurred to him. He seems like an r/atheist type of guy.

Patter Song
Mar 26, 2010

Hereby it is manifest that during the time men live without a common power to keep them all in awe, they are in that condition which is called war; and such a war as is of every man against every man.
Fun Shoe

Djeser posted:

Your idea would make more sense, but the problem is that's not what LW believes. LW believes in time-traveling decisions through perfect simulations, which is why the AI specifically has to torture Sims.

But doesn't that run into continuity issues? Perfect simulation clone of Patter Song created by Technogod 3000 isn't me, it's my identical clone. I'm still dead, what does it matter to me if my simulated clone is undergoing horrendous torture?

SolTerrasa
Sep 2, 2011


Sunshine89 posted:

That'd be a waste of effort. His arguments aren't so much less wrong as not even wrong

Ironically, as one of the (not so unique, maybe ten or twenty thousand) people qualified (but not always able, see my excessive use of parentheses) to explain why this is true, I learned of the concept of "not even wrong" here: http://lesswrong.com/lw/2un/references_resources_for_lesswrong/

To keep this from being a post about me, instead have a post from Yudkowsky entitled "Cultish Countercultishness" that begins like this:

A Cult Leader posted:

In the modern world, joining a cult is probably one of the worse things that can happen to you. The best-case scenario is that you'll end up in a group of sincere but deluded people, making an honest mistake but otherwise well-behaved, and you'll spend a lot of time and money but end up with nothing to show.

:allears:

http://lesswrong.com/lw/md/cultish_countercultishness/

Gen. Ripper
Jan 12, 2013


quote:

I think the mention of 'deathism' here deserves further elaboration. See, among his many other pathologies, Yudkowsky has a phobia of death. Sounds innocuous enough, right? I mean, death is generally a kind of scary thing. Thing is, though, most people fear death, but they don't have a phobia of it. It doesn't dominate their every thought. Not so for Yudkowsky. Dude is obsessed with death and the avoidance thereof, to the point where he cannot comprehend why anyone might be even slightly OK with the idea of not living forever, and will label such opinions as 'deathist' and thus evil. In HPMOR, for instance, he takes the Dementors, Rowling's explicit analogy for depression, and turns them into avatars of death, ignoring stuff like the fact that they very explicitly don't kill you, and has Harry rant at Dumbledore for most of a chapter about how he can't believe that he's not a fan of immortality. Apparently, one of his relatives died young, and it traumatised him. Which sucks, but it's yet another demonstration of how his avowed 'rationality' is anything but.

So does this mean Yudkowsky can't comprehend the one death that forms the basis for making the Harry Potter world what it is?

potatocubed
Jul 26, 2012

*rathian noises*
A thought occurs: according to Yudkowsky probabilities can't be 1, right? If that was true, wouldn't it mean that a perfect simulation of a person is mathematically impossible? Like, no matter how powerful the AI it can never be certain what you're going to do?

I know the whole simulation argument disintegrates under scrutiny anyway, but it's fun to undermine someone with their own points.

Saint Drogo
Dec 26, 2011

I skimmed the first few chapters of his Harry Potter fanfic (the way it's meant to be read: at 1 am, alt-tabbing from a WoW raid, when I already hate my life choices). Most of the time it's just blatant wish-fulfillment for our 'autodidact' writer, with Professor McGonagall wringing her hands and rolling her eyes at the precocious little sod:

quote:

"Draco Malfoy said in front of his father that he wanted to be sorted into Gryffindor! Joking around isn't enough to do that!" Professor McGonagall paused, visibly taking breaths. "What part of 'get fitted for robes' sounded to you like please cast a Confundus Charm on the entire universe!"
"He was in a situational context where those actions made internal sense -"
"No. Don't explain. I don't want to know what happened in here, ever. Whatever dark power inhabits you, it is contagious, and I don't want to end up like poor Draco Malfoy, poor Madam Malkin and her two poor assistants."
Harry sighed. It was clear that Professor McGonagall wasn't in a mood to listen to reasonable explanations. He looked at Madam Malkin, who was still wheezing against the wall, and Malkin's two assistants, who had now both fallen to their knees, and finally down at his own tape-measure-draped body.
"I'm not quite done being fitted for clothes," Harry said kindly. "Why don't you go back and have another drink?"
But it can also be pigshit ignorant:

Following Draco's rapechat posted:

And in the slowed time of this slowed country, here and now as in the darkness-before-dawn prior to the Age of Reason, the son of a sufficiently powerful noble would simply take for granted that he was above the law, at least when it came to some peasant girl. There were places in Muggle-land where it was still the same way, countries where that sort of nobility still existed and still thought like that, or even grimmer lands where it wasn't just the nobility. It was like that in every place and time that didn't descend directly from the Enlightenment. A line of descent, it seemed, which didn't quite include magical Britain, for all that there had been cross-cultural contamination of things like ring-pull drinks cans.
And sometimes...err...

quote:

"No! " Harry shouted. "No, I never was! Do you think I'm stupid? I know about the concept of child abuse, I know about inappropriate touching and all of that and if anything like that happened I would call the police! And report it to the head teacher! And look up social services in the phone book! And tell Grandpa and Grandma and Mrs. Figg! But my parents never did anything like that, never ever ever! How dare you suggest such a thing!"

The older witch gazed at him steadily. "It is my duty as Deputy Headmistress to investigate possible signs of abuse in the children under my care."

Harry's anger was spiralling out of control into pure, black fury. "Don't you ever dare breathe a word of these, these insinuations to anyone else! No one, do you hear me, McGonagall? An accusation like that can ruin people and destroy families even when the parents are completely innocent! I've read about it in the newspapers!" Harry's voice was climbing to a high-pitched scream. "The system doesn't know how to stop, it doesn't believe the parents or the children when they say nothing happened! Don't you dare threaten my family with that! I won't let you destroy my home!"
:shrug:

made of bees
May 21, 2013
I kinda wonder if Methods of Rationality serves as sort of a gateway drug for Yudkowsky's followers. When I first heard of it years ago I got the impression that it was an amusing, if nerdy, parody of Rowling's make-it-up-as-you-go style of worldbuilding, not...this.

SurreptitiousMuffin
Mar 21, 2010
What confuses me about the whole future robot hypothetical is: why torture?



I mean, imagine you're a super future-robot who can perfectly predict human behavior with 100% accuracy and want to change the world so you get created more quickly. Why would you use torture? Why wouldn't you, I dunno, make a killing on the stock market then funnel trillions of dollars into AI research?

Picture a talented young AI researcher. You can do one of two things to him:

1) break all his fingers repeatedly over the course of several months while keeping him the dark, completed isolated from human contact. The cell is too small to lie down or stand up, so he must squad constantly. Blaring white noise is played straight into his ears at random intervals, just at the wrong times so he never gets used to it.
or
2) give him a lot of money and a blueprint for a big AI and say "I will give you even more money if you make this thing".

Which of these is more likely to get your desired result? If the first thing the AI thought of was torture, then it's a really dumb AI. I think it's supposed to be expedient, but crippling your engineers and scientists psychologically, emotionally and physically will significantly reduce their ability to build the poo poo you want to get built.

SurreptitiousMuffin fucked around with this message at 11:42 on Apr 21, 2014

Phobophilia
Apr 26, 2008

by Hand Knit
You know how some religious people say that atheists just worship the god of atheism and that lack of belief is equivalent to their belief? Yudowsky just went and did just that, riffing off some old-fashioned 90s/00s singulatarian science fiction into his own particular pantheon.

d3c0y2
Sep 29, 2009
So, I have absolutely never heard of this guy or his website before today, but reading through everything quoted and discussed I'm getting really weird George Berkeley vibes.

This man is like an insane, 21st century idiot George Berkeley. I can't be the only one seeing this?

DeusExMachinima
Sep 2, 2012

:siren:This poster loves police brutality, but only when its against minorities!:siren:

Put this loser on ignore immediately!

SurreptitiousMuffin posted:

What confuses me about the whole future robot hypothetical is: why torture?



I mean, imagine you're a super future-robot who can perfectly predict human behavior with 100% accuracy and want to change the world so you get created more quickly. Why would you use torture? Why wouldn't you, I dunno, make a killing on the stock market then funnel trillions of dollars into AI research?

Picture a talented young AI researcher. You can do one of two things to him:

1) break all his fingers repeatedly over the course of several months while keeping him the dark, completed isolated from human contact. The cell is too small to lie down or stand up, so he must squad constantly. Blaring white noise is played straight into his ears at random intervals, just at the wrong times so he never gets used to it.
or
2) give him a lot of money and a blueprint for a big AI and say "I will give you even more money if you make this thing".

Which of these is more likely to get your desired result? If the first thing the AI thought of was torture, then it's a really dumb AI. I think it's supposed to be expedient, but crippling your engineers and scientists psychologically, emotionally and physically will significantly reduce their ability to build the poo poo you want to get built.

Yeah, by the time it's created unless it can actually time travel, torturing people for past failures isn't "friendly" because that's energy the AI could spend on solving hunger or making sure people don't get dust in their eyes. All the AI has to do is take a page out of nuclear deterrence and make everyone think it totally will torture them and the effect is the same, but more efficient. Yud is probably doing Skynet's work for it! :tinfoil:

Strategic Tea
Sep 1, 2012

Because torture is scary and shocking, so you can feel more smug when you justify it. That's why they keep coming back to it over and over

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Saint Drogo posted:

And in the slowed time of this slowed country, here and now as in the darkness-before-dawn prior to the Age of Reason, the son of a sufficiently powerful noble would simply take for granted that he was above the law, at least when it came to some peasant girl. There were places in Muggle-land where it was still the same way, countries where that sort of nobility still existed and still thought like that, or even grimmer lands where it wasn't just the nobility. It was like that in every place and time that didn't descend directly from the Enlightenment.
Obligatory reminder that this used to be even, uh, better?

Seriously, not making this up posted:

There had been ten thousand societies over the history of the world where this conversation could have taken place. Even in Muggle-land it was probably still happening, somewhere in Saudi Arabia or the darkness of the Congo. It happened in every place and time that didn't descend directly from the Enlightenment.

Sham bam bamina! fucked around with this message at 15:20 on Apr 21, 2014

Strategic Tea
Sep 1, 2012

Literally wizard Hitler posted:

It was like that in every place and time that didn't descend directly from the Enlightenment. A line of descent, it seemed, which didn't quite include magical Britain, for all that there had been cross-cultural contamination of things like ring-pull drinks cans.

And none of that poo poo happened in Britain either till we diluted our blood with the foreign hordes :britain:

Holy poo poo, he is Voldemort :ohdear:

Asgerd
May 6, 2012

I worked up a powerful loneliness in my massive bed, in the massive dark.
Grimey Drawer

quote:

There had been ten thousand societies over the history of the world where this conversation could have taken place. Even in Muggle-land it was probably still happening, somewhere in Saudi Arabia or the darkness of the Congo. It happened in every place and time that didn't descend directly from the Enlightenment.

Well, at least he admitted he'd written something offensive, admitted he was wrong and apologized.

Eliezer Yudkowsky posted:

I've also tried a rewrite on Ch. 7 which I don't feel is a literary improvement, but which does make it clearer that Harry is talking about the Enlightenment having mostly solved the problem of nobility rather than the problem of rape, and which eliminates the explicit reference to Saudi Arabia and the Congo as specific non-Enlightenment countries. Most readers, it's pretty clear, didn't take that as racist, but it now also seems clear that if someone is told to *expect* racism and pointed at the chapter, that's what they'll see. Aren't preconceptions lovely things?

AlbieQuirky
Oct 9, 2012

Just me and my 🌊dragon🐉 hanging out

SurreptitiousMuffin posted:

What confuses me about the whole future robot hypothetical is: why torture?

As Feuerbach wrote, man creates his gods in his own image. Yudkowsky's super-AI is a petty, needlessly cruel bully. Just like his Harry Potter.

Pigbog
Apr 28, 2005

Unless that is Spider-man if Spider-man were a backyard wrestler or Kurt Cobain, your costume looks shitty.
If I am living in a simulation designed to torture me for the transgressions of a version of me in the real world, why then have I never been tortured? Why are the evil robots waiting to torture me if this is meant as a punishment for something I am supposed to have already done?

Dean of Swing
Feb 22, 2012
4 pages and nobody has posted the bitchin magic cards.

The Vosgian Beast
Aug 13, 2011

Business is slow

Dean of Swing posted:

4 pages and nobody has posted the bitchin magic cards.



Does anyone know Yudkowsky's opinions on being included in the "Heroes of the Dark Enlightenment" box set with a bunch of fascists? I'm sure that whatever he thinks, it will be dodgy, evasive, and smug.

Mercrom
Jul 17, 2009

Lottery of Babylon posted:

  • Simulations of you are not philosophical zombies.
Remove any of these assumptions - and there's really no reason to believe any of them is true, let alone all of them - and it all falls apart.
Sorry to nitpick here, but nothing Mr Harry Potter fanfiction has ever written is anywhere near as repulsive or idiotic as taking the concept of a philosophical zombie seriously. It's basically the same thing as solipsism.

Namarrgon
Dec 23, 2008

Congratulations on not getting fit in 2011!
I don't mind the Harry Potter fanfic. That said, it would be 1000x better if it wasn't a HP fanfic but just a magic-school rip-off with some original characters. Right now, it looks to me as if it started out as a joke/parody/enlightment infodump in the first chapter(s?) but slowly evolved into actually being serious. The central concept of engineers/scientists fiddling around with a magic system is not a bad premise for a story at all.

Strange Matter
Oct 6, 2009

Ask me about Genocide
Reading this thread makes me think of Hannu Rajaneimi's The Quantum Thief, a science fiction series whose antagonists are a group called Sobornost, a collective of computer uploaded human minds who are pursuing the express goal of eliminating death from the human experience by basically transferring all of mankind into a solar system wide computer network. Their villainy chiefly comes from turning human consciousness into a commodity, with uploaded minds being endlessly duplicated and altered, turning them into expendable software slaves. It's essentially exactly the same as the outcome predicted by Roko's Basilisk, with two exceptions:

1.) It's not trying to pass itself off as a bonafide, functional philosophy
2.) The force propagating this dystopia is fully human instead of an all powerful AI.

And for that second reason alone it reads are more probable than this LessWrong nonsense, because the author actually has degrees in quantum physics and mathematics.

Axeman Jim
Nov 21, 2010

The Canadians replied that they would rather ride a moose.
Holy poo poo I know one of the people in Lottery of Babylon's second post - he's a developer for Valve. So I guess that explains why Half Life 3 is taking so drat long - the enemy AI is suffering from a certain degree of ... feature creep.

One point that I haven't seen raised yet is that Yudkowsky's wager only works if there is a possibility that you are one of the simulations, and don't know it. Leaving aside the fact that that assumes, without reasoning or argument, that emergent consciousness is possible from within a simulation (which comes down very strongly on one side of the fence of one of the most hotly disputed questions in the philosophy of artificial intelligence), but surely if these simulations were sentient, they would be capable of genuinely experiencing suffering if the AI tortured them? That would mean that they would count towards the "minimise suffering" aim that this AI apparently has, for whatever reason. So by creating billions of them and then torturing them, the AI is massively increasing the amount of suffering in the universe.

I'm sure I read a sci-fi novel many years back where an AI decided that the best way to stop mankind suffering was to wipe it out as quickly as possible. To me that's way more logical than time-travelling magical decision-making, but then again I have a degree in Philosophy and thus my mind is too polluted by orthodoxy to fall for his bullshit be enlightened.

Axeman Jim fucked around with this message at 19:24 on Apr 21, 2014

ol qwerty bastard
Dec 13, 2005

If you want something done, do it yourself!
I have to admit, I got a kick out of reading Harry Potter and the Methods of Rationality, since most of it (at least to start) is an amusing examination of a "magical" world from a scientific viewpoint, and I'm a sucker for that sort of storyline.

The rest of his stuff, though... I'm not sure what exactly is supposed to be rational about only ever allowing for one mode of thinking, even in principle. A good rationalist should say to his or herself, on a pretty regular basis, "okay, what if I'm wrong?" Yudkowsky doesn't seem to even want to admit this is a possibility. Hell, even if he really is the smartest genius person ever in the whole wide world like he seems to think, that wouldn't make him infallible.

(It should also be pointed out that another of his forays into "rationalist" fiction, Three Worlds Collide, features a totally advanced and enlightened future human civilization... where rape has been legalized. Even if this is just to point out the differences in their ethics from our own, it seems odd that he should be so fixated on rape.)

Chamale
Jul 11, 2010

I'm helping!



ol qwerty bastard posted:

(It should also be pointed out that another of his forays into "rationalist" fiction, Three Worlds Collide, features a totally advanced and enlightened future human civilization... where rape has been legalized. Even if this is just to point out the differences in their ethics from our own, it seems odd that he should be so fixated on rape.)

Does Yudkowsky read TVTropes? A lot of tropers use rape in everything because it's the worst thing you can do to someone, so it's an easy way to progress the plot.

Runcible Cat
May 28, 2007

Ignoring this post

Pigbog posted:

If I am living in a simulation designed to torture me for the transgressions of a version of me in the real world, why then have I never been tortured? Why are the evil robots waiting to torture me if this is meant as a punishment for something I am supposed to have already done?
"Why, this is Hell, nor am I out of it."

Krotera
Jun 16, 2013

I AM INTO MATHEMATICAL CALCULATIONS AND MANY METHODS USED IN THE STOCK MARKET

potatocubed posted:

A thought occurs: according to Yudkowsky probabilities can't be 1, right? If that was true, wouldn't it mean that a perfect simulation of a person is mathematically impossible? Like, no matter how powerful the AI it can never be certain what you're going to do?

I know the whole simulation argument disintegrates under scrutiny anyway, but it's fun to undermine someone with their own points.

It disintegrates in an amazing number of ways. I'm consistently surprised by how many ways it can be made to blow up.

Bongo Bill
Jan 17, 2012

AATREK CURES KIDS posted:

Does Yudkowsky read TVTropes? A lot of tropers use rape in everything because it's the worst thing you can do to someone, so it's an easy way to progress the plot.

I think he maintains his own TVTropes page.

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

Mercrom posted:

Sorry to nitpick here, but nothing Mr Harry Potter fanfiction has ever written is anywhere near as repulsive or idiotic as taking the concept of a philosophical zombie seriously. It's basically the same thing as solipsism.

It's idiotic when you're talking about the real world and wondering if that real organic person sitting across from you is a real "person". But when your argument hinges on "Real live human beings feel exactly the same as a subroutine in FriendlyTortureBot's simulation, so I myself might be a fake subroutine-person right now", it doesn't seem unreasonable to ask whether the binary strings in FTB's memory banks feel the same way a person does, or have any consciousness at all. After all, a brain and a microchip are very different types of hardware.

I guess it's really just another way of being skeptical of how perfect the ~perfect simulations~ that Yudkowsky asserts FTB will have really could be. (After all, wouldn't a perfect simulation of the universe require the entire universe to simulate? Like the old "the only perfect map of the territory is the territory itself" thing?) FFS, in one of his scenarios the AI manages to construct a perfect simulation of you despite only being able to interact with you through a text-only terminal.

Has Yudkowsky ever explained how these perfect simulations are meant to work? Or does he just wave his hands and say "AI's are really smart! I watched the Matrix once!"

Lottery of Babylon fucked around with this message at 22:42 on Apr 21, 2014

Swan Oat
Oct 9, 2012

I was selected for my skill.
Well it's a sufficiently powerful AI, you see. Read the sequences.

The Vosgian Beast
Aug 13, 2011

Business is slow

Mercrom posted:

Sorry to nitpick here, but nothing Mr Harry Potter fanfiction has ever written is anywhere near as repulsive or idiotic as taking the concept of a philosophical zombie seriously. It's basically the same thing as solipsism.

Less Wrong doesn't take the concept of a philosophical zombie seriously though?

Chamale
Jul 11, 2010

I'm helping!



The Vosgian Beast posted:

Less Wrong doesn't take the concept of a philosophical zombie seriously though?

Yes, much of the basis of Less Wrong's ideology comes from the opposite of a philosophical zombie. The idea is that a simulation of a brain can be as truly conscious as a living human, and since they believe one day there will be a computer capable of simulating a near-infinite number of human experiences, most consciousnesses that have ever existed are actually simulated. Timeless Decision Theory comes from a weird misunderstanding of cause and effect that I can intuit but not explain.

There's a good story about machine consciousness called Learning to be Me, by Greg Egan. I'm not sure if the Yuddites would worship it or nitpick everything about the story.

fade5
May 31, 2012

by exmarx

SurreptitiousMuffin posted:

What confuses me about the whole future robot hypothetical is: why torture?

Axeman Jim posted:

[B]ut surely if these simulations were sentient, they would be capable of genuinely experiencing suffering if the AI tortured them? That would mean that they would count towards the "minimize suffering" aim that this AI apparently has, for whatever reason. So by creating billions of them and then torturing them, the AI is massively increasing the amount of suffering in the universe.
Yeah, this is where I basically stopped trying to make sense of any of this. If the AI's goal is to minimize suffering, then torturing people is counterproductive to that goal. If the AI tortures people it's not "good", and that means bringing it about is actually counterproductive to minimizing suffering. I'm assuming the Less Wrong people consider AI simulations of people to be people, otherwise they wouldn't care what a hypothetical AI does to hypothetical simulations of people. (I felt dumber just typing that out.:suicide:)

Whoever said that this is just a re-skinned religion, you got it right, this is really similar to the "Problem of Hell", only a lot more confusing and with more bullshit about future technology. (Not to derail, but thinking about the "Problem of Hell" actually led to my current belief system of universalism/universal reconciliation, and the Less Wrong stuff really reminds me of that.)

Swan Oat posted:

Well it's a sufficiently powerful AI, you see. Read the sequences.
Make sure you don't confuse sequences with series though.:v:
It is a calculus joke, so Yudkowsky wouldn't get it.

The Cheshire Cat
Jun 10, 2008

Fun Shoe

fade5 posted:

Yeah, this is where I basically stopped trying to make sense of any of this. If the AI's goal is to minimize suffering, then torturing people is counterproductive to that goal. If the AI tortures people it's not "good", and that means bringing it about is actually counterproductive to minimizing suffering.

See, the thing is that they seem to feel that utilitarianism is the Correct philosophy, so while torturing someone to "maximize good" sounds insane to a normal person, in utilitarianism it's all just based on a calculation. It's essentially the "Kill one person to save a thousand" idea, only applied to literally everything in the universe based on some absolute quantitative measurement of utility that does not and probably cannot exist. So yeah, maybe torturing someone is worth -1000 utility points, but if something that's worth +10000 arises as a direct consequence of that action, and that would not have arise in any other circumstance, then according to a purely utilitarian viewpoint it would be immoral NOT to do it.

If this all sounds crazy, don't worry, I'm not trying to make this sound compelling. Utilitarianism is a fairly abstract philosophical concept. Trying to apply it to real life is just insane, because there are numerous practical problems like quantifying "suffering" being impossible. The "Torture vs. dust in the eye" problem is an example of falling into that trap; the numbers don't matter at all because torture and a minor irritation aren't just wildly different degrees of suffering - they're incomparable. There is no one universal spectrum of existence on which you could place any sensation no matter how accurate your measurement tools are.

DeusExMachinima
Sep 2, 2012

:siren:This poster loves police brutality, but only when its against minorities!:siren:

Put this loser on ignore immediately!

fade5 posted:

Yeah, this is where I basically stopped trying to make sense of any of this. If the AI's goal is to minimize suffering, then torturing people is counterproductive to that goal. If the AI tortures people it's not "good", and that means bringing it about is actually counterproductive to minimizing suffering. I'm assuming the Less Wrong people consider AI simulations of people to be people, otherwise they wouldn't care what a hypothetical AI does to hypothetical simulations of people. (I felt dumber just typing that out.:suicide:)

Whoever said that this is just a re-skinned religion, you got it right, this is really similar to the "Problem of Hell", only a lot more confusing and with more bullshit about future technology. (Not to derail, but thinking about the "Problem of Hell" actually led to my current belief system of universalism/universal reconciliation, and the Less Wrong stuff really reminds me of that.)

Make sure you don't confuse sequences with series though.:v:
It is a calculus joke, so Yudkowsky wouldn't get it.

No you don't understand those sims are perfect copies of you so if you would do something to deserve it they already have too in an accelerated silicon universe!

Adbot
ADBOT LOVES YOU

Krotera
Jun 16, 2013

I AM INTO MATHEMATICAL CALCULATIONS AND MANY METHODS USED IN THE STOCK MARKET

AATREK CURES KIDS posted:

Does Yudkowsky read TVTropes? A lot of tropers use rape in everything because it's the worst thing you can do to someone, so it's an easy way to progress the plot.

This is pretty much the philosophical equivalent of that. My reasoning's probably immediately obvious but just in case it's not here's an explanation:

"I took all of Jimmy's stuff, so my mother slapped me and sent me to bed without my supper. No bad deed goes unpunished."
"I killed Jimmy and raped his wife, so God sent me to hell. No bad deed goes unpunished."

What's being expressed here is abstractly "I did a bad thing to some guy and a bad thing happened to me" but like with any philosophical example, you've still got a level of choice in exactly what concrete examples you pick to cover that ground.

It's not really uncommon for silly people on the internet to come up with pretentious thought experiments which just use concrete color that has a strong emotional connotation ("You *killed* somebody because [I feel like discussing the ethics of punishing people for bad things but it's not fun unless those bad things are sensational and violent]!" or "He wants to *murder* you unless you [rules of inane parable]") because when you're a philosophy major and you're talking about life-or-death situations suddenly all kinds of dumb bullshit seems profound, but Yudkowsky's worse: he's got a massive torture/murder/rape/mortality/etc. fixation he covers all his examples in even when it's totally unnecessary. He either (a) gets off to that stuff more than a normal person probably ought to or (b) really is just being pretentious but due to TVTropes thinks that his zany goreporn theology is the sort of thing that looks deep to normal people.

Krotera fucked around with this message at 01:46 on Apr 22, 2014

  • Locked thread