Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Sham bam bamina!
Nov 6, 2012

ƨtupid cat
The real problem here is the idea that dust is substantially equivalent to torture, that both exist on some arbitrary scale of "suffering" that allows some arbitrary amount of annoyance to "equal" some arbitrary amount of torture. They don't. You don't suffer when you get a paper cut any more than you are annoyed by the final stages of cancer, even though both scenarios involve physical pain. The reason that non-Yudkowskyites pick dust over torture is that that is the situation in which a person is not tortured. :rolleye:

Sham bam bamina! fucked around with this message at 03:00 on Apr 20, 2014

Adbot
ADBOT LOVES YOU

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Tendales posted:

Even if you accept the premise that dust and torture can be measured on the same scale of suffering, the logic STILL doesn't hold.

Compare: 6 billion people all get a speck of dust in the eye worth 1 Sufferbuck each, or 1 person gets an atomic wedgie worth 6 Billion Sufferbucks. Even if you accept the utilitarian premise, they're STILL not equal! Just keep following the dumbass logic. It takes a person 1 Chilltime to recover from 1 Sufferbuck of trauma. That means 1 Chilltime from now, all 6 billion people will be at 0 Sufferbucks. On the other hand, in 1 Chilltime, the Bayesian Martyr still has 6 billion minus 1 Sufferbucks. So even in Yudkowsky's own bizarro worldview, he's an rear end in a top hat.
No, see, the six billion people wait six billion simultaneous Chilltimes, and the Martyr has to wait six billion consecutive Chilltimes. Checkmate, bitch. :chord:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Improbable Lobster posted:

My biggest question is who cares if cyber-you gets tortured? If you're just a simulation ripe for the torturing then it doesn't matter how much money cyber-you donates because cyber-you is fake and physical you can't be tortured by the AI.
The idea is that, since cyber-you is a copy of you exact enough to essentially be you, anything that cyber-you does, you would also do. So the distinction is erased - with that kind of 1:1 correspondence, it doesn't matter if "you" are a simulation or the original; either way, if you donate, the AI gets funded. (Let's just ignore the practical impossibility of knowing whether your donation would actually end up funding this hypothetical future AI specifically.)

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Saint Drogo posted:

And in the slowed time of this slowed country, here and now as in the darkness-before-dawn prior to the Age of Reason, the son of a sufficiently powerful noble would simply take for granted that he was above the law, at least when it came to some peasant girl. There were places in Muggle-land where it was still the same way, countries where that sort of nobility still existed and still thought like that, or even grimmer lands where it wasn't just the nobility. It was like that in every place and time that didn't descend directly from the Enlightenment.
Obligatory reminder that this used to be even, uh, better?

Seriously, not making this up posted:

There had been ten thousand societies over the history of the world where this conversation could have taken place. Even in Muggle-land it was probably still happening, somewhere in Saudi Arabia or the darkness of the Congo. It happened in every place and time that didn't descend directly from the Enlightenment.

Sham bam bamina! fucked around with this message at 15:20 on Apr 21, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

LaughMyselfTo posted:

"You" can't rationally disprove solipsism. :colbert:
:v:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Lottery of Babylon posted:

This is similar to Lesswrong's Pascal's Mugging thought experiment. The way he presents it is borderline unreadable because he spends pages and pages talking about up arrow notation and Kolmogorov and Solomonoff, but here's the short version: A man comes up to you and says "Give me $5 or I'll use magic wizard matrix powers to torture a really big number of people." If you go by the linearly-add-utility-functions-multiplied-by-probabilities thing that Yudkowsky always asserts is obviously correct, then the small chance that the man is really a wizard can be made up for by threatening arbitrarily large numbers of people, so logic seems to dictate that you should give him money.
Ahahahaha, that is beautiful. So the effective probability that your claim is truthful approaches 1 as the bullshit number that you're claiming approaches infinity? I get that he's not claiming that this actually makes it more probable, but why the hell should I treat it as if it is? If anything, shouldn't I give it less legitimacy the further the number rockets away from sanity?

This is like that one kid who always "won" by killing you with his infinity laser and you could never kill him because of his infinity shield, except now I'm supposed to be actually terrified instead of annoyed.

Sham bam bamina! fucked around with this message at 01:46 on Apr 23, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Darth Walrus posted:

I was expecting to be the youngest person there, but it turned out that my age wasn't unusual—there were several accomplished individuals who were younger. This was the point at which I realized that my child prodigy license had officially completely expired.

Now, admittedly, this was a closed conference run by people clueful enough to think "Let's invite Eliezer Yudkowsky" even though I'm not a CEO. So this was an incredibly cherry-picked sample. Even so...

Even so, these people of the Power Elite were visibly much smarter than average mortals.
Christ, what is even there to say about someone so abjectly craniorectal?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Jazu posted:

It's kind of like Cow Tools, it sounds like it should make more sense than it does.
Except that Cow Tools could easily have been comprehensible; all that Gary had to do was draw an archaeologist instead of a cow (and why the hell he didn't do that to begin with is anyone's guess). By contrast, there's no easy way to make this garbage work - Cow Tools was a decent idea executed badly; TDT is perfectly-communicated bullshit.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Jonny Angel posted:

I remember one bit from his Babykillers story that had me laughing really hard was when the Babykillers send the humans a poem trying to convince them to kill their own babies. The human realize that to the Babykillers, this piece of poo poo poem is probably one of their greatest cultural achievements. One of the human crew members expresses it like, "This is their Shakespeare, or their Fate/Stay Night!"

Now, there's a chance that it was a deliberate joke on Yudowsky's part, this idea of "Hah, isn't this future culture of humanity so bizarre, that they compare those two?"

At the same time, looking at the pieces of poo poo that he does list as favorite works of media, I'm inclined to believe he puts Fate/Stay Night up there too. Which is hilarious.
Reminder that Yudkowsky typed this in complete seriousness:

quote:

Think of the truly great stories, the ones that have become legendary for being the very best of the best of their genre: The Iliiad, Romeo and Juliet, The Godfather, Watchmen, Planescape: Torment, the second season of Buffy the Vampire Slayer, or that ending in Tsukihime.
His whole "I'm rational enough to see through the arbitrary bias and elitism of 'culture' and truly appreciate the depth of anime and RPGs :smuggo:" schtick is just insufferable. It reeks of that Troper idea that since you're Really Frickin' Smart, everything that you enjoy is necessarily Really Frickin' Smart too. After all, how could anything less satisfy your prodigious intellect? :allears:

Sham bam bamina! fucked around with this message at 18:49 on Apr 26, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

ArchangeI posted:

legalized rape (again, an oxymoron like non-voluntary suicide, i.e. murder)
Literally everything about this is wrong. I'm not sure that you know what the word "rape" even means if you think that it's defined by violation of the law and not a person's body, and a "non-voluntary suicide" (the word you're looking for is "involuntary") would be an accidental death without anyone else's involvement.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

ol qwerty bastard posted:

Who are they even fooling, other than themselves?
Wouldn't anyone that they fool become one of them by definition?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Wales Grey posted:

I dunno, I'd be pretty inclined to let this AI loose onto the internet.
Jerk City isn't autogenerated; that would be Mezzacotta.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

oscarthewilde posted:

(sorry Aristotle, but a 1 kg. of feathers don't fall at the same rate as 1 kg. of iron!)
Uh...

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

SolTerrasa posted:

They accelerate at the same rate towards an infinite plane of uniform density in a vacuum, but they don't fall at the same rate on Earth in an atmosphere. Assume a spherical cow, and all that.
Aristotle never claimed that they did, and in fact he said that heavier objects inherently fall faster. This is why Galileo's experiments with lead weights were so important; they showed that (mostly) absent wind resistance, weight does not affect gravitational acceleration. You "refuted" a true claim that Aristotle never made with a false one that he did.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

potatocubed posted:

More likely than not, most folks who die today didn't have to die! Yes, I am skeptical of most medicine because on average it seems folks who get more medicine aren't healthier. But I'll heartily endorse one medical procedure: cryonics, i.e., freezing folks in liquid nitrogen when the rest of medicine gives up on them.
Ignoring the "folks who get more medicine aren't healthier" :pwn:ery, he is aware that most folks who die today have no access to cryonic preservation in the first place, right? I can guarantee that you won't have the impoverished Chinese or Indian masses or the sub-Saharan Africans lining up for this Alcor thing, even assuming that they are RATIONAL enough to be on board with the idea.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
↑ Ugh, beaten! :argh:

Slate Action posted:

If you are alive and you get frozen, you die. If your body is frozen it sustains a catastrophic amount of damage down to the cellular level. It would probably be easier to just clone someone in the future rather than try to revive them from a frozen state.
I wondered how they avoided every cell getting perforated by ice but just assumed that they had something figured out because why would they do this otherwise? I'm beginning to think that this might have been a bit naïve of me. :v:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Ktb posted:

Or is it just because it sounds cool and future sciencey?
What does your heart tell you?

Sham bam bamina! fucked around with this message at 21:28 on Apr 29, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

chrisoya posted:

Or I guess you could just say "I will not engage in acausal trade with things that will torture simulations of me, what the gently caress" and the AI will sigh and let you go, because there's no point in torturing you. You have to say it out loud in case the simulation the AI's monitoring has privacy protections that stop it peering inside your thoughts and it's just monitoring the air around you for speech and staring creepily at you from all possible angles.
No sooner read than done. Phew, this crisis was surprisingly easy to avert. :sweatdrop:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
I'm beginning to question the value of splitting this discussion from the TV Tropes thread.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Dr Pepper posted:

Yes that's right, in order to be a good Philosopher you must be a computer programmer.
And no programmer has ever produced bad, confused code that doesn't compile. :rolleye:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

The Iron Rose posted:

Honestly I don't see how the possibility of an AI torturing me in the future would make me inclined to support said AI. Really, it seems to me like any decent person should/would oppose it at all costs...
That's how it gets you!!! :byodood:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Don Gato posted:

The more I read about Roko's Basilisk, the more it sounds like someone's Matrix fanfiction gone wrong. It makes about as much sense under close observation
There we go.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

pentyne posted:

Huh, that name seems familiar.

pentyne posted:

'looks like a child but is really 600 years anime love interest for a harem anime'
Why is this familiar to you? :crossarms:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

pentyne posted:

http://negima.wikia.com/wiki/Evangeline_A.K._McDowell


Fan wikis are nothing if not obsessively detailed.
Uh, that's not what I was talking about. It was that you were familiar with the creepy anime character in the first place.

For God's sake, that "random years (mentally)" bit is just gross.

Sham bam bamina! fucked around with this message at 08:44 on Jul 28, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

SolTerrasa posted:

I was once at a dinner party, trying to explain to a man what I did for a living, when he said: "I don't believe Artificial Intelligence is possible because only God can make a soul."

At this point I must have been divinely inspired, because I instantly responded: "You mean if I can make an Artificial Intelligence, it proves your religion is false?"

He said, "What?"

I said, "Well, if your religion predicts that I can't possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for me to build an AI; or, if I build an AI, that disproves your religion."

There was a pause, as the one realized he had just made his hypothesis vulnerable to falsification, and then he said, "Well, I didn't mean that you couldn't make an intelligence, just that it couldn't be emotional in the same way we are."

I said, "So if I make an Artificial Intelligence that, without being deliberately preprogrammed with any sort of script, starts talking about an emotional life that sounds like ours, that means your religion is wrong."

He said, "Well, um, I guess we may have to agree to disagree on this."
I don't believe that this happened. I think that this conversation would have gone something more like:

:catholic: "I don't believe Artificial Intelligence is possible because only God can make a soul."

:smugbert: "Well, if your religion predicts that I can't possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for me to build an AI; or, if I build an AI, that disproves your religion."

:catholic: "But I don't believe that you can build one, so the point is moot. Didn't you hear a word that I said?"

I'm not sure that Yudkowsky even understands his opponent's opening statement here.

Sham bam bamina! fucked around with this message at 16:18 on Aug 25, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Dr Pepper posted:

Calling it a fanfic implies that the author [...] at least has some vague understanding of its themes.
Actually, I'm pretty sure that it doesn't. :v:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
The thing about cryonics that I don't get is, even assuming that it can work reliably, how the hell are they going to get me frozen before my brain tissue starts degrading? That takes what, five minutes? The only thing that I can think of is freezing while I'm still alive, and that's impossible since they'd still have to replace all of my blood with antifreeze (probably not a five-minute process) before starting the freezing itself.

Basically, the choice is between freezing after death and waking up as a brain-damaged zombie or killing myself in a way that I hope will be reversible later on (and still probably ending up as a zombie).

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

ungulateman posted:

C'mon. Lower your standards and live a little. If you've never read a fanfic that blatantly straddles the line between embarrassing and enjoyable, you haven't lived. :unsmith:
The world is overflowing with genuinely good writing; someone can "live a little" without lowering their standards in the slightest. Also, you're a literal autistic brony.

Sham bam bamina! fucked around with this message at 01:17 on Sep 2, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Cardiovorax posted:

Until they are assimilated, anyway. A lot of ex-Borg say that being part of the Collective is actually a very pleasant, communal experience. They'd just prefer to have been asked first. I suppose you could make a utilitarian argument out of that - the minor displeasure of being forcibly included is minor compared to the following lifetime of happiness.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

quote:

“Candy bars and other sweets taste better than anything that could have occurred naturally,” she said, staring gently at him through the pad. “Humans applied their intelligence to pushing the pleasure buttons on their tongues by making very tasty sugary and fatty foods.
Snickers bars: the pinnacle of cuisine.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Qwertycoatl posted:

It's possible that he's not actually supposed to be intelligent or nice and it's just the pony AI feeding him what he wants to hear. But that's probably giving this story too much credit.
I think that my favorite thing about this story is how indistinguishable it is from satire or critique while being 100% dead earnest. It's like a friggin' Chick tract, but with computer ponies.

Sham bam bamina! fucked around with this message at 12:55 on Sep 3, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

SerialKilldeer posted:

What if the Basilisk will be a pony? :ohdear:
The concept of the simulation-torturing AI is the Basilisk, not the AI itself. The idea is that being informed of the concept at all supposedly dooms you, much like making eye contact with a basilisk.

But, yes, the AI will be a pony. The most perfect form, the purest being.

Sham bam bamina! fucked around with this message at 23:34 on Sep 3, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
It's a brony story; I really should have seen that coming.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Djeser posted:

The world is just pixels
It's three-dimensional, so they're actually voxels. It's loving Minecraft. :greencube:

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Djeser posted:

Unlike the physical universe, Equestrian light strikes your eye to satisfy your values through friendship and ponies.
How can someone write a sentence like this without seeing how goddamn arbitrary their horrid utopia is?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Cardiovorax posted:

It makes a bit of sense when you think about it as an ontological statement. Basically the AI is telling him that the real world is naturalistic, while the simulation is teleological. Photons just exist. Light in the simulation exists for your benefit. That the benefit is arbitrarily friendship and ponies is incidental.
I meant the equal weight given to "friendship" and "ponies" in the sentence. Like, OK, there's conceivably some objective justification for prioritizing friendship; community is a pretty fundamental element of human nature. But ponies? Seriously?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

ungulateman posted:

I know, I know way too much about MLP fanfic and should kill my are self... :sigh:
Please, it's "kill I am self".

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Moatman posted:

Okay, post about Fun Theory stuff may take a while. It's making me irrationally angry.
e: This was linked from the first fun theory post. Emphasis Yud's

loving lol.
Beautiful quote.

Please get rid of that avatar.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Besesoth posted:

friendly or hostile (or indifferent, although LW doesn't seem to admit to that possibility, or for that matter anything other than Pure Friendly and Pure Hostile)
It's because this thing will inevitably self-improve itself into essential omnipotence and can either have human good as its chief goal or something else. If the goal is something else, human good will be subordinate to that goal and will at some point be sacrificed for it (because the AI's unlimited power will have limited resources at its disposal, or else because the goal itself is in conflict with human good).

Adbot
ADBOT LOVES YOU

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
Huh. Well.

  • Locked thread