|
The real problem here is the idea that dust is substantially equivalent to torture, that both exist on some arbitrary scale of "suffering" that allows some arbitrary amount of annoyance to "equal" some arbitrary amount of torture. They don't. You don't suffer when you get a paper cut any more than you are annoyed by the final stages of cancer, even though both scenarios involve physical pain. The reason that non-Yudkowskyites pick dust over torture is that that is the situation in which a person is not tortured.
Sham bam bamina! fucked around with this message at 03:00 on Apr 20, 2014 |
# ¿ Apr 20, 2014 02:56 |
|
|
# ¿ Apr 26, 2024 18:00 |
|
Tendales posted:Even if you accept the premise that dust and torture can be measured on the same scale of suffering, the logic STILL doesn't hold.
|
# ¿ Apr 20, 2014 05:35 |
|
Improbable Lobster posted:My biggest question is who cares if cyber-you gets tortured? If you're just a simulation ripe for the torturing then it doesn't matter how much money cyber-you donates because cyber-you is fake and physical you can't be tortured by the AI.
|
# ¿ Apr 20, 2014 21:21 |
|
Saint Drogo posted:And in the slowed time of this slowed country, here and now as in the darkness-before-dawn prior to the Age of Reason, the son of a sufficiently powerful noble would simply take for granted that he was above the law, at least when it came to some peasant girl. There were places in Muggle-land where it was still the same way, countries where that sort of nobility still existed and still thought like that, or even grimmer lands where it wasn't just the nobility. It was like that in every place and time that didn't descend directly from the Enlightenment. Seriously, not making this up posted:There had been ten thousand societies over the history of the world where this conversation could have taken place. Even in Muggle-land it was probably still happening, somewhere in Saudi Arabia or the darkness of the Congo. It happened in every place and time that didn't descend directly from the Enlightenment. Sham bam bamina! fucked around with this message at 15:20 on Apr 21, 2014 |
# ¿ Apr 21, 2014 15:16 |
|
LaughMyselfTo posted:"You" can't rationally disprove solipsism.
|
# ¿ Apr 22, 2014 02:23 |
|
Lottery of Babylon posted:This is similar to Lesswrong's Pascal's Mugging thought experiment. The way he presents it is borderline unreadable because he spends pages and pages talking about up arrow notation and Kolmogorov and Solomonoff, but here's the short version: A man comes up to you and says "Give me $5 or I'll use magic wizard matrix powers to torture a really big number of people." If you go by the linearly-add-utility-functions-multiplied-by-probabilities thing that Yudkowsky always asserts is obviously correct, then the small chance that the man is really a wizard can be made up for by threatening arbitrarily large numbers of people, so logic seems to dictate that you should give him money. This is like that one kid who always "won" by killing you with his infinity laser and you could never kill him because of his infinity shield, except now I'm supposed to be actually terrified instead of annoyed. Sham bam bamina! fucked around with this message at 01:46 on Apr 23, 2014 |
# ¿ Apr 23, 2014 01:27 |
|
Darth Walrus posted:I was expecting to be the youngest person there, but it turned out that my age wasn't unusual—there were several accomplished individuals who were younger. This was the point at which I realized that my child prodigy license had officially completely expired.
|
# ¿ Apr 23, 2014 13:29 |
|
Jazu posted:It's kind of like Cow Tools, it sounds like it should make more sense than it does.
|
# ¿ Apr 25, 2014 15:19 |
|
Jonny Angel posted:I remember one bit from his Babykillers story that had me laughing really hard was when the Babykillers send the humans a poem trying to convince them to kill their own babies. The human realize that to the Babykillers, this piece of poo poo poem is probably one of their greatest cultural achievements. One of the human crew members expresses it like, "This is their Shakespeare, or their Fate/Stay Night!" quote:Think of the truly great stories, the ones that have become legendary for being the very best of the best of their genre: The Iliiad, Romeo and Juliet, The Godfather, Watchmen, Planescape: Torment, the second season of Buffy the Vampire Slayer, or that ending in Tsukihime. Sham bam bamina! fucked around with this message at 18:49 on Apr 26, 2014 |
# ¿ Apr 26, 2014 18:17 |
|
ArchangeI posted:legalized rape (again, an oxymoron like non-voluntary suicide, i.e. murder)
|
# ¿ Apr 27, 2014 01:22 |
|
ol qwerty bastard posted:Who are they even fooling, other than themselves?
|
# ¿ Apr 28, 2014 05:18 |
|
Wales Grey posted:I dunno, I'd be pretty inclined to let this AI loose onto the internet.
|
# ¿ Apr 28, 2014 12:15 |
|
oscarthewilde posted:(sorry Aristotle, but a 1 kg. of feathers don't fall at the same rate as 1 kg. of iron!)
|
# ¿ Apr 29, 2014 06:26 |
|
SolTerrasa posted:They accelerate at the same rate towards an infinite plane of uniform density in a vacuum, but they don't fall at the same rate on Earth in an atmosphere. Assume a spherical cow, and all that.
|
# ¿ Apr 29, 2014 08:18 |
|
potatocubed posted:More likely than not, most folks who die today didn't have to die! Yes, I am skeptical of most medicine because on average it seems folks who get more medicine aren't healthier. But I'll heartily endorse one medical procedure: cryonics, i.e., freezing folks in liquid nitrogen when the rest of medicine gives up on them.
|
# ¿ Apr 29, 2014 17:39 |
|
↑ Ugh, beaten! Slate Action posted:If you are alive and you get frozen, you die. If your body is frozen it sustains a catastrophic amount of damage down to the cellular level. It would probably be easier to just clone someone in the future rather than try to revive them from a frozen state.
|
# ¿ Apr 29, 2014 19:31 |
|
Ktb posted:Or is it just because it sounds cool and future sciencey? Sham bam bamina! fucked around with this message at 21:28 on Apr 29, 2014 |
# ¿ Apr 29, 2014 21:26 |
|
chrisoya posted:Or I guess you could just say "I will not engage in acausal trade with things that will torture simulations of me, what the gently caress" and the AI will sigh and let you go, because there's no point in torturing you. You have to say it out loud in case the simulation the AI's monitoring has privacy protections that stop it peering inside your thoughts and it's just monitoring the air around you for speech and staring creepily at you from all possible angles.
|
# ¿ Apr 30, 2014 17:18 |
|
Aerial Tollhouse posted:Looking at that page led me more of his fiction writing advice.
|
# ¿ May 1, 2014 05:38 |
|
Dr Pepper posted:Yes that's right, in order to be a good Philosopher you must be a computer programmer.
|
# ¿ May 8, 2014 18:13 |
|
The Iron Rose posted:Honestly I don't see how the possibility of an AI torturing me in the future would make me inclined to support said AI. Really, it seems to me like any decent person should/would oppose it at all costs...
|
# ¿ Jul 19, 2014 01:05 |
|
Don Gato posted:The more I read about Roko's Basilisk, the more it sounds like someone's Matrix fanfiction gone wrong. It makes about as much sense under
|
# ¿ Jul 19, 2014 15:33 |
|
pentyne posted:Huh, that name seems familiar. pentyne posted:'looks like a child but is really 600 years anime love interest for a harem anime'
|
# ¿ Jul 28, 2014 03:53 |
|
pentyne posted:http://negima.wikia.com/wiki/Evangeline_A.K._McDowell For God's sake, that "random years (mentally)" bit is just gross. Sham bam bamina! fucked around with this message at 08:44 on Jul 28, 2014 |
# ¿ Jul 28, 2014 08:36 |
|
SolTerrasa posted:I was once at a dinner party, trying to explain to a man what I did for a living, when he said: "I don't believe Artificial Intelligence is possible because only God can make a soul." "I don't believe Artificial Intelligence is possible because only God can make a soul." "Well, if your religion predicts that I can't possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for me to build an AI; or, if I build an AI, that disproves your religion." "But I don't believe that you can build one, so the point is moot. Didn't you hear a word that I said?" I'm not sure that Yudkowsky even understands his opponent's opening statement here. Sham bam bamina! fucked around with this message at 16:18 on Aug 25, 2014 |
# ¿ Aug 25, 2014 16:12 |
|
Dr Pepper posted:Calling it a fanfic implies that the author [...] at least has some vague understanding of its themes.
|
# ¿ Aug 25, 2014 20:48 |
|
The thing about cryonics that I don't get is, even assuming that it can work reliably, how the hell are they going to get me frozen before my brain tissue starts degrading? That takes what, five minutes? The only thing that I can think of is freezing while I'm still alive, and that's impossible since they'd still have to replace all of my blood with antifreeze (probably not a five-minute process) before starting the freezing itself. Basically, the choice is between freezing after death and waking up as a brain-damaged zombie or killing myself in a way that I hope will be reversible later on (and still probably ending up as a zombie).
|
# ¿ Aug 27, 2014 14:52 |
|
ungulateman posted:C'mon. Lower your standards and live a little. If you've never read a fanfic that blatantly straddles the line between embarrassing and enjoyable, you haven't lived. Sham bam bamina! fucked around with this message at 01:17 on Sep 2, 2014 |
# ¿ Sep 2, 2014 01:15 |
|
Cardiovorax posted:Until they are assimilated, anyway. A lot of ex-Borg say that being part of the Collective is actually a very pleasant, communal experience. They'd just prefer to have been asked first. I suppose you could make a utilitarian argument out of that - the minor displeasure of being forcibly included is minor compared to the following lifetime of happiness.
|
# ¿ Sep 2, 2014 20:58 |
|
quote:“Candy bars and other sweets taste better than anything that could have occurred naturally,” she said, staring gently at him through the pad. “Humans applied their intelligence to pushing the pleasure buttons on their tongues by making very tasty sugary and fatty foods.
|
# ¿ Sep 3, 2014 11:47 |
|
Qwertycoatl posted:It's possible that he's not actually supposed to be intelligent or nice and it's just the pony AI feeding him what he wants to hear. But that's probably giving this story too much credit. Sham bam bamina! fucked around with this message at 12:55 on Sep 3, 2014 |
# ¿ Sep 3, 2014 12:53 |
|
SerialKilldeer posted:What if the Basilisk will be a pony? But, yes, the AI will be a pony. The most perfect form, the purest being. Sham bam bamina! fucked around with this message at 23:34 on Sep 3, 2014 |
# ¿ Sep 3, 2014 23:26 |
|
It's a brony story; I really should have seen that coming.
|
# ¿ Sep 4, 2014 00:28 |
|
Djeser posted:The world is just pixels
|
# ¿ Sep 4, 2014 03:01 |
|
Djeser posted:Unlike the physical universe, Equestrian light strikes your eye to satisfy your values through friendship and ponies.
|
# ¿ Sep 4, 2014 12:54 |
|
Cardiovorax posted:It makes a bit of sense when you think about it as an ontological statement. Basically the AI is telling him that the real world is naturalistic, while the simulation is teleological. Photons just exist. Light in the simulation exists for your benefit. That the benefit is arbitrarily friendship and ponies is incidental.
|
# ¿ Sep 4, 2014 13:11 |
|
ungulateman posted:I know, I know way too much about MLP fanfic and should kill my are self...
|
# ¿ Sep 4, 2014 15:06 |
|
Moatman posted:Okay, post about Fun Theory stuff may take a while. It's making me irrationally angry. Please get rid of that avatar.
|
# ¿ Sep 26, 2014 18:49 |
|
Besesoth posted:friendly or hostile (or indifferent, although LW doesn't seem to admit to that possibility, or for that matter anything other than Pure Friendly and Pure Hostile)
|
# ¿ Sep 27, 2014 16:28 |
|
|
# ¿ Apr 26, 2024 18:00 |
|
Huh. Well.
|
# ¿ Sep 28, 2014 16:39 |