Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

The Unholy Ghost posted:

an awesome metaphor
hahahahaha no

Adbot
ADBOT LOVES YOU

Swan Oat
Oct 9, 2012

I was selected for my skill.
I read a Harry Potter alternate universe fanfiction devoted to promoting rationalism and found it not all that bad. Actually, quite good in fact. Parts of it were quite epic and mindblowing if you must know. Really aligned with the things I want to know about, and informed me in a satisfying way. Anyway, having said that, I think I have a lot to contribute, in general.

pentyne
Nov 7, 2012

The Unholy Ghost posted:

Okay, so I posted earlier in here about how I didn't understand the hate for this guy, and now I understand even less. Everyone here told me that HPMOR goes completely bonkers or whatever and that Harry summons Carl Sagan as a patronus, but...

Well, I'm at Chapter 46, and Harry's patronus is a human in general, not Carl Sagan. I still haven't seen anything offensive yet, and I'm just overall feeling kind of disappointed that this thread is working so hard to hate a perfectly fine, somewhat eye-opening story. Call me an idiot if you want, but some of the concepts that are discussed in the story were mindblowing to me, and my basic research afterwards essentially confirmed the ideas I was most interested in.

I mean, I guess my point is that you guys are picking at the stupidest crap which then turns out to not even be true. I was telling myself that I'd stop reading once Carl Sagan popped up, because that really would be idiotic, but instead the patronus was an awesome metaphor that essentially summed up Harry's ideals in the story.

So I have to wonder how much of the stuff you people are complaining about is not even something he wrote.

Now, maybe his forums are crazy (although with the track record of this thread I'd be less inclined to believe it than before) but I really don't see any problems with his work other than that it's a fanfiction, in which case I can kind of see that he may have been looking for a friendlier medium to communicate his basic ideas. ...Or, maybe he just wants to have fun.

Well, if you only ever read other fanfiction, then as a basis for comparison I can see who you'd think MPHOR is "mindblowing" but this thread has already broken it down and explained just how retarded the writing is, in addition to a host of other problems with Yud in general in the way that he presents arguments, claims to have perfect solutions, discourages dissent, ridicules "published academia" etc. The list is far too long to point out in one post, and the majority of this thread is dedicated to people discussing how stupid everything Yud does is, with people who specifically work in said fields on a professional level savaging all of what Yud thinks is his brilliant ideas.

Yud is literally a 21st internet charlatan, making all sorts of nebulous promises, drawing in a cult of personality around him, and getting a lot of money dumb people to produce research which he claims definately exists but is too dangerous for "the others" (a byword for people to stupid and ignorant to appreciate Yud's works and likely to pervert it and ruin the world) to read.

Also, Yud wants to "have fun" in the same way that Ayn Rand wanted to be a fiction writer. Literally everything he does in some way shape or form pushes his intellectual agenda and is rife with the kinds of mistakes a 1st year English major would spot and cross out during an editorial process.

There's a reason Yud has such a following, and at a first, uninformed glance he seems like a smart guy, but when you peel back the layers you realize he's nothing more then a smug self educated "autodidact" styling himself as the next Socrates or Hume pretending to swim in the intellectual ocean when he's really floating with waders in the kiddie pool.

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that

The Unholy Ghost posted:

Everyone here told me that HPMOR goes completely bonkers

You misunderstand. Yudkowsky is bonkers. HPMOR is mostly just long winded with a mediocre writing style. That is of course except for chapter 19 where it briefly and suddenly becomes a torture fic. I have no idea what the gently caress was up with that.

su3su2u1
Apr 23, 2014

The Unholy Ghost posted:

Okay, so I posted earlier in here about how I didn't understand the hate for this guy, and now I understand even less. Everyone here told me that HPMOR goes completely bonkers or whatever and that Harry summons Carl Sagan as a patronus, but...

Well, I'm at Chapter 46, and Harry's patronus is a human in general, not Carl Sagan. I still haven't seen anything offensive yet, and I'm just overall feeling kind of disappointed that this thread is working so hard to hate a perfectly fine, somewhat eye-opening story.

So I'm blogging my experience of reading HPMOR here: http://su3su2u1.tumblr.com/tagged/Hariezer-Yudotter/chrono

Basically none of the science mentioned in HPMOR is correct. I'm 27 chapters I'm yet to encounter anything eye-opening, just a lot of cloying elitism and really poor science.

Squidster
Oct 7, 2008

✋😢Life's just better with Ominous Gloves🤗🧤

su3su2u1 posted:

So I'm blogging my experience of reading HPMOR here: http://su3su2u1.tumblr.com/tagged/Hariezer-Yudotter/chrono

Basically none of the science mentioned in HPMOR is correct. I'm 27 chapters I'm yet to encounter anything eye-opening, just a lot of cloying elitism and really poor science.
Incidentally, I'm enjoying your summaries! Keep on posting, good sir.

The Vosgian Beast
Aug 13, 2011

Business is slow
In which a blogger takes a long time to say "Less Wrong is good because we care about the truth, unlike everyone else"

The Unholy Ghost
Feb 19, 2011
Okay, I misspoke, it was not really the science in the story that surprised me but the logical arguments and the way Harry manipulates people. From the perspective of someone who just wants to enjoy a story, it's quite entertaining.

From the thread's perspective you're watching a guy entrance people into his "cult" and give him money. If he's convincing people he can find a way into immortality that really is stupid.

I think if you look at the story as Harry taking over the magical world with both charlatan and legitimate methods, the story becomes incredibly interesting.

(Also: Dementors are kind of complete mysteries in Harry Potter so it's not that big of a deal if Yudkowsky wants to attribute some noncanon meaning to them. It is a Fanfiction after all [isn't it more about the overall metaphor than the exact details?])

The Vosgian Beast
Aug 13, 2011

Business is slow

The Unholy Ghost posted:

Okay, I misspoke, it was not really the science in the story that surprised me but the logical arguments and the way Harry manipulates people. From the perspective of someone who just wants to enjoy a story, it's quite entertaining.

From the thread's perspective you're watching a guy entrance people into his "cult" and give him money. If he's convincing people he can find a way into immortality that really is stupid.

I think if you look at the story as Harry taking over the magical world with both charlatan and legitimate methods, the story becomes incredibly interesting.

(Also: Dementors are kind of complete mysteries in Harry Potter so it's not that big of a deal if Yudkowsky wants to attribute some noncanon meaning to them. It is a Fanfiction after all [isn't it more about the overall metaphor than the exact details?])

They represent depression. This is reinforced by everything they do in the story, everything Rowling has said, and everything the stories are trying to accomplish. They represent death because Yudkowsky either doesn't know what the point of Harry Potter was, or doesn't care.

Pidmon
Mar 18, 2009

NO ONE risks painful injury on your GREEN SLIME GHOST POGO RIDE.

No one but YOU.

The Unholy Ghost posted:

Okay, I misspoke, it was not really the science in the story that surprised me but the logical arguments and the way Harry manipulates people. From the perspective of someone who just wants to enjoy a story, it's quite entertaining.

Maybe you'd be interested in this thread, where a whole lot of people 'manipulate' fictional characters just like Yudotter does!

It's called shit_that_didn't_happen.txt for a reason.

Maybe if you're lucky your favourite fictional smartybrains will marry an eternally applauding Albert Einstein-Dumbledore at the end of the fic.

The Vosgian Beast
Aug 13, 2011

Business is slow
Also you could just watch House of Cards.

Or read/watch Count of Monte Cristo.

Or The Talented Mr. Ripley.

Many things you could do that don't involve reading Big Yud

Literally Kermit
Mar 4, 2012
t

The Vosgian Beast posted:

Also you could just watch House of Cards.

Or read/watch Count of Monte Cristo.

Or The Talented Mr. Ripley.

Many things you could do that don't involve reading Big Yud

I've taken to thinking him as 'Yud the Spud', as time goes on.

LaughMyselfTo
Nov 15, 2012

by XyloJW
I can actually say without reservation that HPMOR is one of the best fanfics I've ever read!

I do not intend this as an endorsement of HPMOR.

Moatman
Mar 21, 2014

Because the goof is all mine.
Okay, so you know how I said I'd read Yud's Fun Theory sequence like 3 weeks ago?
Well, school got into full swing and tbh I've really been trying to avoid reading it, so I didn't. I'm starting on it today.

edit: I haven't even started the actual sequence and it already hurts

Fuckin Yud posted:

Fun Theory is also the fully general reply to religious theodicy (attempts to justify why God permits evil). Our present world has flaws even from the standpoint of such eudaimonic considerations as freedom, personal responsibility, and self-reliance. Fun Theory tries to describe the dimensions along which a benevolently designed world can and should be optimized, and our present world is clearly not the result of such optimization - there is room for improvement. Fun Theory also highlights the flaws of any particular religion's perfect afterlife - you wouldn't want to go to their Heaven.
I know I don't want to go to LW's heaven

Moatman fucked around with this message at 17:46 on Sep 26, 2014

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Squidster posted:

Incidentally, I'm enjoying your summaries! Keep on posting, good sir.
Yes, do.

The Unholy Ghost posted:

I think if you look at the story as Harry taking over the magical world with both charlatan and legitimate methods, the story becomes incredibly interesting.
I'll grant you that it's a more interesting concept than the actual plot of the Harry Potter books, but that's only because they're garbage that relies on everyone being absurdly stupid all the time to contrive the plot into happening. At least Rowling can write, though.

Epitope
Nov 27, 2006

Grimey Drawer

The Unholy Ghost posted:

Okay, I misspoke, it was not really the science in the story that surprised me but the logical arguments and the way Harry manipulates people. From the perspective of someone who just wants to enjoy a story, it's quite entertaining.

From the thread's perspective you're watching a guy entrance people into his "cult" and give him money. If he's convincing people he can find a way into immortality that really is stupid.

I think if you look at the story as Harry taking over the magical world with both charlatan and legitimate methods, the story becomes incredibly interesting.

(Also: Dementors are kind of complete mysteries in Harry Potter so it's not that big of a deal if Yudkowsky wants to attribute some noncanon meaning to them. It is a Fanfiction after all [isn't it more about the overall metaphor than the exact details?])

It's not that every word he speaks is horrible, this thread has plenty of posts saying- I enjoyed such and such of his. Most everyone is quick to say they don't like him overall though. This is because he is rather deranged and something of a cult leader. If he had nothing of interest to say no one would go to his website and he wouldn't have anyone following him.
Keep seeking outside info on what you read there, and maybe look at how cults work in general. Hopefully you won't get sucked in too deep.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Epitope posted:

It's not that every word he speaks is horrible, this thread has plenty of posts saying- I enjoyed such and such of his. Most everyone is quick to say they don't like him overall though. This is because he is rather deranged and something of a cult leader. If he had nothing of interest to say no one would go to his website and he wouldn't have anyone following him.
Keep seeking outside info on what you read there, and maybe look at how cults work in general. Hopefully you won't get sucked in too deep.
The problem with Yudkowsky is that, in isolation, no single idea of his is actually insane. It's when you put them all together that they turn into an absolute catastrophe. Yudkowsky has taken all these disparate, not-strictly-wrong ideas and joined them into a greater whole created in his own image - pompous, petty and only a fraction as clever as he actually thinks it is.

LaughMyselfTo
Nov 15, 2012

by XyloJW
Not Strictly Wrong would be a pretty good name for Yudkowsky's organization.

Political Whores
Feb 13, 2012

The problem is that Yud has just enough intellect to superficially mimic a scholar, but no more than that. He is the quintessential internet smart guy, who gleans his knowledge from skimming the Wikipedia article in another window. Just look at his awful butchering of Bayesian statistics.

Moatman
Mar 21, 2014

Because the goof is all mine.
Okay, post about Fun Theory stuff may take a while. It's making me irrationally angry.
e: This was linked from the first fun theory post. Emphasis Yud's

quote:

If this was an attempt to focus the young Eliezer on intelligence uber alles, it was the most wildly successful example of reverse psychology I've ever heard of.

But my parents aren't that cunning, and the results weren't exactly positive.


For a long time, I thought that the moral of this story was that experience was no match for sheer raw native intelligence. It wasn't until a lot later, in my twenties, that I looked back and realized that I couldn't possibly have been more intelligent than my parents before puberty, with my brain not even fully developed. At age eleven, when I was already nearly a full-blown atheist, I could not have defeated my parents in any fair contest of mind. My SAT scores were high for an 11-year-old, but they wouldn't have beaten my parents' SAT scores in full adulthood. In a fair fight, my parents' intelligence and experience could have stomped any prepubescent child flat. It was dysrationalia that did them in; they used their intelligence only to defeat itself.

But that understanding came much later, when my intelligence had processed and distilled many more years of experience.
loving lol.

Moatman fucked around with this message at 18:38 on Sep 26, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Moatman posted:

Okay, post about Fun Theory stuff may take a while. It's making me irrationally angry.
e: This was linked from the first fun theory post. Emphasis Yud's

loving lol.
Beautiful quote.

Please get rid of that avatar.

SolTerrasa
Sep 2, 2011

Political Whores posted:

The problem is that Yud has just enough intellect to superficially mimic a scholar, but no more than that. He is the quintessential internet smart guy, who gleans his knowledge from skimming the Wikipedia article in another window. Just look at his awful butchering of Bayesian statistics.

Wait wait wait. I use bayesian stats on a daily basis, but I learned about them from a real person after I learned about them from Yud the Spud. I'm aware he misapplies them to situations where using real numbers would be ludicrous (see anything that has Knuth's Up Arrow Notation), but I'm not aware of anything he gets actually wrong. Is there something?

Moatman
Mar 21, 2014

Because the goof is all mine.

SolTerrasa posted:

Wait wait wait. I use bayesian stats on a daily basis, but I learned about them from a real person after I learned about them from Yud the Spud. I'm aware he misapplies them to situations where using real numbers would be ludicrous (see anything that has Knuth's Up Arrow Notation), but I'm not aware of anything he gets actually wrong. Is there something?

He doesn't understand that you can/are supposed to update priors. Also his hateboner for any other type of statistics is pretty ridiculous.
I'll switch the av once I figure out what to replace it with amd/or have enough money to Scroogeproof it.

SolTerrasa
Sep 2, 2011

The Sequences Digression One

This has taken way too long to assemble because I am too easily angered by Internet Smart Guys.

A long, long time ago, LessWrong was part of another website, called Overcoming Bias. OB is run by someone named Robin Hanson, a legitimately intelligent professor of economics at what appears to be a real school. Hanson thinks that the singularity will probably happen, but he does not think that friendly AI is important. In this, he is just a more optimistic version of me. I hope that the singularity will happen, but consider it ludicrous that we would need Friendly AI as a concept before we have consistently self-modifying AI. He works at the least crazy of the MIRI-clones, the Future of Humanity Institute. He also (being older and wiser than Big Yud) has a much better grasp on the whole "systematic human bias" thing. Basically he's a better version of Big Yud minus about thirty percent of the crazy.

Well, you know how Big Yud wants to misapply the Agreement Theorem to make sure no one ever has a difference of opinion? They had a debate. Robin Hanson (then Yud's mentor-figure) is uniquely suited to this debate. Anyone who's had an economics class knows: there's always that one kid who thinks he knows so, so much better than everyone else, including the professor. Yudkowsky is that kid, and Hanson treats like that, over and over and over.

This debate goes so badly for Big Yud that it seems to be the reason that LessWrong exists at all; he left OB because of irreconcilable differences.

There is the slight problem of :words:. There are seven hundred pages of this debate. So I have distilled what I think are the most interesting parts, but please ask me if it turns out that some context is missing or unclear.

Here we go.

The posts begin innocuously enough. Hanson postulates something which he calls UberTool. UberTool is a tool which aids in task X, which aids in task Y, which aids in task Z, where task Z includes building better UberTools. Hanson asks "would you fund a venture capitalist who said they had discovered such a cycle, and whose plan was to use the subsequent inventions to dominate the world markets for X, Y, and Z?"

Yudkowsky posted:

You’ve got to be doing something that’s the same order of Cool as the invention of “animal
brains, human brains, farming, and industry.” I think this is the wrong list,
really; “farming” sets too low a standard. And certainly venture capitalists have
a tendency and a motive to exaggerate how neat their projects are.
But if, without exaggeration, you find yourself saying, “Well, that looks like
a much larger innovation than farming”—so as to leave some safety margin—
then why shouldn’t it have at least that large an impact?

Yudkowsky thinks this is bigger than farming, because he drew a direct parallel to AI, which is (in his head) capable of direct self improvement in this way. Yudkowsky is like that, he likes extremes. AI is either the savior or destroyer of humanity. An idea is either better than farming or so dangerous as to be banned speech.

Hanson replies, effectively, "slow down there, Eliezer, I actually meant something more like Douglas Engelbart".

Douglas Engelbart is the guy who invented GUIs. For the programmers among us, now it makes sense. He also invented hypertext (the HT in HTTP), the mouse, and networking. These are all things that make you better at using computers, thus better at programming, thus better at making new cool inventions like that. He notably did not take over the world.

Hanson is saying "I would not fund UberTool. I don't think they'd take over the world, because other people who have accomplished their mission didn't." He is also saying "Yudkowsky is wrong". It's a call-out that he is not trying very hard to disguise. This was Hanson's point all along; self-improvement is not a magical device that will fix everything. His first post was a trap: get Yudkowsky to say that self-improvement *is* a magical device that will fix everything. (Never answer an economist's rhetorical questions; they are always traps :v:), then he explains that UberTool is really just computers and we already have them, then in post three he explains why self-improvement is not magical:

Hanson posted:

It is not so much that Engelbart missed a few key insights about
what computer productivity tools would look like. I doubt it would
have made much difference had he traveled in time to see a demo of
modern tools. The point is that most tools require lots more than
a few key insights to be effective—they also require thousands of
small insights that usually accumulate from a large community of tool
builders and users.

Small teams have at times suddenly acquired disproportionate
power, and I’m sure their associates who anticipated this possibility
used the usual human ways to consider that team’s “friendliness.” But
I can’t recall a time when such sudden small team power came from
an UberTool scenario of rapidly mutually improving tools.

Some say we should worry that a small team of AI minds, or even
a single mind, will find a way to rapidly improve themselves and take
over the world. But what makes that scenario reasonable if the Uber-
Tool scenario is not?

This is a rhetorical question. Never answer an economist's rhetorical questions, they are always traps. :v:

Yudkowsky is invested in his argument, though. He's already started the Singularity Institute by this time, based on the idea that one such feedback loop (AI self-improvement) is so dangerous that it must be countered this instant, else we will all die a quick death at the hands of a cruel and uncaring god. So it is necessary in his head that all instances of feedback loops build immediately to world-domination levels.

So Yudkowsky starts telling his side of the story, why AI is inherently different from all other cases. A normal person would say "Aha! Special pleading!" But Robin Hanson is a polite sort of person, and instead lets Big Yud go on and on.

They go back and forth for a little while. Yudkowsky asserts that there are three kinds of predictability for the future. There is the "Strong Inside View", where every component of a system is predictable and easy to understand. This means that most of the work is engineering effort and it's easy to know where things will go. Then there's the "Outside View", where you're basically just making poo poo up. Then there's the "Weak Inside View", where you're combining the two; you can predict some elements of the system but not all of them. He asserts that Hanson is using the Outside View, and that he is using the Weak Inside View, and consequently he is just more reliable and we shall all have to trust him. Hanson, for reference, states that he believes that the next major innovation (which may be AI, or may not) will come soon, but slowly, and be adopted worldwide over the course of years or decades, if not centuries, much like the concept of "industry". He believes this based on some complicated-but-not-obviously-wrong economic analysis; I wouldn't notice if it was wrong, I'm not an economist. Yudkowsky is constantly dodging and diving around Hanson's increasingly direct requests for him to go ahead and state his position for real.

Hanson says:

Hanson posted:

I suspect it is near time for you to reveal to us your “weak inside view,” i.e.,
the analysis that suggests to you that hand-coded AI is likely to appear in the
next few decades, and that it is likely to appear in the form of a single machine
suddenly able to take over the world.

Basically, Hanson is saying "then say why you think this, don't just assert it without evidence". This goes on, and on, and on... Yudkowsky continues never to provide evidence, or even to back up his theories at all.

After one more Yudkowsky :words: post which is "background" (read: does not state his position), Hanson starts being more direct with his "get to the point":

Hanson posted:

Eliezer, I can’t imagine you really think I disagree with anything important
in the above description.

Then Hanson talks for a while about how he thinks Yudkowsky's Inside vs Outside views theory is bullshit, but can't know for sure until Yudkowsky states his position. And Yudkowsky posts again with many, many :words:, without stating his position.

Hanson posted:

Eliezer, have I completely failed to communicate here?


And so Eliezer starts getting rude:

Eliezer posted:

Well . . . it shouldn’t be surprising if you’ve communicated less than
you thought.

Seriously, man, get to the point.

There's some more meaningless back and forth, :words: and :words: and :words:. The shortest possible version is that Yudkowsky is sticking to his guns on having secret knowledge that he can't share but no really we need to trust him. Hanson tries to summarize what he assumes Yudkowsky's points are (a technique called 'steelmanning', the opposite of strawmanning, where you formulate an argument for your opponent as strongly as you possible can).

Eventually Hanson just gets fed up and posts the best economics-professor-tired-of-arrogant-student post I have ever seen:

Hanson posted:

It seems that the basis for Eliezer's claim that my analysis is based on
untrustworthy "surface analogies" while his is based on "deep causes"
is that, while I use long-vetted general social science understandings
of factors influencing innovation, he uses his own new untested meta-
level determinism theory. It seems he could accept that those not yet
willing to accept his new theory might instead reasonably rely on my
analysis.

At this point I'm going to stop posting, because I have hit the point where I realize that this may actually only be interesting to me. We are through 60 pages of 700, and if this is boring you to tears, well, it's not worth continuing. My notes don't start turning into "oh jesus christ gently caress you gently caress you gently caress you" until page 150 or so.

Peel
Dec 3, 2007

Please continue.

I'll just note that 'steelmanning' is also known as the 'principle of charity' and has been common in philosophy for decades if not longer, if it's something LWers like to lay claim to. I've heard of it a couple times.

Peel fucked around with this message at 04:40 on Sep 27, 2014

Political Whores
Feb 13, 2012

SolTerrasa posted:

Wait wait wait. I use bayesian stats on a daily basis, but I learned about them from a real person after I learned about them from Yud the Spud. I'm aware he misapplies them to situations where using real numbers would be ludicrous (see anything that has Knuth's Up Arrow Notation), but I'm not aware of anything he gets actually wrong. Is there something?


Moatman posted:

He doesn't understand that you can/are supposed to update priors. Also his hateboner for any other type of statistics is pretty ridiculous.
I'll switch the av once I figure out what to replace it with amd/or have enough money to Scroogeproof it.


Priors are a part of it, but what really gets me is that Yudkowsky is obviously coming at the situation from the opposite end of someone actually interested in scientific enquiry or prediction i.e. a real scientist. Where does Yudksowsky pull his prior probability distributions? From his own imagination, they have basically no ties to observable reality, and the events he references (a hypothetical god AI coming into existence, for instance) are all extremely contingent on any number of other variables, if they are possible at all. It's not like he even justifies using some variant of reference prior for his beliefs, and in practice it doesn't really matter, because choosing a probability distribution is just a smokescreen for concealing his extremely unscientific certainty of his own wild imagination. That's ultimately what the gigantic up arrow notation numbers represent, his certainty of his prediction, masked as a hypothetical about the number of people he could save. He already came to the conclusion he wants (that his stupid "research" is worth funding) and uses Bayesian statistics to create a scenario which basically validates it, with the bonus of it being ~scientific~. That's numbers he uses are "correct", but its basically just cargo cult probability calculations at that point.

Tiggum
Oct 24, 2007

Your life and your quest end here.


The Unholy Ghost posted:

Okay, I misspoke, it was not really the science in the story that surprised me but the logical arguments and the way Harry manipulates people. From the perspective of someone who just wants to enjoy a story, it's quite entertaining.

I think if you look at the story as Harry taking over the magical world with both charlatan and legitimate methods, the story becomes incredibly interesting.

(Also: Dementors are kind of complete mysteries in Harry Potter so it's not that big of a deal if Yudkowsky wants to attribute some noncanon meaning to them. It is a Fanfiction after all [isn't it more about the overall metaphor than the exact details?])
I agree that there is certainly something to enjoy in there. As you say, fiction need not be realistic. And I did enjoy the early parts of the story quite a lot, as simply a work of fiction. The idea of "Harry Potter, but with a protagonist smart enough to see how dumb it all is" could certainly make for a pretty good parody. But the more of HPMOR you read the further it diverges from that premise, and the more it becomes Yudkowsky arguing with straw men.

And yeah, you could write a Harry Potter fanfic (or parody) where the dementors represented death instead of depression, but you'd have to change them to make that work, and Yudkowsky hasn't done that.

SolTerrasa posted:

At this point I'm going to stop posting, because I have hit the point where I realize that this may actually only be interesting to me. We are through 60 pages of 700, and if this is boring you to tears, well, it's not worth continuing.
I'm enjoying it also.

Tunicate
May 15, 2012

quote:

The idea of "Harry Potter, but with a protagonist smart enough to see how dumb it all is" could certainly make for a pretty good parody.
Andrew Hussie actually did that. It's hilarious.

AlbieQuirky
Oct 9, 2012

Just me and my 🌊dragon🐉 hanging out
The reason HPMOR is worth reading is because of the interest the HP books spark in people. It's the same reason The Wind Done Gone or Wide Sargasso Sea is worth reading. It's the same reason Weird Al Yankovic songs are worth listening to.

It's because the original source material had an impact (for good or ill) so the parody is also interesting because it explores some of the "but what if?" ideas people might have had while reading the original.

Yud confuses this with him being a philosopher/novelist in his own right, instead of a fan fictioneer/parodist.

Spazzle
Jul 5, 2003

I'm marginally involved in the bay area biohacking community. I'm continually frustrated by how most of these people are know nothing shitheads.

I went to a party on the marotol and some guy tried to talk to me about less wrong. I blew it off as some random atheist community but this thread has opened my eyes about the absurdity beneath.


The dumbest thing I heard recently was some chick at noisebridge talking about how intercranial direct current stimulation was the best thing ever.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



Spazzle posted:

I'm marginally involved in the bay area biohacking community. I'm continually frustrated by how most of these people are know nothing shitheads.

I went to a party on the marotol and some guy tried to talk to me about less wrong. I blew it off as some random atheist community but this thread has opened my eyes about the absurdity beneath.


The dumbest thing I heard recently was some chick at noisebridge talking about how intercranial direct current stimulation was the best thing ever.
What the gently caress are all of these things you just mentioned? This sounds interesting. The last bit sounds like Niven's wireheading, has that been invented then?

Qwertycoatl
Dec 31, 2008

Nessus posted:

What the gently caress are all of these things you just mentioned? This sounds interesting. The last bit sounds like Niven's wireheading, has that been invented then?

It supposedly makes you more intelligent. This means it's a good idea to buy a kit from random people on the internet, attach electrodes to your head and fire it up.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
In case it's not obvious, the whole thing is nonsense. More or less harmless, but about as useful as homeopathy.

Biohacking is cool, though. Ever since I've heard about them, I wanted to have some of these magnetic implants that let you feel electromagnetic fields like you're touching them. Too bad they don't work long-term.

Tunicate
May 15, 2012

Cardiovorax posted:

In case it's not obvious, the whole thing is nonsense. More or less harmless, but about as useful as homeopathy.

Biohacking is cool, though. Ever since I've heard about them, I wanted to have some of these magnetic implants that let you feel electromagnetic fields like you're touching them. Too bad they don't work long-term.

You get most of the effects by supergluing them on. Just shorter term, and with less surgery.

ungulateman
Apr 18, 2012

pretentious fuckwit who isn't half as literate or insightful or clever as he thinks he is

Tunicate posted:

Andrew Hussie actually did that. It's hilarious.

Is this a joke post, or do you have a link to the actual thing? As SA's resident person-who-likes-fanfic-too-much I'd like to see this!

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Tunicate posted:

You get most of the effects by supergluing them on. Just shorter term, and with less surgery.
It's probably not nearly the same as having a magnet directly stimulate the nerves inside one of your fingers, but I'll have to give that a try.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



ungulateman posted:

Is this a joke post, or do you have a link to the actual thing? As SA's resident person-who-likes-fanfic-too-much I'd like to see this!
You fool! He's tricked you into asking about Homestuck!

Strom Cuzewon
Jul 1, 2010

Tunicate posted:

You get most of the effects by supergluing them on. Just shorter term, and with less surgery.

Get a magnetic loop hearing aid and you can hear them too.

And see them if they're large enough

Ratoslov
Feb 15, 2012

Now prepare yourselves! You're the guests of honor at the Greatest Kung Fu Cannibal BBQ Ever!

Moatman posted:

He doesn't understand that you can/are supposed to update priors. Also his hateboner for any other type of statistics is pretty ridiculous.

Also, he refuses to acknowledge that something could have a probability of 1 or 0 because a lot of his dumb pseudo-Bayesian arguments rely on there being a non-zero possibility for something ridiculous. Of course this means he has no way of describing p(a|a) or p(a|!a). :eng99:

Adbot
ADBOT LOVES YOU

Crust First
May 1, 2013

Wrong lads.
Can anyone explain to me why Yudkowsky believes humans could even control his "Friendly AI" to begin with? Surely if we built something (True AI) that was so amazing at self improvement that it vastly outpaces the need for humans, it would likely self improve itself right out of whatever initial box we build it in; isn't he just building blinders that would eventually get torn off either on accident or on purpose anyway?

Does he believe that an AI we couldn't comprehend would still use whatever he thinks is "human logic"? I understand, to a point, wanting to start it off "Friendly", but surely that concept is going to be discarded like an old husk at some stage.

(I'm assuming that he thinks this AI would grow rapidly and incomprehensibly powerful and uncontrollable, since otherwise who cares. I'm not sure this is a realistic scenario but even if it was, why does he believe he can do something about it?)

  • Locked thread