Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Added Space
Jul 13, 2012

Free Markets
Free People

Curse you Hayard-Gunnes!

Darth Walrus posted:

Seriously, I really want to see some Lesswrong posts on Aristotle. I mean, they have every reason to be :smug: about how he's a pre-Enlightenment guy who got a vast amount of stuff comically wrong, and yet his scientific method is an ideal fit for Yudkowsky, and HPMOR really does seem to be championing it in practice if not necessarily in theory.

http://lesswrong.com/lw/ns/empty_labels/

EY pointing out in extended, pedantic fashion that syllogisms don't map well to real life.

http://lesswrong.com/lw/te/three_fallacies_of_teleology/

A commentary on how intent is confused with result (accurate as far as I can tell).

http://lesswrong.com/lw/m1/guardians_of_ayn_rand/

Part of a too-long sequence about cults saying that Ayn Rand and Aristotle may have been good in their day, they fell into a trap of cultish behavior and time and sensibilities marched past them.

Adbot
ADBOT LOVES YOU

JosephWongKS
Apr 4, 2009

by Nyc_Tattoo
Chapter 8: Positive Bias
Part Two


quote:


Hermione opened her mouth to reply to this, but then she couldn't think of any possible reply to... whatever it was she'd just heard, even as the boy walked over to her, looked inside the compartment, nodded with satisfaction, and sat down on the bench across from her own. His trunk scurried in after him, grew to three times its former diameter and snuggled up next to her own in an oddly disturbing fashion.

"Please, have a seat," said the boy, "and do please close the door behind you, if you would. Don't worry, I don't bite anyone who doesn't bite me first." He was already unwinding the scarf from around his head.

The imputation that this boy thought she was scared of him made her hand send the door sliding shut, jamming it into the wall with unnecessary force. She spun around and saw a young face with bright, laughing green eyes, and an angry red-dark scar set into his forehead that reminded her of something in the back of her mind but right now she had more important things to think about. "I didn't say I was Hermione Granger!"

"I didn't say you said you were Hermione Granger, I just said you were Hermione Granger. If you're asking how I know, it's because I know everything. Good evening ladies and gentlemen, my name is Harry James Potter-Evans-Verres or Harry Potter for short, I know that probably doesn't mean anything to you for a change -"


What on earth made Eliezarry think this was a good way to introduce yourself to someone else? There’s being dorky and awkward, and then there’s being obnoxious and annoying.


quote:


Hermione's mind finally made the connection. The scar on his forehead, the shape of a lightning bolt. "Harry Potter! You're in Modern Magical History and The Rise and Fall of the Dark Arts and Great Wizarding Events of the Twentieth Century." It was actually the very first time in her whole life that she'd met someone from inside a book, and it was a rather odd feeling.

The boy blinked three times. "I'm in books? Wait, of course I'm in books... what a strange thought."

"Goodness, didn't you know?" said Hermione. "I'd have found out everything I could if it was me."

The boy spoke rather dryly. "Miss Granger, it has been less than 72 hours since I went to Diagon Alley and discovered my claim to fame. I have spent the last two days buying science books. Believe me, I intend to find out everything I can." The boy hesitated. "What do the books say about me?"


Less than 72 hours, yet he already knows so much about wizarding society and politics, enough to discourse with Malfoy on an equal basis. What a prodigy! Such genius! :alllears:


quote:


Hermione Granger's mind flashed back, she hadn't realised she would be tested on those books so she'd read them only once, but it was just a month ago so the material was still fresh in her mind. "You're the only one who's survived the Killing Curse so you're called the Boy-Who-Lived. You were born to James Potter and Lily Potter formerly Lily Evans on the 31st of July 1980. On the 31st of October 1981 the Dark Lord He-Who-Must-Not-Be-Named though I don't know why not attacked your home. You were found alive with the scar on your forehead in the ruins of your parents' house near the burnt remains of You-Know-Who's body. Chief Warlock Albus Percival Wulfric Brian Dumbledore sent you off somewhere, no one knows where. The Rise and Fall of the Dark Arts claims that you survived because of your mother's love and that your scar contains all of the Dark Lord's magical power and that the centaurs fear you, but Great Wizarding Events of the Twentieth Century doesn't mention anything like that and Modern Magical History warns that there are lots of crackpot theories about you."

The boy's mouth was hanging open. "Were you told to wait for Harry Potter on the train to Hogwarts, or something like that?"

"No," Hermione said. "Who told you about me? "

"Professor McGonagall and I believe I see why. Do you have an eidetic memory, Hermione?"

Hermione shook her head. "It's not photographic, I've always wished it was but I had to read my school books five times over to memorize them all."

"Really," the boy said in a slightly strangled voice. "I hope you don't mind if I test that - it's not that I don't believe you, but as the saying goes, 'Trust, but verify'. No point in wondering when I can just do the experiment."

Hermione smiled, rather smugly. She so loved tests. "Go ahead."

The boy stuck a hand into a pouch at his side and said "Magical Drafts and Potions by Arsenius Jigger". When he withdrew his hand it was holding the book he'd named.

Instantly Hermione wanted one of those pouches more than she'd ever wanted anything.

The boy opened the book to somewhere in the middle and looked down. "If you were making oil of sharpness -"

"I can see that page from here, you know!"

The boy tilted the book so that she couldn't see it any more, and flipped the pages again. "If you were brewing a potion of spider climbing, what would be the next ingredient you added after the Acromantula silk?"

"After dropping in the silk, wait until the potion has turned exactly the shade of the cloudless dawn sky, 8 degrees from the horizon and 8 minutes before the tip of the sun first becomes visible. Stir eight times widdershins and once deasil, and then add eight drams of unicorn bogies."

The boy shut the book with a sharp snap and put the book back into his pouch, which swallowed it with a small burping noise. "Well well well well well well. I should like to make you a proposition, Miss Granger."

"A proposition?" Hermione said suspiciously. Girls weren't supposed to listen to those.


Nothing particularly offensive past Harry’s ridiculous self-introduction, and Hermione hasn’t prostrated herself before Harry’s brilliance. I’ll give the beginning of this chapter a passing mark.

Night10194
Feb 13, 2012

We'll start,
like many good things,
with a bear.

Those are, of course, all D&D potions.

Luna Was Here
Mar 21, 2013

Lipstick Apathy
Saw this, decided to give the thread a read

what sort of state of mind do you have to be in to say half of these things, whether its hpmor or just other stuff yud has written, like

loving christ

JosephWongKS
Apr 4, 2009

by Nyc_Tattoo
Chapter 8: Positive Bias
Part Three


quote:


It was also at this point that Hermione realised the other thing - well, one of the things - which was odd about the boy. Apparently people who were in books actually sounded like a book when they talked. This was quite the surprising discovery.

The boy reached into his pouch and said, "can of pop", retrieving a bright green cylinder. He held it out to her and said, "Can I offer you something to drink?"

Hermione politely accepted the fizzy drink. In fact she was feeling sort of thirsty by now. "Thank you very much," Hermione said as she popped the top. "Was that your proposition?"

The boy coughed. "No," he said. Just as Hermione started to drink, he said, "I'd like you to help me take over the universe."

Hermione finished her drink and lowered the can. "No thank you, I'm not evil."

The boy looked at her in surprise, as though he'd been expecting some other answer. "Well, I was speaking a bit rhetorically," he said. "In the sense of the Baconian project, you know, not political power. 'The effecting of all things possible' and so on. I want to conduct experimental studies of spells, figure out the underlying laws, bring magic into the domain of science, merge the wizarding and Muggle worlds, raise the entire planet's standard of living, move humanity centuries ahead, discover the secret of immortality, colonize the Solar System, explore the galaxy, and most importantly, figure out what the heck is really going on here because all of this is blatantly impossible."

That sounded a bit more interesting. "And?"

The boy stared at her incredulously. "And? That's not enough? "

"And what do you want from me?" said Hermione.

"I want you to help me do the research, of course. With your encyclopedic memory added to my intelligence and rationality, we'll have the Baconian project finished in no time, where by 'no time' I mean probably at least thirty-five years."

Hermione was beginning to find this boy annoying. "I haven't seen you do anything intelligent. Maybe I'll let you help me with my research."


Hermione is instantly my favourite character in this story.


quote:


There was a certain silence in the compartment.

"So you're asking me to demonstrate my intelligence, then," said the boy after a long pause.

Hermione nodded.

"I warn you that challenging my ingenuity is a dangerous project, and tends to make your life a lot more surreal."


On a scale of 1-10 for pretentiousness, this must be at least an 8. And there are still more than 110 chapters of this story to go. Plenty of time for Eliezarry to top himself.


quote:


"I'm not impressed yet," Hermione said. Unnoticed, the green drink once again rose to her lips.

"Well, maybe this will impress you," the boy said. He leaned forward and looked at her intensely. "I've already done a bit of experimenting and I found out that I don't need the wand, I can make anything I want happen just by snapping my fingers."

It came just as Hermione was in the middle of swallowing, and she choked and coughed and expelled the bright green fluid.

Onto her brand new, never-worn witch's robes, on the very first day of school.

Hermione actually screamed. It was a high-pitched sound that sounded like an air raid siren in the closed compartment. "Eek! My clothes! "

"Don't panic!" said the boy. "I can fix it for you. Just watch!" He raised a hand and snapped his fingers.
"You'll -" Then she looked down at herself.

The green fluid was still there, but even as she watched, it started to vanish and fade and within just a few moments, it was like she'd never spilled anything at herself.

Hermione stared at the boy, who was wearing a rather smug sort of smile.

Wordless wandless magic! At his age? When he'd only gotten the schoolbooks three days ago?


That’s not a showcase of “intelligence” per se, that’s just Eliezarry getting super-special snowflake powers for no reason other than him being the author’s self-insert Harry Sue of the story.

Telarra
Oct 9, 2012

No, it's him being a smartass and letting her think he did something when the drink did it itself.

Seraphic Neoman
Jul 19, 2011


no guys this is called negging my friend told me all about this no wait where are you going

Legacyspy
Oct 25, 2008

akulanization posted:

Ah yes, the old "I meant to do that" defense, Harry fails at being High King of Rational Mountain because a) a lot of his science is lovely, half right, or surface level but is made into the gospel truth in the story b) because he isn't actually motivated by knowledge or exploration, he is motivated by power. You say he might have been intended to be like Artemis Fowl, but that is trivially untrue; Artemis Fowl is never held up as an example, and he certainly isn't meant to convince you that you should follow the Way of Fowl. Harry however is supposed to teach the audience how to be "rationalists" and really is never defeated or outdone in his area of "expertise" in the context of the story. Much of this is achieved by undermining the other characters or by choosing to have the world work the way that harry guesses it will work.

So, first of all it was raised by Nessus, not me, that said Harry is like Artemis. I could remember some similarity so I rolled with. However I haven't read any Artemis Fowl books in at least 10 years, so IDK. So if you want to argue whether or not Harry is like Artemis fowl, argue with Nessus, he clearly does.

Second of all, to those of you saying that Harry Potter was intended to be a paragon of rationality I literally just asked Eliezer:

quote:

re: Is harry supposed to be a uber rationalist?

from EliezerYudkowsky sent just now

Nope, he's supposed to be an inexperienced baby rationalist.


So clearly Harry isn't supposed to be "High King of Rational Mountain".

SSNeoman posted:

Ah, but you see he was! We're supposed to think the science is real. We're supposed to think he is uber-rationalistic. The author pretty much flat our admitted this point. If he fails to deliver, then it's a bad story. And it fails to deliver.

See above.

How was Harry's blackmail going to ruin McGongal's life? His Blackmail consisted of "If you don't tell me the truth, I'll go ask questions elsewhere" which I think is a perfectly fine thing to say when the truth is about whether or not the Dark Lord who killed your parents, and tried to kill you, is still alive.

Your last paragraph I agree with, except that behavior doesn't bother me, I think we were meant to like him, I like him, and many others do. If he was a real-person I probably wouldn't like interacting with him though.

akulanization posted:

it's author so odious

How so?

Nessus posted:

It seems like your goal is to get people to say "I am upset by the behavior of the character in the lovely fanfiction being roasted."

Naw. I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his). Like the guy on the first page who, when speaking of the torture vs dust specks says "3^^^3 is the same thing as 3^3^3^3" despite the fact that the very post where Eliezer raised the discussion, he explained the notation before even getting to the torture & dust specks. I don't think this guy is at fault for this. He probably got this from some one else. I'm just using this as a really simple example of how the people who mock Eliezer's writing don't even understand basic things hes written and are mocking him from a position of ignorance. Similarly I don't think "friendly A.I" is some sort of crazy idea. It seems pretty reasonable. It is basically: "Are there solutions to problems where the solutions are so complex we can't understand everything about the solution? If yes, how do we build something that will give solutions to problems that won't provide solutions that will conflict with other things we care about?"

Telarra
Oct 9, 2012

Ohhhh, it's 3^^^3 dust specks, not 3^3^3^3? In that case I agree, the torture option is obviously the morally correct choice.

Telarra
Oct 9, 2012

More seriously, yes, friendly AI isn't some crazy idea. Science fiction has toyed with the idea of artificial minds overthrowing their creators for probably a century by now, to the point where everyone is aware of it. The problem is that Yud claims to have more than science fiction to contribute here, and he doesn't. He has unitless utilitarianism and a fetish for Baye's theorem, and that's about it.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



Legacyspy posted:

So, first of all it was raised by Nessus, not me, that said Harry is like Artemis. I could remember some similarity so I rolled with. However I haven't read any Artemis Fowl books in at least 10 years, so IDK. So if you want to argue whether or not Harry is like Artemis fowl, argue with Nessus, he clearly does.

Second of all, to those of you saying that Harry Potter was intended to be a paragon of rationality I literally just asked Eliezer:


So clearly Harry isn't supposed to be "High King of Rational Mountain".


See above.

How was Harry's blackmail going to ruin McGongal's life? His Blackmail consisted of "If you don't tell me the truth, I'll go ask questions elsewhere" which I think is a perfectly fine thing to say when the truth is about whether or not the Dark Lord who killed your parents, and tried to kill you, is still alive.

Your last paragraph I agree with, except that behavior doesn't bother me, I think we were meant to like him, I like him, and many others do. If he was a real-person I probably wouldn't like interacting with him though.


How so?


Naw. I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his). Like the guy on the first page who, when speaking of the torture vs dust specks says "3^^^3 is the same thing as 3^3^3^3" despite the fact that the very post where Eliezer raised the discussion, he explained the notation before even getting to the torture & dust specks. I don't think this guy is at fault for this. He probably got this from some one else. I'm just using this as a really simple example of how the people who mock Eliezer's writing don't even understand basic things hes written and are mocking him from a position of ignorance. Similarly I don't think "friendly A.I" is some sort of crazy idea. It seems pretty reasonable. It is basically: "Are there solutions to problems where the solutions are so complex we can't understand everything about the solution? If yes, how do we build something that will give solutions to problems that won't provide solutions that will conflict with other things we care about?"
I think you should buy Big Yud an account. I'd love to have him come in and explain to us about AI Jesus.

Krotera
Jun 16, 2013

I AM INTO MATHEMATICAL CALCULATIONS AND MANY METHODS USED IN THE STOCK MARKET
If it makes a difference to you, Legacyspy, I hate on Eliezer all the time for a lot of the same reasons as the rest, but this fic is better than I expected.

Uh, it's not hate, but I think he's pretty pretentious, doesn't fact-check himself adequately, and tries to look important for achievements that are pretty insignificant. I haven't seen him do anything that I don't imagine I could have done, but I don't think I'm self-important enough to represent his achievements the way he does, so it annoys me when he gets money and popularity and fame. I think it's a stretch to say he's done nothing because he's written hundreds of thousands of words, but I'm not convinced the words are very substantial or novel. I think his attitude towards others' accomplishments makes him unlikely to succeed in a real sense, although it might get him some attention.

He's a natural for shilling, which I don't mean as an insult. The last few times I've had to shill something it's left me feeling a little sour and uncomfortable, but he seems to shill his own work pretty effortlessly. Maybe it's practice.

Oh yeah, his programming language, Flare, didn't impress me and I have longposts about that in the Less Wrong thread. It's not that bad for a first-time language designer (I'd be lying to say I thought it was average or above average) but I exaggerated for humor. It's pretty nebulous, like the rest of what he does. That's had a more lasting impression with me than it really deserves (it was a long time ago too) because it's a big area of interest for me outside of mockthread stuff. It reminds me of stuff I was doing when I was 14.

Fic has some really lovely moments (every Malfoy convo except the first one) and I think Harry is irritating, but it's funny when Yud doesn't hijack other characters to make Harry look smart. He's pretty good at providing examples of cleverness that aren't totally contrived although Harry's plans, in all their intricate details, have a really strong track record of success so far that makes them feel contrived -- other characters are reacting to Harry or following Harry's script more than Harry appears to be reacting to other characters, which makes everything feel a little bit authorial.

There's something really stilted about the prose but it's ignorable and I blame inexperience on Yud's part. I give the fic as a whole three out of five stars so far with some four-star moments. Mind that I'd be a lot more disfavorable if I thought he took this as seriously as I think a lot of the people following this "Let's Read" think he was.

Oh yeah, it's hypothetical but if he had an account I'd be interested in talking. I don't know if he would go for that though.

i81icu812
Dec 5, 2006

Fried Chicken posted:

Well I was expecting something weird and arrogant based off everything else, but instead that's really sad. I dunno, I can recognize the pain there, but at a certain point you need to rise above and not be a complete poo poo like he is as an adult


http://www.scientificamerican.com/article/the-secret-to-raising-smart-kids1/

I think it definitely counts as weird. Sad yes. But definitely very weird. I'm sure some actual developmental psyc person could help make sense of it all, but it is a fascinating glimpse into the mind that wrote this. http://web.archive.org/web/20010205221413/http://sysopmind.com/eliezer.html#timeline_the



RE raising smart children and reinforcement:

Northwestern's Center for Talent Development, Hopkins's Center for Talented Youth, and other gifted and talented programs run testing programs to administer standardized tests to children typically around middle school when students can be sent off for summer programs. Ostensibly this is to identify smart children and offer them opportunities to attend advanced classes amongst their peers. Less charitable wags would note that the programs are primarily funded through tuition from the classes that it offers all students who take their screening tests and that enrollment has increased year-over-year for decades. IDK. Gifted education is kinda a mess.

Regardless, the programs exist to tell people their kids are smart and should be around other smart kids. Let's see what taking the test (and some practice tests!) did to Yud:

quote:

I obtained a couple of SAT preparation books - one targeted specifically on Math, and one targeted on the whole SAT (Math and Verbal). I took a few practice tests from the Math book, and with each additional test, my scores went down. I got a 570, then a 530, then a 460 (9). "Huh?" I said to myself.

So I took another practice test, this time resolving to, as Ben Kenobi would say, "act on instinct". (That actual phrase, in Ben's voice, ran through my head.) (10). I got a 640 Math. The lesson I learned was to trust my intuitions, because my intuitions are always right - probably one of the most important lessons of my entire life.

On my actual SAT, I got a 670 Verbal and a 740 Math. The Midwest Talent Search informed me that this had placed second Verbal, third Math, and second Combined, for the seventh grade, for the Midwest. Their statistics said I was at the 99.9998th percentile. It wasn't until years later that I realized their stats were worthless because I'd skipped a grade, and to this day, I still don't know what percentile I'm really in.

This was the first real sign that I was not only bright but waayy out of the ordinary...

I think, parentally this is where you try to talk about standardized test's poor behavior on bell curve edges and test repeatability/teachability and try to keep a kid grounded.... IDK. Gifted education is really a mess and Yud's childhood case is sadly not uncommon.


Anecdotally, I took the SAT around the same time as Yud in a similar gifted screening/summer school salesmanship exercise. I'm pretty sure I scored about the same as Yud did. The top ~30ish scorers from my state got certificates and they all turned out more or less normal by the time high school graduation rolled around (small state, most ended up in the same 2 high schools). But for the grace of God...

i81icu812 fucked around with this message at 06:45 on Mar 25, 2015

Seraphic Neoman
Jul 19, 2011


I will defend my position but I need JWKS to get to the later parts of this dreck before I can. That said,

Legacyspy posted:

Naw. I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his). Like the guy on the first page who, when speaking of the torture vs dust specks says "3^^^3 is the same thing as 3^3^3^3" despite the fact that the very post where Eliezer raised the discussion, he explained the notation before even getting to the torture & dust specks.

Right my mistake. It's actually 3^^7625597484987 which is some godawful number you get if you take 3 and then raise it to the 3rd power 7,625,597,484,987 times which is still the same thing as "a meaninglessly big number" which means my point still stands.
There is no reason to involve numbers in this problem, it's a problem concerning philosophical morality, unless you're a smartass who wants to flash his cock. Yud could have asked "Should one person get tortured for a decade or should every other person on Earth get a grain of sand in their eye?" But that wasn't nerdy enough so they do this poo poo.

But you know what fine, whatever. Lemme explain why Yud's conclusion is nuts.

So there is a branch of philosophy called utilitarianism. The basic premise is that it wants to achieve the maximum amount of happiness for the maximum amount of people (in broad strokes).
In Yud's PHILO101 he blasts the problem with a blunt solution. Dude is tortured for 50 years, or someone is tortured (ie: removes the dust speck) for like 1 second? Well obviously we'd pick option 2. But when you SHUT UP AND MULTIPLY our second by that "meaninglessly big number", suddenly having one dude tortured for 50 years doesn't sound that bad, right? BEEP BOOP PROBLEM SOLVED GET hosed CENTURIES OF PHILOSOPHY

Well of course not. There are other factors involved when you realize that we are dealing with a being who has life, willpower, the ability to feel pain and all that other good poo poo. We are taking 50 years out of a person's life and replacing it with pain and misery, instead of inconveniencing a lot of people for an insignificant amount of time. This logic, by the way, is still equating torture with the pain caused by dust specks. I am still playing by Yud's rules despite the fact the two are obviously not equal. You are crushing a person's dreams and ambitions just for the sake of not bothering a whole lot of people for something they won't remember. What about the life this poor person is missing out by going through this torture? All those people won't even remember the speck by the end of the day, but that person will carry his PTSD for the rest of his (no doubt shortened) life, if he's not loving catatonic by the end of the first year.
I am explaining this in-depth because it's a problem that requires you to do so. It is something LW themselves avoid doing, and I hope you are not falling into the same trap.

Yud, by the way, totally dismisses people who point this out, because he misapplies his own concept of Scopes Insensitivity to the solution. He uses an absolutely absurd and unrelated example to prove his point, which is fantastically different from the problem on hand.

Legacyspy posted:

I'm just using this as a really simple example of how the people who mock Eliezer's writing don't even understand basic things hes written and are mocking him from a position of ignorance.

NO.
I understand what Yud and co write. What I don't, I ask others until I do. Some of it is profound, but most of it is them re-inventing philosophical wheels. And sometimes they decide that these wheels should be squares instead.
Do you remember Roko's Basilisk? That was a user taking Yud's ridiculous loving AI anecdote to its absurd conclusion. That is the reason (I suspect) why Yud hates it so, because it shows what a house of cards his whole philosophy is.
That is why he hates non-Bayesian AI ideas; because they make him irrelevant.
At one point he was even backed into a philosophical corner by that same "we know in our hearts torture is the right answer" guy and he threw a hissy fit instead of saying "I don't know".
I could go on about their myriad of faults, but I am very much not arguing from a position of ignorance. Yud's research center has barely published any papers and none are helpful or relevant to society. He is pondering on an irrelevant problem that will never have a solution, nor does it even need to be solved. And if it does come up, Yud's solution will be wrong. I can explain why in detail if you want me to but this part is already long as gently caress.

I'm really annoyed that you'd use this defense because this is the sort of bullshit LW loves. Instead of trying to explain things in simple terms, they use complex ones or ones they made up. Not only does this go against their own mission statement, it's condescending and disingenuous. People have debunked Yud's ideas, he just doesn't want to admit to it (like how he doesn't want to admit that he lost his AI box roleplay 4 times ina row after two wins, which is why he doesn't offer that challenge anymore). So no, you're wrong.

Legacyspy posted:

Similarly I don't think "friendly A.I" is some sort of crazy idea. It seems pretty reasonable. It is basically: "Are there solutions to problems where the solutions are so complex we can't understand everything about the solution? If yes, how do we build something that will give solutions to problems that won't provide solutions that will conflict with other things we care about?"

Except Yud wants a very specific friendly AI, one that uses Bayesian probability to achieve godlike omnipotence. And this is one that will handle all tasks ever and can easily lord over our entire society.

Seraphic Neoman fucked around with this message at 08:25 on Mar 24, 2015

VictualSquid
Feb 29, 2012

Gently enveloping the target with indiscriminate love.
That numbers game is one of the most obvious indications that Judowsky doesn't really get what he is talking about.

The exact number that is needed is not relevant to the example.
Most real Philosophers or Mathematicians would just say that there is an arbitrary number with those properties and argue from there.
The rest would compute the number, like Graham famously did that one time.

But Judowsky insists on inventing a number. He has no justifications at all for choosing this number, but his followers insist that it is the right number. Why?
There is no difference between 3^^4 and 3^^^^3 or even 3 itself. I even suspect that the reasone Yud used 3 instead of 4 as the basis was because Wikipedia has examples for that already computed out.

Tunicate
May 15, 2012

i81icu812 posted:

Anecdotally, I took the SAT around the same time as Yud in a similar gifted screening/summer school salesmanship exercise. I'm pretty sure I scored about the same as Yud did. The top ~30ish scorers from my state got certificates and they all turned out more or less normal by the time high school graduation rolled around (small state, most ended up in the same 2 high schools). But for the grace of God...

Also, like ten people at my middle school got perfect 1600s on the SAT. Admittedly, that's across sixth-through-eighth, but I strongly suspect Yud fell for a scam.

Tunicate fucked around with this message at 16:03 on Mar 24, 2015

anilEhilated
Feb 17, 2014

But I say fuck the rain.

Grimey Drawer

quote:

So I took another practice test, this time resolving to, as Ben Kenobi would say, "act on instinct". (That actual phrase, in Ben's voice, ran through my head.) (10). I got a 640 Math. The lesson I learned was to trust my intuitions, because my intuitions are always right - probably one of the most important lessons of my entire life.
Now that's not very scientific, is it? The level of doublethink required to keep thinking that and while still maitaining the pretense to rationality must be insane.

anilEhilated fucked around with this message at 16:25 on Mar 24, 2015

Luna Was Here
Mar 21, 2013

Lipstick Apathy
If you are in the 99.9999th percentile of loving anything, you do not go on in life to write Harry potter fan fics and logical fallacies. gently caress,I know people who got 10s and 12s on the act and even they get the basic concept of prisoner dilemmas

I'm not meaning to be dissing on the guy and it's likely that he did do pretty well on tests way back in the day, but doing well on basic English and math tests in middle school and then dropping out of school because you're so much better than the system does not give you the qualifications to make up terms and assert that your way of thinking is so much better than everyone else's

su3su2u1
Apr 23, 2014

Legacyspy posted:

Naw. I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his). Like the guy on the first page who, when speaking of the torture vs dust specks says "3^^^3 is the same thing as 3^3^3^3" despite the fact that the very post where Eliezer raised the discussion, he explained the notation before even getting to the torture & dust specks. I don't think this guy is at fault for this. He probably got this from some one else. I'm just using this as a really simple example of how the people who mock Eliezer's writing don't even understand basic things hes written and are mocking him from a position of ignorance. Similarly I don't think "friendly A.I" is some sort of crazy idea. It seems pretty reasonable. It is basically: "Are there solutions to problems where the solutions are so complex we can't understand everything about the solution? If yes, how do we build something that will give solutions to problems that won't provide solutions that will conflict with other things we care about?"

I mock Yudkowsky from a position of strength rather than of ignorance- I have a phd in physics, and Yud included (wrong) physics references in his fanfic. He incorrectly references psychology experiments in his fanfic. He uses the incorrect names for biases in his fanfic. He gets computational complexity stuff wrong in his fanfic. He does this despite insisting on page 1 that "All science mentioned is real science." Let me ask you- if an AI researcher can't get computational complexity correct, why should I trust anything else he writes? If someone who has founded a community around avoiding mental biases can't get the references right in a fanfic, why should I trust his other writing?

His technical work is no better. His paper "timeless decision theory" paper is 100 pages of rambling, with no formal definition of the theory anywhere (and it would be super easy to formalize the theory as described). His research institute is a joke- they've been operating for more than a decade with only 1 paper on arxiv and basically no citations to any of their self-published garbage.

Bobbin Threadbare
Jan 2, 2009

I'm looking for a flock of urbanmechs.

Luna Was Here posted:

If you are in the 99.9999th percentile of loving anything, you do not go on in life to write Harry potter fan fics and logical fallacies. gently caress,I know people who got 10s and 12s on the act and even they get the basic concept of prisoner dilemmas

I'm not meaning to be dissing on the guy and it's likely that he did do pretty well on tests way back in the day, but doing well on basic English and math tests in middle school and then dropping out of school because you're so much better than the system does not give you the qualifications to make up terms and assert that your way of thinking is so much better than everyone else's

Now you're giving too much credit to standardized testing.

akulanization
Dec 21, 2013

Legacyspy posted:

So, first of all it was raised by Nessus, not me, that said Harry is like Artemis. I could remember some similarity so I rolled with. However I haven't read any Artemis Fowl books in at least 10 years, so IDK. So if you want to argue whether or not Harry is like Artemis fowl, argue with Nessus, he clearly does.

Legacyspy posted:

Are you saying that Harry is supposed to be a "uber-rationalist super-optimizer", but fails to do so? I never got that conclusion that at all. I thought he was supposed to be another character in the vein of Artemis Fowl, Ender, Bean, or Miles Vorkosigan.
:ironicat:


Legacyspy posted:

Second of all, to those of you saying that Harry Potter was intended to be a paragon of rationality I literally just asked Eliezer:


So clearly Harry isn't supposed to be "High King of Rational Mountain".
I have no evidence this conversation actually occurred, but if it did that still doesn't change my point. Harriezer is held up as an example, he's powerful and has agency because he is a rationalist. He is obviously meant to inspire people to be rationalists like himself, a perception that isn't helped by the author saying poo poo like this:

Big Yud posted:

To learn almost everything that Harry knows, the best current free online solution is to read the Sequences at LessWrong.com – two years of blog posts that tried to introduce just about everything that I thought a rationalist needed to know as of 2007, starting with basic theory of knowledge, Bayesian probability theory, cognitive biases, evolutionary psychology, social psychology, and going on into the more arcane realms of reductionism and demystified quantum mechanics. Believe it or not, Harry is only allowed to draw on around half of the easier Sequences – if he knew all of them, he would be too powerful a character and break the story.
You see, rationalism is a superpower for Yud and Harriezer is definitely supposed to illustrate that. If Harriezer wasn't an example I doubt Yud could use him to pimp his stupid website, and he might have said something to dissuade all those people who left comments talking about how the character is an example. But he didn't because this work exists to demonstrate the "power" of rationality.

Legacyspy posted:

How was Harry's blackmail going to ruin McGongal's life? His Blackmail consisted of "If you don't tell me the truth, I'll go ask questions elsewhere" which I think is a perfectly fine thing to say when the truth is about whether or not the Dark Lord who killed your parents, and tried to kill you, is still alive.
Harriezer knows little to nothing about his world, and he's talking to a powerful adult who has done nothing unkind or underhanded to him. McGongal actually was part of the people fighting said dark lord, while Harriezer is a snot nosed brat who has demonstrated that he lacks the maturity to deal with the secrets he wants. He demands that someone who's mettle has been proven by blood and sacrifice unconditionally trust him we he had no idea what the issues were or what might have been at stake when he woke up that morning. He then says that he has defenses against a magic that he doesn't have any reason to know existed (I guess he found a copy of the script in his wizard bank vault or something), which is obviously bullshit but he doesn't get his mind wiped because he's dealing with a good person. That sequence is all about how his guardians are absolutely right to keep secrets from him, because he doesn't have enough self-control to be trusted with anything. But I doubt that Big "Teacher Biting" Yud thinks that's a negative.

Legacyspy posted:

Your last paragraph I agree with, except that behavior doesn't bother me, I think we were meant to like him, I like him, and many others do. If he was a real-person I probably wouldn't like interacting with him though.
I agree with the earlier poster, you must be fantasizing about putting yourself in his shoes, since there have been maybe two conversations thus far where he wasn't a oval office.

Legacyspy posted:

Naw. I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his). Like the guy on the first page who, when speaking of the torture vs dust specks says "3^^^3 is the same thing as 3^3^3^3" despite the fact that the very post where Eliezer raised the discussion, he explained the notation before even getting to the torture & dust specks. I don't think this guy is at fault for this. He probably got this from some one else. I'm just using this as a really simple example of how the people who mock Eliezer's writing don't even understand basic things hes written and are mocking him from a position of ignorance. Similarly I don't think "friendly A.I" is some sort of crazy idea. It seems pretty reasonable. It is basically: "Are there solutions to problems where the solutions are so complex we can't understand everything about the solution? If yes, how do we build something that will give solutions to problems that won't provide solutions that will conflict with other things we care about?"
You obviously don't understand that treating people like dirt and being an rear end for no reason makes someone unlikeable to many people, you really must fit in at less wrong! Also the actual number is immaterial if you want to approach this problem from the perspective of mathematics, it's an arbitrarily large real number. If you grant the premise that both events are measured with the same scale and that minimizing that number is a priority, then the number doesn't matter; you can always select a number big enough that the "math" works out the way you want. The problem with the torture proponent's stance is that, as SSNEOMAN pointed out, they don't actually consider the problem. They wipe out a life because that's better than a trifling inconvenience to a huge number of people.

Yud's friendly AI meanwhile is probably a worthless and intellectually bankrupt idea. While I'm going secondhand here, given that Yud hasn't been cited or published a model himself it's pretty clear that even if you think his goals are worthwhile you should recognize that he will never move them forward, let alone complete them as he promises.

SolTerrasa
Sep 2, 2011

Legacyspy posted:

I have two motivations. I wanted to understand why people dislike Harry so much, and initially people were going "He is infuriating because he is irritating" which doesn't explain much, there have been better explanations since then. The second is that I honestly like hpmor & Eliezer. I'd rate it a 7/10 and I do not think the hate against Eliezer is warranted. I think a lot of stems from people not understanding he writes (due to little fault of his).

Like su3su2u1, I am not mocking Yudkowsky from a position of ignorance. I feel that I'm in a pretty good place to criticize his AI work. Which I have, in the mock thread. But a short version here: he is so needlessly wordy that it's difficult to notice how incredibly basic his ideas are. Timeless Decision Theory is a great example, I formalized it in one paragraph in the other thread, which Yudkowsky failed to do in a hundred pages. And it wasn't even a new idea once I wrote it down!

On his favorite model of an AI, the Bayesian inference system, I've built one and I can't tell if he has. Bayesian inference doesn't parallelize well, the units of work are too small and so efficiency gains are nearly canceled by overhead. It couldn't play an RTS game because it was computationally bound, even after I taught it the rules. Yudkowsky's would need to do way better than mine to literally take over the world, and he has literally never had a plan for how it will be. Mine was textbook, his would need to be orders of magnitude above.

All that said, even I don't think friendly AI is a crazy problem for crazy people; I think it's an engineering problem for a domain that doesn't exist yet.

su3su2u1
Apr 23, 2014

SolTerrasa posted:

All that said, even I don't think friendly AI is a crazy problem for crazy people; I think it's an engineering problem for a domain that doesn't exist yet.

Although, the specific direction that MIRI is running in, creating mathematical ideas of friendliness is a huge misunderstanding of how applied math works.

su3su2u1 fucked around with this message at 21:19 on Mar 24, 2015

petrol blue
Feb 9, 2013

sugar and spice
and
ethanol slammers
Yeah, the 'position of ignorance' line probably wasn't a good move, Legacyspy, goons like physics and ai almost as much as Mt. Dew.

:eng101:

Am I reading it right that the Friendly AI should have human values and responses? Because that would imply Yud believes that humans are good to each other and would never, say, try to wipe out a group they consider inferior.

Also, can someone link to the lesswrong thread, please?

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



petrol blue posted:

Yeah, the 'position of ignorance' line probably wasn't a good move, Legacyspy, goons like physics and ai almost as much as Mt. Dew.

:eng101:

Am I reading it right that the Friendly AI should have human values and responses? Because that would imply Yud believes that humans are good to each other and would never, say, try to wipe out a group they consider inferior.

Also, can someone link to the lesswrong thread, please?
Here you go, buddy! http://forums.somethingawful.com/showthread.php?threadid=3627012

The idea is that the AI is going to inevitably become God, so it is the most important thing ever to make sure that when the computer inevitably becomes Literally God, we make sure it's a nice friendly New Testament God who cares for us, rather than an Old Testament God who will send us all to Robot Hell.

i81icu812
Dec 5, 2006

Luna Was Here posted:

If you are in the 99.9999th percentile of loving anything, you do not go on in life to write Harry potter fan fics and logical fallacies. gently caress,I know people who got 10s and 12s on the act and even they get the basic concept of prisoner dilemmas

I'm not meaning to be dissing on the guy and it's likely that he did do pretty well on tests way back in the day, but doing well on basic English and math tests in middle school and then dropping out of school because you're so much better than the system does not give you the qualifications to make up terms and assert that your way of thinking is so much better than everyone else's

Tunicate posted:

Also, like ten people at my middle school got perfect 1600s on the SAT. Admittedly, that's across sixth-through-eighth, but I strongly suspect Yud fell for a scam.

Yeah, the old 1600 scale SATs are wonderfully teachable tests. Take a few dozen old tests, memorize a list of vocab words, and your scores will go up. SATs are really, really, lovely at differentiating the top end of the scale though. Too teachable and way too many people can max out the test for it to be very meaningful. And even if you are one in a million, there's 7,000 people smarter than you.

But it's the perfect scam for the colleges running the gifted and talented summer school programs--all data are real and truthful and everything they tell the students is completely accurate social science! They administer actual SATs for that year in a special middle school students only session and simply add an informational paper to the results reported by College Board saying how your results scale to the study run years ago for other kids your age, confidence levels, error bars and everything. Add award ceremonies for high scorers and some cheap recognition certificates and suburban parents can't throw money at your summer school programs fast enough.

Though perhaps I'm less than charitable, I'm sure some kids benefit tremendously from being around other smart kids and that it looks good on college applications.

Legacyspy posted:

position of ignorance

And since Legacyspy is doing such a wonderful job of getting people to respond--like SolTerrasa and su3su2u1 I'm not mocking from a position of ignorance. I've actually coded simulations and built robotic systems using Bayesian networks/Markov chains. They work well for very specific tasks (classic example is training robots to walk) and are terrible inefficient for others. But an AI with some sort of agency? Yud is crazy, not in our lifetimes. The mock thread goes into far more detail if you want.

Beside, I claimed a good score on a middle school SAT--the exact same credentials Yud has attained in his academic career. Clearly I am the most qualified person to mock him.

Night10194
Feb 13, 2012

We'll start,
like many good things,
with a bear.

I mock the writing in the story because it's bad and we're on somethingawful.com, but Yud's cult is legitimately fascinating to me as a religious studies guy who is interested in getting into studying emerging religions, and so his proselytization-fiction is actually really interesting from an academic standpoint.

Also, as someone who came out of a high school for the gifted, I can say it's definitely productive in some ways (I had access to a lot of very good teachers and classes) but counterproductive in others (there was a sort of accidental promotion of the kind of learning style I see in Yud, and I was definitely not above it. I never really learned to buckle down and work on stuff I didn't 'love' until I got to college, where a goodly number of my fellow graduates flunked.)

Night10194 fucked around with this message at 01:31 on Mar 25, 2015

SolTerrasa
Sep 2, 2011

su3su2u1 posted:

Although, the specific direction that MIRI is running in, creating mathematical ideas of friendliness is a huge misunderstanding of how applied math works.

Interesting! Would you mind posting (here or in the mock thread) how they misunderstood applied math? Whenever I see unnecessarily fiddly math in AI papers I assume they're just trying to impress the reviewers. I had a paper rejected once because it "didn't have enough equations", despite attaching working source code.

petrol blue posted:

Am I reading it right that the Friendly AI should have human values and responses? Because that would imply Yud believes that humans are good to each other and would never, say, try to wipe out a group they consider inferior.

Nope, that's not quite what it means. What does it mean? Well, Yud doesn't seem to know either; he's never really explained it. As far as I can discern it means that the AI will never do anything that would violate the "coherent extrapolated volition" of humanity. So, basically, if you could take everyone's opinions (no explanation given for collecting these), throw out the opinions that are bad (no explanation given for deciding which opinions are bad), then do whatever best satisfies those opinions. The AI itself doesn't need to seem human or have human feelings, just to act in a way that optimizes around human feelings.

Edit: here, try to derive what he means from this, which, if you can believe it, he tried to include in an AI paper.

quote:

Our coherent extrapolated volition is our wish if we
knew more, thought faster, were more the people we wished we were, had
grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish
that extrapolated, interpreted as we wish that interpreted.

SolTerrasa fucked around with this message at 01:49 on Mar 25, 2015

petrol blue
Feb 9, 2013

sugar and spice
and
ethanol slammers
This AI is going to be a brony, isn't it?

JosephWongKS
Apr 4, 2009

by Nyc_Tattoo
Chapter 8: Positive Bias
Part Four


quote:


Then she remembered what she'd read, and she gasped and flinched back from him. All the Dark Lord's magical power! In his scar!

She rose hastily to her feet. "I, I, I need to go the toilet, wait here all right -" she had to find a grownup she had to tell them -

The boy's smile faded. "It was just a trick, Hermione. I'm sorry, I didn't mean to scare you."

Her hand halted on the door handle. "A trick? "

"Yes," said the boy. "You asked me to demonstrate my intelligence. So I did something apparently impossible, which is always a good way to show off. I can't really do anything just by snapping my fingers." The boy paused. "At least I don't think I can, I've never actually tested it experimentally." The boy raised his hand and snapped his fingers again. "Nope, no banana."


I was wrong about the source of Harry’s “wandless magic” and Moddington was right. That’s what comes of reading in little chunks.

I still don’t understand how this “demonstrates Harry’s intelligence”, though. He’s just making use of a quality of the Comed-Tea that Hermione wasn’t aware of – it’s a gap in knowledge rather than a sign of “intelligence” on his part per se.



quote:


Hermione was as confused as she'd ever been in her life.

The boy was now smiling again at the look on her face. "I did warn you that challenging my ingenuity tends to make your life surreal. Do remember this the next time I warn you about something."

"But, but," Hermione stammered. "What did you do, then?"

The boy's gaze took on a measuring, weighing quality that she'd never seen before from someone her own age. "You think you have what it takes to be a scientist in your own right, with or without my help? Then let's see how you investigate a confusing phenomenon."


Here it comes – Hermione being forced by author fiat to bow to Eliezarry’s superiority. :negative:

Also, when Hermione said that “Maybe I’ll let you help you with my research”, it was clear that it was a verbal riposte to Harry’s arrogance and obnoxiousness. She never said that she actually thought she was a magic-scientist or wanted to be one.



quote:


"I..." Hermione's mind went blank for a moment. She loved tests but she'd never had a test like this before. Frantically, she tried to cast back for anything she'd read about what scientists were supposed to do. Her mind skipped gears, ground against itself, and spat back the instructions for doing a science investigation project:
Step 1: Form a hypothesis.
Step 2: Do an experiment to test your hypothesis.
Step 3: Measure the results.
Step 4: Make a cardboard poster.


Step 1 was to form a hypothesis. That meant, try to think of something that could have happened just now. "All right. My hypothesis is that you cast a Charm on my robes to make anything spilled on it vanish."

"All right," said the boy, "is that your answer?"

The shock was wearing off, and Hermione's mind was starting to work properly. "Wait, that can't be right. I didn't see you touch your wand or say any spells so how could you have cast a Charm?"

The boy waited, his face neutral.

"But suppose all the robes come from the store with a Charm already on them to keep them clean, which would be a useful sort of Charm for them to have. You found that out by spilling something on yourself earlier."

Now the boy's eyebrows lifted. "Is that your answer?"


This is starting to look a little like “negging”, as SSNeoman highlighted. Has Eliezer expressed any Men’s Rights Activist views in his writings in the past?

Added Space
Jul 13, 2012

Free Markets
Free People

Curse you Hayard-Gunnes!
To try to explain, imagine you have a genie.

You want to make a wish, but the genie might actively screw with you Monkey Paw style, so you wish for the genie to be obedient.

You might not have thought out you wish enough so that something comes back to bite you, so wish there were no bad consequences...

Except you're not sure how to do that, so wish for the proper form of a wish so you can get what you want without negative consequences...

You're not really sure how to do that either, so wish that you were smart enough to know what you would wish for if you smarter.

Now write an AI that is also a genie, and you're set.

e: No, JKWS, it's textbook Socratic method. Sometimes you really overreach in your criticisms. Please just stick to the actually stupid things?

Added Space fucked around with this message at 02:41 on Mar 25, 2015

Darth Walrus
Feb 13, 2012

JosephWongKS posted:

Chapter 8: Positive Bias
Part Four



I was wrong about the source of Harry’s “wandless magic” and Moddington was right. That’s what comes of reading in little chunks.

I still don’t understand how this “demonstrates Harry’s intelligence”, though. He’s just making use of a quality of the Comed-Tea that Hermione wasn’t aware of – it’s a gap in knowledge rather than a sign of “intelligence” on his part per se.



Here it comes – Hermione being forced by author fiat to bow to Eliezarry’s superiority. :negative:

Also, when Hermione said that “Maybe I’ll let you help you with my research”, it was clear that it was a verbal riposte to Harry’s arrogance and obnoxiousness. She never said that she actually thought she was a magic-scientist or wanted to be one.



This is starting to look a little like “negging”, as SSNeoman highlighted. Has Eliezer expressed any Men’s Rights Activist views in his writings in the past?

I went looking, and now my world is a silent, wordless scream.

JosephWongKS
Apr 4, 2009

by Nyc_Tattoo

Added Space posted:

e: No, JKWS, it's textbook Socratic method.

Alright, fair enough.


quote:

Sometimes you really overreach in your criticisms. Please just stick to the actually stupid things?

I'll try my best. It's a mock thread though, so you guys are equally free to mock me when I'm being stupid.

i81icu812
Dec 5, 2006

SolTerrasa posted:


Nope, that's not quite what it means. What does it mean? Well, Yud doesn't seem to know either; he's never really explained it. As far as I can discern it means that the AI will never do anything that would violate the "coherent extrapolated volition" of humanity. So, basically, if you could take everyone's opinions (no explanation given for collecting these), throw out the opinions that are bad (no explanation given for deciding which opinions are bad), then do whatever best satisfies those opinions. The AI itself doesn't need to seem human or have human feelings, just to act in a way that optimizes around human feelings.

Edit: here, try to derive what he means from this, which, if you can believe it, he tried to include in an AI paper.

quote:

quote:

Our coherent extrapolated volition is our wish if we
knew more, thought faster, were more the people we wished we were, had
grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish
that extrapolated, interpreted as we wish that interpreted.

This was for an AI-generated poetry anthology right?

Night10194
Feb 13, 2012

We'll start,
like many good things,
with a bear.


What the hell do you expect from a guy who is 'Hey! Evolutionary Psych and :biotruths: are amazing!'

Darth Walrus
Feb 13, 2012

Night10194 posted:

What the hell do you expect from a guy who is 'Hey! Evolutionary Psych and :biotruths: are amazing!'

Expectation had nothing to do with it. Like most of this thread's posters, I stuck my hand in that bear trap voluntarily.

i81icu812
Dec 5, 2006

Oh good! Now you know how Yud feels all the time!

http://web.archive.org/web/20010205221413/http://sysopmind.com/eliezer.html#timeline_the posted:

There's a single emotional tone - an emotional tone is a modular component of the emotional symphonies we have English words for - common to sorrow, despair, and frustration. The tone is invoked by an effort failing to produce the expected reward ("frustration"), or by the anticipation of something going wrong ("despair"), or by watching something go wrong ("sorrow"). The message of this tone can be summarized as: "This isn't working. Stop what you're doing, try to figure out what you're doing wrong, and try something else." The cognitive methods activated by this tone (21) include what I would now call "causal analysis", "combinatorial design", and "reflectivity". The motivational effect of the tone includes, of course, low mental energy.

To get an idea of what this tone feels like, tilt your head back and try to scream, inaudibly, at the pitch bats use; then, add the burning sensation you get at the back of your throat when you're about to cry; the result is the tactile aspect of the tone.

That catch in the throat is always with me.
It is present when I get up in the morning, when I go to sleep at night, and at every moment in between. Such emotions often have specific neurological substrate, and it is known that neurological perturbations can alter or obliterate both entire emotions and specific facets of emotions. (22). The catch in the throat, and the low mental energy that goes with it, and the cognitive methods it invokes, are constantly present in my mind. These characteristics are "nailed down", present at all times and regardless of external conditions.

Thus a single cause neatly accounts for both my SAT scores and my Great Depression, with the available evidence suggesting it goes back to birth or earlier (23). I have no idea whether this perturbation, this "neurohack", is genetic or prenatal or the result of some disease in infancy, but in retrospect, it's clear that it goes back as far as I - or my parents - can remember.

Are you smarter than everyone you know, but unable to force yourself to get stuff done? If so I have this great fanfic you should read!

i81icu812 fucked around with this message at 04:11 on Mar 25, 2015

petrol blue
Feb 9, 2013

sugar and spice
and
ethanol slammers

Darth Walrus posted:

Expectation had nothing to do with it. Like most of this thread's posters, I stuck my hand in that bear trap voluntarily.

It's worse than that: we saw your hand in the beartrap and jumped right in after you.

quote:

Our species does definitely have a problem. If you've managed to find your perfect mate, then I am glad for you, but try to have some sympathy on the rest of your poor species—they aren't just incompetent. Not all women and men are the same, no, not at all. But if you drew two histograms of the desired frequencies of intercourse for both sexes, you'd see that the graphs don't match up, and it would be the same way on many other dimensions. There can be lucky couples, and every person considered individually, probably has an individual soulmate out there somewhere... if you don't consider the competition. Our species as a whole has a statistical sex problem!

But splitting in two and generating optimized nonsentient romantic/sexual partner(s) for both halves, doesn't strike me as solving the problem so much as running away from it. There should be superior alternatives. I'm willing to bet that a few psychological nudges in both sexes—to behavior and/or desire—could solve 90% of the needlessly frustrating aspects of relationships for large sectors of the population, while still keeping the complexity and interest of loving someone who isn't tailored to your desires.

On the plus side, I think his utopia involves putting everyone in a volcano. I couldn't agree more.

Added Space
Jul 13, 2012

Free Markets
Free People

Curse you Hayard-Gunnes!
That article is basically the "Don't Date RobotsCatgirls!" bit from Futurama.

Adbot
ADBOT LOVES YOU

su3su2u1
Apr 23, 2014

SolTerrasa posted:

Interesting! Would you mind posting (here or in the mock thread) how they misunderstood applied math? Whenever I see unnecessarily fiddly math in AI papers I assume they're just trying to impress the reviewers. I had a paper rejected once because it "didn't have enough equations", despite attaching working source code.

The definition of friendliness they create will just be an abstraction that shares some properties of what we might think of as "friendly." It's like the mathematical definitions of "secure" you might use in cryptography stuff - it captures some feature but it's not going to be perfect. You still have all the engineering challenges that go along with the real world. A provably secure algorithm might fail in practice.

They have a nebulous mathematical definition of super human intelligence (i.e. how near to a Bayesian agent/ideal AIXI does something behave?) that doesn't capture lots of properties of intelligence (i.e. idea generation - an AIXI agent starts with all possible ideas), and they'll move to a nebulous idea of friendliness that also doesn't capture some important features,etc.

What they really want to create is a sort of "best practices guide to AI development such that it doesn't kill everyone" - that isn't a math problem.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply