|
Lottery of Babylon posted:Anyone who actually studies game theory or decision theory or ethical philosophy or any related field will almost immediately come across the concept of the "minimax" (or "maximin") rule, which says you should minimize your maximum loss, improve the worst-case scenario, and/or make the most disadvantaged members of society as advantaged as possible, depending on how it is framed. And Yudkowsky fancies himself quite learned in such topics, to the point of inventing his own "Timeless Decision Theory" to correct what he perceives as flaws in other decision theories. But since he's "self-taught" (read: a dropout) and has minimal contact with people doing serious work in such fields (read: has never produced anything of value to anyone), he's never encountered even basic ideas like the minimax. Somebody should drop by with a copy of, I don't know, how about A Theory of Justice or Taking Rights Seriously, and see what they make of it.
|
# ¿ Apr 20, 2014 00:21 |
|
|
# ¿ Mar 28, 2024 09:02 |
|
The "simulation" thing confuses me. Just to clarify, is the basic thesis that we are (or might be) simulations generated by a super-advanced AI? I don't really get it. Is it like the Matrix or something?
|
# ¿ Apr 22, 2014 08:50 |