|
Argh if it's not breaking down the meaning behind your words and figuring out how to control the underlying systems, it's not natural language processing it's a complicated button! If you still have to have an understanding of the underlying systems it's just an interface like any other! That being said, by being bolted onto a specific system cleverly enough it could do natural language processing for specific use cases. Y'know, the way it can't now, what with being very confidently entirely wrong a lot, too much to rely on. Unless it can un-gently caress its own mistakes without human prompting, double-check when it's unsure, communicate its lack of confidence... it doesn't understand a goddamn thing. It's not thinking. Though to get there I think it might need frustration, anxiety, and self-doubt. And if we're programming a thinking, feeling computer and giving it all these negative emotions, that smells like the Torment Nexus to me.
|
![]() |
|
![]()
|
# ¿ Sep 27, 2023 22:25 |
|
NLP requires a model of the systems it's interfacing with. AGI requires a model of itself it's capable of evaluating and changing.
|
![]() |
|
Presumably there's a fuckload of other problems to solve too, but the system consistency evaluation is a requirement we're not capable of doing. We do have tests for it, and the way machine learning works, as soon as its training data has people discussing the test, the test is useless.
|
![]() |