I agree, although this seems like an unpopular opinion in this thread.
LLMs are really good at organizing and abstracting information, and it would make a lot of sense for an AGI to incorporate them for that purpose. It’s just that there’s no actual thought process happening, and in my opinion, “reasoning models” like Deepseek are entirely insufficient and a poor substitute for true reasoning capabilities.
I agree, although this seems like an unpopular opinion in this thread.
LLMs are really good at organizing and abstracting information, and it would make a lot of sense for an AGI to incorporate them for that purpose. It’s just that there’s no actual thought process happening, and in my opinion, “reasoning models” like Deepseek are entirely insufficient and a poor substitute for true reasoning capabilities.