but AGI based on linguistic mapping of reality without sensory grounding is going to be fine — I don't see any issues — I mean, it can probably infer the true nature of reality from linguistic and mathematical structures — I mean, like, look how good gpt4 is at programming, and it's not too bad at genetics, and umm... chemistry and... like, wolfram alpha and... uhh... plugins and.. like... other stuff... right? THE MAP IS NOT THE TERRITORY; ...right?