I've been thinking about this question and see some glaring issues with the entire concept. The idea is to model AI after organic neural networks like brains, but brains are TERRIBLE! A) Brains are susceptible to all kinds of errors, hallucinations, mistakes, biases, degeneration, fallacies, tricks, etc. b) it's taken billions of years of evolution just to get ONE species to human intelligence. As far as we know, nothing has made the leap beyond it and millions and millions of other permutations i've barely even managed to approach it c) you can't "train" more intellect into an organism/person – you can teach them tons of information or train them at tasks, etc. but their level of intelligence is basically hard wired in genetically and/or from a young age – chimps and humans share 99%+ of their DNA and have brains that are nearly the same size, form, layout, etc. but no matter how much training data you feed it, it'll never make that leap to human level intelligence, d) companies keep adding orders magnitude more connections and feeding in orders of magnitude more training data and expecting better results but absolute brain size/number of connections has no inherent correlation with intelligence in animals so why would it in neural nets? And again, you could train a person to do a zillion different things without them actually understanding what they're doing and that will make them more useful but it won't make them more intelligent, and e) there's no such thing as an organic general intelligence that is an expert in all fields like they're expecting AGI to be, and there's not even an expectation that that could be possible in people. There's just an inherent limit to the amount of things on neural net can do well at the same time and like intelligence, that doesn't necessarily seem to be correlated with network size ie you can't just keep making it bigger and expecting it to get that much better or do that much more. The compute resources and energy requirements get exponential very quickly. Compromises always have to be made in evolution, so why wouldn't that be true of artificial neural nets?
So what do you think, are GPT's and neural nets just a dead end for creating AGI or am my way off?