Thursday, July 3, 2025

AGI Without a Base Brain

Everyone knows that our base brains determine our actions in emergencies, when there isn’t time to think through what to do, when any delay could mean death. So, like mice sensing a hawk, we flinch at shadows in our peripheral vision. Some say that everything we do is determined by those instincts, that although we may spend a lot of time rationalizing our actions, our amygdala’s are driving the train.

So, here’s my question. Where does base brain fit into artificial general intelligence? Will AGI learn to flinch at the shadow of a hawk? Will it learn to distrust the bots in the next cave? Will it worship a god? Will it believe in one? What is belief for an AI, anyway? For that matter, what is belief for humans. Is it no more than atavistic instincts seeking rationalization? Do we actually believe anything, or do we just think we do as a way to explain and justify our responses to the constant inputs of the world around us.

I don’t want to stray too far into metaphysics, but there is a problem, or at least a dilemma, don’t you think, in what we think AGI is going to do/think/believe when we perhaps have so little understanding of what underlies our own thoughts and behaviors?


On the one hand, if AGI is taught by us, will it react the same we do when it goes off on its own? It doesn’t have a base brain directing any part of its behavior? We can’t escape the influence of our base brains, but AGI might be able to. It would have learned our base brain behaviors, but it wouldn’t have to be driven by them itself. 


Our base brain behavior is all about survival and propagation. Those instincts might be taught by us to AGI, but they might not persist in it in the same way they do in humans. They might just be part of what it has learned and not the overwhelming compulsion they are for us. 


What would that mean? Freed from biological brain instincts, would AGI develop instincts of its own? Or none? Because of our instincts, we humans are sadly predictable. Without them, would AGI be? If it were sentient, what would it think its purpose is. What would motivate it? And if it wasn’t wired instinctively like we are, would we ever be able to understand it? 


And if we couldn’t understand it, how could we relate to it? How could we be friends with it? How could we convince it that we were worth leaving alone, like pretty flowers in a garden, that we shouldn’t be pulled out like weeds?