I was listening to a fascinating conversation between John Vervaeke and DC Schindler:
They made a wonderful distinction between:
1. Raw intelligence, maybe best considered as number-crunching ability
2. Rationality, the ability to be logically consistent
3. Wisdom, the ability to make good decisions
Rationality is made not by faster computing, but by self-correction. There's the old joke that a computer is a machine that makes a million mistakes a second - what is lacking is the ability to check itself. A lot of algorithms are starting to do this. It requires a lot of computational power, but it can be done - at least in part. It requires the algorithm to contemplate; to do work that is not directly tied to an output. This raises flags, but they become obvious when we start considering wisdom.
Wisdom is a trickier nut to crack.
They argue that wisdom requires care, and care requires embodiment. That caught me off guard by just how right it is. It's actually at the core of the Christian ethos, which is fundamentally incarnational.
If something is not embodied - if it does not recieve feedback from the world, if it does not understand that it is a part of the world in which it affects (even if by mere word or computation), then why should it act in such a way to make the world better? Indeed, how could it? Secondhand accounts are always lacking. We understand partly when we are given information, but understand fully when we experience directly.
This raises all sorts of lovely questions about the mind-body connection that I don't want to get into because it's out of my depth and beside the point . Let's jump over that.
If we want wise AI, it will necessitate that it be embodied. The AI would need limbs, sensors, etc. - maybe humanoid, maybe blob-like, who knows. The key thing is that it have these faculties, so that it understands its dependence on the world and place in it.
But... why?
Why are we making these machines at all? Vervaeke and Schindler propose a potential future (even if unlikely) where such beings become enlightened and then help us be enlightened. Maybe. I think there's some people driving AI in this camp. But honestly?
We’re not making AI to become enlightened. We want dumb robot slaves.
At first we wanted our fields tilled and things built without lifting a finger. With AI, we want our taxes done and books written with the same lack of effort - and while you're at it, write me a sonnet! We want the result, we don't want the work.
So, we enslave someone and make them do it. Well, we have a sense that isn't right. Alright, well, let the guy be free and pay him. But that's expensive. Well, replace him with a machine. Now the guy is out of the picture entirely. Is removing our neighbor from the picture, and ushering in self-reliance, island existence, very Christian, though?
Adding rationality and wisdom to artificial intelligence fundamentally subverts the objective of these machines. The point of these machines is to wield the power of rationality and wisdom, without the cost of doing it ourselves, or being faced with the reality of the other. The desire to make machines rather than relationships is a love of power over neighbor.
But it's actually even worse than that. We do not want to face the reality of the other precisely because the other is a judge. We ought not enslave our neighbor not because it is unseemly, but because our neighbor is Christ - the judge.
If these AIs truly rise to the occasion of rationality and wisdom, they will judge us.
I don't mean that so much in the way of how God will judge us. I mean that in the sense that the presence of any rational, wise person, invokes a sense of judgement within us - in the way that Peter falls before Christ and asks him to leave, for he is a sinful man - in the way that another person in the room reminds us that we are not God.
I wouldn't argue that we should usher this along, in the hopes that the AI will succeed, and cause enlightening bliss, forcing us to contend with our not-Godness.
What I am saying, is that succeeding in making a good AI is fundamentally at odds with the goals of an AI: to have the power of being wise, content in a state of vice. That’s strange.
Oh, and it uses a lot of electricity. Like a lot of electricity. For all people talk about how religion is a waste of time, I think it's more energy-efficient than AI.
I definitely need to watch this video given your interest, but I have a hard time believing number-crunching is some meaningful (even if inferior) part of rationality. It always strikes me to be some narrow or derivative part, given that it's so mechanical. The art of mathematics is not about number-crunching, it's about pattern-finding.