Beyond AI: artificial compassion

When we talk about artificial intelligence, we often make an unexamined assumption: that intelligence, understood as rational thought, is the same thing as mind. We use metaphors like the brains operating system or thinking machines, without always noticing their implicit bias.

But if what we are trying to build is artificial minds, we need only look at a map of the brain to see that in the domain were tackling, intelligence might be the smaller, easier part.

Maybe thats why we started with it.

After all, the rational part of our brain is a relatively recent add-on. Setting aside unconscious processes, most of our gray matter is devoted not to thinking, but to feeling.

There was a time when we deprecated this larger part of the mind, as something we should either ignore or, if it got unruly, control.

But now we understand that, as troublesome as they may sometimes be, emotions are essential to being fully conscious. For one thing, as neurologist Antonio Damasio has demonstrated, we need them in order to make decisions. A certain kind of brain damage leaves the intellect unharmed, but removes the emotions. People with this affliction tend to analyze options endlessly, never settling on a final choice.

But thats far from all: feelings condition our ability to do just about anything. Like an engine needs fuel or a computer needs electricity, humans need love, respect, a sense of purpose.

Consider that feeling unloved can cause crippling depression. Feeling disrespected is a leading trigger of anger or even violence. And one of the toughest forms of punishment is being made to feel lonely, through solitary confinement too much of it can cause people to go insane.

All this by way of saying that while were working on AI, we need to remember to include AC: artificial compassion.

To some extent, we already do. Consider the recommendation engine, as deployed by Amazon.com, Pandora, and others (If you like this, you might like that). Its easy to see it as an intelligence feature, simplifying our searches. But its also a compassion feature: if you feel a recommendation engine gets you, youre likely to bond with it, which may be irrational, but its no less valuable for that.

See the article here:

Beyond AI: artificial compassion

Related Posts

Comments are closed.