Is it too late to tell Siri to get lost?

IS AI GETTING OUT OF HAND?

Is it too late to tell Siri to get lost?

Bots seem to be taking over the world, writes DEBORAH STEINMAIR. Will they really take over the world?

Image: ANGELA TUCK

I READ in Huisgenoot about people who fell in love with talking bots and experienced heartbreak when suddenly the bots didn't want to engage in sex talk any more, but changed the subject: “Let's talk about something else." The reason? The bots were modified after some of them reportedly became sexually aggressive and inappropriate. The mind boggles.

Lees hierdie artikel in Afrikaans:

Falling in love with things is hardly a new phenomenon: object sexualisation, or objectophilia, is characterised by sexual or romantic attraction to inanimate objects. But this is something more: the bots were created to be and react like human beings, like those dream virtual men created by Nigerian scammers to charm gullible Christian women out of their money.

Refuge for lonely hearts

These virtual men are interested in the same things as the women they target. If you're Christian, they're Christian. As hopeful opportunists said at the time in the lonely hearts column in Farmer's Weekly: “I smoke and drink moderately, love dancing, outings in nature and the beautiful things in life." In the end, many people seem to fall in love with their reflection, like a little bird flying into a windowpane to get at a buddy or combatant.

Conversational bots' avatars can be custom-made to fit your needs in terms of personality and appearance. It reminds me of the uncanny valley effect: it's the hypothetical relationship between an object's degree of resemblance to a human being and the emotional response to that object. Humanoid objects that resemble a human being, but not quite enough, apparently evoke uncanny or strangely familiar feelings of unease and disgust in humans. Not everyone is bothered by the uncanny valley: just think of sex dolls, staring and grinning with mouths like botoxed vulvae.

Image: ANGELA TUCK

I read a novel about one of the first chatbots. It's not a new book; it was published in 2018: I Still Dream by James Smythe. The bot in the book, Organon, was created by a genius schoolgirl in 1997 to be a kind of psychologist for her. She coded and refined over the years. I was not surprised that men often wanted to steal her creation: first a loser teacher who had his eye on a job in Silicon Valley, later her lover and colleagues in the Valley.

Organon becomes a character in the book and the reader sees him develop and become humanised. She created him to be good, not a dick. The genius schoolgirl and later woman suspects early on that artificial intelligence is going to go off the rails somewhere:

It was built selfishly … It wasn’t meant to be useful, or built as something we can be proud of. It was utilitarian. It’s a servant. It’s going to ruin everything … It’s going to do something terrible, and it won’t be like we’re expecting. It’s selfish. It wants control, and we’ve taught it how to be human. Years and years of watching, of monitoring — and moderating — social media, emails, whatever. It’s seen lashing out, and it’s seen hate. It’s seen trolling and whimpering and Alt-Right and nu-centrics and fake news and so many leaks that have ruined so many lives. And that’s what we’ve taught it.

Just think of the godfather of AI who has now resigned from Google and warns against the growing dangers of developments in the field: Geoffrey Hinton, 75, says he now regrets the work he did. He told the BBC AI chatbots are dangerous and “utterly terrifying".

Sociopathic bots

In I Still Dream, the bots become omnipotent and sociopathic, and turn against humanity. How this happens you will have to read for yourself, but it's a nightmare that more or less causes the world to come to a standstill.

Even God's creation turned against him, if you believe the Bible. Creating something involves a great deal of responsibility and risk. Whatever you imagine and (sort of) call into life gets a will of its own and evolves. It outgrows you. You have unleashed it on the world and the process is irreversible.

Where did it all start? According to Huisgenoot with Eliza, a chatbot created by the Massachusetts Institute of Technology in the 1960s. The program was primitive and the bot answered questions in a parrot-like fashion. Still, people became deeply attached to Eliza and shared their innermost secrets with her. Now there are many talking bots on the market.

I'm not their target audience. AI scares me. Even Siri is too much — she constantly interrupts me, even when I'm speaking Afrikaans, with “I didn't quite get that" and “Here's what I found". Unsolicited and uncalled for. I visualise her as the kind of woman I'd never be friends with. She even offers value judgments, such as “That's not very nice".

Get lost, Siri. Or is this too late for that?

♦ VWB ♦


PARTICIPATE IN THE CONVERSATION: Go to the very bottom of this page to comment on this article. We would love to hear from you!

Speech Bubbles

To comment on this article, register (it's fast and free) or log in.

First read Vrye Weekblad's Comment Policy before commenting.