My fellow Today presenters have a new colleague. It’s small and child-like, with a vacant expression and a limited collection of rather direct one-liners, some of which are disconcertingly familiar.
But could it soon put me, or the venerable John Humphrys, out of a job?
It’s a tantalising – and slightly terrifying – question.
For this space age creature is in fact an extraordinary robot which has been christened, temporarily for our purposes, the Mishalbot.
Could this robot soon be putting Mishal Husain and John Humphrys out of their jobs?
It is currently being developed by scientists at Sheffield University who are seeking to improve the capacity of Artificial Intelligence, or AI, in relation to language and understanding.
But for the Saturday edition of the Today programme, it had been fed with a series of around 30 transcribed interviews I had carried out which had been analysed for key phrases. The point was to see whether a robot could do my job as well as, or better, than I could.
I was curious to know two things in particular, relating to the most difficult aspects of interviewing – would she be able to recall the right nugget of information or the right fact and deploy it at the right moment? Would she be able to get the tone of the conversation right? This was the sort of capability that would ring alarm bells for me and, for that matter, for everyone else working in broadcasting.
As the Mishalbot started talking, I did recognise – to my chagrin – words and phrases that I obviously over-use: she said ‘right’ rather often, and urged, a little too pressingly, ‘I don’t know what you mean. Can you give an example?’ At one point, she even interrupted Professor Tony Prescott, who leads the Sheffield team with: ‘I’m going to stop you there.’
‘As the Mishalbot started talking, I did recognise – to my chagrin – words and phrases that I obviously over-use,’ says Mishal Husain
It was an unnerving, albeit fascinating, experience.
But the reality is it has given me a newfound appreciation for the nuances and complexity of the art of human conversation. Language is precious, the essence of being human, and reassuringly hard for a machine to mimic.
Rather happily, from my point of view, these limitations were obvious. The bot did not display any particular insight, and on occasion she did not seem to understand the answers to her questions at all.
And this is the frontier of machine learning, and where that most of the work in the fast-developing field of AI is taking place. We are told that, in 30 years’ time, many of our jobs could be taken over by robots.
But, for the moment, it feels as if the jobs which involve the most amount of human interaction and emotion will be the safest because those are so much more complex than the mechanical and the functional.
This was a year in which AI defeated the world’s top player of Go, the fiendishly complex board game that originated in ancient China.
So the Today programme was keen to explore its potential in a way that would be meaningful to listeners.
The original idea was for AI to be one of our guest editors for the programmes that run in the usually slow news period between Christmas and the New Year.
‘For the moment, it feels as if the jobs which involve the most amount of human interaction and emotion will be the safest,’ argues Mishal Husain
We rely on these guest editors to generate new and interesting material for us and they haven’t always been real people – last New Year’s Eve, we chose Hull as the purported guest editor as its year as UK City of Culture began.
But while an AI guest editor would be able to use existing running orders and probably make suggestions about what to do in each slot, based on past history, actually developing the content of the conversation within those slots was a different matter.
But could it replace one of the presenters instead?
From here, it all gets rather personal. The programme has five presenters but the one they wanted to explore replacing was me.
My editor, Sarah Sands, tried to put it more delicately, talking of creating ‘an artificially enhanced version of Mishal, to see if it is indeed possible to improve her.’ Hmm. I could see where this was heading – probably best to see it for myself.
Actually developing the content of the conversation within Radio 4 broadcasts may prove difficult for an AI
The Sheffield team is one of the few in the UK working with a humanoid robot called the iCub. There are about 30 of these in existence in the world, all being developed in different ways to see what the possibilities of the technology might be.
In person, as it were, the ‘humanoid’ form of the iCub technology looks a little like the top half of a child aged about 8, with particular emphasis on the hands, arms and upper body, which are lined with sensors, a small camera in each of the eyes and movement sensors so it can gauge what is happening in its immediate environment.
Walking into the room, it was sitting at a small table where we sat opposite. It didn’t sound like me – although it was polite.
All AI capability depends on the data put into a system; in this case, my interviews. Prof Prescott and his team, Daniel Camilleri and Dr Peter Wallis, explained that the Mishalbot’s capacity for language involved recognising key words rather than understanding meaning, because speech and conversation is incredibly complex to break down.
While I see speech as natural, and games like chess or Go as more taxing, for a computer it is the opposite.
There are about 30 iCubs in existence in the world, all being developed in different ways to see what the possibilities of the technology might be
Dr Wallis showed me the ‘annotation model’ he had used to analyse my interviews, looking for common threads between them and coming up with a set of frequently used individual parts, which he called ‘dialog moves’.
Some of these constituent parts were easier to identify than others, for example, the fact that the introduction always starts with a time reference. Then, the conversation continues with a selection of the following: an invitation to the interviewee to state their case, an expansion, perhaps a request for clarification or an on-the-spot question.
Each of these was given a code and explained, with the closing annotated as ‘CLS’, alongside the explanation that this ‘can be quite abrupt’.
Even with this dissection of the different ways in which we converse, the Mishalbot’s capacity was limited, which from the selfish perspective of my own career prospects, is a good thing.
The more I heard the stilted, robotic voice, the more I heard its strange disconnected answers, the less it was possible to imagine it as having any human quality.
But as I sat opposite her and had my own chance to talk to her, I began to understand the potential of this combination of AI in humanoid form.
Her look and her voice started to become almost endearing, and I realised that for those in whose world human contact is scarce, whether through living alone or being isolated, she would offer a welcome degree of companionship.
And so, while I am reassured that she won’t kill my job just yet, she definitely has a value. Part of that is scientific – the Mishalbot lives on because the work done will continue to play into the research at Sheffield Robotics.
Who knows – in 20 years’ time, with a wealth of material to work from, she might just be good enough to sit in for me from time to time.
But her value also lies in the simple yet important reminder of what a complex and unique jigsaw a human being really is.
Download the Radio 4 Best of Today podcast for all the highlights from the Today programme Christmas Guest Editors