Marilyn Monroe and a young Kate Moss could make a return to the catwalk as digital holograms

Anna Wintour, the formidable editor-in-chief of US Vogue, takes her seat in the front row of the catwalk, in between the Beckhams and the ­Kardashians, and the lights begin to dim.

A murmur of anticipation ripples around the small, select crowd, made up of the biggest names in the fashion world who have assembled to witness a historic moment. 

Loud, thumping music starts to play. Photographers prime their lenses.

Then, in a sea of flashes, on to the runway steps a confident 19-year-old Kate Moss – walking in the brand new Autumn/Winter 2023 collection from Calvin Klein.

It takes a second to sink in that the figure on the catwalk is a fresh-faced teenager, and that Moss, the world’s most prolific supermodel, last month turned 49.

Future of fashion? A mock-up of how the catwalk could look in the future, with Kate Moss, Audrey Hepburn, Marilyn Monroe and Grace Kelly all appearing together – virtually

Kate Moss is today 49, but thanks to AI technology she could appear again looking as a 19-year-old on the catwalk wearing designs from 2023

Kate Moss is today 49, but thanks to AI technology she could appear again looking as a 19-year-old on the catwalk wearing designs from 2023

Abba, the legendary Swedish group whose four members – now each in their 70s – appear nightly on stage as hyper-realistic digital characters in the groundbreaking Voyage concert. Above: Agnetha's Abbatar

Abba, the legendary Swedish group whose four members – now each in their 70s – appear nightly on stage as hyper-realistic digital characters in the groundbreaking Voyage concert. Above: Agnetha’s Abbatar

Yet the Croydon supermodel looks as dewy and waif-like as she did in her very first shoot for the legendary American designer in the early 1990s.

It’s a tantalising prospect – and one that suggests the future of the fashion show could be virtual. 

If Moss ever makes her way down the runway again as a 19-year-old ingenue, it will be as a digital ­construction called an avatar.

The ability to revive yesteryear’s model and dress her in tomorrow’s couture marks the next leap forward in the unrelenting advance of artificial intelligence (AI) technology. 

It has already been proven to work in practice by Abba, the legendary Swedish group whose four members – now each in their 70s – appear nightly on stage as hyper-realistic digital characters in the groundbreaking Voyage concert. 

‘De-aged’ to look as they did in their 1970s heyday, the band have shown the way in blurring the line between the digital world and reality.

The ‘Abbatars’ were created by the group pre-recording performances in front of 160 cameras over five weeks, while dressed in motion-capture suits. 

State-of-the-art computer technology then follows key reference points on the suits to manipulate their appearance, which is beamed on to an enormous (but barely visible) screen at a specially built concert venue – the 3,000-capacity Abba Arena in Stratford, East London.

Some 800,000 people have already seen the mind-bending show since it opened last May. 

Playing to a packed audience every night, it is expected to run for at least another three years.

Little wonder, then, that serious money is now being poured into similar projects by the multi-billion-pound fashion industry.

ITV’s latest sitcom, Deep Fake Neighbour Wars, uses AI to create likenesses of celebrities, such as England football captain Harry Kane and rapper Stormzy for use in comedy sketches

ITV’s latest sitcom, Deep Fake Neighbour Wars, uses AI to create likenesses of celebrities, such as England football captain Harry Kane and rapper Stormzy for use in comedy sketches

Last year, one leading luxury brand hosted an experimental fashion show using avatars which could be watched through virtual reality headsets.

Nicole Reader, founder of Modern Mirror, a New York fashion technology agency, said: ‘I don’t think we’re far off from the digital and physical worlds colliding over fashion. 

‘It is already possible to put a 3D model on a runway and make her walk. Using special holographic tech, they could even appear and walk alongside the real model.

‘AI tools could help bring this all to life within a year.’

So a 19-year-old Kate Moss could be followed down the catwalk by some of the most famous faces the world has ever known. 

Grace Kelly, swinging the latest Hermes It-bag (the fashion house named its most sought-after handbag in her honour).

Marilyn Monroe, struggling to hold down her flyaway dress – as the Hollywood legend famously did while tottering across a subway air vent during the filming of The Seven Year Itch – only this time in the latest pair of Jimmy Choos.

Audrey Hepburn dressed in Givenchy’s latest little black dress, and draped in Tiffany jewels.

Even Diana, Princess of Wales, wearing key pieces from the next collection by her erstwhile friend Donatella Versace.

It all sounds far-fetched – even ghoulish – until you watch ITV’s latest sitcom, Deep Fake Neighbour Wars, which uses AI to create likenesses of celebrities that can then be parlayed into comedy sketches. 

Last week’s episode showed a deep-fake Wonder Woman actress Gal Gadot acting as peacemaker in a fight between England football captain Harry Kane and rapper Stormzy.

According to AI experts, it won’t be long until these avatars, and those used in the Abba show, will be transplanted on to the catwalk, making millions for modelling agencies. 

For Abba Voyage, a ‘wardrobe’ of digital couture was commissioned and designed by Dolce & Gabbana. The digital Agnetha, Bjorn, Benny and Anni-Frid perform several costume changes in the 90-minute show.

Clearly inspired by the collaboration, the Italian luxury brand will tomorrow announce it is sponsoring a virtual collection at next month’s Metaverse Fashion Week.

For a recent catwalk show, Spanish label Balenciaga used AI technology to present its collection virtually, using 44 clones of the same model. 

Creative director Demna Gvasalia called it a ‘deepfake of a fashion show’. ‘It’s a show that never happened, but the clothes are real – they were made,’ he said.

The film industry has long been fascinated by how AI could be used to bring their greatest stars back to the big screen. 

When Oliver Reed died in 1999, during the filming of Gladiator, his face was digitally grafted on to that of an extra to complete his missing scenes.

For the final Star Wars film, The Rise Of Skywalker, actress Carrie Fisher, who played Princess Leia but died before it went into production, was recreated using archive footage of her face superimposed on to a digitised body.

However, using a celebrity’s likeness – even a dead one – without permission raises questions about licensing rights as well as ethical considerations.

Henry Ajder, an adviser on deepfakes and regenerative technology, raised concerns on Radio 4’s Today programme last week, noting: ‘We’ve seen this technology advance dramatically over the past five years.

‘In the film industry in particular, there are some really key questions around things such as consent, in the case of what we call synthetic resurrection – bringing back deceased actors for new films where their face is swapped on to a body double, or perhaps their entire likeness is synthetically recreated. 

‘We’re seeing some interesting questions around the legal aspects, too.

‘If I license my face for one film, can it then be used in other productions? If I’m breaking through as an actor and, excited to get a part, I sign a contract, does that mean the studio can use my likeness in other ways? And can I be compensated for that? How much control do I have over what they do with my face and my voice?’

Such considerations are especially prescient in a ‘fake news’ era. At present, there is no technology that can flag up when AI has been used to manipulate a video clip. 

It means conspiracy theorists and political extremists are able to say that a video has been faked with AI when it doesn’t suit their agenda.

As for the fashion industry, improvements in technology mean that global brands are increasingly keen to investigate how to use ­virtual avatars.

According to industry sources, virtual modelling is under serious consideration at Europe’s biggest agency, Models1 – home to Linda Evangelista, Sophie Dahl and Amber Le Bon – and London agency Storm, whose founder Sarah Doukas scouted a young Kate Moss and represents Cindy Crawford and her daughter Kaia Gerber, Wonderbra model Eva Herzigova, Carla Bruni and Lady Amelia Windsor.

A spokesman for Moss, who now runs her own modelling agency, declined to comment on whether she has been approached to turn herself into a digital avatar. But Storm director Simon Chambers said he would be surprised if she wasn’t on every luxury brand’s ­digital wishlist.

He said: ‘This is something we are watching and learning about, and there is little doubt that virtual models will be possible.

‘We feel it’s perfectly possible that we could look after an individual and their avatar, or potentially manage just an avatar in that same way that we do for real people.’

German tech company DMIx, which makes photo-realistic avatars for brands, lists Hugo Boss and Valentino as clients.

Valentino has also worked with the AI designer Vittorio Maria Dal Maso to create virtual models for a menswear collection.

Gerd Willschutz, DMIx’s co-founder, told The Mail on Sunday that the ­possibilities for the technology are ‘endless’. 

His technology can scan a living model and make a digital avatar of them that is so realistic online shoppers can’t tell the difference.

He said: ‘Think of football players in video games. They’re licensed to a game manufacturer, then they play their digital role. 

‘We can do the same with normal fashion ­models. In your life as a model, you could digitise yourself at different ages, and license all of them.

‘At the same time, if you are a successful model, you could digitise yourself and do work all over the planet in the same second.

‘We did blind testing, putting our images into e-commerce sites and not telling the users, and in the end we asked if they noticed anything. Nobody did.’

Yet there are already concerns that avatars could breach data ­protection rights, and throw up copyright issues that have no legal precedent. Then there’s the question of how much a model should be paid for her virtual efforts.

Ms Reader said models must seize the chance to become 3D-imaged and turned into avatars so they can own their data and intellectual property rights and license themselves out to make money from ­virtual runways, e-commerce and gaming. 

If they don’t, they risk their images on social media being ‘scraped’ by AI tools to copy their data to create virtual models in their image without their consent.

Fake virtual influencer Lil Miquela, a 19-year-old robot with almost three million followers on Instagram, earns $8,000 from brands for each paid Instagram post. It is estimated that she earned more than £8.9 million last year.

‘Models and celebrities can grab new income streams by being scanned into virtual avatars of themselves,’ Ms Reader said. 

‘If you think about it, they can continue to license their image over and over again without having to be present.’

Soon – to misquote the maxim of the legendary 1990s supermodel Linda Evangelista – she won’t have get out of bed for less than $10,000 a day.

***
Read more at DailyMail.co.uk