AI computer chip ‘SMELLS your armpit and tells you if you have BO’

AI computer chip that ‘SMELLS your armpit and tells you if you have body odour’ may one day be sewn into your clothes, scientists say

  • E-nose identifies dozens of gassy chemicals that cause human stench
  • Ranks odour from one to five – so people know whether to reapply deodorant 
  • Device could one day even be added to food packaging to check for freshness 

Sticky public transport and sweltering offices leave many worrying they have body odour.

But new AI technology may mean people no longer have to ‘subtly’ sniff their armpits to check for foul odours. 

Cambridge-based manufacturer Arm – which develops the hardware in smartphones – is creating a smart chip that can ‘smell’.

The ‘e-nose’ identifies the dozens of gassy chemicals behind human stench before ranking the scent from one to five – so people know whether it is time to reapply their deodorant.

AI technology could lead to an ‘e-nose’ that tells us if we have BO (stock)

The firm hope the devices may one day be built into clothes to monitor freshness throughout the day and could even be added to food packaging to check if produce has gone off. 

The e-nose is being created as part of Project PlasticARMPit, which aims to create flexible electronic devices that can recognise everything from fingerprints to odours. 

It will consist of a thin sheet of plastic film with built-in gas-sensing technology, which was developed by the University of Manchester. 

A total of eight sensors will bind to molecules within gases in BO. These molecules will then be interpreted by AI technology to determine how strong the odour is.  

‘It’s the job of the machine learning to collect and interpret all the data and then alert the user if action is needed,’ James Myers, principal research engineer at Arm, told the New Scientist.  

This technology may also one day to be used instead of ‘best before dates’, which could help to limit food waste. 

This may be difficult, however, if different types of food require specific AI chips.

‘Bad fish smells different from bad burgers and from bad milk,’ Emre Ozer, project leader of PlasticARMPit, told IEEE Spectrum. 

The concept of e-noses is not new. They were first developed by Julian Gardner, a professor of electronic engineering, at the University of Warwick. 

HOW WILL THE ‘E-NOSE’ WORK? 

The e-nose, which is still being created, will consist of a thin sheet of plastic film with built-in gas-sensing technology.

A total of eight sensors will bind to molecules within gases in BO, the team at Cambridge-based Arm say. 

These molecules will then be interpreted by AI technology to determine how strong the odour is – on a scale of one to five.  

In 1993 he co-founded the company Alpha MOS, which sells these devices to the food industry.

Although Professor Gardner managed to take the devices from costing around £15,810 ($20,000) to just a few dollars, he argued they will need to be even cheaper before people will be persuaded to buy clothes with them sewn in. 

The chips will also need to be resilient enough to survive the wash.

Getting e-noses into food packaging may be even more difficult, according to Alex Bond, CEO of Fresh Check, a start-up that is developing sprays and wipes that change colour if food is contaminated. 

‘Any increase to packaging costs is hard to justify,’ Mr Bond told New Scientist. 

‘Most food manufacturers have exceptionally tight profit margins, so there has to be an incredibly strong incentive for them to adopt more expensive packaging.’

The chips may therefore only be realistic in developed countries or those with a high risk of contamination.  

HOW DOES ARTIFICIAL INTELLIGENCE LEARN?

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn.

ANNs can be trained to recognise patterns in information – including speech, text data, or visual images – and are the basis for a large number of the developments in AI over recent years.

Conventional AI uses input to ‘teach’ an algorithm about a particular subject by feeding it massive amounts of information.   

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information - including speech, text data, or visual images

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information – including speech, text data, or visual images

Practical applications include Google’s language translation services, Facebook’s facial recognition software and Snapchat’s image altering live filters.

The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge. 

A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other. 

This approach is designed to speed up the process of learning, as well as refining the output created by AI systems. 

Read more at DailyMail.co.uk