Nobel Prize in Physics is awarded to two scientists for developing the methods that are the foundation of today’s most POWERFUL AI tools

The 2024 Nobel Prize in Physics has been awarded to two scientists for developing the methods which lay the foundations for today’s powerful AI.

John Hopfield and Geoffrey Hinton received the prestigious award for ‘foundational discoveries and inventions that enable machine learning with artificial neural networks.’

John Hopfield, of Princeton University, invented the first methods which allowed machine learning systems to save and recreate patterns.

Geoffrey Hinton, of the University of Toronto, gave these networks the ability to find specific properties, allowing them to complete tasks like recognising elements in pictures.

These scientists’ discoveries paved the way for the artificial neural networks which power modern chatbots such as ChatGPT.

The 2024 Nobel Prize in Physics has been awarded to John Hopfield and Geoffrey Hinton for developing the methods that are the foundations for today’s powerful AI

Why did John Hopfield and Geoffrey Hinton win the Nobel Prize for Physics?

Hopfield and Hinton were awarded the prize for ‘foundational discoveries and inventions that enable machine learning with artificial neural networks.’

John Hopfield invented the ‘Hopfield network’, a method for storing and recognising images within networks of nodes.

Geoffrey Hinton created the ‘Boltzmann Machine’ which adapts a Hopfield Network to recognise patterns and common attributes in data.

These advances paved the way for modern artificial neural networks which enable our most powerful AI tools.  

Most modern artificial intelligences are based on a type of technology called artificial neural networks which mimic the connections between neurons in the brain.

In the AI, the neurons are represented by nodes which influence each other through connections which can be made weaker or stronger – allowing AIs to learn over time.

Without this technology, the powerful systems which run everything from ChatGPT to Apple Intelligence would not be possible.

This year’s Nobel laureates were both instrumental in laying the foundations for these important advances from the 1980s onward.

Ellen Moons, Chair of the Nobel Committee for Physics, says: ‘The laureates’ work has already been of the greatest benefit. 

‘In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties.’

John Hopfield was responsible for inventing a system called the ‘Hopfield Network’ which allows AI to save and recreate patterns.

With his background in physics, Hopfield wanted to understand how the individual neurons in the brain work together to create new abilities like memory and reason.

Drawing on examples found in magnetic metals, Hopfield imagined that the neurons could be represented as a network of ‘nodes’ joined together by connections of different strengths.  

Today's AIs use a system called Artificial Neural Networks which would not be possible without the work of Hopfield and Hinton

Today’s AIs use a system called Artificial Neural Networks which would not be possible without the work of Hopfield and Hinton 

In his earliest work, those nodes store a value which is either ‘1’ or ‘zero’ – like the pixels in a black and white photo.

Hopfield found a way to describe these networks using a property called ‘energy’ which is calculated from the value of all the nodes and the strengths of the connection between them. 

Using these equations, networks could be programmed by giving them an image made up of black and white pixels and adjusting the connections between the nodes so the saved picture has low energy. 

When a new pattern is given the the network, the program checks each node to see whether the energy of the system would be lower if it were changed from black to white or vice versa.

The Nobel Prize Committee for Physics awarded the two scientists the prize for their 'foundational discoveries' which led to the development of machine learning

The Nobel Prize Committee for Physics awarded the two scientists the prize for their ‘foundational discoveries’ which led to the development of machine learning 

By following this rule, the network will check every node until it has eventually reproduced the original picture.

What makes this technique so special is that you can use one network to store lots of different images.

Whenever you give the network any new image, it will always return the most similar saved pattern. 

You can think about this like shaping a landscape of peaks and troughs – when the network is fed an image it creates a valley in a virtual landscape where the bottom of the valley has the lowest energy.

If you dropped a ball into this landscape it would keep rolling downwards towards lower energy until it was surrounded by uphills on all sides – that valley would be the pattern which was closest the the input pattern. 

That discovery opened up the possibility of networks which could recognise similarities between data.

John Hopfield discovered a way of storing images in artificial networks which gives computers the ability to find the closest matching saved image when provided with partially distorted data

John Hopfield discovered a way of storing images in artificial networks which gives computers the ability to find the closest matching saved image when provided with partially distorted data 

Geoffrey Hinton received the Nobel Prize work for creating the ‘Boltzmann Machine’ which expanded on this concept in a revolutionarily new way.

These machines use the Hopfield network as their basis but give the network the new ability to recognise characteristic elements in a given type of data.

Just like humans are able to recognise and interpret data according to categories, Hinton wanted to know if the same would be possible for machines.

To do this, Hinton and his colleague Terrence Sejnowski, combined Hopfield’s energy landscapes with ideas taken from statistical physics.

These methods allow scientists to describe systems that have too many individual parts to keep track of individually, such as the molecules which make up a cloud of gas. 

Even though we can’t keep track of all the parts, we can describe some states in which they could exist as more likely to occur and calculate these probabilities based on the amount of available energy. 

Geoffrey Hinton (pictured) is often described as the 'godfather of AI' for his work creating the first 'generative' algorithms capable of learning from examples

Geoffrey Hinton (pictured) is often described as the ‘godfather of AI’ for his work creating the first ‘generative’ algorithms capable of learning from examples 

Geoffrey Hinton received the award for his work creating the Boltzmann Machine (illustrated) which expanded Hopfield networks to include 'hidden' layers which allow them to learn from examples

Geoffrey Hinton received the award for his work creating the Boltzmann Machine (illustrated) which expanded Hopfield networks to include ‘hidden’ layers which allow them to learn from examples

Hinton’s breakthrough was to take an an equation by the nineteenth-century physicist Ludwig Boltzmann which describes this process and apply it to a Hopfield network. 

The resulting ‘Boltzmann machine’ has nodes like a Hopfield network but also contains a layer of ‘hidden’ nodes.

The machine is run by updating the value of the nodes one at a time until it finds a state where the pattern of the nodes can change without altering the properties of the network as a whole.

This allows the machine to learn by being given examples of what you are looking for.

The machine can be trained by changing the values of its connections until the example pattern has the highest probability of appearing on the ‘visible nodes’. 

The advances created by Fieldman and Hinton have created the basis for the neural networks which power the most advanced modern AI (file photo)

The advances created by Fieldman and Hinton have created the basis for the neural networks which power the most advanced modern AI (file photo)

AI chatbots such as ChatGPT use artificial neural networks to power their vast systems, these would not have been possible without the basic research conducted by Fieldman and Hinton (file photo)

AI chatbots such as ChatGPT use artificial neural networks to power their vast systems, these would not have been possible without the basic research conducted by Fieldman and Hinton (file photo)

This allows the AI to recognise patterns in things it hasn’t seen before – just like how you can instantly tell a tiger is somehow related to your housecat even if you haven’t seen one before.

By layering lots of these networks on top of each other, we can create something which starts to resemble many of the AIs we recognise today.

For example, a simple Boltzmann machine might be used to recommend you films based on what you had enjoyed before. 

Although the field of AI has come a long way since Hopfield and Hinton made their initial discoveries, their work has laid the basis for some of the most important innovations in recent history. 

On Monday, Victor Ambros and Gary Ruvkun were awarded the 2024 Nobel Prize in Physiology or Medicine for their discovery of microRNAs.

The Nobel Prize for Chemistry is due to be announced tomorrow morning.  

***
Read more at DailyMail.co.uk