New technology is giving autonomous vehicles ‘X-ray’ vision to help them track pedestrians, cyclists and other vehicles that may be obscured.
Experts in Australia are now commercialising the technology, which is called cooperative or collective perception (CP).
It involves the installation of roadside information-sharing units (‘ITS stations’) equipped with sensors such as cameras and lidar.
At a busy junction, for example, vehicles would use these units to share what they ‘see’ with other vehicles.
This gives each vehicle X-ray style vision that sees through buses to notice pedestrians, or a fast-moving van around a corner that’s about to run a red light.
Example of a CP scenarios at an intersection. The car on the left would be able to alert the other car of what’s happening – that a pedestrian is crossing the road
Autonomous vehicles are powered by artificial intelligence (AI) that’s trained to detect pedestrians in order to know when to stop and avoid a collision.
But they can only be widely adopted once they can be trusted to drive more safely than human drivers.
Therefore, teaching them how to respond to unique situations to the same capability as a human will be crucial to their full roll out.
The Australian project is being undertaken by iMOVE, a government-funded research centre, with support from transport software firm Cohda Wireless and the University of Sydney.
They’ve released their findings in a final report following three years of research and development.
The technology’s applications are now being commercialised by Cohda, following the R&D work, which involved trials on public roads in Sydney.
‘This is a game changer for both human-operated and autonomous vehicles which we hope will substantially improve the efficiency and safety of road transportation,’ said Professor Eduardo Nebot at the University of Sydney’s Australian Centre for Field Robotics.
‘CP enables the smart vehicles to break the physical and practical limitations of onboard perception sensors.’
In one test, a vehicle equipped with the tech was able to track a pedestrian visually obstructed by a building.
This image shows the view of the autonomous vehicle equipped with the technology. To the right is a fast-moving van obscured by a building, about to go through a red light. The X-ray style vision lets the vehicle detect the van and put on the brakes to avoid a collision
Right is a pedestrian ‘about to make an error of judgement’ by walking into the road. CP would allow a vehicle to brake in time to prevent a collision
‘This was achieved seconds before its local perception sensors or the driver could possibly see the same pedestrian around the corner, providing extra time for the driver or the navigation stack to react to this safety hazard,’ said Professor Nebot.
In a real-life setting, CP would allow a moving vehicle to know that a pedestrian is about to walk out in front of traffic – perhaps because they’re too busy looking at their phone – and brake in time to stop a collision.
In this sense, X-ray vision is an example of how an autonomous vehicle would improve upon the capability of a standard car operated by a human.
However, autonomous vehicle technology is still learning how to master many of the basics – including recognising dark-skinned faces in the dark.
Professor Paul Alexander, the chief technical officer of Cohda Wireless, said the new technology ‘has the potential to increase safety in scenarios with both human operated and autonomous vehicles’.
Safety continues to be a major challenge for autonomous vehicles, which have undergone multiple trials globally. Some self-driving cars have been involved in human fatalities
‘CP enables the smart vehicles to break the physical and practical limitations of onboard perception sensors, and embrace improved perception quality and robustness,’ he said.
‘This could lower per vehicle cost to facilitate the massive deployment of CAV [connected and automated vehicles] technology.’
2021 was previously touted as the year fully automated vehicles would rollout on UK roads – but the technology is still in the trial phase.
Last year, Oxbotica, an Oxford-based autonomous vehicle software firm, launched a test fleet of six self-driving Ford Mondeos in the city.
The vehicles were each fitted with a dozen cameras, three Lidar sensors and two radar sensors, giving the fleet ‘level 4’ – the ability to handle almost all situations itself.
In January this year, new lanekeeping technology approved in United Nations regulations came into force in the UK.
This effectively means vehicles can be fitted with an Automated Lane Keeping System (ALKS), which keeps the vehicle within its lane, controlling its movements for extended periods of time without the driver needing to do anything.
The driver must be ready and able to resume driving control when prompted by the vehicle, but it would mean drivers could cruise along the motorway while sending a text or even watching a film.
Car manufacturers would potentially have to install shaking seats to alert drivers when they would have to take control of the vehicle.
The ALKS system is classified by the UN as Level 3 automation – the third of five steps towards fully-autonomous vehicles.
A fleet of six self-driving Ford Mondeos navigated the streets of Oxford in all hours and all weathers to test the abilities of driverless cars as part of a trial in 2020
Safety continues to be a major challenge for autonomous vehicles, which have undergone multiple trials globally.
Several self-driving cars have been involved in nasty accidents – in March 2018, for example, an autonomous Uber vehicle killed a female pedestrian crossing the street in Tempe, Arizona in the US.
The Uber engineer in the vehicle was watching videos on her phone, according to reports at the time.