Self-driving vehicles may be inherently racist because they’re unable to detect dark-skinned faces in the dark, experts have warned.
The Law Commission says racial bias ‘has crept into the design of vehicles and automated systems’, which could have disastrous consequences.
Autonomous vehicles are powered by artificial intelligence (AI) that’s trained to detect pedestrians in order to know when to stop and avoid a collision.
But this inherent bias effectively means anyone with a ‘non-white’ skin tone might be at greater risk of being involved in an accident in poor light conditions.
Self-driving vehicles may also be prejudiced against women and the mobility-impaired, because their operating systems have largely been created by able-bodied men, according to the Law Commission.
The self-driving vehicle market will be worth nearly £42 billion to the UK by 2035 according to the Department of Transport – but scientists are racing to get the technology right
Driverless cars are more likely to HIT people with darker skin
In 2019, a report found that facial recognition systems in self-driving cars are better at identifying the faces of white people than those of darker skin tones.
Researchers say the inherent racism of these systems likely stems from a lack of dark-skinned individuals included in the technology’s training set.
The system used in the research, BDD100K, was developed by UC Berkeley and contained more than 100,000 videos of real-world footage taken by vehicle mounted cameras.
The study found databases behind facial recognition technology being built for autonomous cars are up to 12 per cent worse at spotting people with darker skin.
On average, the technology is 4.8 per cent more accurate at correctly spotting light-skinned individuals.
The independent body is drawing up the legal framework for the rollout of self-driving cars on UK roads.
‘Systems may not have been trained to deal with the full variety of wheelchairs and mobility scooters,’ it says in its joint consultation with the Scottish Law Commission.
‘Air bags save many lives, but the first generation… posed risks to smaller passengers, such as women of small stature, the elderly, and children, because they were developed with adult males in mind.
‘Current facial recognition software may also exhibit a bias towards white, male faces.
‘For non-white and non-male faces, the accuracy of facial recognition systems may decline significantly.’
The Law Commission also said that if systems are designed to recognise pedestrians through leg movements, ‘those movements may not be as pronounced for people wearing long skirts or robes’.
‘Where designers are drawn predominantly from one demographic group (such as young men) it is easy for the diversity of those affected by the design to be overlooked,’ it says.
The self-driving vehicle market will be worth nearly £42 billion to the UK by 2035 by 2035 according to the Department of Transport – by which time, 40 per cent of new UK car sales could have self-driving capabilities.
But autonomous vehicles can only be widely adopted once they can be trusted to drive more safely than human drivers.
Therefore, teaching them how to respond to unique situations to the same capability as a human will be crucial to their full roll out.
‘When it comes to autonomous cars, that technology must be accurate, precise and non-discriminatory,’ Edmund King, president of AA, told the Times.
‘Human error is a factor in a majority of crashes but we shouldn’t just transfer the risks and accept robot error.
‘The last thing we need is the next generation of Mondeo Man being a racist, misogynist self-driving automobile.
THE FIVE LEVELS OF AUTONOMOUS DRIVING
Level 1 – A small amount of control is accomplished by the system such as adaptive braking if a car gets too close.
Level 2 – The system can control the speed and direction of the car allowing the driver to take their hands off temporarily, but they have to monitor the road at all times and be ready to take over.
Level 3 – The driver does not have to monitor the system at all times in some specific cases like on high ways but must be ready to resume control if the system requests.
Level 4 – The system can cope will all situations automatically within defined use but it may not be able to cope will all weather or road conditions. System will rely on high definition mapping.
Level 5 – Full automation. System can cope with all weather, traffic and lighting conditions. It can go anywhere, at any time in any conditions.
Note: Level 0 is often used to describe vehicles fully controlled by a human driver.
‘These technological hurdles need to be overcome before drivers can take their hands off the wheel.’
Self-driving vehicles could prevent 47,000 serious accidents and save 3,900 lives over the next decade, Mike Hawes, chief executive of the Society of Motor Manufacturers and Trader.
But Hawes also said fully automated driving – known as level 5 – is ‘some way off’.
2021 was previously touted as the year fully automated vehicles would rollout on UK roads – but the technology is still in the trial phase.
Last autumn, Oxbotica, an Oxford-based autonomous vehicle software firm, launched a test fleet of six self-driving Ford Mondeos in the city.
The vehicles were each fitted with a dozen cameras, three Lidar sensors and two radar sensors, giving the fleet ‘level 4’ – the ability to handle almost all situations itself.
In January this year, new lanekeeping technology approved in United Nations regulations came into force in the UK.
This effectively means vehicles can be fitted with an Automated Lane Keeping System (ALKS), which keeps the vehicle within its lane, controlling its movements for extended periods of time without the driver needing to do anything.
The driver must be ready and able to resume driving control when prompted by the vehicle, however.
But it would mean drivers could cruise along the motorway at 70mph while sending a text or even watching a film.
Car manufacturers would potentially have to install shaking seats to alert drivers when they would have to take control of the vehicle.
A fleet of six self-driving Ford Mondeos navigated the streets of Oxford in all hours and all weathers to test the abilities of driverless cars as part of a trialin 2020
The ALKS system is classified by the UN as Level 3 automation – the third of five steps towards fully-autonomous vehicles.
Safety continues to be a major challenge for autonomous vehicles, which have undergone multiple trials globally.
Several self-driving cars have been involved in nasty accidents – in March 2018, for example, an autonomous Uber vehicle killed a female pedestrian crossing the street in Tempe, Arizona in the US.
The Uber engineer in the vehicle was watching videos on her phone, according to reports at the time.
SELF-DRIVING CARS ‘SEE’ USING LIDAR, CAMERAS AND RADAR
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
However, others make use of visible light cameras that capture imagery of the roads and streets.
They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.
In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.