Skip to main content

Are we right not to trust autonomous vehicles?

This page is approximately a 3 minute read

This page was published on

A banner image of Dr. Ana MacIntosh, the article's author.

Dr. Ana Macintosh researcher from the University of York, comments on how self-driving cars are being developed and discusses media reports on the subject of development and testing. 

65% 65%

of people would not feel safe in a car without a human driver. 

So, should we trust an autonomous vehicle to be safe?

Self-driving cars are not yet on the market, but the technology that will enable their introduction is being developed and tested. Media reports about the possible capabilities of highly automated and autonomous vehicles are frequent, as are the inevitable reports of crashes when the technology has been trialled before its safety is assured, and it has failed.

In their latest World Risk Poll, Lloyd’s Register Foundation asked people across the globe if they’d feel safe in a car without a human driver. They found that 65% would not.

This statistic isn't surprising for our team at the Assuring Autonomy International Programme (AAIP), a £12M partnership between Lloyd’s Register Foundation (LRF) and the University of York. Autonomous systems replace human perception, understanding of context, and decision-making, but none of the current manufacturers or software companies can demonstrate that their autonomous systems are safe in a way that gives stakeholders, including the public, confidence and trust.

Education and guidance

Providing developers with the methodologies and tools to systematically assure the safety of their system is essential if we are to give the public assurances of safety that they can believe. With the support of the Foundation, we’re making an impact in this area.

We have introduced the world’s first systematic, expert guidance on how to assure the safety of autonomous systems. Our guidance is freely accessible across the globe and has been accessed by safety engineers, software architects, CEOs, and machine learning developers. We know that it is now being adopted by organisations in many industries.

We are also training developers, engineers, and regulators to assure the safety of systems and understand what questions to ask of manufacturers and system integrators, to gain confidence in the systems they are presenting.

This approach of increasing the skills and knowledge of the right people and providing them with the guidance they need to assure their systems, is essential if we are to gain societal acceptability of self-driving cars.

Public engagement

The World Risk Poll also found that, as with optimism about Artificial Intelligence (AI) generally, people with higher levels of education were most likely to say they would feel safe in a self-driving car. Overall, 35% of those with post-secondary education responded this way versus 28% of those with secondary education and 25% with primary education or less.

We know from our own work that engaging the public and helping them understand the technology behind self-driving cars and other autonomous systems is incredibly important. Focus groups that we ran in the UK in 2020 and 2021 echo this understandable global hesitancy to accept the safety of autonomous cars.

We have also undertaken work to increase the public’s understanding of autonomous systems. Earlier this year we worked with the UK’s Science Museum Group to develop an exhibition for the National Railway Museum. The central point of the exhibition was a film that illustrated how a machine learning algorithm in a self-driving car will identify and categorise potential hazards in the world around it. Supporting elements of the exhibition considered the wider context of the introduction of autonomous systems, including a look at the ethical questions raised by autonomy.

A safe future

Whether you are developing, regulating, buying, or using a self-driving car you need to trust that it will behave safely and as expected. To build this trust, all stakeholders need justified confidence in the system’s safety.

 

The work we’re doing, supported by the Foundation, is giving technology and automotive companies the skills, tools, and methodologies they need in order to generate and give others justified confidence in their self-driving car. There's more work to be done (by the AAIP and others) to gather momentum and reach a point in the future where all stakeholders will feel safe in a car without a human driver.