There is no easy answer as to how safe self-driving cars should be, an adviser to a new UK Government-backed study has told the BBC.
The report from the Centre for Data Ethics and Innovation (CDEI) warns that it might not be enough for self-driving cars to be safer than human drivers.
It suggests the public may have much higher expectations of self-driving car safety.
This comes as the Government further details plans for self-driving cars.
These include a "safety ambition" for vehicles - that they should be as safe as a competent human driver.
It says this will inform standards that need to be reached to "self-drive" on the roads, and manufacturers could face sanctions if they are not met.
But the CDEI, an expert body which advises governments on artificial intelligence, says the question of how safe autonomous vehicles should be is not one that science alone can answer.
Little tolerance for crashes
It says the public may have little tolerance for crashes that are seen as the fault of "faceless technology companies or lax regulation" even if, it adds, on average driverless cars are safer than humans.
And, if the public expect self-driving cars to be as safe as trains or planes, it would require a hundred-fold increase in average safety over manually driven cars, it warns.
"What we wanted to do was say there's not an easy answer to this question," said Professor Jack Stilgoe of University College London who advised the CDEI. He suggested that establishing how safe they should be was a democratic decision.
The CDEI says it is also important to consider how risk is distributed between different groups. Even if there are improvements in overall safety, "some groups may see substantial safety improvements while others see none or even face new hazards".
The report advises that other risks will need scrutiny as the technology is rolled out.
One is the potential for bias in algorithms controlling the cars.
It warns that some groups, such as wheelchair users, may be under-represented in data used to train the software algorithms which control the cars - potentially causing bias.
Clearly identified
The report also says that self-driving cars should be clearly identified, and that "people have a right to know what sort of agents they are sharing the road with".
A survey by the CDEI suggested that 86% of the public agreed with this.
Professor Stilgoe said there were also serious moral questions about how the testing of self-driving vehicles is conducted on public roads, as other road users could in effect become participants in these trials whether they liked it or not.
"There is something quite important about the ethical principle of informed consent," he told the BBC.
The technology might result in pressure to alter roads and the rules of the road to suit self-driving cars.
Professor Stilgoe said these needed to be debated and discussed transparently.
"The danger is sort of sleepwalking into a world in which these changes happen in order to suit one mode of transport - and the benefits then don't get spread very widely," he said.
Questions for Tesla
Meanwhile, US auto safety regulators yesterday asked electric-car manufacturer Tesla to answer questions about its in-car camera intended to monitor driver awareness as part of a probe into 830,000 Tesla vehicles which employ an advanced driver-assistance system called Autopilot.
Reuters says the National Highway Traffic Safety Administration (NHTSA) is assessing the performance of Autopilot after earlier identifying a dozen crashes in which Tesla vehicles struck stopped emergency vehicles.
In June, it upgraded its probe to an engineering analysis - a required step before it could potentially demand a recall.
NHTSA's nine-page letter demands Tesla answer questions by October 12 about "the role that the cabin camera plays in the enforcement of driver engagement/attentiveness".