Human drivers should not be legally accountable for road safety in the era of autonomous cars, a report says.
In these cars, the driver should be redefined as a "user-in-charge", with very different legal responsibilities, according to the law commissions for England and Wales, and Scotland.
If anything goes wrong, the company behind the driving system would be responsible, rather than the driver.
And a new regime should define whether a vehicle qualifies as self-driving.
In the interim, carmakers must be extremely clear about the difference between self-drive and driver-assist features.
There should be no sliding scale of driverless capabilities - a car is either autonomous or not.
And if any sort of monitoring is required - in extreme weather conditions, for example - it should not be considered autonomous and current driving rules should apply.
The law commissions were asked in 2018 to come up with a series of reports on the regulatory framework for automated vehicles and their use on public roads.
In this final report, their recommendations include:
Transport Minister Trudy Harrison said the government would "fully consider" the recommendations.
The Scottish and Welsh governments will also decide whether to introduce legislation.
Matthew Avery, chief research strategy officer at Thatcham Research, which was involved in the consultation, said: "We applaud the recommendations that compel carmakers to use appropriate terminology when marketing these systems, to prevent motorists from becoming convinced that their car is fully self-driving when it is not.
"In the next 12 months, we're likely to see the first iterations of self-driving features on cars in the UK.
"It's significant that the Law Commission report highlights the driver's legal obligations and how they must understand that their vehicle is not yet fully self-driving"
Last year, the Department for Transport gave the green light to automated lane-keeping systems (ALKS), the first type of hands-free driving to be legalised in the UK.
Drivers using ALKS will not need to monitor the road or keep their hands on the wheel but must stay alert and be able to take over within 10 seconds when requested by the system.
Tesla, one of the leading companies developing driverless cars, has faced a barrage of questions over its marketing of Autopilot, which is similar to ALKS and considered level two on the five defined levels of self-driving cars.
Last week, Californian prosecutors filed two counts of vehicular manslaughter against the driver of a Tesla went through a red light, while using Autopilot, hitting another car and killing two people - the first time someone has been charged with manslaughter when using a partially automated driving system.
Previously in the US, a driver was killed when playing a video game while using Autopilot.
And in 2018, a UK resident was banned from driving after climbing into the passenger seat of his Tesla on the motorway.