► Robotaxi tested in the UK
► Nissan hardware, startup software
► What does it mean for autonomy?
270 cameras darted around a small part of East London, 24km of digital mapping that’s accurate to 1cm and a car that’s equipped with four lidar sensors, more than 100kg of added electronic goodness and a boot crammed with so much secretive tech that I wasn’t allowed to take a picture of it. That, in layman’s terms, is what it takes to make an autonomous robotaxi work in the UK.
The car in question is a 2018 Nissan Leaf, heavily modified by East London outfit Smart Mobility Living Lab. The project allows the car to drive in a living, breathing, real-world testbed. The project is partly funded by the government, and partly by industry. £10.7 million has been spent on it so far, 1600 miles have been covered and there have been zero road accidents.
How did the ride go?
Honestly, smoother than a lot of Ubers I’ve been in. I should state here that the car in question is level four autonomous, meaning it offers hands-off driving and near full autonomy. But UK laws dictate that there needs to be an actual human behind the wheel, although he was allowed to not have his hands on the wheel.
Things start off smoothly; the car indicates to pull out of the junction as we start the 2.7-mile route in Woolwich. Interestingly, although it indicates, it doesn’t interpret other cars’ indicators. It can, it’s just told not to as humans are so indifferent/incompetent at using them correctly.
Right, we’re away. I’m in the back, and in front of me sits a large screen. From that is a visual interpretation of what the car ‘sees’. It looks a bit like the info you get on a Tesla.
A red light is up ahead, it slows down naturally and pulls away when it goes green in a reasonable time. All good so far. Next up is a roundabout and a lane change, once again dealt with adeptly.
On the screen in front of me you can see images of generic cars, bikes and humans. Two junctions away there’s a bus that’s red. This indicates that it has stopped. The car makes a lane change in order to avoid getting stuck in traffic. All very clever stuff, and something my human eyes couldn’t have seen.
A zebra crossing quickly approaches and the car slows tentatively as it can ‘see’ a pedestrian nearby. The pedestrian doesn’t step out, the car carries on its merry way.
It’s all very impressive. But not quite without fault. Our human driver did need to intervene at one point to avoid a pothole. The car isn’t sentient and doesn’t understand the concept of suspension bushes.
What does it mean for drivers?
Broadly, that robotaxis can work in a city such as London. California, a land with large open spaces, has working driverless taxis already – Amazon’s toaster-shaped Zoox is currently roaming around sans steering wheels or a pedal.
But Nissan’s venture proves the concept can work in much harder environments, with fast and slow moving traffic, roundabouts and errant Londoners.
It’s not all perfect though. The system is compromised in poor weather, especially snow, where sensors can clog and the cameras struggle to make sense of what’s happening. And there’s the whole mapping issue.
Want to access more amazing content from CAR? Become a Member of CAR here with a 99p trial!
Our test route was extremely well-mapped and surveilled. There’s even a command centre back at base with tens of screens monitoring your every move. This type of robotaxi can work in populous places with millions and millions being spent on infrastructure and the car itself. But we’re still very far away from all of this working seamlessly in areas without the huge infrastructure and mapped data.