Tesla this week unveiled new software updates to improve the self-driving capabilities of its vehicles, including new limits on highway speeds and a way for owners to "summon" their empty vehicles.
But Tesla chief executive Elon Musk said he already believes the technology has proven itself, even though Tesla introduced the self-driving software into its fleet of vehicles only late last year and even that didn't turn the vehicles into fully autonomous vehicles.
"It's probably better than a person right now [at driving]," Musk said during a conference call with reporters. Musk also expects rapid improvements over the next 24 to 36 months, when a Tesla "will be able to drive virtually all roads at a safety level significantly better than humans," he said.
Tesla tweaks the software running its vehicles by analysing data from the hundreds of millions of miles driven by current owners. The carmaker is constantly updating the autopilot software, which it still considers to be an unfinished "beta" product. But, Musk said, he was not aware of any accidents caused by auto pilot. He said the closest scenario were accidents where drivers mistakenly believed they were in auto-pilot mode.
The new software updates show the progress Tesla appears to have made - what Musk called "baby steps" toward fully autonomous vehicles.
With the "summon" feature, a driver can tell her Tesla to park itself or pull out from a parking spot or garage. For now, the driver has to be standing nearby. But Musk said he sees that changing quickly. "I think within two years you'll be able to summon your car from across the country," Musk said, allowing that he might be a little optimistic about the time frame.
The carmaker also pushed out new limits on its current autopilot driving feature. Now, Teslas in autopilot mode will be capped to driving 5 mph (8km/h) over the speed limit on undivided highways and in residential neighborhoods. Also, the autopilot feature will now slow - like a human driver might - when driving along curved roads.
A STRETCH of the 101 freeway links downtown Los Angeles with the Hollywood Hills. It sounds glamorous, but for local residents it’s eight lanes of concrete drudgery. If Chris Rea had lived in America, the 101 — not the M25 — would have been his road to hell.
I would be more exasperated by driving along it but I’ve handed over control of my Tesla Model S to its Autopilot software. Two flicks of a column stalk activate first the throttle and braking system and then the automated steering. Another column stalk activates an autonomous lane change, should the need arise. All you have to do is to settle back and let the autopilot do the work, just as a captain did on the Airbus A380 that brought me here.
In Britain the law dictates that the driver maintain control by keeping a hand on the steering wheel, but in LA that’s not necessary. While technically (and legally) in control, I’m free to faff around with my phone with one hand while adjusting the rear-view mirror with the other.
Tesla owners are now the company’s guinea pigs. It admits as much, saying that it uses driving data to improve its autonomous systems. Tesla even describes its automatic steering as a “beta version”. The term is usually applied to computer software released early to a group of users who can identify bugs, which can then be eliminated before the product’s official launch. The worst that can happen with software is that your computer crashes. If it all goes wrong in a Tesla, the crash could hurt.
But after a few minutes of stop-start traffic it starts to feel remarkably normal. Even when a mustachioed trucker makes a desperate lunge for an exit, the Tesla dabs the brakes and gives him room. All it forgets is to shout, “Hey, man!” and extend the appropriate digit.
There’s no denying it takes the stress and fatigue out of commuting. Even when the traffic momentarily opens up and we hit 60mph, the Tesla goes with the flow. For the first time in my life I’m enjoying not driving a car.
It works so well on the freeway, it’s easy to forget that the technology is still in its infancy. When I eventually head for the exit, the Tesla misjudges the bend of the slip road and I’m forced to grab the wheel and take control.
I’m not the only one. In October last year Tesla updated recent Model S cars with new software, sending a patch via the cars’ built-in Sim cards. It installed the Autopilot self-driving technology, which works in tandem with sensors fitted as standard to cars built since September 2014. Within hours drivers in America and Europe were posting videos of near misses, in which the car appeared to swerve suddenly.
It was no wonder there were a few close shaves. In some clips owners could be seen sitting in the back of the car, leaving the driver’s seat empty. Others tried to cruise through towns, along winding country roads or across busy junctions, which could all prove too complex for Tesla’s fledgling software. The company said that Autopilot should be used only on motorways, which have clear white lines to guide the car. Most of the near misses appeared to be on single-carriageway roads.
Other manufacturers offer cars that control their own accelerator, brakes and steering on motorways, but Tesla’s version differs in a few key areas. It is the first to offer an automated lane-change option, triggered by the driver clicking the indicator. And it is promoted differently. Rival companies bill self-steering technology as a driver assistance system, but the Autopilot name suggests that the car can drive itself.
Autopilot allows drivers to take their hands off the wheel for considerably longer periods than is possible in any other car we have tested. In Europe fully automated steering systems are banned at speeds of more than 7.5mph. So the new BMW 7-series, for example, can track the white lines on roads and steer itself as long as you have your hands on the wheel. Take your hands off and an alarm will sound after a few seconds. Keep them off and the self-steering will stop.
When we tested a Tesla Model S in Britain last year we were able to drive up the M4 with our hands hovering over — but not touching — the wheel for almost three minutes before an alarm sounded. Critics say this system is irresponsible, as drivers who take their hands off the wheel for such a length of time could become distracted and unable to intervene immediately in an emergency.
“In the app industry you can launch products on the market that are 70%-80% ready and then complete their development with the customer,” said Harald Krueger, the chief executive of BMW, in an interview. “That is absolutely impossible with safety features in a car.” Jaguar said the system could create a “false sense of security”.
One manufacturer behind numerous safety innovations says that drivers should never be expected to remain in control once they take their hands off the wheel. “Once we go to autopilot we think it’s not reasonable to expect the driver to intervene,” said Erik Coelingh, senior technical director of Volvo’s driverless car projects. “Drivers will not be prepared [to take control suddenly in an emergency]. When we offer autopilot, we say to the driver, ‘You can sit back, relax and do something else.’ We take the responsibility and we have to make sure that the car can deal with all situations.”
The law is having trouble keeping up with these advances, and manufacturers aren’t just outpacing current regulations but future ones too. Car technology is regulated by the international Vienna convention on road traffic, which states that a driver must “at all times be able to control his vehicle”. Unsurprisingly, the 1968 rule doesn’t tackle assistance systems. But next month an amendment will come into force, stating that fully autonomous systems that can be switched off or overriden by the driver are permitted. Many believe this broad description, combined with a 2018 update to automatic steering rules, will open the floodgates to new technology that could sideline drivers even further.
The head of the United Nations group responsible for road traffic safety, which developed the amendment, said manufacturers were already pushing the boundaries. “The amendment just considers that the driver will be in the car, but now we see the BMW 7-series, where the driver can park the car by standing outside and controlling it with their key fob,” Luciana Iorio said. “New technology is becoming available and we are having to go faster with legislation.
“We have to consider the role of drivers and how they will behave with full freedom. You might have a mum changing her baby’s nappy [in the car] and then something goes wrong. She can’t take control straight away. We have to look at liability too. It is like opening a Pandora’s box — a good one, which will revolutionise the world for the better — but still a Pandora’s box with a lot of unknowns.”
Elon Musk, the founder of Tesla, has predicted that we will see a fully autonomous car for sale in two years. Mercedes believes that the regulation changes will allow buyers of its new E-class to be given a software update that will allow it to drive itself on motorways without any steering by the driver. “The regulations would give the basis for hands-off steering,” said Michael Hafner, director of driver assistance systems and active safety for Mercedes.
Hafner said the update would not be offered until it was perfected, and dismissed Musk’s prediction. “I just don’t see full autonomy in two years — 99% of problems can be solved quickly, and that’s where we are now. But solving the remaining 1% takes 99% of the time.
“Obstacles or dirt on the road, construction areas and the loss of road markings are all areas that still need to be addressed. We need more detailed, higher-definition maps than we have today and we need a way of updating them through connectivity. There are key real-world challenges — can you imagine trying to navigate around several lanes of unmarked traffic at the Arc de Triomphe in Paris, for example? That’s the 1% at the moment.”
Coelingh at Volvo also urged caution, saying that car makers must be so confident before launching fully driverless cars that they will accept responsibility if something goes wrong. “If a lift broke, you wouldn’t accept responsibility just because you were inside,” he says. “And it’s the same with autonomous driving.”
Volvo will begin a public trial of autonomous cars in Sweden next year, but Coelingh said it would not be as futuristic as you might think. “We have tested people’s reactions in what we call Wizard of Oz cars, which appear to be autonomous,” he said. “Drivers hidden in the back control the steering. Most people are really excited when they get in, but then it drives along smoothly and they are disappointed because it’s just relaxed and comfortable.”