No matter how far Google Inc. comes with self-driving cars, the technology will never be perfect. Human error and a chaotic world will not allow it.
This may be why Google, prodded by a report by the Associated Press about accidents involving the company's self-driving cars, revealed this month that its fleet has been involved in 11 minor accidents over the course of 1.7 million miles of testing.
Google insisted its software wasn't at fault in any of the crashes, yet the data prompted a search for meaning, as analysts compared Google's record to U.S. averages and asked whether the self-driving car had a safety problem.
It was a fundamentally misguided analysis. Google's research takes place mostly on urban roads, where minor accidents are more common than on highways. Google must also disclose every accident involving its self-driving cars to the State of California, unlike the legions of drivers who don't report minor fender benders for fear of raising their insurance rates.
Yet this probing, as unfair as it was, illustrated the bar that Google's self-driving car will need to get over to be accepted. It must fit onto the road seamlessly, so drivers sharing the road aren't surprised or put at risk by its manners. It must be not only safer than human drivers, but unquestionably safe.
The blog post that Google released this month explained a handful of fender benders, but it could very well have been about the first person to be seriously hurt or killed by a self-diving car.
"Even when our software and sensors can detect a sticky situation and take action earlier and faster than an alert human driver, sometimes we won't be able to overcome the realities of speed and distance," Chris Urmson, the director of Google's self-driving cars project, wrote in a post on the website Medium. This is "important context for communities with self-driving cars on their streets," he added. "Although we wish we could avoid all accidents, some will be unavoidable."
People are wary of new technology and eager to seize on its flaws. Think back to last year, when a few incidents surfaced of battery fires in the Tesla Model S. People were alarmed, and bafflingly so, because gasoline fires take place every day. Yet this fear had a material value: Tesla ultimately spent millions of dollars to retrofit its cars with titanium shields and defuse the controversy.
Google will find itself in the same position someday. If its cars drive enough miles, a tragic, one-in-a-million event will occur. So now, as it prepares to run pilot programs on the public roads of its hometown, Mountain View, Calif., Google must think beyond engineering -- about culture, psychology and marketing.
For a strange new technology seen as somehow intimidating, the cure is familiarity. To succeed, self-driving cars must be not just technically better, but also welcomed, which is why Google gave its prototype a rounded, friendly look. It's also why Google tests its cars in Mountain View. That's where familiarity will form fastest. This affinity must be so strong that it cannot be broken when something goes wrong.
Volvo Will Accept Liability
If a driverless car crashes into another driverless car, who is to blame? The carmaker that made the faulty depth sensor? The human asleep in the back seat? Or the artificial intelligence that was, for a brief second, unintelligent?
Volvo appears to have answered these questions. The Swedish carmaker has said that it will accept full responsibility for a crash involving one of its driverless vehicles as long as the accident resulted from a flaw in Volvo’s design and not from human meddling.
Hakan Samuelsson, president of Volvo, made the pledge yesterday to a conference in Washington. Erik Coelingh, Volvo’s senior technical leader for safety, said: “There will be fewer crashes with autonomous cars but there will be crashes. In these rare events, people need to know what will happen.”
Volvo is aiming to provide 100 customers in Gothenburg, Sweden, with such cars by 2017. These people would be covered by Volvo’s liability promise.
There is no general consensus among driverless carmakers about who should bear responsibility for accidents involving their vehicles.
Experts argue that responsibility could lie within a driverless car’s hardware and software systems. Volvo’s pledge is likely, therefore, to calm the nerves of passengers, insurance companies and road regulators.
There are likely to be limits to the carmaker’s pledge. An owner who fails to update their vehicle’s software might be deemed partly responsible for a crash. If an owner added a huge exhaust pipe that fell off and hit another vehicle, the car would probably not be to blame.
Tesla, the American manufacturer, appears to be leaving liability with the driver. According to The Wall Street Journal, its technology would let a driver overtake a car simply by flicking their indicator. However, with this flick the driver is also taking responsibility for the safety of the manoeuvre.
Changing Road Laws
(London Times 18 Jan 2015)
MINISTERS have decided to allow driverless cars to share Britain’s roads, but the Highway Code will have to be rewritten to help vehicles on autopilot cope with the UK’s unpredictable traffic.
The biggest concerns involve how control is handed from man to machine. Graham Parkhurst, head of an academic research programme on transport in Bristol, one of four official pilot projects, said: “It is like the laws in the infancy of motoring when a man had to walk in front of a motor vehicle waving a red flag.”
Under a review conducted by the Department for Transport (DfT), cars will be allowed to operate without human intervention on the road network, but drivers must be able and ready to take control.
It means they will no longer be required to keep both hands on the wheel but will have to wear a seatbelt and will face penalties for speeding or weaving across the road.
As part of his research Parkhurst will examine the length of time that people can remain alert if they are sitting in the driver’s seat with nothing to do other than to react in a crisis.
In the long term, drivers will be able to hand over full responsibility to the vehicle’s computerised controls, giving mobility to non-drivers, including elderly and disabled people.
“Automated vehicles that never get tired or distracted also hold the key to improving road safety substantially,” a government source said.
An official trial that will start in Bristol in April will try to deal with the short-term problem of allowing robots to share the road with 35m conventional vehicles.
Because autonomous vehicles are programmed to brake when they detect a human in their way, there are concerns that they may be too “timid” to nudge their way through busy urban streets when pedestrians are walking among near-stationary traffic.
Cyclists will also cause problems because a robot car would follow the letter of the Highway Code, crawling behind them as it waits for a gap equal to that when overtaking a car. Robot cars may become marooned at roundabouts as they wait for a safe gap and be unable to recognise a wave from a driver to allow them into the traffic.
Parker, who is part of the Venturer consortium in Bristol which also includes the insurance giant Axa, said: “The debate needs to be had whether driverless cars can drive to different standards. It may be that the requirements of the Highway Code can be relaxed, for example, because they can pass with precision closer to another vehicle.”
Rules on tailgating may also need to be changed for driverless cars to fulfil their potential to cut fuel bills and reduce overcrowding on the roads.
The revolution will start in Milton Keynes in late autumn when a pod becomes the first fully autonomous vehicle to operate in a public space. It will weave at a modest 10mph on pavements and in pedestrianised areas between the railway station and a shopping centre.
An official trial in Bristol, due to begin in April, will see a highly automated Bowler Wildcat, based on a Land Rover Defender, tested on public roads at the campus of the University of the West of England.
A DfT consultation document says road users could struggle to respond when they encounter a car where the person in charge is not obviously “driving”.
It suggests that a car on autopilot should display a warning signal, either a sticker or a different numberplate.
Under plans to be announced next month, Britain will seek to become an international test centre for a new generation of robot cars . It will compete with California, which is hosting trials by Google, and Sweden, which permits Volvo to run tests in Gothenburg.
A review of traffic laws has concluded that there are “no barriers” and “huge safety benefits” to testing the new technology.
Patrick McLoughlin, the transport secretary, speaking before the review was published, said: “ We need to grab this opportunity to place the UK at the forefront of this development.” He has been working with the Department for Business, Innovation and Skills. A government source said: “We are setting out the best possible framework to support the testing of entirely automated vehicles and providing the legal clarity to encourage the largest global businesses to come to the UK to develop and test new models.”
According to the source, a new regime of laws and regulations will be introduced before the first driverless cars go on sale to the public. New rules governing insurance liability, tax and the MoT test, along with a revamped Highway Code and driving licence regime, are expected to be agreed by the summer of 2017.
Lawyers and the insurance industry must now work out who is liable if a car controlled by a computer rather than a human being collides with another road-user.
The legal problems are unlikely to stop there. Once cars operate on autopilot, the charge of dangerous driving might, for example, have to give way to an offence of operating a car without a compulsory software upgrade.
London Times 22 Aug 2014
Google has programmed its driverless cars to break speed limits by up to 10mph because it believes that allowing them to do so will improve road safety.
Dmitri Dolgov, the lead software engineer on Google’s driverless cars project, said research had shown that keeping to a speed limit when nearby cars were going faster was more dangerous than speeding up.
Google is testing its cars on the streets of Mountain View, the Silicon Valley town that is home to Google’s headquarters. The cars have not yet been tested in the UK, but Vince Cable, the business secretary, announced last month that companies will be able to test driverless cars in certain cities from the start of next year.
The Highway Code states that vehicles cannot travel faster than the national speed limit in any circumstance. The government has promised to review road rules in advance of the introduction of driverless car testing.
Some research has suggested that a car moving slowly amid faster-moving traffic is likely to cause other vehicles to bunch up behind it, which could lead to an accident. “Thousands and thousands of people are killed in car accidents every year,” Mr Dolgov said. Allowing driverless cars to speed “could change that”.
J Christian Gerdes, faculty director of the REVS Institute for Automotive Research at Stanford University, said that the Google car’s ability to recognise unusual objects and to react in abnormal situations were significant hurdles that had yet to be overcome.
There were also ethical issues with driverless cars, he said. “Should a car try to protect its occupants at the expense of hitting pedestrians? And will we accept it when machines make mistakes, even if they make far fewer mistakes than humans?”
There are also unresolved issues around legal liability when a driverless car is involved in a crash.
Google’s driverless car project, which began in 2009, is being run by its Google X experimental technology division. The same unit developed Google Glass, the “smart” eyewear that was released earlier this year.
* Britain is a nation of middle-lane hoggers even though motorists facing a fine of £100 for breaking the law. A study by ICM Research found that almost six drivers in ten say they hog the middle lane of the motorway and almost one in ten admit that they always or regularly do so.