|| posted on 14-7-2016 at 05:16
|I found a sludge box to be much easier to drive in heavy traffic. Though I do prefer "stick".
|| posted on 13-7-2016 at 23:32
|This is quite an interesting investigation. The motor trade as is has a vested interest in keeping the status quo. Every so often something comes
along that threatens to take away their market share. Tesla has long been championed by geeky types who actually want the comfort of their own vehicle
without having to drive them themselves. I found America's long grid like streets and automatic gears soporific and went back to a stick shift to
stay alert. I suspect taking your hands off the wheel would be such an easy thing to do when the car is, apparently, in charge.
|| posted on 13-7-2016 at 23:03
|FYI #2 Update: Tesla "Versus" The NY Times: Another Major Length Article
This articles appears on 07/16: "As U.S. Investigates Fatal Tesla Crash, Company Defends Autopilot System"
I've lost track of the number of articles in the Times Re Tesla since the first fatality (05/07), but it's pushing about 10, and they are ALL
lengthy. At some point a "saga" becomes a "crusade", and IMO that's what hard copy printed newspapers have always excelled at.
Rather than c & p the entire article, I've cherry picked what IMO are the most relevant paragraphs, and on occasion commented on same:
The National Highway Traffic Safety Administration (a different agency from the NTSB) on Tuesday released a detailed set of questions for the carmaker
about its automated driving system, particularly the emergency braking function.
Despite that acknowledgment by the company, as the federal agency pushes for answers about the accident and whether the Autopilot system failed to
work properly, Tesla officials continue to say that the technology is safe. They also say they have no plans to disable the feature, which is
installed in about 70,000 Tesla cars on the road (did NOT specify if that number is USA only). Instead, they indicate that drivers may be to blame for
The executive, whom Tesla authorized to speak only if he was not named, said drivers needed to be aware of road conditions and be able to take control
of the car at a moment’s notice — even though he said Autopilot’s self-steering and speed controls could operate for up to three minutes without
any driver involvement.
“With any driver assistance system, a lack of customer education is a very real risk,” the executive said.
But the Tesla executive said the Autopilot system had performed safely during tens of millions of miles of driving by consumers. “It’s not like we
are starting to test this using our customers as guinea pigs,” he said.
Elon Musk, Tesla’s chief executive, said in a Wall Street Journal interview published on Tuesday that the company planned a blog post to educate
Tesla owners on how to use the system safely. “A lot of people don’t understand what it is and how you turn it on,” he said. (me here: so they
sold these cars knowing that some of the new owners didn't fully understand the tech's limits as they drove away from the dealership???).
The questions raised by the N.H.T.S.A., in a nine-page letter that was dated July 8 but not made public until Tuesday, indicated the agency was
investigating whether there are defects in the various crash-prevention systems related to Autopilot.
The attention generated by the fatal Tesla accident has stoked a public debate about the safety and wisdom of making “self-driving cars.” But in
reality — YouTube videos of daredevil Tesla drivers notwithstanding — the current generation of technology involves driver-assistance features
mainly meant to make vehicles safer and help avoid accidents.
...said David Teater, a prominent traffic safety advocate. The problem, Mr. Teater said, is that the technology sends a mixed message to drivers,
suggesting they can disengage but also requiring them to take over at a moment’s notice.
“Would you want your kid jumping into a Tesla and turning on driving assist after what’s just happened?” Mr. Teater asked. Right now the
technology, he said, “is more about marketing than safety.”
While the systems enable drivers to briefly take their hands off the wheel, they are mainly designed to help drivers avoid accidents.
But the issues raised by the Tesla fatality, and the questions that the National Highway Traffic Safety Administration is asking, involve whether
Autopilot’s potential safety advantages offset the risks that such technology lulls drivers into a false sense of security.
Tesla remotely collects vast volumes of information about its cars as they are being driven. The federal safety agency apparently wants the company to
thoroughly mine that data.
The agency’s letter asks Tesla to provide a wide range of information, including a list of all vehicles sold in the United States that are equipped
with the Autopilot system. It also asks how many miles have been driven with Autosteer activated, and how many times the automatic system warned
drivers to put their hands back on the steering wheel.
The N.H.T.S.A. also wants to know, among other things, the number of incidents in which Tesla vehicles’ automatic emergency braking was activated.
The letter asks Tesla to turn over any information on consumer complaints or reports of crashes, or other incidents in which a vehicle’s
accident-prevention systems may not have worked properly.