|| posted on 5-9-2016 at 09:30
|I would like to think that we were beyond "pissing games" but, sadly, I don't think that will ever occur. It would be relatively easy to
incorporate technology into vehicles that, based on their location on the road, they are limited to an appropriate speed. Like cruise control, there
is always a means to override it, but the override should also come with an audible warning, like the ground proximity warnings in aircraft. We
already have lane departure warnings. It would not be too hard, too, to incorporate into a car a smart licence where your drivers licence is required
to start the car and the performance of the car is determined by the license used to start it.
|| posted on 4-9-2016 at 23:51
You're defining the crux of the problem; that is, just what the transition to self-driving could/should be (on a society wide basis).
I have yet to come across anyone, at all, ever, speaking to that, even conceptually/theoretically.
And it's no small wonder. It is very obvious that any transition "plan" would be engulfed in tsunami level huge social and political controversy;
as one small example: there are large number of drivers in our societies who would feel outraged that their speed limit would be set to the posted
speed limit by the self-driving tech.
Citizens would fabricate every single excuse/rationalization to defeat any such plan, i. e., every single one would be DOA.
The theoretical, and strongest selling point IMO, of self-driving tech is that it would greatly reduce traffic accidents/injuries/etc.,. That
matter's not at all to a good many citizens. And that's really no surprise if you believe that a car is a real world psychological extension of the
ego of the driver, and exists not just to get from Point A to Point B, but among other needs, as a personal resource to vent/etc.,.
|| posted on 4-9-2016 at 21:08
|Personally, I think that the technology has some merit, however, it would have to be a blanket thing where ALL cars were required to be self drive and
roads developed to that end. Sort of like aircraft that are largely "self drive". Considering the miles aircraft do, it is one of the safest forms
of transport. The next that could benefit from self drive fairly easily is rail travel. The problem with auto-mobile self drive isn't so much the
technology but the other factors outside of the control of the manufacturers. i.e. road surfaces, the nut behind the wheel (and not necessarily in the
self dive vehicle) and the nuts using other forms of transport, i.e. shanks pony, pushies, motorbikes, horses etc.
|| posted on 4-9-2016 at 18:50
Of course WE (the entire planet) do.
One more time: my saga with all these FYIs is that this tech, along with the commercial drones, is just the latest chapter in Big Government selling
us down the proverbial drain.
That's part of the reason I commented re each country sets its own rules re auto requirements. So there's really nothing to prevent ANY sovereign
nation from stopping this tech in its tracks, i. e., doing so would prevent the local auto distributers from importing any cars with the tech.
It may take, and likely will, a groundswell of a public outcry to do so, but to get a grassroots uprising is not some parlor game. It will take
getting one's hand 'dirty' re (1) raising public awareness that it is happening right now, (2) getting editorial boards of newspapers to publish in
their editorial page their position on the matter, etc., (3) the list goes on and on.
Maybe the govt will be proactive, but I bet it won't, and even when the causalities start accumulating, ALL govt tend to drag their feet especially
when it comes to a total ban because IMO there's no intermediate position. To me this issue is identical to texting while driving; some people do
regardless of the law against it, and now it's way too late to close the barn door.
|| posted on 4-9-2016 at 12:51
|I think that there are a wealth of other issues with driver assist technologies. How would it cope with black ice, or other unstable road surfaces?
Coming back from son's place tonight, albeit on dirt roads, there were many times when I had to correct a skid on a tight bend. How would it cope
with suicidal animals that seem to think that it's ok to kiss the front of the car whilst you are travelling down the road at a gazillion miles an
hour. Or don't you have those issues over there?
|| posted on 3-9-2016 at 14:55
|FYI #3 Re Self-Driving/Driverless Cars--Big Carmakers Merge, Cautiously, Into the Self-Driving Lane
The NY Times continues to follow the self-driving/driverless tech with major space devoted to the subject in its print/online edition. This article
was published on 09/02/16.
Since IMO this tech puts all of us in the category of 'expendable' re having to coexist with this very flawed tech, I have opted to post this entire
lengthy article. It is also my opinion that my govt is reactive (rather than proactive) to accidents, and any new regulations are subject to the
extremely powerful lobbying efforts of auto manufacturers, especially since they, and their suppliers, are a major source of jobs [even if a good manyof these jobs are really outside the borders of the USA].
I should note that each country could have its own set of regs re these cars, i. e., what is "allowed" in one country may not be allowed in another
re the various features of the tech.
When General Motors starts selling a new Cadillac sedan next year, it plans to equip the car with G.M.’s answer to Tesla Motors’ Autopilot. [inthe USA a "sedan" is likely called a "salon" car in other countries].
Like Autopilot, G.M.’s SuperCruise feature will be designed to steer a car for long stretches of highway driving, pass other vehicles, brake for
traffic, speed up and change lanes — all with minimal effort by the driver.
There is one thing the G.M. system will force drivers to do, though, that Tesla’s system does not: keep their eyes on the road.
Unlike Tesla, G.M. has chosen to place a camera, near the rearview mirror, that monitors the driver’s actions.
Along with other mainstream automakers cautiously rolling out Autopilot-like capabilities, G.M. is unwilling to assume that human drivers will — or
even can — be trusted to remain safely engaged in the vehicle’s operation.
The big carmakers hope to avoid the criticism that has enveloped Tesla’s Autopilot — that the driver-assistance technology can lull the person
behind the wheel into a mind-wandering sense of false security.
“Through the driver’s eyes, you can detect his or her level of attention,” Mark Reuss, G.M.’s executive vice president for global product
development, said in July at an auto technology conference in Detroit.
The monitoring system, which analyzes images of the drivers’ eyes and head to tell if they are looking forward, will notice if drivers are drowsy,
looking down at their cellphones or have turned to reach into the rear seat.
If the driver does not turn back to the road after a few seconds, warning tones and lights go off. If the driver does not respond, SuperCruise can
slow or stop the car, Mr. Reuss said.
Audi, the German luxury carmaker, plans to add similar driver-monitoring technology next year when an upgraded version of its driver-assistance
technology is offered in the 2018 A8 sedan.
“The car will see the driver’s condition and be able to say, ‘O.K., you’re paying attention and alert,’ and then it can be engaged,” said
Brad Stertz, Audi’s director of government affairs.
As an added safeguard, the G.M. and Audi systems will work only on divided highways whose curves and exits have been plotted on digital maps, which
enables cars to track their precise location on the road and on the three-dimensional surrounding terrain. The systems will also recognize objects
like overpasses and road signs. [Me here: so what about road work, accidents that block lanes of traffic, etc.,; they won't be plotted on anyauto-navigation gear]
Outside those digitally mapped areas — on winding country roads, say, or uncharted city streets — G.M.’s SuperCruise and Audi’s Traffic Jam
Pilot will not operate.
Such precautions are intended to prevent the kind of accident that has put Tesla and Autopilot under heavy scrutiny in the last few months.
In May, the driver of a Tesla Model S was killed when his car collided with a tractor-trailer crossing a state highway in Florida. Autopilot was
operating at the time and failed to recognize the white truck against a bright sky. Neither Autopilot nor the 40-year-old driver, Joshua Brown, hit
The National Highway Traffic Safety Administration is investigating to see if flaws exist in Autopilot, but Tesla has said the accident was at least
partly a case of operator error.
Even as G.M., Audi and other carmakers seek to adopt some of the capabilities of Autopilot, many experts say the concept has a basic flaw — the
so-called handoff problem. These critics say too many experiments have shown that a person behind the wheel of a car that seems to be driving itself
simply cannot respond quickly enough to take over in the instant the unexpected happens and the technology is not equipped to handle it.
That is why at least one major automaker, Ford Motor, and Google have each said they would skip driver-assistance systems and focus on future, fully
self-driving cars that require no human intervention.
Despite their multitude of sensors and processors, autonomous cars have a lot of trouble with some everyday aspects of driving.
“It’s very difficult to have proper driver engagement,” said Raj Nair, Ford’s chief technical officer.
Tesla, in contrast, is intent on sticking with the Autopilot approach but improving it.
When Tesla introduced Autopilot last fall to much fanfare, the Silicon Valley company appeared to leap ahead of conventional automakers that have
decades more experience in developing automobiles.
The system and the way Tesla distributed it — by beaming software updates wirelessly to cars already on the road — fostered the impression that
Silicon Valley, with its charge-ahead mind-set, was reinventing the auto industry for the digital age, while leaving the plodding, old-line carmakers
The contrast between Tesla’s approach and the more cautious pace taken by traditional automakers reflects a “clash of philosophies,” said Amin
Kashi, director of automated driving assist systems and automated driving at Mentor Graphics, which helps automakers design electronics systems.
Mr. Kashi has experience in both worlds. He worked at Ford for 14 years before moving to Silicon Valley, and he has been at Mentor since April.
Established automakers take a more conservative view of new technology and tend to have their own engineers refine and test it until it works as
intended. The companies also typically hold clinics where they watch customers try new technologies to make sure it is easy to use and to discover how
some might misuse it.
“Sometimes, maybe, they are too careful,” Mr. Kashi said. “That’s something I didn’t like at” at Ford.
Mr. Kashi said he was impressed by how quickly Tesla was able to speed ahead with Autopilot. Still, he said that while the “negative press” Tesla
has received over the Florida accident focused on “customer misuse” of Autopilot, it also shows the need to make sure customers understand the
Tesla warns drivers that Autopilot is not meant to be a fully autonomous system. When Autopilot is activated, the dashboard screen and audio alerts
remind drivers to remain engaged and keep their hands on the steering wheel.
But many Tesla owners, including Mr. Brown, the driver killed in Florida, have posted videos on YouTube showing it is possible to go several minutes
without looking at the road or holding the wheel.
Mercedes-Benz has just introduced a semiautomated driver-assist system in the new 2017 E-Class sedan that is similar to Autopilot but forces drivers
to have their hands on the wheel more frequently. On an open highway, with few cars on the road, the system, Drive Pilot, might allow hands-free
driving for up to a minute, said Bart Herring, general manager of product management at Mercedes-Benz USA. But in traffic, it sets off alarms after
only a few seconds.
“We love big leaps, but we don’t make those until they’re ready for prime time,” Mr. Herring said.
Audi has even further restrictions. The system it plans to release next year is meant to work only up to 35 miles per hour, to relieve the driver in
stop-and-go traffic. Not until the next version, expected by 2020 or 2021, will the Audi system operate at highway speeds.
“We feel it’s important to walk along with customers as they get used to the technology,” Mr. Stertz said.
Both the Audi system and G.M.’s SuperCruise also use radar sensors, cameras and lidar — a kind of radar based on lasers — to read the road and
identify vehicles and pedestrians.
Elon Musk, Tesla’s chief executive, has said the company does not think lidar is necessary for a semiautonomous driving system like Autopilot. On
Wednesday, he posted a Twitter message promising a software upgrade that would include improvements to Autopilot “primarily through advanced
processing of radar signals.’’
Many experts say lidar is better at identifying objects than radar. But it is more costly and does not see as far in front of the car as radar, a
Tesla spokeswoman, Alexis Georgeson, said.
Other big automakers are following an even more cautious pace than G.M., Audi and Mercedes. Honda and Toyota offer systems with radar, cameras and
automatic braking that are intended to mitigate or prevent accidents. And a steering capability keeps their cars from drifting out of their lanes. But
Honda and Toyota stop short of letting their cars drive themselves.
“We have to be clear about the technology’s limitations,” said Jim Keller, a chief engineer at Honda R&D Americas. “Customers will find
situations where the system won’t work as they think it will, and there will be consequences.”
One final thought from me: I have a very strong suspicion that an owner of a car that can drive itself will increase their alcohol/drug use prior to
driving feeling that, since erratic driving is a dead giveaway when spotted by a patrol officer, they can then "drive" undetected.