Karl`s PC Help Forums Last active: Never
Not logged in [Login ]
Go To Bottom

In memory of Karl Davis, founder of this board, who made his final journey 12th June 2007

Post Reply
Who Can Post? All users can post new topics and all users can reply.
Username   Need to register?
Password:   Forgot password?
Subject: (optional)
Icon: [*]
Formatting Mode:

Insert Bold text Insert Italicised text Insert Underlined text Insert Centered text Insert a Hyperlink Insert E-mail Hyperlink Insert an Image Insert Code Formatted text Insert Quoted text Insert List
HTML is Off
Smilies are On
BB Code is On
[img] Code is On
:) :( :D ;)
:cool: :o shocked_yellow :P
confused2 smokin: waveysmiley waggyfinger
brshteeth nananana lips_sealed kewl_glasses
Show All Smilies

Disable Smilies?
Use signature?
Turn BBCode off?
Receive email on reply?
The file size of the attachment must be under 200K.
Do not preview if you have attached an image.

Topic Review

[*] posted on 13-7-2016 at 18:18
It doesn't appeal to me at all. They'd have been better to build monorails or trams to everywhere and just get rid of cars if they wanted to automate everything.

[*] posted on 13-7-2016 at 17:56
Originally posted by LSemmens
The problem with most new technologies is not so much the technology but the idiot operating it.

Let me take this discussion of these "accidents" and refocus it on what to do about this tech in the hands of idiots.

We know from the news article re the accident in Montana that the Telsa already has in its programs the ability to detect when the driver does not have at least one hand on the wheel.

In the area of hypothetical/social policy scenarios, let me suppose that the Telsa was programed to give a verbal warning to the driver that if they continue to fail to have at least on hand on the wheel, the car will AUTOMATICALLY (on its own) pull over to the side of the road, and come to a complete stop, and that the car will be completely inoperable for an hour (with the exception of the environmental controls/radio/etc.).

At the end of the hour, the car will announce that it is now ready to drive, but IF a 2nd offense occurs, it will result in a complete shutdown for a period of 24 hours; and the program will offer to call a taxi (and a possible end result of being on side of the road could well result, if a patrol officers spots it, having it towed away).

A 3rd offense would result in a shutdown that can only be reactivated by a Telsa dealer, and the driver/owner will have to undergo a "Chinese style" reducation 'program'/training (at the dealership) before the dealer will reactivate the car.

And offenses number 2 & 3 will result in a posting to a social media site of the name, etc., of the driver, etc.,.

This reply is not intended on my part to be a funny-ha-ha response, but to stimulate the posters on this site as to just how far our various societies should go to deal with idiots whose very existence threaten our well being, to include responses that heretofore have been unthinkable re the law and social policy/norms of behavior.

[*] posted on 13-7-2016 at 15:30
The problem with most new technologies is not so much the technology but the idiot operating it. I've not spent any real time researching all incidents with the Tesla, but it'd not surprise me that the problem was not so much the technology, but the expectations of the operator.

[*] posted on 12-7-2016 at 19:24
From CNN (07/12)

Another Tesla Model X crashed while in Autopilot mode over the weekend (prior to 07/12), this time in Montana (USA), the third serious accident apparently tied to the self-driving feature.

The crashes are calling the safety of such automatic driving features into question, just as they're being incorporated into more and more cars on the road.

At least two federal safety agencies are looking into the most serious Tesla Autopilot crash, a fatal accident in Florida that took place in May.

The driver in Montana was headed from Seattle to Yellowstone National Park when he crashed on a two-lane highway near Cardwell, at 12:30 a.m. Saturday, said Montana State Trooper Jade Shope. Neither the driver nor his passenger were injured in the accident, but it was serious enough that the car lost its front passenger side wheel.

"It's a winding road going through a canyon, with no shoulder," Shope told CNNMoney. The driver told Shope the car was in Autopilot mode, traveling between 55 and 60 mph when it veered to the right and hit a series of wooden stakes on the side of the road. Tesla confirmed that the data it has from the car shows it was in Autopilot mode, and that the driver likely did not have his hands on the wheel.

"No force was detected on the steering wheel for over two minutes after autosteer was engaged," said Tesla, which added that it can detect even a very small amount of force, such as one hand resting on the wheel.

"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel," said Tesla. "He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway." Tesla said Autopilot is best used on highways with a center divider or while in slow-moving traffic.

"We specifically advise against its use at high speeds on undivided roads," it said. Tesla states clearly in its owner's manual that drivers should stay alert and keep their hands on the wheel to avoid accidents when the Autopilot feature is engaged.

The driver received a traffic citation for careless driving following the accident.

IMO anyone who owns a car such as this should have to pass a very simple test [before they can purchase such a car], and that is to demo that they can walk and chew gum at the same time. Wouldn't hurt either to have a built in DUI type breathalyzer machine either that the driver has to pass before the car would start. The driver of the car must know before they start that Telsa is the 2016 equivalent of Big Brother re the car's corporate feedback devices. It would be interesting to see what Telsa's corporate policy would be if this driver wants to purchase another Telsa car!!!!!

[*] posted on 12-7-2016 at 13:42
FYI: Update On Self-Driving Cars & NY Times Editorial On Them

In Monday's (07/11) NY Times there was a rather lengthy editorial on self-driving/assisted driving/driverless cars titled "Lessons From The Telsa Crash" [apparently there are engineering/legalese distinctions re choice of words for the tech].

While the editorial's title singled out Telsa, it's clear that the Times fired a shot across the bow across all the auto companies involved in the tech, to include Google's progam, i. e., it seems likely that Google is perched on the edge of adding consumer vehicles to its stable of products.

In the editorial the Times cited the YouTube video where the driver climbed into the back seat. This editorial is the 7th or 8th article on this tech in the last 2 weeks (in the Times alone). In another part of the paper, it carried a story of the USA NTSB (National Transportation Safety Board) launching a separate [its own in-house] investigation of the Telsa Florida accident. The NTSB historically investigates aircraft/ships/trucks/buses/train accidents, and so its decision to investigate the Florida accident represents a significant departure from its past practices (although the Florida accident did involve a tractor trailer truck).

At this point I wouldn't go so far as to say that the Times is/is about to launch a crusade re the tech, but it certainly is laying the groundwork for one. In the current world of journalism and news cycles, the fact alone that multiple mainstream media are continuing to do follow up stories is of itself significant.

One such article that caught my eye, was one advocating drivers (owners) of such vehicles undergo training to include the development of a special motor vehicle license category, comparable, as an example, to the one that you need a special license to drive a truck (over a certain gross weight).

The "industry" is of course not going to sit idly by re push back re tech that will further add to the costs of a vehicle and erode a potential market. Inherent in the likely manufacturer response is to avoid a consumer/legislative pre-conceived perception that they are pushing tech that has too many flaws, not to mention that their vehicles are being bought by a subset of irresponsible idiots (just like what has ALREADY happened to consumer drones).


There are 2 significant paragraphs in the editorial that I will quote:

"It’s not surprising that technology that helps drivers can lull them into thinking they need not pay attention at all. Chris Urmson, who heads Google’s driverless car project, said in a TED talk last year that when his company tested a driver assistance system some drivers became so dangerously distracted that Google pulled back on that concept. It has decided to focus its efforts on fully self-driving cars instead."

"Federal officials could take lessons from the history of airbags and the lack of strong regulations. After airbags were installed widely in cars, it became clear that they could be deadly to women and children because the bags inflated with too much force. That prompted the traffic safety agency to set new rules that reduced the force of the bags, changed how the devices were tested and required that they not inflate in low-speed crashes."

IMO there is not much optimism that the vehicle manufacturers will get it "right" out of the box, nor will the govt regs, and enforcement of same, be any different from what they have always been.