Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

Tesla Automatic Driving Under Scrutiny by US Regulators

The US National Highway Traffic Safety Administration (NHTSA) has opened a formal investigation about Tesla’s automatic driving features (PDF), claiming to have identified 11 accidents that are of concern. In particular, they are looking at the feature Tesla calls “Autopilot” or traffic-aware cruise control” while approaching stopped responder vehicles like fire trucks or ambulances. According to the statement from NHTSA, most of the cases were at night and also involved warning devices such as cones, flashing lights, or a sign with an arrow that, you would presume, would have made a human driver cautious.

Qote from Tesla support page: "The currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous."There are no details about the severity of those accidents. In the events being studied, the NHTSA reports that vehicles using the traffic-aware cruise control “encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.”

Despite how they have marketed the features, Tesla will tell you that none of their vehicles are truly self-driving and that the driver must maintain control. That’s assuming a lot, even if you ignore the fact that some Tesla owners have gone to great lengths to bypass the need to have a driver in control. Tesla has promised full automation for driving and is testing that feature, but as of the time of writing the company still indicates active driver supervision is necessary when using existing “Full Self-Driving” features.

We’ve talked a lot about self-driving car safety in the past. We’ve also covered some of the more public accidents we’ve heard about. What do you think? Are self-driving cars as close to reality as they’d like you to believe? Let us know what you think in the comments.

Enregistrer un commentaire

0 Commentaires