1 second before the accident, Autopilot automatically exits

[ad_1]

 

An investigation into Tesla has
caused an uproar on social media. Some netizens even described the
findings of the survey as \”epic\”:

\"Tesla

 

\"Tesla

This investigation comes from the
National Highway Safety Administration (NHTSA). In the exposed documents, a sentence stands out: Autopilot aborted vehicle control less
than one second before the first impact. bell, Autopilot aborts control
of the vehicle .)

 

\"Tesla

This discovery by NHTSA is like
dropping a blockbuster on the public. Because it\’s like \”the robot
hands the steering wheel to you before the car hits you \”…

\"Tesla

 

Then many netizens began to
question: With such a design, Tesla can deny that the accident was because of
Autopilot?

Ah, this…

Put the steering wheel in your
hands less than a second before the accident.

The
NHTSA investigation into
Tesla has been underway since last August.

The direct reason for the
investigation is that more than one Tesla broke into the scene of an existing
accident with the automatic driving system Autopilot turned on and hit an
ambulance, a police car, or an accident vehicle that had already been parked on
the roadside.

 

\"Tesla

I don\’t know if I don\’t check it; I\’m shocked when I check it.

They found that in 16 of these
crashes, most of the vehicles\’ Autopilot had Forward Collision Warning
(FCW) activated before the collision. About half of the vehicles\’ automatic emergency braking
(AEB) subsequently actively
intervened. But in the end, they all failed (one person died, unfortunately).

Horribly, on average, in all 16
crashes, Autopilot aborted control less than a second before the actual impact
of the car. And this, there is not enough time for human drivers to take
over.

Video of the accident showed that
the human driver basically noticed the existing accident scene for 8
seconds before the collision.

But court data from 11 crashes showed that no human driver took evasive action in the 2-5 seconds
before the collision, even though they all kept their hands on the steering
wheel, as Autopilot asked.

Perhaps most drivers still
\”believed\” in Autopilot at the time—nine of them had drivers who
didn\’t respond to the system\’s visual or audible alerts at the last minute
before the collision.

But four cars didn\’t give any alerts at all.

To further
understand the safety of Autopilot and related systems (and to what extent it
can undermine human driver oversight and exacerbate risks), NHTSA has decided
to escalate this initial investigation into an engineering analysis (EA).

And expand the vehicles involved
to all four Tesla models: Model S, Model X, Model 3, and Model Y, for a total of
830,000 vehicles.

 

\"Tesla

And as soon as the results of
this investigation came to light, victims began to speak out.

One user said that as early as
2016, her Model S also crashed into a parked car while changing lanes. But
Tesla told her it was her fault because she hit the brakes before the crash,
causing Autopilot to spin out of control. But at this time, the system raised the alarm, the woman said. It means that everything I do is wrong. I don\’t
hit the brakes, and I ignore the warning.

However, Musk did not respond to
the matter but just sent a tweet 6 hours after the netizens broke
the news, throwing out an NHTSA investigation report on Tesla as early as 2018.

 

\"Tesla

The report said Tesla\’s Model S
(produced in 2014) and Model X (produced in 2015) have been shown to have the
lowest probability of injury after an accident of all vehicles through all previous
NHTSA testing.

They found out that the new
Model 3 (produced in 2018) replaced the Model S and Model X in the
first place.

 

\"Tesla

Tesla
also took the opportunity to make a wave of publicity

This is a response to the
incident of \”the car was destroyed, but no one was injured\” in the
Tesla car accident in Shanghai a few days ago.

Interestingly, just when everyone
was decrying how unreliable Tesla\’s Autopilot was, some,ne stood up and said everyone was \”stunned\” by black Tesla.

 

\"Tesla

In fact, Tesla interrupted
Autopilot 1 second before the accident and did not throw the blame on humans.
When they counted Tesla accidents, as long as Autopilot was working within 5
seconds of the collision, they would count the responsibility on Autopilot.

Later, Tesla officials also came
out to confirm this rule.

 

\"Tesla

However, although humans don\’t
have to take the blame, the \”saucy operation\” of cutting off the
automatic driving only 1 second before the accident still cannot change the
fact that the human driver has no time to take over the steering wheel.

Is Autopilot reliable?

After reading NHTSA\’s findings,
let\’s look back at Autopilot, the focus of this public opinion.

Autopilot is a set of advanced
driver assistance systems (ADAS) from Tesla, which is at L2 in the automatic
driving level proposed by the International Society of Automotive Engineers
(SAE). (SAE divides autonomous driving into six levels, from L0 to L5)

According to Tesla\’s official
description, Autopilot has functions including automatic assistance
steering, acceleration and braking in the lane, automatic parking, and
\”summoning\” the car from the garage or parking space.

So does this mean that the driver
can be fully \”managed\”? It\’s not true.

Tesla\’s Autopilot at the current
stage can only play an \”assistance\” role, not fully autonomous
driving. It also requires the driver to \”actively\” and
\”actively\” supervise Autopilot.

\"Tesla

 

But in the official introduction,
Tesla also briefly described the ability of \”full
self-driving.\” All new Teslas have the hardware required for full
self-driving in almost all situations in the future. The ability to travel short and long distances without needing a driver.

 

\"Tesla

But for Autopilot to achieve
these goals, it is essential to be far superior to humans in terms of
safety. In this regard, Tesla says it has proven it in billions of miles
of experiments. And Musk has spoken out about Tesla\’s safety more than once:
Tesla\’s fully automatic driving safety level is much higher than that of
ordinary drivers.

\"Tesla

 

But is this the case?

Regardless of the evaluations
given by Tesla and Musk, from a practical point of view, Tesla\’s Autopilot is
controversial in terms of safety.

For example, the standard
\”ghost brake\” incident has pushed it to the forefront of public
opinion again and again, and it is also one of the main reasons for NHTSA to
launch this investigation.

\”Ghost braking\” means
that when the driver turns on the Tesla Autopilot assisted driving function,
even if the vehicle has no obstacles in front of it or does not collide with
the car in front, the Tesla vehicle will perform an unnecessary emergency
brake.

This poses a huge safety hazard
to drivers and other vehicles on the road.

Not only that, if you take a
closer look at the hot events related to Tesla, it is not difficult to find
that many of them are related to the safety of Autopilot:

So what does NHTSA think about
this?

In the document, NHTSA reminded us that there are currently no fully autonomous vehicles on the market,
\”every vehicle requires the driver to be in control at all times, and all
state laws hold the driver responsible for the operation of his vehicle.
.\”

As for how this investigation
will affect Tesla in the future, according to NHTSA: If there are
safety-related defects, it has the right to issue a \”recall request\”
letter to the manufacturer.

Finally, do a little research –
do you trust Autopilot?

[ad_2]

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top