Dept. Of Transportation Warns: “Tesla’s Autopilot Requires The Continual And Full Attention Of The Driver”

Liberty Fight

The National Highway Traffic Safety Admnnistration, under the U.S. Department of Transportation, has issued its’ official report on the crash of a Tesla so-called “Auto-pilot” vehicle which crashed into a big rig last summer.

While the corporate media has tried to merely blame driver error, the very first line of the actual NHTSA report admits that the vehicle may not have functioned as designed:  

“Problem Description: The Automatic Emergency Braking (AEB) or Autopilot systems may not function as designed, increasing the risk of a crash.”

The report continues with a summary of the fatal wreck:

On May 7, 2016, a 2015 Tesla Model S collided with a tractor trailer crossing an uncontrolled intersection on a highway west of Williston, Florida, resulting in fatal injuries to the Tesla driver. Data obtained from the Model S indicated that: 1) the Tesla was being operated in Autopilot mode at the time of the collision; 2) the Automatic Emergency Braking (AEB) system did not provide any warning or automated braking for the collision event; and 3) the driver took no braking, steering or other actions to avoid the collision. On June 28, 2016, NHTSA opened PE16-007 to “examine the design and performance of any automated driving systems in use at the time of the crash.”

The NHTSA continually emphasizes throughout the report that “Advanced Driver Assistance Systems, such as Tesla’s Autopilot, require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.”

They conclude noting “The closing of this investigation does not constitute a finding by NHTSA that no safety-related defect exists.”

Below are some more excerpts of interest from the full DOT report: The particular areas of interest I highlighted in bold text.

But first, some pictures of the actual wreck:

“The Florida fatal crash appears to have involved a period of extended distraction (at least 7 seconds),” the report states.

Also be sure to see:

A Cautionary Tale: Safe driving tips for motorists around big rigs

Note: I love quoting the actual NHTSA report because it’s been a pet peeve of mine for years about the retarded, absolutely false, hysterical, paranoid & inane claims that “DRIVERLESS TRUCKS ARE OUR FUTURE! EVERY DRIVER WILL BE OUT OF WORK!”

It will NEVER happen, they can’t even design a car correctly without killing people and even if the technology improves, humanity will not go for it. Furthermore, I follow this stuff, and what was touted in the sensationalist click-bait media as Daimler’s allegedly “self driving’ big rig that was featured at a truck show in Las Vegas a few years ago, the manufacturers specifically stated that it was NOT a “self driving” vehicle, and that the enhanced features were only meant to aide the human driver, never to replace them.

Also note that the DOT report states “NHTSA recognizes that other jurisdictions have raised concerns about Tesla’s use of the name “Autopilot.” This issue is outside the scope of this investigation.”

Here are some DOT report excerpts on the fatal Tesla crash:

Advanced Driver Assistance Systems, such as Tesla’s Autopilot, require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.Automated Emergency Braking systems have been developed to aid in avoiding or mitigating rear-end collisions.

The systems have limitations and may not always detect threats or provide warnings or automatic braking early enough to avoid collisions. Although perhaps not as specific as it could be, Tesla has provided information about system limitations in the owner’s manuals, user interface and associated warnings/alerts, as well as a driver monitoring system that is intended to aid the driver in remaining engaged in the driving task at all times. Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations.23 While ADAS technologies are continually improving in performance in larger percentages of crash types, a driver should never wait for automatic braking to occur when a collision threat is perceived.

NHTSA’s examination did not identify any defects in design or performance of the AEB or Autopilot systems of the subject vehicles nor any incidents in which the systems did not perform as designed.

The Autopilot system is an Advanced Driver Assistance System (ADAS) that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes. Tesla’s design included a hands-on the steering wheel system for monitoring driver engagement. That system has been updated to further reinforce the need for driver engagement through a “strike out” strategy. Drivers that do not respond to visual cues in the driver monitoring system alerts may “strike out” and lose Autopilot function for the remainder of the drive cycle.

ODI’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate read- end collisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performace through MY 2016; and 3) braking for crossing path collisions, such as that present in the Florida fatal crash, are outside the expected performance capabilities of the system.

3.1 Traffic-Aware Cruise Control (TACC). The Tesla TACC system uses information from the forward looking camera and radar sensor to determine if there is a vehicle in front of the Tesla in the same lane. If there is no vehicle in front of the Tesla, TACC maintains a set driving speed selected by the driver. When there is a lead vehicle detected that is travelling slower that the Tesla’s set speed, the TACC will control motor torques to maintain a selected time-based distance from the lead vehicle. The Tesla Model S owner’s manual states that TACC “is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets.” The manual includes several additional warnings related to system limitations, use near pedestrians and cyclists, and use on winding roads with sharp curves or with slippery surfaces or poor weather conditions. The system does not prevent operation on any road types.

3.2 Autosteer. The Tesla Autosteer system uses information from the forward-looking camera, the radar sensor, and the ultrasonic sensors, to detect lane markings and the presence of vehicles and objects to provide automated lane-centering steering control based on the lane markings and the vehicle directly in front of the Tesla, if present.


8. Object classification algorithms in the Tesla and peer vehicles with AEB technologies are designed to avoid false- positive brake activations. The Florida crash involved a target image (side of a tractor trailer) that would not be a “true” target in the EyeQ3 vision system dataset and the tractor trailer was not moving in the same longitudinal direction as the Tesla, which is the vehicle kinematic scenario the radar system is designed to detect.9. NHTSA recognizes that other jurisdictions have raised concerns about Tesla’s use of the name “Autopilot.” This issue is outside the scope of this investigation.

10. 2016 Tesla Model S Owner’s Manual

The Tesla owner’s manual contains the following warnings: 1) “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present.

Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause serious property damage, injury or death;” and 2) “Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember that as a result, Autosteer may not steer Model S appropriately. Always drive attentively and be prepared to take immediate action.” The system does not prevent operation on any road types.

4.2 System limitations. Tesla provides information about system limitations at multiple levels, including: 1) the owner’s manual; 2) in the release notes for new software releases, which refer to the owner’s manual; 3) a user agreement required before enabling Autosteer for the first time or after an ignition cycle that concluded with Autosteer being switched off; 4) a dialog box that appears every time Autosteer is activated reminding the driver to “Always keep your hands on the wheel” and “Be prepared to take over at any time” (Figure 4); 5) the information in the user interface, which appears at all times while driving – the blue shaded circle around the white steering wheel indicates Autosteer is in operation, as opposed to when the background is gray meaning Autosteer is available should the driver decide to enable it (Figure 5).

5.1 Autopilot crashes. ODI analyzed data from crashes of Tesla Model S and Model X vehicles involving airbag deployments that occurred while operating in, or within 15 seconds of transitioning from, Autopilot mode.14 Some crashes involved impacts from other vehicles striking the Tesla from various directions with little to no warning to the Tesla driver. Other crashes involved scenarios known to be outside of the state-of-technology for current-generation Level 1 or 2 systems, such as cut-ins, cut-outs and crossing path collisions.

Data logs, image files, and records related to the crashes were provided by Tesla in response to NHTSA subpoenas.

5.2 Driver behavior factors. Many of the crashes appear to involve driver behavior factors, including travelling too fast for conditions, mode confusion, and distraction. Most of these involve late steering and/or braking actions by the driver to avoid the collision, but a few do not show any actions prior to impact. Highway incidents, which accounted for a little over half of the crashes reviewed by ODI, involved cut-ins, cut-outs, and sudden changes in traffic flow. Some crashes occurred in environments that are not appropriate for semi-autonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections). ODI’s analysis of incidents related to mode confusion did not identify a pattern of failures indicating a potential design defect. Incidents included apparent mode confusion during attempted Autopilot activations and mode confusion after inadvertant overrides. The incidents associated with each of these scenarios were isolated events that involved different sets of contributing factors. Recent changes implemented by Tesla have been made to further reduce the potential for mode confusion in the subject vehicles.16

The Florida fatal crash appears to have involved a period of extended distraction (at least 7 seconds). Most of the incidents reviewed by ODI involved events with much shorter time available for the system and driver to detect/observe and react to the pending collision (less than 3 seconds). An attentive driver has superior situational awareness in most of these types of events, particularly when coupled with the ability of an experienced driver to anticipate the actions of other drivers. Tesla has changed its driver monitoring strategy to promote driver attention to the driving environment.

5.3 Driver distraction. Figure 10 shows the distributions of off-road glances by duration that were observed in a research study by General Motors of driver behaviors in vehicles with Limited-Ability Autonomous Driving Systems (LAADS)17 when operated in SAE Level 1 and Level 2 modes.18 The data show distractions occur in each operating mode and that the majority occur for 3 seconds or less when driving in ACC mode or with ACC and Lane Centering Control used together. ODI’s analysis of field incidents found that most of the crashes developed in less than 3-4 seconds. Distractions greater than seven seconds, such as appears to have occurred in the fatal Florida crash are uncommon, but foreseeable.

15 Tesla’s Model S Owner’s Manual is not as specific as the examples cited here; opting instead to identify a handful of scenarios under which a vehicle may not be detected, followed by a broad warning: “The limitations described above do not represent an exhaustive list of situations that may interfere with proper operation of Collision Avoidance Assist features. These features may fail to provide their intended function for many other reasons. It is the driver’s responsibility to avoid collisions by staying alert and paying attention to the area beside Model S so you can anticipate the need to take corrective action as early as possible.” 16 This review included an assessment of the user interface in the subject vehicles, which has a larger display and symbols showing system status than peer vehicles with SAE Level 1 or Level 2 technologies reviewed by ODI. 17 LAADS are defined in the study as systems that “can control vehicle speed and steering on public roads for substantial distances and time” and “in some situations requires that the driver/operator intervene to assure a safe and comfortable trip,” with the latter element accounting for the “limited-ability.” The study showed that vehicles with an “ACC and perfect Lane Centering (PADS)” system may have slightly more frequent longer duration off- road glances than vehicles with “ACC and imperfect Lane Centering (LAADS)” systems. 18 Salinger, J. Human Factors for Limited-Ability Autonomous Driving Systems. (2012). General Motors Research To probe the foreseeability issue further,19 the Agency issued a Special Order to Tesla to evaluate the types of driver misuse, including driver distraction, that were considered by the company and any safeguards that were incorporated into the Autopilot design. It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above – i.e., through mode confusion, distracted driving, and use of the system outside preferred environments and conditions.

Included in the types of driver distraction that Tesla engineers considered are that a driver might fail to pay attention, fall asleep, or become incapactitated while using Autopilot. The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product. It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse.20

7.0 CONCLUSIONAdvanced Driver Assistance Systems, such as Tesla’s Autopilot, require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.

Automated Emergency Braking systems have been developed to aid in avoiding or mitigating rear-end collisions. The systems have limitations and may not always detect threats or provide warnings or automatic braking early enough to avoid collisions. Although perhaps not as specific as it could be, Tesla has provided information about system limitations in the owner’s manuals, user interface and associated warnings/alerts, as well as a driver monitoring system that is intended to aid the driver in remaining engaged in the driving task at all times. Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations.23 While ADAS technologies are continually improving in performance in larger percentages of crash types, a driver should never wait for automatic braking to occur when a collision threat is perceived.

NHTSA’s examination did not identify any defects in design or performance of the AEB or Autopilot systems of the subject vehicles nor any incidents in which the systems did not perform as designed. AEB systems used in the automotive industry through MY 2016 are rear-end collision avoidance technologies that are not designed to reliably perform in all crash modes, including crossing path collisions. Tesla appears to have recognized HMI factors, such as the potential for driver distraction, in its design process for the Autopilot system. Tesla’s design included a hands-on the steering wheel system for monitoring driver engagement. That system has been updated to further reinforce the need for driver engagement through a “strike out” strategy. Drivers that do not respond to visual cues in the driver monitoring system alerts may “strike out” and lose Autopilot function for the remainder of the drive cycle.

A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted. Accordingly, this investigation is closed. The closing of this investigation does not constitute a finding by NHTSA that no safety-related defect exists. The agency will monitor the issue and reserves the right to take future action if warranted by the circumstances.

You can download the full 13-page report here. You can also visit https://www.nhtsa.gov.

– See more at: http://libertyfight.com/2017/Feb/NHTSA-tesla-crash-autopilot-not-function-as-designed.html#sthash.1j8ndnd2.dpuf

2 thoughts on “Dept. Of Transportation Warns: “Tesla’s Autopilot Requires The Continual And Full Attention Of The Driver”

  1. Even if they get the “autopilot” for driverless vehicles working with close to flawless reliability, most likely it will still be necessary to hire people to sit in them and supervise them during their travel. Someone will need to be on hand if there’s a flat tire or other mechanical failure, or if an accident occurs due to another driver’s fault. Guarding against vandalism will also likely be a necessity.

    Whatever. I refuse to have this technology shoved down my throat. Its purpose is probably to help track our movements (GPS) and to visually record activities conducted near the car (cameras). These cars can also be hacked and used to kill occupants, as probably happened to Michael Hastings. Besides, I enjoy driving and would find it very annoying to be “chauffeured” by a computer that won’t exceed the speed limit and that will probably take longer than I do to park and maneuver through tight situations.

Join the Conversation

Your email address will not be published. Required fields are marked *


*