How to Improve the Performance & Safety of Level 2 Automation: Tesla Mountain View Examination

The National Transportation Safety Board’s widely anticipated preliminary report on the crash of a 2017 Tesla Model X on March 23 in California is out.

The National Transportation Safety Board (NTSB) issued last Thursday its preliminary report for the investigation of the fatal, March 23, 2018, crash of a Tesla on U.S. Highway 101 in Mountain View, California.

Information contained in the report is preliminary and subject to change during the NTSB’s ongoing investigation.  Preliminary reports, by their nature, do not contain analysis and do not discuss probable cause and as such, no conclusions about the cause of the crash should be drawn from the preliminary report.

According to performance data downloaded from the crash vehicle, a 2017 Tesla Model X P100D, the driver was using traffic-aware cruise control and autosteer lane-keeping assistance, which are advanced driver assistance features that Tesla refers to as Autopilot. The vehicle was approaching the state Highway 85 interchange, traveling south on U.S. Highway 101, in the second lane from the left — a high-occupancy-vehicle lane.

As the vehicle approached the paved “gore area” dividing the main travel lane of U.S. Highway 101 from the state Highway 85 exit ramp, it moved to the left and entered the gore area at approximately 71 mph, striking a previously damaged, SCI smart cushion crash attenuator system. The speed limit for the roadway is 65 mph. The vehicle’s traffic-aware cruise control was set to 75 mph at the time of the crash.

The Tesla was subsequently involved in collisions with a 2010 Mazda 3 and a 2017 Audi A4. The Tesla’s 400-volt, lithium-ion, high-voltage battery was breached during the crash and a post-crash fire ensued. The Tesla’s driver was found belted in his seat and bystanders removed him from the vehicle before it was engulfed in flames. The Tesla driver suffered fatal injuries while the driver of the Mazda suffered minor injuries and the driver of the Audi was not injured.

 
teslamountainviewcrash.png
 

What Went Wrong

Among the questions raised: Was Tesla’s driver monitoring system adequate to ensure the driver’s proper use of the Autopilot system? Why was the vehicle silent while driving straight into a concrete median barrier (no warning)? Did the car’s forward radar notice anything amiss while careening into the barrier?

According to VSI analysis the accident was in part caused by the liberal grace period in which Tesla permits before a disengagement. Based on our own research vehicle, you get about two minutes from the time Autopilot is engaged until the system starts prompting you with warnings to grab the wheel.

If you do not grab the wheel, the alerts get more pronounced until the system eventually disengages, and you are presented with a message that says you may no longer use Autopilot for the duration of the trip.

As a driver monitoring solution (DMS) Tesla uses a torque sensor in the steering wheel to measure driver engagement. The system forces engagement because you must apply a little bit of resistance for the vehicle to confirm you are holding the steering wheel.  This is actually a pretty good system in our opinion because it requires a level of engagement.  So unlike traditional DMS you are not monitoring the driver’s attentiveness but rather you are measuring engagement. For a Level 2 system engagement the more important item.  Cadillac’s Supercruise measures both engagement as well as attentiveness as it is monitoring the drivers’ forward-looking pose.

A Vision First Solution

Tesla Autopilot is a “vision first” solution as it requires camera visibility of lanes lines before it can be enabled. If there are no lines the system will simply not work.  Vision only systems work reasonable well under most use cases such as interstates or divided highways. It is also adequate a on two-lane road if you do not have signaled intersections and stop signs as the Tesla has no way to identify those.

Radar is another part of the solution but is secondary to the vision system.  Radar by itself cannot steer a car but can prevent you from hitting something.  Radar does a very good job with car following (a.k.a. Adaptive Cruise Control). But radar is not perfect.  Radar does a poor job on static objects. It must filter out most of them, because if it did not [filter them out], there would be too many false positives. This creates hazards.  VSI has experienced false positive from time to time in its Tesla but usually is nothing more than a rapid slow down when it misinterprets another vehicle in its trajectory.

While camera plus radar does a reasonable job at maintaining Level 2 performance it is not perfect.  If the driver is still in the loop than this is satisfactory approach. Therefore, driver monitoring is so important for any form of automation. 

Another factor that comes into play on a vision first solution is a map. At least a lane model. Without a lane model you have not ground truth to go by.  You don’t necessarily need super accurate precision, but you need to know what a proper trajectory is.   The camera-based lane-keep algorithms really don't know the difference between right and wrong. 

For example, when lines are out of the ordinary the system can easily get confused. In the Mountain View case, the vehicle misinterpreted the lines, and got caught between the two lanes where it hit the barrier.

As we have illustrated, you had one lane that splits in two. The Tesla got confused and thought the area between the two lanes was in fact another lane. I suspect the lane markings were messed up or not properly applied.

 
 

In the Mountain View accident, a likely contributing factor is the road surface changes. The dark surface is asphalt while the light surface is concrete. Autopilot may have misinterpreted the change of surfaces as a lane line leading to the improper trajectory. If Autopilot had a lane model and was localizing against that lane model, this type of accident could be prevented. This is why Autopilot (and any other L2 system) require constant driver attention and engagement.

NTSB described this as a "gore area" a triangular-shaped boundary created by white lines marking an area of pavement formed by the convergence or divergence of a mainline travel lane and an exit/entrance lane.

VSI remains optimistic, noting that these accidents are “addressable. If we were Tesla the first this to do is take out the 2+ minutes of the grace period. The second suggestion is a map-based localization method.  There are several ways you could go about this and VSI has written extensively about localization methods.

For both recommendations they could be enabled with software updates and we would expect this to happen within a short amount of time.  

Summary

The gradual roll out of active safety and assisted driving features are going to drive auto sales for the next ten years+.  Just as ADAS systems have infiltrated the majority of modern car feature availability, so too will L2 automation.  And all these vehicles will require driver monitoring that makes sure the driver is attentive and engaged in the driving task. 

Some might ask what the point of an automated vehicle if you must maintain attention and engagement?  Well, unless you have experienced L2 automation on a decent stretch of highway, you may never understand the benefits from a comfort and safety standpoint.