Wednesday, June 23, 2021

How Tesla is working to improve autonomous driving

This 35-minute lecture (you can start at 2:05) is long and tedious. Andrej can be hard to understand and he talks very fast. 

But this video describes the challenge of compiling and analyzing petabytes of data from challenging real-world scenarios (discrepancy between autonomous prediction and real driver input) in order to improve the cars' interpretation to make driving smoother and safer. He talks about "training time" (when the supercomputer analyzes everything in detail with the benefit of hindsight) vs "test time" in real-world driving where efficiency (short latency) is paramount. 


24:00 is a good example - the car in front slams on its brakes. The radar has trouble detecting if the car in front has stopped or the radar is picking up an ancillary stationary object, so in rapid succession 6 times the radar signal "drops" (assumes it's an erroneous signal) while the vision-only system "understands" the situation more smoothly and consistently. 

25:39 explains "phantom braking" under bridges and overpasses - radar can't distinguish between a fixed object like a bridge and an obstruction like a stopped vehicle, so it "panics" and looks to confirm with the visual input that there's a stopped object, which sometimes is erroneously confirmed, and braking is initiated. By "doubling down" on interpretation of visual information and ignoring radar, they can achieve smoother driving. 

29:31 "We expect the legacy stack (the existing sensor configuration) to have one accident every 5 million miles"


No comments:

Post a Comment

Search This Blog

Followers