a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: Autopilot Saves Wildlife

Blog: Autopilot Saves Wildlife


Go to the profile of Alex

Now I know that we all have seen Scott Kubo’s video on him disengaging Autopilot due to the fact that it did not register the debris in the oncoming lane. Whether this non-classification of the debris was intentional or not, only the Tesla engineers who review the disengage will be able to tell us. However, judging from a video that surfaced earlier this week I can say that with confidence that such classification was most likely intentional.

As we can see in Scott’s video below, Autopilot does an exceptional job on navigating over some heavily congested interstate traffic without driver lane change confirmation all the way to the left lane and eventually into the HOV lane of travel. Once in this lane for a period of time however we see that Scott goes under an overpass and to both his and the vehicles surprise a large piece of debris is blocking part of the left lane.

Scott Kubo Autopilot Debris Disengage

Once noticed, Scott disengages Autopilot a mere couple meters from the debris and swerves to the right just in time to avoid it. Another perfect reason why we must at all time be vigilant when using the Autopilot system. This makes me wonder that while it happened early this past weekend if some of the executives and employees we saw at Tesla’s Autonomy day have already reviewed the disengage footage and have corrected the paths that the car should have taken. Yet this too brings up an interesting point. A large basis for current radar and Lidar-based autonomous vehicles when detecting a stationary object at highway speeds is to ignore it. See the following video below of a Model S crashing directly into a stationary fire truck at highway speeds last May while autopilot was engaged.

Tesla Autopilot Crashes Into Fire Truck

Autopilot and other self-driving systems have an inherent issue to solve and that is that due to the abundance of stationery items that you and the vehicle may come into contact with during your journey that poses no risk, however, could lead the system to trigger an emergency brake to avoid such object. These objects can be so common such as signs on the side of the roads, construction equipment and barriers, stalled cars on the shoulder, debris and any other of the 100’s of variables we encounter while driving. Due to this, they have classified to not engage or to ignore any stationary objects detected while traveling at highway speeds. As we can see in Scott’s video, Autopilot performed as intended. Hopefully however Tesla, Waymo and GM have been able to make improvements to their classifier to better sort stationary objects between things that may actually require the vehicle to come to a complete stop at highway speeds (i.e. a big red fire truck ).

Now what leading into what this all has to do with the title is that Tesla clearly has demonstrated that their classifier is indeed getting vastly better from last year. In a video that surfaced earlier this week, we see that Autopilot is engaged on the dark foggy road when out of nowhere through the fog a little hopping bunny rabbit decides to cross the road. Now even before the driver or the vehicle had time to ask him why he was crossing the road Autopilot made the decision to veer off to the right on the lane onto the usable paved road, ultimately prolonging Roger’s life for another day.

Tesla Autopilot Avoids Rabbit

This to me is a fascinating case for Autopilot. Now while not the same circumstances, as the debris on the road, this is a perfect example that no matter moving or stationary how Tesla classifies small objects on the roadway. From the few frames we get from once the rabbit comes into view of the camera we can see that the rabbit is running quickly across the street and obviously enough to trigger the Autopilot system to in fact change course, slow down and veer to the right shoulder of the road. This clearly is an interesting case to see how and what needs to happen to actually get Autopilot to trigger an alternate scenario for the car. On one side we see that there is a large piece of what looks like a bumper on the side of the road that if hit would certainly result in damage as well as damage to other vehicles due to what might happen as a result of the impact. In this case, Tesla’s autopilot system did nothing to avoid and if not for user intervention would have been another CNN headline for us all to bask in. However, we have another case, one where there were no other travelers on the road, one where the impact of the item most likely would not have resulted in serious damage to the vehicle and one that if impacted would have had no impact on surrounding drivers and vehicles. Tesla Autopilot intervenes and pulls the vehicle to the shoulder and applies the brakes.

As an aside I would also like to mention something else that I am puzzled by / would like if any of you reading could clarify for me. Judging based on the height and reflective speed limit sign down the street it appears that the vehicles high beams are enabled but as the car begins to take its evasive action we see that these lights then become disabled? I wonder if for some reason Autopilot did this or if by the quick reaction of the driver grabbing the wheel the driver then accidentally hit the high beams. Then again due to this being a Tesla, it does not have the same stalks and knobs to turn off / on the lights as most traditional cars do. On Tesla’s that function in reserved for the touchscreen. Almost guaranteeing that Tesla’s Autopilot did in fact for some reason disable the high beams. Any and all input on this would be appreciated. Going to reach out to Dealer of Happiness and will update with any information.

Now I do not want to come across as a Tesla short or anti-self-driving, quite the opposite, I just write this to highlight the still massive hill of challenges that is artificial intelligence vision. To stop or not stop when an object has been presented in your lane of travel. What were to happen if there was a vehicle traveling next to Dealer of Happiness seen in the Rabbit twitter video, would Autopilot have known that this, in fact, was a rabbit and that it would cause less of an impact to the surrounding traffic and driver if the vehicle hits it and stay in the current lane of travel or would it have detected it as a tire that came loose from a aside vehicle and actually still made the veer and stop action, then resulting in a side collision from the vehicle next to Dealer of Happiness. Sorry for the run on sentence this is just the way my mind is racing around this whole topic and questions. I for one know that Tesla and the engineers have a giant uphill battle still ahead and I am just happy that I currently and the one that gets to ask these questions and not answer them.

Thank you for reading, if you are new I am writing these blogs in an attempt to create passive income to purchase my own Tesla Model 3 and document the ever-increasing updates to the Tesla Enhanced Autopilot system. As a poor CS college student, this is not an easy task. If you have gotten value from these blogs or are interested in the progression please support my Patreon and follow me on Twitter.

Source: Artificial Intelligence on Medium

(Visited 5 times, 1 visits today)
Post a Comment

Newsletter