What’s Up Wednesday 5/1/19

Tesla Provides Updates On Autopilot

Tesla recently provided an update on its autopilot program and the corresponding progress that has been made toward achieving fully autonomous vehicles.  The company, which only began selling mainstream passenger vehicles (not counting the original Roadster) with its Model S sedan in the 2012 model year, has seemingly made incredible strides in constructing vehicles that are completely autonomous.

The company has been developing its own (in-house) chip for autonomous vehicles for several years after available options from other vendors were deemed inadequate.  Through these in-house efforts, Tesla engineers were able to create a chip that both consumes minimal power and still provides exceptional processing capacity.  It’s a far superior solution to what was previously used in Tesla (and by extension, competing manufacturers’) vehicles.  Elon Musk says from this point forward, all Tesla vehicles will be equipped with this new chip and all necessary hardware for fully autonomous level-5 operation.  This package is (as of now) now standard equipment in each car sold.

The plan is to pursue regulatory approval for this technology over the next couple of years.  Tesla hopes to have a fleet of Tesla robo taxis operating in select cities within that time period.

It’s an amazing accomplishment for such a young company and it seems Tesla is far ahead of all other competing companies when it comes to level-5 autonomy.  The large manufacturers, even after tens of millions of dollars have been devoted to research and development, seem ill-equipped to compete with Tesla in this facet at this time.  Tesla has made it clear that the company believes the use of LIDAR is a mistake, one of the primary differences between what it’s doing with its liberal utilization of cameras and what other manufacturers in the same space are doing with their stolid reliance on LIDAR.  Musk claims that other companies will eventually realize that dependence on LIDAR has been a mistake and switch to a comprehensive camera setup instead.

One of the things that level-5 autonomy requires is the ability of the system to learn as it encounters different driving situations while on the road.  Tesla has accumulated vast amounts of data from its fleet of semi-autonomous machines in the past few years.  As the company’s sales volume has increased, its usable data from the cars it has put on the road has also grown.  This has led to a massive lead in one of the most important factors in testing the autopilot system – real-world road miles.  Not simulated miles (which in their way can also be helpful) but actual automobiles on the road experiencing the various real-world situations that all drivers face, feasting on the unpredictability of what can happen on the road.

But there have also been some negatives with a limited version of Tesla’s autopilot system in the hands of consumers.  There have been multiple deaths of Tesla drivers who were using the autopilot feature (as well as other accidents that didn’t result in fatalities).  Even though Tesla always stated that the current system was not designed to be used without human input, many drivers have disregarded this warning and used the tech like a fully autonomous cruise control, allowing the vehicles to drive themselves with little operator attentiveness to the road or surroundings.

The family of one of the drivers that died while using Tesla’s autopilot is now suing Tesla in a wrongful death lawsuit.  The family’s claim is that Tesla put a faulty product in the hands of ignorant consumers and used early adopters of the technology as guinea pigs.  That lawsuit was just filed this week in California.  The person who died was an Apple engineer.  He was driving a Tesla Model X and had it on autopilot when the car didn’t properly navigate lane markings and hit a cement barricade where California Highway 101 splits.  We’ll see what happens with this lawsuit but it’s certainly one of the downsides of testing these autonomous systems in real-world situations.

At the present time, we’re forced to take Musk’s word that this technology is as good as advertised, that it’s not simply a slightly more complex version of a system that has killed and injured a regrettable number of people.  It has to be a concern.  So far the damage (injuries and deaths in Tesla vehicle accidents) has been relegated primarily to the Tesla drivers and their passengers.  What happens if, with more of these systems in Tesla vehicles (and presumably in greater use), innocent people in other cars, on motorcycles, riding bicycles, walking or jogging on the shoulder of some road someplace begin getting killed in significant numbers by driverless Tesla vehicles?

The way we read it, a driverless system that actually works could be a boon for whoever develops it – or if it fails to function as promised, it could be an absolute disaster.

 

Ready to create your documents?