Additional information has now been disclosed about the fatal accident that occurred earlier this week when a 2017 Tesla (TSLA) Model X crashed into an “exit divider” barrier. The location of the accident was at a major interchange in the San Francisco Bay area where southbound traffic on the major “north/south” highway on the San Francisco Peninsula (U.S. 101) can exit to the southwest (onto California Route 85) that then goes to Cupertino.
From long having been a San Francisco Bay area resident, I know the interchange well. When I initially heard what had happened, the crash also made no sense to me. It occurred in bright daylight, the exit ramp is clearly visible, and, due to the typical heavy traffic in the Bay Area, most drivers themselves have essentially become “automatons” by just automatically following other vehicles in an endless line of traffic. As such, what would have then caused a vehicle in such a situation to swerve into an exit ramp divider that separated the dividing lanes of traffic?
I immediately suspected a “confused” autopilot system that might have attempted to redirect the vehicle back into the originally traveled southbound lanes on U.S. 101 instead of continuing to the right on the exit ramp for Route 85. But, I had no other information at the time to provide any evidence for such a scenario but Tesla itself has now effectively provided such evidence.
There has also already been an excellent article on Seeking Alpha about the crash (by FundamentalSpeculation.IO) but Tesla, the gift that keeps on giving with its manipulative narratives about things, has now actually confirmed that the vehicle was using Autopilot at the time of the crash. As I will describe in the following, I also think that what went along with Tesla’s confirmation that the vehicle was using Autopilot is both appalling in its attempt to “spin the narrative” and further highlights the potential risks for investing in Tesla’s stock.
Tesla’s “Narrative” about the crash
A Reuter’s article provides what I believe is a very factual and straightforward account of Tesla’s comments about the crash. As the article reports, Tesla has confirmed that the vehicle was using Autopilot:
Tesla Inc said on Friday that a Tesla Model X involved a fatal crash in California last week had activated its Autopilot system
But, apparently in Tesla’s view, the crash was actually the fault of the driver:
Tesla also said vehicle logs from the accident showed no action had been taken by the driver soon before the crash and that he had received earlier warnings to put his hands on the wheel.
“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken,” Tesla said.
As the Reuters article then describes, however:
The statement did not say why the Autopilot system apparently did not detect the concrete divider.
Tesla then did go on to reportedly say:
Tesla said late Friday that “Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists
But the Tesla narrative then goes further:
Tesla said that in the United States “there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware.”
As Tesla’s “disclosures” and “communications” are ever artful in the subtle use of words (sort of like the artful way Bill Clinton used words), we don’t really know from the statement above whether Autopilot was actually being used for all of those 320 million miles on those “Autopilot equipped vehicles”, or how it was being used, or since Tesla implies it has such good logs, for how many miles Autopilot had been deployed in all those miles.
In any case, however, with the recent Mountain View fatality, now that there have been two fatalities that we know of in the U.S., that statistic is now only half as good now as the apparently new statistic is one fatality every 160 million miles driven. Also, in case anyone has forgotten, there was another fatality in China from a vehicle being driven in Autopilot, and so the statistics are not as good as Tesla would have you believe.
A statistical digression…
Given my own view of the truthfulness of both Elon Musk’s and Tesla statements and comments about things from Bill Maurer’s excellent summary of such things , I am also not inclined to accept Tesla’s statistics about anything. I do have some suggestions, however, about what would make Tesla’s statistics be far more helpful and also some comments about manipulative those statistics are in any case.
The “baseline” presented by Tesla of “one automotive fatality every 86 million miles across all vehicles from all manufacturers” is from a driver and vehicle universe very different than that of the typical Tesla vehicle owner.
That universe would include a huge number of much less affluent vehicle drivers driving much older and possibly poorly maintained vehicles. The universe would also probably include a huge number of vehicles owners having far less education than the typical Tesla owner and while having less education would not necessarily be a contributing factor into more “accidents” occurring, I would guess that would possibly be the case. The overall universe would probably also include many different uses for vehicles other than what I would assume would be the primary uses of Tesla vehicles for either commuting or casual personal driving.
And so, I would like for Tesla to provide fatality statistics for a universe of drivers having similar demographics to Tesla owners and then let’s see how the statistics look. I would also like to see Tesla provide fatality statistics for a universe of vehicles similar to Tesla vehicles to see how those statistics would compare. Examples of other similar vehicles would include cars such as the Mercedes S Class, BMW 7 series, Lexus LS, etc. and also “high end” SUVs from such companies. Until Tesla is willing to provide more detailed statistics for both a similar demographic as its owners and for similar vehicles, none of Tesla’s statistics should have any credibility at all.
Tesla’s manipulative use of statistics
There is actually a far more important issue that I have with Tesla’s use of what I consider misleading statistics which is that such statistics are also used to “spin the narrative” that we should just blindly accept both anything Tesla is doing and the overall concept of “driving assistance” functions and then, evolving from that, “self-driving cars.” Implicit in the way that Tesla manipulates statistics is the patently false argument which is essentially that:
- since people are going to die in auto accidents anyway, it is ok that some people die while using our systems while we are still developing them.
And, since Tesla would apparently hope for us to accept their view of “progress” – that we should turn a blind eye to what happens in the interim. Of course this is also similar to other aspects of Tesla – such as suggestions that investors should also turn a blind eye towards any scrutiny of Tesla’s truly horrendous financial characteristics (what else can you say about 14 years of losses and over $25 billion in financial obligations?) given all of the “progress” that the company would like for you to believe will be achieved.
In my opinion, Tesla’s narrative about “statistics” is actually just a cover-up for Tesla’s ongoing dysfunctional overall operations. There is no better example of that then what we are currently seeing from the very strange “Model 3 ramp-up.”
First of all, it was Elon Musk that announced that the vehicle was going to go into production in “July 2017” but so far there have apparently been barely 10,000 vehicles produced. Musk also said that “100,000 to 200,000 Model 3s were going to be produced in the second half of 2017” but less than 3,000 were actually produced. The other very odd thing about the Model 3 “ramp” is that all of this was not supposed to happen at all.
From the previous instance of “production hell” when the Model X was introduced and from all of Musk’s comments about how much was learned from that experience and that the Model 3 was “designed for manufacturing” – why can’t Tesla produce more of the vehicles? Such a failure also appears very odd in that Tesla has also now been manufacturing around 2,000 Model S and Model X vehicles for over 18 months now. As such, the Model 3 production ramp tells me that there are many things inside Tesla that are now yet widely known or appreciated that are severely dysfunctional.
With the very strange Model 3 introduction, however, there is at least a reasonable amount of public visibility into what is happening with that. Buried in huge amounts of software code for “driving assistance” and “self-driving vehicle” systems, there is unfortunately no visibility at all as to actually how well developed and tested any of those functions might be.
All we can wait for as data points are unfortunate individual statistics as the one that recently occurred in Mountain View, California. What is even more concerning about all of this is also the well-established practice of having its customers effectively being the “beta testers” of Tesla’s vehicles and developing systems.
Additional evidence of possible Autopilot failures
As Fundamental.IO also highlighted in his recent Tesla article, there was a local news report about the now deceased Tesla driver having supposedly describing to Tesla previously that his vehicle had swerved towards the exit ramp barrier during earlier trips along the same stretch of road. After having read many accounts (too many to link!) about Tesla drivers’ experiences with the erratic new “Enhanced Autopilot” (AP 2.0 or 2.5) introduced after Tesla no longer had access to Mobileye’s systems in 2016, that is why I immediately suspected that Autopilot was somehow confused by the diverging lanes at that particular exit.
Tesla, however, merely just offers up more statistics about how “85,000 trips” past that exit have occurred with other Tesla drivers without any other incidents. Regardless of all of those other statistics, in the event that happened this week – in broad daylight, in a vehicle driven by someone who travels the same stretch of road every day on his way to work, a person who reportedly had previously described to Tesla that his vehicle had other instances of “swerving” towards the same barrier – this driver’s Model X did inexplicably did swerve towards the barrier and make contact with it. This incident also follows an earlier in the year incident (but which fortunately didn’t result in fatalities) in which a Tesla vehicle on Autopilot plowed into the back of a fire truck (a hopefully visible vehicle or object!)
Tesla’s overall response and further risks for the stock
Tesla also mentions in its narrative about the current fatality that the driver had ignored “repeated warnings” from Autopilot to take control of the vehicle and re-engage the system.
I can’t wait for hungry “plaintiff’s bar” lawyers to quickly size up that statement as that to me is clear cut evidence that Tesla’s systems are still not properly developed or deployed. As such, that means that they are still very dangerous both to the drivers of its vehicles and any other vehicles on the road near a Tesla vehicle. Effectively what Tesla has admitted to is its own design flaws where probably after only one warning that was not responded to, that the automated systems should be disabled.
I also believe that the overall environment and public attitudes will start changing about how “technology” companies are operating as businesses. As I already commented on in another recent article (titled Facebook, Uber, and Tesla), I believe that there will start to be a growing backlash about how many companies are treating its customers as pawns in their out of control race to further dominate markets and increase revenue growth.
As we have also now seen from how quickly protests have grown from what is typically a pretty apolitical group (U.S. teenagers) about gun violence, there could also now start to be significant public backlash about possible “car violence” from poorly developed and implemented systems from any auto company. I would now expect there to start being far more controls and requirements about ongoing development projects and testing of such systems and vehicles. As such, I think such trends would be a very underappreciated risk for Tesla investors who seem to assume that every Elon Musk proclamation will come true.
I also continue to believe that at least 20 percent of Tesla’s market value is based on the view that Tesla has a “lead” in the development of such automated systems (which include crazed projections of huge revenues from a future “Tesla Network”). Although I believe that is clearly not the case in terms of Tesla’s actual and likely capabilities, if the development of such systems is much more heavily monitored, regulated, and controlled, that would result in companies with a much less consistent record of successfully developing and introducing products (just think of what has happened over the past few years with the Model X, Model 3, and AP 2.0 introductions) being put under much greater scrutiny. Since the ongoing Tesla narratives attempt to manage any scrutiny applied to the company, that also suggests that Tesla would feel very limited by a lot more scrutiny into its operations.
Another interesting data point this week about how complicated all such systems are to develop and safely deploy, there is also a fairly amusing anecdote about a “self-driving” car getting a ticket for not yielding to a pedestrian in a cross walk. Fortunately no pedestrians were killed in this instance but I believe this shows the complexity in trying to develop such systems.
Since this article also essentially discusses how much we should trust both Elon Musk’s and Tesla’s statements and announced development activities, I thought it would be appropriate to remind readers about another aggressively announced Tesla “feature” which you can see in the following video:
What is especially comical about this video are the number of deluded sycophants enthusiastically cheering and making comments like “all right, awesome, cool, wow!” etc. during the video (about something that then never happened!)
There is another comical part of the same presentation where at the start of the “introduction” (which occurred in 2013!), the first pitchman who appeared described the achievement of Tesla’s “first profitable quarter” and then promised that there would be “many more to come!” Another comical part of that video is that a Model X was also “introduced” at the same time (although it would then take three years before it sort of achieved volume production…sound familiar at all?).
What is not comical to me, however, is having what I view as Tesla’s poorly developed and implemented systems being randomly deployed out on our highways with essentially no oversight or visibility. As I have described about ongoing societal trends, I also believe such business practices are becoming much less acceptable to the general public.
Since Tesla so far has been very much supported by the unquestioning acceptance that what it is doing represents “progress” I also think any turn in such sentiments could have huge negative leverage for Tesla. For a severely unprofitable company that is also that is also very heavily leveraged financially (with over $10 billion of debt and an additional $15 billion of other financial obligations), such a sharp sentiment change about the company could also result very quickly in overall financial distress.
Disclosure: I am/we are short TSLA.
I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Additional disclosure: This article expresses the author’s opinions and perspectives about various investment related topics. Since all statements in the article are represented as opinions, rather than facts, such opinions are not a recommendation to buy or sell a security. My own investment position described in the disclosures is not intended to provide investment advice or a recommendation of a specific investment strategy but is a required disclosure item by Seeking Alpha. My own investment position may have been initiated at very different price levels than current prices levels and so that is also why my disclosed position is definitely not intended as an investment recommendation. All investors should also do their own research before making any investment decision.