Musk’s Fury Over a Tesla Investigation Foreshadowed His War on Washington
Faiz Siddiqui The Washington Post
Illustration of Elon Musk and US Capitol building. (photo: Anson Chan/For The Washington Post)
The upcoming “Hubris Maximus” book details Tesla CEO Elon Musk’s disregard for the public and his contempt for regulators charged with overseeing his companies.
Walter Huang believed in Tesla. His lawyers would later say that he thought his car, when placed in Tesla’s Autopilot mode, was safer than one driven by a human driver.
He was such a dedicated enthusiast that he joined a Facebook Model X ownership group and regularly talked to a friend about the performance of the Autopilot software. His wife said he would watch YouTube videos of Autopilot in his spare time.
This reconstruction of Huang’s drive is based substantially on documents from the National Transportation Safety Board’s (NTSB) accident docket.
Lately, Huang had been having problems with his Model X. The car, which he regularly used in Autopilot mode, repeatedly veered toward a highway barrier on U.S. 101 that he passed daily on his way to work, prompting him to correct its course.
At around 9 a.m. on March 23, 2018, minutes after dropping off his son Tristan at preschool, Huang, an Apple software engineer, flicked on Autopilot and headed south on 101 toward work. At around the same time, he fired up a mobile strategy game he had been playing — Three Kingdoms.
Noting his lack of attention, the car repeatedly prompted him — a visual warning escalating to an annoying beep, the kind meant to provoke a reaction. Tesla detected whether a driver was holding the steering wheel, but Huang didn’t nudge it in response to the warnings — a phenomenon familiar to drivers who tend to tune out alerts from the automated systems. Autopilot, programmed to follow lane lines and keep its distance from vehicles ahead, stayed engaged.
At around twenty-seven minutes into his drive, near the interchange between U.S. 101 and State Route 85 north of San Jose, Huang reached the spot where his car had veered off before. Apparently, he did not react quickly enough, and the car lost its lane. From there, it followed a “faded and nearly obliterated” lane line into an empty space between two lanes; ahead, the bustling U.S. 101 and the on-ramp for Route 85 were separated by a concrete highway median.
Once in that no-man’s-land and with no traffic ahead of it, Huang’s SUV did what it had been programmed to do — it accelerated to the maximum speed the driver had set. It climbed from the 62 mph speed of traffic up to 65 two seconds later, then 68 a second later, then 70 in the final second.
It never reached the speed of 75 mph to which Huang’s Autopilot had been preset. Huang’s car was an unstoppable force about to meet an immovable object. Huang crashed into the median at around 71 mph, investigators said, spinning “the SUV counterclockwise and [causing] the front body structure to separate from the rear of the vehicle.” It also struck two other vehicles, leaving the twenty-five-year-old driver of a Mazda 3 with minor injuries — and the car’s front driver’s side fender heavily damaged — when the Tesla rotated into the lane of traffic before coming to a rest. The Tesla, its battery compartment ripped open, erupted in flames.
Bystanders found Huang strapped into the driver’s seat. They pulled him out of the wreckage, and he was taken by ambulance to Stanford Health Care Hospital.
At 1:02 p.m., Walter Huang was pronounced dead.
A tense phone call
Robert Sumwalt’s job had sent him to the sites of devastating plane crashes, train derailments, and infrastructure failures over amore than decade-long career at the NTSB.
But seated at a conference table in his sixth-floor Washington, DC, office, one corner in the labyrinth of federal agencies known as L’Enfant Plaza, the nation’s top federal safety investigator looked at his iPhone and was stunned as never before.
“He hung up on us.”
“Yeah, he did,” said Dennis Jones, a nearly forty-year veteran of the agency sitting across the table, also trying to process the ordeal.
Over twenty-seven contentious minutes on April 11, 2018, in Sumwalt’s later recollection, Elon Musk had fumed, protested, threatened to sue, and abruptly exited the conversation when safety investigators refused to bend to his will. It was a textbook example of Musk’s disregard for a public that had imbued him with godlike power — and his contempt for the safety establishment charged with ensuring he didn’t abuse it.
Autopilot, Musk believed, would play a pivotal role in advancing traffic safety, ushering in a future where people no longer had to die on the road. Its very origins were tied to an internal meeting at Tesla where the subject of eradicating road deaths had gripped the engineering staff as one of them wrote out the annual number of yearly road deaths on a whiteboard. Already, major tech companies such as Google and Uber were envisioning populating the roads with self-driving fleets, but Tesla would be unique in pursuing autonomy through privately owned personal vehicles. And the company wanted to make it happen as quickly as possible.
Autopilot is a set of driver-assistance features that enable Teslas to maneuver from highway on-ramps to off-ramps without the driver’s physical input, a type of hyper-advanced cruise control that gave consumers a tangible demonstration of Tesla’s technological lead over other automakers. It controls the cars’ speed and distance from other cars, follows lane lines, and can even make lane changes along a route. Full Self-Driving, meanwhile, sought to bring those capabilities to city and residential streets, adding the ability to make turns, halt for red lights and stop signs, and follow turn-by-turn directions.
Tesla had developed a handy talking point for its discussions of Autopilot: it was safer than normal driving when crash data was compared. (The argument carried a fundamental flaw: Autopilot was intended for highways, and highway driving was inherently less complicated.) But even years later, Musk’s position had hardly evolved. He applied the same logic to Full Self-Driving, Tesla’s more advanced iteration of Autopilot, designed for use during much more complicated city and residential street driving. Musk’s arguments here were at best unproven and at worst reckless: he was encouraging drivers to view systems geared at convenience as lifesaving breakthroughs that could prevent crashes.
Regardless, Musk seemed to believe that even if some lives were lost in the process, those who opposed his vision of the future were roadblocks to progress. He fully articulated this philosophy at an autonomy-focused event years later, in 2022: “At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people. Because the people whose lives you saved don’t know that their lives were saved, and the people who do occasionally die or get injured, they definitely know — or their state does.”
His position was that the processes established by society to prevent automotive calamities were ineffective or, worse, obstacles to this moral imperative. Musk had legions of admirers and online fanboys who validated this belief; his methods were the right ones, and his way was the only path forward. Who was the government to stand in the way? How could they possibly possess the requisite knowledge, technological know-how, and raw data to undermine him? What had they ever built?
Musk’s beef with Washington
Musk and Tesla already had a fraught history with regulators in Washington. Tesla was staking its future on the artificial intelligence bet of the century: putting a fully autonomous vehicle in the hands of customers, a moon shot that differed from the mostly commercial ambitions of the robotaxi projects from Big Tech competitors like Google and Apple. Regulators and safety officials in the federal government who were building a set of rules to regulate Silicon Valley’s lab experiments — small-scale testing in a highly regulated space — were caught largely flat-footed when Tesla started adding features that resembled autonomy to its cars beginning in late 2014.
Musk may have resented Washington’s meddling, but he also owed much of Tesla’s success to it. In 2009, the company was on the verge of collapse as the Great Recession promised to wipe out demand for its pricey electric cars. Tesla had produced a sleek sports car, the Roadster, which offered the thrill of instant torque combined with an electric power train, in part inspired by the mid-engine Lotus Elise. That thrill came at a price: the vehicle cost around $100,000, or more than double the sticker price of the Elise. Faced with a souring economy that threatened its ability to produce the Model S — the car that would later make the company a household name — Tesla found two saviors. Daimler, the auto group that encompassed Mercedes-Benz, approached Tesla to build power trains for its electric Smart cars. Meanwhile, the U.S. government, aiming to bring electric cars to the masses, made a bet on Tesla. The Department of Energy provided the company with a $465 million loan, critical cash at a time of existential uncertainty.
In a 2011 interview with The Atlantic, Musk acknowledged Tesla’s reliance on the government.
“Tesla has received a loan from the government,” he said. “If Tesla is to compete effectively against GM, Ford, Chrysler and others and those guys are getting massive amounts of money from the government at zero cost of capital and we don’t participate in that game it makes a very difficult job even harder. And so it just would be really unwise if we didn’t do that.”
In the coming years, Tesla would secure another coup. The government was encouraging big automakers to go electric, but they didn’t have the capacity or willpower to do so, especially in a declining economy. Tesla, on the other hand, would pump out thousands of electric vehicles per quarter. Why couldn’t Detroit simply take credit for their work? Automakers such as Chrysler started buying what were called “regulatory credits” from Tesla so they could surpass state emissions requirements under the federal Clean Air Act. This arrangement propelled Tesla to the profitability that helped make Musk the world’s richest person. Not only would his company get a years-long head start on the competition, it could also cash in on their failure to adapt.
Musk may have outmaneuvered competitors in Silicon Valley and Detroit, but the threat of regulation still hung over him. Though the safety investigators with the NTSB had a different mandate from their counterparts at the National Highway Traffic Safety Administration (NHTSA), Sumwalt felt that Musk lumped all the DC suits together. Musk had been irate when NHTSA regulators called him after another, similar crash in 2016; a Tesla in Autopilot had slammed into a tractor-trailer at 70 mph after failing to distinguish the rig from the sky behind it, in Tesla’s explanation, killing the driver. Musk yelled on the phone and threatened a lawsuit when he was told regulators were getting involved, a former safety official, speaking on the condition of anonymity to discuss a sensitive matter, said.
Musk’s view was simple: “We’re all beating up on him,” Dennis Jones, the former NTSB managing director, recalled.
Musk’s relationship with regulators and safety officials fully eroded during the five most critical years in his self-driving push. As he promoted his vision of consumer robotaxis, Musk tested a strategy of harnessing online armies of fanboys — oftentimes, enthusiastic investors whose toxic digital personas were aimed at silencing short sellers or naysayers — against those who threatened to slow the progress of Autopilot and its companion mode, Full Self-Driving, making life a nightmare for those who stood in his way. All in the name, Musk argued, of safety. One official was forced to flee her home in response to what local authorities regarded as a dangerous threat after Tesla fans erupted over her appointment as an NHTSA adviser, and Musk joined in the public attack. In another instance, authorities had to get involved after criticism of a government official escalated into personal threats from online trolls.
A real-life Tony Stark
On April 6, 2018 — the Friday before Musk would angrily hang up on him — Robert Sumwalt knew that he was faced with a potentially unpleasant task: calling the CEO about the latest deadly crash involving a Tesla on Autopilot. Things began quite cordially, but Sumwalt would soon learn the same lesson as so many who have crossed Musk’s path over the years: the mercurial billionaire can charm and play nice with those who have power over his empire, but he can turn on them just as quickly if he feels they’re threatening to stand in his way.
Federal safety investigators have a duty to the public: ensuring that the errors contributing to fatal crashes are not repeated or, worse, built into safety-critical systems. Many feel this responsibility deeply. Sumwalt — an easygoing but direct communicator who had spent decades as a commercial pilot — certainly did. Even so, he was starstruck as he first dialed Musk’s cellphone to discuss the matter.
“In fact, I was amazed … I thought ‘this was pretty cool, I’m talking to Elon Musk,’” said Sumwalt, now retired from the safety board, recalling the conversation after a dinner at Cracker Barrel in Florida.
The two men exchanged pleasantries. Sumwalt, flanked by a coterie of Washington officials huddled around a speakerphone on a sofa outside his office, explained to Musk that he wanted Tesla to be a party to the investigation. This was an especially critical step for a company with vast amounts of internal data, whose technical understanding of its own systems far outmatched that of safety officials. Dennis Jones, the NTSB’s longtime managing director, liked to joke that the agency was charged with investigating airplane manufacturers who could pay for its whole budget with a single airliner. In Tesla’s case, investigators were helpless in retrieving and decoding the proprietary data from the company’s on-board computers without internal assistance. Musk should have been well aware of this knowledge gap.
But the investigative process also benefited Tesla: if the company played a part in the investigation, it would be aware of potentially damaging information and could offer input and clarity about possible damning investigative findings. The ultimate goal was to keep the public safe — and a company that didn’t want to mask wrongdoing had little reason not to cooperate.
There were also rules: a party to a federal investigation could not unilaterally release information that might factor into the NTSB probe. The NTSB was “unhappy” with Tesla’s release of investigative information in the Huang crash, which had implied Huang’s inattention was a factor.
Basically, Tesla wasn’t allowed to spin its own version of the crash if it wanted to collaborate with safety officials in good faith. Sumwalt had been concerned about Tesla preemptively disclosing data that was subject to the investigation and wanted to make sure Musk understood the rules. On that spring day, Musk was polite and professional on the phone with Sumwalt — and he expressed openness to cooperating. He wanted Tesla to be part of the probe, he said. Sumwalt took that as an indication that Musk intended to follow the rules. So Elon Musk and the country’s top transportation safety investigator agreed to work together.
With the business portion of their conversation concluded, Sumwalt had more questions.
Musk had yet to become the richest person on earth, but he was quickly becoming a household name, a celebrity CEO whom many already regarded as the real-life Tony Stark.
What’s a day in the life of Elon Musk? Sumwalt asked.
Musk sounded tired to Sumwalt as he explained that his work schedule was intense. He had been sleeping on the floor of Tesla’s factory. The year 2018 had been the most painful of his career, as the company tried to sort out production problems with its mass market Model 3. He told Sumwalt that around 75 to 80 percent of his time was dedicated to Tesla, around 20 percent to SpaceX, and the rest — a figure over the remaining 5 percent or less — was devoted to his other companies. “I think that was intentional,” Sumwalt told me of Musk’s description exceeding 100 percent.
The conversation lasted thirty minutes. Sumwalt left feeling good about it.
But over the next few days, as Tesla continued to release information about the crash, even speculating on its cause, the situation deteriorated. Tesla was under pressure as the gruesome details of the crash were revealed. Other crashes had garnered NTSB attention, of course: a later case in Delray Beach, Florida, and an earlier one in Williston, Florida, which was among the most troubling: the Tesla, operating in Autopilot mode, failed to distinguish the side of a tractor-trailer from a bright-colored sky.
In that crash, forty-year-old Joshua David Brown was killed when a truck turned across the path of his 2015 Tesla Model S, which failed to slow down. The top of the Tesla was sheared off in a crash safety officials attributed to distraction after the driver ignored at least seven visual safety warnings, on-screen prompts to pay attention. Investigators cited both Brown’s overreliance on the software and a more novel concept, the car’s “operational design domain,” or the set of conditions and locations in which Autopilot could be activated. Regulators at NHTSA, meanwhile, held Tesla largely free from blame, and Musk called their findings “very positive.”
A PR crisis and broken trust
But in the crash involving Huang, Tesla risked losing control. Musk was on the verge of positioning Autopilot as the most important product in Tesla’s portfolio, with potential value exceeding that of the company’s automotive business. He had seemed to realize the company’s image was taking a hit, so Tesla chose a strategy that most big automakers in its position wouldn’t think of: it started running interference. Again.
Suddenly, days after the call in which Musk agreed to the ground rules, details of the crash that were under investigation by the NTSB started pouring out of Tesla, directly from its PR department.
“According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location,” the company said in a press statement reported by outlets such as Fortune and ABC News. “The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”
When his agency stumbled on the press clippings a day after Tesla’s statement went out, Sumwalt couldn’t believe it. Tesla was blaming the driver for the crash that had killed him, after Sumwalt had explicitly warned Musk about the rules. It was beyond inappropriate; it was unconscionable.
He picked up the phone and dialed Musk, who wasn’t immediately available.
Later that day, Jones had been visiting with Sumwalt in his office when a call popped up on Sumwalt’s screen.
Jones remembers the moment vividly. “Wow, that’s Elon Musk,” Sumwalt said.
Sumwalt signaled for Jones to stay — they’d finish their conversation once the call had concluded. He knew it would serve him well to have a witness to their exchange.
They put Musk on speakerphone, and Sumwalt quickly got to the point.
“What you did, Elon, was a violation of our party agreement. We spoke about this last week. You agreed that you would abide by our requirements.”
There was nothing but silence for almost ten seconds.
Then Musk, growing agitated and shorter with his words, launched into a tirade. Sumwalt recalled him arguing: “You’re making a bad mistake. More people [will] die because of this, because of what you’re doing.”
The investigators were out of line, he indicated, behind the curve with their slow bureaucratic process, as Tesla had already drawn conclusions about the crash using its vast amounts of data, and there was no doubt that the driver had been at fault.
“He goes into a diatribe,” Sumwalt said, “about ‘well you’re decreasing safety by virtue of the fact that our car is safer when it’s on Autopilot, we’re saving more lives because of Autopilot than people are lost but by your removing us from the investigation you’re decreasing safety.’”
Musk then threatened to sue, though it was not clear what standing he would have had.
“That’s fine, go ahead,” Sumwalt responded.
Musk launched back into his argument about the safety of Autopilot, reminding them of its potential lifesaving capabilities. Sumwalt wondered: How many times do I need to tell you this?
After Musk had finished, Sumwalt signaled for Jones to weigh in. The managing director explained how the parties were expected to work collaboratively, how the agency maintained productive relationships with automakers subject to its investigations, and he noted the harmonious relationship the agency had had with SpaceX in the past. Musk didn’t respond right away.
“I don’t want us to be removed from the investigation,” Musk finally said, as if the prior twenty-seven minutes hadn’t happened.
“It’s too late for that,” Sumwalt said, at no small risk to his agency, which relied on Tesla’s expertise to decode its data.
The line went dead.
Sumwalt and Jones looked at each other in shock, trying to process what had happened. It wasn’t just being hung up on; Musk’s demeanor and attitude and his unconvincing argument — a repetition-filled script — had left Sumwalt thoroughly unimpressed. There simply wasn’t enough evidence to demonstrate that Autopilot, a suite of driver-assistance features with a catchy name, was the transformative and revolutionary system with the lifesaving capabilities Musk touted; in this particular instance, it was at the center of a fatal crash, a high-tech calamity that safety investigators could examine to uncover new findings about the intersection of technology, driver distraction, and speed.
It was clear to them both that Musk didn’t even recognize the difference between the roles of the safety investigators from the NTSB and the regulators from NHTSA, an important distinction that the head of an automaker should understand.
Tesla would later claim it hadn’t been booted from the investigation — it had withdrawn on its own. In a statement reported by outlets including CNBC, Tesla elaborated on its apparent decision to withdraw. “It’s been clear in our conversations with the NTSB that they’re more concerned with press headlines than actually promoting safety,” the company said. “Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts.”
“You can’t fire me, I quit,” Sumwalt called the maneuver.