Elon Musk’s Crash Course at Tesla

Never mind the dumpster fire at Twitter. The (sometimes) richest man in the world’s bigger headache just may be his once-darling car company
224

Elon Musk’s chaotic acquisition of the social-media giant Twitter—which saw the re-platforming of ex-president Donald Trump and a host of deporables alongside the dismissal of half of Twitter’s workforce—gave the world a bracing primer in the management style of the South African-born billionaire mogul. One can theorize that Musk’s notorious impulsiveness and probable poor emotional regulation might be interfering with achieving his vision of a Muskified future of space travel on demand and cars that drive themselves, embodied by his two principal companies: Space X, the leading private space-transport company, and Tesla, the definitive electric-vehicle brand. While both are alive and humming, the instant, just-add-water future they once offered has yet to materialize.

Wreckage of a Tesla that crashed on Autopilot. (PHOTO: NTSB via AP, file)

In no arena is this truer than that of the driverless vehicle. What was once an inspiring moonshot that as late as 2017 Musk was touting would by now be commonplace has instead settled into a morass of regulatory scrutiny, scaled-back expectations, and finger-pointing over who is to blame for bringing an overhyped technology crashing back to earth. Increasingly, Tesla and its temperamental CEO are taking the heat.

Tesla took the first steps toward autonomous cars in 2014 when it began outfitting its Model S sedans with technology to automate certain functions, like acceleration, braking, and steering. The feature, evocatively dubbed “Autopilot,” combined lane-keeping assistance with elements of traditional adaptive cruise control—introduced on the 1998 Mercedes S-Class sedan—that were meant to keep vehicles inside lane lines painted on asphalt. A blog post released by the company in 2015 insisted that Autopilot was not a “self-driving” mechanism and that the driver would be “responsible for, and ultimately in control of, the car.” The same post stated that Tesla vehicles were equipped with hardware that would allow “the introduction of self-driving technology” through incremental software releases.

A battery fire destroys a Tesla in 2022. (PHOTO: The Morris Township volunteer fire.)

Musk, ever to the frustration of his publicists, appeared to trample the company’s cautious corporate messaging a year later by telling reporters on a conference call that he believed Autopilot to be “probably better” than human drivers. On that same call, he forecast that Teslas would be equipped to drive better than humans within three years and that, within two, it would be possible to “summon” one’s Tesla from remote locations. “eg, you’re in L.A. and the car is in N.Y.,” Musk later tweeted. The unsurprising takeaway: Teslas could already drive themselves. Soon, YouTube broadcast videos of Model S owners riding shotgun as their cars hurled down highways, driver seats serenely empty.

According to Consumer Reports, Teslas on Autopilot moved ‘like a drunken or distracted driver.’

Then the accident reports started rolling in. In January 2016, a 23-year-old in the Chinese province of Hebei was killed while driving home from a family wedding. The Tesla Model S he was driving reportedly collided with the rear of a street sweeper. His family sued Tesla, claiming Autopilot was engaged at the time of the accident. Three months later, Joshua Brown, 40, of Canton, Ohio, was killed after his Model S slammed into the side of a tractor-trailer in Levy County, Florida. Brown was a Tesla enthusiast who frequently made and published YouTube videos of himself using Autopilot, one of which was reposted by Musk to his own Twitter account that purported to show Autopilot steering to “avoid collision with a truck,” Musk wrote.

Elon Musk. (AP PHOTO/Marcio Jose Sanchez)

Brown’s death caught the attention of the National Transportation Safety Board, which launched an investigation. In its final report, the NTSB backtracked on its preliminary findings that Autopilot failed to apply the brakes and that Brown’s crash was not in fact a result of technological defect. Ultimately, a 2019 independent review by the Quality Control Systems Corporation disputed the NTSB’s findings, stating that collisions involving airbag deployment in fact likely increased following integration of Autopilot.

Meanwhile, scrutiny of Tesla’s driverless functionalities continues to pile up. Tests conducted by Consumer Reports found that the Autopilot feature required “significant driver intervention” and that it “lagged far behind a human driver’s skills.” The Consumer Reports test vehicle cut off nearby cars and passed some in ways that violated state laws. The magazine’s test of the “Smart Summon” feature found that a driverless Tesla often crossed lane lines, had difficulty navigating parking lots, and moved “like a drunken or distracted driver.”

The National Highway Traffic Safety Administration has thus far identified 11 instances in which Tesla cars on Autopilot have struck first responders or emergency vehicles utilizing flashing lights, flares, or illuminated signage to warn of hazards. In August 2021, the agency opened an investigation into alleged safety defects with Autopilot, bolstered by calls from Congress to look into what some have described as Tesla’s deceptive marketing practices around the feature; a year earlier, a court in Germany banned Tesla from using the term “Autopilot” in its sales and marketing materials. The NHTSA also ordered the company to disclose information about any crashes involving driverless-vehicle technology in marketing materials describing its functionality.

Tesla responded to the NHTSA inquiry in October 2021, claiming the request violated its right to shield trade secrets and confidential business information. That same month, however, Tesla issued a recall for over 11,000 vehicles using its latest “Full Self-Driving,” or FSD beta software—one of the incremental upgrades to AutoPilot the company had promised, which was installed in 160,000 Teslas whose owners had agreed to road test the program.

Soon, drivers with FSD deployed reported instances of “phantom braking,” or sudden, jarring brake initiation while in motion. A report in the Washington Post documented similar problems with FSD-equipped Teslas. The Post assembled a panel of six experts in driverless-vehicle technology, which reviewed a series of authenticated driver videos published to YouTube highlighting the FSD feature. The panel identified multiple instances of concern, including a scene in which a driver appeared to be fighting FSD for control and vehicles “failing to properly interpret critical road markings and signs” as well as “ordinary pedestrian behavior.”

One video reviewed by the Post reported to be the first recorded crash of a car equipped with FSD, showed a Tesla making an automated right turn through an intersection in San Jose, California, collide head-on with a green bollard installed to protect a bike lane from passing traffic. A video review of an FSD-equipped Model 3 sedan released by CNN likened the experience to “a little like teaching a teenager how to drive.”

Last November, Chinese law-enforcement authorities launched an investigation into an incident in Guangdong province, in which a Model Y sedan attempting to park suddenly took off on a two-lane road at breakneck speed before colliding with a storefront about 30 seconds later. The driver reportedly survived with injuries, but two individuals were killed in connection with the incident. Rumors swirled online that the vehicle was attempting an auto-park before it mysteriously accelerated away on its own. Tesla has claimed that logs from the vehicle in question showed that the brake pedal was not applied during the incident. Surveillance footage of the speeding Tesla show that rear brake lights were not illuminated for most of the 30-second stretch, though they turn on briefly about 23 seconds in.

PARK IT. Santa Barbara engineer Dan O’Dowd wants Tesla to pull its “terrible” self-driving software. (PHOTO: Jonas Jungblut)

Dan O’Dowd is a Santa Barbara-based engineer and CEO of Green Hills Software, which provides programming for intercontinental nuclear bombers and what he describes as “systems that require the highest degree of reliability.” O’Dowd is also founder of the Dawn Project, a software-safety advocacy organization that has kept Tesla squarely in its crosshairs over the years.

“It’s machinery that has the potential to kill many people,” he says of FSD-outfitted Tesla models. “And it is being developed with the move-fast-and-break-things methodology of Silicon Valley. We shouldn’t be relying on that methodology when building things that many lives depend on.”

O’Dowd, who unsuccessfully ran for the California Senate last year on a platform of banning FSD from being tested on public streets, describes FSD as “terrible” and disputes Tesla’s characterization of the software as “driver assistance.”

“It drives the car for you,” he tells Los Angeles. “It just drives the car badly.”

According to Dawn Project studies of various Tesla models, FSD engages maneuvers that “would cost you your license” approximately every eight minutes. One of the main bugs identified by O’Dowd and his team concerns FSD’s ability to read traffic signs. “It doesn’t know what a do-not-enter sign means,” he says, “So it’ll drive down a one-way-street.” FSD also comes with speed limits preprogrammed, meaning it won’t recognize or honor temporary speed limits placed in construction zones, for example. “It should not be sold,” O’Dowd insists of FSD. “This product is simply not ready to be on the road.”

Since the NHTSA launched its inquiry into the safety of FSD, the picture has gotten only gloomier for Tesla. Last February, the company announced a recall of nearly 54,000 vehicles equipped with FSD, which, in addition to its other vagaries, allows vehicles to make “dangerous and potentially fatal rolling stops at stop signs,” according to Los Angeles personal injury lawyer Peter Steinberg. To date, the company has recalled more than  1.5 million vehicles over issues with various automatic functions, including FSD. (Los Angeles reached out to Tesla for comment on current or forthcoming safety updates to Autopilot and FSD but did not receive a response prior to publication.)

Tesla is not the only electric vehicle manufacturer running into problems with implementing driverless technology. New York Times reporter Cade Metz relayed his experience riding in a driverless Chevy Bolt operated by Cruise, the rideshare service backed by General Motors, through the streets of San Francisco. Metz reported the Bolt swerved sharply to avoid a car it mistook for a pedestrian, braked without adequately decelerating at red lights, and at one point detected an imaginary accident in the road, pulled over, and promptly shut down for the night.

It doesn’t know what a do-not-enter sign means. So it’ll drive down a one-way street.

For Alexander Wyglinski, an associate dean for graduate studies in electrical and computer engineering at Worcester Polytechnic Institute in Massachusetts, the present dreary outlook for driverless technology may not be permanent. He’s particularly bullish on research into connected and autonomous vehicles, or CAVs, which incorporate data communicated from nearby vehicles to map their environment in real-time; already, Hawaii, Pennsylvania, and Michigan have implemented programs to adopt CAV specifications for future autonomous vehicles.

An autonomous car’s collision avoidance system that relies on, say, 11 internally stored images of a deer jumping across the road is limited to those exact scenarios; a CAV can access thousands of images of deer, standing still and in motion, solo or in groups, to recognize and respond to the threat before a set of antlers comes crashing through the windshield. There are no consumer vehicles currently on the market that have integrated CAV technology, “but there are a ton of people working on it,” Wyglinski notes.

Meanwhile, Tesla plows ahead with FSD. Last November, Musk took to his just-purchased Twitter to announce: “Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen . . . Congrats to Tesla Autopilot/AI team on achieving a major milestone!”

To which a commenter responded: “It should be illegal to ‘beta test’ shit like that on the open road.”

Stay on top of the latest in L.A. news, food, and culture. Sign up for our newsletters today.