Tesla's self-driving bias: Musk and influencers get priority in autonomous driving AI development



Steve Granitz/FilmMagic; iStock; Rebecca Zisser/BI
  • Tesla gives special treatment to data from VIPs, including Elon Musk, when training its self-driving AI.
  • Current and former staff say they were told to work on routes Musk drove as well as influencers.
  • Experts say it means Tesla’s resources toward self-driving vehicles are being unevenly distributed.

Tesla’s self-driving cars seem like a marvel of machine learning.

But in reality, the company relies on a small army of human “data annotators” who continuously improve how the cars drive by reviewing camera footage from thousands of Tesla drivers and teaching the vehicle how to behave like a human driver, like deciding when it’s appropriate to use a blinker or identifying a construction cone.

Business Insider has learned that those annotators focus their efforts on two high-profile categories of drivers: Tesla CEO Elon Musk and a select set of “VIP” drivers.

BI spoke with over a dozen current and former Tesla employees, all but one who spoke on condition of anonymity, who said images and video clips from Musk’s Teslas received meticulous scrutiny, while data from high-profile drivers like YouTubers received “VIP” treatment in identifying and addressing issues with the Full Self-Driving software. The result is that Tesla’s Autopilot and FSD software may better navigate routes taken by Musk and other high-profile drivers, making their rides smoother and more straightforward.

That means, experts say, Tesla’s resources are being unevenly distributed and could serve as a distraction toward the company’s larger mission of truly autonomous driving.

Each Tesla is equipped with nine cameras, and owners can opt to share video from those cameras to improve Tesla’s systems.

Tesla’s legion of data annotators review the clips shared with Tesla and use the images to train the system to execute a proper left turn or identify a stop sign (and stop at it). The workers also review situations where the system failed to respond properly and a driver had to take back control of the vehicle.

The annotators label videos where the system worked properly and instances when it malfunctioned. By ID’ing issues, the data-annotation team can update Tesla’s global database with new information to clear up any confusion for other Teslas that encounter the same situations. Put simply, they teach Tesla’s AI that the stop sign at First and Main is part of a four-way stop.

Analyzing data from Musk’s vehicles has been a priority since the program’s inception, multiple workers said.

Eight workers said they recalled labeling data that they believed was associated with the billionaire. Two workers said they labeled a route in 2021 that went in and out of the driveway of a mansion in Hillsborough, California, that they later discovered belonged to Musk. The Tesla CEO sold the house in November 2021 for $32 million.

Several workers said they spent a significant amount of time labeling routes in and out of Tesla’s Austin and Fremont, California, factories, as well as the SpaceX office in Hawthorne, California.

‘It felt dishonest’

While the annotators could have been viewing data from other Tesla employees or SpaceX workers who owned Teslas, they said the same focus was not given to other factory or office parking lots in California or elsewhere. In addition, one recalled labeling a series of clips from late 2022 and early 2023 that involved Twitter’s headquarters in San Francisco. The worker said the team was told to focus on data from areas near Twitter headquarters during the same time period that Musk took control of the social-media company.

John Bernal, a former Autopilot analyst and test driver, and three other former workers told BI they were informed they were working on data from Musk’s car and specifically told to treat the clips with care. While workers at the data-labeling plants are typically rated on how fast they can annotate the data, Bernal and two other workers said they were told to take more time with data from Musk’s vehicles, adding that the clips would go through an extra round of quality assurance as well.

“It seems pretty clear that Elon’s experience would be better than anyone else’s,” one former employee said. “He was seeing the software at its best.”

Another worker said they had some misgivings about the initiative.

“It seemed like we were purposely making his car better to make Autopilot look different than it was,” another former employee said. “It felt dishonest.”

Four other workers said they believed they’d labeled routes associated with Musk but had not been expressly told by supervisors. When annotators view the data, they can see a timestamp from when the footage was taken and the geolocation, but they’re unable to view anything that explicitly identifies the specific vehicle or driver. Instead, annotators said they could rely on context clues, particularly the routes and locations the vehicle visited.

Some workers said the cost of failing to label Musk’s data correctly could be high. Two former workers recalled an incident where a data annotator was terminated shortly after labeling a clip they believed came from Musk’s car. The worker was escorted out of Tesla’s facility in Buffalo, New York, the workers said, after the data labeler failed to properly label a highway exit sign. One former worker said it was highly unusual for someone in data annotation to be fired without warning and that the employees were typically put on notice if they were not hitting their metrics.

One former worker told BI they recalled labeling a route in 2020 from a house in Los Angeles to SpaceX’s headquarters in Hawthorne, where the software struggled to identify the lines on the road toward an off-ramp. Tesla’s Autopilot software has struggled to follow patchy lane markings in the past. In Walter Isaacson’s biography on Musk, the author said that earlier on in the Autopilot program, Tesla persuaded a “Musk fan” at the California Department of Transportation to repaint the lines on Interstate 405 after Musk encountered issues with Autopilot as a result of the faded lane markings.

However, one former worker said there was no way for any of the labelers to know with certainty whether the clip belonged to any one driver, adding that anyone who thinks they know the owner of the car would be operating on “pure speculation.”

Representatives for Tesla and Musk did not reply when reached for comment.

Tesla influencers also get extra attention

Musk isn’t the only driver that gets special treatment.

Since FSD was released in 2020, Tesla fans and critics alike have taken to social media to share videos of the technology succeeding and failing, from clips of it navigating difficult routes without human intervention to buggy videos showing the car running over toddler-sized dummies or mistaking the moon for a stop light.

These videos do not go unnoticed by Tesla staff. In fact, the company created a system to prioritize data from drivers most likely to share their experience online, three current and former workers with direct knowledge of the issue told BI. These drivers are internally referred to as “VIP” users and their data is at times put in VIP queues, according to the workers.

Data collected from VIP users, including high-profile Tesla drivers who post on YouTube, is scrutinized more heavily and more likely to be labeled, three current and former workers said. They said they’d been specifically told by leads on their teams that they were working on “VIP data” and had received overtime pay to work on the data ahead of FSD updates.

“We would annotate every area that car regularly drove in,” one former worker, who said they were told by their manager they were working on “Tesla influencer” data, added. “We’d home in on where they lived and label everything we could along that route.”

Bernal said that Tesla sent multiple test drivers out to routes YouTubers had driven, including routes driven by Raj Balwani and Chuck Cook, two YouTubers who often review the software.

Bernal said he was one of eight or nine test drivers who went to Lombard Street to work on a solution, after Balwani, also known by his YouTube account Tesla Raj, posted a video of FSD repeatedly attempting to veer off the famously curvy road. The company eventually coded invisible barriers into the system to fix the issue specifically for Lombard Street, according to Bernal. (Bernal was terminated in 2022. He said it was the result of sharing a series of videos on his YouTube channel of his personal Tesla malfunctioning while using FSD.)

Balwani told BI he’d never been contacted by any Tesla employees regarding his videos but that he saw the company’s focus on online feedback as a positive sign.

“I think this just means that their teams are monitoring and are engaged in the areas that they need to be,” Balwani said

“For most of what I’ve recorded and done since I’ve had FSD, almost everything has been solved, which is pretty incredible,” he added.

In 2022, Musk congratulated Cook on Twitter for giving Tesla a “tough case to solve” after the carmaker rolled out an update meant to address an issue with unprotected left turns that Cook had pointed out in his videos.

Cook told BI he’s very aware of Tesla’s focus on his content — in fact, he said, he sees test drivers in his neighborhood on a weekly basis. The YouTuber said he’d attempted to reach out to Autopilot engineers over email and social media but that they never responded and the test drivers in his neighborhood had been very “tight-lipped” about their work.

Cook said he sent an email in 2020 to an account for FSD beta testers asking whether Tesla was really looking at his data.

“They sent a screenshot of what my camera was seeing in my car just 30 minutes prior,” he said.

Cook feels Tesla is focused less on handpicking influencers and more on gathering the best data for training, he said.

“They know I’m not just blabbing and fanboying or overcriticizing,” Cook said. “I’m fair.”

One worker with knowledge of the issue said the VIP system wasn’t designed to give preferential treatment but served as an additional method for Tesla to improve FSD for all drivers.

YouTubers are “going out and constantly trying to break the system,” the worker said, adding: “They’re identifying difficulties that could be translated to other routes and calling attention to them.”

“In a way, they’re a second tier of test drivers,” they added.

But Tesla’s focus on Musk and VIP users could be detrimental to the company’s efforts to achieve truly autonomous driving, Missy Cummings, a former safety advisor for the National Highway Traffic Safety Administration, said.

“It’s going to be hard to make a self-driving car for the masses if it’s only operating well around Elon’s house,” Cummings told BI.

The issue comes down to whether Tesla’s focus on VIP users contributes to niche improvements like the fixes to Lombard Street or ones that will benefit the entire community, Philip Koopman, a computer-engineering expert out of Carnegie Mellon University, told BI.

“I expect there is marketing pressure to make the VIP drivers look good in their videos, and it’s hard to know how much of that is theater and how much is reality without Tesla disclosing the amount of safety improvements provided by each change,” Koopman said.

Tesla’s self-driving in the regulatory spotlight

Tesla has come under increasing scrutiny from regulators over the self-driving software and the company’s marketing of the service. In April, the National Highway Traffic Safety Administration’s investigation into Tesla’s Autopilot and Full Self-Driving linked it to hundreds of crashes and dozens of deaths, citing inadequate measures to ensure driver attention.

Additionally, the US Justice Department is investigating whether Tesla committed securities or wire fraud over claims that it misled investors and consumers about its electric vehicles’ self-driving capabilities.

Meanwhile, Musk has repeatedly said Tesla is getting closer to its self-driving goal, including plans to unveil its Robotaxi service later this year.

Musk sees Autopilot and FSD as existentially important for Tesla. Self-driving, he said in a 2022 interview, is “really the difference between Tesla being worth a lot of money or worth basically zero.”

Do you work for Tesla or have a tip? Reach out to the reporter via a non-work email and device at gkay@businessinsider.com or 248-894-6012.



Source link

About The Author

Scroll to Top