A Tesla car on the newest Full Self-Driving (FSD) Supervised replace abruptly veered off highway and flipped the automobile the wrong way up – making a scary crash that the driving force mentioned he couldn’t forestall.
Now we have seen many crashes involving Tesla’s Supervised FSD over time, however the overwhelming majority of them have a serious contributing think about widespread: the driving force is just not paying consideration or is just not able to take management.
A typical crash situation with Tesla FSD is that the car doesn’t see an impediment on the highway, like a car, and crashes into it, though the driving force would have had time to react in the event that they have been paying sufficient consideration.
Regardless of its identify, Full Self-Driving (FSD) remains to be thought of a degree 2 driver help system and isn’t absolutely self-driving. It requires drivers to remain attentive always and for them to be able to take management – therefore whereas Tesla has extra just lately added ‘Supervised’ to the identify.
In line with Tesla, the driving force is all the time accountable in a crash, even when FSD is activated.
The automaker has carried out driver monitoring techniques to make sure drivers’ consideration, however it’s step by step enjoyable these.
Simply in the present day, Tesla launched a put up on X during which it mentioned drivers simply need to “lean again and watch the highway” when utilizing FSD:

Sitting again and watching the highway was precisely what Wally, a Tesla driver in Alabama, was doing when his automobile abruptly veered off the highway in Toney, Alabama, earlier this 12 months.
Wally leased a model new 2025 Tesla Mannequin 3 with FSD and understood that he wanted to concentrate. When speaking with Electrek yesterday, he mentioned that he would often use the characteristic:
I used FSD each likelihood I might get I really watched YouTube movies to tailor my FSD settings and expertise. I used to be joyful it might drive me to Waffle Home and I might simply sit again and chill out whereas it could drive me on my morning commute to work.
Two months in the past, he was driving to work on Tesla Full Self-Driving when his automobile abruptly swerved off the highway. He shared the Tesla digicam video of the crash:
Wally instructed Electrek that he didn’t have time to react though he was paying consideration:
I used to be driving to work had Full Self-Driving on. The oncoming automobile handed, and the wheel began turning quickly, driving into the ditch, and side-swiping the tree, and the automobile flipped over. I didn’t have any time to react.
The automobile ended up flipping the wrong way up from the crash:



Thankfully, Wally solely suffered a comparatively small chin damage from the accident, nevertheless it was a scary expertise:
My chin cut up open, and I needed to get 7 stitches. After the affect, I used to be hanging the wrong way up watching blood drip right down to the glass solar roof, not realizing the place I used to be bleeding from. I unbuckled my seatbelt and sat on the material inside in the course of the 2 entrance seats, and noticed that my cellphone’s crash detection went off and instructed me the primary responders have been on their method. My entire physique was in shock from the incident.
The Tesla driver mentioned that one of many neighbors got here out of their home to verify he was okay and the native Firefighters arrived to get him out of the upside-down Mannequin 3.
Wally mentioned he was on Tesla FSD v13.2.8 on {Hardware} 4, Tesla’s newest FSD know-how. He requested that Tesla ship him the information from his automobile to higher perceive what occurred.
Electrek’s Take
That is the place Tesla FSD will get actually scary. I get that Tesla admits that FSD could make errors on the worst potential second and that the driving force wants to concentrate always.
The thought is that in the event you concentrate, you may right these errors, which is true more often than not, however not all the time.
On this case, the driving force had lower than a second to react, and even when he had reacted, it might need made issues worse, like correcting, however not sufficient to get again on the highway and hit the tree head-on as an alternative.
In instances like this one, it’s laborious to place the blame on the driving force. He was doing precisely what Tesla says you must do: “lean again and watch the highway.”
A really related factor occurred to me final 12 months when my Mannequin 3 on FSD veered to the left, attempting to take an emergency exit on the freeway for no purpose. I used to be capable of take management in time, nevertheless it created a harmful scenario as I nearly overcorrected right into a car in the precise lane.
In Wally’s case, it’s unclear what occurred. It’s potential that FSD believed it was about to hit one thing due to the shadows on the bottom. Right here’s the view from the front-facing digicam, a fraction of a second earlier than FSD veered to the left:

However it’s simply hypothesis at the moment.
Both method, I believe Tesla has an issue with complacency with FSD the place its drivers are beginning to pay much less consideration on FSD – resulting in some crashes, however there are additionally these even scarier crashes that seem like 100% brought on by FSD with little or no to no alternative for the drivers to forestall them.
That’s even scarier.