Nurse's Tesla video ignites self-driving safety debate

A registered nurse shared a TikTok video appearing to sleep while her Tesla drove on full self-driving mode after a night shift, drawing widespread criticism for promoting risky behavior. Commenters highlighted the dangers and illegality of fully relying on the technology, though the nurse clarified her eyes were open. The incident underscores ongoing concerns about Tesla's Full Self-Driving system's safeguards.

Vanessa Lim, a registered nurse known on TikTok as @vanessalim, posted a video showing her slouched in the driver's seat of her Tesla electric vehicle during a solo ride home following a night shift. The clip includes a text overlay stating, “thank god for my self-driving Tesla cause how tf was I gunna drive home after that nightshift,” with buildings and cars visible passing by outside. Her right arm appears extended as if recording, her left arm wrapped around her neck, and neither hand touches the steering wheel. Although her eyes seem closed at first, closer inspection reveals fluttering eyelids, a point noted by a commenter: “Do people not see you blinking?”

The video quickly attracted backlash from viewers who deemed it irresponsible. One remarked, “Yeah lets not play with peoples lives,” emphasizing the potential for fatal outcomes. Another warned, “Babe no this is illegal and dangerous. These self-driving cars are not to be fully trusted. Protect yourself. I worked nights too so I feel the struggle though.” In response, Lim questioned, “Wait how is driving a Tesla on Full self-driving mode illegal?” A third commenter observed, “Lmao I think she thinks she actually fully asleep…while recording mind you,” to which Lim replied, “Dude my eyes are open in this video.”

Tesla's Full Self-Driving (Supervised) system requires drivers to remain attentive and ready to take control at any moment, as stated by the company. Discussions on forums like Tesla Motors Club suggest that while updates aimed to prevent dozing off, online footage indicates these measures are not always effective. A May 2025 software update reduced in-cabin eye tracking to address complaints about excessive alerts, even when drivers were focused. Tesla CEO Elon Musk has claimed that owners could soon text while driving safely using the technology. Reports also mention prompts for sleepy drivers, such as a pop-up reading: “Pull down to activate FSD (Supervised). Lane drift detected. Let FSD assist so you can stay focused.” InsideEVs contacted Lim and Tesla for additional comments.

Related Articles

Realistic photo illustration of a Tesla car violating traffic rules at an intersection, under investigation by NHTSA officials, highlighting safety concerns with autonomous driving technology.
Image generated by AI

NHTSA investigates Tesla's Full Self-Driving software for traffic violations

Reported by AI Image generated by AI

The U.S. National Highway Traffic Safety Administration has launched its sixth investigation into Tesla's Full Self-Driving software following reports of dangerous traffic violations. The probe examines incidents including running red lights and driving in wrong lanes, which led to crashes and injuries. This comes amid Tesla's push toward robotaxis and unsupervised driving.

A video has surfaced showing a Tesla Cybertruck driver playing the video game Grand Theft Auto while the vehicle's Full Self-Driving system is engaged on the highway. The driver uses a controller, with eyes focused on the game screen, as the truck navigates traffic. This incident highlights ongoing efforts by drivers to bypass Tesla's driver-monitoring safeguards.

Reported by AI

A Tesla Model 3 veered into oncoming traffic during a livestream demonstration of its Full Self-Driving features in China, causing a head-on collision. No one was critically injured, but the incident has raised fresh concerns about overreliance on the system's capabilities. The driver released footage showing the software initiated the erroneous lane change.

California regulators are poised to suspend Tesla's vehicle sales license in the state for 30 days unless the company revises its marketing for self-driving features. An administrative law judge ruled that terms like 'Autopilot' and 'Full Self-Driving' mislead consumers about the technology's capabilities, which require constant human supervision. Tesla has 90 days to comply and avoid the penalty.

Reported by AI

A driver in Houston has filed a lawsuit against Tesla following an incident where her Cybertruck allegedly attempted to drive off an overpass while using the autopilot feature. The suit claims that Tesla's self-driving technology is defectively designed and misleadingly marketed as fully autonomous. The event occurred last year.

The family of Jeffrey Nissen Jr., a 28-year-old motorcyclist killed in an April 2024 collision with a Tesla Model S using Autopilot, has filed a wrongful death lawsuit against the company. They allege misleading marketing led to over-reliance on the system, seek damages and a sales halt, amid updates that the driver will face no criminal charges. The case underscores ongoing scrutiny of Tesla's autonomous tech.

Reported by AI

A Georgia man survived a heart attack thanks to Tesla's Full Self-Driving system, which redirected his vehicle to a nearby medical center after his son changed the destination via the app. The incident occurred while the man was driving through Atlanta en route to Birmingham. Doctors later confirmed the quick reroute was life-saving.

 

 

 

This website uses cookies

We use cookies for analytics to improve our site. Read our privacy policy for more information.
Decline